Sample records for average time series

  1. Time averaging, ageing and delay analysis of financial time series

    NASA Astrophysics Data System (ADS)

    Cherstvy, Andrey G.; Vinod, Deepak; Aghion, Erez; Chechkin, Aleksei V.; Metzler, Ralf

    2017-06-01

    We introduce three strategies for the analysis of financial time series based on time averaged observables. These comprise the time averaged mean squared displacement (MSD) as well as the ageing and delay time methods for varying fractions of the financial time series. We explore these concepts via statistical analysis of historic time series for several Dow Jones Industrial indices for the period from the 1960s to 2015. Remarkably, we discover a simple universal law for the delay time averaged MSD. The observed features of the financial time series dynamics agree well with our analytical results for the time averaged measurables for geometric Brownian motion, underlying the famed Black-Scholes-Merton model. The concepts we promote here are shown to be useful for financial data analysis and enable one to unveil new universal features of stock market dynamics.

  2. Using Time-Series Regression to Predict Academic Library Circulations.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.

    1984-01-01

    Four methods were used to forecast monthly circulation totals in 15 midwestern academic libraries: dummy time-series regression, lagged time-series regression, simple average (straight-line forecasting), monthly average (naive forecasting). In tests of forecasting accuracy, dummy regression method and monthly mean method exhibited smallest average…

  3. 75 FR 37390 - Caribbean Fishery Management Council; Public Hearings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-29

    ...; rather, all are calculated based on landings data averaged over alternative time series. The overfished... the USVI, and recreational landings data recorded during 2000-2001. These time series were considered... Calculated Based on the Alternative Time Series Described in Section 4.2.1. Also Included Are the Average...

  4. Comparison of detrending methods for fluctuation analysis in hydrology

    NASA Astrophysics Data System (ADS)

    Zhang, Qiang; Zhou, Yu; Singh, Vijay P.; Chen, Yongqin David

    2011-03-01

    SummaryTrends within a hydrologic time series can significantly influence the scaling results of fluctuation analysis, such as rescaled range (RS) analysis and (multifractal) detrended fluctuation analysis (MF-DFA). Therefore, removal of trends is important in the study of scaling properties of the time series. In this study, three detrending methods, including adaptive detrending algorithm (ADA), Fourier-based method, and average removing technique, were evaluated by analyzing numerically generated series and observed streamflow series with obvious relative regular periodic trend. Results indicated that: (1) the Fourier-based detrending method and ADA were similar in detrending practices, and given proper parameters, these two methods can produce similarly satisfactory results; (2) detrended series by Fourier-based detrending method and ADA lose the fluctuation information at larger time scales, and the location of crossover points is heavily impacted by the chosen parameters of these two methods; and (3) the average removing method has an advantage over the other two methods, i.e., the fluctuation information at larger time scales is kept well-an indication of relatively reliable performance in detrending. In addition, the average removing method performed reasonably well in detrending a time series with regular periods or trends. In this sense, the average removing method should be preferred in the study of scaling properties of the hydrometeorolgical series with relative regular periodic trend using MF-DFA.

  5. 76 FR 53652 - Atlantic Highly Migratory Species; Atlantic Shark Management Measures

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-29

    ... an annual basis over that time series, an average of 780 were released alive and were 350 discarded dead. For oceanic whitetip sharks discarded over the time series, an average of 133 were released alive... time, NMFS is implementing the Recommendations as adopted at the 2010 ICCAT meeting. These...

  6. Adaptive Anchoring Model: How Static and Dynamic Presentations of Time Series Influence Judgments and Predictions.

    PubMed

    Kusev, Petko; van Schaik, Paul; Tsaneva-Atanasova, Krasimira; Juliusson, Asgeir; Chater, Nick

    2018-01-01

    When attempting to predict future events, people commonly rely on historical data. One psychological characteristic of judgmental forecasting of time series, established by research, is that when people make forecasts from series, they tend to underestimate future values for upward trends and overestimate them for downward ones, so-called trend-damping (modeled by anchoring on, and insufficient adjustment from, the average of recent time series values). Events in a time series can be experienced sequentially (dynamic mode), or they can also be retrospectively viewed simultaneously (static mode), not experienced individually in real time. In one experiment, we studied the influence of presentation mode (dynamic and static) on two sorts of judgment: (a) predictions of the next event (forecast) and (b) estimation of the average value of all the events in the presented series (average estimation). Participants' responses in dynamic mode were anchored on more recent events than in static mode for all types of judgment but with different consequences; hence, dynamic presentation improved prediction accuracy, but not estimation. These results are not anticipated by existing theoretical accounts; we develop and present an agent-based model-the adaptive anchoring model (ADAM)-to account for the difference between processing sequences of dynamically and statically presented stimuli (visually presented data). ADAM captures how variation in presentation mode produces variation in responses (and the accuracy of these responses) in both forecasting and judgment tasks. ADAM's model predictions for the forecasting and judgment tasks fit better with the response data than a linear-regression time series model. Moreover, ADAM outperformed autoregressive-integrated-moving-average (ARIMA) and exponential-smoothing models, while neither of these models accounts for people's responses on the average estimation task. Copyright © 2017 The Authors. Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.

  7. Forecasting daily meteorological time series using ARIMA and regression models

    NASA Astrophysics Data System (ADS)

    Murat, Małgorzata; Malinowska, Iwona; Gos, Magdalena; Krzyszczak, Jaromir

    2018-04-01

    The daily air temperature and precipitation time series recorded between January 1, 1980 and December 31, 2010 in four European sites (Jokioinen, Dikopshof, Lleida and Lublin) from different climatic zones were modeled and forecasted. In our forecasting we used the methods of the Box-Jenkins and Holt- Winters seasonal auto regressive integrated moving-average, the autoregressive integrated moving-average with external regressors in the form of Fourier terms and the time series regression, including trend and seasonality components methodology with R software. It was demonstrated that obtained models are able to capture the dynamics of the time series data and to produce sensible forecasts.

  8. Regional Landslide Mapping Aided by Automated Classification of SqueeSAR™ Time Series (Northern Apennines, Italy)

    NASA Astrophysics Data System (ADS)

    Iannacone, J.; Berti, M.; Allievi, J.; Del Conte, S.; Corsini, A.

    2013-12-01

    Space borne InSAR has proven to be very valuable for landslides detection. In particular, extremely slow landslides (Cruden and Varnes, 1996) can be now clearly identified, thanks to the millimetric precision reached by recent multi-interferometric algorithms. The typical approach in radar interpretation for landslides mapping is based on average annual velocity of the deformation which is calculated over the entire times series. The Hotspot and Cluster Analysis (Lu et al., 2012) and the PSI-based matrix approach (Cigna et al., 2013) are examples of landslides mapping techniques based on average annual velocities. However, slope movements can be affected by non-linear deformation trends, (i.e. reactivation of dormant landslides, deceleration due to natural or man-made slope stabilization, seasonal activity, etc). Therefore, analyzing deformation time series is crucial in order to fully characterize slope dynamics. While this is relatively simple to be carried out manually when dealing with small dataset, the time series analysis over regional scale dataset requires automated classification procedures. Berti et al. (2013) developed an automatic procedure for the analysis of InSAR time series based on a sequence of statistical tests. The analysis allows to classify the time series into six distinctive target trends (0=uncorrelated; 1=linear; 2=quadratic; 3=bilinear; 4=discontinuous without constant velocity; 5=discontinuous with change in velocity) which are likely to represent different slope processes. The analysis also provides a series of descriptive parameters which can be used to characterize the temporal changes of ground motion. All the classification algorithms were integrated into a Graphical User Interface called PSTime. We investigated an area of about 2000 km2 in the Northern Apennines of Italy by using SqueeSAR™ algorithm (Ferretti et al., 2011). Two Radarsat-1 data stack, comprising of 112 scenes in descending orbit and 124 scenes in ascending orbit, were processed. The time coverage lasts from April 2003 to November 2012, with an average temporal frequency of 1 scene/month. Radar interpretation has been carried out by considering average annual velocities as well as acceleration/deceleration trends evidenced by PSTime. Altogether, from ascending and descending geometries respectively, this approach allowed detecting of 115 and 112 potential landslides on the basis of average displacement rate and 77 and 79 landslides on the basis of acceleration trends. In conclusion, time series analysis resulted to be very valuable for landslide mapping. In particular it highlighted areas with marked acceleration in a specific period in time while still being affected by low average annual velocity over the entire analysis period. On the other hand, even in areas with high average annual velocity, time series analysis was of primary importance to characterize the slope dynamics in terms of acceleration events.

  9. Investigation of Cepstrum Analysis for Seismic/Acoustic Signal Sensor Range Determination.

    DTIC Science & Technology

    1981-01-01

    distorted by transmission through a linear system . For example, the effect of multipath and reverberation may be modeled in terms of a signal that is...called the short time averaged cepstrum. To derive some analytical expressions for short time average cepstrums we choose some functions of interest...linear process applied to the time series or any equivalent time function Repiod Period The amount of time required for one cycle of a time series Saphe

  10. Documentation of a spreadsheet for time-series analysis and drawdown estimation

    USGS Publications Warehouse

    Halford, Keith J.

    2006-01-01

    Drawdowns during aquifer tests can be obscured by barometric pressure changes, earth tides, regional pumping, and recharge events in the water-level record. These stresses can create water-level fluctuations that should be removed from observed water levels prior to estimating drawdowns. Simple models have been developed for estimating unpumped water levels during aquifer tests that are referred to as synthetic water levels. These models sum multiple time series such as barometric pressure, tidal potential, and background water levels to simulate non-pumping water levels. The amplitude and phase of each time series are adjusted so that synthetic water levels match measured water levels during periods unaffected by an aquifer test. Differences between synthetic and measured water levels are minimized with a sum-of-squares objective function. Root-mean-square errors during fitting and prediction periods were compared multiple times at four geographically diverse sites. Prediction error equaled fitting error when fitting periods were greater than or equal to four times prediction periods. The proposed drawdown estimation approach has been implemented in a spreadsheet application. Measured time series are independent so that collection frequencies can differ and sampling times can be asynchronous. Time series can be viewed selectively and magnified easily. Fitting and prediction periods can be defined graphically or entered directly. Synthetic water levels for each observation well are created with earth tides, measured time series, moving averages of time series, and differences between measured and moving averages of time series. Selected series and fitting parameters for synthetic water levels are stored and drawdowns are estimated for prediction periods. Drawdowns can be viewed independently and adjusted visually if an anomaly skews initial drawdowns away from 0. The number of observations in a drawdown time series can be reduced by averaging across user-defined periods. Raw or reduced drawdown estimates can be copied from the spreadsheet application or written to tab-delimited ASCII files.

  11. The Effect on Non-Normal Distributions on the Integrated Moving Average Model of Time-Series Analysis.

    ERIC Educational Resources Information Center

    Doerann-George, Judith

    The Integrated Moving Average (IMA) model of time series, and the analysis of intervention effects based on it, assume random shocks which are normally distributed. To determine the robustness of the analysis to violations of this assumption, empirical sampling methods were employed. Samples were generated from three populations; normal,…

  12. Why didn't Box-Jenkins win (again)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pack, D.J.; Downing, D.J.

    This paper focuses on the forecasting performance of the Box-Jenkins methodology applied to the 111 time series of the Makridakis competition. It considers the influence of the following factors: (1) time series length, (2) time-series information (autocorrelation) content, (3) time-series outliers or structural changes, (4) averaging results over time series, and (5) forecast time origin choice. It is found that the 111 time series contain substantial numbers of very short series, series with obvious structural change, and series whose histories are relatively uninformative. If these series are typical of those that one must face in practice, the real message ofmore » the competition is that univariate time series extrapolations will frequently fail regardless of the methodology employed to produce them.« less

  13. Glossary-HDSC/OWP

    Science.gov Websites

    Glossary Precipitation Frequency Data Server GIS Grids Maps Time Series Temporals Documents Probable provides a measure of the average time between years (and not events) in which a particular value is RECCURENCE INTERVAL). ANNUAL MAXIMUM SERIES (AMS) - Time series of the largest precipitation amounts in a

  14. Use of Time-Series, ARIMA Designs to Assess Program Efficacy.

    ERIC Educational Resources Information Center

    Braden, Jeffery P.; And Others

    1990-01-01

    Illustrates use of time-series designs for determining efficacy of interventions with fictitious data describing drug-abuse prevention program. Discusses problems and procedures associated with time-series data analysis using Auto Regressive Integrated Moving Averages (ARIMA) models. Example illustrates application of ARIMA analysis for…

  15. Estimation of time averages from irregularly spaced observations - With application to coastal zone color scanner estimates of chlorophyll concentration

    NASA Technical Reports Server (NTRS)

    Chelton, Dudley B.; Schlax, Michael G.

    1991-01-01

    The sampling error of an arbitrary linear estimate of a time-averaged quantity constructed from a time series of irregularly spaced observations at a fixed located is quantified through a formalism. The method is applied to satellite observations of chlorophyll from the coastal zone color scanner. The two specific linear estimates under consideration are the composite average formed from the simple average of all observations within the averaging period and the optimal estimate formed by minimizing the mean squared error of the temporal average based on all the observations in the time series. The resulting suboptimal estimates are shown to be more accurate than composite averages. Suboptimal estimates are also found to be nearly as accurate as optimal estimates using the correct signal and measurement error variances and correlation functions for realistic ranges of these parameters, which makes it a viable practical alternative to the composite average method generally employed at present.

  16. Time Series Modelling of Syphilis Incidence in China from 2005 to 2012

    PubMed Central

    Zhang, Xingyu; Zhang, Tao; Pei, Jiao; Liu, Yuanyuan; Li, Xiaosong; Medrano-Gracia, Pau

    2016-01-01

    Background The infection rate of syphilis in China has increased dramatically in recent decades, becoming a serious public health concern. Early prediction of syphilis is therefore of great importance for heath planning and management. Methods In this paper, we analyzed surveillance time series data for primary, secondary, tertiary, congenital and latent syphilis in mainland China from 2005 to 2012. Seasonality and long-term trend were explored with decomposition methods. Autoregressive integrated moving average (ARIMA) was used to fit a univariate time series model of syphilis incidence. A separate multi-variable time series for each syphilis type was also tested using an autoregressive integrated moving average model with exogenous variables (ARIMAX). Results The syphilis incidence rates have increased three-fold from 2005 to 2012. All syphilis time series showed strong seasonality and increasing long-term trend. Both ARIMA and ARIMAX models fitted and estimated syphilis incidence well. All univariate time series showed highest goodness-of-fit results with the ARIMA(0,0,1)×(0,1,1) model. Conclusion Time series analysis was an effective tool for modelling the historical and future incidence of syphilis in China. The ARIMAX model showed superior performance than the ARIMA model for the modelling of syphilis incidence. Time series correlations existed between the models for primary, secondary, tertiary, congenital and latent syphilis. PMID:26901682

  17. Time Series Modelling of Syphilis Incidence in China from 2005 to 2012.

    PubMed

    Zhang, Xingyu; Zhang, Tao; Pei, Jiao; Liu, Yuanyuan; Li, Xiaosong; Medrano-Gracia, Pau

    2016-01-01

    The infection rate of syphilis in China has increased dramatically in recent decades, becoming a serious public health concern. Early prediction of syphilis is therefore of great importance for heath planning and management. In this paper, we analyzed surveillance time series data for primary, secondary, tertiary, congenital and latent syphilis in mainland China from 2005 to 2012. Seasonality and long-term trend were explored with decomposition methods. Autoregressive integrated moving average (ARIMA) was used to fit a univariate time series model of syphilis incidence. A separate multi-variable time series for each syphilis type was also tested using an autoregressive integrated moving average model with exogenous variables (ARIMAX). The syphilis incidence rates have increased three-fold from 2005 to 2012. All syphilis time series showed strong seasonality and increasing long-term trend. Both ARIMA and ARIMAX models fitted and estimated syphilis incidence well. All univariate time series showed highest goodness-of-fit results with the ARIMA(0,0,1)×(0,1,1) model. Time series analysis was an effective tool for modelling the historical and future incidence of syphilis in China. The ARIMAX model showed superior performance than the ARIMA model for the modelling of syphilis incidence. Time series correlations existed between the models for primary, secondary, tertiary, congenital and latent syphilis.

  18. Modeling Geodetic Processes with Levy α-Stable Distribution and FARIMA

    NASA Astrophysics Data System (ADS)

    Montillet, Jean-Philippe; Yu, Kegen

    2015-04-01

    Over the last years the scientific community has been using the auto regressive moving average (ARMA) model in the modeling of the noise in global positioning system (GPS) time series (daily solution). This work starts with the investigation of the limit of the ARMA model which is widely used in signal processing when the measurement noise is white. Since a typical GPS time series consists of geophysical signals (e.g., seasonal signal) and stochastic processes (e.g., coloured and white noise), the ARMA model may be inappropriate. Therefore, the application of the fractional auto-regressive integrated moving average (FARIMA) model is investigated. The simulation results using simulated time series as well as real GPS time series from a few selected stations around Australia show that the FARIMA model fits the time series better than other models when the coloured noise is larger than the white noise. The second fold of this work focuses on fitting the GPS time series with the family of Levy α-stable distributions. Using this distribution, a hypothesis test is developed to eliminate effectively coarse outliers from GPS time series, achieving better performance than using the rule of thumb of n standard deviations (with n chosen empirically).

  19. Rotation in the Dynamic Factor Modeling of Multivariate Stationary Time Series.

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Nesselroade, John R.

    2001-01-01

    Proposes a special rotation procedure for the exploratory dynamic factor model for stationary multivariate time series. The rotation procedure applies separately to each univariate component series of a q-variate latent factor series and transforms such a component, initially represented as white noise, into a univariate moving-average.…

  20. The Prediction of Teacher Turnover Employing Time Series Analysis.

    ERIC Educational Resources Information Center

    Costa, Crist H.

    The purpose of this study was to combine knowledge of teacher demographic data with time-series forecasting methods to predict teacher turnover. Moving averages and exponential smoothing were used to forecast discrete time series. The study used data collected from the 22 largest school districts in Iowa, designated as FACT schools. Predictions…

  1. Refining Markov state models for conformational dynamics using ensemble-averaged data and time-series trajectories

    NASA Astrophysics Data System (ADS)

    Matsunaga, Y.; Sugita, Y.

    2018-06-01

    A data-driven modeling scheme is proposed for conformational dynamics of biomolecules based on molecular dynamics (MD) simulations and experimental measurements. In this scheme, an initial Markov State Model (MSM) is constructed from MD simulation trajectories, and then, the MSM parameters are refined using experimental measurements through machine learning techniques. The second step can reduce the bias of MD simulation results due to inaccurate force-field parameters. Either time-series trajectories or ensemble-averaged data are available as a training data set in the scheme. Using a coarse-grained model of a dye-labeled polyproline-20, we compare the performance of machine learning estimations from the two types of training data sets. Machine learning from time-series data could provide the equilibrium populations of conformational states as well as their transition probabilities. It estimates hidden conformational states in more robust ways compared to that from ensemble-averaged data although there are limitations in estimating the transition probabilities between minor states. We discuss how to use the machine learning scheme for various experimental measurements including single-molecule time-series trajectories.

  2. Comparison of estimators for rolling samples using Forest Inventory and Analysis data

    Treesearch

    Devin S. Johnson; Michael S. Williams; Raymond L. Czaplewski

    2003-01-01

    The performance of three classes of weighted average estimators is studied for an annual inventory design similar to the Forest Inventory and Analysis program of the United States. The first class is based on an ARIMA(0,1,1) time series model. The equal weight, simple moving average is a member of this class. The second class is based on an ARIMA(0,2,2) time series...

  3. Evaluation of scaling invariance embedded in short time series.

    PubMed

    Pan, Xue; Hou, Lei; Stephen, Mutua; Yang, Huijie; Zhu, Chenping

    2014-01-01

    Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length ~10(2). Calculations with specified Hurst exponent values of 0.2,0.3,...,0.9 show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias (≤0.03) and sharp confidential interval (standard deviation ≤0.05). Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records.

  4. Evaluation of Scaling Invariance Embedded in Short Time Series

    PubMed Central

    Pan, Xue; Hou, Lei; Stephen, Mutua; Yang, Huijie; Zhu, Chenping

    2014-01-01

    Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length . Calculations with specified Hurst exponent values of show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias () and sharp confidential interval (standard deviation ). Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records. PMID:25549356

  5. KARMA4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khalil, Mohammad; Salloum, Maher; Lee, Jina

    2017-07-10

    KARMA4 is a C++ library for autoregressive moving average (ARMA) modeling and forecasting of time-series data while incorporating both process and observation error. KARMA4 is designed for fitting and forecasting of time-series data for predictive purposes.

  6. Conventional and advanced time series estimation: application to the Australian and New Zealand Intensive Care Society (ANZICS) adult patient database, 1993-2006.

    PubMed

    Moran, John L; Solomon, Patricia J

    2011-02-01

    Time series analysis has seen limited application in the biomedical Literature. The utility of conventional and advanced time series estimators was explored for intensive care unit (ICU) outcome series. Monthly mean time series, 1993-2006, for hospital mortality, severity-of-illness score (APACHE III), ventilation fraction and patient type (medical and surgical), were generated from the Australia and New Zealand Intensive Care Society adult patient database. Analyses encompassed geographical seasonal mortality patterns, series structural time changes, mortality series volatility using autoregressive moving average and Generalized Autoregressive Conditional Heteroscedasticity models in which predicted variances are updated adaptively, and bivariate and multivariate (vector error correction models) cointegrating relationships between series. The mortality series exhibited marked seasonality, declining mortality trend and substantial autocorrelation beyond 24 lags. Mortality increased in winter months (July-August); the medical series featured annual cycling, whereas the surgical demonstrated long and short (3-4 months) cycling. Series structural breaks were apparent in January 1995 and December 2002. The covariance stationary first-differenced mortality series was consistent with a seasonal autoregressive moving average process; the observed conditional-variance volatility (1993-1995) and residual Autoregressive Conditional Heteroscedasticity effects entailed a Generalized Autoregressive Conditional Heteroscedasticity model, preferred by information criterion and mean model forecast performance. Bivariate cointegration, indicating long-term equilibrium relationships, was established between mortality and severity-of-illness scores at the database level and for categories of ICUs. Multivariate cointegration was demonstrated for {log APACHE III score, log ICU length of stay, ICU mortality and ventilation fraction}. A system approach to understanding series time-dependence may be established using conventional and advanced econometric time series estimators. © 2010 Blackwell Publishing Ltd.

  7. Defense Applications of Signal Processing

    DTIC Science & Technology

    1999-08-27

    class of multiscale autoregressive moving average (MARMA) processes. These are generalisations of ARMA models in time series analysis , and they contain...including the two theoretical sinusoidal components. Analysis of the amplitude and frequency time series provided some novel insight into the real...communication channels, underwater acoustic signals, radar systems , economic time series and biomedical signals [7]. The alpha stable (aS) distribution has

  8. Time series models on analysing mortality rates and acute childhood lymphoid leukaemia.

    PubMed

    Kis, Maria

    2005-01-01

    In this paper we demonstrate applying time series models on medical research. The Hungarian mortality rates were analysed by autoregressive integrated moving average models and seasonal time series models examined the data of acute childhood lymphoid leukaemia.The mortality data may be analysed by time series methods such as autoregressive integrated moving average (ARIMA) modelling. This method is demonstrated by two examples: analysis of the mortality rates of ischemic heart diseases and analysis of the mortality rates of cancer of digestive system. Mathematical expressions are given for the results of analysis. The relationships between time series of mortality rates were studied with ARIMA models. Calculations of confidence intervals for autoregressive parameters by tree methods: standard normal distribution as estimation and estimation of the White's theory and the continuous time case estimation. Analysing the confidence intervals of the first order autoregressive parameters we may conclude that the confidence intervals were much smaller than other estimations by applying the continuous time estimation model.We present a new approach to analysing the occurrence of acute childhood lymphoid leukaemia. We decompose time series into components. The periodicity of acute childhood lymphoid leukaemia in Hungary was examined using seasonal decomposition time series method. The cyclic trend of the dates of diagnosis revealed that a higher percent of the peaks fell within the winter months than in the other seasons. This proves the seasonal occurrence of the childhood leukaemia in Hungary.

  9. Beyond long memory in heart rate variability: An approach based on fractionally integrated autoregressive moving average time series models with conditional heteroscedasticity

    NASA Astrophysics Data System (ADS)

    Leite, Argentina; Paula Rocha, Ana; Eduarda Silva, Maria

    2013-06-01

    Heart Rate Variability (HRV) series exhibit long memory and time-varying conditional variance. This work considers the Fractionally Integrated AutoRegressive Moving Average (ARFIMA) models with Generalized AutoRegressive Conditional Heteroscedastic (GARCH) errors. ARFIMA-GARCH models may be used to capture and remove long memory and estimate the conditional volatility in 24 h HRV recordings. The ARFIMA-GARCH approach is applied to fifteen long term HRV series available at Physionet, leading to the discrimination among normal individuals, heart failure patients, and patients with atrial fibrillation.

  10. Fast Algorithms for Mining Co-evolving Time Series

    DTIC Science & Technology

    2011-09-01

    Keogh et al., 2001, 2004] and (b) forecasting, like an autoregressive integrated moving average model ( ARIMA ) and related meth- ods [Box et al., 1994...computing hardware? We develop models to mine time series with missing values, to extract compact representation from time sequences, to segment the...sequences, and to do forecasting. For large scale data, we propose algorithms for learning time series models , in particular, including Linear Dynamical

  11. Clinical time series prediction: Toward a hierarchical dynamical system framework.

    PubMed

    Liu, Zitao; Hauskrecht, Milos

    2015-09-01

    Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. We tested our framework by first learning the time series model from data for the patients in the training set, and then using it to predict future time series values for the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Fluctuations in Cerebral Hemodynamics

    DTIC Science & Technology

    2003-12-01

    Determination of scaling properties Detrended Fluctuations Analysis (see (28) and references therein) is commonly used to determine scaling...pressure (averaged over a cardiac beat) of a healthy subject. First 1000 values of the time series are shown. (b) Detrended fluctuation analysis (DFA...1000 values of the time series are shown. (b) Detrended fluctuation analysis of the time series shown in (a). Fig . 3 Side-by-side boxplot for the

  13. Unveiling signatures of interdecadal climate changes by Hilbert analysis

    NASA Astrophysics Data System (ADS)

    Zappalà, Dario; Barreiro, Marcelo; Masoller, Cristina

    2017-04-01

    A recent study demonstrated that, in a class of networks of oscillators, the optimal network reconstruction from dynamics is obtained when the similarity analysis is performed not on the original dynamical time series, but on transformed series obtained by Hilbert transform. [1] That motivated us to use Hilbert transform to study another kind of (in a broad sense) "oscillating" series, such as the series of temperature. Actually, we found that Hilbert analysis of SAT (Surface Air Temperature) time series uncovers meaningful information about climate and is therefore a promising tool for the study of other climatological variables. [2] In this work we analysed a large dataset of SAT series, performing Hilbert transform and further analysis with the goal of finding signs of climate change during the analysed period. We used the publicly available ERA-Interim dataset, containing reanalysis data. [3] In particular, we worked on daily SAT time series, from year 1979 to 2015, in 16380 points arranged over a regular grid on the Earth surface. From each SAT time series we calculate the anomaly series and also, by using the Hilbert transform, we calculate the instantaneous amplitude and instantaneous frequency series. Our first approach is to calculate the relative variation: the difference between the average value on the last 10 years and the average value on the first 10 years, divided by the average value over all the analysed period. We did this calculations on our transformed series: frequency and amplitude, both with average values and standard deviation values. Furthermore, to have a comparison with an already known analysis methods, we did these same calculations on the anomaly series. We plotted these results as maps, where the colour of each site indicates the value of its relative variation. Finally, to gain insight in the interpretation of our results over real SAT data, we generated synthetic sinusoidal series with various levels of additive noise. By applying Hilbert analysis to the synthetic data, we uncovered a clear trend between mean amplitude and mean frequency: as the noise level grows, the amplitude increases while the frequency decreases. Research funded in part by AGAUR (Generalitat de Catalunya), EU LINC project (Grant No. 289447) and Spanish MINECO (FIS2015-66503-C3-2-P).

  14. Large deviation probabilities for correlated Gaussian stochastic processes and daily temperature anomalies

    NASA Astrophysics Data System (ADS)

    Massah, Mozhdeh; Kantz, Holger

    2016-04-01

    As we have one and only one earth and no replicas, climate characteristics are usually computed as time averages from a single time series. For understanding climate variability, it is essential to understand how close a single time average will typically be to an ensemble average. To answer this question, we study large deviation probabilities (LDP) of stochastic processes and characterize them by their dependence on the time window. In contrast to iid variables for which there exists an analytical expression for the rate function, the correlated variables such as auto-regressive (short memory) and auto-regressive fractionally integrated moving average (long memory) processes, have not an analytical LDP. We study LDP for these processes, in order to see how correlation affects this probability in comparison to iid data. Although short range correlations lead to a simple correction of sample size, long range correlations lead to a sub-exponential decay of LDP and hence to a very slow convergence of time averages. This effect is demonstrated for a 120 year long time series of daily temperature anomalies measured in Potsdam (Germany).

  15. Trends in Average Living Children at the Time of Terminal Contraception: A Time Series Analysis Over 27 Years Using ARIMA (p, d, q) Nonseasonal Model.

    PubMed

    Mumbare, Sachin S; Gosavi, Shriram; Almale, Balaji; Patil, Aruna; Dhakane, Supriya; Kadu, Aniruddha

    2014-10-01

    India's National Family Welfare Programme is dominated by sterilization, particularly tubectomy. Sterilization, being a terminal method of contraception, decides the final number of children for that couple. Many studies have shown the declining trend in the average number of living children at the time of sterilization over a short period of time. So this study was planned to do time series analysis of the average children at the time of terminal contraception, to do forecasting till 2020 for the same and to compare the rates of change in various subgroups of the population. Data was preprocessed in MS Access 2007 by creating and running SQL queries. After testing stationarity of every series with augmented Dickey-Fuller test, time series analysis and forecasting was done using best-fit Box-Jenkins ARIMA (p, d, q) nonseasonal model. To compare the rates of change of average children in various subgroups, at sterilization, analysis of covariance (ANCOVA) was applied. Forecasting showed that the replacement level of 2.1 total fertility rate (TFR) will be achieved in 2018 for couples opting for sterilization. The same will be achieved in 2020, 2016, 2018, and 2019 for rural area, urban area, Hindu couples, and Buddhist couples, respectively. It will not be achieved till 2020 in Muslim couples. Every stratum of population showed the declining trend. The decline for male children and in rural area was significantly faster than the decline for female children and in urban area, respectively. The decline was not significantly different in Hindu, Muslim, and Buddhist couples.

  16. Work-related accidents among the Iranian population: a time series analysis, 2000–2011

    PubMed Central

    Karimlou, Masoud; Imani, Mehdi; Hosseini, Agha-Fatemeh; Dehnad, Afsaneh; Vahabi, Nasim; Bakhtiyari, Mahmood

    2015-01-01

    Background Work-related accidents result in human suffering and economic losses and are considered as a major health problem worldwide, especially in the economically developing world. Objectives To introduce seasonal autoregressive moving average (ARIMA) models for time series analysis of work-related accident data for workers insured by the Iranian Social Security Organization (ISSO) between 2000 and 2011. Methods In this retrospective study, all insured people experiencing at least one work-related accident during a 10-year period were included in the analyses. We used Box–Jenkins modeling to develop a time series model of the total number of accidents. Results There was an average of 1476 accidents per month (1476·05±458·77, mean±SD). The final ARIMA (p,d,q) (P,D,Q)s model for fitting to data was: ARIMA(1,1,1)×(0,1,1)12 consisting of the first ordering of the autoregressive, moving average and seasonal moving average parameters with 20·942 mean absolute percentage error (MAPE). Conclusions The final model showed that time series analysis of ARIMA models was useful for forecasting the number of work-related accidents in Iran. In addition, the forecasted number of work-related accidents for 2011 explained the stability of occurrence of these accidents in recent years, indicating a need for preventive occupational health and safety policies such as safety inspection. PMID:26119774

  17. Work-related accidents among the Iranian population: a time series analysis, 2000-2011.

    PubMed

    Karimlou, Masoud; Salehi, Masoud; Imani, Mehdi; Hosseini, Agha-Fatemeh; Dehnad, Afsaneh; Vahabi, Nasim; Bakhtiyari, Mahmood

    2015-01-01

    Work-related accidents result in human suffering and economic losses and are considered as a major health problem worldwide, especially in the economically developing world. To introduce seasonal autoregressive moving average (ARIMA) models for time series analysis of work-related accident data for workers insured by the Iranian Social Security Organization (ISSO) between 2000 and 2011. In this retrospective study, all insured people experiencing at least one work-related accident during a 10-year period were included in the analyses. We used Box-Jenkins modeling to develop a time series model of the total number of accidents. There was an average of 1476 accidents per month (1476·05±458·77, mean±SD). The final ARIMA (p,d,q) (P,D,Q)s model for fitting to data was: ARIMA(1,1,1)×(0,1,1)12 consisting of the first ordering of the autoregressive, moving average and seasonal moving average parameters with 20·942 mean absolute percentage error (MAPE). The final model showed that time series analysis of ARIMA models was useful for forecasting the number of work-related accidents in Iran. In addition, the forecasted number of work-related accidents for 2011 explained the stability of occurrence of these accidents in recent years, indicating a need for preventive occupational health and safety policies such as safety inspection.

  18. Estimation of Parameters from Discrete Random Nonstationary Time Series

    NASA Astrophysics Data System (ADS)

    Takayasu, H.; Nakamura, T.

    For the analysis of nonstationary stochastic time series we introduce a formulation to estimate the underlying time-dependent parameters. This method is designed for random events with small numbers that are out of the applicability range of the normal distribution. The method is demonstrated for numerical data generated by a known system, and applied to time series of traffic accidents, batting average of a baseball player and sales volume of home electronics.

  19. Time Series in Education: The Analysis of Daily Attendance in Two High Schools

    ERIC Educational Resources Information Center

    Koopmans, Matthijs

    2011-01-01

    This presentation discusses the use of a time series approach to the analysis of daily attendance in two urban high schools over the course of one school year (2009-10). After establishing that the series for both schools were stationary, they were examined for moving average processes, autoregression, seasonal dependencies (weekly cycles),…

  20. Generalized seasonal autoregressive integrated moving average models for count data with application to malaria time series with low case numbers.

    PubMed

    Briët, Olivier J T; Amerasinghe, Priyanie H; Vounatsou, Penelope

    2013-01-01

    With the renewed drive towards malaria elimination, there is a need for improved surveillance tools. While time series analysis is an important tool for surveillance, prediction and for measuring interventions' impact, approximations by commonly used Gaussian methods are prone to inaccuracies when case counts are low. Therefore, statistical methods appropriate for count data are required, especially during "consolidation" and "pre-elimination" phases. Generalized autoregressive moving average (GARMA) models were extended to generalized seasonal autoregressive integrated moving average (GSARIMA) models for parsimonious observation-driven modelling of non Gaussian, non stationary and/or seasonal time series of count data. The models were applied to monthly malaria case time series in a district in Sri Lanka, where malaria has decreased dramatically in recent years. The malaria series showed long-term changes in the mean, unstable variance and seasonality. After fitting negative-binomial Bayesian models, both a GSARIMA and a GARIMA deterministic seasonality model were selected based on different criteria. Posterior predictive distributions indicated that negative-binomial models provided better predictions than Gaussian models, especially when counts were low. The G(S)ARIMA models were able to capture the autocorrelation in the series. G(S)ARIMA models may be particularly useful in the drive towards malaria elimination, since episode count series are often seasonal and non-stationary, especially when control is increased. Although building and fitting GSARIMA models is laborious, they may provide more realistic prediction distributions than do Gaussian methods and may be more suitable when counts are low.

  1. Generalized Seasonal Autoregressive Integrated Moving Average Models for Count Data with Application to Malaria Time Series with Low Case Numbers

    PubMed Central

    Briët, Olivier J. T.; Amerasinghe, Priyanie H.; Vounatsou, Penelope

    2013-01-01

    Introduction With the renewed drive towards malaria elimination, there is a need for improved surveillance tools. While time series analysis is an important tool for surveillance, prediction and for measuring interventions’ impact, approximations by commonly used Gaussian methods are prone to inaccuracies when case counts are low. Therefore, statistical methods appropriate for count data are required, especially during “consolidation” and “pre-elimination” phases. Methods Generalized autoregressive moving average (GARMA) models were extended to generalized seasonal autoregressive integrated moving average (GSARIMA) models for parsimonious observation-driven modelling of non Gaussian, non stationary and/or seasonal time series of count data. The models were applied to monthly malaria case time series in a district in Sri Lanka, where malaria has decreased dramatically in recent years. Results The malaria series showed long-term changes in the mean, unstable variance and seasonality. After fitting negative-binomial Bayesian models, both a GSARIMA and a GARIMA deterministic seasonality model were selected based on different criteria. Posterior predictive distributions indicated that negative-binomial models provided better predictions than Gaussian models, especially when counts were low. The G(S)ARIMA models were able to capture the autocorrelation in the series. Conclusions G(S)ARIMA models may be particularly useful in the drive towards malaria elimination, since episode count series are often seasonal and non-stationary, especially when control is increased. Although building and fitting GSARIMA models is laborious, they may provide more realistic prediction distributions than do Gaussian methods and may be more suitable when counts are low. PMID:23785448

  2. Degree-Pruning Dynamic Programming Approaches to Central Time Series Minimizing Dynamic Time Warping Distance.

    PubMed

    Sun, Tao; Liu, Hongbo; Yu, Hong; Chen, C L Philip

    2016-06-28

    The central time series crystallizes the common patterns of the set it represents. In this paper, we propose a global constrained degree-pruning dynamic programming (g(dp)²) approach to obtain the central time series through minimizing dynamic time warping (DTW) distance between two time series. The DTW matching path theory with global constraints is proved theoretically for our degree-pruning strategy, which is helpful to reduce the time complexity and computational cost. Our approach can achieve the optimal solution between two time series. An approximate method to the central time series of multiple time series [called as m_g(dp)²] is presented based on DTW barycenter averaging and our g(dp)² approach by considering hierarchically merging strategy. As illustrated by the experimental results, our approaches provide better within-group sum of squares and robustness than other relevant algorithms.

  3. Multiscale structure of time series revealed by the monotony spectrum.

    PubMed

    Vamoş, Călin

    2017-03-01

    Observation of complex systems produces time series with specific dynamics at different time scales. The majority of the existing numerical methods for multiscale analysis first decompose the time series into several simpler components and the multiscale structure is given by the properties of their components. We present a numerical method which describes the multiscale structure of arbitrary time series without decomposing them. It is based on the monotony spectrum defined as the variation of the mean amplitude of the monotonic segments with respect to the mean local time scale during successive averagings of the time series, the local time scales being the durations of the monotonic segments. The maxima of the monotony spectrum indicate the time scales which dominate the variations of the time series. We show that the monotony spectrum can correctly analyze a diversity of artificial time series and can discriminate the existence of deterministic variations at large time scales from the random fluctuations. As an application we analyze the multifractal structure of some hydrological time series.

  4. Temporal and long-term trend analysis of class C notifiable diseases in China from 2009 to 2014

    PubMed Central

    Zhang, Xingyu; Hou, Fengsu; Qiao, Zhijiao; Li, Xiaosong; Zhou, Lijun; Liu, Yuanyuan; Zhang, Tao

    2016-01-01

    Objectives Time series models are effective tools for disease forecasting. This study aims to explore the time series behaviour of 11 notifiable diseases in China and to predict their incidence through effective models. Settings and participants The Chinese Ministry of Health started to publish class C notifiable diseases in 2009. The monthly reported case time series of 11 infectious diseases from the surveillance system between 2009 and 2014 was collected. Methods We performed a descriptive and a time series study using the surveillance data. Decomposition methods were used to explore (1) their seasonality expressed in the form of seasonal indices and (2) their long-term trend in the form of a linear regression model. Autoregressive integrated moving average (ARIMA) models have been established for each disease. Results The number of cases and deaths caused by hand, foot and mouth disease ranks number 1 among the detected diseases. It occurred most often in May and July and increased, on average, by 0.14126/100 000 per month. The remaining incidence models show good fit except the influenza and hydatid disease models. Both the hydatid disease and influenza series become white noise after differencing, so no available ARIMA model can be fitted for these two diseases. Conclusion Time series analysis of effective surveillance time series is useful for better understanding the occurrence of the 11 types of infectious disease. PMID:27797981

  5. Time-Series Analysis: Assessing the Effects of Multiple Educational Interventions in a Small-Enrollment Course

    NASA Astrophysics Data System (ADS)

    Warren, Aaron R.

    2009-11-01

    Time-series designs are an alternative to pretest-posttest methods that are able to identify and measure the impacts of multiple educational interventions, even for small student populations. Here, we use an instrument employing standard multiple-choice conceptual questions to collect data from students at regular intervals. The questions are modified by asking students to distribute 100 Confidence Points among the options in order to indicate the perceived likelihood of each answer option being the correct one. Tracking the class-averaged ratings for each option produces a set of time-series. ARIMA (autoregressive integrated moving average) analysis is then used to test for, and measure, changes in each series. In particular, it is possible to discern which educational interventions produce significant changes in class performance. Cluster analysis can also identify groups of students whose ratings evolve in similar ways. A brief overview of our methods and an example are presented.

  6. Statistical process control of mortality series in the Australian and New Zealand Intensive Care Society (ANZICS) adult patient database: implications of the data generating process.

    PubMed

    Moran, John L; Solomon, Patricia J

    2013-05-24

    Statistical process control (SPC), an industrial sphere initiative, has recently been applied in health care and public health surveillance. SPC methods assume independent observations and process autocorrelation has been associated with increase in false alarm frequency. Monthly mean raw mortality (at hospital discharge) time series, 1995-2009, at the individual Intensive Care unit (ICU) level, were generated from the Australia and New Zealand Intensive Care Society adult patient database. Evidence for series (i) autocorrelation and seasonality was demonstrated using (partial)-autocorrelation ((P)ACF) function displays and classical series decomposition and (ii) "in-control" status was sought using risk-adjusted (RA) exponentially weighted moving average (EWMA) control limits (3 sigma). Risk adjustment was achieved using a random coefficient (intercept as ICU site and slope as APACHE III score) logistic regression model, generating an expected mortality series. Application of time-series to an exemplar complete ICU series (1995-(end)2009) was via Box-Jenkins methodology: autoregressive moving average (ARMA) and (G)ARCH ((Generalised) Autoregressive Conditional Heteroscedasticity) models, the latter addressing volatility of the series variance. The overall data set, 1995-2009, consisted of 491324 records from 137 ICU sites; average raw mortality was 14.07%; average(SD) raw and expected mortalities ranged from 0.012(0.113) and 0.013(0.045) to 0.296(0.457) and 0.278(0.247) respectively. For the raw mortality series: 71 sites had continuous data for assessment up to or beyond lag40 and 35% had autocorrelation through to lag40; and of 36 sites with continuous data for ≥ 72 months, all demonstrated marked seasonality. Similar numbers and percentages were seen with the expected series. Out-of-control signalling was evident for the raw mortality series with respect to RA-EWMA control limits; a seasonal ARMA model, with GARCH effects, displayed white-noise residuals which were in-control with respect to EWMA control limits and one-step prediction error limits (3SE). The expected series was modelled with a multiplicative seasonal autoregressive model. The data generating process of monthly raw mortality series at the ICU level displayed autocorrelation, seasonality and volatility. False-positive signalling of the raw mortality series was evident with respect to RA-EWMA control limits. A time series approach using residual control charts resolved these issues.

  7. The Press Relations of a Local School District: An Analysis of the Emergence of School Issues.

    ERIC Educational Resources Information Center

    Morris, Jon R.; Guenter, Cornelius

    Press coverage of a suburban midwest school district is analyzed as a set of time series of observations including the amount and quality of coverage. Possible shifts in these series because of the emergence of controversial issues are analyzed statistically using the Integrated Moving Average Time Series Model. Evidence of significant shifts in…

  8. A univariate model of river water nitrate time series

    NASA Astrophysics Data System (ADS)

    Worrall, F.; Burt, T. P.

    1999-01-01

    Four time series were taken from three catchments in the North and South of England. The sites chosen included two in predominantly agricultural catchments, one at the tidal limit and one downstream of a sewage treatment works. A time series model was constructed for each of these series as a means of decomposing the elements controlling river water nitrate concentrations and to assess whether this approach could provide a simple management tool for protecting water abstractions. Autoregressive (AR) modelling of the detrended and deseasoned time series showed a "memory effect". This memory effect expressed itself as an increase in the winter-summer difference in nitrate levels that was dependent upon the nitrate concentration 12 or 6 months previously. Autoregressive moving average (ARMA) modelling showed that one of the series contained seasonal, non-stationary elements that appeared as an increasing trend in the winter-summer difference. The ARMA model was used to predict nitrate levels and predictions were tested against data held back from the model construction process - predictions gave average percentage errors of less than 10%. Empirical modelling can therefore provide a simple, efficient method for constructing management models for downstream water abstraction.

  9. A novel weight determination method for time series data aggregation

    NASA Astrophysics Data System (ADS)

    Xu, Paiheng; Zhang, Rong; Deng, Yong

    2017-09-01

    Aggregation in time series is of great importance in time series smoothing, predicting and other time series analysis process, which makes it crucial to address the weights in times series correctly and reasonably. In this paper, a novel method to obtain the weights in time series is proposed, in which we adopt induced ordered weighted aggregation (IOWA) operator and visibility graph averaging (VGA) operator and linearly combine the weights separately generated by the two operator. The IOWA operator is introduced to the weight determination of time series, through which the time decay factor is taken into consideration. The VGA operator is able to generate weights with respect to the degree distribution in the visibility graph constructed from the corresponding time series, which reflects the relative importance of vertices in time series. The proposed method is applied to two practical datasets to illustrate its merits. The aggregation of Construction Cost Index (CCI) demonstrates the ability of proposed method to smooth time series, while the aggregation of The Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) illustrate how proposed method maintain the variation tendency of original data.

  10. Ridge Polynomial Neural Network with Error Feedback for Time Series Forecasting

    PubMed Central

    Ghazali, Rozaida; Herawan, Tutut

    2016-01-01

    Time series forecasting has gained much attention due to its many practical applications. Higher-order neural network with recurrent feedback is a powerful technique that has been used successfully for time series forecasting. It maintains fast learning and the ability to learn the dynamics of the time series over time. Network output feedback is the most common recurrent feedback for many recurrent neural network models. However, not much attention has been paid to the use of network error feedback instead of network output feedback. In this study, we propose a novel model, called Ridge Polynomial Neural Network with Error Feedback (RPNN-EF) that incorporates higher order terms, recurrence and error feedback. To evaluate the performance of RPNN-EF, we used four univariate time series with different forecasting horizons, namely star brightness, monthly smoothed sunspot numbers, daily Euro/Dollar exchange rate, and Mackey-Glass time-delay differential equation. We compared the forecasting performance of RPNN-EF with the ordinary Ridge Polynomial Neural Network (RPNN) and the Dynamic Ridge Polynomial Neural Network (DRPNN). Simulation results showed an average 23.34% improvement in Root Mean Square Error (RMSE) with respect to RPNN and an average 10.74% improvement with respect to DRPNN. That means that using network errors during training helps enhance the overall forecasting performance for the network. PMID:27959927

  11. Ridge Polynomial Neural Network with Error Feedback for Time Series Forecasting.

    PubMed

    Waheeb, Waddah; Ghazali, Rozaida; Herawan, Tutut

    2016-01-01

    Time series forecasting has gained much attention due to its many practical applications. Higher-order neural network with recurrent feedback is a powerful technique that has been used successfully for time series forecasting. It maintains fast learning and the ability to learn the dynamics of the time series over time. Network output feedback is the most common recurrent feedback for many recurrent neural network models. However, not much attention has been paid to the use of network error feedback instead of network output feedback. In this study, we propose a novel model, called Ridge Polynomial Neural Network with Error Feedback (RPNN-EF) that incorporates higher order terms, recurrence and error feedback. To evaluate the performance of RPNN-EF, we used four univariate time series with different forecasting horizons, namely star brightness, monthly smoothed sunspot numbers, daily Euro/Dollar exchange rate, and Mackey-Glass time-delay differential equation. We compared the forecasting performance of RPNN-EF with the ordinary Ridge Polynomial Neural Network (RPNN) and the Dynamic Ridge Polynomial Neural Network (DRPNN). Simulation results showed an average 23.34% improvement in Root Mean Square Error (RMSE) with respect to RPNN and an average 10.74% improvement with respect to DRPNN. That means that using network errors during training helps enhance the overall forecasting performance for the network.

  12. Wavelet analysis and scaling properties of time series

    NASA Astrophysics Data System (ADS)

    Manimaran, P.; Panigrahi, Prasanta K.; Parikh, Jitendra C.

    2005-10-01

    We propose a wavelet based method for the characterization of the scaling behavior of nonstationary time series. It makes use of the built-in ability of the wavelets for capturing the trends in a data set, in variable window sizes. Discrete wavelets from the Daubechies family are used to illustrate the efficacy of this procedure. After studying binomial multifractal time series with the present and earlier approaches of detrending for comparison, we analyze the time series of averaged spin density in the 2D Ising model at the critical temperature, along with several experimental data sets possessing multifractal behavior.

  13. Intercomparison of Recent Anomaly Time-Series of OLR as Observed by CERES and Computed Using AIRS Products

    NASA Technical Reports Server (NTRS)

    Susskind, Joel; Molnar, Gyula; Iredell, Lena; Loeb, Norman G.

    2011-01-01

    This paper compares recent spatial and temporal anomaly time series of OLR as observed by CERES and computed based on AIRS retrieved surface and atmospheric geophysical parameters over the 7 year time period September 2002 through February 2010. This time period is marked by a substantial decrease of OLR, on the order of +/-0.1 W/sq m/yr, averaged over the globe, and very large spatial variations of changes in OLR in the tropics, with local values ranging from -2.8 W/sq m/yr to +3.1 W/sq m/yr. Global and Tropical OLR both began to decrease significantly at the onset of a strong La Ni a in mid-2007. Late 2009 is characterized by a strong El Ni o, with a corresponding change in sign of both Tropical and Global OLR anomalies. The spatial patterns of the 7 year short term changes in AIRS and CERES OLR have a spatial correlation of 0.97 and slopes of the linear least squares fits of anomaly time series averaged over different spatial regions agree on the order of +/-0.01 W/sq m/yr. This essentially perfect agreement of OLR anomaly time series derived from observations by two different instruments, determined in totally independent and different manners, implies that both sets of results must be highly stable. This agreement also validates the anomaly time series of the AIRS derived products used to compute OLR and furthermore indicates that anomaly time series of AIRS derived products can be used to explain the factors contributing to anomaly time series of OLR.

  14. Kumaraswamy autoregressive moving average models for double bounded environmental data

    NASA Astrophysics Data System (ADS)

    Bayer, Fábio Mariano; Bayer, Débora Missio; Pumi, Guilherme

    2017-12-01

    In this paper we introduce the Kumaraswamy autoregressive moving average models (KARMA), which is a dynamic class of models for time series taking values in the double bounded interval (a,b) following the Kumaraswamy distribution. The Kumaraswamy family of distribution is widely applied in many areas, especially hydrology and related fields. Classical examples are time series representing rates and proportions observed over time. In the proposed KARMA model, the median is modeled by a dynamic structure containing autoregressive and moving average terms, time-varying regressors, unknown parameters and a link function. We introduce the new class of models and discuss conditional maximum likelihood estimation, hypothesis testing inference, diagnostic analysis and forecasting. In particular, we provide closed-form expressions for the conditional score vector and conditional Fisher information matrix. An application to environmental real data is presented and discussed.

  15. Clinical time series prediction: towards a hierarchical dynamical system framework

    PubMed Central

    Liu, Zitao; Hauskrecht, Milos

    2014-01-01

    Objective Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Materials and methods Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. Results We tested our framework by first learning the time series model from data for the patient in the training set, and then applying the model in order to predict future time series values on the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. Conclusion A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. PMID:25534671

  16. Measurement of cardiac output from dynamic pulmonary circulation time CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yee, Seonghwan, E-mail: Seonghwan.Yee@Beaumont.edu; Scalzetti, Ernest M.

    Purpose: To introduce a method of estimating cardiac output from the dynamic pulmonary circulation time CT that is primarily used to determine the optimal time window of CT pulmonary angiography (CTPA). Methods: Dynamic pulmonary circulation time CT series, acquired for eight patients, were retrospectively analyzed. The dynamic CT series was acquired, prior to the main CTPA, in cine mode (1 frame/s) for a single slice at the level of the main pulmonary artery covering the cross sections of ascending aorta (AA) and descending aorta (DA) during the infusion of iodinated contrast. The time series of contrast changes obtained for DA,more » which is the downstream of AA, was assumed to be related to the time series for AA by the convolution with a delay function. The delay time constant in the delay function, representing the average time interval between the cross sections of AA and DA, was determined by least square error fitting between the convoluted AA time series and the DA time series. The cardiac output was then calculated by dividing the volume of the aortic arch between the cross sections of AA and DA (estimated from the single slice CT image) by the average time interval, and multiplying the result by a correction factor. Results: The mean cardiac output value for the six patients was 5.11 (l/min) (with a standard deviation of 1.57 l/min), which is in good agreement with the literature value; the data for the other two patients were too noisy for processing. Conclusions: The dynamic single-slice pulmonary circulation time CT series also can be used to estimate cardiac output.« less

  17. Moving Average Models with Bivariate Exponential and Geometric Distributions.

    DTIC Science & Technology

    1985-03-01

    ordinary time series and of point processes. Developments in Statistics, Vol. 1, P.R. Krishnaiah , ed. Academic Press, New York. [9] Esary, J.D. and...valued and discrete - valued time series with ARMA correlation structure. Multivariate Analysis V, P.R. Krishnaiah , ed. North-Holland. 151-166. [28

  18. Quantifying memory in complex physiological time-series.

    PubMed

    Shirazi, Amir H; Raoufy, Mohammad R; Ebadi, Haleh; De Rui, Michele; Schiff, Sami; Mazloom, Roham; Hajizadeh, Sohrab; Gharibzadeh, Shahriar; Dehpour, Ahmad R; Amodio, Piero; Jafari, G Reza; Montagnese, Sara; Mani, Ali R

    2013-01-01

    In a time-series, memory is a statistical feature that lasts for a period of time and distinguishes the time-series from a random, or memory-less, process. In the present study, the concept of "memory length" was used to define the time period, or scale over which rare events within a physiological time-series do not appear randomly. The method is based on inverse statistical analysis and provides empiric evidence that rare fluctuations in cardio-respiratory time-series are 'forgotten' quickly in healthy subjects while the memory for such events is significantly prolonged in pathological conditions such as asthma (respiratory time-series) and liver cirrhosis (heart-beat time-series). The memory length was significantly higher in patients with uncontrolled asthma compared to healthy volunteers. Likewise, it was significantly higher in patients with decompensated cirrhosis compared to those with compensated cirrhosis and healthy volunteers. We also observed that the cardio-respiratory system has simple low order dynamics and short memory around its average, and high order dynamics around rare fluctuations.

  19. Quantifying Memory in Complex Physiological Time-Series

    PubMed Central

    Shirazi, Amir H.; Raoufy, Mohammad R.; Ebadi, Haleh; De Rui, Michele; Schiff, Sami; Mazloom, Roham; Hajizadeh, Sohrab; Gharibzadeh, Shahriar; Dehpour, Ahmad R.; Amodio, Piero; Jafari, G. Reza; Montagnese, Sara; Mani, Ali R.

    2013-01-01

    In a time-series, memory is a statistical feature that lasts for a period of time and distinguishes the time-series from a random, or memory-less, process. In the present study, the concept of “memory length” was used to define the time period, or scale over which rare events within a physiological time-series do not appear randomly. The method is based on inverse statistical analysis and provides empiric evidence that rare fluctuations in cardio-respiratory time-series are ‘forgotten’ quickly in healthy subjects while the memory for such events is significantly prolonged in pathological conditions such as asthma (respiratory time-series) and liver cirrhosis (heart-beat time-series). The memory length was significantly higher in patients with uncontrolled asthma compared to healthy volunteers. Likewise, it was significantly higher in patients with decompensated cirrhosis compared to those with compensated cirrhosis and healthy volunteers. We also observed that the cardio-respiratory system has simple low order dynamics and short memory around its average, and high order dynamics around rare fluctuations. PMID:24039811

  20. Estimating Perturbation and Meta-Stability in the Daily Attendance Rates of Six Small High Schools

    NASA Astrophysics Data System (ADS)

    Koopmans, Matthijs

    This paper discusses the daily attendance rates in six small high schools over a ten-year period and evaluates how stable those rates are. “Stability” is approached from two vantage points: pulse models are fitted to estimate the impact of sudden perturbations and their reverberation through the series, and Autoregressive Fractionally Integrated Moving Average (ARFIMA) techniques are used to detect dependencies over the long range of the series. The analyses are meant to (1) exemplify the utility of time series approaches in educational research, which lacks a time series tradition, (2) discuss some time series features that seem to be particular to daily attendance rate trajectories such as the distinct downward pull coming from extreme observations, and (3) present an analytical approach to handle the important yet distinct patterns of variability that can be found in these data. The analysis also illustrates why the assumption of stability that underlies the habitual reporting of weekly, monthly and yearly averages in the educational literature is questionable, as it reveals dynamical processes (perturbation, meta-stability) that remain hidden in such summaries.

  1. Statistical process control of mortality series in the Australian and New Zealand Intensive Care Society (ANZICS) adult patient database: implications of the data generating process

    PubMed Central

    2013-01-01

    Background Statistical process control (SPC), an industrial sphere initiative, has recently been applied in health care and public health surveillance. SPC methods assume independent observations and process autocorrelation has been associated with increase in false alarm frequency. Methods Monthly mean raw mortality (at hospital discharge) time series, 1995–2009, at the individual Intensive Care unit (ICU) level, were generated from the Australia and New Zealand Intensive Care Society adult patient database. Evidence for series (i) autocorrelation and seasonality was demonstrated using (partial)-autocorrelation ((P)ACF) function displays and classical series decomposition and (ii) “in-control” status was sought using risk-adjusted (RA) exponentially weighted moving average (EWMA) control limits (3 sigma). Risk adjustment was achieved using a random coefficient (intercept as ICU site and slope as APACHE III score) logistic regression model, generating an expected mortality series. Application of time-series to an exemplar complete ICU series (1995-(end)2009) was via Box-Jenkins methodology: autoregressive moving average (ARMA) and (G)ARCH ((Generalised) Autoregressive Conditional Heteroscedasticity) models, the latter addressing volatility of the series variance. Results The overall data set, 1995-2009, consisted of 491324 records from 137 ICU sites; average raw mortality was 14.07%; average(SD) raw and expected mortalities ranged from 0.012(0.113) and 0.013(0.045) to 0.296(0.457) and 0.278(0.247) respectively. For the raw mortality series: 71 sites had continuous data for assessment up to or beyond lag40 and 35% had autocorrelation through to lag40; and of 36 sites with continuous data for ≥ 72 months, all demonstrated marked seasonality. Similar numbers and percentages were seen with the expected series. Out-of-control signalling was evident for the raw mortality series with respect to RA-EWMA control limits; a seasonal ARMA model, with GARCH effects, displayed white-noise residuals which were in-control with respect to EWMA control limits and one-step prediction error limits (3SE). The expected series was modelled with a multiplicative seasonal autoregressive model. Conclusions The data generating process of monthly raw mortality series at the ICU level displayed autocorrelation, seasonality and volatility. False-positive signalling of the raw mortality series was evident with respect to RA-EWMA control limits. A time series approach using residual control charts resolved these issues. PMID:23705957

  2. Prediction of South China sea level using seasonal ARIMA models

    NASA Astrophysics Data System (ADS)

    Fernandez, Flerida Regine; Po, Rodolfo; Montero, Neil; Addawe, Rizavel

    2017-11-01

    Accelerating sea level rise is an indicator of global warming and poses a threat to low-lying places and coastal countries. This study aims to fit a Seasonal Autoregressive Integrated Moving Average (SARIMA) model to the time series obtained from the TOPEX and Jason series of satellite radar altimetries of the South China Sea from the year 2008 to 2015. With altimetric measurements taken in a 10-day repeat cycle, monthly averages of the satellite altimetry measurements were taken to compose the data set used in the study. SARIMA models were then tried and fitted to the time series in order to find the best-fit model. Results show that the SARIMA(1,0,0)(0,1,1)12 model best fits the time series and was used to forecast the values for January 2016 to December 2016. The 12-month forecast using SARIMA(1,0,0)(0,1,1)12 shows that the sea level gradually increases from January to September 2016, and decreases until December 2016.

  3. Associations of daily pediatric asthma emergency department visits with air pollution in Newark, NJ: utilizing time-series and case-crossover study designs.

    PubMed

    Gleason, Jessie A; Fagliano, Jerald A

    2015-10-01

    Asthma is one of the most common chronic diseases affecting children. This study assesses the associations of ozone and fine particulate matter (PM2.5) with pediatric emergency department visits in the urban environment of Newark, NJ. Two study designs were utilized and evaluated for usability. We obtained daily emergency department visits among children aged 3-17 years with a primary diagnosis of asthma during April to September for 2004-2007. Both a time-stratified case-crossover study design with bi-directional control sampling and a time-series study design were utilized. Lagged effects (1-d through 5-d lag, 3-d average, and 5-d average) of ozone and PM2.5 were explored and a dose-response analysis comparing the bottom 5th percentile of 3-d average lag ozone with each 5 percentile increase was performed. Associations of interquartile range increase in same-day ozone were similar between the time-series and case-crossover study designs (RR = 1.08, 95% CI 1.04-1.12) and (OR = 1.10, 95% CI 1.06-1.14), respectively. Similar associations were seen for 1-day lag and 3-day average lag ozone levels. PM2.5 was not associated with the outcome in either study design. Dose-response assessment indicated a statistically significant and increasing association around 50-55 ppb consistent for both study designs. Ozone was statistically positively associated with pediatric asthma ED visits in Newark, NJ. Our results were generally comparable across the time-series and case-crossover study designs, indicating both are useful to assess local air pollution impacts.

  4. Variable diffusion in stock market fluctuations

    NASA Astrophysics Data System (ADS)

    Hua, Jia-Chen; Chen, Lijian; Falcon, Liberty; McCauley, Joseph L.; Gunaratne, Gemunu H.

    2015-02-01

    We analyze intraday fluctuations in several stock indices to investigate the underlying stochastic processes using techniques appropriate for processes with nonstationary increments. The five most actively traded stocks each contains two time intervals during the day where the variance of increments can be fit by power law scaling in time. The fluctuations in return within these intervals follow asymptotic bi-exponential distributions. The autocorrelation function for increments vanishes rapidly, but decays slowly for absolute and squared increments. Based on these results, we propose an intraday stochastic model with linear variable diffusion coefficient as a lowest order approximation to the real dynamics of financial markets, and to test the effects of time averaging techniques typically used for financial time series analysis. We find that our model replicates major stylized facts associated with empirical financial time series. We also find that ensemble averaging techniques can be used to identify the underlying dynamics correctly, whereas time averages fail in this task. Our work indicates that ensemble average approaches will yield new insight into the study of financial markets' dynamics. Our proposed model also provides new insight into the modeling of financial markets dynamics in microscopic time scales.

  5. A general statistical test for correlations in a finite-length time series.

    PubMed

    Hanson, Jeffery A; Yang, Haw

    2008-06-07

    The statistical properties of the autocorrelation function from a time series composed of independently and identically distributed stochastic variables has been studied. Analytical expressions for the autocorrelation function's variance have been derived. It has been found that two common ways of calculating the autocorrelation, moving-average and Fourier transform, exhibit different uncertainty characteristics. For periodic time series, the Fourier transform method is preferred because it gives smaller uncertainties that are uniform through all time lags. Based on these analytical results, a statistically robust method has been proposed to test the existence of correlations in a time series. The statistical test is verified by computer simulations and an application to single-molecule fluorescence spectroscopy is discussed.

  6. NCEP Air Quality Forecast(AQF) Verification. NOAA/NWS/NCEP/EMC

    Science.gov Websites

    average Select forecast four: Day 1 AOD skill for all thresholds Day 1 Time series for AOD GT 0 Day 2 AOD skill for all thresholds Day 2 Time series for AOD GT 0 Diurnal plots for AOD GT 0 Select statistic type

  7. Smoothing strategies combined with ARIMA and neural networks to improve the forecasting of traffic accidents.

    PubMed

    Barba, Lida; Rodríguez, Nibaldo; Montt, Cecilia

    2014-01-01

    Two smoothing strategies combined with autoregressive integrated moving average (ARIMA) and autoregressive neural networks (ANNs) models to improve the forecasting of time series are presented. The strategy of forecasting is implemented using two stages. In the first stage the time series is smoothed using either, 3-point moving average smoothing, or singular value Decomposition of the Hankel matrix (HSVD). In the second stage, an ARIMA model and two ANNs for one-step-ahead time series forecasting are used. The coefficients of the first ANN are estimated through the particle swarm optimization (PSO) learning algorithm, while the coefficients of the second ANN are estimated with the resilient backpropagation (RPROP) learning algorithm. The proposed models are evaluated using a weekly time series of traffic accidents of Valparaíso, Chilean region, from 2003 to 2012. The best result is given by the combination HSVD-ARIMA, with a MAPE of 0:26%, followed by MA-ARIMA with a MAPE of 1:12%; the worst result is given by the MA-ANN based on PSO with a MAPE of 15:51%.

  8. From standard alpha-stable Lévy motions to horizontal visibility networks: dependence of multifractal and Laplacian spectrum

    NASA Astrophysics Data System (ADS)

    Zou, Hai-Long; Yu, Zu-Guo; Anh, Vo; Ma, Yuan-Lin

    2018-05-01

    In recent years, researchers have proposed several methods to transform time series (such as those of fractional Brownian motion) into complex networks. In this paper, we construct horizontal visibility networks (HVNs) based on the -stable Lévy motion. We aim to study the relations of multifractal and Laplacian spectrum of transformed networks on the parameters and of the -stable Lévy motion. First, we employ the sandbox algorithm to compute the mass exponents and multifractal spectrum to investigate the multifractality of these HVNs. Then we perform least squares fits to find possible relations of the average fractal dimension , the average information dimension and the average correlation dimension against using several methods of model selection. We also investigate possible dependence relations of eigenvalues and energy on , calculated from the Laplacian and normalized Laplacian operators of the constructed HVNs. All of these constructions and estimates will help us to evaluate the validity and usefulness of the mappings between time series and networks, especially between time series of -stable Lévy motions and HVNs.

  9. Recurrence plots revisited

    NASA Astrophysics Data System (ADS)

    Casdagli, M. C.

    1997-09-01

    We show that recurrence plots (RPs) give detailed characterizations of time series generated by dynamical systems driven by slowly varying external forces. For deterministic systems we show that RPs of the time series can be used to reconstruct the RP of the driving force if it varies sufficiently slowly. If the driving force is one-dimensional, its functional form can then be inferred up to an invertible coordinate transformation. The same results hold for stochastic systems if the RP of the time series is suitably averaged and transformed. These results are used to investigate the nonlinear prediction of time series generated by dynamical systems driven by slowly varying external forces. We also consider the problem of detecting a small change in the driving force, and propose a surrogate data technique for assessing statistical significance. Numerically simulated time series and a time series of respiration rates recorded from a subject with sleep apnea are used as illustrative examples.

  10. Closed-Loop Optimal Control Implementations for Space Applications

    DTIC Science & Technology

    2016-12-01

    analyses of a series of optimal control problems, several real- time optimal control algorithms are developed that continuously adapt to feedback on the...through the analyses of a series of optimal control problems, several real- time optimal control algorithms are developed that continuously adapt to...information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering

  11. Dst and a map of average equivalent ring current: 1958-2007

    NASA Astrophysics Data System (ADS)

    Love, J. J.

    2008-12-01

    A new Dst index construction is made using the original hourly magnetic-observatory data collected over the years 1958-2007; stations: Hermanus South Africa, Kakioka Japan, Honolulu Hawaii, and San Juan Puerto Rico. The construction method we use is generally consistent with the algorithm defined by Sugiura (1964), and which forms the basis for the standard Kyoto Dst index. This involves corrections for observatory baseline shifts, subtraction of the main-field secular variation, and subtraction of specific harmonics that approximate the solar-quiet (Sq) variation. Fourier analysis of the observatory data reveals the nature of Sq: it consists primarily of periodic variation driven by the Earth's rotation, the Moon's orbit, the Earth's orbit, and, to some extent, the solar cycle. Cross coupling of the harmonics associated with each of the external periodic driving forces results in a seemingly complicated Sq time series that is sometimes considered to be relatively random and unpredictable, but which is, in fact, well described in terms of Fourier series. Working in the frequency domain, Sq can be filtered out, and, upon return to the time domain, the local disturbance time series (Dist) for each observatory can be recovered. After averaging the local disturbance time series from each observatory, the global magnetic disturbance time series Dst is obtained. Analysis of this new Dst index is compared with that produced by Kyoto, and various biases and differences are discussed. The combination of the Dist and Dst time series can be used to explore the local-time/universal-time symmetry of an equivalent ring current. Individual magnetic storms can have a complicated disturbance field that is asymmetrical in longitude, presumably due to partial ring currents. Using 50 years of data we map the average local-time magnetic disturbance, finding that it is very nearly proportional to Dst. To our surprise, the primary asymmetry in mean magnetic disturbance is not between midnight and noon, but rather between dawn and dusk, with greatest mean disturbance occurring at dusk. As a result, proposed corrections to Dst for magnetopause and tail currents might be reasonably reconsidered.

  12. Analysis of Zenith Tropospheric Delay above Europe based on long time series derived from the EPN data

    NASA Astrophysics Data System (ADS)

    Baldysz, Zofia; Nykiel, Grzegorz; Figurski, Mariusz; Szafranek, Karolina; Kroszczynski, Krzysztof; Araszkiewicz, Andrzej

    2015-04-01

    In recent years, the GNSS system began to play an increasingly important role in the research related to the climate monitoring. Based on the GPS system, which has the longest operational capability in comparison with other systems, and a common computational strategy applied to all observations, long and homogeneous ZTD (Zenith Tropospheric Delay) time series were derived. This paper presents results of analysis of 16-year ZTD time series obtained from the EPN (EUREF Permanent Network) reprocessing performed by the Military University of Technology. To maintain the uniformity of data, analyzed period of time (1998-2013) is exactly the same for all stations - observations carried out before 1998 were removed from time series and observations processed using different strategy were recalculated according to the MUT LAC approach. For all 16-year time series (59 stations) Lomb-Scargle periodograms were created to obtain information about the oscillations in ZTD time series. Due to strong annual oscillations which disturb the character of oscillations with smaller amplitude and thus hinder their investigation, Lomb-Scargle periodograms for time series with the deleted annual oscillations were created in order to verify presence of semi-annual, ter-annual and quarto-annual oscillations. Linear trend and seasonal components were estimated using LSE (Least Square Estimation) and Mann-Kendall trend test were used to confirm the presence of linear trend designated by LSE method. In order to verify the effect of the length of time series on the estimated size of the linear trend, comparison between two different length of ZTD time series was performed. To carry out a comparative analysis, 30 stations which have been operating since 1996 were selected. For these stations two periods of time were analyzed: shortened 16-year (1998-2013) and full 18-year (1996-2013). For some stations an additional two years of observations have significant impact on changing the size of linear trend - only for 4 stations the size of linear trend was exactly the same for two periods of time. In one case, the nature of the trend has changed from negative (16-year time series) for positive (18-year time series). The average value of a linear trends for 16-year time series is 1,5 mm/decade, but their spatial distribution is not uniform. The average value of linear trends for all 18-year time series is 2,0 mm/decade, with better spatial distribution and smaller discrepancies.

  13. Comparing the performance of FA, DFA and DMA using different synthetic long-range correlated time series

    PubMed Central

    Shao, Ying-Hui; Gu, Gao-Feng; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Sornette, Didier

    2012-01-01

    Notwithstanding the significant efforts to develop estimators of long-range correlations (LRC) and to compare their performance, no clear consensus exists on what is the best method and under which conditions. In addition, synthetic tests suggest that the performance of LRC estimators varies when using different generators of LRC time series. Here, we compare the performances of four estimators [Fluctuation Analysis (FA), Detrended Fluctuation Analysis (DFA), Backward Detrending Moving Average (BDMA), and Centred Detrending Moving Average (CDMA)]. We use three different generators [Fractional Gaussian Noises, and two ways of generating Fractional Brownian Motions]. We find that CDMA has the best performance and DFA is only slightly worse in some situations, while FA performs the worst. In addition, CDMA and DFA are less sensitive to the scaling range than FA. Hence, CDMA and DFA remain “The Methods of Choice” in determining the Hurst index of time series. PMID:23150785

  14. A New Trend-Following Indicator: Using SSA to Design Trading Rules

    NASA Astrophysics Data System (ADS)

    Leles, Michel Carlo Rodrigues; Mozelli, Leonardo Amaral; Guimarães, Homero Nogueira

    Singular Spectrum Analysis (SSA) is a non-parametric approach that can be used to decompose a time-series as trends, oscillations and noise. Trend-following strategies rely on the principle that financial markets move in trends for an extended period of time. Moving Averages (MAs) are the standard indicator to design such strategies. In this study, SSA is used as an alternative method to enhance trend resolution in comparison with the traditional MA. New trading rules using SSA as indicator are proposed. This paper shows that for the Down Jones Industrial Average (DJIA) and Shangai Securities Composite Index (SSCI) time-series the SSA trading rules provided, in general, better results in comparison to MA trading rules.

  15. A Comparison of Alternative Approaches to the Analysis of Interrupted Time-Series.

    ERIC Educational Resources Information Center

    Harrop, John W.; Velicer, Wayne F.

    1985-01-01

    Computer generated data representative of 16 Auto Regressive Integrated Moving Averages (ARIMA) models were used to compare the results of interrupted time-series analysis using: (1) the known model identification, (2) an assumed (l,0,0) model, and (3) an assumed (3,0,0) model as an approximation to the General Transformation approach. (Author/BW)

  16. Base Stability of Aminocyclopropeniums

    DTIC Science & Technology

    2017-11-01

    stability, a series of aminocyclopropeniums were synthesized and their base stability probed in situ using time -resolved proton nuclear magnetic resonance...reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...tested for their utility in anion exchange membranes for alkaline fuel cells. A series of aminocyclopropeniums were synthesized and their base

  17. Large-deviation probabilities for correlated Gaussian processes and intermittent dynamical systems

    NASA Astrophysics Data System (ADS)

    Massah, Mozhdeh; Nicol, Matthew; Kantz, Holger

    2018-05-01

    In its classical version, the theory of large deviations makes quantitative statements about the probability of outliers when estimating time averages, if time series data are identically independently distributed. We study large-deviation probabilities (LDPs) for time averages in short- and long-range correlated Gaussian processes and show that long-range correlations lead to subexponential decay of LDPs. A particular deterministic intermittent map can, depending on a control parameter, also generate long-range correlated time series. We illustrate numerically, in agreement with the mathematical literature, that this type of intermittency leads to a power law decay of LDPs. The power law decay holds irrespective of whether the correlation time is finite or infinite, and hence irrespective of whether the central limit theorem applies or not.

  18. Characterising experimental time series using local intrinsic dimension

    NASA Astrophysics Data System (ADS)

    Buzug, Thorsten M.; von Stamm, Jens; Pfister, Gerd

    1995-02-01

    Experimental strange attractors are analysed with the averaged local intrinsic dimension proposed by A. Passamante et al. [Phys. Rev. A 39 (1989) 3640] which is based on singular value decomposition of local trajectory matrices. The results are compared to the values of Kaplan-Yorke and the correlation dimension. The attractors, reconstructed with Takens' delay time coordinates from scalar velocity time series, are measured in the hydrodynamic Taylor-Couette system. A period doubling route towards chaos obtained from a very short Taylor-Couette cylinder yields a sequence of experimental time series where the local intrinsic dimension is applied.

  19. Predicting long-term catchment nutrient export: the use of nonlinear time series models

    NASA Astrophysics Data System (ADS)

    Valent, Peter; Howden, Nicholas J. K.; Szolgay, Jan; Komornikova, Magda

    2010-05-01

    After the Second World War the nitrate concentrations in European water bodies changed significantly as the result of increased nitrogen fertilizer use and changes in land use. However, in the last decades, as a consequence of the implementation of nitrate-reducing measures in Europe, the nitrate concentrations in water bodies slowly decrease. This causes that the mean and variance of the observed time series also changes with time (nonstationarity and heteroscedascity). In order to detect changes and properly describe the behaviour of such time series by time series analysis, linear models (such as autoregressive (AR), moving average (MA) and autoregressive moving average models (ARMA)), are no more suitable. Time series with sudden changes in statistical characteristics can cause various problems in the calibration of traditional water quality models and thus give biased predictions. Proper statistical analysis of these non-stationary and heteroscedastic time series with the aim of detecting and subsequently explaining the variations in their statistical characteristics requires the use of nonlinear time series models. This information can be then used to improve the model building and calibration of conceptual water quality model or to select right calibration periods in order to produce reliable predictions. The objective of this contribution is to analyze two long time series of nitrate concentrations of the rivers Ouse and Stour with advanced nonlinear statistical modelling techniques and compare their performance with traditional linear models of the ARMA class in order to identify changes in the time series characteristics. The time series were analysed with nonlinear models with multiple regimes represented by self-exciting threshold autoregressive (SETAR) and Markov-switching models (MSW). The analysis showed that, based on the value of residual sum of squares (RSS) in both datasets, SETAR and MSW models described the time-series better than models of the ARMA class. In most cases the relative improvement of SETAR models against AR models of first order was low ranging between 1% and 4% with the exception of the three-regime model for the River Stour time-series where the improvement was 48.9%. In comparison, the relative improvement of MSW models was between 44.6% and 52.5 for two-regime and from 60.4% to 75% for three-regime models. However, the visual assessment of models plotted against original datasets showed that despite a high value of RSS, some ARMA models could describe the analyzed time-series better than AR, MA and SETAR models with lower values of RSS. In both datasets MSW models provided a very good visual fit describing most of the extreme values.

  20. Predicting Energy Consumption for Potential Effective Use in Hybrid Vehicle Powertrain Management Using Driver Prediction

    NASA Astrophysics Data System (ADS)

    Magnuson, Brian

    A proof-of-concept software-in-the-loop study is performed to assess the accuracy of predicted net and charge-gaining energy consumption for potential effective use in optimizing powertrain management of hybrid vehicles. With promising results of improving fuel efficiency of a thermostatic control strategy for a series, plug-ing, hybrid-electric vehicle by 8.24%, the route and speed prediction machine learning algorithms are redesigned and implemented for real- world testing in a stand-alone C++ code-base to ingest map data, learn and predict driver habits, and store driver data for fast startup and shutdown of the controller or computer used to execute the compiled algorithm. Speed prediction is performed using a multi-layer, multi-input, multi- output neural network using feed-forward prediction and gradient descent through back- propagation training. Route prediction utilizes a Hidden Markov Model with a recurrent forward algorithm for prediction and multi-dimensional hash maps to store state and state distribution constraining associations between atomic road segments and end destinations. Predicted energy is calculated using the predicted time-series speed and elevation profile over the predicted route and the road-load equation. Testing of the code-base is performed over a known road network spanning 24x35 blocks on the south hill of Spokane, Washington. A large set of training routes are traversed once to add randomness to the route prediction algorithm, and a subset of the training routes, testing routes, are traversed to assess the accuracy of the net and charge-gaining predicted energy consumption. Each test route is traveled a random number of times with varying speed conditions from traffic and pedestrians to add randomness to speed prediction. Prediction data is stored and analyzed in a post process Matlab script. The aggregated results and analysis of all traversals of all test routes reflect the performance of the Driver Prediction algorithm. The error of average energy gained through charge-gaining events is 31.3% and the error of average net energy consumed is 27.3%. The average delta and average standard deviation of the delta of predicted energy gained through charge-gaining events is 0.639 and 0.601 Wh respectively for individual time-series calculations. Similarly, the average delta and average standard deviation of the delta of the predicted net energy consumed is 0.567 and 0.580 Wh respectively for individual time-series calculations. The average delta and standard deviation of the delta of the predicted speed is 1.60 and 1.15 respectively also for the individual time-series measurements. The percentage of accuracy of route prediction is 91%. Overall, test routes are traversed 151 times for a total test distance of 276.4 km.

  1. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, L.; Vogel, R. M.

    2015-12-01

    Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.

  2. A Filtering of Incomplete GNSS Position Time Series with Probabilistic Principal Component Analysis

    NASA Astrophysics Data System (ADS)

    Gruszczynski, Maciej; Klos, Anna; Bogusz, Janusz

    2018-04-01

    For the first time, we introduced the probabilistic principal component analysis (pPCA) regarding the spatio-temporal filtering of Global Navigation Satellite System (GNSS) position time series to estimate and remove Common Mode Error (CME) without the interpolation of missing values. We used data from the International GNSS Service (IGS) stations which contributed to the latest International Terrestrial Reference Frame (ITRF2014). The efficiency of the proposed algorithm was tested on the simulated incomplete time series, then CME was estimated for a set of 25 stations located in Central Europe. The newly applied pPCA was compared with previously used algorithms, which showed that this method is capable of resolving the problem of proper spatio-temporal filtering of GNSS time series characterized by different observation time span. We showed, that filtering can be carried out with pPCA method when there exist two time series in the dataset having less than 100 common epoch of observations. The 1st Principal Component (PC) explained more than 36% of the total variance represented by time series residuals' (series with deterministic model removed), what compared to the other PCs variances (less than 8%) means that common signals are significant in GNSS residuals. A clear improvement in the spectral indices of the power-law noise was noticed for the Up component, which is reflected by an average shift towards white noise from - 0.98 to - 0.67 (30%). We observed a significant average reduction in the accuracy of stations' velocity estimated for filtered residuals by 35, 28 and 69% for the North, East, and Up components, respectively. CME series were also subjected to analysis in the context of environmental mass loading influences of the filtering results. Subtraction of the environmental loading models from GNSS residuals provides to reduction of the estimated CME variance by 20 and 65% for horizontal and vertical components, respectively.

  3. Finding hidden periodic signals in time series - an application to stock prices

    NASA Astrophysics Data System (ADS)

    O'Shea, Michael

    2014-03-01

    Data in the form of time series appear in many areas of science. In cases where the periodicity is apparent and the only other contribution to the time series is stochastic in origin, the data can be `folded' to improve signal to noise and this has been done for light curves of variable stars with the folding resulting in a cleaner light curve signal. Stock index prices versus time are classic examples of time series. Repeating patterns have been claimed by many workers and include unusually large returns on small-cap stocks during the month of January, and small returns on the Dow Jones Industrial average (DJIA) in the months June through September compared to the rest of the year. Such observations imply that these prices have a periodic component. We investigate this for the DJIA. If such a component exists it is hidden in a large non-periodic variation and a large stochastic variation. We show how to extract this periodic component and for the first time reveal its yearly (averaged) shape. This periodic component leads directly to the `Sell in May and buy at Halloween' adage. We also drill down and show that this yearly variation emerges from approximately half of the underlying stocks making up the DJIA index.

  4. Monthly Surface Air Temperature Time Series Area-Averaged Over the 30-Degree Latitudinal Belts of the Globe

    DOE Data Explorer

    Lugina, K. M. [Department of Geography, St. Petersburg State University, St. Petersburg, Russia; Groisman, P. Ya. [National Climatic Data Center, Asheville, North Carolina USA); Vinnikov, K. Ya. [Department of Atmospheric Sciences, University of Maryland, College Park, Maryland (USA); Koknaeva, V. V. [State Hydrological Institute, St. Petersburg, Russia; Speranskaya, N. A. [State Hydrological Institute, St. Petersburg, Russia

    2006-01-01

    The mean monthly and annual values of surface air temperature compiled by Lugina et al. have been taken mainly from the World Weather Records, Monthly Climatic Data for the World, and Meteorological Data for Individual Years over the Northern Hemisphere Excluding the USSR. These published records were supplemented with information from different national publications. In the original archive, after removal of station records believed to be nonhomogeneous or biased, 301 and 265 stations were used to determine the mean temperature for the Northern and Southern hemispheres, respectively. The new version of the station temperature archive (used for evaluation of the zonally-averaged temperatures) was created in 1995. The change to the archive was required because data from some stations became unavailable for analyses in the 1990s. During this process, special care was taken to secure homogeneity of zonally averaged time series. When a station (or a group of stations) stopped reporting, a "new" station (or group of stations) was selected in the same region, and its data for the past 50 years were collected and added to the archive. The processing (area-averaging) was organized in such a way that each time series from a new station spans the reference period (1951-1975) and the years thereafter. It was determined that the addition of the new stations had essentially no effect on the zonally-averaged values for the pre-1990 period.

  5. Single-Trial Analysis of Inter-Beat Interval Perturbations Accompanying Single-Switch Scanning: Case Series of Three Children With Severe Spastic Quadriplegic Cerebral Palsy.

    PubMed

    Leung, Brian; Chau, Tom

    2016-02-01

    Single-switch access in conjunction with scanning remains a fundamental solution in restoring communication for many children with profound physical disabilities. However, untimely switch inaction and unintentional switch activations can lead to user frustration and impede functional communication. A previous preliminary study, in the context of a case series with three single-switch users, reported that correct, accidental and missed switch activations could elicit cardiac deceleration and increased phasic skin conductance on average, while deliberate switch non-use was associated with autonomic nonresponse. The present study investigated the possibility of using blood volume pulse recordings from the same three pediatric single-switch users to track the aforementioned switch events on a single-trial basis. Peaks of the line length time series derived from the empirical mode decomposition of the inter-beat interval time series matched, on average, a high percentage (above 80%) of single-switch events, while unmatched peaks coincided moderately (below 37%) with idle time during scanning. These results encourage further study of autonomic measures as complementary information channels to enhance single-switch access.

  6. High School Grade Inflation from 2004 to 2011. ACT Research Report Series, 2013 (3)

    ERIC Educational Resources Information Center

    Zhang, Qian; Sanchez, Edgar I.

    2013-01-01

    This study explores inflation in high school grade point average (HSGPA), defined as trend over time in the conditional average of HSGPA, given ACT® Composite score. The time period considered is 2004 to 2011. Using hierarchical linear modeling, the study updates a previous analysis of Woodruff and Ziomek (2004). The study also investigates…

  7. Performance Comparison of Big Data Analytics With NEXUS and Giovanni

    NASA Astrophysics Data System (ADS)

    Jacob, J. C.; Huang, T.; Lynnes, C.

    2016-12-01

    NEXUS is an emerging data-intensive analysis framework developed with a new approach for handling science data that enables large-scale data analysis. It is available through open source. We compare performance of NEXUS and Giovanni for 3 statistics algorithms applied to NASA datasets. Giovanni is a statistics web service at NASA Distributed Active Archive Centers (DAACs). NEXUS is a cloud-computing environment developed at JPL and built on Apache Solr, Cassandra, and Spark. We compute global time-averaged map, correlation map, and area-averaged time series. The first two algorithms average over time to produce a value for each pixel in a 2-D map. The third algorithm averages spatially to produce a single value for each time step. This talk is our report on benchmark comparison findings that indicate 15x speedup with NEXUS over Giovanni to compute area-averaged time series of daily precipitation rate for the Tropical Rainfall Measuring Mission (TRMM with 0.25 degree spatial resolution) for the Continental United States over 14 years (2000-2014) with 64-way parallelism and 545 tiles per granule. 16-way parallelism with 16 tiles per granule worked best with NEXUS for computing an 18-year (1998-2015) TRMM daily precipitation global time averaged map (2.5 times speedup) and 18-year global map of correlation between TRMM daily precipitation and TRMM real time daily precipitation (7x speedup). These and other benchmark results will be presented along with key lessons learned in applying the NEXUS tiling approach to big data analytics in the cloud.

  8. Load Balancing Using Time Series Analysis for Soft Real Time Systems with Statistically Periodic Loads

    NASA Technical Reports Server (NTRS)

    Hailperin, Max

    1993-01-01

    This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that our techniques allow more accurate estimation of the global system load ing, resulting in fewer object migration than local methods. Our method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive methods.

  9. Covariance Function for Nearshore Wave Assimilation Systems

    DTIC Science & Technology

    2018-01-30

    covariance can be modeled by a parameterized Gaussian function, for nearshore wave assimilation applications, the covariance function depends primarily on...case of missing values at the compiled time series, the gaps were filled by weighted interpolation. The weights depend on the number of the...averaging, in order to create the continuous time series, filters out the dependency on the instantaneous meteorological and oceanographic conditions

  10. Smoothing Strategies Combined with ARIMA and Neural Networks to Improve the Forecasting of Traffic Accidents

    PubMed Central

    Rodríguez, Nibaldo

    2014-01-01

    Two smoothing strategies combined with autoregressive integrated moving average (ARIMA) and autoregressive neural networks (ANNs) models to improve the forecasting of time series are presented. The strategy of forecasting is implemented using two stages. In the first stage the time series is smoothed using either, 3-point moving average smoothing, or singular value Decomposition of the Hankel matrix (HSVD). In the second stage, an ARIMA model and two ANNs for one-step-ahead time series forecasting are used. The coefficients of the first ANN are estimated through the particle swarm optimization (PSO) learning algorithm, while the coefficients of the second ANN are estimated with the resilient backpropagation (RPROP) learning algorithm. The proposed models are evaluated using a weekly time series of traffic accidents of Valparaíso, Chilean region, from 2003 to 2012. The best result is given by the combination HSVD-ARIMA, with a MAPE of 0 : 26%, followed by MA-ARIMA with a MAPE of 1 : 12%; the worst result is given by the MA-ANN based on PSO with a MAPE of 15 : 51%. PMID:25243200

  11. Filter-based multiscale entropy analysis of complex physiological time series.

    PubMed

    Xu, Yuesheng; Zhao, Liang

    2013-08-01

    Multiscale entropy (MSE) has been widely and successfully used in analyzing the complexity of physiological time series. We reinterpret the averaging process in MSE as filtering a time series by a filter of a piecewise constant type. From this viewpoint, we introduce filter-based multiscale entropy (FME), which filters a time series to generate multiple frequency components, and then we compute the blockwise entropy of the resulting components. By choosing filters adapted to the feature of a given time series, FME is able to better capture its multiscale information and to provide more flexibility for studying its complexity. Motivated by the heart rate turbulence theory, which suggests that the human heartbeat interval time series can be described in piecewise linear patterns, we propose piecewise linear filter multiscale entropy (PLFME) for the complexity analysis of the time series. Numerical results from PLFME are more robust to data of various lengths than those from MSE. The numerical performance of the adaptive piecewise constant filter multiscale entropy without prior information is comparable to that of PLFME, whose design takes prior information into account.

  12. Neural network versus classical time series forecasting models

    NASA Astrophysics Data System (ADS)

    Nor, Maria Elena; Safuan, Hamizah Mohd; Shab, Noorzehan Fazahiyah Md; Asrul, Mohd; Abdullah, Affendi; Mohamad, Nurul Asmaa Izzati; Lee, Muhammad Hisyam

    2017-05-01

    Artificial neural network (ANN) has advantage in time series forecasting as it has potential to solve complex forecasting problems. This is because ANN is data driven approach which able to be trained to map past values of a time series. In this study the forecast performance between neural network and classical time series forecasting method namely seasonal autoregressive integrated moving average models was being compared by utilizing gold price data. Moreover, the effect of different data preprocessing on the forecast performance of neural network being examined. The forecast accuracy was evaluated using mean absolute deviation, root mean square error and mean absolute percentage error. It was found that ANN produced the most accurate forecast when Box-Cox transformation was used as data preprocessing.

  13. Taylor Series Trajectory Calculations Including Oblateness Effects and Variable Atmospheric Density

    NASA Technical Reports Server (NTRS)

    Scott, James R.

    2011-01-01

    Taylor series integration is implemented in NASA Glenn's Spacecraft N-body Analysis Program, and compared head-to-head with the code's existing 8th- order Runge-Kutta Fehlberg time integration scheme. This paper focuses on trajectory problems that include oblateness and/or variable atmospheric density. Taylor series is shown to be significantly faster and more accurate for oblateness problems up through a 4x4 field, with speedups ranging from a factor of 2 to 13. For problems with variable atmospheric density, speedups average 24 for atmospheric density alone, and average 1.6 to 8.2 when density and oblateness are combined.

  14. Getting It Right Matters: Climate Spectra and Their Estimation

    NASA Astrophysics Data System (ADS)

    Privalsky, Victor; Yushkov, Vladislav

    2018-06-01

    In many recent publications, climate spectra estimated with different methods from observed, GCM-simulated, and reconstructed time series contain many peaks at time scales from a few years to many decades and even centuries. However, respective spectral estimates obtained with the autoregressive (AR) and multitapering (MTM) methods showed that spectra of climate time series are smooth and contain no evidence of periodic or quasi-periodic behavior. Four order selection criteria for the autoregressive models were studied and proven sufficiently reliable for 25 time series of climate observations at individual locations or spatially averaged at local-to-global scales. As time series of climate observations are short, an alternative reliable nonparametric approach is Thomson's MTM. These results agree with both the earlier climate spectral analyses and the Markovian stochastic model of climate.

  15. Assessing the catchment's filtering effect on the propagation of meteorological anomalies

    NASA Astrophysics Data System (ADS)

    di Domenico, Antonella; Laguardia, Giovanni; Margiotta, Maria Rosaria

    2010-05-01

    The characteristics of drought propagation within a catchment are evaluated by means of the analysis of time series of water fluxes and storages' states. The study area is the Agri basin, Southern Italy, closed at the Tarangelo gauging station (507 km2). Once calibrated the IRP weather generator (Veneziano and Iacobellis, 2002) on observed data, a 100 years time series of precipitation has been produced. The drought statistics obtained from the synthetic data have been compared to the ones obtained from the limited observations available. The DREAM hydrological model has been calibrated based on observed precipitation and discharge. From the model run on the synthetic precipitation we have obtained the time series of variables relevant for assessing the status of the catchment, namely total runoff and its components, actual evapotranspiration, and soil moisture. The Standardized Precipitation Index (SPI; McKee et al., 1993) has been calculated for different averaging periods. The modelled data have been processed for the calculation of drought indices. In particular, we have chosen to use their transformation into standardized variables. We have performed autocorrelation analysis for assessing the characteristic time scales of the variables. Moreover, we have investigated through cross correlation their relationships, assessing also the SPI averaging period for which the maximum correlation is reached. The variables' drought statistics, namely number of events, duration, and deficit volumes, have been assessed. As a result of the filtering effect exerted by the different catchment storages, the characteristic time scale and the maximum correlation SPI averaging periods for the different time series tend to increase. Thus, the number of drought events tends to decrease and their duration to increase under increasing storage.

  16. Most suitable mother wavelet for the analysis of fractal properties of stride interval time series via the average wavelet coefficient

    PubMed Central

    Zhang, Zhenwei; VanSwearingen, Jessie; Brach, Jennifer S.; Perera, Subashan

    2016-01-01

    Human gait is a complex interaction of many nonlinear systems and stride intervals exhibit self-similarity over long time scales that can be modeled as a fractal process. The scaling exponent represents the fractal degree and can be interpreted as a biomarker of relative diseases. The previous study showed that the average wavelet method provides the most accurate results to estimate this scaling exponent when applied to stride interval time series. The purpose of this paper is to determine the most suitable mother wavelet for the average wavelet method. This paper presents a comparative numerical analysis of sixteen mother wavelets using simulated and real fractal signals. Simulated fractal signals were generated under varying signal lengths and scaling exponents that indicate a range of physiologically conceivable fractal signals. The five candidates were chosen due to their good performance on the mean square error test for both short and long signals. Next, we comparatively analyzed these five mother wavelets for physiologically relevant stride time series lengths. Our analysis showed that the symlet 2 mother wavelet provides a low mean square error and low variance for long time intervals and relatively low errors for short signal lengths. It can be considered as the most suitable mother function without the burden of considering the signal length. PMID:27960102

  17. An improved portmanteau test for autocorrelated errors in interrupted time-series regression models.

    PubMed

    Huitema, Bradley E; McKean, Joseph W

    2007-08-01

    A new portmanteau test for autocorrelation among the errors of interrupted time-series regression models is proposed. Simulation results demonstrate that the inferential properties of the proposed Q(H-M) test statistic are considerably more satisfactory than those of the well known Ljung-Box test and moderately better than those of the Box-Pierce test. These conclusions generally hold for a wide variety of autoregressive (AR), moving averages (MA), and ARMA error processes that are associated with time-series regression models of the form described in Huitema and McKean (2000a, 2000b).

  18. Stochastic modelling of the monthly average maximum and minimum temperature patterns in India 1981-2015

    NASA Astrophysics Data System (ADS)

    Narasimha Murthy, K. V.; Saravana, R.; Vijaya Kumar, K.

    2018-04-01

    The paper investigates the stochastic modelling and forecasting of monthly average maximum and minimum temperature patterns through suitable seasonal auto regressive integrated moving average (SARIMA) model for the period 1981-2015 in India. The variations and distributions of monthly maximum and minimum temperatures are analyzed through Box plots and cumulative distribution functions. The time series plot indicates that the maximum temperature series contain sharp peaks in almost all the years, while it is not true for the minimum temperature series, so both the series are modelled separately. The possible SARIMA model has been chosen based on observing autocorrelation function (ACF), partial autocorrelation function (PACF), and inverse autocorrelation function (IACF) of the logarithmic transformed temperature series. The SARIMA (1, 0, 0) × (0, 1, 1)12 model is selected for monthly average maximum and minimum temperature series based on minimum Bayesian information criteria. The model parameters are obtained using maximum-likelihood method with the help of standard error of residuals. The adequacy of the selected model is determined using correlation diagnostic checking through ACF, PACF, IACF, and p values of Ljung-Box test statistic of residuals and using normal diagnostic checking through the kernel and normal density curves of histogram and Q-Q plot. Finally, the forecasting of monthly maximum and minimum temperature patterns of India for the next 3 years has been noticed with the help of selected model.

  19. A comparative analysis of spectral exponent estimation techniques for 1/fβ processes with applications to the analysis of stride interval time series

    PubMed Central

    Schaefer, Alexander; Brach, Jennifer S.; Perera, Subashan; Sejdić, Ervin

    2013-01-01

    Background The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f) = 1/fβ. The scaling exponent β is thus often interpreted as a “biomarker” of relative health and decline. New Method This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. Results The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Comparison with Existing Methods: Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. Conclusions The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. PMID:24200509

  20. A comparative analysis of spectral exponent estimation techniques for 1/f(β) processes with applications to the analysis of stride interval time series.

    PubMed

    Schaefer, Alexander; Brach, Jennifer S; Perera, Subashan; Sejdić, Ervin

    2014-01-30

    The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f)=1/f(β). The scaling exponent β is thus often interpreted as a "biomarker" of relative health and decline. This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Distinguished Lecture Series - Balancing the Energy & Climate Budget

    ScienceCinema

    None

    2017-12-09

    The average American uses 11400 Watts of power continuously. This is the equivalent of burning 114 x100 Watt light bulbs, all the time. The average person globally uses 2255 Watts of power, or a little less than 23 x100 Watt light bulbs.

  2. Compounding approach for univariate time series with nonstationary variances

    NASA Astrophysics Data System (ADS)

    Schäfer, Rudi; Barkhofen, Sonja; Guhr, Thomas; Stöckmann, Hans-Jürgen; Kuhl, Ulrich

    2015-12-01

    A defining feature of nonstationary systems is the time dependence of their statistical parameters. Measured time series may exhibit Gaussian statistics on short time horizons, due to the central limit theorem. The sample statistics for long time horizons, however, averages over the time-dependent variances. To model the long-term statistical behavior, we compound the local distribution with the distribution of its parameters. Here, we consider two concrete, but diverse, examples of such nonstationary systems: the turbulent air flow of a fan and a time series of foreign exchange rates. Our main focus is to empirically determine the appropriate parameter distribution for the compounding approach. To this end, we extract the relevant time scales by decomposing the time signals into windows and determine the distribution function of the thus obtained local variances.

  3. Compounding approach for univariate time series with nonstationary variances.

    PubMed

    Schäfer, Rudi; Barkhofen, Sonja; Guhr, Thomas; Stöckmann, Hans-Jürgen; Kuhl, Ulrich

    2015-12-01

    A defining feature of nonstationary systems is the time dependence of their statistical parameters. Measured time series may exhibit Gaussian statistics on short time horizons, due to the central limit theorem. The sample statistics for long time horizons, however, averages over the time-dependent variances. To model the long-term statistical behavior, we compound the local distribution with the distribution of its parameters. Here, we consider two concrete, but diverse, examples of such nonstationary systems: the turbulent air flow of a fan and a time series of foreign exchange rates. Our main focus is to empirically determine the appropriate parameter distribution for the compounding approach. To this end, we extract the relevant time scales by decomposing the time signals into windows and determine the distribution function of the thus obtained local variances.

  4. The Hurst exponent in energy futures prices

    NASA Astrophysics Data System (ADS)

    Serletis, Apostolos; Rosenberg, Aryeh Adam

    2007-07-01

    This paper extends the work in Elder and Serletis [Long memory in energy futures prices, Rev. Financial Econ., forthcoming, 2007] and Serletis et al. [Detrended fluctuation analysis of the US stock market, Int. J. Bifurcation Chaos, forthcoming, 2007] by re-examining the empirical evidence for random walk type behavior in energy futures prices. In doing so, it uses daily data on energy futures traded on the New York Mercantile Exchange, over the period from July 2, 1990 to November 1, 2006, and a statistical physics approach-the ‘detrending moving average’ technique-providing a reliable framework for testing the information efficiency in financial markets as shown by Alessio et al. [Second-order moving average and scaling of stochastic time series, Eur. Phys. J. B 27 (2002) 197-200] and Carbone et al. [Time-dependent hurst exponent in financial time series. Physica A 344 (2004) 267-271; Analysis of clusters formed by the moving average of a long-range correlated time series. Phys. Rev. E 69 (2004) 026105]. The results show that energy futures returns display long memory and that the particular form of long memory is anti-persistence.

  5. Finite-size effect and the components of multifractality in transport economics volatility based on multifractal detrending moving average method

    NASA Astrophysics Data System (ADS)

    Chen, Feier; Tian, Kang; Ding, Xiaoxu; Miao, Yuqi; Lu, Chunxia

    2016-11-01

    Analysis of freight rate volatility characteristics attracts more attention after year 2008 due to the effect of credit crunch and slowdown in marine transportation. The multifractal detrended fluctuation analysis technique is employed to analyze the time series of Baltic Dry Bulk Freight Rate Index and the market trend of two bulk ship sizes, namely Capesize and Panamax for the period: March 1st 1999-February 26th 2015. In this paper, the degree of the multifractality with different fluctuation sizes is calculated. Besides, multifractal detrending moving average (MF-DMA) counting technique has been developed to quantify the components of multifractal spectrum with the finite-size effect taken into consideration. Numerical results show that both Capesize and Panamax freight rate index time series are of multifractal nature. The origin of multifractality for the bulk freight rate market series is found mostly due to nonlinear correlation.

  6. Euclidean distance and Kolmogorov-Smirnov analyses of multi-day auditory event-related potentials: a longitudinal stability study

    NASA Astrophysics Data System (ADS)

    Durato, M. V.; Albano, A. M.; Rapp, P. E.; Nawang, S. A.

    2015-06-01

    The validity of ERPs as indices of stable neurophysiological traits is partially dependent on their stability over time. Previous studies on ERP stability, however, have reported diverse stability estimates despite using the same component scoring methods. This present study explores a novel approach in investigating the longitudinal stability of average ERPs—that is, by treating the ERP waveform as a time series and then applying Euclidean Distance and Kolmogorov-Smirnov analyses to evaluate the similarity or dissimilarity between the ERP time series of different sessions or run pairs. Nonlinear dynamical analysis show that in the absence of a change in medical condition, the average ERPs of healthy human adults are highly longitudinally stable—as evaluated by both the Euclidean distance and the Kolmogorov-Smirnov test.

  7. Robust Semi-Active Ride Control under Stochastic Excitation

    DTIC Science & Technology

    2014-01-01

    broad classes of time-series models which are of practical importance; the Auto-Regressive (AR) models, the Integrated (I) models, and the Moving...Average (MA) models [12]. Combinations of these models result in autoregressive moving average (ARMA) and autoregressive integrated moving average...Down Up 4) Down Down These four cases can be written in compact form as: (20) Where is the Heaviside

  8. Tidal and residual currents measured by an acoustic doppler current profiler at the west end of Carquinez Strait, San Francisco Bay, California, March to November 1988

    USGS Publications Warehouse

    Burau, J.R.; Simpson, M.R.; Cheng, R.T.

    1993-01-01

    Water-velocity profiles were collected at the west end of Carquinez Strait, San Francisco Bay, California, from March to November 1988, using an acoustic Doppler current profiler (ADCP). These data are a series of 10-minute-averaged water velocities collected at 1-meter vertical intervals (bins) in the 16.8-meter water column, beginning 2.1 meters above the estuary bed. To examine the vertical structure of the horizontal water velocities, the data are separated into individual time-series by bin and then used for time-series plots, harmonic analysis, and for input to digital filters. Three-dimensional graphic renditions of the filtered data are also used in the analysis. Harmonic analysis of the time-series data from each bin indicates that the dominant (12.42 hour or M2) partial tidal currents reverse direction near the bottom, on average, 20 minutes sooner than M2 partial tidal currents near the surface. Residual (nontidal) currents derived from the filtered data indicate that currents near the bottom are pre- dominantly up-estuary during the neap tides and down-estuary during the more energetic spring tides.

  9. Creating a monthly time series of the potentiometric surface in the Upper Floridan aquifer, Northern Tampa Bay area, Florida, January 2000-December 2009

    USGS Publications Warehouse

    Lee, Terrie M.; Fouad, Geoffrey G.

    2014-01-01

    In Florida’s karst terrain, where groundwater and surface waters interact, a mapping time series of the potentiometric surface in the Upper Floridan aquifer offers a versatile metric for assessing the hydrologic condition of both the aquifer and overlying streams and wetlands. Long-term groundwater monitoring data were used to generate a monthly time series of potentiometric surfaces in the Upper Floridan aquifer over a 573-square-mile area of west-central Florida between January 2000 and December 2009. Recorded groundwater elevations were collated for 260 groundwater monitoring wells in the Northern Tampa Bay area, and a continuous time series of daily observations was created for 197 of the wells by estimating missing daily values through regression relations with other monitoring wells. Kriging was used to interpolate the monthly average potentiometric-surface elevation in the Upper Floridan aquifer over a decade. The mapping time series gives spatial and temporal coherence to groundwater monitoring data collected continuously over the decade by three different organizations, but at various frequencies. Further, the mapping time series describes the potentiometric surface beneath parts of six regionally important stream watersheds and 11 municipal well fields that collectively withdraw about 90 million gallons per day from the Upper Floridan aquifer. Monthly semivariogram models were developed using monthly average groundwater levels at wells. Kriging was used to interpolate the monthly average potentiometric-surface elevations and to quantify the uncertainty in the interpolated elevations. Drawdown of the potentiometric surface within well fields was likely the cause of a characteristic decrease and then increase in the observed semivariance with increasing lag distance. This characteristic made use of the hole effect model appropriate for describing the monthly semivariograms and the interpolated surfaces. Spatial variance reflected in the monthly semivariograms decreased markedly between 2002 and 2003, timing that coincided with decreases in well-field pumping. Cross-validation results suggest that the kriging interpolation may smooth over the drawdown of the potentiometric surface near production wells. The groundwater monitoring network of 197 wells yielded an average kriging error in the potentiometric-surface elevations of 2 feet or less over approximately 70 percent of the map area. Additional data collection within the existing monitoring network of 260 wells and near selected well fields could reduce the error in individual months. Reducing the kriging error in other areas would require adding new monitoring wells. Potentiometric-surface elevations fluctuated by as much as 30 feet over the study period, and the spatially averaged elevation for the entire surface rose by about 2 feet over the decade. Monthly potentiometric-surface elevations describe the lateral groundwater flow patterns in the aquifer and are usable at a variety of spatial scales to describe vertical groundwater recharge and discharge conditions for overlying surface-water features.

  10. A comparative study of shallow groundwater level simulation with three time series models in a coastal aquifer of South China

    NASA Astrophysics Data System (ADS)

    Yang, Q.; Wang, Y.; Zhang, J.; Delgado, J.

    2017-05-01

    Accurate and reliable groundwater level forecasting models can help ensure the sustainable use of a watershed's aquifers for urban and rural water supply. In this paper, three time series analysis methods, Holt-Winters (HW), integrated time series (ITS), and seasonal autoregressive integrated moving average (SARIMA), are explored to simulate the groundwater level in a coastal aquifer, China. The monthly groundwater table depth data collected in a long time series from 2000 to 2011 are simulated and compared with those three time series models. The error criteria are estimated using coefficient of determination ( R 2), Nash-Sutcliffe model efficiency coefficient ( E), and root-mean-squared error. The results indicate that three models are all accurate in reproducing the historical time series of groundwater levels. The comparisons of three models show that HW model is more accurate in predicting the groundwater levels than SARIMA and ITS models. It is recommended that additional studies explore this proposed method, which can be used in turn to facilitate the development and implementation of more effective and sustainable groundwater management strategies.

  11. 77 FR 40889 - Proposed Data Collections Submitted for Public Comment and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-11

    ... evaluation study will be conducted using a group-randomized controlled trial multi-time series design. Four... their time. Estimated Annualized Burden Hours Number of Average burden Respondents Number of responses...

  12. Load Balancing Using Time Series Analysis for Soft Real Time Systems with Statistically Periodic Loads

    NASA Technical Reports Server (NTRS)

    Hailperin, M.

    1993-01-01

    This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that the authors' techniques allow more accurate estimation of the global system loading, resulting in fewer object migrations than local methods. The authors' method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive load-balancing methods. Results from a preliminary analysis of another system and from simulation with a synthetic load provide some evidence of more general applicability.

  13. Multifractal detrended cross-correlation analysis on gold, crude oil and foreign exchange rate time series

    NASA Astrophysics Data System (ADS)

    Pal, Mayukha; Madhusudana Rao, P.; Manimaran, P.

    2014-12-01

    We apply the recently developed multifractal detrended cross-correlation analysis method to investigate the cross-correlation behavior and fractal nature between two non-stationary time series. We analyze the daily return price of gold, West Texas Intermediate and Brent crude oil, foreign exchange rate data, over a period of 18 years. The cross correlation has been measured from the Hurst scaling exponents and the singularity spectrum quantitatively. From the results, the existence of multifractal cross-correlation between all of these time series is found. We also found that the cross correlation between gold and oil prices possess uncorrelated behavior and the remaining bivariate time series possess persistent behavior. It was observed for five bivariate series that the cross-correlation exponents are less than the calculated average generalized Hurst exponents (GHE) for q<0 and greater than GHE when q>0 and for one bivariate series the cross-correlation exponent is greater than GHE for all q values.

  14. The application of time series models to cloud field morphology analysis

    NASA Technical Reports Server (NTRS)

    Chin, Roland T.; Jau, Jack Y. C.; Weinman, James A.

    1987-01-01

    A modeling method for the quantitative description of remotely sensed cloud field images is presented. A two-dimensional texture modeling scheme based on one-dimensional time series procedures is adopted for this purpose. The time series procedure used is the seasonal autoregressive, moving average (ARMA) process in Box and Jenkins. Cloud field properties such as directionality, clustering and cloud coverage can be retrieved by this method. It has been demonstrated that a cloud field image can be quantitatively defined by a small set of parameters and synthesized surrogates can be reconstructed from these model parameters. This method enables cloud climatology to be studied quantitatively.

  15. On the statistical aspects of sunspot number time series and its association with the summer-monsoon rainfall over India

    NASA Astrophysics Data System (ADS)

    Chattopadhyay, Surajit; Chattopadhyay, Goutami

    The present paper reports studies on the association between the mean annual sunspot numbers and the summer monsoon rainfall over India. The cross correlations have been studied. After Box-Cox transformation, the time spectral analysis has been executed and it has been found that both of the time series have an important spectrum at the fifth harmonic. An artificial neural network (ANN) model has been developed on the data series averaged continuously by five years and the neural network could establish a predictor-predict and relationship between the sunspot numbers and the mean yearly summer monsoon rainfall over India.

  16. Quantified moving average strategy of crude oil futures market based on fuzzy logic rules and genetic algorithms

    NASA Astrophysics Data System (ADS)

    Liu, Xiaojia; An, Haizhong; Wang, Lijun; Guan, Qing

    2017-09-01

    The moving average strategy is a technical indicator that can generate trading signals to assist investment. While the trading signals tell the traders timing to buy or sell, the moving average cannot tell the trading volume, which is a crucial factor for investment. This paper proposes a fuzzy moving average strategy, in which the fuzzy logic rule is used to determine the strength of trading signals, i.e., the trading volume. To compose one fuzzy logic rule, we use four types of moving averages, the length of the moving average period, the fuzzy extent, and the recommend value. Ten fuzzy logic rules form a fuzzy set, which generates a rating level that decides the trading volume. In this process, we apply genetic algorithms to identify an optimal fuzzy logic rule set and utilize crude oil futures prices from the New York Mercantile Exchange (NYMEX) as the experiment data. Each experiment is repeated for 20 times. The results show that firstly the fuzzy moving average strategy can obtain a more stable rate of return than the moving average strategies. Secondly, holding amounts series is highly sensitive to price series. Thirdly, simple moving average methods are more efficient. Lastly, the fuzzy extents of extremely low, high, and very high are more popular. These results are helpful in investment decisions.

  17. Tide Gauge Records Reveal Improved Processing of Gravity Recovery and Climate Experiment Time-Variable Mass Solutions over the Coastal Ocean

    NASA Astrophysics Data System (ADS)

    Piecuch, Christopher G.; Landerer, Felix W.; Ponte, Rui M.

    2018-05-01

    Monthly ocean bottom pressure solutions from the Gravity Recovery and Climate Experiment (GRACE), derived using surface spherical cap mass concentration (MC) blocks and spherical harmonics (SH) basis functions, are compared to tide gauge (TG) monthly averaged sea level data over 2003-2015 to evaluate improved gravimetric data processing methods near the coast. MC solutions can explain ≳ 42% of the monthly variance in TG time series over broad shelf regions and in semi-enclosed marginal seas. MC solutions also generally explain ˜5-32 % more TG data variance than SH estimates. Applying a coastline resolution improvement algorithm in the GRACE data processing leads to ˜ 31% more variance in TG records explained by the MC solution on average compared to not using this algorithm. Synthetic observations sampled from an ocean general circulation model exhibit similar patterns of correspondence between modeled TG and MC time series and differences between MC and SH time series in terms of their relationship with TG time series, suggesting that observational results here are generally consistent with expectations from ocean dynamics. This work demonstrates the improved quality of recent MC solutions compared to earlier SH estimates over the coastal ocean, and suggests that the MC solutions could be a useful tool for understanding contemporary coastal sea level variability and change.

  18. SPATIAL VARIABILITY OF PM2.5 IN URBAN AREAS IN THE UNITED STATES

    EPA Science Inventory

    Epidemiologic time-series studies typically use either daily 24-hour PM concentrations averaged across several monitors in a city or data obtained at a ?central monitoring site' to relate to human health effects. If 24-hour average concentrations differ substantially across an ur...

  19. Time Series ARIMA Models of Undergraduate Grade Point Average.

    ERIC Educational Resources Information Center

    Rogers, Bruce G.

    The Auto-Regressive Integrated Moving Average (ARIMA) Models, often referred to as Box-Jenkins models, are regression methods for analyzing sequential dependent observations with large amounts of data. The Box-Jenkins approach, a three-stage procedure consisting of identification, estimation and diagnosis, was used to select the most appropriate…

  20. Macroscopic Spatial Complexity of the Game of Life Cellular Automaton: A Simple Data Analysis

    NASA Astrophysics Data System (ADS)

    Hernández-Montoya, A. R.; Coronel-Brizio, H. F.; Rodríguez-Achach, M. E.

    In this chapter we present a simple data analysis of an ensemble of 20 time series, generated by averaging the spatial positions of the living cells for each state of the Game of Life Cellular Automaton (GoL). We show that at the macroscopic level described by these time series, complexity properties of GoL are also presented and the following emergent properties, typical of data extracted complex systems such as financial or economical come out: variations of the generated time series following an asymptotic power law distribution, large fluctuations tending to be followed by large fluctuations, and small fluctuations tending to be followed by small ones, and fast decay of linear correlations, however, the correlations associated to their absolute variations exhibit a long range memory. Finally, a Detrended Fluctuation Analysis (DFA) of the generated time series, indicates that the GoL spatial macro states described by the time series are not either completely ordered or random, in a measurable and very interesting way.

  1. Using self-organizing maps to infill missing data in hydro-meteorological time series from the Logone catchment, Lake Chad basin.

    PubMed

    Nkiaka, E; Nawaz, N R; Lovett, J C

    2016-07-01

    Hydro-meteorological data is an important asset that can enhance management of water resources. But existing data often contains gaps, leading to uncertainties and so compromising their use. Although many methods exist for infilling data gaps in hydro-meteorological time series, many of these methods require inputs from neighbouring stations, which are often not available, while other methods are computationally demanding. Computing techniques such as artificial intelligence can be used to address this challenge. Self-organizing maps (SOMs), which are a type of artificial neural network, were used for infilling gaps in a hydro-meteorological time series in a Sudano-Sahel catchment. The coefficients of determination obtained were all above 0.75 and 0.65 while the average topographic error was 0.008 and 0.02 for rainfall and river discharge time series, respectively. These results further indicate that SOMs are a robust and efficient method for infilling missing gaps in hydro-meteorological time series.

  2. RADON CONCENTRATION TIME SERIES MODELING AND APPLICATION DISCUSSION.

    PubMed

    Stránský, V; Thinová, L

    2017-11-01

    In the year 2010 a continual radon measurement was established at Mladeč Caves in the Czech Republic using a continual radon monitor RADIM3A. In order to model radon time series in the years 2010-15, the Box-Jenkins Methodology, often used in econometrics, was applied. Because of the behavior of radon concentrations (RCs), a seasonal integrated, autoregressive moving averages model with exogenous variables (SARIMAX) has been chosen to model the measured time series. This model uses the time series seasonality, previously acquired values and delayed atmospheric parameters, to forecast RC. The developed model for RC time series is called regARIMA(5,1,3). Model residuals could be retrospectively compared with seismic evidence of local or global earthquakes, which occurred during the RCs measurement. This technique enables us to asses if continuously measured RC could serve an earthquake precursor. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Multiscale limited penetrable horizontal visibility graph for analyzing nonlinear time series

    NASA Astrophysics Data System (ADS)

    Gao, Zhong-Ke; Cai, Qing; Yang, Yu-Xuan; Dang, Wei-Dong; Zhang, Shan-Shan

    2016-10-01

    Visibility graph has established itself as a powerful tool for analyzing time series. We in this paper develop a novel multiscale limited penetrable horizontal visibility graph (MLPHVG). We use nonlinear time series from two typical complex systems, i.e., EEG signals and two-phase flow signals, to demonstrate the effectiveness of our method. Combining MLPHVG and support vector machine, we detect epileptic seizures from the EEG signals recorded from healthy subjects and epilepsy patients and the classification accuracy is 100%. In addition, we derive MLPHVGs from oil-water two-phase flow signals and find that the average clustering coefficient at different scales allows faithfully identifying and characterizing three typical oil-water flow patterns. These findings render our MLPHVG method particularly useful for analyzing nonlinear time series from the perspective of multiscale network analysis.

  4. Comparison between wavelet transform and moving average as filter method of MODIS imagery to recognize paddy cropping pattern in West Java

    NASA Astrophysics Data System (ADS)

    Dwi Nugroho, Kreshna; Pebrianto, Singgih; Arif Fatoni, Muhammad; Fatikhunnada, Alvin; Liyantono; Setiawan, Yudi

    2017-01-01

    Information on the area and spatial distribution of paddy field are needed to support sustainable agricultural and food security program. Mapping or distribution of cropping pattern paddy field is important to obtain sustainability paddy field area. It can be done by direct observation and remote sensing method. This paper discusses remote sensing for paddy field monitoring based on MODIS time series data. In time series MODIS data, difficult to direct classified of data, because of temporal noise. Therefore wavelet transform and moving average are needed as filter methods. The Objective of this study is to recognize paddy cropping pattern with wavelet transform and moving average in West Java using MODIS imagery (MOD13Q1) from 2001 to 2015 then compared between both of methods. The result showed the spatial distribution almost have the same cropping pattern. The accuracy of wavelet transform (75.5%) is higher than moving average (70.5%). Both methods showed that the majority of the cropping pattern in West Java have pattern paddy-fallow-paddy-fallow with various time planting. The difference of the planting schedule was occurs caused by the availability of irrigation water.

  5. Using NASA's Giovanni System to Simulate Time-Series Stations in the Outflow Region of California's Eel River

    NASA Technical Reports Server (NTRS)

    Acker, James G.; Shen, Suhung; Leptoukh, Gregory G.; Lee, Zhongping

    2012-01-01

    Oceanographic time-series stations provide vital data for the monitoring of oceanic processes, particularly those associated with trends over time and interannual variability. There are likely numerous locations where the establishment of a time-series station would be desirable, but for reasons of funding or logistics, such establishment may not be feasible. An alternative to an operational time-series station is monitoring of sites via remote sensing. In this study, the NASA Giovanni data system is employed to simulate the establishment of two time-series stations near the outflow region of California s Eel River, which carries a high sediment load. Previous time-series analysis of this location (Acker et al. 2009) indicated that remotely-sensed chl a exhibits a statistically significant increasing trend during summer (low flow) months, but no apparent trend during winter (high flow) months. Examination of several newly-available ocean data parameters in Giovanni, including 8-day resolution data, demonstrates the differences in ocean parameter trends at the two locations compared to regionally-averaged time-series. The hypothesis that the increased summer chl a values are related to increasing SST is evaluated, and the signature of the Eel River plume is defined with ocean optical parameters.

  6. Historical instrumental climate data for Australia - quality and utility for palaeoclimatic studies

    NASA Astrophysics Data System (ADS)

    Nicholls, Neville; Collins, Dean; Trewin, Blair; Hope, Pandora

    2006-10-01

    The quality and availability of climate data suitable for palaeoclimatic calibration and verification for the Australian region are discussed and documented. Details of the various datasets, including problems with the data, are presented. High-quality datasets, where such problems are reduced or even eliminated, are discussed. Many climate datasets are now analysed onto grids, facilitating the preparation of regional-average time series. Work is under way to produce such high-quality, gridded datasets for a variety of hitherto unavailable climate data, including surface humidity, pan evaporation, wind, and cloud. An experiment suggests that only a relatively small number of palaeoclimatic time series could provide a useful estimate of long-term changes in Australian annual average temperature. Copyright

  7. Phenologically-tuned MODIS NDVI-based production anomaly estimates for Zimbabwe

    USGS Publications Warehouse

    Funk, Chris; Budde, Michael E.

    2009-01-01

    For thirty years, simple crop water balance models have been used by the early warning community to monitor agricultural drought. These models estimate and accumulate actual crop evapotranspiration, evaluating environmental conditions based on crop water requirements. Unlike seasonal rainfall totals, these models take into account the phenology of the crop, emphasizing conditions during the peak grain filling phase of crop growth. In this paper we describe an analogous metric of crop performance based on time series of Moderate Resolution Imaging Spectroradiometer (MODIS) Normalized Difference Vegetation Index (NDVI) imagery. A special temporal filter is used to screen for cloud contamination. Regional NDVI time series are then composited for cultivated areas, and adjusted temporally according to the timing of the rainy season. This adjustment standardizes the NDVI response vis-??-vis the expected phenological response of maize. A national time series index is then created by taking the cropped-area weighted average of the regional series. This national time series provides an effective summary of vegetation response in agricultural areas, and allows for the identification of NDVI green-up during grain filling. Onset-adjusted NDVI values following the grain filling period are well correlated with U.S. Department of Agriculture production figures, possess desirable linear characteristics, and perform better than more common indices such as maximum seasonal NDVI or seasonally averaged NDVI. Thus, just as appropriately calibrated crop water balance models can provide more information than seasonal rainfall totals, the appropriate agro-phenological filtering of NDVI can improve the utility and accuracy of space-based agricultural monitoring.

  8. 76 FR 70165 - Proposed Collection, Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-10

    ... vacancies, labor hires, and labor separations. As the monthly JOLTS time series grow longer, their value in... ensure that requested data can be provided in the desired format, reporting burden (time and financial... businesses and organizations. Total Total Average time Estimated Affected public respondents Frequency...

  9. Weather variability and the incidence of cryptosporidiosis: comparison of time series poisson regression and SARIMA models.

    PubMed

    Hu, Wenbiao; Tong, Shilu; Mengersen, Kerrie; Connell, Des

    2007-09-01

    Few studies have examined the relationship between weather variables and cryptosporidiosis in Australia. This paper examines the potential impact of weather variability on the transmission of cryptosporidiosis and explores the possibility of developing an empirical forecast system. Data on weather variables, notified cryptosporidiosis cases, and population size in Brisbane were supplied by the Australian Bureau of Meteorology, Queensland Department of Health, and Australian Bureau of Statistics for the period of January 1, 1996-December 31, 2004, respectively. Time series Poisson regression and seasonal auto-regression integrated moving average (SARIMA) models were performed to examine the potential impact of weather variability on the transmission of cryptosporidiosis. Both the time series Poisson regression and SARIMA models show that seasonal and monthly maximum temperature at a prior moving average of 1 and 3 months were significantly associated with cryptosporidiosis disease. It suggests that there may be 50 more cases a year for an increase of 1 degrees C maximum temperature on average in Brisbane. Model assessments indicated that the SARIMA model had better predictive ability than the Poisson regression model (SARIMA: root mean square error (RMSE): 0.40, Akaike information criterion (AIC): -12.53; Poisson regression: RMSE: 0.54, AIC: -2.84). Furthermore, the analysis of residuals shows that the time series Poisson regression appeared to violate a modeling assumption, in that residual autocorrelation persisted. The results of this study suggest that weather variability (particularly maximum temperature) may have played a significant role in the transmission of cryptosporidiosis. A SARIMA model may be a better predictive model than a Poisson regression model in the assessment of the relationship between weather variability and the incidence of cryptosporidiosis.

  10. Bayesian model averaging method for evaluating associations between air pollution and respiratory mortality: a time-series study

    PubMed Central

    Fang, Xin; Li, Runkui; Kan, Haidong; Bottai, Matteo; Fang, Fang

    2016-01-01

    Objective To demonstrate an application of Bayesian model averaging (BMA) with generalised additive mixed models (GAMM) and provide a novel modelling technique to assess the association between inhalable coarse particles (PM10) and respiratory mortality in time-series studies. Design A time-series study using regional death registry between 2009 and 2010. Setting 8 districts in a large metropolitan area in Northern China. Participants 9559 permanent residents of the 8 districts who died of respiratory diseases between 2009 and 2010. Main outcome measures Per cent increase in daily respiratory mortality rate (MR) per interquartile range (IQR) increase of PM10 concentration and corresponding 95% confidence interval (CI) in single-pollutant and multipollutant (including NOx, CO) models. Results The Bayesian model averaged GAMM (GAMM+BMA) and the optimal GAMM of PM10, multipollutants and principal components (PCs) of multipollutants showed comparable results for the effect of PM10 on daily respiratory MR, that is, one IQR increase in PM10 concentration corresponded to 1.38% vs 1.39%, 1.81% vs 1.83% and 0.87% vs 0.88% increase, respectively, in daily respiratory MR. However, GAMM+BMA gave slightly but noticeable wider CIs for the single-pollutant model (−1.09 to 4.28 vs −1.08 to 3.93) and the PCs-based model (−2.23 to 4.07 vs −2.03 vs 3.88). The CIs of the multiple-pollutant model from two methods are similar, that is, −1.12 to 4.85 versus −1.11 versus 4.83. Conclusions The BMA method may represent a useful tool for modelling uncertainty in time-series studies when evaluating the effect of air pollution on fatal health outcomes. PMID:27531727

  11. Interrupted time series analysis of children’s blood lead levels: A case study of lead hazard control program in Syracuse, New York

    PubMed Central

    Shao, Liyang; Zhang, Lianjun; Zhen, Zhen

    2017-01-01

    Children’s blood lead concentrations have been closely monitored over the last two decades in the United States. The bio-monitoring surveillance data collected in local agencies reflected the local temporal trends of children’s blood lead levels (BLLs). However, the analysis and modeling of the long-term time series of BLLs have rarely been reported. We attempted to quantify the long-term trends of children’s BLLs in the city of Syracuse, New York and evaluate the impacts of local lead poisoning prevention programs and Lead Hazard Control Program on reducing the children’s BLLs. We applied interrupted time series analysis on the monthly time series of BLLs surveillance data and used ARMA (autoregressive and moving average) models to measure the average children’s blood lead level shift and detect the seasonal pattern change. Our results showed that there were three intervention stages over the past 20 years to reduce children’s BLLs in the city of Syracuse, NY. The average of children’s BLLs was significantly decreased after the interventions, declining from 8.77μg/dL to 3.94μg/dL during1992 to 2011. The seasonal variation diminished over the past decade, but more short term influences were in the variation. The lead hazard control treatment intervention proved effective in reducing the children’s blood lead levels in Syracuse, NY. Also, the reduction of the seasonal variation of children’s BLLs reflected the impacts of the local lead-based paint mitigation program. The replacement of window and door was the major cost of lead house abatement. However, soil lead was not considered a major source of lead hazard in our analysis. PMID:28182688

  12. Interrupted time series analysis of children's blood lead levels: A case study of lead hazard control program in Syracuse, New York.

    PubMed

    Shao, Liyang; Zhang, Lianjun; Zhen, Zhen

    2017-01-01

    Children's blood lead concentrations have been closely monitored over the last two decades in the United States. The bio-monitoring surveillance data collected in local agencies reflected the local temporal trends of children's blood lead levels (BLLs). However, the analysis and modeling of the long-term time series of BLLs have rarely been reported. We attempted to quantify the long-term trends of children's BLLs in the city of Syracuse, New York and evaluate the impacts of local lead poisoning prevention programs and Lead Hazard Control Program on reducing the children's BLLs. We applied interrupted time series analysis on the monthly time series of BLLs surveillance data and used ARMA (autoregressive and moving average) models to measure the average children's blood lead level shift and detect the seasonal pattern change. Our results showed that there were three intervention stages over the past 20 years to reduce children's BLLs in the city of Syracuse, NY. The average of children's BLLs was significantly decreased after the interventions, declining from 8.77μg/dL to 3.94μg/dL during1992 to 2011. The seasonal variation diminished over the past decade, but more short term influences were in the variation. The lead hazard control treatment intervention proved effective in reducing the children's blood lead levels in Syracuse, NY. Also, the reduction of the seasonal variation of children's BLLs reflected the impacts of the local lead-based paint mitigation program. The replacement of window and door was the major cost of lead house abatement. However, soil lead was not considered a major source of lead hazard in our analysis.

  13. Relating the large-scale structure of time series and visibility networks.

    PubMed

    Rodríguez, Miguel A

    2017-06-01

    The structure of time series is usually characterized by means of correlations. A new proposal based on visibility networks has been considered recently. Visibility networks are complex networks mapped from surfaces or time series using visibility properties. The structures of time series and visibility networks are closely related, as shown by means of fractional time series in recent works. In these works, a simple relationship between the Hurst exponent H of fractional time series and the exponent of the distribution of edges γ of the corresponding visibility network, which exhibits a power law, is shown. To check and generalize these results, in this paper we delve into this idea of connected structures by defining both structures more properly. In addition to the exponents used before, H and γ, which take into account local properties, we consider two more exponents that, as we will show, characterize global properties. These are the exponent α for time series, which gives the scaling of the variance with the size as var∼T^{2α}, and the exponent κ of their corresponding network, which gives the scaling of the averaged maximum of the number of edges, 〈k_{M}〉∼N^{κ}. With this representation, a more precise connection between the structures of general time series and their associated visibility network is achieved. Similarities and differences are more clearly established, and new scaling forms of complex networks appear in agreement with their respective classes of time series.

  14. Forbidden patterns in financial time series

    NASA Astrophysics Data System (ADS)

    Zanin, Massimiliano

    2008-03-01

    The existence of forbidden patterns, i.e., certain missing sequences in a given time series, is a recently proposed instrument of potential application in the study of time series. Forbidden patterns are related to the permutation entropy, which has the basic properties of classic chaos indicators, such as Lyapunov exponent or Kolmogorov entropy, thus allowing to separate deterministic (usually chaotic) from random series; however, it requires fewer values of the series to be calculated, and it is suitable for using with small datasets. In this paper, the appearance of forbidden patterns is studied in different economical indicators such as stock indices (Dow Jones Industrial Average and Nasdaq Composite), NYSE stocks (IBM and Boeing), and others (ten year Bond interest rate), to find evidence of deterministic behavior in their evolutions. Moreover, the rate of appearance of the forbidden patterns is calculated, and some considerations about the underlying dynamics are suggested.

  15. Joint Seasonal ARMA Approach for Modeling of Load Forecast Errors in Planning Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hafen, Ryan P.; Samaan, Nader A.; Makarov, Yuri V.

    2014-04-14

    To make informed and robust decisions in the probabilistic power system operation and planning process, it is critical to conduct multiple simulations of the generated combinations of wind and load parameters and their forecast errors to handle the variability and uncertainty of these time series. In order for the simulation results to be trustworthy, the simulated series must preserve the salient statistical characteristics of the real series. In this paper, we analyze day-ahead load forecast error data from multiple balancing authority locations and characterize statistical properties such as mean, standard deviation, autocorrelation, correlation between series, time-of-day bias, and time-of-day autocorrelation.more » We then construct and validate a seasonal autoregressive moving average (ARMA) model to model these characteristics, and use the model to jointly simulate day-ahead load forecast error series for all BAs.« less

  16. Are U.S. Military Interventions Contagious over Time? Intervention Timing and Its Implications for Force Planning

    DTIC Science & Technology

    2013-01-01

    29 3.5. ARIMA Models , Temporal Clustering of Conflicts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 3.6...39 3.9. ARIMA Models ...variance across a distribution. Autoregressive integrated moving average ( ARIMA ) models are used with time-series data sets and are designed to capture

  17. Using First Differences to Reduce Inhomogeneity in Radiosonde Temperature Datasets.

    NASA Astrophysics Data System (ADS)

    Free, Melissa; Angell, James K.; Durre, Imke; Lanzante, John; Peterson, Thomas C.; Seidel, Dian J.

    2004-11-01

    The utility of a “first difference” method for producing temporally homogeneous large-scale mean time series is assessed. Starting with monthly averages, the method involves dropping data around the time of suspected discontinuities and then calculating differences in temperature from one year to the next, resulting in a time series of year-to-year differences for each month at each station. These first difference time series are then combined to form large-scale means, and mean temperature time series are constructed from the first difference series. When applied to radiosonde temperature data, the method introduces random errors that decrease with the number of station time series used to create the large-scale time series and increase with the number of temporal gaps in the station time series. Root-mean-square errors for annual means of datasets produced with this method using over 500 stations are estimated at no more than 0.03 K, with errors in trends less than 0.02 K decade-1 for 1960 97 at 500 mb. For a 50-station dataset, errors in trends in annual global means introduced by the first differencing procedure may be as large as 0.06 K decade-1 (for six breaks per series), which is greater than the standard error of the trend. Although the first difference method offers significant resource and labor advantages over methods that attempt to adjust the data, it introduces an error in large-scale mean time series that may be unacceptable in some cases.


  18. Evaluation of COPD's diaphragm motion extracted from 4D-MRI

    NASA Astrophysics Data System (ADS)

    Swastika, Windra; Masuda, Yoshitada; Kawata, Naoko; Matsumoto, Koji; Suzuki, Toshio; Iesato, Ken; Tada, Yuji; Sugiura, Toshihiko; Tanabe, Nobuhiro; Tatsumi, Koichiro; Ohnishi, Takashi; Haneishi, Hideaki

    2015-03-01

    We have developed a method called intersection profile method to construct a 4D-MRI (3D+time) from time-series of 2D-MRI. The basic idea is to find the best matching of the intersection profile from the time series of 2D-MRI in sagittal plane (navigator slice) and time series of 2D-MRI in coronal plane (data slice). In this study, we use 4D-MRI to semiautomatically extract the right diaphragm motion of 16 subjects (8 healthy subjects and 8 COPD patients). The diaphragm motion is then evaluated quantitatively by calculating the displacement of each subjects and normalized it. We also generate phase-length map to view and locate paradoxical motion of the COPD patients. The quantitative results of the normalized displacement shows that COPD patients tend to have smaller displacement compared to healthy subjects. The average normalized displacement of total 8 COPD patients is 9.4mm and the average of normalized displacement of 8 healthy volunteers is 15.3mm. The generated phase-length maps show that not all of the COPD patients have paradoxical motion, however if it has paradoxical motion, the phase-length map is able to locate where does it occur.

  19. Predicting Long-Term College Success through Degree Completion Using ACT[R] Composite Score, ACT Benchmarks, and High School Grade Point Average. ACT Research Report Series, 2012 (5)

    ERIC Educational Resources Information Center

    Radunzel, Justine; Noble, Julie

    2012-01-01

    This study compared the effectiveness of ACT[R] Composite score and high school grade point average (HSGPA) for predicting long-term college success. Outcomes included annual progress towards a degree (based on cumulative credit-bearing hours earned), degree completion, and cumulative grade point average (GPA) at 150% of normal time to degree…

  20. Weak ergodicity breaking, irreproducibility, and ageing in anomalous diffusion processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Metzler, Ralf

    2014-01-14

    Single particle traces are standardly evaluated in terms of time averages of the second moment of the position time series r(t). For ergodic processes, one can interpret such results in terms of the known theories for the corresponding ensemble averaged quantities. In anomalous diffusion processes, that are widely observed in nature over many orders of magnitude, the equivalence between (long) time and ensemble averages may be broken (weak ergodicity breaking), and these time averages may no longer be interpreted in terms of ensemble theories. Here we detail some recent results on weakly non-ergodic systems with respect to the time averagedmore » mean squared displacement, the inherent irreproducibility of individual measurements, and methods to determine the exact underlying stochastic process. We also address the phenomenon of ageing, the dependence of physical observables on the time span between initial preparation of the system and the start of the measurement.« less

  1. High Speed Solution of Spacecraft Trajectory Problems Using Taylor Series Integration

    NASA Technical Reports Server (NTRS)

    Scott, James R.; Martini, Michael C.

    2008-01-01

    Taylor series integration is implemented in a spacecraft trajectory analysis code-the Spacecraft N-body Analysis Program (SNAP) - and compared with the code s existing eighth-order Runge-Kutta Fehlberg time integration scheme. Nine trajectory problems, including near Earth, lunar, Mars and Europa missions, are analyzed. Head-to-head comparison at five different error tolerances shows that, on average, Taylor series is faster than Runge-Kutta Fehlberg by a factor of 15.8. Results further show that Taylor series has superior convergence properties. Taylor series integration proves that it can provide rapid, highly accurate solutions to spacecraft trajectory problems.

  2. Using SAR satellite data time series for regional glacier mapping

    NASA Astrophysics Data System (ADS)

    Winsvold, Solveig H.; Kääb, Andreas; Nuth, Christopher; Andreassen, Liss M.; van Pelt, Ward J. J.; Schellenberger, Thomas

    2018-03-01

    With dense SAR satellite data time series it is possible to map surface and subsurface glacier properties that vary in time. On Sentinel-1A and RADARSAT-2 backscatter time series images over mainland Norway and Svalbard, we outline how to map glaciers using descriptive methods. We present five application scenarios. The first shows potential for tracking transient snow lines with SAR backscatter time series and correlates with both optical satellite images (Sentinel-2A and Landsat 8) and equilibrium line altitudes derived from in situ surface mass balance data. In the second application scenario, time series representation of glacier facies corresponding to SAR glacier zones shows potential for a more accurate delineation of the zones and how they change in time. The third application scenario investigates the firn evolution using dense SAR backscatter time series together with a coupled energy balance and multilayer firn model. We find strong correlation between backscatter signals with both the modeled firn air content and modeled wetness in the firn. In the fourth application scenario, we highlight how winter rain events can be detected in SAR time series, revealing important information about the area extent of internal accumulation. In the last application scenario, averaged summer SAR images were found to have potential in assisting the process of mapping glaciers outlines, especially in the presence of seasonal snow. Altogether we present examples of how to map glaciers and to further understand glaciological processes using the existing and future massive amount of multi-sensor time series data.

  3. Multifractal analysis of time series generated by discrete Ito equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Telesca, Luciano; Czechowski, Zbigniew; Lovallo, Michele

    2015-06-15

    In this study, we show that discrete Ito equations with short-tail Gaussian marginal distribution function generate multifractal time series. The multifractality is due to the nonlinear correlations, which are hidden in Markov processes and are generated by the interrelation between the drift and the multiplicative stochastic forces in the Ito equation. A link between the range of the generalized Hurst exponents and the mean of the squares of all averaged net forces is suggested.

  4. Alcohol and liver cirrhosis mortality in the United States: comparison of methods for the analyses of time-series panel data models.

    PubMed

    Ye, Yu; Kerr, William C

    2011-01-01

    To explore various model specifications in estimating relationships between liver cirrhosis mortality rates and per capita alcohol consumption in aggregate-level cross-section time-series data. Using a series of liver cirrhosis mortality rates from 1950 to 2002 for 47 U.S. states, the effects of alcohol consumption were estimated from pooled autoregressive integrated moving average (ARIMA) models and 4 types of panel data models: generalized estimating equation, generalized least square, fixed effect, and multilevel models. Various specifications of error term structure under each type of model were also examined. Different approaches controlling for time trends and for using concurrent or accumulated consumption as predictors were also evaluated. When cirrhosis mortality was predicted by total alcohol, highly consistent estimates were found between ARIMA and panel data analyses, with an average overall effect of 0.07 to 0.09. Less consistent estimates were derived using spirits, beer, and wine consumption as predictors. When multiple geographic time series are combined as panel data, none of existent models could accommodate all sources of heterogeneity such that any type of panel model must employ some form of generalization. Different types of panel data models should thus be estimated to examine the robustness of findings. We also suggest cautious interpretation when beverage-specific volumes are used as predictors. Copyright © 2010 by the Research Society on Alcoholism.

  5. Hacia la predicción del Número R de Wolf de manchas solares utilizando Redes Neuronales con retardos temporales

    NASA Astrophysics Data System (ADS)

    Francile, C.; Luoni, M. L.

    We present a prediction of the time series of the Wolf number R of sunspots using "time lagged feed forward neural networks". We use two types of networks: the focused and distributed ones which were trained with the back propagation of errors algorithm and the temporal back propagation algorithm respectively. As inputs to neural networks we use the time series of the number R averaged annually and monthly with the method IR5. As data sets for training and test we choose certain intervals of the time series similar to other works, in order to compare the results. Finally we discuss the topology of the networks used, the number of delays used, the number of neurons per layer, the number of hidden layers and the results in the prediction of the series between one and six steps ahead. FULL TEXT IN SPANISH

  6. Seasonal trend analysis and ARIMA modeling of relative humidity and wind speed time series around Yamula Dam

    NASA Astrophysics Data System (ADS)

    Eymen, Abdurrahman; Köylü, Ümran

    2018-02-01

    Local climate change is determined by analysis of long-term recorded meteorological data. In the statistical analysis of the meteorological data, the Mann-Kendall rank test, which is one of the non-parametrical tests, has been used; on the other hand, for determining the power of the trend, Theil-Sen method has been used on the data obtained from 16 meteorological stations. The stations cover the provinces of Kayseri, Sivas, Yozgat, and Nevşehir in the Central Anatolia region of Turkey. Changes in land-use affect local climate. Dams are structures that cause major changes on the land. Yamula Dam is located 25 km northwest of Kayseri. The dam has huge water body which is approximately 85 km2. The mentioned tests have been used for detecting the presence of any positive or negative trend in meteorological data. The meteorological data in relation to the seasonal average, maximum, and minimum values of the relative humidity and seasonal average wind speed have been organized as time series and the tests have been conducted accordingly. As a result of these tests, the following have been identified: increase was observed in minimum relative humidity values in the spring, summer, and autumn seasons. As for the seasonal average wind speed, decrease was detected for nine stations in all seasons, whereas increase was observed in four stations. After the trend analysis, pre-dam mean relative humidity time series were modeled with Autoregressive Integrated Moving Averages (ARIMA) model which is statistical modeling tool. Post-dam relative humidity values were predicted by ARIMA models.

  7. Real time detection of farm-level swine mycobacteriosis outbreak using time series modeling of the number of condemned intestines in abattoirs.

    PubMed

    Adachi, Yasumoto; Makita, Kohei

    2015-09-01

    Mycobacteriosis in swine is a common zoonosis found in abattoirs during meat inspections, and the veterinary authority is expected to inform the producer for corrective actions when an outbreak is detected. The expected value of the number of condemned carcasses due to mycobacteriosis therefore would be a useful threshold to detect an outbreak, and the present study aims to develop such an expected value through time series modeling. The model was developed using eight years of inspection data (2003 to 2010) obtained at 2 abattoirs of the Higashi-Mokoto Meat Inspection Center, Japan. The resulting model was validated by comparing the predicted time-dependent values for the subsequent 2 years with the actual data for 2 years between 2011 and 2012. For the modeling, at first, periodicities were checked using Fast Fourier Transformation, and the ensemble average profiles for weekly periodicities were calculated. An Auto-Regressive Integrated Moving Average (ARIMA) model was fitted to the residual of the ensemble average on the basis of minimum Akaike's information criterion (AIC). The sum of the ARIMA model and the weekly ensemble average was regarded as the time-dependent expected value. During 2011 and 2012, the number of whole or partial condemned carcasses exceeded the 95% confidence interval of the predicted values 20 times. All of these events were associated with the slaughtering of pigs from three producers with the highest rate of condemnation due to mycobacteriosis.

  8. Using wavelet-feedforward neural networks to improve air pollution forecasting in urban environments.

    PubMed

    Dunea, Daniel; Pohoata, Alin; Iordache, Stefania

    2015-07-01

    The paper presents the screening of various feedforward neural networks (FANN) and wavelet-feedforward neural networks (WFANN) applied to time series of ground-level ozone (O3), nitrogen dioxide (NO2), and particulate matter (PM10 and PM2.5 fractions) recorded at four monitoring stations located in various urban areas of Romania, to identify common configurations with optimal generalization performance. Two distinct model runs were performed as follows: data processing using hourly-recorded time series of airborne pollutants during cold months (O3, NO2, and PM10), when residential heating increases the local emissions, and data processing using 24-h daily averaged concentrations (PM2.5) recorded between 2009 and 2012. Dataset variability was assessed using statistical analysis. Time series were passed through various FANNs. Each time series was decomposed in four time-scale components using three-level wavelets, which have been passed also through FANN, and recomposed into a single time series. The agreement between observed and modelled output was evaluated based on the statistical significance (r coefficient and correlation between errors and data). Daubechies db3 wavelet-Rprop FANN (6-4-1) utilization gave positive results for O3 time series optimizing the exclusive use of the FANN for hourly-recorded time series. NO2 was difficult to model due to time series specificity, but wavelet integration improved FANN performances. Daubechies db3 wavelet did not improve the FANN outputs for PM10 time series. Both models (FANN/WFANN) overestimated PM2.5 forecasted values in the last quarter of time series. A potential improvement of the forecasted values could be the integration of a smoothing algorithm to adjust the PM2.5 model outputs.

  9. Understanding the source of multifractality in financial markets

    NASA Astrophysics Data System (ADS)

    Barunik, Jozef; Aste, Tomaso; Di Matteo, T.; Liu, Ruipeng

    2012-09-01

    In this paper, we use the generalized Hurst exponent approach to study the multi-scaling behavior of different financial time series. We show that this approach is robust and powerful in detecting different types of multi-scaling. We observe a puzzling phenomenon where an apparent increase in multifractality is measured in time series generated from shuffled returns, where all time-correlations are destroyed, while the return distributions are conserved. This effect is robust and it is reproduced in several real financial data including stock market indices, exchange rates and interest rates. In order to understand the origin of this effect we investigate different simulated time series by means of the Markov switching multifractal model, autoregressive fractionally integrated moving average processes with stable innovations, fractional Brownian motion and Levy flights. Overall we conclude that the multifractality observed in financial time series is mainly a consequence of the characteristic fat-tailed distribution of the returns and time-correlations have the effect to decrease the measured multifractality.

  10. 10 CFR 300.5 - Submission of an entity statement.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... report as a large emitter in all future years in order to ensure a consistent time series of reports... average annual emissions over a continuous period not to exceed four years of time ending in its chosen... necessary, update their entity statements. (2) From time to time, a reporting entity may choose to change...

  11. 10 CFR 300.5 - Submission of an entity statement.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... report as a large emitter in all future years in order to ensure a consistent time series of reports... average annual emissions over a continuous period not to exceed four years of time ending in its chosen... necessary, update their entity statements. (2) From time to time, a reporting entity may choose to change...

  12. 10 CFR 300.5 - Submission of an entity statement.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... report as a large emitter in all future years in order to ensure a consistent time series of reports... average annual emissions over a continuous period not to exceed four years of time ending in its chosen... necessary, update their entity statements. (2) From time to time, a reporting entity may choose to change...

  13. 10 CFR 300.5 - Submission of an entity statement.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... report as a large emitter in all future years in order to ensure a consistent time series of reports... average annual emissions over a continuous period not to exceed four years of time ending in its chosen... necessary, update their entity statements. (2) From time to time, a reporting entity may choose to change...

  14. 10 CFR 300.5 - Submission of an entity statement.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... report as a large emitter in all future years in order to ensure a consistent time series of reports... average annual emissions over a continuous period not to exceed four years of time ending in its chosen... necessary, update their entity statements. (2) From time to time, a reporting entity may choose to change...

  15. Phase walk analysis of leptokurtic time series.

    PubMed

    Schreiber, Korbinian; Modest, Heike I; Räth, Christoph

    2018-06-01

    The Fourier phase information play a key role for the quantified description of nonlinear data. We present a novel tool for time series analysis that identifies nonlinearities by sensitively detecting correlations among the Fourier phases. The method, being called phase walk analysis, is based on well established measures from random walk analysis, which are now applied to the unwrapped Fourier phases of time series. We provide an analytical description of its functionality and demonstrate its capabilities on systematically controlled leptokurtic noise. Hereby, we investigate the properties of leptokurtic time series and their influence on the Fourier phases of time series. The phase walk analysis is applied to measured and simulated intermittent time series, whose probability density distribution is approximated by power laws. We use the day-to-day returns of the Dow-Jones industrial average, a synthetic time series with tailored nonlinearities mimicing the power law behavior of the Dow-Jones and the acceleration of the wind at an Atlantic offshore site. Testing for nonlinearities by means of surrogates shows that the new method yields strong significances for nonlinear behavior. Due to the drastically decreased computing time as compared to embedding space methods, the number of surrogate realizations can be increased by orders of magnitude. Thereby, the probability distribution of the test statistics can very accurately be derived and parameterized, which allows for much more precise tests on nonlinearities.

  16. Radiocarbon dating uncertainty and the reliability of the PEWMA method of time-series analysis for research on long-term human-environment interaction

    PubMed Central

    Carleton, W. Christopher; Campbell, David

    2018-01-01

    Statistical time-series analysis has the potential to improve our understanding of human-environment interaction in deep time. However, radiocarbon dating—the most common chronometric technique in archaeological and palaeoenvironmental research—creates challenges for established statistical methods. The methods assume that observations in a time-series are precisely dated, but this assumption is often violated when calibrated radiocarbon dates are used because they usually have highly irregular uncertainties. As a result, it is unclear whether the methods can be reliably used on radiocarbon-dated time-series. With this in mind, we conducted a large simulation study to investigate the impact of chronological uncertainty on a potentially useful time-series method. The method is a type of regression involving a prediction algorithm called the Poisson Exponentially Weighted Moving Average (PEMWA). It is designed for use with count time-series data, which makes it applicable to a wide range of questions about human-environment interaction in deep time. Our simulations suggest that the PEWMA method can often correctly identify relationships between time-series despite chronological uncertainty. When two time-series are correlated with a coefficient of 0.25, the method is able to identify that relationship correctly 20–30% of the time, providing the time-series contain low noise levels. With correlations of around 0.5, it is capable of correctly identifying correlations despite chronological uncertainty more than 90% of the time. While further testing is desirable, these findings indicate that the method can be used to test hypotheses about long-term human-environment interaction with a reasonable degree of confidence. PMID:29351329

  17. Radiocarbon dating uncertainty and the reliability of the PEWMA method of time-series analysis for research on long-term human-environment interaction.

    PubMed

    Carleton, W Christopher; Campbell, David; Collard, Mark

    2018-01-01

    Statistical time-series analysis has the potential to improve our understanding of human-environment interaction in deep time. However, radiocarbon dating-the most common chronometric technique in archaeological and palaeoenvironmental research-creates challenges for established statistical methods. The methods assume that observations in a time-series are precisely dated, but this assumption is often violated when calibrated radiocarbon dates are used because they usually have highly irregular uncertainties. As a result, it is unclear whether the methods can be reliably used on radiocarbon-dated time-series. With this in mind, we conducted a large simulation study to investigate the impact of chronological uncertainty on a potentially useful time-series method. The method is a type of regression involving a prediction algorithm called the Poisson Exponentially Weighted Moving Average (PEMWA). It is designed for use with count time-series data, which makes it applicable to a wide range of questions about human-environment interaction in deep time. Our simulations suggest that the PEWMA method can often correctly identify relationships between time-series despite chronological uncertainty. When two time-series are correlated with a coefficient of 0.25, the method is able to identify that relationship correctly 20-30% of the time, providing the time-series contain low noise levels. With correlations of around 0.5, it is capable of correctly identifying correlations despite chronological uncertainty more than 90% of the time. While further testing is desirable, these findings indicate that the method can be used to test hypotheses about long-term human-environment interaction with a reasonable degree of confidence.

  18. Changes in the NDVI of Boreal Forests over the period 1984 to 2003 measured using time series of Landsat TM/ETM+ surface reflectance and the GIMMS AVHRR NDVI record.

    NASA Astrophysics Data System (ADS)

    McMillan, A. M.; Rocha, A. V.; Goulden, M. L.

    2006-12-01

    There is a prevailing opinion that the boreal landscape is undergoing change as a result of warming temperatures leading to earlier springs, greater forest fire frequency and possibly CO2 fertilization. One widely- used line of evidence is the GIMMS AVHRR NDVI record. Several studies suggest increasing rates of photosynthesis in boreal forests from 1982 to 1991 (based on NDVI increases) while others suggest declining photosynthesis from 1996 to 2003. We suspect that a portion of these changes are due to the successional stage of the forests. We compiled a time-series of atmospherically-corrected Landsat TM/ETM+ images spanning the period 1984 to 2003 over the BOREAS Northern Study Area and compared spatial and temporal patterns of NDVI between the two records. The Landsat time series is higher resolution and, together with the Canadian Fire Service Large Fire Database, provides stand-age information. We then (1) analyzed the agreement between the Landsat and GIMMS AVHRR time series; (2) determined how the stage of forest succession affected NDVI; (3) assessed how the calculation method of annual averages of NDVI affects decadal-scale trends. The agreement between the Landsat and the AVHRR was reasonable although the depression of NDVI associated with the aerosols from the Pinatubo volcano was greater in the GIMMS time series. Pixels containing high proportions of stands burned within a decade of the observation period showed very high gains in NDVI while the more mature stands were constant. While NDVI appears to exhibit a large sensitivity to the presence of snow, the choice of a May to September averaging period for NDVI over a June to August averaging period did not affect the interannual patterns in NDVI at this location because the snow pack was seldom present in either of these periods. Knowledge of the spatial and temporal patterns of wild fire will prove useful in interpreting trends of remotely-sensed proxies of photosynthesis.

  19. Rainfall disaggregation for urban hydrology: Effects of spatial consistence

    NASA Astrophysics Data System (ADS)

    Müller, Hannes; Haberlandt, Uwe

    2015-04-01

    For urban hydrology rainfall time series with a high temporal resolution are crucial. Observed time series of this kind are very short in most cases, so they cannot be used. On the contrary, time series with lower temporal resolution (daily measurements) exist for much longer periods. The objective is to derive time series with a long duration and a high resolution by disaggregating time series of the non-recording stations with information of time series of the recording stations. The multiplicative random cascade model is a well-known disaggregation model for daily time series. For urban hydrology it is often assumed, that a day consists of only 1280 minutes in total as starting point for the disaggregation process. We introduce a new variant for the cascade model, which is functional without this assumption and also outperforms the existing approach regarding time series characteristics like wet and dry spell duration, average intensity, fraction of dry intervals and extreme value representation. However, in both approaches rainfall time series of different stations are disaggregated without consideration of surrounding stations. This yields in unrealistic spatial patterns of rainfall. We apply a simulated annealing algorithm that has been used successfully for hourly values before. Relative diurnal cycles of the disaggregated time series are resampled to reproduce the spatial dependence of rainfall. To describe spatial dependence we use bivariate characteristics like probability of occurrence, continuity ratio and coefficient of correlation. Investigation area is a sewage system in Northern Germany. We show that the algorithm has the capability to improve spatial dependence. The influence of the chosen disaggregation routine and the spatial dependence on overflow occurrences and volumes of the sewage system will be analyzed.

  20. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, L. K.; Vogel, R. M.

    2015-11-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied Generalized Pareto (GP) model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series X, with corresponding failure time series T, should have application to a wide class of natural hazards with rich opportunities for future extensions.

  1. Hydrodynamic and suspended-solids concentration measurements in Suisun Bay, California, 1995

    USGS Publications Warehouse

    Cuetara, Jay I.; Burau, Jon R.; Schoellhamer, David H.

    2001-01-01

    Sea level, current velocity, water temperature, salinity (computed from conductivity and temperature), and suspended-solids data collected in Suisun Bay, California, from May 30, 1995, through October 27, 1995, by the U.S. Geological Survey are documented in this report. Data were collected concurrently at 21 sites. Various parameters were measured at each site. Velocity-profile data were collected at 6 sites, single-point velocity measurements were made at 9 sites, salinity data were collected at 20 sites, and suspended-solids concentrations were measured at 10 sites. Sea-level and velocity data are presented in three forms; harmonic analysis results; time-series plots (sea level, current speed, and current direction versus time); and time-series plots of low-pass-filtered time series. Temperature, salinity, and suspended-solids data are presented as plots of raw and low-pass-filtered time series.The velocity and salinity data presented in this report document a period when the residual current patterns and salt field were transitioning from a freshwater-inflow-dominated condition towards a quasi steady-state summer condition when density-driven circulation and tidal nonlinearities became relatively more important as long-term transport mechanisms. Sacramento-San Joaquin River Delta outflow was high prior to and during this study, so the tidally averaged salinities were abnormally low for this time of year. For example, the tidally averaged salinities varied from 0-12 at Martinez, the western border of Suisun Bay, to a maximum of 2 at Mallard Island, the eastern border of Suisun Bay. Even though salinities increased overall in Suisun Bay during the study period, the near-bed residual currents primarily were directed seaward. Therefore, salinity intrusion through Suisun Bay towards the Delta primarily was accomplished in the absence of the tidally averaged, two-layer flow known as gravitational circulation where, by definition, the net currents are landward at the bed. The Folsom Dam spillway gate failure on July 17, 1995, was analyzed to determine the effect on the hydrodynamics of Suisun Bay. The peak flow of the American River reached roughly 1,000 cubic meters per second as a result of the failure, which is relatively small. This was roughly 15 percent of the approximate 7,000 cubic meters per second tidal flows that occur daily in Suisun Bay and was likely attenuated greatly. Based on analysis of tidally averaged near-bed salinity and depth-averaged currents after the failure, the effect was essentially nonexistent and is indistinguishable from the natural variability.

  2. Impacts of GNSS position offsets on global frame stability

    NASA Astrophysics Data System (ADS)

    Griffiths, Jake; Ray, Jim

    2015-04-01

    Positional offsets appear in Global Navigation Satellite System (GNSS) time series for a variety of reasons. Antenna or radome changes are the most common cause for these discontinuities. Many others are from earthquakes, receiver changes, and different anthropogenic modifications at or near the stations. Some jumps appear for unknown or undocumented reasons. Accurate determination of station velocities, and therefore geophysical parameters and terrestrial reference frames, requires that positional offsets be correctly found and compensated. Williams (2003) found that undetected offsets introduce a random walk error component in individual station time series. The topic of detecting positional offsets has received considerable attention in recent years (e.g., Detection of Offsets in GPS Experiment; DOGEx), and most research groups using GNSS have adopted a mix of manual and automated methods for finding them. The removal of a positional offset from a time series is usually handled by estimating the average station position on both sides of the discontinuity. Except for large earthquake events, the velocity is usually assumed constant and continuous across the positional jump. This approach is sufficient in the absence of time-correlated errors. However, GNSS time series contain periodic and power-law (flicker) errors. In this paper, we evaluate the impact to individual station results and the overall stability of the global reference frame from adding increasing numbers of positional discontinuities. We use the International GNSS Service (IGS) weekly SINEX files, and iteratively insert positional offset parameters. Each iteration includes a restacking of the modified SINEX files using the CATREF software from Institut National de l'Information Géographique et Forestière (IGN). Comparisons of successive stacked solutions are used to assess the impacts on the time series of x-pole and y-pole offsets, along with changes in regularized position and secular velocity for stations with more than 2.5 years of data. Our preliminary results indicate that the change in polar motion scatter is logarithmic with increasing numbers of discontinuities. The best-fit natural logarithm to the changes in scatter for x-pole has R2 = 0.58; the fit for the y-pole series has R2 = 0.99. From these empirical functions, we find that polar motion scatter increases from zero when the total rate of discontinuities exceeds 0.2 (x-pole) and 1.3 (y-pole) per station, on average (the IGS has 0.65 per station). Thus, the presence of position offsets in GNSS station time series is likely already a contributor to IGS polar motion inaccuracy and global frame instability. Impacts to station position and velocity estimates depend on noise features found in that station's positional time series. For instance, larger changes in velocity occur for stations with shorter and noisier data spans. This is because an added discontinuity parameter for an individual station time series can induce changes in average position on both sides of the break. We will expand on these results, and consider remaining questions about the role of velocity discontinuities and the effects caused by non-core reference frame stations.

  3. Riemannian multi-manifold modeling and clustering in brain networks

    NASA Astrophysics Data System (ADS)

    Slavakis, Konstantinos; Salsabilian, Shiva; Wack, David S.; Muldoon, Sarah F.; Baidoo-Williams, Henry E.; Vettel, Jean M.; Cieslak, Matthew; Grafton, Scott T.

    2017-08-01

    This paper introduces Riemannian multi-manifold modeling in the context of brain-network analytics: Brainnetwork time-series yield features which are modeled as points lying in or close to a union of a finite number of submanifolds within a known Riemannian manifold. Distinguishing disparate time series amounts thus to clustering multiple Riemannian submanifolds. To this end, two feature-generation schemes for brain-network time series are put forth. The first one is motivated by Granger-causality arguments and uses an auto-regressive moving average model to map low-rank linear vector subspaces, spanned by column vectors of appropriately defined observability matrices, to points into the Grassmann manifold. The second one utilizes (non-linear) dependencies among network nodes by introducing kernel-based partial correlations to generate points in the manifold of positivedefinite matrices. Based on recently developed research on clustering Riemannian submanifolds, an algorithm is provided for distinguishing time series based on their Riemannian-geometry properties. Numerical tests on time series, synthetically generated from real brain-network structural connectivity matrices, reveal that the proposed scheme outperforms classical and state-of-the-art techniques in clustering brain-network states/structures.

  4. Does preprocessing change nonlinear measures of heart rate variability?

    PubMed

    Gomes, Murilo E D; Guimarães, Homero N; Ribeiro, Antônio L P; Aguirre, Luis A

    2002-11-01

    This work investigated if methods used to produce a uniformly sampled heart rate variability (HRV) time series significantly change the deterministic signature underlying the dynamics of such signals and some nonlinear measures of HRV. Two methods of preprocessing were used: the convolution of inverse interval function values with a rectangular window and the cubic polynomial interpolation. The HRV time series were obtained from 33 Wistar rats submitted to autonomic blockade protocols and from 17 healthy adults. The analysis of determinism was carried out by the method of surrogate data sets and nonlinear autoregressive moving average modelling and prediction. The scaling exponents alpha, alpha(1) and alpha(2) derived from the detrended fluctuation analysis were calculated from raw HRV time series and respective preprocessed signals. It was shown that the technique of cubic interpolation of HRV time series did not significantly change any nonlinear characteristic studied in this work, while the method of convolution only affected the alpha(1) index. The results suggested that preprocessed time series may be used to study HRV in the field of nonlinear dynamics.

  5. Timescales for determining temperature and dissolved oxygen trends in the Long Island Sound (LIS) estuary

    NASA Astrophysics Data System (ADS)

    Staniec, Allison; Vlahos, Penny

    2017-12-01

    Long-term time series represent a critical part of the oceanographic community's efforts to discern natural and anthropogenically forced variations in the environment. They provide regular measurements of climate relevant indicators including temperature, oxygen concentrations, and salinity. When evaluating time series, it is essential to isolate long-term trends from autocorrelation in data and noise due to natural variability. Herein we apply a statistical approach, well-established in atmospheric time series, to key parameters in the U.S. east coast's Long Island Sound estuary (LIS). Analysis shows that the LIS time series (established in the early 1990s) is sufficiently long to detect significant trends in physical-chemical parameters including temperature (T) and dissolved oxygen (DO). Over the last two decades, overall (combined surface and deep) LIS T has increased at an average rate of 0.08 ± 0.03 °C yr-1 while overall DO has dropped at an average rate of 0.03 ± 0.01 mg L-1yr-1 since 1994 at the 95% confidence level. This trend is notably faster than the global open ocean T trend (0.01 °C yr-1), as might be expected for a shallower estuarine system. T and DO trends were always significant for the existing time series using four month data increments. Rates of change of DO and T in LIS are strongly correlated and the rate of decrease of DO concentrations is consistent with the expected reduced solubility of DO at these higher temperatures. Thus, changes in T alone, across decadal timescales can account for between 33 and 100% of the observed decrease in DO. This has significant implications for other dissolved gases and the long-term management of LIS hypoxia.

  6. Multi-Instrument Investigation of Ionospheric Flow Channels and Their Impact on the Ionosphere and Thermosphere during Geomagnetic Storms

    DTIC Science & Technology

    2017-12-29

    indicated as shaded intervals in cyan) is shown in the context of the 5-6 August 2011 storm energetics. These are depicted by the time series of [b...of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing   data sources...documented in a series of journal articles [Horvath and Lovell, 2017A; 2017B; 2017C; 2017D]. Our findings contribute to the better understanding of

  7. Using time-delayed mutual information to discover and interpret temporal correlation structure in complex populations

    NASA Astrophysics Data System (ADS)

    Albers, D. J.; Hripcsak, George

    2012-03-01

    This paper addresses how to calculate and interpret the time-delayed mutual information (TDMI) for a complex, diversely and sparsely measured, possibly non-stationary population of time-series of unknown composition and origin. The primary vehicle used for this analysis is a comparison between the time-delayed mutual information averaged over the population and the time-delayed mutual information of an aggregated population (here, aggregation implies the population is conjoined before any statistical estimates are implemented). Through the use of information theoretic tools, a sequence of practically implementable calculations are detailed that allow for the average and aggregate time-delayed mutual information to be interpreted. Moreover, these calculations can also be used to understand the degree of homo or heterogeneity present in the population. To demonstrate that the proposed methods can be used in nearly any situation, the methods are applied and demonstrated on the time series of glucose measurements from two different subpopulations of individuals from the Columbia University Medical Center electronic health record repository, revealing a picture of the composition of the population as well as physiological features.

  8. Characterizing Detrended Fluctuation Analysis of multifractional Brownian motion

    NASA Astrophysics Data System (ADS)

    Setty, V. A.; Sharma, A. S.

    2015-02-01

    The Hurst exponent (H) is widely used to quantify long range dependence in time series data and is estimated using several well known techniques. Recognizing its ability to remove trends the Detrended Fluctuation Analysis (DFA) is used extensively to estimate a Hurst exponent in non-stationary data. Multifractional Brownian motion (mBm) broadly encompasses a set of models of non-stationary data exhibiting time varying Hurst exponents, H(t) as against a constant H. Recently, there has been a growing interest in time dependence of H(t) and sliding window techniques have been used to estimate a local time average of the exponent. This brought to fore the ability of DFA to estimate scaling exponents in systems with time varying H(t) , such as mBm. This paper characterizes the performance of DFA on mBm data with linearly varying H(t) and further test the robustness of estimated time average with respect to data and technique related parameters. Our results serve as a bench-mark for using DFA as a sliding window estimator to obtain H(t) from time series data.

  9. Trends in College Pricing, 2012. Trends in Higher Education Series

    ERIC Educational Resources Information Center

    Baum, Sandy; Ma, Jennifer

    2012-01-01

    Widespread concern about the high and rising price of college makes timely data on tuition increases in historical context particularly important. The increase in average published tuition and fees at public four-year colleges and universities for the 2012-13 academic year is smaller than it has been in recent years--and below the average growth…

  10. A Comparison of Forecast Error Generators for Modeling Wind and Load Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Ning; Diao, Ruisheng; Hafen, Ryan P.

    2013-12-18

    This paper presents four algorithms to generate random forecast error time series, including a truncated-normal distribution model, a state-space based Markov model, a seasonal autoregressive moving average (ARMA) model, and a stochastic-optimization based model. The error time series are used to create real-time (RT), hour-ahead (HA), and day-ahead (DA) wind and load forecast time series that statistically match historically observed forecasting data sets, used for variable generation integration studies. A comparison is made using historical DA load forecast and actual load values to generate new sets of DA forecasts with similar stoical forecast error characteristics. This paper discusses and comparesmore » the capabilities of each algorithm to preserve the characteristics of the historical forecast data sets.« less

  11. Inflow forecasting model construction with stochastic time series for coordinated dam operation

    NASA Astrophysics Data System (ADS)

    Kim, T.; Jung, Y.; Kim, H.; Heo, J. H.

    2014-12-01

    Dam inflow forecasting is one of the most important tasks in dam operation for an effective water resources management and control. In general, dam inflow forecasting with stochastic time series model is possible to apply when the data is stationary because most of stochastic process based on stationarity. However, recent hydrological data cannot be satisfied the stationarity anymore because of climate change. Therefore a stochastic time series model, which can consider seasonality and trend in the data series, named SARIMAX(Seasonal Autoregressive Integrated Average with eXternal variable) model were constructed in this study. This SARIMAX model could increase the performance of stochastic time series model by considering the nonstationarity components and external variable such as precipitation. For application, the models were constructed for four coordinated dams on Han river in South Korea with monthly time series data. As a result, the models of each dam have similar performance and it would be possible to use the model for coordinated dam operation.Acknowledgement This research was supported by a grant 'Establishing Active Disaster Management System of Flood Control Structures by using 3D BIM Technique' [NEMA-NH-12-57] from the Natural Hazard Mitigation Research Group, National Emergency Management Agency of Korea.

  12. Universal fractal scaling in stream chemistry and its implications for solute transport and water quality trend detection

    PubMed Central

    Kirchner, James W.; Neal, Colin

    2013-01-01

    The chemical dynamics of lakes and streams affect their suitability as aquatic habitats and as water supplies for human needs. Because water quality is typically monitored only weekly or monthly, however, the higher-frequency dynamics of stream chemistry have remained largely invisible. To illuminate a wider spectrum of water quality dynamics, rainfall and streamflow were sampled in two headwater catchments at Plynlimon, Wales, at 7-h intervals for 1–2 y and weekly for over two decades, and were analyzed for 45 solutes spanning the periodic table from H+ to U. Here we show that in streamflow, all 45 of these solutes, including nutrients, trace elements, and toxic metals, exhibit fractal 1/fα scaling on time scales from hours to decades (α = 1.05 ± 0.15, mean ± SD). We show that this fractal scaling can arise through dispersion of random chemical inputs distributed across a catchment. These 1/f time series are non–self-averaging: monthly, yearly, or decadal averages are approximately as variable, one from the next, as individual measurements taken hours or days apart, defying naive statistical expectations. (By contrast, stream discharge itself is nonfractal, and self-averaging on time scales of months and longer.) In the solute time series, statistically significant trends arise much more frequently, on all time scales, than one would expect from conventional t statistics. However, these same trends are poor predictors of future trends—much poorer than one would expect from their calculated uncertainties. Our results illustrate how 1/f time series pose fundamental challenges to trend analysis and change detection in environmental systems. PMID:23842090

  13. Universal fractal scaling in stream chemistry and its implications for solute transport and water quality trend detection

    NASA Astrophysics Data System (ADS)

    Kirchner, James W.; Neal, Colin

    2013-07-01

    The chemical dynamics of lakes and streams affect their suitability as aquatic habitats and as water supplies for human needs. Because water quality is typically monitored only weekly or monthly, however, the higher-frequency dynamics of stream chemistry have remained largely invisible. To illuminate a wider spectrum of water quality dynamics, rainfall and streamflow were sampled in two headwater catchments at Plynlimon, Wales, at 7-h intervals for 1-2 y and weekly for over two decades, and were analyzed for 45 solutes spanning the periodic table from H+ to U. Here we show that in streamflow, all 45 of these solutes, including nutrients, trace elements, and toxic metals, exhibit fractal 1/fα scaling on time scales from hours to decades (α = 1.05 ± 0.15, mean ± SD). We show that this fractal scaling can arise through dispersion of random chemical inputs distributed across a catchment. These 1/f time series are non-self-averaging: monthly, yearly, or decadal averages are approximately as variable, one from the next, as individual measurements taken hours or days apart, defying naive statistical expectations. (By contrast, stream discharge itself is nonfractal, and self-averaging on time scales of months and longer.) In the solute time series, statistically significant trends arise much more frequently, on all time scales, than one would expect from conventional t statistics. However, these same trends are poor predictors of future trends-much poorer than one would expect from their calculated uncertainties. Our results illustrate how 1/f time series pose fundamental challenges to trend analysis and change detection in environmental systems.

  14. How Reliable is Bayesian Model Averaging Under Noisy Data? Statistical Assessment and Implications for Robust Model Selection

    NASA Astrophysics Data System (ADS)

    Schöniger, Anneli; Wöhling, Thomas; Nowak, Wolfgang

    2014-05-01

    Bayesian model averaging ranks the predictive capabilities of alternative conceptual models based on Bayes' theorem. The individual models are weighted with their posterior probability to be the best one in the considered set of models. Finally, their predictions are combined into a robust weighted average and the predictive uncertainty can be quantified. This rigorous procedure does, however, not yet account for possible instabilities due to measurement noise in the calibration data set. This is a major drawback, since posterior model weights may suffer a lack of robustness related to the uncertainty in noisy data, which may compromise the reliability of model ranking. We present a new statistical concept to account for measurement noise as source of uncertainty for the weights in Bayesian model averaging. Our suggested upgrade reflects the limited information content of data for the purpose of model selection. It allows us to assess the significance of the determined posterior model weights, the confidence in model selection, and the accuracy of the quantified predictive uncertainty. Our approach rests on a brute-force Monte Carlo framework. We determine the robustness of model weights against measurement noise by repeatedly perturbing the observed data with random realizations of measurement error. Then, we analyze the induced variability in posterior model weights and introduce this "weighting variance" as an additional term into the overall prediction uncertainty analysis scheme. We further determine the theoretical upper limit in performance of the model set which is imposed by measurement noise. As an extension to the merely relative model ranking, this analysis provides a measure of absolute model performance. To finally decide, whether better data or longer time series are needed to ensure a robust basis for model selection, we resample the measurement time series and assess the convergence of model weights for increasing time series length. We illustrate our suggested approach with an application to model selection between different soil-plant models following up on a study by Wöhling et al. (2013). Results show that measurement noise compromises the reliability of model ranking and causes a significant amount of weighting uncertainty, if the calibration data time series is not long enough to compensate for its noisiness. This additional contribution to the overall predictive uncertainty is neglected without our approach. Thus, we strongly advertise to include our suggested upgrade in the Bayesian model averaging routine.

  15. The application of complex network time series analysis in turbulent heated jets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charakopoulos, A. K.; Karakasidis, T. E., E-mail: thkarak@uth.gr; Liakopoulos, A.

    In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topologicalmore » properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.« less

  16. The application of complex network time series analysis in turbulent heated jets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charakopoulos, A. K.; Karakasidis, T. E., E-mail: thkarak@uth.gr; Liakopoulos, A.

    2014-06-15

    In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topologicalmore » properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.« less

  17. The influence of trading volume on market efficiency: The DCCA approach

    NASA Astrophysics Data System (ADS)

    Sukpitak, Jessada; Hengpunya, Varagorn

    2016-09-01

    For a single market, the cross-correlation between market efficiency and trading volume, which is an indicator of market liquidity, is attentively analysed. The study begins with creating time series of market efficiency by applying time-varying Hurst exponent with one year sliding window to daily closing prices. The time series of trading volume corresponding to the same time period used for the market efficiency is derived from one year moving average of daily trading volume. Subsequently, the detrended cross-correlation coefficient is employed to quantify the degree of cross-correlation between the two time series. It was found that values of cross-correlation coefficient of all considered stock markets are close to 0 and are clearly out of range in which correlation being considered significant in almost every time scale. Obtained results show that the market liquidity in term of trading volume hardly has effect on the market efficiency.

  18. Climatic factors and community - associated methicillin-resistant Staphylococcus aureus skin and soft-tissue infections - a time-series analysis study.

    PubMed

    Sahoo, Krushna Chandra; Sahoo, Soumyakanta; Marrone, Gaetano; Pathak, Ashish; Lundborg, Cecilia Stålsby; Tamhankar, Ashok J

    2014-08-29

    Skin and soft tissue infections caused by Staphylococcus aureus (SA-SSTIs) including methicillin-resistant Staphylococcus aureus (MRSA) have experienced a significant surge all over the world. Changing climatic factors are affecting the global burden of dermatological infections and there is a lack of information on the association between climatic factors and MRSA infections. Therefore, association of temperature and relative humidity (RH) with occurrence of SA-SSTIs (n = 387) and also MRSA (n = 251) was monitored for 18 months in the outpatient clinic at a tertiary care hospital located in Bhubaneswar, Odisha, India. The Kirby-Bauer disk diffusion method was used for antibiotic susceptibility testing. Time-series analysis was used to investigate the potential association of climatic factors (weekly averages of maximum temperature, minimum temperature and RH) with weekly incidence of SA-SSTIs and MRSA infections. The analysis showed that a combination of weekly average maximum temperature above 33 °C coinciding with weekly average RH ranging between 55% and 78%, is most favorable for the occurrence of SA-SSTIs and MRSA and within these parameters, each unit increase in occurrence of MRSA was associated with increase in weekly average maximum temperature of 1.7 °C (p = 0.044) and weekly average RH increase of 10% (p = 0.097).

  19. Maximum likelihood estimation for periodic autoregressive moving average models

    USGS Publications Warehouse

    Vecchia, A.V.

    1985-01-01

    A useful class of models for seasonal time series that cannot be filtered or standardized to achieve second-order stationarity is that of periodic autoregressive moving average (PARMA) models, which are extensions of ARMA models that allow periodic (seasonal) parameters. An approximation to the exact likelihood for Gaussian PARMA processes is developed, and a straightforward algorithm for its maximization is presented. The algorithm is tested on several periodic ARMA(1, 1) models through simulation studies and is compared to moment estimation via the seasonal Yule-Walker equations. Applicability of the technique is demonstrated through an analysis of a seasonal stream-flow series from the Rio Caroni River in Venezuela.

  20. Time series analysis of temporal networks

    NASA Astrophysics Data System (ADS)

    Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh

    2016-01-01

    A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue "Temporal Network Theory and Applications", edited by Petter Holme.

  1. Spatial variability in the trends in extreme storm surges and weekly-scale high water levels in the eastern Baltic Sea

    NASA Astrophysics Data System (ADS)

    Soomere, Tarmo; Pindsoo, Katri

    2016-03-01

    We address the possibilities of a separation of the overall increasing trend in maximum water levels of semi-enclosed water bodies into associated trends in the heights of local storm surges and basin-scale components of the water level based on recorded and modelled local water level time series. The test area is the Baltic Sea. Sequences of strong storms may substantially increase its water volume and raise the average sea level by almost 1 m for a few weeks. Such events are singled out from the water level time series using a weekly-scale average. The trends in the annual maxima of the weekly average have an almost constant value along the entire eastern Baltic Sea coast for averaging intervals longer than 4 days. Their slopes are ~4 cm/decade for 8-day running average and decrease with an increase of the averaging interval. The trends for maxima of local storm surge heights represent almost the entire spatial variability in the water level maxima. Their slopes vary from almost zero for the open Baltic Proper coast up to 5-7 cm/decade in the eastern Gulf of Finland and Gulf of Riga. This pattern suggests that an increase in wind speed in strong storms is unlikely in this area but storm duration may have increased and wind direction may have rotated.

  2. Studies in astronomical time series analysis. I - Modeling random processes in the time domain

    NASA Technical Reports Server (NTRS)

    Scargle, J. D.

    1981-01-01

    Several random process models in the time domain are defined and discussed. Attention is given to the moving average model, the autoregressive model, and relationships between and combinations of these models. Consideration is then given to methods for investigating pulse structure, procedures of model construction, computational methods, and numerical experiments. A FORTRAN algorithm of time series analysis has been developed which is relatively stable numerically. Results of test cases are given to study the effect of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the light curve of the quasar 3C 272 is considered as an example.

  3. The Living Planet Index: using species population time series to track trends in biodiversity

    PubMed Central

    Loh, Jonathan; Green, Rhys E; Ricketts, Taylor; Lamoreux, John; Jenkins, Martin; Kapos, Valerie; Randers, Jorgen

    2005-01-01

    The Living Planet Index was developed to measure the changing state of the world's biodiversity over time. It uses time-series data to calculate average rates of change in a large number of populations of terrestrial, freshwater and marine vertebrate species. The dataset contains about 3000 population time series for over 1100 species. Two methods of calculating the index are outlined: the chain method and a method based on linear modelling of log-transformed data. The dataset is analysed to compare the relative representation of biogeographic realms, ecoregional biomes, threat status and taxonomic groups among species contributing to the index. The two methods show very similar results: terrestrial species declined on average by 25% from 1970 to 2000. Birds and mammals are over-represented in comparison with other vertebrate classes, and temperate species are over-represented compared with tropical species, but there is little difference in representation between threatened and non-threatened species. Some of the problems arising from over-representation are reduced by the way in which the index is calculated. It may be possible to reduce this further by post-stratification and weighting, but new information would first need to be collected for data-poor classes, realms and biomes. PMID:15814346

  4. Epidemiology meets econometrics: using time-series analysis to observe the impact of bed occupancy rates on the spread of multidrug-resistant bacteria.

    PubMed

    Kaier, K; Meyer, E; Dettenkofer, M; Frank, U

    2010-10-01

    Two multivariate time-series analyses were carried out to identify the impact of bed occupancy rates, turnover intervals and the average length of hospital stay on the spread of multidrug-resistant bacteria in a teaching hospital. Epidemiological data on the incidences of meticillin-resistant Staphylococcus aureus (MRSA) and extended-spectrum beta-lactamase (ESBL)-producing bacteria were collected. Time-series of bed occupancy rates, turnover intervals and the average length of stay were tested for inclusion in the models as independent variables. Incidence was defined as nosocomial cases per 1000 patient-days. This included all patients infected or colonised with MRSA/ESBL more than 48h after admission. Between January 2003 and July 2008, a mean incidence of 0.15 nosocomial MRSA cases was identified. ESBL was not included in the surveillance until January 2005. Between January 2005 and July 2008 the mean incidence of nosocomial ESBL was also 0.15 cases per 1000 patient-days. The two multivariate models demonstrate a temporal relationship between bed occupancy rates in general wards and the incidence of nosocomial MRSA and ESBL. Similarly, the temporal relationship between the monthly average length of stay in intensive care units (ICUs) and the incidence of nosocomial MRSA and ESBL was demonstrated. Overcrowding in general wards and long periods of ICU stay were identified as factors influencing the spread of multidrug-resistant bacteria in hospital settings. Copyright 2010 The Hospital Infection Society. Published by Elsevier Ltd. All rights reserved.

  5. Electromagnetic pulse propagation in dispersive planar dielectrics.

    PubMed

    Moten, K; Durney, C H; Stockham, T G

    1989-01-01

    The responses of a plane-wave pulse train irradiating a lossy dispersive dielectric half-space are investigated. The incident pulse train is expressed as a Fourier series with summing done by the inverse fast Fourier transform. The Fourier series technique is adopted to avoid the many difficulties often encountered in finding the inverse Fourier transform when transform analyses are used. Calculations are made for propagation in pure water, and typical waveforms inside the dielectric half-space are presented. Higher harmonics are strongly attenuated, resulting in a single continuous sinusoidal waveform at the frequency of the fundamental depth in the material. The time-averaged specific absorption rate (SAR) for pulse-train propagation is shown to be the sum of the time-averaged SARs of the individual harmonic components of the pulse train. For the same average power, calculated SARs reveal that pulse trains generally penetrate deeper than carrier-frequency continuous waves but not deeper than continuous waves at frequencies approaching the fundamental of the pulse train. The effects of rise time on the propagating pulse train in the dielectrics are shown and explained. Since most practical pulsed systems are very limited in bandwidth, no pronounced differences between their response and continuous wave (CW) response would be expected. Typical results for pulse-train propagation in arrays of dispersive planar dielectric slabs are presented. Expressing the pulse train as a Fourier series provides a practical way of interpreting the dispersion characteristics from the spectral point of view.

  6. The CACAO Method for Smoothing, Gap Filling, and Characterizing Seasonal Anomalies in Satellite Time Series

    NASA Technical Reports Server (NTRS)

    Verger, Aleixandre; Baret, F.; Weiss, M.; Kandasamy, S.; Vermote, E.

    2013-01-01

    Consistent, continuous, and long time series of global biophysical variables derived from satellite data are required for global change research. A novel climatology fitting approach called CACAO (Consistent Adjustment of the Climatology to Actual Observations) is proposed to reduce noise and fill gaps in time series by scaling and shifting the seasonal climatological patterns to the actual observations. The shift and scale CACAO parameters adjusted for each season allow quantifying shifts in the timing of seasonal phenology and inter-annual variations in magnitude as compared to the average climatology. CACAO was assessed first over simulated daily Leaf Area Index (LAI) time series with varying fractions of missing data and noise. Then, performances were analyzed over actual satellite LAI products derived from AVHRR Long-Term Data Record for the 1981-2000 period over the BELMANIP2 globally representative sample of sites. Comparison with two widely used temporal filtering methods-the asymmetric Gaussian (AG) model and the Savitzky-Golay (SG) filter as implemented in TIMESAT-revealed that CACAO achieved better performances for smoothing AVHRR time series characterized by high level of noise and frequent missing observations. The resulting smoothed time series captures well the vegetation dynamics and shows no gaps as compared to the 50-60% of still missing data after AG or SG reconstructions. Results of simulation experiments as well as confrontation with actual AVHRR time series indicate that the proposed CACAO method is more robust to noise and missing data than AG and SG methods for phenology extraction.

  7. Effect of spatial averaging on multifractal properties of meteorological time series

    NASA Astrophysics Data System (ADS)

    Hoffmann, Holger; Baranowski, Piotr; Krzyszczak, Jaromir; Zubik, Monika

    2016-04-01

    Introduction The process-based models for large-scale simulations require input of agro-meteorological quantities that are often in the form of time series of coarse spatial resolution. Therefore, the knowledge about their scaling properties is fundamental for transferring locally measured fluctuations to larger scales and vice-versa. However, the scaling analysis of these quantities is complicated due to the presence of localized trends and non-stationarities. Here we assess how spatially aggregating meteorological data to coarser resolutions affects the data's temporal scaling properties. While it is known that spatial aggregation may affect spatial data properties (Hoffmann et al., 2015), it is unknown how it affects temporal data properties. Therefore, the objective of this study was to characterize the aggregation effect (AE) with regard to both temporal and spatial input data properties considering scaling properties (i.e. statistical self-similarity) of the chosen agro-meteorological time series through multifractal detrended fluctuation analysis (MFDFA). Materials and Methods Time series coming from years 1982-2011 were spatially averaged from 1 to 10, 25, 50 and 100 km resolution to assess the impact of spatial aggregation. Daily minimum, mean and maximum air temperature (2 m), precipitation, global radiation, wind speed and relative humidity (Zhao et al., 2015) were used. To reveal the multifractal structure of the time series, we used the procedure described in Baranowski et al. (2015). The diversity of the studied multifractals was evaluated by the parameters of time series spectra. In order to analyse differences in multifractal properties to 1 km resolution grids, data of coarser resolutions was disaggregated to 1 km. Results and Conclusions Analysing the spatial averaging on multifractal properties we observed that spatial patterns of the multifractal spectrum (MS) of all meteorological variables differed from 1 km grids and MS-parameters were biased by -29.1 % (precipitation; width of MS) up to >4 % (min. Temperature, Radiation; asymmetry of MS). Also, the spatial variability of MS parameters was strongly affected at the highest aggregation (100 km). Obtained results confirm that spatial data aggregation may strongly affect temporal scaling properties. This should be taken into account when upscaling for large-scale studies. Acknowledgements The study was conducted within FACCE MACSUR. Please see Baranowski et al. (2015) for details on funding. References Baranowski, P., Krzyszczak, J., Sławiński, C. et al. (2015). Climate Research 65, 39-52. Hoffman, H., G. Zhao, L.G.J. Van Bussel et al. (2015). Climate Research 65, 53-69. Zhao, G., Siebert, S., Rezaei E. et al. (2015). Agricultural and Forest Meteorology 200, 156-171.

  8. A synergic simulation-optimization approach for analyzing biomolecular dynamics in living organisms.

    PubMed

    Sadegh Zadeh, Kouroush

    2011-01-01

    A synergic duo simulation-optimization approach was developed and implemented to study protein-substrate dynamics and binding kinetics in living organisms. The forward problem is a system of several coupled nonlinear partial differential equations which, with a given set of kinetics and diffusion parameters, can provide not only the commonly used bleached area-averaged time series in fluorescence microscopy experiments but more informative full biomolecular/drug space-time series and can be successfully used to study dynamics of both Dirac and Gaussian fluorescence-labeled biomacromolecules in vivo. The incomplete Cholesky preconditioner was coupled with the finite difference discretization scheme and an adaptive time-stepping strategy to solve the forward problem. The proposed approach was validated with analytical as well as reference solutions and used to simulate dynamics of GFP-tagged glucocorticoid receptor (GFP-GR) in mouse cancer cell during a fluorescence recovery after photobleaching experiment. Model analysis indicates that the commonly practiced bleach spot-averaged time series is not an efficient approach to extract physiological information from the fluorescence microscopy protocols. It was recommended that experimental biophysicists should use full space-time series, resulting from experimental protocols, to study dynamics of biomacromolecules and drugs in living organisms. It was also concluded that in parameterization of biological mass transfer processes, setting the norm of the gradient of the penalty function at the solution to zero is not an efficient stopping rule to end the inverse algorithm. Theoreticians should use multi-criteria stopping rules to quantify model parameters by optimization. Copyright © 2010 Elsevier Ltd. All rights reserved.

  9. Alteration of Box-Jenkins methodology by implementing genetic algorithm method

    NASA Astrophysics Data System (ADS)

    Ismail, Zuhaimy; Maarof, Mohd Zulariffin Md; Fadzli, Mohammad

    2015-02-01

    A time series is a set of values sequentially observed through time. The Box-Jenkins methodology is a systematic method of identifying, fitting, checking and using integrated autoregressive moving average time series model for forecasting. Box-Jenkins method is an appropriate for a medium to a long length (at least 50) time series data observation. When modeling a medium to a long length (at least 50), the difficulty arose in choosing the accurate order of model identification level and to discover the right parameter estimation. This presents the development of Genetic Algorithm heuristic method in solving the identification and estimation models problems in Box-Jenkins. Data on International Tourist arrivals to Malaysia were used to illustrate the effectiveness of this proposed method. The forecast results that generated from this proposed model outperformed single traditional Box-Jenkins model.

  10. Root System Water Consumption Pattern Identification on Time Series Data

    PubMed Central

    Figueroa, Manuel; Pope, Christopher

    2017-01-01

    In agriculture, soil and meteorological sensors are used along low power networks to capture data, which allows for optimal resource usage and minimizing environmental impact. This study uses time series analysis methods for outliers’ detection and pattern recognition on soil moisture sensor data to identify irrigation and consumption patterns and to improve a soil moisture prediction and irrigation system. This study compares three new algorithms with the current detection technique in the project; the results greatly decrease the number of false positives detected. The best result is obtained by the Series Strings Comparison (SSC) algorithm averaging a precision of 0.872 on the testing sets, vastly improving the current system’s 0.348 precision. PMID:28621739

  11. Root System Water Consumption Pattern Identification on Time Series Data.

    PubMed

    Figueroa, Manuel; Pope, Christopher

    2017-06-16

    In agriculture, soil and meteorological sensors are used along low power networks to capture data, which allows for optimal resource usage and minimizing environmental impact. This study uses time series analysis methods for outliers' detection and pattern recognition on soil moisture sensor data to identify irrigation and consumption patterns and to improve a soil moisture prediction and irrigation system. This study compares three new algorithms with the current detection technique in the project; the results greatly decrease the number of false positives detected. The best result is obtained by the Series Strings Comparison (SSC) algorithm averaging a precision of 0.872 on the testing sets, vastly improving the current system's 0.348 precision.

  12. Bayesian model averaging method for evaluating associations between air pollution and respiratory mortality: a time-series study.

    PubMed

    Fang, Xin; Li, Runkui; Kan, Haidong; Bottai, Matteo; Fang, Fang; Cao, Yang

    2016-08-16

    To demonstrate an application of Bayesian model averaging (BMA) with generalised additive mixed models (GAMM) and provide a novel modelling technique to assess the association between inhalable coarse particles (PM10) and respiratory mortality in time-series studies. A time-series study using regional death registry between 2009 and 2010. 8 districts in a large metropolitan area in Northern China. 9559 permanent residents of the 8 districts who died of respiratory diseases between 2009 and 2010. Per cent increase in daily respiratory mortality rate (MR) per interquartile range (IQR) increase of PM10 concentration and corresponding 95% confidence interval (CI) in single-pollutant and multipollutant (including NOx, CO) models. The Bayesian model averaged GAMM (GAMM+BMA) and the optimal GAMM of PM10, multipollutants and principal components (PCs) of multipollutants showed comparable results for the effect of PM10 on daily respiratory MR, that is, one IQR increase in PM10 concentration corresponded to 1.38% vs 1.39%, 1.81% vs 1.83% and 0.87% vs 0.88% increase, respectively, in daily respiratory MR. However, GAMM+BMA gave slightly but noticeable wider CIs for the single-pollutant model (-1.09 to 4.28 vs -1.08 to 3.93) and the PCs-based model (-2.23 to 4.07 vs -2.03 vs 3.88). The CIs of the multiple-pollutant model from two methods are similar, that is, -1.12 to 4.85 versus -1.11 versus 4.83. The BMA method may represent a useful tool for modelling uncertainty in time-series studies when evaluating the effect of air pollution on fatal health outcomes. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  13. Time-series analysis of lung texture on bone-suppressed dynamic chest radiograph for the evaluation of pulmonary function: a preliminary study

    NASA Astrophysics Data System (ADS)

    Tanaka, Rie; Matsuda, Hiroaki; Sanada, Shigeru

    2017-03-01

    The density of lung tissue changes as demonstrated on imagery is dependent on the relative increases and decreases in the volume of air and lung vessels per unit volume of lung. Therefore, a time-series analysis of lung texture can be used to evaluate relative pulmonary function. This study was performed to assess a time-series analysis of lung texture on dynamic chest radiographs during respiration, and to demonstrate its usefulness in the diagnosis of pulmonary impairments. Sequential chest radiographs of 30 patients were obtained using a dynamic flat-panel detector (FPD; 100 kV, 0.2 mAs/pulse, 15 frames/s, SID = 2.0 m; Prototype, Konica Minolta). Imaging was performed during respiration, and 210 images were obtained over 14 seconds. Commercial bone suppression image-processing software (Clear Read Bone Suppression; Riverain Technologies, Miamisburg, Ohio, USA) was applied to the sequential chest radiographs to create corresponding bone suppression images. Average pixel values, standard deviation (SD), kurtosis, and skewness were calculated based on a density histogram analysis in lung regions. Regions of interest (ROIs) were manually located in the lungs, and the same ROIs were traced by the template matching technique during respiration. Average pixel value effectively differentiated regions with ventilatory defects and normal lung tissue. The average pixel values in normal areas changed dynamically in synchronization with the respiratory phase, whereas those in regions of ventilatory defects indicated reduced variations in pixel value. There were no significant differences between ventilatory defects and normal lung tissue in the other parameters. We confirmed that time-series analysis of lung texture was useful for the evaluation of pulmonary function in dynamic chest radiography during respiration. Pulmonary impairments were detected as reduced changes in pixel value. This technique is a simple, cost-effective diagnostic tool for the evaluation of regional pulmonary function.

  14. Gravity and magma induces spreading of Mount Etna volcano revealed by satellite radar interferometry

    NASA Technical Reports Server (NTRS)

    Lungren, P.; Casu, F.; Manzo, M.; Pepe, A.; Berardino, P.; Sansosti, E.; Lanari, R.

    2004-01-01

    Mount Etna underwent a cycle of eruptive activity over the past ten years. Here we compute ground displacement maps and deformation time series from more than 400 radar interferograms to reveal Mount Etna's average and time varying surface deformation from 1992 to 2001.

  15. A Computer Program for the Generation of ARIMA Data

    ERIC Educational Resources Information Center

    Green, Samuel B.; Noles, Keith O.

    1977-01-01

    The autoregressive integrated moving averages model (ARIMA) has been applied to time series data in psychological and educational research. A program is described that generates ARIMA data of a known order. The program enables researchers to explore statistical properties of ARIMA data and simulate systems producing time dependent observations.…

  16. 78 FR 59775 - Blueberry Promotion, Research and Information Order; Assessment Rate Increase

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-30

    ... demand. \\6\\ The econometric model used statistical methods with time series data to measure how strongly... been over 15 times greater than the costs. At the opposite end of the spectrum in the supply response, the average BCR was computed to be 5.36, implying that the benefits of the USHBC were over five times...

  17. Multifractality Signatures in Quasars Time Series. I. 3C 273

    NASA Astrophysics Data System (ADS)

    Belete, A. Bewketu; Bravo, J. P.; Canto Martins, B. L.; Leão, I. C.; De Araujo, J. M.; De Medeiros, J. R.

    2018-05-01

    The presence of multifractality in a time series shows different correlations for different time scales as well as intermittent behaviour that cannot be captured by a single scaling exponent. The identification of a multifractal nature allows for a characterization of the dynamics and of the intermittency of the fluctuations in non-linear and complex systems. In this study, we search for a possible multifractal structure (multifractality signature) of the flux variability in the quasar 3C 273 time series for all electromagnetic wavebands at different observation points, and the origins for the observed multifractality. This study is intended to highlight how the scaling behaves across the different bands of the selected candidate which can be used as an additional new technique to group quasars based on the fractal signature observed in their time series and determine whether quasars are non-linear physical systems or not. The Multifractal Detrended Moving Average algorithm (MFDMA) has been used to study the scaling in non-linear, complex and dynamic systems. To achieve this goal, we applied the backward (θ = 0) MFDMA method for one-dimensional signals. We observe weak multifractal (close to monofractal) behaviour in some of the time series of our candidate except in the mm, UV and X-ray bands. The non-linear temporal correlation is the main source of the observed multifractality in the time series whereas the heaviness of the distribution contributes less.

  18. Hybrid Wavelet De-noising and Rank-Set Pair Analysis approach for forecasting hydro-meteorological time series

    NASA Astrophysics Data System (ADS)

    WANG, D.; Wang, Y.; Zeng, X.

    2017-12-01

    Accurate, fast forecasting of hydro-meteorological time series is presently a major challenge in drought and flood mitigation. This paper proposes a hybrid approach, Wavelet De-noising (WD) and Rank-Set Pair Analysis (RSPA), that takes full advantage of a combination of the two approaches to improve forecasts of hydro-meteorological time series. WD allows decomposition and reconstruction of a time series by the wavelet transform, and hence separation of the noise from the original series. RSPA, a more reliable and efficient version of Set Pair Analysis, is integrated with WD to form the hybrid WD-RSPA approach. Two types of hydro-meteorological data sets with different characteristics and different levels of human influences at some representative stations are used to illustrate the WD-RSPA approach. The approach is also compared to three other generic methods: the conventional Auto Regressive Integrated Moving Average (ARIMA) method, Artificial Neural Networks (ANNs) (BP-error Back Propagation, MLP-Multilayer Perceptron and RBF-Radial Basis Function), and RSPA alone. Nine error metrics are used to evaluate the model performance. The results show that WD-RSPA is accurate, feasible, and effective. In particular, WD-RSPA is found to be the best among the various generic methods compared in this paper, even when the extreme events are included within a time series.

  19. Robust and Adaptive Online Time Series Prediction with Long Short-Term Memory

    PubMed Central

    Tao, Qing

    2017-01-01

    Online time series prediction is the mainstream method in a wide range of fields, ranging from speech analysis and noise cancelation to stock market analysis. However, the data often contains many outliers with the increasing length of time series in real world. These outliers can mislead the learned model if treated as normal points in the process of prediction. To address this issue, in this paper, we propose a robust and adaptive online gradient learning method, RoAdam (Robust Adam), for long short-term memory (LSTM) to predict time series with outliers. This method tunes the learning rate of the stochastic gradient algorithm adaptively in the process of prediction, which reduces the adverse effect of outliers. It tracks the relative prediction error of the loss function with a weighted average through modifying Adam, a popular stochastic gradient method algorithm for training deep neural networks. In our algorithm, the large value of the relative prediction error corresponds to a small learning rate, and vice versa. The experiments on both synthetic data and real time series show that our method achieves better performance compared to the existing methods based on LSTM. PMID:29391864

  20. Robust and Adaptive Online Time Series Prediction with Long Short-Term Memory.

    PubMed

    Yang, Haimin; Pan, Zhisong; Tao, Qing

    2017-01-01

    Online time series prediction is the mainstream method in a wide range of fields, ranging from speech analysis and noise cancelation to stock market analysis. However, the data often contains many outliers with the increasing length of time series in real world. These outliers can mislead the learned model if treated as normal points in the process of prediction. To address this issue, in this paper, we propose a robust and adaptive online gradient learning method, RoAdam (Robust Adam), for long short-term memory (LSTM) to predict time series with outliers. This method tunes the learning rate of the stochastic gradient algorithm adaptively in the process of prediction, which reduces the adverse effect of outliers. It tracks the relative prediction error of the loss function with a weighted average through modifying Adam, a popular stochastic gradient method algorithm for training deep neural networks. In our algorithm, the large value of the relative prediction error corresponds to a small learning rate, and vice versa. The experiments on both synthetic data and real time series show that our method achieves better performance compared to the existing methods based on LSTM.

  1. Developing a comprehensive time series of GDP per capita for 210 countries from 1950 to 2015

    PubMed Central

    2012-01-01

    Background Income has been extensively studied and utilized as a determinant of health. There are several sources of income expressed as gross domestic product (GDP) per capita, but there are no time series that are complete for the years between 1950 and 2015 for the 210 countries for which data exist. It is in the interest of population health research to establish a global time series that is complete from 1950 to 2015. Methods We collected GDP per capita estimates expressed in either constant US dollar terms or international dollar terms (corrected for purchasing power parity) from seven sources. We applied several stages of models, including ordinary least-squares regressions and mixed effects models, to complete each of the seven source series from 1950 to 2015. The three US dollar and four international dollar series were each averaged to produce two new GDP per capita series. Results and discussion Nine complete series from 1950 to 2015 for 210 countries are available for use. These series can serve various analytical purposes and can illustrate myriad economic trends and features. The derivation of the two new series allows for researchers to avoid any series-specific biases that may exist. The modeling approach used is flexible and will allow for yearly updating as new estimates are produced by the source series. Conclusion GDP per capita is a necessary tool in population health research, and our development and implementation of a new method has allowed for the most comprehensive known time series to date. PMID:22846561

  2. Volatility Behaviors of Financial Time Series by Percolation System on Sierpinski Carpet Lattice

    NASA Astrophysics Data System (ADS)

    Pei, Anqi; Wang, Jun

    2015-01-01

    The financial time series is simulated and investigated by the percolation system on the Sierpinski carpet lattice, where percolation is usually employed to describe the behavior of connected clusters in a random graph, and the Sierpinski carpet lattice is a graph which corresponds the fractal — Sierpinski carpet. To study the fluctuation behavior of returns for the financial model and the Shanghai Composite Index, we establish a daily volatility measure — multifractal volatility (MFV) measure to obtain MFV series, which have long-range cross-correlations with squared daily return series. The autoregressive fractionally integrated moving average (ARFIMA) model is used to analyze the MFV series, which performs better when compared to other volatility series. By a comparative study of the multifractality and volatility analysis of the data, the simulation data of the proposed model exhibits very similar behaviors to those of the real stock index, which indicates somewhat rationality of the model to the market application.

  3. Main Trend Extraction Based on Irregular Sampling Estimation and Its Application in Storage Volume of Internet Data Center

    PubMed Central

    Dou, Chao

    2016-01-01

    The storage volume of internet data center is one of the classical time series. It is very valuable to predict the storage volume of a data center for the business value. However, the storage volume series from a data center is always “dirty,” which contains the noise, missing data, and outliers, so it is necessary to extract the main trend of storage volume series for the future prediction processing. In this paper, we propose an irregular sampling estimation method to extract the main trend of the time series, in which the Kalman filter is used to remove the “dirty” data; then the cubic spline interpolation and average method are used to reconstruct the main trend. The developed method is applied in the storage volume series of internet data center. The experiment results show that the developed method can estimate the main trend of storage volume series accurately and make great contribution to predict the future volume value. 
 PMID:28090205

  4. Main Trend Extraction Based on Irregular Sampling Estimation and Its Application in Storage Volume of Internet Data Center.

    PubMed

    Miao, Beibei; Dou, Chao; Jin, Xuebo

    2016-01-01

    The storage volume of internet data center is one of the classical time series. It is very valuable to predict the storage volume of a data center for the business value. However, the storage volume series from a data center is always "dirty," which contains the noise, missing data, and outliers, so it is necessary to extract the main trend of storage volume series for the future prediction processing. In this paper, we propose an irregular sampling estimation method to extract the main trend of the time series, in which the Kalman filter is used to remove the "dirty" data; then the cubic spline interpolation and average method are used to reconstruct the main trend. The developed method is applied in the storage volume series of internet data center. The experiment results show that the developed method can estimate the main trend of storage volume series accurately and make great contribution to predict the future volume value. 
 .

  5. Time-series analysis of delta13C from tree rings. I. Time trends and autocorrelation.

    PubMed

    Monserud, R A; Marshall, J D

    2001-09-01

    Univariate time-series analyses were conducted on stable carbon isotope ratios obtained from tree-ring cellulose. We looked for the presence and structure of autocorrelation. Significant autocorrelation violates the statistical independence assumption and biases hypothesis tests. Its presence would indicate the existence of lagged physiological effects that persist for longer than the current year. We analyzed data from 28 trees (60-85 years old; mean = 73 years) of western white pine (Pinus monticola Dougl.), ponderosa pine (Pinus ponderosa Laws.), and Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco var. glauca) growing in northern Idaho. Material was obtained by the stem analysis method from rings laid down in the upper portion of the crown throughout each tree's life. The sampling protocol minimized variation caused by changing light regimes within each tree. Autoregressive moving average (ARMA) models were used to describe the autocorrelation structure over time. Three time series were analyzed for each tree: the stable carbon isotope ratio (delta(13)C); discrimination (delta); and the difference between ambient and internal CO(2) concentrations (c(a) - c(i)). The effect of converting from ring cellulose to whole-leaf tissue did not affect the analysis because it was almost completely removed by the detrending that precedes time-series analysis. A simple linear or quadratic model adequately described the time trend. The residuals from the trend had a constant mean and variance, thus ensuring stationarity, a requirement for autocorrelation analysis. The trend over time for c(a) - c(i) was particularly strong (R(2) = 0.29-0.84). Autoregressive moving average analyses of the residuals from these trends indicated that two-thirds of the individual tree series contained significant autocorrelation, whereas the remaining third were random (white noise) over time. We were unable to distinguish between individuals with and without significant autocorrelation beforehand. Significant ARMA models were all of low order, with either first- or second-order (i.e., lagged 1 or 2 years, respectively) models performing well. A simple autoregressive (AR(1)), model was the most common. The most useful generalization was that the same ARMA model holds for each of the three series (delta(13)C, delta, c(a) - c(i)) for an individual tree, if the time trend has been properly removed for each series. The mean series for the two pine species were described by first-order ARMA models (1-year lags), whereas the Douglas-fir mean series were described by second-order models (2-year lags) with negligible first-order effects. Apparently, the process of constructing a mean time series for a species preserves an underlying signal related to delta(13)C while canceling some of the random individual tree variation. Furthermore, the best model for the overall mean series (e.g., for a species) cannot be inferred from a consensus of the individual tree model forms, nor can its parameters be estimated reliably from the mean of the individual tree parameters. Because two-thirds of the individual tree time series contained significant autocorrelation, the normal assumption of a random structure over time is unwarranted, even after accounting for the time trend. The residuals of an appropriate ARMA model satisfy the independence assumption, and can be used to make hypothesis tests.

  6. Rapid determination of thermodynamic parameters from one-dimensional programmed-temperature gas chromatography for use in retention time prediction in comprehensive multidimensional chromatography.

    PubMed

    McGinitie, Teague M; Ebrahimi-Najafabadi, Heshmatollah; Harynuk, James J

    2014-01-17

    A new method for estimating the thermodynamic parameters of ΔH(T0), ΔS(T0), and ΔCP for use in thermodynamic modeling of GC×GC separations has been developed. The method is an alternative to the traditional isothermal separations required to fit a three-parameter thermodynamic model to retention data. Herein, a non-linear optimization technique is used to estimate the parameters from a series of temperature-programmed separations using the Nelder-Mead simplex algorithm. With this method, the time required to obtain estimates of thermodynamic parameters a series of analytes is significantly reduced. This new method allows for precise predictions of retention time with the average error being only 0.2s for 1D separations. Predictions for GC×GC separations were also in agreement with experimental measurements; having an average relative error of 0.37% for (1)tr and 2.1% for (2)tr. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Atmospheric mold spore counts in relation to meteorological parameters

    NASA Astrophysics Data System (ADS)

    Katial, R. K.; Zhang, Yiming; Jones, Richard H.; Dyer, Philip D.

    Fungal spore counts of Cladosporium, Alternaria, and Epicoccum were studied during 8 years in Denver, Colorado. Fungal spore counts were obtained daily during the pollinating season by a Rotorod sampler. Weather data were obtained from the National Climatic Data Center. Daily averages of temperature, relative humidity, daily precipitation, barometric pressure, and wind speed were studied. A time series analysis was performed on the data to mathematically model the spore counts in relation to weather parameters. Using SAS PROC ARIMA software, a regression analysis was performed, regressing the spore counts on the weather variables assuming an autoregressive moving average (ARMA) error structure. Cladosporium was found to be positively correlated (P<0.02) with average daily temperature, relative humidity, and negatively correlated with precipitation. Alternaria and Epicoccum did not show increased predictability with weather variables. A mathematical model was derived for Cladosporium spore counts using the annual seasonal cycle and significant weather variables. The model for Alternaria and Epicoccum incorporated the annual seasonal cycle. Fungal spore counts can be modeled by time series analysis and related to meteorological parameters controlling for seasonallity; this modeling can provide estimates of exposure to fungal aeroallergens.

  8. Data imputation analysis for Cosmic Rays time series

    NASA Astrophysics Data System (ADS)

    Fernandes, R. C.; Lucio, P. S.; Fernandez, J. H.

    2017-05-01

    The occurrence of missing data concerning Galactic Cosmic Rays time series (GCR) is inevitable since loss of data is due to mechanical and human failure or technical problems and different periods of operation of GCR stations. The aim of this study was to perform multiple dataset imputation in order to depict the observational dataset. The study has used the monthly time series of GCR Climax (CLMX) and Roma (ROME) from 1960 to 2004 to simulate scenarios of 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80% and 90% of missing data compared to observed ROME series, with 50 replicates. Then, the CLMX station as a proxy for allocation of these scenarios was used. Three different methods for monthly dataset imputation were selected: AMÉLIA II - runs the bootstrap Expectation Maximization algorithm, MICE - runs an algorithm via Multivariate Imputation by Chained Equations and MTSDI - an Expectation Maximization algorithm-based method for imputation of missing values in multivariate normal time series. The synthetic time series compared with the observed ROME series has also been evaluated using several skill measures as such as RMSE, NRMSE, Agreement Index, R, R2, F-test and t-test. The results showed that for CLMX and ROME, the R2 and R statistics were equal to 0.98 and 0.96, respectively. It was observed that increases in the number of gaps generate loss of quality of the time series. Data imputation was more efficient with MTSDI method, with negligible errors and best skill coefficients. The results suggest a limit of about 60% of missing data for imputation, for monthly averages, no more than this. It is noteworthy that CLMX, ROME and KIEL stations present no missing data in the target period. This methodology allowed reconstructing 43 time series.

  9. Measurement error in time-series analysis: a simulation study comparing modelled and monitored data.

    PubMed

    Butland, Barbara K; Armstrong, Ben; Atkinson, Richard W; Wilkinson, Paul; Heal, Mathew R; Doherty, Ruth M; Vieno, Massimo

    2013-11-13

    Assessing health effects from background exposure to air pollution is often hampered by the sparseness of pollution monitoring networks. However, regional atmospheric chemistry-transport models (CTMs) can provide pollution data with national coverage at fine geographical and temporal resolution. We used statistical simulation to compare the impact on epidemiological time-series analysis of additive measurement error in sparse monitor data as opposed to geographically and temporally complete model data. Statistical simulations were based on a theoretical area of 4 regions each consisting of twenty-five 5 km × 5 km grid-squares. In the context of a 3-year Poisson regression time-series analysis of the association between mortality and a single pollutant, we compared the error impact of using daily grid-specific model data as opposed to daily regional average monitor data. We investigated how this comparison was affected if we changed the number of grids per region containing a monitor. To inform simulations, estimates (e.g. of pollutant means) were obtained from observed monitor data for 2003-2006 for national network sites across the UK and corresponding model data that were generated by the EMEP-WRF CTM. Average within-site correlations between observed monitor and model data were 0.73 and 0.76 for rural and urban daily maximum 8-hour ozone respectively, and 0.67 and 0.61 for rural and urban loge(daily 1-hour maximum NO2). When regional averages were based on 5 or 10 monitors per region, health effect estimates exhibited little bias. However, with only 1 monitor per region, the regression coefficient in our time-series analysis was attenuated by an estimated 6% for urban background ozone, 13% for rural ozone, 29% for urban background loge(NO2) and 38% for rural loge(NO2). For grid-specific model data the corresponding figures were 19%, 22%, 54% and 44% respectively, i.e. similar for rural loge(NO2) but more marked for urban loge(NO2). Even if correlations between model and monitor data appear reasonably strong, additive classical measurement error in model data may lead to appreciable bias in health effect estimates. As process-based air pollution models become more widely used in epidemiological time-series analysis, assessments of error impact that include statistical simulation may be useful.

  10. Zernike phase-contrast electron cryotomography applied to marine cyanobacteria infected with cyanophages.

    PubMed

    Dai, Wei; Fu, Caroline; Khant, Htet A; Ludtke, Steven J; Schmid, Michael F; Chiu, Wah

    2014-11-01

    Advances in electron cryotomography have provided new opportunities to visualize the internal 3D structures of a bacterium. An electron microscope equipped with Zernike phase-contrast optics produces images with markedly increased contrast compared with images obtained by conventional electron microscopy. Here we describe a protocol to apply Zernike phase plate technology for acquiring electron tomographic tilt series of cyanophage-infected cyanobacterial cells embedded in ice, without staining or chemical fixation. We detail the procedures for aligning and assessing phase plates for data collection, and methods for obtaining 3D structures of cyanophage assembly intermediates in the host by subtomogram alignment, classification and averaging. Acquiring three or four tomographic tilt series takes ∼12 h on a JEM2200FS electron microscope. We expect this time requirement to decrease substantially as the technique matures. The time required for annotation and subtomogram averaging varies widely depending on the project goals and data volume.

  11. A 15-Year Time-series Study of Tooth Extraction in Brazil

    PubMed Central

    Cunha, Maria Aparecida Goncalves de Melo; Lino, Patrícia Azevedo; dos Santos, Thiago Rezende; Vasconcelos, Mara; Lucas, Simone Dutra; de Abreu, Mauro Henrique Nogueira Guimarães

    2015-01-01

    Abstract Tooth loss is considered to be a public health problem. Time-series studies that assess the influence of social conditions and access to health services on tooth loss are scarce. This study aimed to examine the time-series of permanent tooth extraction in Brazil between 1998 and 2012 and to compare these series in municipalities with different Human Development Index (HDI) scores and with different access to distinct primary and secondary care. The time-series study was performed between 1998 and 2012, using data from the Brazilian National Health Information System. Time-series study was performed between 1998 and 2012. Two annual rates of tooth extraction were calculated and evaluated separately according to 3 parameters: the HDI, the presence of a Dental Specialty Center, and coverage by Oral Health Teams. The time-series was analyzed using a linear regression model. An overall decrease in the tooth-loss tendencies during this period was observed, particularly in the tooth-extraction rate during primary care procedures. In the municipalities with an HDI that was lower than the median, the average tooth-loss rates were higher than in the municipalities with a higher HDI. The municipalities with lower rates of Oral Health Team coverage also showed lower extraction rates than the municipalities with higher coverage rates. In general, Brazil has shown a decrease in the trend to extract permanent teeth during these 15 years. Increased human development and access to dental services have influenced tooth-extraction rates. PMID:26632688

  12. Ecology of West Nile virus across four European countries: empirical modelling of the Culex pipiens abundance dynamics as a function of weather.

    PubMed

    Groen, Thomas A; L'Ambert, Gregory; Bellini, Romeo; Chaskopoulou, Alexandra; Petric, Dusan; Zgomba, Marija; Marrama, Laurence; Bicout, Dominique J

    2017-10-26

    Culex pipiens is the major vector of West Nile virus in Europe, and is causing frequent outbreaks throughout the southern part of the continent. Proper empirical modelling of the population dynamics of this species can help in understanding West Nile virus epidemiology, optimizing vector surveillance and mosquito control efforts. But modelling results may differ from place to place. In this study we look at which type of models and weather variables can be consistently used across different locations. Weekly mosquito trap collections from eight functional units located in France, Greece, Italy and Serbia for several years were combined. Additionally, rainfall, relative humidity and temperature were recorded. Correlations between lagged weather conditions and Cx. pipiens dynamics were analysed. Also seasonal autoregressive integrated moving-average (SARIMA) models were fitted to describe the temporal dynamics of Cx. pipiens and to check whether the weather variables could improve these models. Correlations were strongest between mean temperatures at short time lags, followed by relative humidity, most likely due to collinearity. Precipitation alone had weak correlations and inconsistent patterns across sites. SARIMA models could also make reasonable predictions, especially when longer time series of Cx. pipiens observations are available. Average temperature was a consistently good predictor across sites. When only short time series (~ < 4 years) of observations are available, average temperature can therefore be used to model Cx. pipiens dynamics. When longer time series (~ > 4 years) are available, SARIMAs can provide better statistical descriptions of Cx. pipiens dynamics, without the need for further weather variables. This suggests that density dependence is also an important determinant of Cx. pipiens dynamics.

  13. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, Laura K.; Vogel, Richard M.

    2016-04-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.

  14. Impact of the zero-markup drug policy on hospitalisation expenditure in western rural China: an interrupted time series analysis.

    PubMed

    Yang, Caijun; Shen, Qian; Cai, Wenfang; Zhu, Wenwen; Li, Zongjie; Wu, Lina; Fang, Yu

    2017-02-01

    To assess the long-term effects of the introduction of China's zero-markup drug policy on hospitalisation expenditure and hospitalisation expenditures after reimbursement. An interrupted time series was used to evaluate the impact of the zero-markup drug policy on hospitalisation expenditure and hospitalisation expenditure after reimbursement at primary health institutions in Fufeng County of Shaanxi Province, western China. Two regression models were developed. Monthly average hospitalisation expenditure and monthly average hospitalisation expenditure after reimbursement in primary health institutions were analysed covering the period 2009 through to 2013. For the monthly average hospitalisation expenditure, the increasing trend was slowed down after the introduction of the zero-markup drug policy (coefficient = -16.49, P = 0.009). For the monthly average hospitalisation expenditure after reimbursement, the increasing trend was slowed down after the introduction of the zero-markup drug policy (coefficient = -10.84, P = 0.064), and a significant decrease in the intercept was noted after the second intervention of changes in reimbursement schemes of the new rural cooperative medical insurance (coefficient = -220.64, P < 0.001). A statistically significant absolute decrease in the level or trend of monthly average hospitalisation expenditure and monthly average hospitalisation expenditure after reimbursement was detected after the introduction of the zero-markup drug policy in western China. However, hospitalisation expenditure and hospitalisation expenditure after reimbursement were still increasing. More effective policies are needed to prevent these costs from continuing to rise. © 2016 John Wiley & Sons Ltd.

  15. Climatic Factors and Community — Associated Methicillin-Resistant Staphylococcus aureus Skin and Soft-Tissue Infections — A Time-Series Analysis Study

    PubMed Central

    Sahoo, Krushna Chandra; Sahoo, Soumyakanta; Marrone, Gaetano; Pathak, Ashish; Lundborg, Cecilia Stålsby; Tamhankar, Ashok J.

    2014-01-01

    Skin and soft tissue infections caused by Staphylococcus aureus (SA-SSTIs) including methicillin-resistant Staphylococcus aureus (MRSA) have experienced a significant surge all over the world. Changing climatic factors are affecting the global burden of dermatological infections and there is a lack of information on the association between climatic factors and MRSA infections. Therefore, association of temperature and relative humidity (RH) with occurrence of SA-SSTIs (n = 387) and also MRSA (n = 251) was monitored for 18 months in the outpatient clinic at a tertiary care hospital located in Bhubaneswar, Odisha, India. The Kirby-Bauer disk diffusion method was used for antibiotic susceptibility testing. Time-series analysis was used to investigate the potential association of climatic factors (weekly averages of maximum temperature, minimum temperature and RH) with weekly incidence of SA-SSTIs and MRSA infections. The analysis showed that a combination of weekly average maximum temperature above 33 °C coinciding with weekly average RH ranging between 55% and 78%, is most favorable for the occurrence of SA-SSTIs and MRSA and within these parameters, each unit increase in occurrence of MRSA was associated with increase in weekly average maximum temperature of 1.7 °C (p = 0.044) and weekly average RH increase of 10% (p = 0.097). PMID:25177823

  16. A Time Series of Mean Global Sea Surface Temperature from the Along-Track Scanning Radiometers

    NASA Astrophysics Data System (ADS)

    Veal, Karen L.; Corlett, Gary; Remedios, John; Llewellyn-Jones, David

    2010-12-01

    A climate data set requires a long time series of consistently processed data with suitably long periods of overlap of different instruments which allows characterization of any inter-instrument biases. The data obtained from ESA's three Along-Track Scanning Radiometers (ATSRs) together comprise an 18 year record of SST with overlap periods of at least 6 months. The data from all three ATSRs has been consistently processed. These factors together with the stability of the instruments and the precision of the derived SST makes this data set eminently suitable for the construction of a time series of SST that complies with many of the GCOS requirements for a climate data set. A time series of global and regional average SST anomalies has been constructed from the ATSR version 2 data set. An analysis of the overlap periods of successive instruments was used to remove intra-series biases and align the series to a common reference. An ATSR climatology has been developed and has been used to calculate the SST anomalies. The ATSR-1 time series and the AATSR time series have been aligned to ATSR-2. The largest adjustment is ~0.2 K between ATSR-2 and AATSR which is suspected to be due to a shift of the 12 μm filter function for AATSR. An uncertainty of 0.06 K is assigned to the relative anomaly record that is derived from the dual three-channel night-time data. A relative uncertainty of 0.07 K is assigned to the dual night-time two-channel record, except in the ATSR-1 period (1994-1996) where it is larger.

  17. Time Series Analysis for Forecasting Hospital Census: Application to the Neonatal Intensive Care Unit

    PubMed Central

    Hoover, Stephen; Jackson, Eric V.; Paul, David; Locke, Robert

    2016-01-01

    Summary Background Accurate prediction of future patient census in hospital units is essential for patient safety, health outcomes, and resource planning. Forecasting census in the Neonatal Intensive Care Unit (NICU) is particularly challenging due to limited ability to control the census and clinical trajectories. The fixed average census approach, using average census from previous year, is a forecasting alternative used in clinical practice, but has limitations due to census variations. Objective Our objectives are to: (i) analyze the daily NICU census at a single health care facility and develop census forecasting models, (ii) explore models with and without patient data characteristics obtained at the time of admission, and (iii) evaluate accuracy of the models compared with the fixed average census approach. Methods We used five years of retrospective daily NICU census data for model development (January 2008 – December 2012, N=1827 observations) and one year of data for validation (January – December 2013, N=365 observations). Best-fitting models of ARIMA and linear regression were applied to various 7-day prediction periods and compared using error statistics. Results The census showed a slightly increasing linear trend. Best fitting models included a non-seasonal model, ARIMA(1,0,0), seasonal ARIMA models, ARIMA(1,0,0)x(1,1,2)7 and ARIMA(2,1,4)x(1,1,2)14, as well as a seasonal linear regression model. Proposed forecasting models resulted on average in 36.49% improvement in forecasting accuracy compared with the fixed average census approach. Conclusions Time series models provide higher prediction accuracy under different census conditions compared with the fixed average census approach. Presented methodology is easily applicable in clinical practice, can be generalized to other care settings, support short- and long-term census forecasting, and inform staff resource planning. PMID:27437040

  18. Time Series Analysis for Forecasting Hospital Census: Application to the Neonatal Intensive Care Unit.

    PubMed

    Capan, Muge; Hoover, Stephen; Jackson, Eric V; Paul, David; Locke, Robert

    2016-01-01

    Accurate prediction of future patient census in hospital units is essential for patient safety, health outcomes, and resource planning. Forecasting census in the Neonatal Intensive Care Unit (NICU) is particularly challenging due to limited ability to control the census and clinical trajectories. The fixed average census approach, using average census from previous year, is a forecasting alternative used in clinical practice, but has limitations due to census variations. Our objectives are to: (i) analyze the daily NICU census at a single health care facility and develop census forecasting models, (ii) explore models with and without patient data characteristics obtained at the time of admission, and (iii) evaluate accuracy of the models compared with the fixed average census approach. We used five years of retrospective daily NICU census data for model development (January 2008 - December 2012, N=1827 observations) and one year of data for validation (January - December 2013, N=365 observations). Best-fitting models of ARIMA and linear regression were applied to various 7-day prediction periods and compared using error statistics. The census showed a slightly increasing linear trend. Best fitting models included a non-seasonal model, ARIMA(1,0,0), seasonal ARIMA models, ARIMA(1,0,0)x(1,1,2)7 and ARIMA(2,1,4)x(1,1,2)14, as well as a seasonal linear regression model. Proposed forecasting models resulted on average in 36.49% improvement in forecasting accuracy compared with the fixed average census approach. Time series models provide higher prediction accuracy under different census conditions compared with the fixed average census approach. Presented methodology is easily applicable in clinical practice, can be generalized to other care settings, support short- and long-term census forecasting, and inform staff resource planning.

  19. Spatial and Temporal scales of time-averaged 700 MB height anomalies

    NASA Technical Reports Server (NTRS)

    Gutzler, D.

    1981-01-01

    The monthly and seasonal forecasting technique is based to a large extent on the extrapolation of trends in the positions of the centers of time averaged geopotential height anomalies. The complete forecasted height pattern is subsequently drawn around the forecasted anomaly centers. The efficacy of this technique was tested and time series of observed monthly mean and 5 day mean 700 mb geopotential heights were examined. Autocorrelation statistics are generated to document the tendency for persistence of anomalies. These statistics are compared to a red noise hypothesis to check for evidence of possible preferred time scales of persistence. Space-time spectral analyses at middle latitudes are checked for evidence of periodicities which could be associated with predictable month-to-month trends. A local measure of the average spatial scale of anomalies is devised for guidance in the completion of the anomaly pattern around the forecasted centers.

  20. Road Traffic Injury Trends in the City of Valledupar, Colombia. A Time Series Study from 2008 to 2012

    PubMed Central

    Rodríguez, Jorge Martín; Peñaloza, Rolando Enrique; Moreno Montoya, José

    2015-01-01

    Objective To analyze the behavior temporal of road-traffic injuries (RTI) in Valledupar, Colombia from January 2008 to December 2012. Methodology An observational study was conducted based on records from the Colombian National Legal Medicine and Forensic Sciences Institute regional office in Valledupar. Different variables were analyzed, such as the injured person’s sex, age, education level, and type of road user; the timeframe, place and circumstances of crashes and the vehicles associated with the occurrence. Furthermore, a time series analysis was conducted using an auto-regressive integrated moving average. Results There were 105 events per month on an average, 64.9% of RTI involved men; 82.3% of the persons injured were from 18 to 59 years of age; the average age was 35.4 years of age; the road users most involved in RTI were motorcyclists (69%), followed by pedestrians (12%). 70% had up to upper-secondary education. Sunday was the day with the most RTI occurrences; 93% of the RTI occurred in the urban area. The time series showed a seasonal pattern and a significant trend effect. The modeling process verified the existence of both memory and extrinsic variables related. Conclusions An RTI occurrence pattern was identified, which showed an upward trend during the period analyzed. Motorcyclists were the main road users involved in RTI, which suggests the need to design and implement specific measures for that type of road user, from regulations for graduated licensing for young drivers to monitoring road user behavior for the promotion of road safety. PMID:26657887

  1. Road Traffic Injury Trends in the City of Valledupar, Colombia. A Time Series Study from 2008 to 2012.

    PubMed

    Rodríguez, Jorge Martín; Peñaloza, Rolando Enrique; Moreno Montoya, José

    2015-01-01

    To analyze the behavior temporal of road-traffic injuries (RTI) in Valledupar, Colombia from January 2008 to December 2012. An observational study was conducted based on records from the Colombian National Legal Medicine and Forensic Sciences Institute regional office in Valledupar. Different variables were analyzed, such as the injured person's sex, age, education level, and type of road user; the timeframe, place and circumstances of crashes and the vehicles associated with the occurrence. Furthermore, a time series analysis was conducted using an auto-regressive integrated moving average. There were 105 events per month on an average, 64.9% of RTI involved men; 82.3% of the persons injured were from 18 to 59 years of age; the average age was 35.4 years of age; the road users most involved in RTI were motorcyclists (69%), followed by pedestrians (12%). 70% had up to upper-secondary education. Sunday was the day with the most RTI occurrences; 93% of the RTI occurred in the urban area. The time series showed a seasonal pattern and a significant trend effect. The modeling process verified the existence of both memory and extrinsic variables related. An RTI occurrence pattern was identified, which showed an upward trend during the period analyzed. Motorcyclists were the main road users involved in RTI, which suggests the need to design and implement specific measures for that type of road user, from regulations for graduated licensing for young drivers to monitoring road user behavior for the promotion of road safety.

  2. Correlates of depression in bipolar disorder

    PubMed Central

    Moore, Paul J.; Little, Max A.; McSharry, Patrick E.; Goodwin, Guy M.; Geddes, John R.

    2014-01-01

    We analyse time series from 100 patients with bipolar disorder for correlates of depression symptoms. As the sampling interval is non-uniform, we quantify the extent of missing and irregular data using new measures of compliance and continuity. We find that uniformity of response is negatively correlated with the standard deviation of sleep ratings (ρ = –0.26, p = 0.01). To investigate the correlation structure of the time series themselves, we apply the Edelson–Krolik method for correlation estimation. We examine the correlation between depression symptoms for a subset of patients and find that self-reported measures of sleep and appetite/weight show a lower average correlation than other symptoms. Using surrogate time series as a reference dataset, we find no evidence that depression is correlated between patients, though we note a possible loss of information from sparse sampling. PMID:24352942

  3. Data mining on long-term barometric data within the ARISE2 project

    NASA Astrophysics Data System (ADS)

    Hupe, Patrick; Ceranna, Lars; Pilger, Christoph

    2016-04-01

    The Comprehensive nuclear-Test-Ban Treaty (CTBT) led to the implementation of an international infrasound array network. The International Monitoring System (IMS) network includes 48 certified stations, each providing data for up to 15 years. As part of work package 3 of the ARISE2 project (Atmospheric dynamics Research InfraStructure in Europe, phase 2) the data sets will be statistically evaluated with regard on atmospheric dynamics. The current study focusses on fluctuations of absolute air pressure. Time series have been analysed for 17 monitoring stations which are located all over the world between Greenland and Antarctica along the latitudes to represent different climate zones and characteristic atmospheric conditions. Hence this enables quantitative comparisons between those regions. Analyses are shown including wavelet power spectra, multi-annual time series of average variances with regard to long-wave scales, and spectral densities to derive characteristics and special events. Evaluations reveal periodicities in average variances on 2 to 20 day scale with a maximum in the winter months and a minimum in summer of the respective hemisphere. This basically applies to time series of IMS stations beyond the tropics where the dominance of cyclones and anticyclones changes with seasons. Furthermore, spectral density analyses illustrate striking signals for several dynamic activities within one day, e.g., the semidiurnal tide.

  4. [The trial of business data analysis at the Department of Radiology by constructing the auto-regressive integrated moving-average (ARIMA) model].

    PubMed

    Tani, Yuji; Ogasawara, Katsuhiko

    2012-01-01

    This study aimed to contribute to the management of a healthcare organization by providing management information using time-series analysis of business data accumulated in the hospital information system, which has not been utilized thus far. In this study, we examined the performance of the prediction method using the auto-regressive integrated moving-average (ARIMA) model, using the business data obtained at the Radiology Department. We made the model using the data used for analysis, which was the number of radiological examinations in the past 9 years, and we predicted the number of radiological examinations in the last 1 year. Then, we compared the actual value with the forecast value. We were able to establish that the performance prediction method was simple and cost-effective by using free software. In addition, we were able to build the simple model by pre-processing the removal of trend components using the data. The difference between predicted values and actual values was 10%; however, it was more important to understand the chronological change rather than the individual time-series values. Furthermore, our method was highly versatile and adaptable compared to the general time-series data. Therefore, different healthcare organizations can use our method for the analysis and forecasting of their business data.

  5. Deriving Vegetation Dynamics of Natural Terrestrial Ecosystems from MODIS NDVI/EVI Data over Turkey.

    PubMed

    Evrendilek, Fatih; Gulbeyaz, Onder

    2008-09-01

    The 16-day composite MODIS vegetation indices (VIs) at 500-m resolution for the period between 2000 to 2007 were seasonally averaged on the basis of the estimated distribution of 16 potential natural terrestrial ecosystems (NTEs) across Turkey. Graphical and statistical analyses of the time-series VIs for the NTEs spatially disaggregated in terms of biogeoclimate zones and land cover types included descriptive statistics, correlations, discrete Fourier transform (DFT), time-series decomposition, and simple linear regression (SLR) models. Our spatio-temporal analyses revealed that both MODIS VIs, on average, depicted similar seasonal variations for the NTEs, with the NDVI values having higher mean and SD values. The seasonal VIs were most correlated in decreasing order for: barren/sparsely vegetated land > grassland > shrubland/woodland > forest; (sub)nival > warm temperate > alpine > cool temperate > boreal = Mediterranean; and summer > spring > autumn > winter. Most pronounced differences between the MODIS VI responses over Turkey occurred in boreal and Mediterranean climate zones and forests, and in winter (the senescence phase of the growing season). Our results showed the potential of the time-series MODIS VI datasets in the estimation and monitoring of seasonal and interannual ecosystem dynamics over Turkey that needs to be further improved and refined through systematic and extensive field measurements and validations across various biomes.

  6. Intimate partner violence in Madrid: a time series analysis (2008-2016).

    PubMed

    Sanz-Barbero, Belén; Linares, Cristina; Vives-Cases, Carmen; González, José Luis; López-Ossorio, Juan José; Díaz, Julio

    2018-06-02

    This study analyzes whether there are time patterns in different intimate partner violence (IPV) indicators and aims to obtain models that can predict the behavior of these time series. Univariate autoregressive moving average models were used to analyze the time series corresponding to the number of daily calls to the 016 telephone IPV helpline and the number of daily police reports filed in the Community of Madrid during the period 2008-2015. Predictions were made for both dependent variables for 2016. The daily number of calls to the 016 telephone IPV helpline decreased during January 2008-April 2012 and increased during April 2012-December 2015. No statistically significant change was observed in the trend of the number of daily IPV police reports. The number of IPV police reports filed increased on weekends and on Christmas holidays. The number of calls to the 016 IPV help line increased on Mondays. Using data from 2008 to 2015, the univariate autoregressive moving average models predicted 64.2% of calls to the 016 telephone IPV helpline and 73.2% of police reports filed during 2016 in the Community of Madrid. Our results suggest the need for an increase in police and judicial resources on nonwork days. Also, the 016 telephone IPV helpline should be especially active on work days. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. Self-averaging in complex brain neuron signals

    NASA Astrophysics Data System (ADS)

    Bershadskii, A.; Dremencov, E.; Fukayama, D.; Yadid, G.

    2002-12-01

    Nonlinear statistical properties of Ventral Tegmental Area (VTA) of limbic brain are studied in vivo. VTA plays key role in generation of pleasure and in development of psychological drug addiction. It is shown that spiking time-series of the VTA dopaminergic neurons exhibit long-range correlations with self-averaging behavior. This specific VTA phenomenon has no relation to VTA rewarding function. Last result reveals complex role of VTA in limbic brain.

  8. Hierarchical time series bottom-up approach for forecast the export value in Central Java

    NASA Astrophysics Data System (ADS)

    Mahkya, D. A.; Ulama, B. S.; Suhartono

    2017-10-01

    The purpose of this study is Getting the best modeling and predicting the export value of Central Java using a Hierarchical Time Series. The export value is one variable injection in the economy of a country, meaning that if the export value of the country increases, the country’s economy will increase even more. Therefore, it is necessary appropriate modeling to predict the export value especially in Central Java. Export Value in Central Java are grouped into 21 commodities with each commodity has a different pattern. One approach that can be used time series is a hierarchical approach. Hierarchical Time Series is used Buttom-up. To Forecast the individual series at all levels using Autoregressive Integrated Moving Average (ARIMA), Radial Basis Function Neural Network (RBFNN), and Hybrid ARIMA-RBFNN. For the selection of the best models used Symmetric Mean Absolute Percentage Error (sMAPE). Results of the analysis showed that for the Export Value of Central Java, Bottom-up approach with Hybrid ARIMA-RBFNN modeling can be used for long-term predictions. As for the short and medium-term predictions, it can be used a bottom-up approach RBFNN modeling. Overall bottom-up approach with RBFNN modeling give the best result.

  9. 1977 Nationwide Personal Transportation Study : vehicle occupancy

    DOT National Transportation Integrated Search

    1981-04-01

    This report is part of a series that presents findings from the 1977 Nationwide Personal Transportation Study (NPTS). This report contains average vehicle occupancy rates by trip characteristics (trip purpose, trip length, time of day and day of the ...

  10. Nonlinear time-series-based adaptive control applications

    NASA Technical Reports Server (NTRS)

    Mohler, R. R.; Rajkumar, V.; Zakrzewski, R. R.

    1991-01-01

    A control design methodology based on a nonlinear time-series reference model is presented. It is indicated by highly nonlinear simulations that such designs successfully stabilize troublesome aircraft maneuvers undergoing large changes in angle of attack as well as large electric power transients due to line faults. In both applications, the nonlinear controller was significantly better than the corresponding linear adaptive controller. For the electric power network, a flexible AC transmission system with series capacitor power feedback control is studied. A bilinear autoregressive moving average reference model is identified from system data, and the feedback control is manipulated according to a desired reference state. The control is optimized according to a predictive one-step quadratic performance index. A similar algorithm is derived for control of rapid changes in aircraft angle of attack over a normally unstable flight regime. In the latter case, however, a generalization of a bilinear time-series model reference includes quadratic and cubic terms in angle of attack.

  11. A New Normal for the Sea Ice Index

    NASA Technical Reports Server (NTRS)

    Fetterer, Florence; Windnagel, Ann; Meier, Walter N.

    2014-01-01

    The NSIDC Sea Ice Index is a popular data product that shows users how ice extent and concentration have changed since the beginning of the passive microwave satellite record in 1978. It shows time series of monthly ice extent anomalies rather than actual extent values, in order to emphasize the information the data are carrying. Along with the time series, an image of average extent for the previous month is shown as a white field, with a pink line showing the median extent for that month. These are updated monthly; corresponding daily products are updated daily.

  12. Decoding Dynamic Brain Patterns from Evoked Responses: A Tutorial on Multivariate Pattern Analysis Applied to Time Series Neuroimaging Data.

    PubMed

    Grootswagers, Tijl; Wardle, Susan G; Carlson, Thomas A

    2017-04-01

    Multivariate pattern analysis (MVPA) or brain decoding methods have become standard practice in analyzing fMRI data. Although decoding methods have been extensively applied in brain-computer interfaces, these methods have only recently been applied to time series neuroimaging data such as MEG and EEG to address experimental questions in cognitive neuroscience. In a tutorial style review, we describe a broad set of options to inform future time series decoding studies from a cognitive neuroscience perspective. Using example MEG data, we illustrate the effects that different options in the decoding analysis pipeline can have on experimental results where the aim is to "decode" different perceptual stimuli or cognitive states over time from dynamic brain activation patterns. We show that decisions made at both preprocessing (e.g., dimensionality reduction, subsampling, trial averaging) and decoding (e.g., classifier selection, cross-validation design) stages of the analysis can significantly affect the results. In addition to standard decoding, we describe extensions to MVPA for time-varying neuroimaging data including representational similarity analysis, temporal generalization, and the interpretation of classifier weight maps. Finally, we outline important caveats in the design and interpretation of time series decoding experiments.

  13. Automated classification of Permanent Scatterers time-series based on statistical characterization tests

    NASA Astrophysics Data System (ADS)

    Berti, Matteo; Corsini, Alessandro; Franceschini, Silvia; Iannacone, Jean Pascal

    2013-04-01

    The application of space borne synthetic aperture radar interferometry has progressed, over the last two decades, from the pioneer use of single interferograms for analyzing changes on the earth's surface to the development of advanced multi-interferogram techniques to analyze any sort of natural phenomena which involves movements of the ground. The success of multi-interferograms techniques in the analysis of natural hazards such as landslides and subsidence is widely documented in the scientific literature and demonstrated by the consensus among the end-users. Despite the great potential of this technique, radar interpretation of slope movements is generally based on the sole analysis of average displacement velocities, while the information embraced in multi interferogram time series is often overlooked if not completely neglected. The underuse of PS time series is probably due to the detrimental effect of residual atmospheric errors, which make the PS time series characterized by erratic, irregular fluctuations often difficult to interpret, and also to the difficulty of performing a visual, supervised analysis of the time series for a large dataset. In this work is we present a procedure for automatic classification of PS time series based on a series of statistical characterization tests. The procedure allows to classify the time series into six distinctive target trends (0=uncorrelated; 1=linear; 2=quadratic; 3=bilinear; 4=discontinuous without constant velocity; 5=discontinuous with change in velocity) and retrieve for each trend a series of descriptive parameters which can be efficiently used to characterize the temporal changes of ground motion. The classification algorithms were developed and tested using an ENVISAT datasets available in the frame of EPRS-E project (Extraordinary Plan of Environmental Remote Sensing) of the Italian Ministry of Environment (track "Modena", Northern Apennines). This dataset was generated using standard processing, then the time series are typically affected by a significant noise to signal ratio. The results of the analysis show that even with such a rough-quality dataset, our automated classification procedure can greatly improve radar interpretation of mass movements. In general, uncorrelated PS (type 0) are concentrated in flat areas such as fluvial terraces and valley bottoms, and along stable watershed divides; linear PS (type 1) are mainly located on slopes (both inside or outside mapped landslides) or near the edge of scarps or steep slopes; non-linear PS (types 2 to 5) typically fall inside landslide deposits or in the surrounding areas. The spatial distribution of classified PS allows to detect deformation phenomena not visible by considering the average velocity alone, and provide important information on the temporal evolution of the phenomena such as acceleration, deceleration, seasonal fluctuations, abrupt or continuous changes of the displacement rate. Based on these encouraging results we integrated all the classification algorithms into a Graphical User Interface (called PSTime) which is freely available as a standalone application.

  14. Hybrid model for forecasting time series with trend, seasonal and salendar variation patterns

    NASA Astrophysics Data System (ADS)

    Suhartono; Rahayu, S. P.; Prastyo, D. D.; Wijayanti, D. G. P.; Juliyanto

    2017-09-01

    Most of the monthly time series data in economics and business in Indonesia and other Moslem countries not only contain trend and seasonal, but also affected by two types of calendar variation effects, i.e. the effect of the number of working days or trading and holiday effects. The purpose of this research is to develop a hybrid model or a combination of several forecasting models to predict time series that contain trend, seasonal and calendar variation patterns. This hybrid model is a combination of classical models (namely time series regression and ARIMA model) and/or modern methods (artificial intelligence method, i.e. Artificial Neural Networks). A simulation study was used to show that the proposed procedure for building the hybrid model could work well for forecasting time series with trend, seasonal and calendar variation patterns. Furthermore, the proposed hybrid model is applied for forecasting real data, i.e. monthly data about inflow and outflow of currency at Bank Indonesia. The results show that the hybrid model tend to provide more accurate forecasts than individual forecasting models. Moreover, this result is also in line with the third results of the M3 competition, i.e. the hybrid model on average provides a more accurate forecast than the individual model.

  15. Visual Circular Analysis of 266 Years of Sunspot Counts.

    PubMed

    Buelens, Bart

    2016-06-01

    Sunspots, colder areas that are visible as dark spots on the surface of the Sun, have been observed for centuries. Their number varies with a period of ∼11 years, a phenomenon closely related to the solar activity cycle. Recently, observation records dating back to 1749 have been reassessed, resulting in the release of a time series of sunspot numbers covering 266 years of observations. This series is analyzed using circular analysis to determine the periodicity of the occurrence of solar maxima. The circular analysis is combined with spiral graphs to provide a single visualization, simultaneously showing the periodicity of the series, the degree to which individual cycle lengths deviate from the average period, and differences in levels reached during the different maxima. This type of visualization of cyclic time series with varying cycle lengths in which significant events occur periodically is broadly applicable. It is aimed particularly at science communication, education, and public outreach.

  16. deltaGseg: macrostate estimation via molecular dynamics simulations and multiscale time series analysis.

    PubMed

    Low, Diana H P; Motakis, Efthymios

    2013-10-01

    Binding free energy calculations obtained through molecular dynamics simulations reflect intermolecular interaction states through a series of independent snapshots. Typically, the free energies of multiple simulated series (each with slightly different starting conditions) need to be estimated. Previous approaches carry out this task by moving averages at certain decorrelation times, assuming that the system comes from a single conformation description of binding events. Here, we discuss a more general approach that uses statistical modeling, wavelets denoising and hierarchical clustering to estimate the significance of multiple statistically distinct subpopulations, reflecting potential macrostates of the system. We present the deltaGseg R package that performs macrostate estimation from multiple replicated series and allows molecular biologists/chemists to gain physical insight into the molecular details that are not easily accessible by experimental techniques. deltaGseg is a Bioconductor R package available at http://bioconductor.org/packages/release/bioc/html/deltaGseg.html.

  17. Estimating short-run and long-run interaction mechanisms in interictal state.

    PubMed

    Ozkaya, Ata; Korürek, Mehmet

    2010-04-01

    We address the issue of analyzing electroencephalogram (EEG) from seizure patients in order to test, model and determine the statistical properties that distinguish between EEG states (interictal, pre-ictal, ictal) by introducing a new class of time series analysis methods. In the present study: firstly, we employ statistical methods to determine the non-stationary behavior of focal interictal epileptiform series within very short time intervals; secondly, for such intervals that are deemed non-stationary we suggest the concept of Autoregressive Integrated Moving Average (ARIMA) process modelling, well known in time series analysis. We finally address the queries of causal relationships between epileptic states and between brain areas during epileptiform activity. We estimate the interaction between different EEG series (channels) in short time intervals by performing Granger-causality analysis and also estimate such interaction in long time intervals by employing Cointegration analysis, both analysis methods are well-known in econometrics. Here we find: first, that the causal relationship between neuronal assemblies can be identified according to the duration and the direction of their possible mutual influences; second, that although the estimated bidirectional causality in short time intervals yields that the neuronal ensembles positively affect each other, in long time intervals neither of them is affected (increasing amplitudes) from this relationship. Moreover, Cointegration analysis of the EEG series enables us to identify whether there is a causal link from the interictal state to ictal state.

  18. Processing short-term and long-term information with a combination of polynomial approximation techniques and time-delay neural networks.

    PubMed

    Fuchs, Erich; Gruber, Christian; Reitmaier, Tobias; Sick, Bernhard

    2009-09-01

    Neural networks are often used to process temporal information, i.e., any kind of information related to time series. In many cases, time series contain short-term and long-term trends or behavior. This paper presents a new approach to capture temporal information with various reference periods simultaneously. A least squares approximation of the time series with orthogonal polynomials will be used to describe short-term trends contained in a signal (average, increase, curvature, etc.). Long-term behavior will be modeled with the tapped delay lines of a time-delay neural network (TDNN). This network takes the coefficients of the orthogonal expansion of the approximating polynomial as inputs such considering short-term and long-term information efficiently. The advantages of the method will be demonstrated by means of artificial data and two real-world application examples, the prediction of the user number in a computer network and online tool wear classification in turning.

  19. GPS Time Series and Geodynamic Implications for the Hellenic Arc Area, Greece

    NASA Astrophysics Data System (ADS)

    Hollenstein, Ch.; Heller, O.; Geiger, A.; Kahle, H.-G.; Veis, G.

    The quantification of crustal deformation and its temporal behavior is an important contribution to earthquake hazard assessment. With GPS measurements, especially from continuous operating stations, pre-, co-, post- and interseismic movements can be recorded and monitored. We present results of a continuous GPS network which has been operated in the Hellenic Arc area, Greece, since 1995. In order to obtain coordinate time series of high precision which are representative for crustal deformation, a main goal was to eliminate effects which are not of tectonic origin. By applying different steps of improvement, non-tectonic irregularities were reduced significantly, and the precision could be improved by an average of 40%. The improved time series are used to study the crustal movements in space and time. They serve as a base for the estimation of velocities and for the visualization of the movements in terms of trajectories. Special attention is given to large earthquakes (M>6), which occurred near GPS sites during the measuring time span.

  20. On the construction of a time base and the elimination of averaging errors in proxy records

    NASA Astrophysics Data System (ADS)

    Beelaerts, V.; De Ridder, F.; Bauwens, M.; Schmitz, N.; Pintelon, R.

    2009-04-01

    Proxies are sources of climate information which are stored in natural archives (e.g. ice-cores, sediment layers on ocean floors and animals with calcareous marine skeletons). Measuring these proxies produces very short records and mostly involves sampling solid substrates, which is subject to the following two problems: Problem 1: Natural archives are equidistantly sampled at a distance grid along their accretion axis. Starting from these distance series, a time series needs to be constructed, as comparison of different data records is only meaningful on a time grid. The time series will be non-equidistant, as the accretion rate is non-constant. Problem 2: A typical example of sampling solid substrates is drilling. Because of the dimensions of the drill, the holes drilled will not be infinitesimally small. Consequently, samples are not taken at a point in distance, but rather over a volume in distance. This holds for most sampling methods in solid substrates. As a consequence, when the continuous proxy signal is sampled, it will be averaged over the volume of the sample, resulting in an underestimation of the amplitude. Whether this averaging effect is significant, depends on the volume of the sample and the variations of interest of the proxy signal. Starting from the measured signal, the continuous signal needs to be reconstructed in order eliminate these averaging errors. The aim is to provide an efficient identification algorithm to identify the non-linearities in the distance-time relationship, called time base distortions, and to correct for the averaging effects. Because this is a parametric method, an assumption about the proxy signal needs to be made: the proxy record on a time base is assumed to be harmonic, this is an obvious assumption because natural archives often exhibit a seasonal cycle. In a first approach the averaging effects are assumed to be in one direction only, i.e. the direction of the axis on which the measurements were performed. The measured averaged proxy signal is modeled by following signal model: -- Δ ∫ n+12Δδ- y(n,θ) = δ- 1Δ- y(m,θ)dm n-2 δ where m is the position, x(m) = Δm; θ are the unknown parameters and y(m,θ) is the proxy signal we want to identify (the proxy signal as found in the natural archive), which we model as: y(m, θ) = A +∑H [A sin(kωt(m ))+ A cos(kωt(m ))] 0 k=1 k k+H With t(m): t(m) = mTS + g(m )TS Here TS = 1/fS is the sampling period, fS the sampling frequency, and g(m) the unknown time base distortion (TBD). In this work a splines approximation of the TBD is chosen: ∑ g(m ) = b blφl(m ) l=1 where, b is a vector of unknown time base distortion parameters, and φ is a set of splines. The estimates of the unknown parameters were obtained with a nonlinear least squares algorithm. The vessel density measured in the mangrove tree R mucronata was used to illustrate the method. The vessel density is a proxy for the rain fall in tropical regions. The proxy data on the newly constructed time base showed a yearly periodicity, this is what we expected and the correction for the averaging effect increased the amplitude by 11.18%.

  1. Soft-plastic brace for lower limb fractures in patients with spinal cord injury.

    PubMed

    Uehara, K; Akai, M; Kubo, T; Yamasaki, N; Okuma, Y; Tobimatsu, Y; Iwaya, T

    2013-04-01

    Retrospective study at a rehabilitation center. Patients with spinal cord injury, even if they are wheelchair users, sometimes suffer from fractures of the lower limb bones. As their bones are too weak to have surgery, and because a precise reduction is not required for restoration, such patients are often indicated for conservative treatment. This case series study investigated the use of a hinged, soft-plastic brace as a conservative approach to treating fractures of the lower extremities of patients with spinal cord injury. National Rehabilitation Center, Japan. Fifteen patients (male, n=10; female, n=5; average age, 52.7 years) with 19 fractures of the femur or the tibia who were treated with a newly-developed hinged, soft-plastic brace were studied. All of them used wheelchairs. We analyzed the time taken for fracture union and for wearing orthotics, degree of malalignment, femorotibial angle and side effects. The fractures in this series were caused by relatively low-energy impact. The average time taken for fracture union was 80.1 (37-189) days, and the average amount of time spent wearing orthotics was 77.9 (42-197) days. On final X-ray imaging, the average femorotibial angle was 176.9° (s.d. ±8.90), and 15° of misalignment in the sagittal plane occurred in one patient. A hinged, soft-plastic brace is a useful option as a conservative approach for treating fractures of the lower extremities in patients with spinal cord injury.

  2. XDATA

    DTIC Science & Technology

    2017-12-01

    6028 Date Cleared: 30 NOV 2017 13. SUPPLEMENTARY NOTES 14. ABSTRACT Data analysis tools which operate on varied data sources including time series ...public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...and raw detections from geo-located tweets Micro-paths (10M) (No distance/ time filter) Raw Tracks (10M) Raw Detections (10M) APPROVED FOR PUBLIC

  3. Solar Activity and the Sea-surface Temperature Record-evidence of a Long-period Variation in Solar Total Irradiance

    NASA Technical Reports Server (NTRS)

    Reid, George C.

    1990-01-01

    There have been many suggestions over the years of a connection between solar activity and the Earth's climate on time scales long compared to the 11-year sunspot cycle. They have remained little more than suggestions largely because of the major uncertainties in the climate record itself, and the difficulty in trying to compile a global average from an assembly of measurements that are uneven in both quality and distribution. Different climate time response to solar activity, some suggesting a positive correlation, some a negative correlation, and some no correlation at all. The only excuse for making yet another such suggestion is that much effort has been devoted in recent years to compiling climate records for the past century or more that are internally consistent and believable, and that a decadal-scale record of solar total irradiance is emerging from spacecraft measurements, and can be used to set limits on the variation that is likely to have occurred on these time scales. The work described here was originally inspired by the observation that the time series of globally averaged sea-surface temperatures over the past 120 years or so, as compiled by the British Meteorological Office group (Folland and Kates, 1984), bore a resonable similarity to the long-term average sunspot number, which is an indicator of the secular variability of solar activity. The two time series are shown where the sunspot number is shown as the 135-month running mean, and the SST variation is shown as the departure from an arbitrary average value. The simplest explanation of the similarity, if one accepts it as other than coincidental, is that the sun's luminosity may have been varying more or less in step with the level of solar activity, or in other words that there is a close coupling between the sun's magnetic condition and its radiative output on time scales longer than the 11-year cycle. Such an idea is not new, and in fact the time series shown can be regarded as a modern extension of the proposal put forward by Eddy (1977) to explain the covariance between various global climate indicators and solar activity as revealed by the C-14 record over the past millenium.

  4. Sensitivity analysis of machine-learning models of hydrologic time series

    NASA Astrophysics Data System (ADS)

    O'Reilly, A. M.

    2017-12-01

    Sensitivity analysis traditionally has been applied to assessing model response to perturbations in model parameters, where the parameters are those model input variables adjusted during calibration. Unlike physics-based models where parameters represent real phenomena, the equivalent of parameters for machine-learning models are simply mathematical "knobs" that are automatically adjusted during training/testing/verification procedures. Thus the challenge of extracting knowledge of hydrologic system functionality from machine-learning models lies in their very nature, leading to the label "black box." Sensitivity analysis of the forcing-response behavior of machine-learning models, however, can provide understanding of how the physical phenomena represented by model inputs affect the physical phenomena represented by model outputs.As part of a previous study, hybrid spectral-decomposition artificial neural network (ANN) models were developed to simulate the observed behavior of hydrologic response contained in multidecadal datasets of lake water level, groundwater level, and spring flow. Model inputs used moving window averages (MWA) to represent various frequencies and frequency-band components of time series of rainfall and groundwater use. Using these forcing time series, the MWA-ANN models were trained to predict time series of lake water level, groundwater level, and spring flow at 51 sites in central Florida, USA. A time series of sensitivities for each MWA-ANN model was produced by perturbing forcing time-series and computing the change in response time-series per unit change in perturbation. Variations in forcing-response sensitivities are evident between types (lake, groundwater level, or spring), spatially (among sites of the same type), and temporally. Two generally common characteristics among sites are more uniform sensitivities to rainfall over time and notable increases in sensitivities to groundwater usage during significant drought periods.

  5. Time lag between immigration and tuberculosis rates in immigrants in the Netherlands: a time-series analysis.

    PubMed

    van Aart, C; Boshuizen, H; Dekkers, A; Korthals Altes, H

    2017-05-01

    In low-incidence countries, most tuberculosis (TB) cases are foreign-born. We explored the temporal relationship between immigration and TB in first-generation immigrants between 1995 and 2012 to assess whether immigration can be a predictor for TB in immigrants from high-incidence countries. We obtained monthly data on immigrant TB cases and immigration for the three countries of origin most frequently represented among TB cases in the Netherlands: Morocco, Somalia and Turkey. The best-fit seasonal autoregressive integrated moving average (SARIMA) model to the immigration time-series was used to prewhiten the TB time series. The cross-correlation function (CCF) was then computed on the residual time series to detect time lags between immigration and TB rates. We identified a 17-month lag between Somali immigration and Somali immigrant TB cases, but no time lag for immigrants from Morocco and Turkey. The absence of a lag in the Moroccan and Turkish population may be attributed to the relatively low TB prevalence in the countries of origin and an increased likelihood of reactivation TB in an ageing immigrant population. Understanding the time lag between Somali immigration and TB disease would benefit from a closer epidemiological analysis of cohorts of Somali cases diagnosed within the first years after entry.

  6. Multifractal detrending moving-average cross-correlation analysis

    NASA Astrophysics Data System (ADS)

    Jiang, Zhi-Qiang; Zhou, Wei-Xing

    2011-07-01

    There are a number of situations in which several signals are simultaneously recorded in complex systems, which exhibit long-term power-law cross correlations. The multifractal detrended cross-correlation analysis (MFDCCA) approaches can be used to quantify such cross correlations, such as the MFDCCA based on the detrended fluctuation analysis (MFXDFA) method. We develop in this work a class of MFDCCA algorithms based on the detrending moving-average analysis, called MFXDMA. The performances of the proposed MFXDMA algorithms are compared with the MFXDFA method by extensive numerical experiments on pairs of time series generated from bivariate fractional Brownian motions, two-component autoregressive fractionally integrated moving-average processes, and binomial measures, which have theoretical expressions of the multifractal nature. In all cases, the scaling exponents hxy extracted from the MFXDMA and MFXDFA algorithms are very close to the theoretical values. For bivariate fractional Brownian motions, the scaling exponent of the cross correlation is independent of the cross-correlation coefficient between two time series, and the MFXDFA and centered MFXDMA algorithms have comparative performances, which outperform the forward and backward MFXDMA algorithms. For two-component autoregressive fractionally integrated moving-average processes, we also find that the MFXDFA and centered MFXDMA algorithms have comparative performances, while the forward and backward MFXDMA algorithms perform slightly worse. For binomial measures, the forward MFXDMA algorithm exhibits the best performance, the centered MFXDMA algorithms performs worst, and the backward MFXDMA algorithm outperforms the MFXDFA algorithm when the moment order q<0 and underperforms when q>0. We apply these algorithms to the return time series of two stock market indexes and to their volatilities. For the returns, the centered MFXDMA algorithm gives the best estimates of hxy(q) since its hxy(2) is closest to 0.5, as expected, and the MFXDFA algorithm has the second best performance. For the volatilities, the forward and backward MFXDMA algorithms give similar results, while the centered MFXDMA and the MFXDFA algorithms fail to extract rational multifractal nature.

  7. Coronal Mass Ejection Data Clustering and Visualization of Decision Trees

    NASA Astrophysics Data System (ADS)

    Ma, Ruizhe; Angryk, Rafal A.; Riley, Pete; Filali Boubrahimi, Soukaina

    2018-05-01

    Coronal mass ejections (CMEs) can be categorized as either “magnetic clouds” (MCs) or non-MCs. Features such as a large magnetic field, low plasma-beta, and low proton temperature suggest that a CME event is also an MC event; however, so far there is neither a definitive method nor an automatic process to distinguish the two. Human labeling is time-consuming, and results can fluctuate owing to the imprecise definition of such events. In this study, we approach the problem of MC and non-MC distinction from a time series data analysis perspective and show how clustering can shed some light on this problem. Although many algorithms exist for traditional data clustering in the Euclidean space, they are not well suited for time series data. Problems such as inadequate distance measure, inaccurate cluster center description, and lack of intuitive cluster representations need to be addressed for effective time series clustering. Our data analysis in this work is twofold: clustering and visualization. For clustering we compared the results from the popular hierarchical agglomerative clustering technique to a distance density clustering heuristic we developed previously for time series data clustering. In both cases, dynamic time warping will be used for similarity measure. For classification as well as visualization, we use decision trees to aggregate single-dimensional clustering results to form a multidimensional time series decision tree, with averaged time series to present each decision. In this study, we achieved modest accuracy and, more importantly, an intuitive interpretation of how different parameters contribute to an MC event.

  8. Hidden discriminative features extraction for supervised high-order time series modeling.

    PubMed

    Nguyen, Ngoc Anh Thi; Yang, Hyung-Jeong; Kim, Sunhee

    2016-11-01

    In this paper, an orthogonal Tucker-decomposition-based extraction of high-order discriminative subspaces from a tensor-based time series data structure is presented, named as Tensor Discriminative Feature Extraction (TDFE). TDFE relies on the employment of category information for the maximization of the between-class scatter and the minimization of the within-class scatter to extract optimal hidden discriminative feature subspaces that are simultaneously spanned by every modality for supervised tensor modeling. In this context, the proposed tensor-decomposition method provides the following benefits: i) reduces dimensionality while robustly mining the underlying discriminative features, ii) results in effective interpretable features that lead to an improved classification and visualization, and iii) reduces the processing time during the training stage and the filtering of the projection by solving the generalized eigenvalue issue at each alternation step. Two real third-order tensor-structures of time series datasets (an epilepsy electroencephalogram (EEG) that is modeled as channel×frequency bin×time frame and a microarray data that is modeled as gene×sample×time) were used for the evaluation of the TDFE. The experiment results corroborate the advantages of the proposed method with averages of 98.26% and 89.63% for the classification accuracies of the epilepsy dataset and the microarray dataset, respectively. These performance averages represent an improvement on those of the matrix-based algorithms and recent tensor-based, discriminant-decomposition approaches; this is especially the case considering the small number of samples that are used in practice. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Light-weight Parallel Python Tools for Earth System Modeling Workflows

    NASA Astrophysics Data System (ADS)

    Mickelson, S. A.; Paul, K.; Xu, H.; Dennis, J.; Brown, D. I.

    2015-12-01

    With the growth in computing power over the last 30 years, earth system modeling codes have become increasingly data-intensive. As an example, it is expected that the data required for the next Intergovernmental Panel on Climate Change (IPCC) Assessment Report (AR6) will increase by more than 10x to an expected 25PB per climate model. Faced with this daunting challenge, developers of the Community Earth System Model (CESM) have chosen to change the format of their data for long-term storage from time-slice to time-series, in order to reduce the required download bandwidth needed for later analysis and post-processing by climate scientists. Hence, efficient tools are required to (1) perform the transformation of the data from time-slice to time-series format and to (2) compute climatology statistics, needed for many diagnostic computations, on the resulting time-series data. To address the first of these two challenges, we have developed a parallel Python tool for converting time-slice model output to time-series format. To address the second of these challenges, we have developed a parallel Python tool to perform fast time-averaging of time-series data. These tools are designed to be light-weight, be easy to install, have very few dependencies, and can be easily inserted into the Earth system modeling workflow with negligible disruption. In this work, we present the motivation, approach, and testing results of these two light-weight parallel Python tools, as well as our plans for future research and development.

  10. Nonparametric autocovariance estimation from censored time series by Gaussian imputation.

    PubMed

    Park, Jung Wook; Genton, Marc G; Ghosh, Sujit K

    2009-02-01

    One of the most frequently used methods to model the autocovariance function of a second-order stationary time series is to use the parametric framework of autoregressive and moving average models developed by Box and Jenkins. However, such parametric models, though very flexible, may not always be adequate to model autocovariance functions with sharp changes. Furthermore, if the data do not follow the parametric model and are censored at a certain value, the estimation results may not be reliable. We develop a Gaussian imputation method to estimate an autocovariance structure via nonparametric estimation of the autocovariance function in order to address both censoring and incorrect model specification. We demonstrate the effectiveness of the technique in terms of bias and efficiency with simulations under various rates of censoring and underlying models. We describe its application to a time series of silicon concentrations in the Arctic.

  11. Predictive time-series modeling using artificial neural networks for Linac beam symmetry: an empirical study.

    PubMed

    Li, Qiongge; Chan, Maria F

    2017-01-01

    Over half of cancer patients receive radiotherapy (RT) as partial or full cancer treatment. Daily quality assurance (QA) of RT in cancer treatment closely monitors the performance of the medical linear accelerator (Linac) and is critical for continuous improvement of patient safety and quality of care. Cumulative longitudinal QA measurements are valuable for understanding the behavior of the Linac and allow physicists to identify trends in the output and take preventive actions. In this study, artificial neural networks (ANNs) and autoregressive moving average (ARMA) time-series prediction modeling techniques were both applied to 5-year daily Linac QA data. Verification tests and other evaluations were then performed for all models. Preliminary results showed that ANN time-series predictive modeling has more advantages over ARMA techniques for accurate and effective applicability in the dosimetry and QA field. © 2016 New York Academy of Sciences.

  12. A hybrid clustering approach for multivariate time series - A case study applied to failure analysis in a gas turbine.

    PubMed

    Fontes, Cristiano Hora; Budman, Hector

    2017-11-01

    A clustering problem involving multivariate time series (MTS) requires the selection of similarity metrics. This paper shows the limitations of the PCA similarity factor (SPCA) as a single metric in nonlinear problems where there are differences in magnitude of the same process variables due to expected changes in operation conditions. A novel method for clustering MTS based on a combination between SPCA and the average-based Euclidean distance (AED) within a fuzzy clustering approach is proposed. Case studies involving either simulated or real industrial data collected from a large scale gas turbine are used to illustrate that the hybrid approach enhances the ability to recognize normal and fault operating patterns. This paper also proposes an oversampling procedure to create synthetic multivariate time series that can be useful in commonly occurring situations involving unbalanced data sets. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Hazard function theory for nonstationary natural hazards

    DOE PAGES

    Read, Laura K.; Vogel, Richard M.

    2016-04-11

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series ( X) with its failure time series ( T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable  X with corresponding failure time series  T should have application to a wide class of natural hazards with opportunities for future extensions.« less

  14. Detecting macroeconomic phases in the Dow Jones Industrial Average time series

    NASA Astrophysics Data System (ADS)

    Wong, Jian Cheng; Lian, Heng; Cheong, Siew Ann

    2009-11-01

    In this paper, we perform statistical segmentation and clustering analysis of the Dow Jones Industrial Average (DJI) time series between January 1997 and August 2008. Modeling the index movements and log-index movements as stationary Gaussian processes, we find a total of 116 and 119 statistically stationary segments respectively. These can then be grouped into between five and seven clusters, each representing a different macroeconomic phase. The macroeconomic phases are distinguished primarily by their volatilities. We find that the US economy, as measured by the DJI, spends most of its time in a low-volatility phase and a high-volatility phase. The former can be roughly associated with economic expansion, while the latter contains the economic contraction phase in the standard economic cycle. Both phases are interrupted by a moderate-volatility market correction phase, but extremely-high-volatility market crashes are found mostly within the high-volatility phase. From the temporal distribution of various phases, we see a high-volatility phase from mid-1998 to mid-2003, and another starting mid-2007 (the current global financial crisis). Transitions from the low-volatility phase to the high-volatility phase are preceded by a series of precursor shocks, whereas the transition from the high-volatility phase to the low-volatility phase is preceded by a series of inverted shocks. The time scale for both types of transitions is about a year. We also identify the July 1997 Asian Financial Crisis to be the trigger for the mid-1998 transition, and an unnamed May 2006 market event related to corrections in the Chinese markets to be the trigger for the mid-2007 transition.

  15. Hazard function theory for nonstationary natural hazards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Read, Laura K.; Vogel, Richard M.

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series ( X) with its failure time series ( T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable  X with corresponding failure time series  T should have application to a wide class of natural hazards with opportunities for future extensions.« less

  16. Studies in astronomical time series analysis: Modeling random processes in the time domain

    NASA Technical Reports Server (NTRS)

    Scargle, J. D.

    1979-01-01

    Random process models phased in the time domain are used to analyze astrophysical time series data produced by random processes. A moving average (MA) model represents the data as a sequence of pulses occurring randomly in time, with random amplitudes. An autoregressive (AR) model represents the correlations in the process in terms of a linear function of past values. The best AR model is determined from sampled data and transformed to an MA for interpretation. The randomness of the pulse amplitudes is maximized by a FORTRAN algorithm which is relatively stable numerically. Results of test cases are given to study the effects of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the optical light curve of the quasar 3C 273 is given.

  17. Visibility graph analysis on quarterly macroeconomic series of China based on complex network theory

    NASA Astrophysics Data System (ADS)

    Wang, Na; Li, Dong; Wang, Qiwen

    2012-12-01

    The visibility graph approach and complex network theory provide a new insight into time series analysis. The inheritance of the visibility graph from the original time series was further explored in the paper. We found that degree distributions of visibility graphs extracted from Pseudo Brownian Motion series obtained by the Frequency Domain algorithm exhibit exponential behaviors, in which the exponential exponent is a binomial function of the Hurst index inherited in the time series. Our simulations presented that the quantitative relations between the Hurst indexes and the exponents of degree distribution function are different for different series and the visibility graph inherits some important features of the original time series. Further, we convert some quarterly macroeconomic series including the growth rates of value-added of three industry series and the growth rates of Gross Domestic Product series of China to graphs by the visibility algorithm and explore the topological properties of graphs associated from the four macroeconomic series, namely, the degree distribution and correlations, the clustering coefficient, the average path length, and community structure. Based on complex network analysis we find degree distributions of associated networks from the growth rates of value-added of three industry series are almost exponential and the degree distributions of associated networks from the growth rates of GDP series are scale free. We also discussed the assortativity and disassortativity of the four associated networks as they are related to the evolutionary process of the original macroeconomic series. All the constructed networks have “small-world” features. The community structures of associated networks suggest dynamic changes of the original macroeconomic series. We also detected the relationship among government policy changes, community structures of associated networks and macroeconomic dynamics. We find great influences of government policies in China on the changes of dynamics of GDP and the three industries adjustment. The work in our paper provides a new way to understand the dynamics of economic development.

  18. SIMULATING ATMOSPHERIC EXPOSURE IN A NATIONAL RISK ASSESSMENT USING AN INNOVATIVE METEOROLOGICAL SAMPLING SCHEME

    EPA Science Inventory

    Multimedia risk assessments require the temporal integration of atmospheric concentration and deposition with other media modules. However, providing an extended time series of estimates is computationally expensive. An alternative approach is to substitute long-term average a...

  19. Detrending moving average algorithm for multifractals

    NASA Astrophysics Data System (ADS)

    Gu, Gao-Feng; Zhou, Wei-Xing

    2010-07-01

    The detrending moving average (DMA) algorithm is a widely used technique to quantify the long-term correlations of nonstationary time series and the long-range correlations of fractal surfaces, which contains a parameter θ determining the position of the detrending window. We develop multifractal detrending moving average (MFDMA) algorithms for the analysis of one-dimensional multifractal measures and higher-dimensional multifractals, which is a generalization of the DMA method. The performance of the one-dimensional and two-dimensional MFDMA methods is investigated using synthetic multifractal measures with analytical solutions for backward (θ=0) , centered (θ=0.5) , and forward (θ=1) detrending windows. We find that the estimated multifractal scaling exponent τ(q) and the singularity spectrum f(α) are in good agreement with the theoretical values. In addition, the backward MFDMA method has the best performance, which provides the most accurate estimates of the scaling exponents with lowest error bars, while the centered MFDMA method has the worse performance. It is found that the backward MFDMA algorithm also outperforms the multifractal detrended fluctuation analysis. The one-dimensional backward MFDMA method is applied to analyzing the time series of Shanghai Stock Exchange Composite Index and its multifractal nature is confirmed.

  20. The role of global cloud climatologies in validating numerical models

    NASA Technical Reports Server (NTRS)

    HARSHVARDHAN

    1993-01-01

    The purpose of this work is to estimate sampling errors of area-time averaged rain rate due to temporal samplings by satellites. In particular, the sampling errors of the proposed low inclination orbit satellite of the Tropical Rainfall Measuring Mission (TRMM) (35 deg inclination and 350 km altitude), one of the sun synchronous polar orbiting satellites of NOAA series (98.89 deg inclination and 833 km altitude), and two simultaneous sun synchronous polar orbiting satellites--assumed to carry a perfect passive microwave sensor for direct rainfall measurements--will be estimated. This estimate is done by performing a study of the satellite orbits and the autocovariance function of the area-averaged rain rate time series. A model based on an exponential fit of the autocovariance function is used for actual calculations. Varying visiting intervals and total coverage of averaging area on each visit by the satellites are taken into account in the model. The data are generated by a General Circulation Model (GCM). The model has a diurnal cycle and parameterized convective processes. A special run of the GCM was made at NASA/GSFC in which the rainfall and precipitable water fields were retained globally for every hour of the run for the whole year.

  1. Forecasting air quality time series using deep learning.

    PubMed

    Freeman, Brian S; Taylor, Graham; Gharabaghi, Bahram; Thé, Jesse

    2018-04-13

    This paper presents one of the first applications of deep learning (DL) techniques to predict air pollution time series. Air quality management relies extensively on time series data captured at air monitoring stations as the basis of identifying population exposure to airborne pollutants and determining compliance with local ambient air standards. In this paper, 8 hr averaged surface ozone (O 3 ) concentrations were predicted using deep learning consisting of a recurrent neural network (RNN) with long short-term memory (LSTM). Hourly air quality and meteorological data were used to train and forecast values up to 72 hours with low error rates. The LSTM was able to forecast the duration of continuous O 3 exceedances as well. Prior to training the network, the dataset was reviewed for missing data and outliers. Missing data were imputed using a novel technique that averaged gaps less than eight time steps with incremental steps based on first-order differences of neighboring time periods. Data were then used to train decision trees to evaluate input feature importance over different time prediction horizons. The number of features used to train the LSTM model was reduced from 25 features to 5 features, resulting in improved accuracy as measured by Mean Absolute Error (MAE). Parameter sensitivity analysis identified look-back nodes associated with the RNN proved to be a significant source of error if not aligned with the prediction horizon. Overall, MAE's less than 2 were calculated for predictions out to 72 hours. Novel deep learning techniques were used to train an 8-hour averaged ozone forecast model. Missing data and outliers within the captured data set were replaced using a new imputation method that generated calculated values closer to the expected value based on the time and season. Decision trees were used to identify input variables with the greatest importance. The methods presented in this paper allow air managers to forecast long range air pollution concentration while only monitoring key parameters and without transforming the data set in its entirety, thus allowing real time inputs and continuous prediction.

  2. A multimodel approach to interannual and seasonal prediction of Danube discharge anomalies

    NASA Astrophysics Data System (ADS)

    Rimbu, Norel; Ionita, Monica; Patrut, Simona; Dima, Mihai

    2010-05-01

    Interannual and seasonal predictability of Danube river discharge is investigated using three model types: 1) time series models 2) linear regression models of discharge with large-scale climate mode indices and 3) models based on stable teleconnections. All models are calibrated using discharge and climatic data for the period 1901-1977 and validated for the period 1978-2008 . Various time series models, like autoregressive (AR), moving average (MA), autoregressive and moving average (ARMA) or singular spectrum analysis and autoregressive moving average (SSA+ARMA) models have been calibrated and their skills evaluated. The best results were obtained using SSA+ARMA models. SSA+ARMA models proved to have the highest forecast skill also for other European rivers (Gamiz-Fortis et al. 2008). Multiple linear regression models using large-scale climatic mode indices as predictors have a higher forecast skill than the time series models. The best predictors for Danube discharge are the North Atlantic Oscillation (NAO) and the East Atlantic/Western Russia patterns during winter and spring. Other patterns, like Polar/Eurasian or Tropical Northern Hemisphere (TNH) are good predictors for summer and autumn discharge. Based on stable teleconnection approach (Ionita et al. 2008) we construct prediction models through a combination of sea surface temperature (SST), temperature (T) and precipitation (PP) from the regions where discharge and SST, T and PP variations are stable correlated. Forecast skills of these models are higher than forecast skills of the time series and multiple regression models. The models calibrated and validated in our study can be used for operational prediction of interannual and seasonal Danube discharge anomalies. References Gamiz-Fortis, S., D. Pozo-Vazquez, R.M. Trigo, and Y. Castro-Diez, Quantifying the predictability of winter river flow in Iberia. Part I: intearannual predictability. J. Climate, 2484-2501, 2008. Gamiz-Fortis, S., D. Pozo-Vazquez, R.M. Trigo, and Y. Castro-Diez, Quantifying the predictability of winter river flow in Iberia. Part II: seasonal predictability. J. Climate, 2503-2518, 2008. Ionita, M., G. Lohmann, and N. Rimbu, Prediction of spring Elbe river discharge based on stable teleconnections with global temperature and precipitation. J. Climate. 6215-6226, 2008.

  3. A study of intensity, fatigue and precision in two specific interval trainings in young tennis players: high-intensity interval training versus intermittent interval training

    PubMed Central

    Suárez Rodríguez, David; del Valle Soto, Miguel

    2017-01-01

    Background The aim of this study is to find the differences between two specific interval exercises. We begin with the hypothesis that the use of microintervals of work and rest allow for greater intensity of play and a reduction in fatigue. Methods Thirteen competition-level male tennis players took part in two interval training exercises comprising nine 2 min series, which consisted of hitting the ball with cross-court forehand and backhand shots, behind the service box. One was a high-intensity interval training (HIIT), made up of periods of continuous work lasting 2 min, and the other was intermittent interval training (IIT), this time with intermittent 2 min intervals, alternating periods of work with rest periods. Average heart rate (HR) and lactate levels were registered in order to observe the physiological intensity of the two exercises, along with the Borg Scale results for perceived exertion and the number of shots and errors in order to determine the intensity achieved and the degree of fatigue throughout the exercise. Results There were no significant differences in the average heart rate, lactate or the Borg Scale. Significant differences were registered, on the other hand, with a greater number of shots in the first two HIIT series (series 1 p>0.009; series 2 p>0.056), but not in the third. The number of errors was significantly lower in all the IIT series (series 1 p<0.035; series 2 p<0.010; series 3 p<0.001). Conclusion Our study suggests that high-intensity intermittent training allows for greater intensity of play in relation to the real time spent on the exercise, reduced fatigue levels and the maintaining of greater precision in specific tennis-related exercises. PMID:29021912

  4. Time-series analysis in imatinib-resistant chronic myeloid leukemia K562-cells under different drug treatments.

    PubMed

    Zhao, Yan-Hong; Zhang, Xue-Fang; Zhao, Yan-Qiu; Bai, Fan; Qin, Fan; Sun, Jing; Dong, Ying

    2017-08-01

    Chronic myeloid leukemia (CML) is characterized by the accumulation of active BCR-ABL protein. Imatinib is the first-line treatment of CML; however, many patients are resistant to this drug. In this study, we aimed to compare the differences in expression patterns and functions of time-series genes in imatinib-resistant CML cells under different drug treatments. GSE24946 was downloaded from the GEO database, which included 17 samples of K562-r cells with (n=12) or without drug administration (n=5). Three drug treatment groups were considered for this study: arsenic trioxide (ATO), AMN107, and ATO+AMN107. Each group had one sample at each time point (3, 12, 24, and 48 h). Time-series genes with a ratio of standard deviation/average (coefficient of variation) >0.15 were screened, and their expression patterns were revealed based on Short Time-series Expression Miner (STEM). Then, the functional enrichment analysis of time-series genes in each group was performed using DAVID, and the genes enriched in the top ten functional categories were extracted to detect their expression patterns. Different time-series genes were identified in the three groups, and most of them were enriched in the ribosome and oxidative phosphorylation pathways. Time-series genes in the three treatment groups had different expression patterns and functions. Time-series genes in the ATO group (e.g. CCNA2 and DAB2) were significantly associated with cell adhesion, those in the AMN107 group were related to cellular carbohydrate metabolic process, while those in the ATO+AMN107 group (e.g. AP2M1) were significantly related to cell proliferation and antigen processing. In imatinib-resistant CML cells, ATO could influence genes related to cell adhesion, AMN107 might affect genes involved in cellular carbohydrate metabolism, and the combination therapy might regulate genes involved in cell proliferation.

  5. Modeling and projection of dengue fever cases in Guangzhou based on variation of weather factors.

    PubMed

    Li, Chenlu; Wang, Xiaofeng; Wu, Xiaoxu; Liu, Jianing; Ji, Duoying; Du, Juan

    2017-12-15

    Dengue fever is one of the most serious vector-borne infectious diseases, especially in Guangzhou, China. Dengue viruses and their vectors Aedes albopictus are sensitive to climate change primarily in relation to weather factors. Previous research has mainly focused on identifying the relationship between climate factors and dengue cases, or developing dengue case models with some non-climate factors. However, there has been little research addressing the modeling and projection of dengue cases only from the perspective of climate change. This study considered this topic using long time series data (1998-2014). First, sensitive weather factors were identified through meta-analysis that included literature review screening, lagged analysis, and collinear analysis. Then, key factors that included monthly average temperature at a lag of two months, and monthly average relative humidity and monthly average precipitation at lags of three months were determined. Second, time series Poisson analysis was used with the generalized additive model approach to develop a dengue model based on key weather factors for January 1998 to December 2012. Data from January 2013 to July 2014 were used to validate that the model was reliable and reasonable. Finally, future weather data (January 2020 to December 2070) were input into the model to project the occurrence of dengue cases under different climate scenarios (RCP 2.6 and RCP 8.5). Longer time series analysis and scientifically selected weather variables were used to develop a dengue model to ensure reliability. The projections suggested that seasonal disease control (especially in summer and fall) and mitigation of greenhouse gas emissions could help reduce the incidence of dengue fever. The results of this study hope to provide a scientifically theoretical basis for the prevention and control of dengue fever in Guangzhou. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Stochastic model stationarization by eliminating the periodic term and its effect on time series prediction

    NASA Astrophysics Data System (ADS)

    Moeeni, Hamid; Bonakdari, Hossein; Fatemi, Seyed Ehsan

    2017-04-01

    Because time series stationarization has a key role in stochastic modeling results, three methods are analyzed in this study. The methods are seasonal differencing, seasonal standardization and spectral analysis to eliminate the periodic effect on time series stationarity. First, six time series including 4 streamflow series and 2 water temperature series are stationarized. The stochastic term for these series obtained with ARIMA is subsequently modeled. For the analysis, 9228 models are introduced. It is observed that seasonal standardization and spectral analysis eliminate the periodic term completely, while seasonal differencing maintains seasonal correlation structures. The obtained results indicate that all three methods present acceptable performance overall. However, model accuracy in monthly streamflow prediction is higher with seasonal differencing than with the other two methods. Another advantage of seasonal differencing over the other methods is that the monthly streamflow is never estimated as negative. Standardization is the best method for predicting monthly water temperature although it is quite similar to seasonal differencing, while spectral analysis performed the weakest in all cases. It is concluded that for each monthly seasonal series, seasonal differencing is the best stationarization method in terms of periodic effect elimination. Moreover, the monthly water temperature is predicted with more accuracy than monthly streamflow. The criteria of the average stochastic term divided by the amplitude of the periodic term obtained for monthly streamflow and monthly water temperature were 0.19 and 0.30, 0.21 and 0.13, and 0.07 and 0.04 respectively. As a result, the periodic term is more dominant than the stochastic term for water temperature in the monthly water temperature series compared to streamflow series.

  7. A Source Term for Wave Attenuation by Sea Ice in WAVEWATCH III (registered trademark): IC4

    DTIC Science & Technology

    2017-06-07

    by ANSI Std. Z39.18 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for... time . Diamonds indicate active, moored AWACs. Circle indicates location of R/V Sikuliaq. Thick magenta and white lines indicate path of R/V Sikuliaq...past and future ship position, respectively). .................................................................. 15 Figure 10 Time series of

  8. Cascading Oscillators in Decoding Speech: Reflection of a Cortical Computation Principle

    DTIC Science & Technology

    2016-09-06

    Combining an experimental paradigm based on Ghitza and Greenberg (2009) for speech with the approach of Farbood et al. (2013) to timing in key...Fuglsang, 2015). A model was developed which uses modulation spectrograms to construct an oscillating time - series synchronized with the slowly varying...estimated to average 1 hour per response, including the time for reviewing instructions, searching data sources, gathering and maintaining the data

  9. Characteristic mega-basin water storage behavior using GRACE.

    PubMed

    Reager, J T; Famiglietti, James S

    2013-06-01

    [1] A long-standing challenge for hydrologists has been a lack of observational data on global-scale basin hydrological behavior. With observations from NASA's Gravity Recovery and Climate Experiment (GRACE) mission, hydrologists are now able to study terrestrial water storage for large river basins (>200,000 km 2 ), with monthly time resolution. Here we provide results of a time series model of basin-averaged GRACE terrestrial water storage anomaly and Global Precipitation Climatology Project precipitation for the world's largest basins. We address the short (10 year) length of the GRACE record by adopting a parametric spectral method to calculate frequency-domain transfer functions of storage response to precipitation forcing and then generalize these transfer functions based on large-scale basin characteristics, such as percent forest cover and basin temperature. Among the parameters tested, results show that temperature, soil water-holding capacity, and percent forest cover are important controls on relative storage variability, while basin area and mean terrain slope are less important. The derived empirical relationships were accurate (0.54 ≤  E f  ≤ 0.84) in modeling global-scale water storage anomaly time series for the study basins using only precipitation, average basin temperature, and two land-surface variables, offering the potential for synthesis of basin storage time series beyond the GRACE observational period. Such an approach could be applied toward gap filling between current and future GRACE missions and for predicting basin storage given predictions of future precipitation.

  10. Characteristic mega-basin water storage behavior using GRACE

    PubMed Central

    Reager, J T; Famiglietti, James S

    2013-01-01

    [1] A long-standing challenge for hydrologists has been a lack of observational data on global-scale basin hydrological behavior. With observations from NASA’s Gravity Recovery and Climate Experiment (GRACE) mission, hydrologists are now able to study terrestrial water storage for large river basins (>200,000 km2), with monthly time resolution. Here we provide results of a time series model of basin-averaged GRACE terrestrial water storage anomaly and Global Precipitation Climatology Project precipitation for the world’s largest basins. We address the short (10 year) length of the GRACE record by adopting a parametric spectral method to calculate frequency-domain transfer functions of storage response to precipitation forcing and then generalize these transfer functions based on large-scale basin characteristics, such as percent forest cover and basin temperature. Among the parameters tested, results show that temperature, soil water-holding capacity, and percent forest cover are important controls on relative storage variability, while basin area and mean terrain slope are less important. The derived empirical relationships were accurate (0.54 ≤ Ef ≤ 0.84) in modeling global-scale water storage anomaly time series for the study basins using only precipitation, average basin temperature, and two land-surface variables, offering the potential for synthesis of basin storage time series beyond the GRACE observational period. Such an approach could be applied toward gap filling between current and future GRACE missions and for predicting basin storage given predictions of future precipitation. PMID:24563556

  11. Real Time Search Algorithm for Observation Outliers During Monitoring Engineering Constructions

    NASA Astrophysics Data System (ADS)

    Latos, Dorota; Kolanowski, Bogdan; Pachelski, Wojciech; Sołoducha, Ryszard

    2017-12-01

    Real time monitoring of engineering structures in case of an emergency of disaster requires collection of a large amount of data to be processed by specific analytical techniques. A quick and accurate assessment of the state of the object is crucial for a probable rescue action. One of the more significant evaluation methods of large sets of data, either collected during a specified interval of time or permanently, is the time series analysis. In this paper presented is a search algorithm for those time series elements which deviate from their values expected during monitoring. Quick and proper detection of observations indicating anomalous behavior of the structure allows to take a variety of preventive actions. In the algorithm, the mathematical formulae used provide maximal sensitivity to detect even minimal changes in the object's behavior. The sensitivity analyses were conducted for the algorithm of moving average as well as for the Douglas-Peucker algorithm used in generalization of linear objects in GIS. In addition to determining the size of deviations from the average it was used the so-called Hausdorff distance. The carried out simulation and verification of laboratory survey data showed that the approach provides sufficient sensitivity for automatic real time analysis of large amount of data obtained from different and various sensors (total stations, leveling, camera, radar).

  12. Calculation of power spectrums from digital time series with missing data points

    NASA Technical Reports Server (NTRS)

    Murray, C. W., Jr.

    1980-01-01

    Two algorithms are developed for calculating power spectrums from the autocorrelation function when there are missing data points in the time series. Both methods use an average sampling interval to compute lagged products. One method, the correlation function power spectrum, takes the discrete Fourier transform of the lagged products directly to obtain the spectrum, while the other, the modified Blackman-Tukey power spectrum, takes the Fourier transform of the mean lagged products. Both techniques require fewer calculations than other procedures since only 50% to 80% of the maximum lags need be calculated. The algorithms are compared with the Fourier transform power spectrum and two least squares procedures (all for an arbitrary data spacing). Examples are given showing recovery of frequency components from simulated periodic data where portions of the time series are missing and random noise has been added to both the time points and to values of the function. In addition the methods are compared using real data. All procedures performed equally well in detecting periodicities in the data.

  13. Lead-lag cross-sectional structure and detection of correlated anticorrelated regime shifts: Application to the volatilities of inflation and economic growth rates

    NASA Astrophysics Data System (ADS)

    Zhou, Wei-Xing; Sornette, Didier

    2007-07-01

    We have recently introduced the “thermal optimal path” (TOP) method to investigate the real-time lead-lag structure between two time series. The TOP method consists in searching for a robust noise-averaged optimal path of the distance matrix along which the two time series have the greatest similarity. Here, we generalize the TOP method by introducing a more general definition of distance which takes into account possible regime shifts between positive and negative correlations. This generalization to track possible changes of correlation signs is able to identify possible transitions from one convention (or consensus) to another. Numerical simulations on synthetic time series verify that the new TOP method performs as expected even in the presence of substantial noise. We then apply it to investigate changes of convention in the dependence structure between the historical volatilities of the USA inflation rate and economic growth rate. Several measures show that the new TOP method significantly outperforms standard cross-correlation methods.

  14. The nature of turbulence in a triangular lattice gas automaton

    NASA Astrophysics Data System (ADS)

    Duong-Van, Minh; Feit, M. D.; Keller, P.; Pound, M.

    1986-12-01

    Power spectra calculated from the coarse-graining of a simple lattice gas automaton, and those of time averaging other stochastic times series that we have investigated, have exponents in the range -1.6 to -2, consistent with observation of fully developed turbulence. This power spectrum is a natural consequence of coarse-graining; the exponent -2 represents the continuum limit.

  15. A better understanding of long-range temporal dependence of traffic flow time series

    NASA Astrophysics Data System (ADS)

    Feng, Shuo; Wang, Xingmin; Sun, Haowei; Zhang, Yi; Li, Li

    2018-02-01

    Long-range temporal dependence is an important research perspective for modelling of traffic flow time series. Various methods have been proposed to depict the long-range temporal dependence, including autocorrelation function analysis, spectral analysis and fractal analysis. However, few researches have studied the daily temporal dependence (i.e. the similarity between different daily traffic flow time series), which can help us better understand the long-range temporal dependence, such as the origin of crossover phenomenon. Moreover, considering both types of dependence contributes to establishing more accurate model and depicting the properties of traffic flow time series. In this paper, we study the properties of daily temporal dependence by simple average method and Principal Component Analysis (PCA) based method. Meanwhile, we also study the long-range temporal dependence by Detrended Fluctuation Analysis (DFA) and Multifractal Detrended Fluctuation Analysis (MFDFA). The results show that both the daily and long-range temporal dependence exert considerable influence on the traffic flow series. The DFA results reveal that the daily temporal dependence creates crossover phenomenon when estimating the Hurst exponent which depicts the long-range temporal dependence. Furthermore, through the comparison of the DFA test, PCA-based method turns out to be a better method to extract the daily temporal dependence especially when the difference between days is significant.

  16. Time series inversion of spectra from ground-based radiometers

    NASA Astrophysics Data System (ADS)

    Christensen, O. M.; Eriksson, P.

    2013-02-01

    Retrieving time series of atmospheric constituents from ground-based spectrometers often requires different temporal averaging depending on the altitude region in focus. This can lead to several datasets existing for one instrument which complicates validation and comparisons between instruments. This paper puts forth a possible solution by incorporating the temporal domain into the maximum a posteriori (MAP) retrieval algorithm. The state vector is increased to include measurements spanning a time period, and the temporal correlations between the true atmospheric states are explicitly specified in the a priori uncertainty matrix. This allows the MAP method to effectively select the best temporal smoothing for each altitude, removing the need for several datasets to cover different altitudes. The method is compared to traditional averaging of spectra using a simulated retrieval of water vapour in the mesosphere. The simulations show that the method offers a significant advantage compared to the traditional method, extending the sensitivity an additional 10 km upwards without reducing the temporal resolution at lower altitudes. The method is also tested on the OSO water vapour microwave radiometer confirming the advantages found in the simulation. Additionally, it is shown how the method can interpolate data in time and provide diagnostic values to evaluate the interpolated data.

  17. Increasing trend in the average temperature in Finland, 1847-2012

    NASA Astrophysics Data System (ADS)

    Mikkonen, Santtu; Laine, Marko; Mäkelä, Hanna M.; Gregow, Hilppa; Tuomenvirta, Heikki; Lahtinen, Matti; Laaksonen, Ari

    2014-05-01

    The global average temperature has increased by about 0.8 ° C since the mid-19th century. It has been shown that this increase is statistically significant and that it can, for the most part, be attributed to human-induced climate change (IPCC 2007). A temperature increase is obvious also in regional and local temperatures in many parts of the world. However, compared with the global average temperature, the regional and local temperatures exhibit higher levels of noise, which has largely been removed from the global temperature due to the higher level of averaging. Because Finland is located in northern latitudes, it is subject to the polar amplification of climate change-induced warming, which is due to the enhanced melting of snow and ice and other feedback mechanisms. Therefore, warming in Finland is expected to be approximately 50% higher than the global average. Conversely, the location of Finland between the Atlantic Ocean and continental Eurasia causes the weather to be very variable, and thus the temperature signal is rather noisy. The change in mean temperature in Finland was investigated with Dynamic Linear Models (DLM) in order to define the sign and the magnitude of the trend in the temperature time series within the last 165 years. The data consisted of gridded monthly mean temperatures. The grid has a 10 km spatial resolution, and it was created by interpolating a homogenized temperature series measured at Finnish weather stations. Seasonal variation in temperature and the autocorrelation structure of the time series were taken account in the DLM models. We found that the Finnish temperature time series exhibits a statistically significant increasing trend, which is consistent with human-induced global warming. The mean temperature has risen clearly over 2° C in the years 1847-2012, which amounts to 0.16 ° C/decade. The warming rate before 1940's was close to the linear trend for the whole period, whereas the temperature change in the mid-20th century was negligible. However, the warming after the late 1960s has been remarkably fast. The model indicates that within the last 40 years the rate of change has been as high as 0.30 ° C/decade. The increase in temperature has been highest in spring and in late autumn but the change in summer months has not been so evident. The observed warming is somewhat higher than the global trend, which confirms the assumption that warming is stronger in higher latitudes.

  18. Comparison of missing value imputation methods in time series: the case of Turkish meteorological data

    NASA Astrophysics Data System (ADS)

    Yozgatligil, Ceylan; Aslan, Sipan; Iyigun, Cem; Batmaz, Inci

    2013-04-01

    This study aims to compare several imputation methods to complete the missing values of spatio-temporal meteorological time series. To this end, six imputation methods are assessed with respect to various criteria including accuracy, robustness, precision, and efficiency for artificially created missing data in monthly total precipitation and mean temperature series obtained from the Turkish State Meteorological Service. Of these methods, simple arithmetic average, normal ratio (NR), and NR weighted with correlations comprise the simple ones, whereas multilayer perceptron type neural network and multiple imputation strategy adopted by Monte Carlo Markov Chain based on expectation-maximization (EM-MCMC) are computationally intensive ones. In addition, we propose a modification on the EM-MCMC method. Besides using a conventional accuracy measure based on squared errors, we also suggest the correlation dimension (CD) technique of nonlinear dynamic time series analysis which takes spatio-temporal dependencies into account for evaluating imputation performances. Depending on the detailed graphical and quantitative analysis, it can be said that although computational methods, particularly EM-MCMC method, are computationally inefficient, they seem favorable for imputation of meteorological time series with respect to different missingness periods considering both measures and both series studied. To conclude, using the EM-MCMC algorithm for imputing missing values before conducting any statistical analyses of meteorological data will definitely decrease the amount of uncertainty and give more robust results. Moreover, the CD measure can be suggested for the performance evaluation of missing data imputation particularly with computational methods since it gives more precise results in meteorological time series.

  19. Iterative Procedures for Exact Maximum Likelihood Estimation in the First-Order Gaussian Moving Average Model

    DTIC Science & Technology

    1990-11-01

    1 = Q- 1 - 1 QlaaQ- 1.1 + a’Q-1a This is a simple case of a general formula called Woodbury’s formula by some authors; see, for example, Phadke and...1 2. The First-Order Moving Average Model ..... .................. 3. Some Approaches to the Iterative...the approximate likelihood function in some time series models. Useful suggestions have been the Cholesky decomposition of the covariance matrix and

  20. Zernike Phase Contrast Electron Cryo-Tomography Applied to Marine Cyanobacteria Infected with Cyanophages

    PubMed Central

    Dai, Wei; Fu, Caroline; Khant, Htet A.; Ludtke, Steven J.; Schmid, Michael F.; Chiu, Wah

    2015-01-01

    Advances in electron cryo-tomography have provided a new opportunity to visualize the internal 3D structures of a bacterium. An electron microscope equipped with Zernike phase contrast optics produces images with dramatically increased contrast compared to images obtained by conventional electron microscopy. Here we describe a protocol to apply Zernike phase plate technology for acquiring electron tomographic tilt series of cyanophage-infected cyanobacterial cells embedded in ice, without staining or chemical fixation. We detail the procedures for aligning and assessing phase plates for data collection, and methods to obtain 3D structures of cyanophage assembly intermediates in the host, by subtomogram alignment, classification and averaging. Acquiring three to four tomographic tilt series takes approximately 12 h on a JEM2200FS electron microscope. We expect this time requirement to decrease substantially as the technique matures. Time required for annotation and subtomogram averaging varies widely depending on the project goals and data volume. PMID:25321408

  1. Seasonal flux and assemblage composition of planktic foraminifera from the northern Gulf of Mexico, 2008-2012

    USGS Publications Warehouse

    Reynolds, Caitlin E.; Richey, Julie N.; Poore, Richard Z.

    2013-01-01

    The U.S. Geological Survey anchored a sediment trap in the northern Gulf of Mexico beginning in 2008 to collect seasonal time-series data on the flux and assemblage composition of live planktic foraminifers. This report provides an update of the previous time-series data to include results from 2012. Ten species, or varieties, constituted ~92 percent of the 2012 assemblage: Globigerinoides ruber (pink and white varieties), Globigerinoides sacculifer, Globigerina calida, Globigerinella aequilateralis, Globorotalia menardii group [The Gt. menardii group includes Gt. menardii, Gt. tumida, and Gt. ungulata], Orbulina universa, Globorotalia truncatulinoides, Pulleniatina spp., and Neogloboquadrina dutertrei. The mean daily flux was 158 tests per square meter per day (m–2 day–1), with maximum fluxes of >450tests m–2 day–1 during the beginning of July and mid–August and minimum fluxes of –2 day–1 during the beginning of February and mid–July. Globorotalia truncatulinoides showed a clear preference for the winter, consistent with data from 2008 to 2011. Globigerinoides ruber (white) flux data for 2012 (average 23 tests m–2 day–1) were consistent with data from 2011 (average 30 tests m–2 day–1) and 2010 (average 29 tests m–2 day–1) and showed a steady threefold increase since 2009 (average 11 tests m–2 day–1) and a tenfold increase from the 2008 flux (3 tests m–2 day–1).

  2. Bootstrap-after-bootstrap model averaging for reducing model uncertainty in model selection for air pollution mortality studies.

    PubMed

    Roberts, Steven; Martin, Michael A

    2010-01-01

    Concerns have been raised about findings of associations between particulate matter (PM) air pollution and mortality that have been based on a single "best" model arising from a model selection procedure, because such a strategy may ignore model uncertainty inherently involved in searching through a set of candidate models to find the best model. Model averaging has been proposed as a method of allowing for model uncertainty in this context. To propose an extension (double BOOT) to a previously described bootstrap model-averaging procedure (BOOT) for use in time series studies of the association between PM and mortality. We compared double BOOT and BOOT with Bayesian model averaging (BMA) and a standard method of model selection [standard Akaike's information criterion (AIC)]. Actual time series data from the United States are used to conduct a simulation study to compare and contrast the performance of double BOOT, BOOT, BMA, and standard AIC. Double BOOT produced estimates of the effect of PM on mortality that have had smaller root mean squared error than did those produced by BOOT, BMA, and standard AIC. This performance boost resulted from estimates produced by double BOOT having smaller variance than those produced by BOOT and BMA. Double BOOT is a viable alternative to BOOT and BMA for producing estimates of the mortality effect of PM.

  3. Ensemble Deep Learning for Biomedical Time Series Classification

    PubMed Central

    2016-01-01

    Ensemble learning has been proved to improve the generalization ability effectively in both theory and practice. In this paper, we briefly outline the current status of research on it first. Then, a new deep neural network-based ensemble method that integrates filtering views, local views, distorted views, explicit training, implicit training, subview prediction, and Simple Average is proposed for biomedical time series classification. Finally, we validate its effectiveness on the Chinese Cardiovascular Disease Database containing a large number of electrocardiogram recordings. The experimental results show that the proposed method has certain advantages compared to some well-known ensemble methods, such as Bagging and AdaBoost. PMID:27725828

  4. PROMIS series. Volume 8: Midlatitude ground magnetograms

    NASA Technical Reports Server (NTRS)

    Fairfield, D. H.; Russell, C. T.

    1990-01-01

    This is the eighth in a series of volumes pertaining to the Polar Region Outer Magnetosphere International Study (PROMIS). This volume contains 24 hour stack plots of 1-minute average, H and D component, ground magnetograms for the period March 10 through June 16, 1986. Nine midlatitude ground stations were selected from the UCLA magnetogram data base that was constructed from all available digitized magnetogram stations. The primary purpose of this publication is to allow users to define universal times and onset longitudes of magnetospheric substorms.

  5. Forecasting coconut production in the Philippines with ARIMA model

    NASA Astrophysics Data System (ADS)

    Lim, Cristina Teresa

    2015-02-01

    The study aimed to depict the situation of the coconut industry in the Philippines for the future years applying Autoregressive Integrated Moving Average (ARIMA) method. Data on coconut production, one of the major industrial crops of the country, for the period of 1990 to 2012 were analyzed using time-series methods. Autocorrelation (ACF) and partial autocorrelation functions (PACF) were calculated for the data. Appropriate Box-Jenkins autoregressive moving average model was fitted. Validity of the model was tested using standard statistical techniques. The forecasting power of autoregressive moving average (ARMA) model was used to forecast coconut production for the eight leading years.

  6. Status of alewife and rainbow smelt in U.S. waters of Lake Ontario, 2015

    USGS Publications Warehouse

    Walsh, Maureen; Weidel, Brian C.; Connerton, Michael J.; Holden, Jeremy P.

    2016-01-01

    In 2015 the joint USGS and NYSDEC surveys for Alewife and Rainbow Smelt were combined for the first time into a comprehensive spring pelagic prey fish survey. The adult Alewife abundance and weight indices in 2015 increased slightly from 2014 levels, and adult Alewife abundance has remained relatively stable for the past five years. Adult Alewife condition in both spring and fall increased from 2014 values and was above long-term means. Yearling Alewife abundance was the lowest observed in the 38-year time series. Alewife year class strength at age 1 is related to the number of spawning adults and summer temperatures and winter duration in the first year after hatching. Moderate year classes were produced during 2009-2011, and 2012 was the largest year class in the time series. However, severe winters in 2013-2014 and 2014-2015 contributed to two successive very small year classes for the first time in the time series. We expect adult Alewife abundance and biomass to decline in 2016 as older and larger fish decline in the population. The number of spawning adults increased in 2015, summer temperatures were slightly below average, and the anticipated winter duration is below average (i.e., milder winter) for 2015-2016, so these conditions will likely produce a low to moderate year class. A third successive weak year class could be problematic for the Lake Ontario Alewife population and may be of concern to binational lake managers. Rainbow Smelt were also assessed and the population continues to persist at a low and stable level.

  7. ARIMA representation for daily solar irradiance and surface air temperature time series

    NASA Astrophysics Data System (ADS)

    Kärner, Olavi

    2009-06-01

    Autoregressive integrated moving average (ARIMA) models are used to compare long-range temporal variability of the total solar irradiance (TSI) at the top of the atmosphere (TOA) and surface air temperature series. The comparison shows that one and the same type of the model is applicable to represent the TSI and air temperature series. In terms of the model type surface air temperature imitates closely that for the TSI. This may mean that currently no other forcing to the climate system is capable to change the random walk type variability established by the varying activity of the rotating Sun. The result should inspire more detailed examination of the dependence of various climate series on short-range fluctuations of TSI.

  8. Cost-benefit analysis of the 55-mph speed limit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forester, T.H.; McNown, R.F.; Singell, L.D.

    1984-01-01

    This article presents the results of an empirical study which estimates the number of reduced fatalities as a result of the imposed 55-mph speed limit. Time series data for the US from 1952 to 1979 is employed in a regression model capturing the relation between fatalities, average speed, variability of speed, and the speed limit. Also discussed are the alternative approaches to valuing human life and the value of time. Provided is a series of benefit-cost ratios based on alternative measures of the benefits and costs from life saving. The paper concludes that the 55-mph speed limit is not costmore » efficient unless additional time on the highway is valued significantly below levels estimated in the best reasearch on the value of time. 12 references, 1 table.« less

  9. Estimations of the Global Distribution and Time Series of UV Noontime Irradiance (305, 310, 324, 380 nm, and Erythemal) from TOMS and SeaWiFS Data

    NASA Technical Reports Server (NTRS)

    Herman, J.

    2004-01-01

    The amount of UV irradiance reaching the Earth's surface is estimated from the measured cloud reflectivity, ozone, aerosol amounts, and surface reflectivity time series from 1980 to 1992 and 1997 to 2000 to estimate changes that have occurred over a 21-year period. Recent analysis of the TOMS data shows that there has been an apparent increase in reflectivity (decrease in W) in the Southern Hemisphere that is related to a calibration error in EP-TOMS. Data from the well-calibrated SeaWiFS satellite instrument have been used to correct the EP-TOMS reflectivity and UV time series. After correction, some of the local trend features seen in the N7 time series (1980 to 1992) have been continued in the combined time series, but the overall zonal average and global trends have changed. In addition to correcting the EP-TOMS radiance calibration, the use of SeaWiFS cloud data permits estimation of UV irradiance at higher spatial resolution (1 to 4 km) than is available from TOMS (100 km) under the assumption that ozone is slowly varying over a scale of 100 km. The key results include a continuing decrease in cloud cover over Europe and North America with a corresponding increase in UV and a decrease in UV irradiance near Antarctica.

  10. Nonlinear Dynamics of River Runoff Elucidated by Horizontal Visibility Graphs

    NASA Astrophysics Data System (ADS)

    Lange, Holger; Rosso, Osvaldo A.

    2017-04-01

    We investigate a set of long-term river runoff time series at daily resolution from Brazil, monitored by the Agencia Nacional de Aguas. A total of 150 time series was obtained, with an average length of 65 years. Both long-term trends and human influence (water management, e.g. for power production) on the dynamical behaviour are analyzed. We use Horizontal Visibility Graphs (HVGs) to determine the individual temporal networks for the time series, and extract their degree and their distance (shortest path length) distributions. Statistical and information-theoretic properties of these distributions are calculated: robust estimators of skewness and kurtosis, the maximum degree occurring in the time series, the Shannon entropy, permutation complexity and Fisher Information. For the latter, we also compare the information measures obtained from the degree distributions to those using the original time series directly, to investigate the impact of graph construction on the dynamical properties as reflected in these measures. Focus is on one hand on universal properties of the HVG, common to all runoff series, and on site-specific aspects on the other. Results demonstrate that the assumption of power law behaviour for the degree distribtion does not generally hold, and that management has a significant impact on this distribution. We also show that a specific pretreatment of the time series conventional in hydrology, the elimination of seasonality by a separate z-transformation for each calendar day, is highly detrimental to the nonlinear behaviour. It changes long-term correlations and the overall dynamics towards more random behaviour. Analysis based on the transformed data easily leads to spurious results, and bear a high risk of misinterpretation.

  11. Monitoring of Viral Induced Cell Death Using Real Time Cell Analysis

    DTIC Science & Technology

    2016-11-01

    studies have shown that real- time cell analysis (RTCA) platforms such as the xCELLigence can be used to gather quantitative measurements of viral...Teng, Z., Kuang, X., Wang, J., Zhang, X. Real- time cell analysis – A new method for dynamic, quantitative measurement of infectious viruses and...cytopathogenicity. A) Real- time monitoring of BSR cells infected with a 1:10 dilution series of Gan Gan virus. The curve is an average of eight

  12. Evaluation of the effects of climate and man intervention on ground waters and their dependent ecosystems using time series analysis

    NASA Astrophysics Data System (ADS)

    Gemitzi, Alexandra; Stefanopoulos, Kyriakos

    2011-06-01

    SummaryGroundwaters and their dependent ecosystems are affected both by the meteorological conditions as well as from human interventions, mainly in the form of groundwater abstractions for irrigation needs. This work aims at investigating the quantitative effects of meteorological conditions and man intervention on groundwater resources and their dependent ecosystems. Various seasonal Auto-Regressive Integrated Moving Average (ARIMA) models with external predictor variables were used in order to model the influence of meteorological conditions and man intervention on the groundwater level time series. Initially, a seasonal ARIMA model that simulates the abstraction time series using as external predictor variable temperature ( T) was prepared. Thereafter, seasonal ARIMA models were developed in order to simulate groundwater level time series in 8 monitoring locations, using the appropriate predictor variables determined for each individual case. The spatial component was introduced through the use of Geographical Information Systems (GIS). Application of the proposed methodology took place in the Neon Sidirochorion alluvial aquifer (Northern Greece), for which a 7-year long time series (i.e., 2003-2010) of piezometric and groundwater abstraction data exists. According to the developed ARIMA models, three distinct groups of groundwater level time series exist; the first one proves to be dependent only on the meteorological parameters, the second group demonstrates a mixed dependence both on meteorological conditions and on human intervention, whereas the third group shows a clear influence from man intervention. Moreover, there is evidence that groundwater abstraction has affected an important protected ecosystem.

  13. Visibility graph network analysis of natural gas price: The case of North American market

    NASA Astrophysics Data System (ADS)

    Sun, Mei; Wang, Yaqi; Gao, Cuixia

    2016-11-01

    Fluctuations in prices of natural gas significantly affect global economy. Therefore, the research on the characteristics of natural gas price fluctuations, turning points and its influencing cycle on the subsequent price series is of great significance. Global natural gas trade concentrates on three regional markets: the North American market, the European market and the Asia-Pacific market, with North America having the most developed natural gas financial market. In addition, perfect legal supervision and coordinated regulations make the North American market more open and more competitive. This paper focuses on the North American natural gas market specifically. The Henry Hub natural gas spot price time series is converted to a visibility graph network which provides a new direction for macro analysis of time series, and several indicators are investigated: degree and degree distribution, the average shortest path length and community structure. The internal mechanisms underlying price fluctuations are explored through the indicators. The results show that the natural gas prices visibility graph network (NGP-VGN) is of small-world and scale-free properties simultaneously. After random rearrangement of original price time series, the degree distribution of network becomes exponential distribution, different from the original ones. This means that, the original price time series is of long-range negative correlation fractal characteristic. In addition, nodes with large degree correspond to significant geopolitical or economic events. Communities correspond to time cycles in visibility graph network. The cycles of time series and the impact scope of hubs can be found by community structure partition.

  14. The application of neural networks to myoelectric signal analysis: a preliminary study.

    PubMed

    Kelly, M F; Parker, P A; Scott, R N

    1990-03-01

    Two neural network implementations are applied to myoelectric signal (MES) analysis tasks. The motivation behind this research is to explore more reliable methods of deriving control for multidegree of freedom arm prostheses. A discrete Hopfield network is used to calculate the time series parameters for a moving average MES model. It is demonstrated that the Hopfield network is capable of generating the same time series parameters as those produced by the conventional sequential least squares (SLS) algorithm. Furthermore, it can be extended to applications utilizing larger amounts of data, and possibly to higher order time series models, without significant degradation in computational efficiency. The second neural network implementation involves using a two-layer perceptron for classifying a single site MES based on two features, specifically the first time series parameter, and the signal power. Using these features, the perceptron is trained to distinguish between four separate arm functions. The two-dimensional decision boundaries used by the perceptron classifier are delineated. It is also demonstrated that the perceptron is able to rapidly compensate for variations when new data are incorporated into the training set. This adaptive quality suggests that perceptrons may provide a useful tool for future MES analysis.

  15. The construction of a Central Netherlands temperature

    NASA Astrophysics Data System (ADS)

    van der Schrier, G.; van Ulden, A.; van Oldenborgh, G. J.

    2011-05-01

    The Central Netherlands Temperature (CNT) is a monthly daily mean temperature series constructed from homogenized time series from the centre of the Netherlands. The purpose of this series is to offer a homogeneous time series representative of a larger area in order to study large-scale temperature changes. It will also facilitate a comparison with climate models, which resolve similar scales. From 1906 onwards, temperature measurements in the Netherlands have been sufficiently standardized to construct a high-quality series. Long time series have been constructed by merging nearby stations and using the overlap to calibrate the differences. These long time series and a few time series of only a few decades in length have been subjected to a homogeneity analysis in which significant breaks and artificial trends have been corrected. Many of the detected breaks correspond to changes in the observations that are documented in the station metadata. This version of the CNT, to which we attach the version number 1.1, is constructed as the unweighted average of four stations (De Bilt, Winterswijk/Hupsel, Oudenbosch/Gilze-Rijen and Gemert/Volkel) with the stations Eindhoven and Deelen added from 1951 and 1958 onwards, respectively. The global gridded datasets used for detecting and attributing climate change are based on raw observational data. Although some homogeneity adjustments are made, these are not based on knowledge of local circumstances but only on statistical evidence. Despite this handicap, and the fact that these datasets use grid boxes that are far larger then the area associated with that of the Central Netherlands Temperature, the temperature interpolated to the CNT region shows a warming trend that is broadly consistent with the CNT trend in all of these datasets. The actual trends differ from the CNT trend up to 30 %, which highlights the need to base future global gridded temperature datasets on homogenized time series.

  16. Structural Equation Modeling of Multivariate Time Series

    ERIC Educational Resources Information Center

    du Toit, Stephen H. C.; Browne, Michael W.

    2007-01-01

    The covariance structure of a vector autoregressive process with moving average residuals (VARMA) is derived. It differs from other available expressions for the covariance function of a stationary VARMA process and is compatible with current structural equation methodology. Structural equation modeling programs, such as LISREL, may therefore be…

  17. 40 CFR Appendix D to Part 60 - Required Emission Inventory Information

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... one device operates in series. The method of control efficiency determination shall be indicated (e.g., design efficiency, measured efficiency, estimated efficiency). (iii) Annual average control efficiency, in percent, taking into account control equipment down time. This shall be a combined efficiency when...

  18. 40 CFR Appendix D to Part 60 - Required Emission Inventory Information

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... one device operates in series. The method of control efficiency determination shall be indicated (e.g., design efficiency, measured efficiency, estimated efficiency). (iii) Annual average control efficiency, in percent, taking into account control equipment down time. This shall be a combined efficiency when...

  19. 40 CFR Appendix D to Part 60 - Required Emission Inventory Information

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... one device operates in series. The method of control efficiency determination shall be indicated (e.g., design efficiency, measured efficiency, estimated efficiency). (iii) Annual average control efficiency, in percent, taking into account control equipment down time. This shall be a combined efficiency when...

  20. 40 CFR Appendix D to Part 60 - Required Emission Inventory Information

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... one device operates in series. The method of control efficiency determination shall be indicated (e.g., design efficiency, measured efficiency, estimated efficiency). (iii) Annual average control efficiency, in percent, taking into account control equipment down time. This shall be a combined efficiency when...

  1. 40 CFR Appendix D to Part 60 - Required Emission Inventory Information

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... one device operates in series. The method of control efficiency determination shall be indicated (e.g., design efficiency, measured efficiency, estimated efficiency). (iii) Annual average control efficiency, in percent, taking into account control equipment down time. This shall be a combined efficiency when...

  2. Severe Droughts Reduce Estuarine Primary Productivity with Cascading Effects on Higher Trophic Levels

    EPA Science Inventory

    Using a 10 year time-series dataset, we analyzed the effects of two severe droughts on water quality and ecosystem processes in a temperate, eutrophic estuary (Neuse River Estuary, North Carolina). During the droughts, dissolved inorganic nitrogen concentrations were on average 4...

  3. ERBE_S10N_WFOV_SF_ERBS_AreaAverageTimeSeries_Edition4

    Atmospheric Science Data Center

    2017-08-02

    ... NOTE : There is a data gap in 1993 and 1998 due to instrument issues (see section 1.3 in Data Quality Summary). These years have months of missing data, thus users need caution while using data from these years.   ...

  4. Tropical Tropospheric Ozone (TTO) Maps from Nimbus 7 and Earth-Probe TOMS by the Modified-Residual Method. 1; Validation, Evaluation and Trends based on Atlantic Regional Time Series

    NASA Technical Reports Server (NTRS)

    Thompson, Anne M.; Hudson, Robert D.

    1998-01-01

    The well-known wave-one pattern seen in tropical total ozone [Shiotani, 1992; Ziemke et al., 1996, 1998] has been used to develop a modified-residual (MR) method for retrieving time-averaged stratospheric ozone and tropospheric ozone column amount from TOMS (Total Ozone Mapping Spectrometer) over the 14 complete calendar years of Nimbus 7 observations (1979-1992) and from TOMS on the Earth-Probe (1996-present) and ADEOS platforms (1996- 1997). Nine- to sixteen-day averaged tropical tropospheric ozone (TTO) maps, validated with ozonesondes, show a seasonality expected from dynamical and chemical influences. The maps may be viewed on a homepage: http://metosrv2.umd.edu/tropo. Stratospheric column ozone, which is also derived by the modified-residual method, compares well with sondes (to within 6-7 DU) and with stratospheric ozone column derived from other satellites (within 8-10 DU). Validation of the TTO time-series is presently limited to ozonesonde comparisons with Atlantic stations and sites on the adjacent continents (Ascension Island, Natal, Brazil; Brazzaville); for the sounding periods, TTO at all locations agrees with the sonde record to +/-7 DU. TTO time-series and the magnitude of the wave-one pattern show ENSO signals in the strongest El Nifio periods from 1979-1998. From 12degN and 12degS, zonally averaged tropospheric ozone shows no significant trend from 1980-1990. Trends are also not significant during this period in localized regions, e.g. from just west of South America across to southern Africa. This is consistent with the ozonesonde record at Natal, Brazil (the only tropical ozone data publicly available for the 1980's), which shows a not statistically significant increase. The lack of trend in tropospheric ozone agrees with a statistical analysis based on another method for deriving TTO from TOMS, the so-called Convective-Cloud-Differential approach of Ziemke et al. [1998].

  5. Dissolved organic nitrogen dynamics in the North Sea: A time series analysis (1995-2005)

    NASA Astrophysics Data System (ADS)

    Van Engeland, T.; Soetaert, K.; Knuijt, A.; Laane, R. W. P. M.; Middelburg, J. J.

    2010-09-01

    Dissolved organic nitrogen (DON) dynamics in the North Sea was explored by means of long-term time series of nitrogen parameters from the Dutch national monitoring program. Generally, the data quality was good with little missing data points. Different imputation methods were used to verify the robustness of the patterns against these missing data. No long-term trends in DON concentrations were found over the sampling period (1995-2005). Inter-annual variability in the different time series showed both common and station-specific behavior. The stations could be divided into two regions, based on absolute concentrations and the dominant times scales of variability. Average DON concentrations were 11 μmol l -1 in the coastal region and 5 μmol l -1 in the open sea. Organic fractions of total dissolved nitrogen (TDN) averaged 38 and 71% in the coastal zone and open sea, respectively, but increased over time due to decreasing dissolved inorganic nitrogen (DIN) concentrations. In both regions intra-annual variability dominated over inter-annual variability, but DON variation in the open sea was markedly shifted towards shorter time scales relative to coastal stations. In the coastal zone a consistent seasonal DON cycle existed with high values in spring-summer and low values in autumn-winter. In the open sea seasonality was weak. A marked shift in the seasonality was found at the Dogger Bank, with DON accumulation towards summer and low values in winter prior to 1999, and accumulation in spring and decline throughout summer after 1999. This study clearly shows that DON is a dynamic actor in the North Sea and should be monitored systematically to enable us to understand fully the functioning of this ecosystem.

  6. Time series forecasting using ERNN and QR based on Bayesian model averaging

    NASA Astrophysics Data System (ADS)

    Pwasong, Augustine; Sathasivam, Saratha

    2017-08-01

    The Bayesian model averaging technique is a multi-model combination technique. The technique was employed to amalgamate the Elman recurrent neural network (ERNN) technique with the quadratic regression (QR) technique. The amalgamation produced a hybrid technique known as the hybrid ERNN-QR technique. The potentials of forecasting with the hybrid technique are compared with the forecasting capabilities of individual techniques of ERNN and QR. The outcome revealed that the hybrid technique is superior to the individual techniques in the mean square error sense.

  7. Tuition and Required Fees, New Jersey Colleges and Universities: 1976-77 through 1980-81. Research Note Series, Volume 1, Number 1.

    ERIC Educational Resources Information Center

    Delehanty, Kathleen

    Historical trends from 1976-77 through 1980-81 in tuition and required fee charges in New Jersey colleges and universities are examined. The overall five-year percentage changes in average annual tuition/fees in the different New Jersey collegiate sectors are outlined for different types of students (full-time and part-time, undergraduate and…

  8. Hangar Fire Suppression Utilizing Novec 1230

    DTIC Science & Technology

    2018-01-01

    The public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing...fuel fires in aircraft hangars. A 30×30×8-ft concrete-and-steel test structure was constructed for this test series . Four discharge assemblies...structure. System discharge parameters---discharge time , discharge rate, and quantity of agent discharged---were adjusted to produce the desired Novec 1230

  9. A Comparison of Forecast Error Generators for Modeling Wind and Load Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Ning; Diao, Ruisheng; Hafen, Ryan P.

    2013-07-25

    This paper presents four algorithms to generate random forecast error time series. The performance of four algorithms is compared. The error time series are used to create real-time (RT), hour-ahead (HA), and day-ahead (DA) wind and load forecast time series that statistically match historically observed forecasting data sets used in power grid operation to study the net load balancing need in variable generation integration studies. The four algorithms are truncated-normal distribution models, state-space based Markov models, seasonal autoregressive moving average (ARMA) models, and a stochastic-optimization based approach. The comparison is made using historical DA load forecast and actual load valuesmore » to generate new sets of DA forecasts with similar stoical forecast error characteristics (i.e., mean, standard deviation, autocorrelation, and cross-correlation). The results show that all methods generate satisfactory results. One method may preserve one or two required statistical characteristics better the other methods, but may not preserve other statistical characteristics as well compared with the other methods. Because the wind and load forecast error generators are used in wind integration studies to produce wind and load forecasts time series for stochastic planning processes, it is sometimes critical to use multiple methods to generate the error time series to obtain a statistically robust result. Therefore, this paper discusses and compares the capabilities of each algorithm to preserve the characteristics of the historical forecast data sets.« less

  10. Topological data analysis of financial time series: Landscapes of crashes

    NASA Astrophysics Data System (ADS)

    Gidea, Marian; Katz, Yuri

    2018-02-01

    We explore the evolution of daily returns of four major US stock market indices during the technology crash of 2000, and the financial crisis of 2007-2009. Our methodology is based on topological data analysis (TDA). We use persistence homology to detect and quantify topological patterns that appear in multidimensional time series. Using a sliding window, we extract time-dependent point cloud data sets, to which we associate a topological space. We detect transient loops that appear in this space, and we measure their persistence. This is encoded in real-valued functions referred to as a 'persistence landscapes'. We quantify the temporal changes in persistence landscapes via their Lp-norms. We test this procedure on multidimensional time series generated by various non-linear and non-equilibrium models. We find that, in the vicinity of financial meltdowns, the Lp-norms exhibit strong growth prior to the primary peak, which ascends during a crash. Remarkably, the average spectral density at low frequencies of the time series of Lp-norms of the persistence landscapes demonstrates a strong rising trend for 250 trading days prior to either dotcom crash on 03/10/2000, or to the Lehman bankruptcy on 09/15/2008. Our study suggests that TDA provides a new type of econometric analysis, which complements the standard statistical measures. The method can be used to detect early warning signals of imminent market crashes. We believe that this approach can be used beyond the analysis of financial time series presented here.

  11. Use of Ovine-based Collagen Extracellular Matrix and Gentian Violet/Methylene Blue Antibacterial Foam Dressings to Help Improve Clinical Outcomes in Lower Extremity Wounds: A Retrospective Cohort Study.

    PubMed

    Lullove, Eric J

    2017-04-01

    Dressings that provide broad spectrum metalloprotease reduction along with inherent aspects of an extracellular matrix may contribute to improved wound healing outcomes and shorter treatment times. The author performed a retrospective case series analysis to determine the clinical outcomes of regular debridement with the use of ovine-based collagen extracellular matrix dressings and gentian violet/methylene blue polyurethane antibacterial foam dressings in treating 53 patients with 53 chronic lower extremity wounds (diabetic foot ulcers [DFUs], venous leg ulcers, and heel pressure ulcers). Patients were treated twice weekly in an outpatient clinic for the first 4 weeks and weekly thereafter until closure. Average body mass index (BMI) for the study population was 28.3, and the average patient age was 75.9 years. Mean percent wound surface area reduction at 4, 8, and 12 weeks was 38.5%, 73.3%, and 91.3%, respectively. Average time to closure for all wounds was 10.6 weeks (range, 5-24 weeks). All wounds were 100% reepithelialized by week 20 except 1 DFU that reepithelialized at week 24. The average cost of care for a single wound episode (from presentation to closure) was $2749.49. Results of this analysis showed that the healing of chronic wounds in this series could be achieved at a reasonable cost with regular debridement and a collagen matrix dressing regimen, even in patients of advanced age and above average BMI as well as in wounds that did not achieve > 40% wound surface area reduction at 4 weeks.

  12. Investigating the creeping section of the San Andreas Fault using ALOS PALSAR interferometry

    NASA Astrophysics Data System (ADS)

    Agram, P. S.; Wortham, C.; Zebker, H. A.

    2010-12-01

    In recent years, time-series InSAR techniques have been used to study the temporal characteristics of various geophysical phenomena that produce surface deformation including earthquakes and magma migration in volcanoes. Conventional InSAR and time-series InSAR techniques have also been successfully used to study aseismic creep across faults in urban areas like the Northern Hayward Fault in California [1-3]. However, application of these methods to studying the time-dependent creep across the Central San Andreas Fault using C-band ERS and Envisat radar satellites has resulted in limited success. While these techniques estimate the average long-term far-field deformation rates reliably, creep measurement close to the fault (< 3-4 Km) is virtually impossible due to heavy decorrelation at C-band (6cm wavelength). Shanker and Zebker (2009) [4] used the Persistent Scatterer (PS) time-series InSAR technique to estimate a time-dependent non-uniform creep signal across a section of the creeping segment of the San Andreas Fault. However, the identified PS network was spatially very sparse (1 per sq. km) to study temporal characteristics of deformation of areas close to the fault. In this work, we use L-band (24cm wavelength) SAR data from the PALSAR instrument on-board the ALOS satellite, launched by Japanese Aerospace Exploration Agency (JAXA) in 2006, to study the temporal characteristics of creep across the Central San Andreas Fault. The longer wavelength at L-band improves observed correlation over the entire scene which significantly increased the ground area coverage of estimated deformation in each interferogram but at the cost of decreased sensitivity of interferometric phase to surface deformation. However, noise levels in our deformation estimates can be decreased by combining information from multiple SAR acquisitions using time-series InSAR techniques. We analyze 13 SAR acquisitions spanning the time-period from March 2007 to Dec 2009 using the Short Baseline Subset Analysis (SBAS) time-series InSAR technique [3]. We present detailed comparisons of estimated time-series of fault creep as a function of position along the fault including the locked section around Parkfield, CA. We also present comparisons between the InSAR time-series and GPS network observations in the Parkfield region. During these three years of observation, the average fault creep is estimated to be 35 mm/yr. References [1] Bürgmann,R., E. Fielding and, J. Sukhatme, Slip along the Hayward fault, California, estimated from space-based synthetic aperture radar interferometry, Geology,26, 559-562, 1998. [2] Ferretti, A., C. Prati and F. Rocca, Permanent Scatterers in SAR Interferometry, IEEE Trans. Geosci. Remote Sens., 39, 8-20, 2001. [3] Lanari, R.,F. Casu, M. Manzo, and P. Lundgren, Application of SBAS D- InSAR technique to fault creep: A case study of the Hayward Fault, California. Remote Sensing of Environment, 109(1), 20-28, 2007. [4] Shanker, A. P., and H. Zebker, Edgelist phase unwrapping algorithm for time-series InSAR. J. Opt. Soc. Am. A, 37(4), 2010.

  13. Evolution of record-breaking high and low monthly mean temperatures

    NASA Astrophysics Data System (ADS)

    Anderson, A. L.; Kostinski, A. B.

    2011-12-01

    We examine the ratio of record-breaking highs to record-breaking lows with respect to extent of time-series for monthly mean temperatures within the continental United States (1900-2006) and ask the following question. How are record-breaking high and low surface temperatures in the United States affected by time period? We find that the ratio of record-breaking highs to lows in 2006 increases as the time-series extend further into the past. For example: in 2006, the ratio of record-breaking highs to record-breaking lows is ≈ 13 : 1 with 1950 as the first year and ≈ 25 : 1 with 1900 as the first year; both ratios are an order of magnitude greater than 3-σ for stationary simulations. We also find record-breaking events are more sensitive to trends in time-series of monthly averages than time-series of corresponding daily values. When we consider the ratio as it evolves with respect to a fixed start year, we find it is strongly correlated with the ensemble mean. Correlation coefficients are 0.76 and 0.82 for 1900-2006 and 1950-2006 respectively; 3-σ = 0.3 for pairs of uncorrelated stationary time-series. We find similar values for globally distributed time-series: 0.87 and 0.92 for 1900-2006 and 1950-2006 respectively. However, the ratios evolve differently: global ratios increase throughout (1920-2006) while continental United States ratios decrease from about 1940 to 1970. (Based on Anderson and Kostinski (2011), Evolution and distribution of record-breaking high and low monthly mean temperatures. Journal of Applied Meteorology and Climatology. doi: 10.1175/JAMC-D-10-05025.1)

  14. Population-level administration of AlcoholEdu for college: an ARIMA time-series analysis.

    PubMed

    Wyatt, Todd M; Dejong, William; Dixon, Elizabeth

    2013-08-01

    Autoregressive integrated moving averages (ARIMA) is a powerful analytic tool for conducting interrupted time-series analysis, yet it is rarely used in studies of public health campaigns or programs. This study demonstrated the use of ARIMA to assess AlcoholEdu for College, an online alcohol education course for first-year students, and other health and safety programs introduced at a moderate-size public university in the South. From 1992 to 2009, the university administered annual Core Alcohol and Drug Surveys to samples of undergraduates (Ns = 498 to 1032). AlcoholEdu and other health and safety programs that began during the study period were assessed through a series of quasi-experimental ARIMA analyses. Implementation of AlcoholEdu in 2004 was significantly associated with substantial decreases in alcohol consumption and alcohol- or drug-related negative consequences. These improvements were sustained over time as succeeding first-year classes took the course. Previous studies have shown that AlcoholEdu has an initial positive effect on students' alcohol use and associated negative consequences. This investigation suggests that these positive changes may be sustainable over time through yearly implementation of the course with first-year students. ARIMA time-series analysis holds great promise for investigating the effect of program and policy interventions to address alcohol- and drug-related problems on campus.

  15. Geomagnetic field declination: from decadal to centennial scales

    NASA Astrophysics Data System (ADS)

    Dobrica, Venera; Demetrescu, Crisan; Mandea, Mioara

    2018-04-01

    Declination annual mean time series longer than 1 century provided by 24 geomagnetic observatories worldwide, together with 5 Western European reconstructed declination series over the last 4 centuries, have been analyzed in terms of the frequency constituents of the secular variation at inter-decadal and sub-centennial timescales of 20-35 and 70-90 years. Observatory and reconstructed time series have been processed by several types of filtering, namely Hodrick-Prescott, running averages, and Butterworth. The Hodrick-Prescott filtering allows us to separate a quasi-oscillation at a decadal timescale, which is assumed to be related to external variations and called the 11-year constituent, from a long-term trend. The latter has been decomposed into two other oscillations called inter-decadal and sub-centennial constituents by applying a Butterworth filtering with cutoffs at 30 and 73 years, respectively. The analysis shows that the generally accepted geomagnetic jerks occur around extrema in the time derivative of the trend and coincide with extrema in the time derivative of the 11-year constituent. The sub-centennial constituent is traced back to 1600 in the five 400-year-long time series and seems to be a major constituent of the secular variation, geomagnetic jerks included.

  16. Practical analysis of tide gauges records from Antarctica

    NASA Astrophysics Data System (ADS)

    Galassi, Gaia; Spada, Giorgio

    2015-04-01

    We have collected and analyzed in a basic way the currently available time series from tide gauges deployed along the coasts of Antarctica. The database of the Permanent Service for Mean Sea Level (PSMSL) holds relative sea level information for 17 stations, which are mostly concentrated in the Antarctic Peninsula (8 out of 17). For 7 of the PSMSL stations, Revised Local Reference (RLR) monthly and yearly observations are available, spanning from year 1957.79 (Almirante Brown) to 2013.95 (Argentine Islands). For the remaining 11 stations, only metric monthly data can be obtained during the time window 1957-2013. The record length of the available time series is not generally exceeding 20 years. Remarkable exceptions are the RLR station of Argentine Island, located in the Antarctic Peninsula (AP) (time span: 1958-2013, record length: 54 years, completeness=98%), and the metric station of Syowa in East Antarctica (1975-2012, 37 years, 92%). The general quality (geographical coverage and length of record) of the time series hinders a coherent geophysical interpretation of the relative sea-level data along the coasts of Antarctica. However, in an attempt to characterize the relative sea level signals available, we have stacked (i.e., averaged) the RLR time series for the AP and for the whole Antarctica. The so obtained time series have been analyzed using simple regression in order to estimate a trend and a possible sea-level acceleration. For the AP, the the trend is 1.8 ± 0.2 mm/yr and for the whole Antarctica it is 2.1 ± 0.1 mm/yr (both during 1957-2013). The modeled values of Glacial Isostatic Adjustment (GIA) obtained with ICE-5G(VM2) using program SELEN, range between -0.7 and -1.6 mm/yr, showing that the sea-level trend recorded by tide gauges is strongly influenced by GIA. Subtracting the average GIA contribution (-1.1 mm/yr) to observed sea-level trend from the two stacks, we obtain 3.2 and 2.9 mm/yr for Antarctica and AP respectively, which are interpreted as the effect of current ice melting and steric ocean contributions. By the Ensemble Empirical Mode Decomposition method, we have detected different oscillations embedded in the sea-level signals for Antarctica and AP. This confirms previously recognized connections between the sea-level variations in Antarctica and ocean modes like the ENSO.

  17. NeuroRhythmics: software for analyzing time-series measurements of saltatory movements in neuronal processes.

    PubMed

    Kerlin, Aaron M; Lindsley, Tara A

    2008-08-15

    Time-lapse imaging of living neurons both in vivo and in vitro has revealed that the growth of axons and dendrites is highly dynamic and characterized by alternating periods of extension and retraction. These growth dynamics are associated with important features of neuronal development and are differentially affected by experimental treatments, but the underlying cellular mechanisms are poorly understood. NeuroRhythmics was developed to semi-automate specific quantitative tasks involved in analysis of two-dimensional time-series images of processes that exhibit saltatory elongation. This software provides detailed information on periods of growth and nongrowth that it identifies by transitions in elongation (i.e. initiation time, average rate, duration) and information regarding the overall pattern of saltatory growth (i.e. time of pattern onset, frequency of transitions, relative time spent in a state of growth vs. nongrowth). Plots and numeric output are readily imported into other applications. The user has the option to specify criteria for identifying transitions in growth behavior, which extends the potential application of the software to neurons of different types or developmental stage and to other time-series phenomena that exhibit saltatory dynamics. NeuroRhythmics will facilitate mechanistic studies of periodic axonal and dendritic growth in neurons.

  18. Liquid Nitrogen Cryotherapy for Conjunctival Lymphangiectasia: A Case Series

    PubMed Central

    Fraunfelder, Frederick W.

    2009-01-01

    Purpose: To report a case series of conjunctival lymphangiectasia treated with liquid nitrogen cryotherapy. Methods: A 1.5-mm Brymill cryoprobe was applied in a double freeze-thaw method after an incisional biopsy of a portion of the conjunctiva in patients with conjunctival lymphangiectasia. Freeze times were 1 to 2 seconds with thawing of 5 to 10 seconds between treatments. Patients were reexamined at 1 day, 2 weeks, 3 months, 6 months, and yearly following cryotherapy. Results: Five eyes of 4 patients (3 male and 1 female) with biopsy-proven conjunctival lymphangiectasia underwent liquid nitrogen cryotherapy. The average patient age was 53 years. Ocular examination revealed large lymphatic vessels that were translucent and without conjunctival injection. Subjective symptoms included epiphora, ocular irritation, eye redness, and occasional blurred vision. After treatment with liquid nitrogen cryotherapy, the patients’ symptoms and signs resolved within 2 weeks. Lymphangiectasia recurred twice in one patient, at 1 and 3 years postoperatively. In another patient, lymphangiectasia recurred at 6 months. The average time to recurrence in these 3 eyes was 18 months. Average length of follow-up was 24.5 months for all subjects. Conclusion: Liquid nitrogen cryotherapy may be an effective surgical alternative in the treatment of conjunctival lymphangiectasia. Cryotherapy may need to be repeated in some instances. PMID:20126499

  19. Liquid nitrogen cryotherapy for conjunctival lymphangiectasia: a case series.

    PubMed

    Fraunfelder, Frederick W

    2009-12-01

    To report a case series of conjunctival lymphangiectasia treated with liquid nitrogen cryotherapy. A 1.5-mm Brymill cryoprobe was applied in a double freeze-thaw method after an incisional biopsy of a portion of the conjunctiva in patients with conjunctival lymphangiectasia. Freeze times were 1 to 2 seconds with thawing of 5 to 10 seconds between treatments. Patients were reexamined at 1 day, 2 weeks, 3 months, 6 months, and yearly following cryotherapy. Five eyes of 4 patients (3 male and 1 female) with biopsy-proven conjunctival lymphangiectasia underwent liquid nitrogen cryotherapy. The average patient age was 53 years. Ocular examination revealed large lymphatic vessels that were translucent and without conjunctival injection. Subjective symptoms included epiphora, ocular irritation, eye redness, and occasional blurred vision. After treatment with liquid nitrogen cryotherapy, the patients' symptoms and signs resolved within 2 weeks. Lymphangiectasia recurred twice in one patient, at 1 and 3 years postoperatively. In another patient, lymphangiectasia recurred at 6 months. The average time to recurrence in these 3 eyes was 18 months. Average length of follow-up was 24.5 months for all subjects. Liquid nitrogen cryotherapy may be an effective surgical alternative in the treatment of conjunctival lymphangiectasia. Cryotherapy may need to be repeated in some instances.

  20. Seasonal flux and assemblage composition of planktic foraminifera from the northern Gulf of Mexico, 2008-11

    USGS Publications Warehouse

    Reynolds, Caitlin E.; Poore, Richard Z.

    2013-01-01

    The U.S. Geological Survey anchored a sediment trap in the northern Gulf of Mexico to collect seasonal time-series data on the flux and assemblage composition of live planktic foraminifers. This report provides an update of the previous time-series data to include results from 2011. Ten species, or varieties, constituted ~92 percent of the 2011 assemblage: Globigerinoides ruber (pink and white varieties), Globigerinoides sacculifer, Globigerina calida, Globigerinella aequilateralis, Globorotalia menardii group [The Gt. menardii group includes Gt. menardii, Gt. tumida, and Gt. ungulata], Orbulina universa, Globorotalia truncatulinoides, Pulleniatina spp., and Neogloboquadrina dutertrei. The mean daily flux was 205 tests per square meter per day (m-2 day-1), with maximum fluxes of >600 tests m-2 day-1 during mid-February and mid-September and minimum fluxes of -2 day-1 during mid-March, the beginning of May, and November. Globorotalia truncatulinoides showed a clear preference for the winter, consistent with data from 2008 to 2010. Globigerinoides ruber (white) flux data for 2011 (average 30 tests m-2 day-1) were consistent with data from 2010 (average 29 m-2 day-1) and showed a steady threefold increase since 2009 (average 11 tests m-2 day-1) and a tenfold increase from the 2008 flux (3 tests m-2 day-1).

  1. A hybrid wavelet de-noising and Rank-Set Pair Analysis approach for forecasting hydro-meteorological time series.

    PubMed

    Wang, Dong; Borthwick, Alistair G; He, Handan; Wang, Yuankun; Zhu, Jieyu; Lu, Yuan; Xu, Pengcheng; Zeng, Xiankui; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Liu, Jiufu; Zou, Ying; He, Ruimin

    2018-01-01

    Accurate, fast forecasting of hydro-meteorological time series is presently a major challenge in drought and flood mitigation. This paper proposes a hybrid approach, wavelet de-noising (WD) and Rank-Set Pair Analysis (RSPA), that takes full advantage of a combination of the two approaches to improve forecasts of hydro-meteorological time series. WD allows decomposition and reconstruction of a time series by the wavelet transform, and hence separation of the noise from the original series. RSPA, a more reliable and efficient version of Set Pair Analysis, is integrated with WD to form the hybrid WD-RSPA approach. Two types of hydro-meteorological data sets with different characteristics and different levels of human influences at some representative stations are used to illustrate the WD-RSPA approach. The approach is also compared to three other generic methods: the conventional Auto Regressive Integrated Moving Average (ARIMA) method, Artificial Neural Networks (ANNs) (BP-error Back Propagation, MLP-Multilayer Perceptron and RBF-Radial Basis Function), and RSPA alone. Nine error metrics are used to evaluate the model performance. Compared to three other generic methods, the results generated by WD-REPA model presented invariably smaller error measures which means the forecasting capability of the WD-REPA model is better than other models. The results show that WD-RSPA is accurate, feasible, and effective. In particular, WD-RSPA is found to be the best among the various generic methods compared in this paper, even when the extreme events are included within a time series. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. True random bit generators based on current time series of contact glow discharge electrolysis

    NASA Astrophysics Data System (ADS)

    Rojas, Andrea Espinel; Allagui, Anis; Elwakil, Ahmed S.; Alawadhi, Hussain

    2018-05-01

    Random bit generators (RBGs) in today's digital information and communication systems employ a high rate physical entropy sources such as electronic, photonic, or thermal time series signals. However, the proper functioning of such physical systems is bound by specific constrains that make them in some cases weak and susceptible to external attacks. In this study, we show that the electrical current time series of contact glow discharge electrolysis, which is a dc voltage-powered micro-plasma in liquids, can be used for generating random bit sequences in a wide range of high dc voltages. The current signal is quantized into a binary stream by first using a simple moving average function which makes the distribution centered around zero, and then applying logical operations which enables the binarized data to pass all tests in industry-standard randomness test suite by the National Institute of Standard Technology. Furthermore, the robustness of this RBG against power supply attacks has been examined and verified.

  3. Forecasting of cyanobacterial density in Torrão reservoir using artificial neural networks.

    PubMed

    Torres, Rita; Pereira, Elisa; Vasconcelos, Vítor; Teles, Luís Oliva

    2011-06-01

    The ability of general regression neural networks (GRNN) to forecast the density of cyanobacteria in the Torrão reservoir (Tâmega river, Portugal), in a period of 15 days, based on three years of collected physical and chemical data, was assessed. Several models were developed and 176 were selected based on their correlation values for the verification series. A time lag of 11 was used, equivalent to one sample (periods of 15 days in the summer and 30 days in the winter). Several combinations of the series were used. Input and output data collected from three depths of the reservoir were applied (surface, euphotic zone limit and bottom). The model that presented a higher average correlation value presented the correlations 0.991; 0.843; 0.978 for training, verification and test series. This model had the three series independent in time: first test series, then verification series and, finally, training series. Only six input variables were considered significant to the performance of this model: ammonia, phosphates, dissolved oxygen, water temperature, pH and water evaporation, physical and chemical parameters referring to the three depths of the reservoir. These variables are common to the next four best models produced and, although these included other input variables, their performance was not better than the selected best model.

  4. Cross-bispectrum computation and variance estimation

    NASA Technical Reports Server (NTRS)

    Lii, K. S.; Helland, K. N.

    1981-01-01

    A method for the estimation of cross-bispectra of discrete real time series is developed. The asymptotic variance properties of the bispectrum are reviewed, and a method for the direct estimation of bispectral variance is given. The symmetry properties are described which minimize the computations necessary to obtain a complete estimate of the cross-bispectrum in the right-half-plane. A procedure is given for computing the cross-bispectrum by subdividing the domain into rectangular averaging regions which help reduce the variance of the estimates and allow easy application of the symmetry relationships to minimize the computational effort. As an example of the procedure, the cross-bispectrum of a numerically generated, exponentially distributed time series is computed and compared with theory.

  5. Geodesic regression for image time-series.

    PubMed

    Niethammer, Marc; Huang, Yang; Vialard, François-Xavier

    2011-01-01

    Registration of image-time series has so far been accomplished (i) by concatenating registrations between image pairs, (ii) by solving a joint estimation problem resulting in piecewise geodesic paths between image pairs, (iii) by kernel based local averaging or (iv) by augmenting the joint estimation with additional temporal irregularity penalties. Here, we propose a generative model extending least squares linear regression to the space of images by using a second-order dynamic formulation for image registration. Unlike previous approaches, the formulation allows for a compact representation of an approximation to the full spatio-temporal trajectory through its initial values. The method also opens up possibilities to design image-based approximation algorithms. The resulting optimization problem is solved using an adjoint method.

  6. Random walker in temporally deforming higher-order potential forces observed in a financial crisis.

    PubMed

    Watanabe, Kota; Takayasu, Hideki; Takayasu, Misako

    2009-11-01

    Basic peculiarities of market price fluctuations are known to be well described by a recently developed random-walk model in a temporally deforming quadratic potential force whose center is given by a moving average of past price traces [M. Takayasu, T. Mizuno, and H. Takayasu, Physica A 370, 91 (2006)]. By analyzing high-frequency financial time series of exceptional events, such as bubbles and crashes, we confirm the appearance of higher-order potential force in the markets. We show statistical significance of its existence by applying the information criterion. This time series analysis is expected to be applied widely for detecting a nonstationary symptom in random phenomena.

  7. Comparative climatology of four marine stratocumulus regimes

    NASA Technical Reports Server (NTRS)

    Hanson, Howard P.

    1990-01-01

    The climatology of marine stratocumulus (MSc) cloud regimes off the west coasts of California, Peru, Morocco, and Angola are examined. Long-term, annual averages are presented for several quantities of interest in the four MSc regimes. The climatologies were constructed using the Comprehensive Ocean-Atmosphere Data Set (COADS). A 40 year time series of observations was extracted for 32 x 32 deg analysis domains. The data were taken from the monthly-averaged, 2 deg product. The resolution of the analysis is therefore limited to scales of greater than 200 km with submonthly variability not resolved. The averages of total cloud cover, sea surface temperature, and surface pressure are presented.

  8. Comparing the structure of an emerging market with a mature one under global perturbation

    NASA Astrophysics Data System (ADS)

    Namaki, A.; Jafari, G. R.; Raei, R.

    2011-09-01

    In this paper we investigate the Tehran stock exchange (TSE) and Dow Jones Industrial Average (DJIA) in terms of perturbed correlation matrices. To perturb a stock market, there are two methods, namely local and global perturbation. In the local method, we replace a correlation coefficient of the cross-correlation matrix with one calculated from two Gaussian-distributed time series, whereas in the global method, we reconstruct the correlation matrix after replacing the original return series with Gaussian-distributed time series. The local perturbation is just a technical study. We analyze these markets through two statistical approaches, random matrix theory (RMT) and the correlation coefficient distribution. By using RMT, we find that the largest eigenvalue is an influence that is common to all stocks and this eigenvalue has a peak during financial shocks. We find there are a few correlated stocks that make the essential robustness of the stock market but we see that by replacing these return time series with Gaussian-distributed time series, the mean values of correlation coefficients, the largest eigenvalues of the stock markets and the fraction of eigenvalues that deviate from the RMT prediction fall sharply in both markets. By comparing these two markets, we can see that the DJIA is more sensitive to global perturbations. These findings are crucial for risk management and portfolio selection.

  9. An Algorithm Framework for Isolating Anomalous Signals in Electromagnetic Data

    NASA Astrophysics Data System (ADS)

    Kappler, K. N.; Schneider, D.; Bleier, T.; MacLean, L. S.

    2016-12-01

    QuakeFinder and its international collaborators have installed and currently maintain an array of 165 three-axis induction magnetometer instrument sites in California, Peru, Taiwan, Greece, Chile and Sumatra. Based on research by Bleier et al. (2009), Fraser-Smith et al. (1990), and Freund (2007), the electromagnetic data from these instruments are being analyzed for pre-earthquake signatures. This analysis consists of both private research by QuakeFinder, and institutional collaborators (PUCP in Peru, NCU in Taiwan, NOA in Greece, LASP at University of Colorado, Stanford, UCLA, NASA-ESI, NASA-AMES and USC-CSEP). QuakeFinder has developed an algorithm framework aimed at isolating anomalous signals (pulses) in the time series. Results are presented from an application of this framework to induction-coil magnetometer data. Our data driven approach starts with sliding windows applied to uniformly resampled array data with a variety of lengths and overlap. Data variance (a proxy for energy) is calculated on each window and a short-term average/ long-term average (STA/LTA) filter is applied to the variance time series. Pulse identification is done by flagging time intervals in the STA/LTA filtered time series which exceed a threshold. Flagged time intervals are subsequently fed into a feature extraction program which computes statistical properties of the resampled data. These features are then filtered using a Principal Component Analysis (PCA) based method to cluster similar pulses. We explore the extent to which this approach categorizes pulses with known sources (e.g. cars, lightning, etc.) and the remaining pulses of unknown origin can be analyzed with respect to their relationship with seismicity. We seek a correlation between these daily pulse-counts (with known sources removed) and subsequent (days to weeks) seismic events greater than M5 within 15km radius. Thus we explore functions which map daily pulse-counts to a time series representing the likelihood of a seismic event occurring at some future time. These "pseudo-probabilities" can in turn be represented as Molchan diagrams. The Molchan curve provides an effective cost function for optimization and allows for a rigorous statistical assessment of the validity of pre-earthquake signals in the electromagnetic data.

  10. A statistical approach for generating synthetic tip stress data from limited CPT soundings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Basalams, M.K.

    CPT tip stress data obtained from a Uranium mill tailings impoundment are treated as time series. A statistical class of models that was developed to model time series is explored to investigate its applicability in modeling the tip stress series. These models were developed by Box and Jenkins (1970) and are known as Autoregressive Moving Average (ARMA) models. This research demonstrates how to apply the ARMA models to tip stress series. Generation of synthetic tip stress series that preserve the main statistical characteristics of the measured series is also investigated. Multiple regression analysis is used to model the regional variationmore » of the ARMA model parameters as well as the regional variation of the mean and the standard deviation of the measured tip stress series. The reliability of the generated series is investigated from a geotechnical point of view as well as from a statistical point of view. Estimation of the total settlement using the measured and the generated series subjected to the same loading condition are performed. The variation of friction angle with depth of the impoundment materials is also investigated. This research shows that these series can be modeled by the Box and Jenkins ARMA models. A third degree Autoregressive model AR(3) is selected to represent these series. A theoretical double exponential density function is fitted to the AR(3) model residuals. Synthetic tip stress series are generated at nearby locations. The generated series are shown to be reliable in estimating the total settlement and the friction angle variation with depth for this particular site.« less

  11. Generalized Riemann hypothesis and stochastic time series

    NASA Astrophysics Data System (ADS)

    Mussardo, Giuseppe; LeClair, André

    2018-06-01

    Using the Dirichlet theorem on the equidistribution of residue classes modulo q and the Lemke Oliver–Soundararajan conjecture on the distribution of pairs of residues on consecutive primes, we show that the domain of convergence of the infinite product of Dirichlet L-functions of non-principal characters can be extended from down to , without encountering any zeros before reaching this critical line. The possibility of doing so can be traced back to a universal diffusive random walk behavior of a series C N over the primes which underlies the convergence of the infinite product of the Dirichlet functions. The series C N presents several aspects in common with stochastic time series and its control requires to address a problem similar to the single Brownian trajectory problem in statistical mechanics. In the case of the Dirichlet functions of non principal characters, we show that this problem can be solved in terms of a self-averaging procedure based on an ensemble of block variables computed on extended intervals of primes. Those intervals, called inertial intervals, ensure the ergodicity and stationarity of the time series underlying the quantity C N . The infinity of primes also ensures the absence of rare events which would have been responsible for a different scaling behavior than the universal law of the random walks.

  12. Hydroclimate temporal variability in a coastal Mediterranean watershed: the Tafna basin, North-West Algeria

    NASA Astrophysics Data System (ADS)

    Boulariah, Ouafik; Longobardi, Antonia; Meddi, Mohamed

    2017-04-01

    One of the major challenges scientists, practitioners and stakeholders are nowadays involved in, is to provide the worldwide population with reliable water supplies, protecting, at the same time, the freshwater ecosystems quality and quantity. Climate and land use changes undermine the balance between water demand and water availability, causing alteration of rivers flow regime. Knowledge of hydro-climate variables temporal and spatial variability is clearly helpful to plan drought and flood hazard mitigation strategies but also to adapt them to future environmental scenarios. The present study relates to the coastal semi-arid Tafna catchment, located in the North-West of Algeria, within the Mediterranean basin. The aim is the investigation of streamflow and rainfall indices temporal variability in six sub-basins of the large catchment Tafna, attempting to relate streamflow and rainfall changes. Rainfall and streamflow time series have been preliminary tested for data quality and homogeneity, through the coupled application of two-tailed t test, Pettitt test and Cumsum tests (significance level of 0.1, 0.05 and 0.01). Subsequently maximum annual daily rainfall and streamflow and average daily annual rainfall and streamflow time series have been derived and tested for temporal variability, through the application of the Mann Kendall and Sen's test. Overall maximum annual daily streamflow time series exhibit a negative trend which is however significant for only 30% of the station. Maximum annual daily rainfall also e exhibit a negative trend which is intend significant for the 80% of the stations. In the case of average daily annual streamflow and rainfall, the tendency for decrease in time is unclear and, in both cases, appear significant for 60% of stations.

  13. The Search for Solar Gravity-Mode Oscillations: an Analysis Using ULYSSES Magnetic Field Data

    NASA Astrophysics Data System (ADS)

    Denison, David G. T.; Walden, Andrew T.

    1999-04-01

    In 1995 Thomson, Maclennon, and Lanzerotti (TML) reported on work where they carried out a time-series analysis of energetic particle fluxes measured by Ulysses and Voyager 2 and concluded that solar g-mode oscillations had been detected. The approach is based on finding significant peaks in spectra using a statistical F-test. Using three sets of 2048 hourly averages of Ulysses magnetic field magnitude data, and the same multitaper spectral estimation techniques, we obtain, on average, nine coincidences with the lines listed in the TML paper. We could not reject the hypothesis that the F-test peaks we obtained are uniformly distributed, and further statistical computations show that a sequence of uniformly distributed lines generated on the frequency grid would have, on average, nine coincidences with the lines of TML. Further, we find that a time series generated from a model with a smooth spectrum of the same form as derived from the Ulysses magnetic field magnitude data and having no true spectral lines above 2 μHz, when subjected to the multitaper F-tests, gives rise to essentially the same number of ``identified'' lines and coincident frequencies as found with our Ulysses data. We conclude that our average nine coincidences with the lines found by TML can arise by mechanisms wholly unconnected with the existence of real physical spectral lines and hence find no firm evidence that g-modes can be detected in our sample of magnetic field data.

  14. Wavelet regression model in forecasting crude oil price

    NASA Astrophysics Data System (ADS)

    Hamid, Mohd Helmie; Shabri, Ani

    2017-05-01

    This study presents the performance of wavelet multiple linear regression (WMLR) technique in daily crude oil forecasting. WMLR model was developed by integrating the discrete wavelet transform (DWT) and multiple linear regression (MLR) model. The original time series was decomposed to sub-time series with different scales by wavelet theory. Correlation analysis was conducted to assist in the selection of optimal decomposed components as inputs for the WMLR model. The daily WTI crude oil price series has been used in this study to test the prediction capability of the proposed model. The forecasting performance of WMLR model were also compared with regular multiple linear regression (MLR), Autoregressive Moving Average (ARIMA) and Generalized Autoregressive Conditional Heteroscedasticity (GARCH) using root mean square errors (RMSE) and mean absolute errors (MAE). Based on the experimental results, it appears that the WMLR model performs better than the other forecasting technique tested in this study.

  15. A time-series method for automated measurement of changes in mitotic and interphase duration from time-lapse movies.

    PubMed

    Sigoillot, Frederic D; Huckins, Jeremy F; Li, Fuhai; Zhou, Xiaobo; Wong, Stephen T C; King, Randall W

    2011-01-01

    Automated time-lapse microscopy can visualize proliferation of large numbers of individual cells, enabling accurate measurement of the frequency of cell division and the duration of interphase and mitosis. However, extraction of quantitative information by manual inspection of time-lapse movies is too time-consuming to be useful for analysis of large experiments. Here we present an automated time-series approach that can measure changes in the duration of mitosis and interphase in individual cells expressing fluorescent histone 2B. The approach requires analysis of only 2 features, nuclear area and average intensity. Compared to supervised learning approaches, this method reduces processing time and does not require generation of training data sets. We demonstrate that this method is as sensitive as manual analysis in identifying small changes in interphase or mitotic duration induced by drug or siRNA treatment. This approach should facilitate automated analysis of high-throughput time-lapse data sets to identify small molecules or gene products that influence timing of cell division.

  16. Common mode error in Antarctic GPS coordinate time series on its effect on bedrock-uplift estimates

    NASA Astrophysics Data System (ADS)

    Liu, Bin; King, Matt; Dai, Wujiao

    2018-05-01

    Spatially-correlated common mode error always exists in regional, or-larger, GPS networks. We applied independent component analysis (ICA) to GPS vertical coordinate time series in Antarctica from 2010 to 2014 and made a comparison with the principal component analysis (PCA). Using PCA/ICA, the time series can be decomposed into a set of temporal components and their spatial responses. We assume the components with common spatial responses are common mode error (CME). An average reduction of ˜40% about the RMS values was achieved in both PCA and ICA filtering. However, the common mode components obtained from the two approaches have different spatial and temporal features. ICA time series present interesting correlations with modeled atmospheric and non-tidal ocean loading displacements. A white noise (WN) plus power law noise (PL) model was adopted in the GPS velocity estimation using maximum likelihood estimation (MLE) analysis, with ˜55% reduction of the velocity uncertainties after filtering using ICA. Meanwhile, spatiotemporal filtering reduces the amplitude of PL and periodic terms in the GPS time series. Finally, we compare the GPS uplift velocities, after correction for elastic effects, with recent models of glacial isostatic adjustment (GIA). The agreements of the GPS observed velocities and four GIA models are generally improved after the spatiotemporal filtering, with a mean reduction of ˜0.9 mm/yr of the WRMS values, possibly allowing for more confident separation of various GIA model predictions.

  17. Variance Analysis if Unevenly Spaced Time Series Data

    DTIC Science & Technology

    1995-12-01

    Daka were subsequently removed from mch simulated data set using typical TWSTFT data patterns to create lwo unevenly spaced sets with average...and techniqw are presented for cowecking errors caused by uneven data spacing in typical TWSTFT daka sets. INTRODUCTION Data points obtained from an...the possible data available. In TWSTFT , the task is less daunting: time transfers are typically measured on Monday, Wednesday, and Friday, so, in a

  18. Water Column Variability in Coastal Regions

    DTIC Science & Technology

    1997-09-30

    to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data... 1 . REPORT DATE 30 SEP 1997 2. REPORT TYPE 3. DATES COVERED 00-00-1997 to 00-00-1997 4. TITLE AND SUBTITLE Water Column Variability in...Andrews, Woods, and Kester deployed a spar buoy at a central location in Narragansett Bay to obtain time-series variations at multiple depths ( 1 , 4

  19. Time series inversion of spectra from ground-based radiometers

    NASA Astrophysics Data System (ADS)

    Christensen, O. M.; Eriksson, P.

    2013-07-01

    Retrieving time series of atmospheric constituents from ground-based spectrometers often requires different temporal averaging depending on the altitude region in focus. This can lead to several datasets existing for one instrument, which complicates validation and comparisons between instruments. This paper puts forth a possible solution by incorporating the temporal domain into the maximum a posteriori (MAP) retrieval algorithm. The state vector is increased to include measurements spanning a time period, and the temporal correlations between the true atmospheric states are explicitly specified in the a priori uncertainty matrix. This allows the MAP method to effectively select the best temporal smoothing for each altitude, removing the need for several datasets to cover different altitudes. The method is compared to traditional averaging of spectra using a simulated retrieval of water vapour in the mesosphere. The simulations show that the method offers a significant advantage compared to the traditional method, extending the sensitivity an additional 10 km upwards without reducing the temporal resolution at lower altitudes. The method is also tested on the Onsala Space Observatory (OSO) water vapour microwave radiometer confirming the advantages found in the simulation. Additionally, it is shown how the method can interpolate data in time and provide diagnostic values to evaluate the interpolated data.

  20. Time-Series Forecast Modeling on High-Bandwidth Network Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Wucherl; Sim, Alex

    With the increasing number of geographically distributed scientific collaborations and the growing sizes of scientific data, it has become challenging for users to achieve the best possible network performance on a shared network. In this paper, we have developed a model to forecast expected bandwidth utilization on high-bandwidth wide area networks. The forecast model can improve the efficiency of the resource utilization and scheduling of data movements on high-bandwidth networks to accommodate ever increasing data volume for large-scale scientific data applications. A univariate time-series forecast model is developed with the Seasonal decomposition of Time series by Loess (STL) and themore » AutoRegressive Integrated Moving Average (ARIMA) on Simple Network Management Protocol (SNMP) path utilization measurement data. Compared with the traditional approach such as Box-Jenkins methodology to train the ARIMA model, our forecast model reduces computation time up to 92.6 %. It also shows resilience against abrupt network usage changes. Finally, our forecast model conducts the large number of multi-step forecast, and the forecast errors are within the mean absolute deviation (MAD) of the monitored measurements.« less

  1. Time-Series Forecast Modeling on High-Bandwidth Network Measurements

    DOE PAGES

    Yoo, Wucherl; Sim, Alex

    2016-06-24

    With the increasing number of geographically distributed scientific collaborations and the growing sizes of scientific data, it has become challenging for users to achieve the best possible network performance on a shared network. In this paper, we have developed a model to forecast expected bandwidth utilization on high-bandwidth wide area networks. The forecast model can improve the efficiency of the resource utilization and scheduling of data movements on high-bandwidth networks to accommodate ever increasing data volume for large-scale scientific data applications. A univariate time-series forecast model is developed with the Seasonal decomposition of Time series by Loess (STL) and themore » AutoRegressive Integrated Moving Average (ARIMA) on Simple Network Management Protocol (SNMP) path utilization measurement data. Compared with the traditional approach such as Box-Jenkins methodology to train the ARIMA model, our forecast model reduces computation time up to 92.6 %. It also shows resilience against abrupt network usage changes. Finally, our forecast model conducts the large number of multi-step forecast, and the forecast errors are within the mean absolute deviation (MAD) of the monitored measurements.« less

  2. Functional Connectivity Parcellation of the Human Thalamus by Independent Component Analysis.

    PubMed

    Zhang, Sheng; Li, Chiang-Shan R

    2017-11-01

    As a key structure to relay and integrate information, the thalamus supports multiple cognitive and affective functions through the connectivity between its subnuclei and cortical and subcortical regions. Although extant studies have largely described thalamic regional functions in anatomical terms, evidence accumulates to suggest a more complex picture of subareal activities and connectivities of the thalamus. In this study, we aimed to parcellate the thalamus and examine whole-brain connectivity of its functional clusters. With resting state functional magnetic resonance imaging data from 96 adults, we used independent component analysis (ICA) to parcellate the thalamus into 10 components. On the basis of the independence assumption, ICA helps to identify how subclusters overlap spatially. Whole brain functional connectivity of each subdivision was computed for independent component's time course (ICtc), which is a unique time series to represent an IC. For comparison, we computed seed-region-based functional connectivity using the averaged time course across all voxels within a thalamic subdivision. The results showed that, at p < 10 -6 , corrected, 49% of voxels on average overlapped among subdivisions. Compared with seed-region analysis, ICtc analysis revealed patterns of connectivity that were more distinguished between thalamic clusters. ICtc analysis demonstrated thalamic connectivity to the primary motor cortex, which has eluded the analysis as well as previous studies based on averaged time series, and clarified thalamic connectivity to the hippocampus, caudate nucleus, and precuneus. The new findings elucidate functional organization of the thalamus and suggest that ICA clustering in combination with ICtc rather than seed-region analysis better distinguishes whole-brain connectivities among functional clusters of a brain region.

  3. Time Series of Greenland Ice-Sheet Elevations and Mass Changes from ICESat 2003-2009

    NASA Astrophysics Data System (ADS)

    Zwally, H. J.; Li, J.; Medley, B.; Robbins, J. W.; Yi, D.

    2015-12-01

    We follow the repeat-track analysis (RTA) of ICESat surface-elevation data by a second stage that adjusts the measured elevations on repeat passes to the reference track taking into account the cross-track slope (αc), in order to construct elevation time series. αc are obtained from RTA simultaneous solutions for αc, dh/dt, and h0. The height measurements on repeat tracks are initially interpolated to uniform along-track reference points (every 172 m) and times (ti) giving the h(xi,ti) used in the RTA solutions. The xi are the cross-track spacings from the reference track and i is the laser campaign index. The adjusted elevation measurements at the along-track reference points are hr(ti) = h(xi,ti) - xi tan(αc) - h0. The hr(ti) time series are averaged over 50 km cells creating H(ti) series and further averaged (weighted by cell area) to H(t) time series over drainage systems (DS), elevation bands, regions, and the entire ice sheet. Temperature-driven changes in the rate of firn compaction, CT(t), are calculated for 50 km cells with our firn-compaction model giving I(t) = H(t) - CT(t) - B(t) where B(t) is the vertical motion of the bedrock. During 2003 to 2009, the average dCT(t)/dt in the accumulation zone is -5 cm/yr, which amounts to a -75 km3/yr correction to ice volume change estimates. The I(t) are especially useful for studying the seasonal cycle of mass gains and losses and interannual variations. The H(t) for the ablation zone are fitted with a multi-variate function with a linear component describing the upward component of ice flow plus winter accumulation (fall through spring) and a portion of a sine function describing the superimposed summer melting. During fall to spring the H(t) indicate that the upward motion of the ice flow is at a rate of 1 m/yr, giving an annual mass gain of 180 Gt/yr in the ablation zone. The summer loss from surface melting in the high-melt summer of 2005 is 350 Gt/yr, giving a net surface loss of 170 Gt/yr from the ablation zone for 2005. During 2003-2008, the H(t) for the ablation zone show accelerations of the mass losses in the northwest DS8 and in the west-central DS7 (including Jacobshavn glacier) and offsetting decelerations of the mass losses in the east-central DS3 and southeast DS4, much of which occurred in 2008 possibly due to an eastward shift in the surface mass balance.

  4. Short-term load forecasting of power system

    NASA Astrophysics Data System (ADS)

    Xu, Xiaobin

    2017-05-01

    In order to ensure the scientific nature of optimization about power system, it is necessary to improve the load forecasting accuracy. Power system load forecasting is based on accurate statistical data and survey data, starting from the history and current situation of electricity consumption, with a scientific method to predict the future development trend of power load and change the law of science. Short-term load forecasting is the basis of power system operation and analysis, which is of great significance to unit combination, economic dispatch and safety check. Therefore, the load forecasting of the power system is explained in detail in this paper. First, we use the data from 2012 to 2014 to establish the partial least squares model to regression analysis the relationship between daily maximum load, daily minimum load, daily average load and each meteorological factor, and select the highest peak by observing the regression coefficient histogram Day maximum temperature, daily minimum temperature and daily average temperature as the meteorological factors to improve the accuracy of load forecasting indicators. Secondly, in the case of uncertain climate impact, we use the time series model to predict the load data for 2015, respectively, the 2009-2014 load data were sorted out, through the previous six years of the data to forecast the data for this time in 2015. The criterion for the accuracy of the prediction is the average of the standard deviations for the prediction results and average load for the previous six years. Finally, considering the climate effect, we use the BP neural network model to predict the data in 2015, and optimize the forecast results on the basis of the time series model.

  5. Correcting for day of the week and public holiday effects: improving a national daily syndromic surveillance service for detecting public health threats.

    PubMed

    Buckingham-Jeffery, Elizabeth; Morbey, Roger; House, Thomas; Elliot, Alex J; Harcourt, Sally; Smith, Gillian E

    2017-05-19

    As service provision and patient behaviour varies by day, healthcare data used for public health surveillance can exhibit large day of the week effects. These regular effects are further complicated by the impact of public holidays. Real-time syndromic surveillance requires the daily analysis of a range of healthcare data sources, including family doctor consultations (called general practitioners, or GPs, in the UK). Failure to adjust for such reporting biases during analysis of syndromic GP surveillance data could lead to misinterpretations including false alarms or delays in the detection of outbreaks. The simplest smoothing method to remove a day of the week effect from daily time series data is a 7-day moving average. Public Health England developed the working day moving average in an attempt also to remove public holiday effects from daily GP data. However, neither of these methods adequately account for the combination of day of the week and public holiday effects. The extended working day moving average was developed. This is a further data-driven method for adding a smooth trend curve to a time series graph of daily healthcare data, that aims to take both public holiday and day of the week effects into account. It is based on the assumption that the number of people seeking healthcare services is a combination of illness levels/severity and the ability or desire of patients to seek healthcare each day. The extended working day moving average was compared to the seven-day and working day moving averages through application to data from two syndromic indicators from the GP in-hours syndromic surveillance system managed by Public Health England. The extended working day moving average successfully smoothed the syndromic healthcare data by taking into account the combined day of the week and public holiday effects. In comparison, the seven-day and working day moving averages were unable to account for all these effects, which led to misleading smoothing curves. The results from this study make it possible to identify trends and unusual activity in syndromic surveillance data from GP services in real-time independently of the effects caused by day of the week and public holidays, thereby improving the public health action resulting from the analysis of these data.

  6. Implications of different approaches for characterizing ambient air pollutant concentrations within the urban airshed for time-series studies and health benefits analyses.

    PubMed

    Strickland, Matthew J; Darrow, Lyndsey A; Mulholland, James A; Klein, Mitchel; Flanders, W Dana; Winquist, Andrea; Tolbert, Paige E

    2011-05-11

    In time-series studies of the health effects of urban air pollutants, decisions must be made about how to characterize pollutant levels within the airshed. Emergency department visits for pediatric asthma exacerbations were collected from Atlanta hospitals. Concentrations of carbon monoxide, nitrogen dioxide, ozone, sulfur dioxide, particulate matter less than 10 microns in diameter (PM10), particulate matter less than 2.5 microns in diameter (PM2.5), and the PM2.5 components elemental carbon, organic carbon, and sulfate were obtained from networks of ambient air quality monitors. For each pollutant we created three different daily metrics. For one metric we used the measurements from a centrally-located monitor; for the second we averaged measurements across the network of monitors; and for the third we estimated the population-weighted average concentration using an isotropic spatial model. Rate ratios for each of the metrics were estimated from time-series models. For pollutants with relatively homogeneous spatial distributions we observed only small differences in the rate ratio across the three metrics. Conversely, for spatially heterogeneous pollutants we observed larger differences in the rate ratios. For a given pollutant, the strength of evidence for an association (i.e., chi-square statistics) tended to be similar across metrics. Given that the chi-square statistics were similar across the metrics, the differences in the rate ratios for the spatially heterogeneous pollutants may seem like a relatively small issue. However, these differences are important for health benefits analyses, where results from epidemiological studies on the health effects of pollutants (per unit change in concentration) are used to predict the health impacts of a reduction in pollutant concentrations. We discuss the relative merits of the different metrics as they pertain to time-series studies and health benefits analyses.

  7. Forecasting daily attendances at an emergency department to aid resource planning

    PubMed Central

    Sun, Yan; Heng, Bee Hoon; Seow, Yian Tay; Seow, Eillyne

    2009-01-01

    Background Accurate forecasting of emergency department (ED) attendances can be a valuable tool for micro and macro level planning. Methods Data for analysis was the counts of daily patient attendances at the ED of an acute care regional general hospital from July 2005 to Mar 2008. Patients were stratified into three acuity categories; i.e. P1, P2 and P3, with P1 being the most acute and P3 being the least acute. The autoregressive integrated moving average (ARIMA) method was separately applied to each of the three acuity categories and total patient attendances. Independent variables included in the model were public holiday (yes or no), ambient air quality measured by pollution standard index (PSI), daily ambient average temperature and daily relative humidity. The seasonal components of weekly and yearly periodicities in the time series of daily attendances were also studied. Univariate analysis by t-tests and multivariate time series analysis were carried out in SPSS version 15. Results By time series analyses, P1 attendances did not show any weekly or yearly periodicity and was only predicted by ambient air quality of PSI > 50. P2 and total attendances showed weekly periodicities, and were also significantly predicted by public holiday. P3 attendances were significantly correlated with day of the week, month of the year, public holiday, and ambient air quality of PSI > 50. After applying the developed models to validate the forecast, the MAPE of prediction by the models were 16.8%, 6.7%, 8.6% and 4.8% for P1, P2, P3 and total attendances, respectively. The models were able to account for most of the significant autocorrelations present in the data. Conclusion Time series analysis has been shown to provide a useful, readily available tool for predicting emergency department workload that can be used to plan staff roster and resource planning. PMID:19178716

  8. Reliability prediction of ontology-based service compositions using Petri net and time series models.

    PubMed

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy.

  9. Analysis of temperature trends in Northern Serbia

    NASA Astrophysics Data System (ADS)

    Tosic, Ivana; Gavrilov, Milivoj; Unkašević, Miroslava; Marković, Slobodan; Petrović, Predrag

    2017-04-01

    An analysis of air temperature trends in Northern Serbia for the annual and seasonal time series is performed for two periods: 1949-2013 and 1979-2013. Three data sets of surface air temperatures: monthly mean temperatures, monthly maximum temperatures, and monthly minimum temperatures are analyzed at 9 stations that have altitudes varying between 75 m and 102 m. Monthly mean temperatures are obtained as the average of the daily mean temperatures, while monthly maximum (minimum) temperatures are the maximum (minimum) values of daily temperatures in corresponding month. Positive trends were found in 29 out of 30 time series, and the negative trend was found only in winter during the period 1979-2013. Applying the Mann-Kendall test, significant positive trends were found in 15 series; 7 in the period 1949-2013 and 8 in the period 1979-2013; and no significant trend was found in 15 series. Significant positive trends are dominated during the year, spring, and summer, where it was found in 14 out of 18 cases. Significant positive trends were found 7, 5, and 3 times in mean, maximum and minimum temperatures, respectively. It was found that the positive temperature trends are dominant in Northern Serbia.

  10. The Santander Atlantic Time-Series Station (SATS): A Time Series combination of a monthly hydrographic Station and The Biscay AGL Oceanic Observatory.

    NASA Astrophysics Data System (ADS)

    Lavin, Alicia; Somavilla, Raquel; Cano, Daniel; Rodriguez, Carmen; Gonzalez-Pola, Cesar; Viloria, Amaia; Tel, Elena; Ruiz-Villareal, Manuel

    2017-04-01

    Long-Term Time Series Stations have been developed in order to document seasonal to decadal scale variations in key physical and biogeochemical parameters. Long-term time series measurements are crucial for determining the physical and biological mechanisms controlling the system. The Science and Technology Ministers of the G7 in their Tsukuba Communiqué have stated that 'many parts of the ocean interior are not sufficiently observed' and that 'it is crucial to develop far stronger scientific knowledge necessary to assess the ongoing changes in the ocean and their impact on economies.' Time series has been classically obtained by oceanographic ships that regularly cover standard sections and stations. From 1991, shelf and slope waters of the Southern Bay of Biscay are regularly sampled in a monthly hydrographic line north of Santander to a depth of 1000 m in early stages and for the whole water column down to 2580 m in recent times. Nearby, in June 2007, the IEO deployed an oceanic-meteorological buoy (AGL Buoy, 43° 50.67'N; 3° 46.20'W, and 40 km offshore, www.boya-agl.st.ieo.es). The Santander Atlantic Time Series Station is integrated in the Spanish Institute of Oceanography Observing Sistem (IEOOS). The long-term hydrographic monitoring has allowed to define the seasonality of the main oceanographic facts as the upwelling, the Iberian Poleward Current, low salinity incursions, trends and interannual variability at mixing layer, and at the main water masses North Atlantic Central Water and Mediterranean Water. The relation of these changes with the high frequency surface conditions recorded by the Biscay AGL has been examined using also satellite and reanalysis data. During the FIXO3 Project (Fixed-point Open Ocean Observatories), and using this combined sources, some products and quality controled series of high interest and utility for scientific purposes has been developed. Hourly products as Sea Surface Temperature and Salinity anomalies, wave significant height character with respect to monthly average, and currents with respect to seasonal averages. Ocean-atmosphere heat fluxes (latent and sensible) are computed from the buoy atmospheric and oceanic measurements. Estimations of the mixed layer depth and bulk series at different water levels are provided in a monthly basis. Quality controlled series are distributed for sea surface salinity, oxygen and chlorophyll data. Some sensors are particularly affected by biofouling, and monthly visits to the buoy permit to follow these sensors behaviour. Chlorophyll-fluorescence sensor is the main concern, but Dissolved Oxygen sensor is also problematic. Periods of realistic smooth variations present strong offset that is corrected based on the Winkler analysis of water samples. Also Wind air temperature and humidilty buoy sensors are monthly compared with the research vessel data. Next step will consist in working on a better validation of the data, mainly ten-year data from the Biscay AGL buoy, but also the 25 year data of the station 7, close to the buoy. Data will be depurated an analyzed and the final product will be published and widening to improve and get the better use of them.

  11. 40 CFR 63.653 - Monitoring, recordkeeping, and implementation plan for emissions averaging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... § 63.120 of subpart G; and (ii) For closed vent systems with control devices, conduct an initial design... different times, and/or in different submittals, later submittals may refer to earlier submittals instead of... controlled using a treatment process or series of treatment processes that achieves an emission reduction...

  12. FREQUENCY DISTRIBUTIONS AND SPATIAL ANALYSIS OF FINE PARTICLE MEASUREMENTS IN ST. LOUIS DURING THE REGIONAL AIR POLLUTION STUDY/REGIONAL AIR MONITORING SYSTEM

    EPA Science Inventory

    Community, time-series epidemiology typically uses either 24-hour integrated particulate matter (PM) concentrations averaged across several monitors in a city or data obtained at a central monitoring site to relate PM concentrations to human health effects. If 24-hour integrated...

  13. Salary Compression: A Time-Series Ratio Analysis of ARL Position Classifications

    ERIC Educational Resources Information Center

    Seaman, Scott

    2007-01-01

    Although salary compression has previously been identified in such professional schools as engineering, business, and computer science, there is now evidence of salary compression among Association of Research Libraries members. Using salary data from the "ARL Annual Salary Survey", this study analyzes average annual salaries from 1994-1995…

  14. Effects of Forecasts on the Revisions of Concurrent Seasonally Adjusted Data Using the X-11 Seasonal Adjustment Procedure.

    ERIC Educational Resources Information Center

    Bobbitt, Larry; Otto, Mark

    Three Autoregressive Integrated Moving Averages (ARIMA) forecast procedures for Census Bureau X-11 concurrent seasonal adjustment were empirically tested. Forty time series from three Census Bureau economic divisions (business, construction, and industry) were analyzed. Forecasts were obtained from fitted seasonal ARIMA models augmented with…

  15. Forecasting Techniques and Library Circulation Operations: Implications for Management.

    ERIC Educational Resources Information Center

    Ahiakwo, Okechukwu N.

    1988-01-01

    Causal regression and time series models were developed using six years of data for home borrowing, average readership, and books consulted at a university library. The models were tested for efficacy in producing short-term planning and control data. Combined models were tested in establishing evaluation measures. (10 references) (Author/MES)

  16. Correlated errors in geodetic time series: Implications for time-dependent deformation

    USGS Publications Warehouse

    Langbein, J.; Johnson, H.

    1997-01-01

    Analysis of frequent trilateration observations from the two-color electronic distance measuring networks in California demonstrate that the noise power spectra are dominated by white noise at higher frequencies and power law behavior at lower frequencies. In contrast, Earth scientists typically have assumed that only white noise is present in a geodetic time series, since a combination of infrequent measurements and low precision usually preclude identifying the time-correlated signature in such data. After removing a linear trend from the two-color data, it becomes evident that there are primarily two recognizable types of time-correlated noise present in the residuals. The first type is a seasonal variation in displacement which is probably a result of measuring to shallow surface monuments installed in clayey soil which responds to seasonally occurring rainfall; this noise is significant only for a small fraction of the sites analyzed. The second type of correlated noise becomes evident only after spectral analysis of line length changes and shows a functional relation at long periods between power and frequency of and where f is frequency and ?? ??? 2. With ?? = 2, this type of correlated noise is termed random-walk noise, and its source is mainly thought to be small random motions of geodetic monuments with respect to the Earth's crust, though other sources are possible. Because the line length changes in the two-color networks are measured at irregular intervals, power spectral techniques cannot reliably estimate the level of I//" noise. Rather, we also use here a maximum likelihood estimation technique which assumes that there are only two sources of noise in the residual time series (white noise and randomwalk noise) and estimates the amount of each. From this analysis we find that the random-walk noise level averages about 1.3 mm/Vyr and that our estimates of the white noise component confirm theoretical limitations of the measurement technique. In addition, the seasonal noise can be as large as 3 mm in amplitude but typically is less than 0.5 mm. Because of the presence of random-walk noise in these time series, modeling and interpretation of the geodetic data must account for this source of error. By way of example we show that estimating the time-varying strain tensor (a form of spatial averaging) from geodetic data having both random-walk and white noise error components results in seemingly significant variations in the rate of strain accumulation; spatial averaging does reduce the size of both noise components but not their relative influence on the resulting strain accumulation model. Copyright 1997 by the American Geophysical Union.

  17. Operative Treatment of Traumatic Hallux Valgus in Elite Athletes.

    PubMed

    Covell, D Jeff; Lareau, Craig R; Anderson, Robert B

    2017-06-01

    Traumatic hallux valgus is an increasingly common injury in the athletic population and represents a unique variant of turf toe. Failure to appropriately recognize and treat these injuries can lead to continued pain, decreased performance, progressive deformities, and ultimately degeneration of the hallux metatarsophalangeal joint. Limited literature currently exists to assist in the diagnosis, management, and operative treatment. Nineteen patients were reviewed in this series, including 12 National Football League, 6 college, and 1 high school player who was a college prospect. The average age for all patients at the time of surgery was 24.4 years (range, 19-33 years). Return to play and complications were evaluated. Overall, good operative results were obtained, with 74% of patients returning to their preinjury level of play at an average recovery time of 3.4 months. Traumatic hallux valgus is an increasingly common injury in the athletic population and represents a unique variant of turf toe. The impact of this injury cannot be overstated, as one-quarter of players were unable to return to play. Level IV, case series.

  18. A scan statistic for identifying optimal risk windows in vaccine safety studies using self-controlled case series design.

    PubMed

    Xu, Stanley; Hambidge, Simon J; McClure, David L; Daley, Matthew F; Glanz, Jason M

    2013-08-30

    In the examination of the association between vaccines and rare adverse events after vaccination in postlicensure observational studies, it is challenging to define appropriate risk windows because prelicensure RCTs provide little insight on the timing of specific adverse events. Past vaccine safety studies have often used prespecified risk windows based on prior publications, biological understanding of the vaccine, and expert opinion. Recently, a data-driven approach was developed to identify appropriate risk windows for vaccine safety studies that use the self-controlled case series design. This approach employs both the maximum incidence rate ratio and the linear relation between the estimated incidence rate ratio and the inverse of average person time at risk, given a specified risk window. In this paper, we present a scan statistic that can identify appropriate risk windows in vaccine safety studies using the self-controlled case series design while taking into account the dependence of time intervals within an individual and while adjusting for time-varying covariates such as age and seasonality. This approach uses the maximum likelihood ratio test based on fixed-effects models, which has been used for analyzing data from self-controlled case series design in addition to conditional Poisson models. Copyright © 2013 John Wiley & Sons, Ltd.

  19. Comparison of time-series registration methods in breast dynamic infrared imaging

    NASA Astrophysics Data System (ADS)

    Riyahi-Alam, S.; Agostini, V.; Molinari, F.; Knaflitz, M.

    2015-03-01

    Automated motion reduction in dynamic infrared imaging is on demand in clinical applications, since movement disarranges time-temperature series of each pixel, thus originating thermal artifacts that might bias the clinical decision. All previously proposed registration methods are feature based algorithms requiring manual intervention. The aim of this work is to optimize the registration strategy specifically for Breast Dynamic Infrared Imaging and to make it user-independent. We implemented and evaluated 3 different 3D time-series registration methods: 1. Linear affine, 2. Non-linear Bspline, 3. Demons applied to 12 datasets of healthy breast thermal images. The results are evaluated through normalized mutual information with average values of 0.70 ±0.03, 0.74 ±0.03 and 0.81 ±0.09 (out of 1) for Affine, Bspline and Demons registration, respectively, as well as breast boundary overlap and Jacobian determinant of the deformation field. The statistical analysis of the results showed that symmetric diffeomorphic Demons' registration method outperforms also with the best breast alignment and non-negative Jacobian values which guarantee image similarity and anatomical consistency of the transformation, due to homologous forces enforcing the pixel geometric disparities to be shortened on all the frames. We propose Demons' registration as an effective technique for time-series dynamic infrared registration, to stabilize the local temperature oscillation.

  20. Forecasting malaria cases using climatic factors in delhi, India: a time series analysis.

    PubMed

    Kumar, Varun; Mangal, Abha; Panesar, Sanjeet; Yadav, Geeta; Talwar, Richa; Raut, Deepak; Singh, Saudan

    2014-01-01

    Background. Malaria still remains a public health problem in developing countries and changing environmental and climatic factors pose the biggest challenge in fighting against the scourge of malaria. Therefore, the study was designed to forecast malaria cases using climatic factors as predictors in Delhi, India. Methods. The total number of monthly cases of malaria slide positives occurring from January 2006 to December 2013 was taken from the register maintained at the malaria clinic at Rural Health Training Centre (RHTC), Najafgarh, Delhi. Climatic data of monthly mean rainfall, relative humidity, and mean maximum temperature were taken from Regional Meteorological Centre, Delhi. Expert modeler of SPSS ver. 21 was used for analyzing the time series data. Results. Autoregressive integrated moving average, ARIMA (0,1,1) (0,1,0)(12), was the best fit model and it could explain 72.5% variability in the time series data. Rainfall (P value = 0.004) and relative humidity (P value = 0.001) were found to be significant predictors for malaria transmission in the study area. Seasonal adjusted factor (SAF) for malaria cases shows peak during the months of August and September. Conclusion. ARIMA models of time series analysis is a simple and reliable tool for producing reliable forecasts for malaria in Delhi, India.

  1. Arima model and exponential smoothing method: A comparison

    NASA Astrophysics Data System (ADS)

    Wan Ahmad, Wan Kamarul Ariffin; Ahmad, Sabri

    2013-04-01

    This study shows the comparison between Autoregressive Moving Average (ARIMA) model and Exponential Smoothing Method in making a prediction. The comparison is focused on the ability of both methods in making the forecasts with the different number of data sources and the different length of forecasting period. For this purpose, the data from The Price of Crude Palm Oil (RM/tonne), Exchange Rates of Ringgit Malaysia (RM) in comparison to Great Britain Pound (GBP) and also The Price of SMR 20 Rubber Type (cents/kg) with three different time series are used in the comparison process. Then, forecasting accuracy of each model is measured by examinethe prediction error that producedby using Mean Squared Error (MSE), Mean Absolute Percentage Error (MAPE), and Mean Absolute deviation (MAD). The study shows that the ARIMA model can produce a better prediction for the long-term forecasting with limited data sources, butcannot produce a better prediction for time series with a narrow range of one point to another as in the time series for Exchange Rates. On the contrary, Exponential Smoothing Method can produce a better forecasting for Exchange Rates that has a narrow range of one point to another for its time series, while itcannot produce a better prediction for a longer forecasting period.

  2. Textural changes of FER-A peridotite in time series piston-cylinder experiments at 1.0 GPa, 1300°C

    NASA Astrophysics Data System (ADS)

    Schwab, B. E.; Mercer, C. N.; Johnston, A.

    2012-12-01

    A series of eight 1.0 GPa, 1300°C partial melting experiments were performed using FER-A peridotite starting material to investigate potential textural changes in the residual crystalline phases over time. Powdered peridotite with a layer of vitreous carbon spheres as a melt sink were sealed in graphite-lined Pt capsules and run in CaF2 furnace assemblies in 1.27cm piston-cylinder apparatus at the University of Oregon. Run durations ranged from 4 to 128 hours. Experimental charges were mounted in epoxy, cut, and polished for analysis. In a first attempt to quantify the mineral textures, individual 500x BSE images were collected from selected, representative locations on each of the experimental charges using the FEI Quanta 250 ESEM at Humboldt State University. Noran System Seven (NSS) EDS system was used to collect x-ray maps (spectral images) to aid in identification of phases. A combination of image analysis techniques within NSS and ImageJ software are being used to process the images and quantify the mineral textures observed. The goals are to quantify the size, shape, and abundance of residual olivine (ol), orthopyroxene (opx), clinopyroxene (cpx), and spinel crystals within the selected sample areas of the run products. Additional work will be done to compare the results of the selected areas with larger (lower magnification) images acquired using the same techniques. Preliminary results indicate that measurements of average grain area, minimum grain area, and average, maximum, and minimum grain perimeter show the greatest change (generally decreasing) in measurements for ol, opx, and cpx between the shortest-duration, 4-hour, experiment and the subsequent, 8-hour, experiment. The largest relative change in nearly all of these measurements appears to be for cpx. After the initial decrease, preliminary measurements remain relatively constant for ol, opx, and cpx, respectively, in experiments from 8 to 128 hours in duration. In contrast, measured parameters of spinel grains increase from the 4-hour to 8-hour experiment and continue to fluctuate over the time interval investigated. Spinel also represents the smallest number of individual grains (average n = 25) in any experiment. Average aspect ratios for all minerals remain relatively constant (~1.5-2) throughout the time series. Additional measurements and refinements are underway.

  3. Wavelet application to the time series analysis of DORIS station coordinates

    NASA Astrophysics Data System (ADS)

    Bessissi, Zahia; Terbeche, Mekki; Ghezali, Boualem

    2009-06-01

    The topic developed in this article relates to the residual time series analysis of DORIS station coordinates using the wavelet transform. Several analysis techniques, already developed in other disciplines, were employed in the statistical study of the geodetic time series of stations. The wavelet transform allows one, on the one hand, to provide temporal and frequential parameter residual signals, and on the other hand, to determine and quantify systematic signals such as periodicity and tendency. Tendency is the change in short or long term signals; it is an average curve which represents the general pace of the signal evolution. On the other hand, periodicity is a process which is repeated, identical to itself, after a time interval called the period. In this context, the topic of this article consists, on the one hand, in determining the systematic signals by wavelet analysis of time series of DORIS station coordinates, and on the other hand, in applying the denoising signal to the wavelet packet, which makes it possible to obtain a well-filtered signal, smoother than the original signal. The DORIS data used in the treatment are a set of weekly residual time series from 1993 to 2004 from eight stations: DIOA, COLA, FAIB, KRAB, SAKA, SODB, THUB and SYPB. It is the ign03wd01 solution expressed in stcd format, which is derived by the IGN/JPL analysis center. Although these data are not very recent, the goal of this study is to detect the contribution of the wavelet analysis method on the DORIS data, compared to the other analysis methods already studied.

  4. A multi-centennial time series of well-constrained ΔR values for the Irish Sea derived using absolutely-dated shell samples from the mollusc Arctica islandica

    NASA Astrophysics Data System (ADS)

    Butler, P. G.; Scourse, J. D.; Richardson, C. A.; Wanamaker, A. D., Jr.

    2009-04-01

    Determinations of the local correction (ΔR) to the globally averaged marine radiocarbon reservoir age are often isolated in space and time, derived from heterogeneous sources and constrained by significant uncertainties. Although time series of ΔR at single sites can be obtained from sediment cores, these are subject to multiple uncertainties related to sedimentation rates, bioturbation and interspecific variations in the source of radiocarbon in the analysed samples. Coral records provide better resolution, but these are available only for tropical locations. It is shown here that it is possible to use the shell of the long-lived bivalve mollusc Arctica islandica as a source of high resolution time series of absolutely-dated marine radiocarbon determinations for the shelf seas surrounding the North Atlantic ocean. Annual growth increments in the shell can be crossdated and chronologies can be constructed in a precise analogue with the use of tree-rings. Because the calendar dates of the samples are known, ΔR can be determined with high precision and accuracy and because all the samples are from the same species, the time series of ΔR values possesses a high degree of internal consistency. Presented here is a multi-centennial (AD 1593 - AD 1933) time series of 31 ΔR values for a site in the Irish Sea close to the Isle of Man. The mean value of ΔR (-62 14C yrs) does not change significantly during this period but increased variability is apparent before AD 1750.

  5. Benchmarking homogenization algorithms for monthly data

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M. J.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2012-01-01

    The COST (European Cooperation in Science and Technology) Action ES0601: advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random independent break-type inhomogeneities with normally distributed breakpoint sizes were added to the simulated datasets. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study. After the deadline at which details of the imposed inhomogeneities were revealed, 22 additional solutions were submitted. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Training the users on homogenization software was found to be very important. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that automatic algorithms can perform as well as manual ones.

  6. Benchmarking monthly homogenization algorithms

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2011-08-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random break-type inhomogeneities were added to the simulated datasets modeled as a Poisson process with normally distributed breakpoint sizes. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Training was found to be very important. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that currently automatic algorithms can perform as well as manual ones.

  7. Performance of time-series methods in forecasting the demand for red blood cell transfusion.

    PubMed

    Pereira, Arturo

    2004-05-01

    Planning the future blood collection efforts must be based on adequate forecasts of transfusion demand. In this study, univariate time-series methods were investigated for their performance in forecasting the monthly demand for RBCs at one tertiary-care, university hospital. Three time-series methods were investigated: autoregressive integrated moving average (ARIMA), the Holt-Winters family of exponential smoothing models, and one neural-network-based method. The time series consisted of the monthly demand for RBCs from January 1988 to December 2002 and was divided into two segments: the older one was used to fit or train the models, and the younger to test for the accuracy of predictions. Performance was compared across forecasting methods by calculating goodness-of-fit statistics, the percentage of months in which forecast-based supply would have met the RBC demand (coverage rate), and the outdate rate. The RBC transfusion series was best fitted by a seasonal ARIMA(0,1,1)(0,1,1)(12) model. Over 1-year time horizons, forecasts generated by ARIMA or exponential smoothing laid within the +/- 10 percent interval of the real RBC demand in 79 percent of months (62% in the case of neural networks). The coverage rate for the three methods was 89, 91, and 86 percent, respectively. Over 2-year time horizons, exponential smoothing largely outperformed the other methods. Predictions by exponential smoothing laid within the +/- 10 percent interval of real values in 75 percent of the 24 forecasted months, and the coverage rate was 87 percent. Over 1-year time horizons, predictions of RBC demand generated by ARIMA or exponential smoothing are accurate enough to be of help in the planning of blood collection efforts. For longer time horizons, exponential smoothing outperforms the other forecasting methods.

  8. Conditional adaptive Bayesian spectral analysis of nonstationary biomedical time series.

    PubMed

    Bruce, Scott A; Hall, Martica H; Buysse, Daniel J; Krafty, Robert T

    2018-03-01

    Many studies of biomedical time series signals aim to measure the association between frequency-domain properties of time series and clinical and behavioral covariates. However, the time-varying dynamics of these associations are largely ignored due to a lack of methods that can assess the changing nature of the relationship through time. This article introduces a method for the simultaneous and automatic analysis of the association between the time-varying power spectrum and covariates, which we refer to as conditional adaptive Bayesian spectrum analysis (CABS). The procedure adaptively partitions the grid of time and covariate values into an unknown number of approximately stationary blocks and nonparametrically estimates local spectra within blocks through penalized splines. CABS is formulated in a fully Bayesian framework, in which the number and locations of partition points are random, and fit using reversible jump Markov chain Monte Carlo techniques. Estimation and inference averaged over the distribution of partitions allows for the accurate analysis of spectra with both smooth and abrupt changes. The proposed methodology is used to analyze the association between the time-varying spectrum of heart rate variability and self-reported sleep quality in a study of older adults serving as the primary caregiver for their ill spouse. © 2017, The International Biometric Society.

  9. Relevance analysis and short-term prediction of PM2.5 concentrations in Beijing based on multi-source data

    NASA Astrophysics Data System (ADS)

    Ni, X. Y.; Huang, H.; Du, W. P.

    2017-02-01

    The PM2.5 problem is proving to be a major public crisis and is of great public-concern requiring an urgent response. Information about, and prediction of PM2.5 from the perspective of atmospheric dynamic theory is still limited due to the complexity of the formation and development of PM2.5. In this paper, we attempted to realize the relevance analysis and short-term prediction of PM2.5 concentrations in Beijing, China, using multi-source data mining. A correlation analysis model of PM2.5 to physical data (meteorological data, including regional average rainfall, daily mean temperature, average relative humidity, average wind speed, maximum wind speed, and other pollutant concentration data, including CO, NO2, SO2, PM10) and social media data (microblog data) was proposed, based on the Multivariate Statistical Analysis method. The study found that during these factors, the value of average wind speed, the concentrations of CO, NO2, PM10, and the daily number of microblog entries with key words 'Beijing; Air pollution' show high mathematical correlation with PM2.5 concentrations. The correlation analysis was further studied based on a big data's machine learning model- Back Propagation Neural Network (hereinafter referred to as BPNN) model. It was found that the BPNN method performs better in correlation mining. Finally, an Autoregressive Integrated Moving Average (hereinafter referred to as ARIMA) Time Series model was applied in this paper to explore the prediction of PM2.5 in the short-term time series. The predicted results were in good agreement with the observed data. This study is useful for helping realize real-time monitoring, analysis and pre-warning of PM2.5 and it also helps to broaden the application of big data and the multi-source data mining methods.

  10. Evaluation of scale invariance in physiological signals by means of balanced estimation of diffusion entropy.

    PubMed

    Zhang, Wenqing; Qiu, Lu; Xiao, Qin; Yang, Huijie; Zhang, Qingjun; Wang, Jianyong

    2012-11-01

    By means of the concept of the balanced estimation of diffusion entropy, we evaluate the reliable scale invariance embedded in different sleep stages and stride records. Segments corresponding to waking, light sleep, rapid eye movement (REM) sleep, and deep sleep stages are extracted from long-term electroencephalogram signals. For each stage the scaling exponent value is distributed over a considerably wide range, which tell us that the scaling behavior is subject and sleep cycle dependent. The average of the scaling exponent values for waking segments is almost the same as that for REM segments (∼0.8). The waking and REM stages have a significantly higher value of the average scaling exponent than that for light sleep stages (∼0.7). For the stride series, the original diffusion entropy (DE) and the balanced estimation of diffusion entropy (BEDE) give almost the same results for detrended series. The evolutions of local scaling invariance show that the physiological states change abruptly, although in the experiments great efforts have been made to keep conditions unchanged. The global behavior of a single physiological signal may lose rich information on physiological states. Methodologically, the BEDE can evaluate with considerable precision the scale invariance in very short time series (∼10^{2}), while the original DE method sometimes may underestimate scale-invariance exponents or even fail in detecting scale-invariant behavior. The BEDE method is sensitive to trends in time series. The existence of trends may lead to an unreasonably high value of the scaling exponent and consequent mistaken conclusions.

  11. Evaluation of scale invariance in physiological signals by means of balanced estimation of diffusion entropy

    NASA Astrophysics Data System (ADS)

    Zhang, Wenqing; Qiu, Lu; Xiao, Qin; Yang, Huijie; Zhang, Qingjun; Wang, Jianyong

    2012-11-01

    By means of the concept of the balanced estimation of diffusion entropy, we evaluate the reliable scale invariance embedded in different sleep stages and stride records. Segments corresponding to waking, light sleep, rapid eye movement (REM) sleep, and deep sleep stages are extracted from long-term electroencephalogram signals. For each stage the scaling exponent value is distributed over a considerably wide range, which tell us that the scaling behavior is subject and sleep cycle dependent. The average of the scaling exponent values for waking segments is almost the same as that for REM segments (˜0.8). The waking and REM stages have a significantly higher value of the average scaling exponent than that for light sleep stages (˜0.7). For the stride series, the original diffusion entropy (DE) and the balanced estimation of diffusion entropy (BEDE) give almost the same results for detrended series. The evolutions of local scaling invariance show that the physiological states change abruptly, although in the experiments great efforts have been made to keep conditions unchanged. The global behavior of a single physiological signal may lose rich information on physiological states. Methodologically, the BEDE can evaluate with considerable precision the scale invariance in very short time series (˜102), while the original DE method sometimes may underestimate scale-invariance exponents or even fail in detecting scale-invariant behavior. The BEDE method is sensitive to trends in time series. The existence of trends may lead to an unreasonably high value of the scaling exponent and consequent mistaken conclusions.

  12. How to estimate exposure when studying the temperature-mortality relationship? A case study of the Paris area.

    PubMed

    Schaeffer, Laura; de Crouy-Chanel, Perrine; Wagner, Vérène; Desplat, Julien; Pascal, Mathilde

    2016-01-01

    Time series studies assessing the effect of temperature on mortality generally use temperatures measured by a single weather station. In the Paris region, there is a substantial measurement network, and a variety of exposure indicators created from multiple stations can be tested. The aim of this study is to test the influence of exposure indicators on the temperature-mortality relationship in the Paris region. The relationship between temperature and non-accidental mortality was assessed based on a time series analysis using Poisson regression and a generalised additive model. Twenty-five stations in Paris and its three neighbouring departments were used to create four exposure indicators. These indicators were (1) the temperature recorded by one reference station, (2) a simple average of the temperatures of all stations, (3) an average weighted on the departmental population and (4) a classification of the stations based on land use and an average weighted on the population in each class. The relative risks and the Akaike criteria were similar for all the exposure indicators. The estimated temperature-mortality relationship therefore did not appear to be significantly affected by the indicator used, regardless of study zone (departments or region) or age group. The increase in temperatures from the 90(th) to the 99(th) percentile of the temperature distribution led to a significant increase in mortality over 75 years (RR = 1.10 [95% CI, 1.07; 1.14]). Conversely, the decrease in temperature between the 10(th) and 1(st) percentile had a significant effect on the mortality under 75 years (RR = 1.04 [95% CI, 1.01; 1.06]). In the Paris area, there is no added value in taking multiple climatic stations into account when estimating exposure in time series studies. Methods to better represent the subtle temperature variations in densely populated areas in epidemiological studies are needed.

  13. Use of a scenario-neutral approach to identify the key hydro-meteorological attributes that impact runoff from a natural catchment

    NASA Astrophysics Data System (ADS)

    Guo, Danlu; Westra, Seth; Maier, Holger R.

    2017-11-01

    Scenario-neutral approaches are being used increasingly for assessing the potential impact of climate change on water resource systems, as these approaches allow the performance of these systems to be evaluated independently of climate change projections. However, practical implementations of these approaches are still scarce, with a key limitation being the difficulty of generating a range of plausible future time series of hydro-meteorological data. In this study we apply a recently developed inverse stochastic generation approach to support the scenario-neutral analysis, and thus identify the key hydro-meteorological variables to which the system is most sensitive. The stochastic generator simulates synthetic hydro-meteorological time series that represent plausible future changes in (1) the average, extremes and seasonal patterns of rainfall; and (2) the average values of temperature (Ta), relative humidity (RH) and wind speed (uz) as variables that drive PET. These hydro-meteorological time series are then fed through a conceptual rainfall-runoff model to simulate the potential changes in runoff as a function of changes in the hydro-meteorological variables, and runoff sensitivity is assessed with both correlation and Sobol' sensitivity analyses. The method was applied to a case study catchment in South Australia, and the results showed that the most important hydro-meteorological attributes for runoff were winter rainfall followed by the annual average rainfall, while the PET-related meteorological variables had comparatively little impact. The high importance of winter rainfall can be related to the winter-dominated nature of both the rainfall and runoff regimes in this catchment. The approach illustrated in this study can greatly enhance our understanding of the key hydro-meteorological attributes and processes that are likely to drive catchment runoff under a changing climate, thus enabling the design of tailored climate impact assessments to specific water resource systems.

  14. Evaluating the impact of abrupt changes in forest policy and management practices on landscape dynamics: analysis of a Landsat image time series in the Atlantic Northern Forest.

    PubMed

    Legaard, Kasey R; Sader, Steven A; Simons-Legaard, Erin M

    2015-01-01

    Sustainable forest management is based on functional relationships between management actions, landscape conditions, and forest values. Changes in management practices make it fundamentally more difficult to study these relationships because the impacts of current practices are difficult to disentangle from the persistent influences of past practices. Within the Atlantic Northern Forest of Maine, U.S.A., forest policy and management practices changed abruptly in the early 1990s. During the 1970s-1980s, a severe insect outbreak stimulated salvage clearcutting of large contiguous tracts of spruce-fir forest. Following clearcut regulation in 1991, management practices shifted abruptly to near complete dependence on partial harvesting. Using a time series of Landsat satellite imagery (1973-2010) we assessed cumulative landscape change caused by these very different management regimes. We modeled predominant temporal patterns of harvesting and segmented a large study area into groups of landscape units with similar harvest histories. Time series of landscape composition and configuration metrics averaged within groups revealed differences in landscape dynamics caused by differences in management history. In some groups (24% of landscape units), salvage caused rapid loss and subdivision of intact mature forest. Persistent landscape change was created by large salvage clearcuts (often averaging > 100 ha) and conversion of spruce-fir to deciduous and mixed forest. In groups that were little affected by salvage (56% of landscape units), contemporary partial harvesting caused loss and subdivision of intact mature forest at even greater rates. Patch shape complexity and edge density reached high levels even where cumulative harvest area was relatively low. Contemporary practices introduced more numerous and much smaller patches of stand-replacing disturbance (typically averaging <15 ha) and a correspondingly large amount of edge. Management regimes impacted different areas to different degrees, producing different trajectories of landscape change that should be recognized when studying the impact of policy and management practices on forest ecology.

  15. Evaluating the Impact of Abrupt Changes in Forest Policy and Management Practices on Landscape Dynamics: Analysis of a Landsat Image Time Series in the Atlantic Northern Forest

    PubMed Central

    Legaard, Kasey R.; Sader, Steven A.; Simons-Legaard, Erin M.

    2015-01-01

    Sustainable forest management is based on functional relationships between management actions, landscape conditions, and forest values. Changes in management practices make it fundamentally more difficult to study these relationships because the impacts of current practices are difficult to disentangle from the persistent influences of past practices. Within the Atlantic Northern Forest of Maine, U.S.A., forest policy and management practices changed abruptly in the early 1990s. During the 1970s-1980s, a severe insect outbreak stimulated salvage clearcutting of large contiguous tracts of spruce-fir forest. Following clearcut regulation in 1991, management practices shifted abruptly to near complete dependence on partial harvesting. Using a time series of Landsat satellite imagery (1973-2010) we assessed cumulative landscape change caused by these very different management regimes. We modeled predominant temporal patterns of harvesting and segmented a large study area into groups of landscape units with similar harvest histories. Time series of landscape composition and configuration metrics averaged within groups revealed differences in landscape dynamics caused by differences in management history. In some groups (24% of landscape units), salvage caused rapid loss and subdivision of intact mature forest. Persistent landscape change was created by large salvage clearcuts (often averaging > 100 ha) and conversion of spruce-fir to deciduous and mixed forest. In groups that were little affected by salvage (56% of landscape units), contemporary partial harvesting caused loss and subdivision of intact mature forest at even greater rates. Patch shape complexity and edge density reached high levels even where cumulative harvest area was relatively low. Contemporary practices introduced more numerous and much smaller patches of stand-replacing disturbance (typically averaging <15 ha) and a correspondingly large amount of edge. Management regimes impacted different areas to different degrees, producing different trajectories of landscape change that should be recognized when studying the impact of policy and management practices on forest ecology. PMID:26106893

  16. Month-wise variation and prediction of bulk tank somatic cell count in Brazilian dairy herds and its impact on payment based on milk quality.

    PubMed

    Busanello, Marcos; de Freitas, Larissa Nazareth; Winckler, João Pedro Pereira; Farias, Hiron Pereira; Dos Santos Dias, Carlos Tadeu; Cassoli, Laerte Dagher; Machado, Paulo Fernando

    2017-01-01

    Payment programs based on milk quality (PPBMQ) are used in several countries around the world as an incentive to improve milk quality. One of the principal milk parameters used in such programs is the bulk tank somatic cell count (BTSCC). In this study, using data from an average of 37,000 farms per month in Brazil where milk was analyzed, BTSCC data were divided into different payment classes based on milk quality. Then, descriptive and graphical analyses were performed. The probability of a change to a worse payment class was calculated, future BTSCC values were predicted using time series models, and financial losses due to the failure to reach the maximum bonus for the payment based on milk quality were simulated. In Brazil, the mean BTSCC has remained high in recent years, without a tendency to improve. The probability of changing to a worse payment class was strongly affected by both the BTSCC average and BTSCC standard deviation for classes 1 and 2 (1000-200,000 and 201,000-400,000 cells/mL, respectively) and only by the BTSCC average for classes 3 and 4 (401,000-500,000 and 501,000-800,000 cells/mL, respectively). The time series models indicated that at some point in the year, farms would not remain in their current class and would accrue financial losses due to payments based on milk quality. The BTSCC for Brazilian dairy farms has not recently improved. The probability of a class change to a worse class is a metric that can aid in decision-making and stimulate farmers to improve milk quality. A time series model can be used to predict the future value of the BTSCC, making it possible to estimate financial losses and to show, moreover, that financial losses occur in all classes of the PPBMQ because the farmers do not remain in the best payment class in all months.

  17. Benchmarking homogenization algorithms for monthly data

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M. J.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratiannil, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.; Willett, K.

    2013-09-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies. The algorithms were validated against a realistic benchmark dataset. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including i) the centered root mean square error relative to the true homogeneous values at various averaging scales, ii) the error in linear trend estimates and iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that currently automatic algorithms can perform as well as manual ones.

  18. A robust algorithm for optimisation and customisation of fractal dimensions of time series modified by nonlinearly scaling their time derivatives: mathematical theory and practical applications.

    PubMed

    Fuss, Franz Konstantin

    2013-01-01

    Standard methods for computing the fractal dimensions of time series are usually tested with continuous nowhere differentiable functions, but not benchmarked with actual signals. Therefore they can produce opposite results in extreme signals. These methods also use different scaling methods, that is, different amplitude multipliers, which makes it difficult to compare fractal dimensions obtained from different methods. The purpose of this research was to develop an optimisation method that computes the fractal dimension of a normalised (dimensionless) and modified time series signal with a robust algorithm and a running average method, and that maximises the difference between two fractal dimensions, for example, a minimum and a maximum one. The signal is modified by transforming its amplitude by a multiplier, which has a non-linear effect on the signal's time derivative. The optimisation method identifies the optimal multiplier of the normalised amplitude for targeted decision making based on fractal dimensions. The optimisation method provides an additional filter effect and makes the fractal dimensions less noisy. The method is exemplified by, and explained with, different signals, such as human movement, EEG, and acoustic signals.

  19. Streamflow Prediction based on Chaos Theory

    NASA Astrophysics Data System (ADS)

    Li, X.; Wang, X.; Babovic, V. M.

    2015-12-01

    Chaos theory is a popular method in hydrologic time series prediction. Local model (LM) based on this theory utilizes time-delay embedding to reconstruct the phase-space diagram. For this method, its efficacy is dependent on the embedding parameters, i.e. embedding dimension, time lag, and nearest neighbor number. The optimal estimation of these parameters is thus critical to the application of Local model. However, these embedding parameters are conventionally estimated using Average Mutual Information (AMI) and False Nearest Neighbors (FNN) separately. This may leads to local optimization and thus has limitation to its prediction accuracy. Considering about these limitation, this paper applies a local model combined with simulated annealing (SA) to find the global optimization of embedding parameters. It is also compared with another global optimization approach of Genetic Algorithm (GA). These proposed hybrid methods are applied in daily and monthly streamflow time series for examination. The results show that global optimization can contribute to the local model to provide more accurate prediction results compared with local optimization. The LM combined with SA shows more advantages in terms of its computational efficiency. The proposed scheme here can also be applied to other fields such as prediction of hydro-climatic time series, error correction, etc.

  20. A Robust Algorithm for Optimisation and Customisation of Fractal Dimensions of Time Series Modified by Nonlinearly Scaling Their Time Derivatives: Mathematical Theory and Practical Applications

    PubMed Central

    2013-01-01

    Standard methods for computing the fractal dimensions of time series are usually tested with continuous nowhere differentiable functions, but not benchmarked with actual signals. Therefore they can produce opposite results in extreme signals. These methods also use different scaling methods, that is, different amplitude multipliers, which makes it difficult to compare fractal dimensions obtained from different methods. The purpose of this research was to develop an optimisation method that computes the fractal dimension of a normalised (dimensionless) and modified time series signal with a robust algorithm and a running average method, and that maximises the difference between two fractal dimensions, for example, a minimum and a maximum one. The signal is modified by transforming its amplitude by a multiplier, which has a non-linear effect on the signal's time derivative. The optimisation method identifies the optimal multiplier of the normalised amplitude for targeted decision making based on fractal dimensions. The optimisation method provides an additional filter effect and makes the fractal dimensions less noisy. The method is exemplified by, and explained with, different signals, such as human movement, EEG, and acoustic signals. PMID:24151522

  1. Univariate Time Series Prediction of Solar Power Using a Hybrid Wavelet-ARMA-NARX Prediction Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nazaripouya, Hamidreza; Wang, Yubo; Chu, Chi-Cheng

    This paper proposes a new hybrid method for super short-term solar power prediction. Solar output power usually has a complex, nonstationary, and nonlinear characteristic due to intermittent and time varying behavior of solar radiance. In addition, solar power dynamics is fast and is inertia less. An accurate super short-time prediction is required to compensate for the fluctuations and reduce the impact of solar power penetration on the power system. The objective is to predict one step-ahead solar power generation based only on historical solar power time series data. The proposed method incorporates discrete wavelet transform (DWT), Auto-Regressive Moving Average (ARMA)more » models, and Recurrent Neural Networks (RNN), while the RNN architecture is based on Nonlinear Auto-Regressive models with eXogenous inputs (NARX). The wavelet transform is utilized to decompose the solar power time series into a set of richer-behaved forming series for prediction. ARMA model is employed as a linear predictor while NARX is used as a nonlinear pattern recognition tool to estimate and compensate the error of wavelet-ARMA prediction. The proposed method is applied to the data captured from UCLA solar PV panels and the results are compared with some of the common and most recent solar power prediction methods. The results validate the effectiveness of the proposed approach and show a considerable improvement in the prediction precision.« less

  2. Construction of regulatory networks using expression time-series data of a genotyped population.

    PubMed

    Yeung, Ka Yee; Dombek, Kenneth M; Lo, Kenneth; Mittler, John E; Zhu, Jun; Schadt, Eric E; Bumgarner, Roger E; Raftery, Adrian E

    2011-11-29

    The inference of regulatory and biochemical networks from large-scale genomics data is a basic problem in molecular biology. The goal is to generate testable hypotheses of gene-to-gene influences and subsequently to design bench experiments to confirm these network predictions. Coexpression of genes in large-scale gene-expression data implies coregulation and potential gene-gene interactions, but provide little information about the direction of influences. Here, we use both time-series data and genetics data to infer directionality of edges in regulatory networks: time-series data contain information about the chronological order of regulatory events and genetics data allow us to map DNA variations to variations at the RNA level. We generate microarray data measuring time-dependent gene-expression levels in 95 genotyped yeast segregants subjected to a drug perturbation. We develop a Bayesian model averaging regression algorithm that incorporates external information from diverse data types to infer regulatory networks from the time-series and genetics data. Our algorithm is capable of generating feedback loops. We show that our inferred network recovers existing and novel regulatory relationships. Following network construction, we generate independent microarray data on selected deletion mutants to prospectively test network predictions. We demonstrate the potential of our network to discover de novo transcription-factor binding sites. Applying our construction method to previously published data demonstrates that our method is competitive with leading network construction algorithms in the literature.

  3. Dynamic Forecasting Conditional Probability of Bombing Attacks Based on Time-Series and Intervention Analysis.

    PubMed

    Li, Shuying; Zhuang, Jun; Shen, Shifei

    2017-07-01

    In recent years, various types of terrorist attacks occurred, causing worldwide catastrophes. According to the Global Terrorism Database (GTD), among all attack tactics, bombing attacks happened most frequently, followed by armed assaults. In this article, a model for analyzing and forecasting the conditional probability of bombing attacks (CPBAs) based on time-series methods is developed. In addition, intervention analysis is used to analyze the sudden increase in the time-series process. The results show that the CPBA increased dramatically at the end of 2011. During that time, the CPBA increased by 16.0% in a two-month period to reach the peak value, but still stays 9.0% greater than the predicted level after the temporary effect gradually decays. By contrast, no significant fluctuation can be found in the conditional probability process of armed assault. It can be inferred that some social unrest, such as America's troop withdrawal from Afghanistan and Iraq, could have led to the increase of the CPBA in Afghanistan, Iraq, and Pakistan. The integrated time-series and intervention model is used to forecast the monthly CPBA in 2014 and through 2064. The average relative error compared with the real data in 2014 is 3.5%. The model is also applied to the total number of attacks recorded by the GTD between 2004 and 2014. © 2016 Society for Risk Analysis.

  4. Near Real-Time Event Detection & Prediction Using Intelligent Software Agents

    DTIC Science & Technology

    2006-03-01

    value was 0.06743. Multiple autoregressive integrated moving average ( ARIMA ) models were then build to see if the raw data, differenced data, or...slight improvement. The best adjusted r^2 value was found to be 0.1814. Successful results were not expected from linear or ARIMA -based modelling ...appear, 2005. [63] Mora-Lopez, L., Mora, J., Morales-Bueno, R., et al. Modelling time series of climatic parameters with probabilistic finite

  5. Change-point analysis of geophysical time-series: application to landslide displacement rate (Séchilienne rock avalanche, France)

    NASA Astrophysics Data System (ADS)

    Amorese, D.; Grasso, J.-R.; Garambois, S.; Font, M.

    2018-05-01

    The rank-sum multiple change-point method is a robust statistical procedure designed to search for the optimal number and the location of change points in an arbitrary continue or discrete sequence of values. As such, this procedure can be used to analyse time-series data. Twelve years of robust data sets for the Séchilienne (French Alps) rockslide show a continuous increase in average displacement rate from 50 to 280 mm per month, in the 2004-2014 period, followed by a strong decrease back to 50 mm per month in the 2014-2015 period. When possible kinematic phases are tentatively suggested in previous studies, its solely rely on the basis of empirical threshold values. In this paper, we analyse how the use of a statistical algorithm for change-point detection helps to better understand time phases in landslide kinematics. First, we test the efficiency of the statistical algorithm on geophysical benchmark data, these data sets (stream flows and Northern Hemisphere temperatures) being already analysed by independent statistical tools. Second, we apply the method to 12-yr daily time-series of the Séchilienne landslide, for rainfall and displacement data, from 2003 December to 2015 December, in order to quantitatively extract changes in landslide kinematics. We find two strong significant discontinuities in the weekly cumulated rainfall values: an average rainfall rate increase is resolved in 2012 April and a decrease in 2014 August. Four robust changes are highlighted in the displacement time-series (2008 May, 2009 November-December-2010 January, 2012 September and 2014 March), the 2010 one being preceded by a significant but weak rainfall rate increase (in 2009 November). Accordingly, we are able to quantitatively define five kinematic stages for the Séchilienne rock avalanche during this period. The synchronization between the rainfall and displacement rate, only resolved at the end of 2009 and beginning of 2010, corresponds to a remarkable change (fourfold increase in mean displacement rate) in the landslide kinematic. This suggests that an increase of the rainfall is able to drive an increase of the landslide displacement rate, but that most of the kinematics of the landslide is not directly attributable to rainfall amount. The detailed exploration of the characteristics of the five kinematic stages suggests that the weekly averaged displacement rates are more tied to the frequency or rainy days than to the rainfall rate values. These results suggest the pattern of Séchilienne rock avalanche is consistent with the previous findings that landslide kinematics is dependent upon not only rainfall but also soil moisture conditions (as known as being more strongly related to precipitation frequency than to precipitation amount). Finally, our analysis of the displacement rate time-series pinpoints a susceptibility change of slope response to rainfall, as being slower before the end of 2009 than after, respectively. The kinematic history as depicted by statistical tools opens new routes to understand the apparent complexity of Séchilienne landslide kinematic.

  6. Development of temporal modelling for forecasting and prediction of malaria infections using time-series and ARIMAX analyses: a case study in endemic districts of Bhutan.

    PubMed

    Wangdi, Kinley; Singhasivanon, Pratap; Silawan, Tassanee; Lawpoolsri, Saranath; White, Nicholas J; Kaewkungwal, Jaranit

    2010-09-03

    Malaria still remains a public health problem in some districts of Bhutan despite marked reduction of cases in last few years. To strengthen the country's prevention and control measures, this study was carried out to develop forecasting and prediction models of malaria incidence in the endemic districts of Bhutan using time series and ARIMAX. This study was carried out retrospectively using the monthly reported malaria cases from the health centres to Vector-borne Disease Control Programme (VDCP) and the meteorological data from Meteorological Unit, Department of Energy, Ministry of Economic Affairs. Time series analysis was performed on monthly malaria cases, from 1994 to 2008, in seven malaria endemic districts. The time series models derived from a multiplicative seasonal autoregressive integrated moving average (ARIMA) was deployed to identify the best model using data from 1994 to 2006. The best-fit model was selected for each individual district and for the overall endemic area was developed and the monthly cases from January to December 2009 and 2010 were forecasted. In developing the prediction model, the monthly reported malaria cases and the meteorological factors from 1996 to 2008 of the seven districts were analysed. The method of ARIMAX modelling was employed to determine predictors of malaria of the subsequent month. It was found that the ARIMA (p, d, q) (P, D, Q)s model (p and P representing the auto regressive and seasonal autoregressive; d and D representing the non-seasonal differences and seasonal differencing; and q and Q the moving average parameters and seasonal moving average parameters, respectively and s representing the length of the seasonal period) for the overall endemic districts was (2,1,1)(0,1,1)12; the modelling data from each district revealed two most common ARIMA models including (2,1,1)(0,1,1)12 and (1,1,1)(0,1,1)12. The forecasted monthly malaria cases from January to December 2009 and 2010 varied from 15 to 82 cases in 2009 and 67 to 149 cases in 2010, where population in 2009 was 285,375 and the expected population of 2010 to be 289,085. The ARIMAX model of monthly cases and climatic factors showed considerable variations among the different districts. In general, the mean maximum temperature lagged at one month was a strong positive predictor of an increased malaria cases for four districts. The monthly number of cases of the previous month was also a significant predictor in one district, whereas no variable could predict malaria cases for two districts. The ARIMA models of time-series analysis were useful in forecasting the number of cases in the endemic areas of Bhutan. There was no consistency in the predictors of malaria cases when using ARIMAX model with selected lag times and climatic predictors. The ARIMA forecasting models could be employed for planning and managing malaria prevention and control programme in Bhutan.

  7. Rapid Calculation of Spacecraft Trajectories Using Efficient Taylor Series Integration

    NASA Technical Reports Server (NTRS)

    Scott, James R.; Martini, Michael C.

    2011-01-01

    A variable-order, variable-step Taylor series integration algorithm was implemented in NASA Glenn's SNAP (Spacecraft N-body Analysis Program) code. SNAP is a high-fidelity trajectory propagation program that can propagate the trajectory of a spacecraft about virtually any body in the solar system. The Taylor series algorithm's very high order accuracy and excellent stability properties lead to large reductions in computer time relative to the code's existing 8th order Runge-Kutta scheme. Head-to-head comparison on near-Earth, lunar, Mars, and Europa missions showed that Taylor series integration is 15.8 times faster than Runge- Kutta on average, and is more accurate. These speedups were obtained for calculations involving central body, other body, thrust, and drag forces. Similar speedups have been obtained for calculations that include J2 spherical harmonic for central body gravitation. The algorithm includes a step size selection method that directly calculates the step size and never requires a repeat step. High-order Taylor series integration algorithms have been shown to provide major reductions in computer time over conventional integration methods in numerous scientific applications. The objective here was to directly implement Taylor series integration in an existing trajectory analysis code and demonstrate that large reductions in computer time (order of magnitude) could be achieved while simultaneously maintaining high accuracy. This software greatly accelerates the calculation of spacecraft trajectories. At each time level, the spacecraft position, velocity, and mass are expanded in a high-order Taylor series whose coefficients are obtained through efficient differentiation arithmetic. This makes it possible to take very large time steps at minimal cost, resulting in large savings in computer time. The Taylor series algorithm is implemented primarily through three subroutines: (1) a driver routine that automatically introduces auxiliary variables and sets up initial conditions and integrates; (2) a routine that calculates system reduced derivatives using recurrence relations for quotients and products; and (3) a routine that determines the step size and sums the series. The order of accuracy used in a trajectory calculation is arbitrary and can be set by the user. The algorithm directly calculates the motion of other planetary bodies and does not require ephemeris files (except to start the calculation). The code also runs with Taylor series and Runge-Kutta used interchangeably for different phases of a mission.

  8. Using satellite laser ranging to measure ice mass change in Greenland and Antarctica

    NASA Astrophysics Data System (ADS)

    Bonin, Jennifer A.; Chambers, Don P.; Cheng, Minkang

    2018-01-01

    A least squares inversion of satellite laser ranging (SLR) data over Greenland and Antarctica could extend gravimetry-based estimates of mass loss back to the early 1990s and fill any future gap between the current Gravity Recovery and Climate Experiment (GRACE) and the future GRACE Follow-On mission. The results of a simulation suggest that, while separating the mass change between Greenland and Antarctica is not possible at the limited spatial resolution of the SLR data, estimating the total combined mass change of the two areas is feasible. When the method is applied to real SLR and GRACE gravity series, we find significantly different estimates of inverted mass loss. There are large, unpredictable, interannual differences between the two inverted data types, making us conclude that the current 5×5 spherical harmonic SLR series cannot be used to stand in for GRACE. However, a comparison with the longer IMBIE time series suggests that on a 20-year time frame, the inverted SLR series' interannual excursions may average out, and the long-term mass loss estimate may be reasonable.

  9. Water quality management using statistical analysis and time-series prediction model

    NASA Astrophysics Data System (ADS)

    Parmar, Kulwinder Singh; Bhardwaj, Rashmi

    2014-12-01

    This paper deals with water quality management using statistical analysis and time-series prediction model. The monthly variation of water quality standards has been used to compare statistical mean, median, mode, standard deviation, kurtosis, skewness, coefficient of variation at Yamuna River. Model validated using R-squared, root mean square error, mean absolute percentage error, maximum absolute percentage error, mean absolute error, maximum absolute error, normalized Bayesian information criterion, Ljung-Box analysis, predicted value and confidence limits. Using auto regressive integrated moving average model, future water quality parameters values have been estimated. It is observed that predictive model is useful at 95 % confidence limits and curve is platykurtic for potential of hydrogen (pH), free ammonia, total Kjeldahl nitrogen, dissolved oxygen, water temperature (WT); leptokurtic for chemical oxygen demand, biochemical oxygen demand. Also, it is observed that predicted series is close to the original series which provides a perfect fit. All parameters except pH and WT cross the prescribed limits of the World Health Organization /United States Environmental Protection Agency, and thus water is not fit for drinking, agriculture and industrial use.

  10. Comparison of hybrid spectral-decomposition artificial neural network models for understanding climatic forcing of groundwater levels

    NASA Astrophysics Data System (ADS)

    Abrokwah, K.; O'Reilly, A. M.

    2017-12-01

    Groundwater is an important resource that is extracted every day because of its invaluable use for domestic, industrial and agricultural purposes. The need for sustaining groundwater resources is clearly indicated by declining water levels and has led to modeling and forecasting accurate groundwater levels. In this study, spectral decomposition of climatic forcing time series was used to develop hybrid wavelet analysis (WA) and moving window average (MWA) artificial neural network (ANN) models. These techniques are explored by modeling historical groundwater levels in order to provide understanding of potential causes of the observed groundwater-level fluctuations. Selection of the appropriate decomposition level for WA and window size for MWA helps in understanding the important time scales of climatic forcing, such as rainfall, that influence water levels. Discrete wavelet transform (DWT) is used to decompose the input time-series data into various levels of approximate and details wavelet coefficients, whilst MWA acts as a low-pass signal-filtering technique for removing high-frequency signals from the input data. The variables used to develop and validate the models were daily average rainfall measurements from five National Atmospheric and Oceanic Administration (NOAA) weather stations and daily water-level measurements from two wells recorded from 1978 to 2008 in central Florida, USA. Using different decomposition levels and different window sizes, several WA-ANN and MWA-ANN models for simulating the water levels were created and their relative performances compared against each other. The WA-ANN models performed better than the corresponding MWA-ANN models; also higher decomposition levels of the input signal by the DWT gave the best results. The results obtained show the applicability and feasibility of hybrid WA-ANN and MWA-ANN models for simulating daily water levels using only climatic forcing time series as model inputs.

  11. Non-linear feature extraction from HRV signal for mortality prediction of ICU cardiovascular patient.

    PubMed

    Karimi Moridani, Mohammad; Setarehdan, Seyed Kamaledin; Motie Nasrabadi, Ali; Hajinasrollah, Esmaeil

    2016-01-01

    Intensive care unit (ICU) patients are at risk of in-ICU morbidities and mortality, making specific systems for identifying at-risk patients a necessity for improving clinical care. This study presents a new method for predicting in-hospital mortality using heart rate variability (HRV) collected from the times of a patient's ICU stay. In this paper, a HRV time series processing based method is proposed for mortality prediction of ICU cardiovascular patients. HRV signals were obtained measuring R-R time intervals. A novel method, named return map, is then developed that reveals useful information from the HRV time series. This study also proposed several features that can be extracted from the return map, including the angle between two vectors, the area of triangles formed by successive points, shortest distance to 45° line and their various combinations. Finally, a thresholding technique is proposed to extract the risk period and to predict mortality. The data used to evaluate the proposed algorithm obtained from 80 cardiovascular ICU patients, from the first 48 h of the first ICU stay of 40 males and 40 females. This study showed that the angle feature has on average a sensitivity of 87.5% (with 12 false alarms), the area feature has on average a sensitivity of 89.58% (with 10 false alarms), the shortest distance feature has on average a sensitivity of 85.42% (with 14 false alarms) and, finally, the combined feature has on average a sensitivity of 92.71% (with seven false alarms). The results showed that the last half an hour before the patient's death is very informative for diagnosing the patient's condition and to save his/her life. These results confirm that it is possible to predict mortality based on the features introduced in this paper, relying on the variations of the HRV dynamic characteristics.

  12. Arthroscopic bursectomy for recalcitrant trochanteric bursitis after hip arthroplasty.

    PubMed

    Van Hofwegen, Christopher; Baker, Champ L; Savory, Carlton G; Baker, Champ L

    2013-01-01

    This study evaluated the use of arthroscopic bursectomy for pain relief in patients with trochanteric bursitis after hip arthroplasty. In this retrospective case series of 12 patients undergoing arthroscopic treatment of recalcitrant trochanteric bursitis after hip arthroplasty, outcomes were assessed via phone interview with a numeric pain rating scale from 1 to 10 and were compared with preoperative pain ratings. Patients were asked the percentage of time they had painless hip function and whether they would have the surgery again. At an average 36-month follow-up (range, 4-85 months), the average numeric pain scale rating improved from 9.3 to 3.3. At an average of 62% of the time, patients had painless use of the hip. Ten of 12 patients in the study felt the pain relief gained was substantial enough to warrant having procedure again. In these patients, arthroscopic bursectomy was a viable option for patients with recalcitrant bursitis after hip arthroplasty.

  13. Variability of rainfall over Lake Kariba catchment area in the Zambezi river basin, Zimbabwe

    NASA Astrophysics Data System (ADS)

    Muchuru, Shepherd; Botai, Joel O.; Botai, Christina M.; Landman, Willem A.; Adeola, Abiodun M.

    2016-04-01

    In this study, average monthly and annual rainfall totals recorded for the period 1970 to 2010 from a network of 13 stations across the Lake Kariba catchment area of the Zambezi river basin were analyzed in order to characterize the spatial-temporal variability of rainfall across the catchment area. In the analysis, the data were subjected to intervention and homogeneity analysis using the Cumulative Summation (CUSUM) technique and step change analysis using rank-sum test. Furthermore, rainfall variability was characterized by trend analysis using the non-parametric Mann-Kendall statistic. Additionally, the rainfall series were decomposed and the spectral characteristics derived using Cross Wavelet Transform (CWT) and Wavelet Coherence (WC) analysis. The advantage of using the wavelet-based parameters is that they vary in time and can therefore be used to quantitatively detect time-scale-dependent correlations and phase shifts between rainfall time series at various localized time-frequency scales. The annual and seasonal rainfall series were homogeneous and demonstrated no apparent significant shifts. According to the inhomogeneity classification, the rainfall series recorded across the Lake Kariba catchment area belonged to category A (useful) and B (doubtful), i.e., there were zero to one and two absolute tests rejecting the null hypothesis (at 5 % significance level), respectively. Lastly, the long-term variability of the rainfall series across the Lake Kariba catchment area exhibited non-significant positive and negative trends with coherent oscillatory modes that are constantly locked in phase in the Morlet wavelet space.

  14. Severe European winters in a secular perspective

    NASA Astrophysics Data System (ADS)

    Hoy, Andreas; Hänsel, Stephanie

    2017-04-01

    Temperature conditions during the winter time are substantially shaped by a strong year-to-year variability. European winters since the late 1980s - compared to previous decades and centuries - were mainly characterised by a high temperature level, including recent record-warm winters. Yet, comparably cold winters and severe cold spells still occur nowadays, like recently observed from 2009 to 2013 and in early 2017. Central England experienced its second coldest December since start of observations more than 350 years ago in 2010, and some of the lowest temperatures ever measured in northern Europe (below -50 °C in Lapland) were recorded in January 1999. Analysing thermal characteristics and spatial distribution of severe (historical) winters - using early instrumental data - helps expanding and consolidating our knowledge of past weather extremes. This contribution presents efforts towards this direction. We focus on a) compiling and assessing a very long-term instrumental, spatially widespread and well-distributed, high-quality meteorological data set to b) investigate very cold winter temperatures in Europe from early measurements until today. In a first step, we analyse the longest available time series of monthly temperature averages within Europe. Our dataset extends from the Nordic countries up to the Mediterranean and from the British Isles up to Russia. We utilise as much as possible homogenised times series in order to ensure reliable results. Homogenised data derive from the NORDHOM (Scandinavia) and HISTALP (greater alpine region) datasets or were obtained from national weather services and universities. Other (not specifically homogenised) data were derived from the ECA&D dataset or national institutions. The employed time series often start already during the 18th century, with Paris & Central England being the longest datasets (from 1659). In a second step, daily temperature averages are involved. Only some of those series are homogenised, but those available are sufficiently distributed throughout Europe to ensure reliable results. Furthermore, the comparably dense network of long-term observations allows an appropriate quality checking within the network. Additionally, the large collective of homogenised monthly data enables assessing the quality of many daily series. Daily data are used to sum up negative values for the respective winter periods to create times series of "cold summations", which are a good indicator for the severeness of winters in most parts of Europe. Additionally, days below certain thresholds may be counted or summed up. Future work will include daily minimum and maximum temperatures, allowing calculating and applying an extensive set of climate indices, refining the work presented here.

  15. An Estimation of the Likelihood of Significant Eruptions During 2000-2009 Using Poisson Statistics on Two-Point Moving Averages of the Volcanic Time Series

    NASA Technical Reports Server (NTRS)

    Wilson, Robert M.

    2001-01-01

    Since 1750, the number of cataclysmic volcanic eruptions (volcanic explosivity index (VEI)>=4) per decade spans 2-11, with 96 percent located in the tropics and extra-tropical Northern Hemisphere. A two-point moving average of the volcanic time series has higher values since the 1860's than before, being 8.00 in the 1910's (the highest value) and 6.50 in the 1980's, the highest since the 1910's peak. Because of the usual behavior of the first difference of the two-point moving averages, one infers that its value for the 1990's will measure approximately 6.50 +/- 1, implying that approximately 7 +/- 4 cataclysmic volcanic eruptions should be expected during the present decade (2000-2009). Because cataclysmic volcanic eruptions (especially those having VEI>=5) nearly always have been associated with short-term episodes of global cooling, the occurrence of even one might confuse our ability to assess the effects of global warming. Poisson probability distributions reveal that the probability of one or more events with a VEI>=4 within the next ten years is >99 percent. It is approximately 49 percent for an event with a VEI>=5, and 18 percent for an event with a VEI>=6. Hence, the likelihood that a climatically significant volcanic eruption will occur within the next ten years appears reasonably high.

  16. A High Precision Prediction Model Using Hybrid Grey Dynamic Model

    ERIC Educational Resources Information Center

    Li, Guo-Dong; Yamaguchi, Daisuke; Nagai, Masatake; Masuda, Shiro

    2008-01-01

    In this paper, we propose a new prediction analysis model which combines the first order one variable Grey differential equation Model (abbreviated as GM(1,1) model) from grey system theory and time series Autoregressive Integrated Moving Average (ARIMA) model from statistics theory. We abbreviate the combined GM(1,1) ARIMA model as ARGM(1,1)…

  17. Relating Factor Models for Longitudinal Data to Quasi-Simplex and NARMA Models

    ERIC Educational Resources Information Center

    Rovine, Michael J.; Molenaar, Peter C. M.

    2005-01-01

    In this article we show the one-factor model can be rewritten as a quasi-simplex model. Using this result along with addition theorems from time series analysis, we describe a common general model, the nonstationary autoregressive moving average (NARMA) model, that includes as a special case, any latent variable model with continuous indicators…

  18. Predictability of monthly temperature and precipitation using automatic time series forecasting methods

    NASA Astrophysics Data System (ADS)

    Papacharalampous, Georgia; Tyralis, Hristos; Koutsoyiannis, Demetris

    2018-02-01

    We investigate the predictability of monthly temperature and precipitation by applying automatic univariate time series forecasting methods to a sample of 985 40-year-long monthly temperature and 1552 40-year-long monthly precipitation time series. The methods include a naïve one based on the monthly values of the last year, as well as the random walk (with drift), AutoRegressive Fractionally Integrated Moving Average (ARFIMA), exponential smoothing state-space model with Box-Cox transformation, ARMA errors, Trend and Seasonal components (BATS), simple exponential smoothing, Theta and Prophet methods. Prophet is a recently introduced model inspired by the nature of time series forecasted at Facebook and has not been applied to hydrometeorological time series before, while the use of random walk, BATS, simple exponential smoothing and Theta is rare in hydrology. The methods are tested in performing multi-step ahead forecasts for the last 48 months of the data. We further investigate how different choices of handling the seasonality and non-normality affect the performance of the models. The results indicate that: (a) all the examined methods apart from the naïve and random walk ones are accurate enough to be used in long-term applications; (b) monthly temperature and precipitation can be forecasted to a level of accuracy which can barely be improved using other methods; (c) the externally applied classical seasonal decomposition results mostly in better forecasts compared to the automatic seasonal decomposition used by the BATS and Prophet methods; and (d) Prophet is competitive, especially when it is combined with externally applied classical seasonal decomposition.

  19. Time series regression model for infectious disease and weather.

    PubMed

    Imai, Chisato; Armstrong, Ben; Chalabi, Zaid; Mangtani, Punam; Hashizume, Masahiro

    2015-10-01

    Time series regression has been developed and long used to evaluate the short-term associations of air pollution and weather with mortality or morbidity of non-infectious diseases. The application of the regression approaches from this tradition to infectious diseases, however, is less well explored and raises some new issues. We discuss and present potential solutions for five issues often arising in such analyses: changes in immune population, strong autocorrelations, a wide range of plausible lag structures and association patterns, seasonality adjustments, and large overdispersion. The potential approaches are illustrated with datasets of cholera cases and rainfall from Bangladesh and influenza and temperature in Tokyo. Though this article focuses on the application of the traditional time series regression to infectious diseases and weather factors, we also briefly introduce alternative approaches, including mathematical modeling, wavelet analysis, and autoregressive integrated moving average (ARIMA) models. Modifications proposed to standard time series regression practice include using sums of past cases as proxies for the immune population, and using the logarithm of lagged disease counts to control autocorrelation due to true contagion, both of which are motivated from "susceptible-infectious-recovered" (SIR) models. The complexity of lag structures and association patterns can often be informed by biological mechanisms and explored by using distributed lag non-linear models. For overdispersed models, alternative distribution models such as quasi-Poisson and negative binomial should be considered. Time series regression can be used to investigate dependence of infectious diseases on weather, but may need modifying to allow for features specific to this context. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Spaceborne Sun-Induced Vegetation Fluorescence Time Series from 2007 to 2015 Evaluated with Australian Flux Tower Measurements

    NASA Technical Reports Server (NTRS)

    Sanders, Abram F. J.; Verstraeten, Willem W.; Kooreman, Maurits L.; van Leth, Thomas C.; Beringer, Jason; Joiner, Joanna

    2016-01-01

    A global, monthly averaged time series of Sun-induced Fluorescence (SiF), spanning January 2007 to June 2015, was derived from Metop-A Global Ozone Monitoring Experiment 2 (GOME-2) spectral measurements. Far-red SiF was retrieved using the filling-in of deep solar Fraunhofer lines and atmospheric absorption bands based on the general methodology described by Joiner et al, AMT, 2013. A Principal Component (PC) analysis of spectra over non-vegetated areas was performed to describe the effects of atmospheric absorption. Our implementation (SiF KNMI) is an independent algorithm and differs from the latest implementation of Joiner et al, AMT, 2013 (SiF NASA, v26), because we used desert reference areas for determining PCs (as opposed to cloudy ocean and some desert) and a wider fit window that covers water vapour and oxygen absorption bands (as opposed to only Fraunhofer lines). As a consequence, more PCs were needed (35 as opposed to 12). The two time series (SiF KNMI and SiF NASA, v26) correlate well (overall R of 0.78) except for tropical rain forests. Sensitivity experiments suggest the strong impact of the water vapour absorption band on retrieved SiF values. Furthermore, we evaluated the SiF time series with Gross Primary Productivity (GPP) derived from twelve flux towers in Australia. Correlations for individual towers range from 0.37 to 0.84. They are particularly high for managed biome types. In the de-seasonalized Australian SiF time series, the break of the Millennium Drought during local summer of 2010/2011 is clearly observed.

  1. Variability in tropical cyclone heat potential over the Southwest Indian Ocean

    NASA Astrophysics Data System (ADS)

    Malan, N.; Reason, C. J. C.; Loveday, B. R.

    2013-12-01

    Tropical cyclone heat potential (TCHP) has been proposed as being important for hurricane and typhoon intensity. Here, a climatology of TCHP is developed for the Southwest Indian Ocean, a basin that experiences on average 11-12 tropical cyclones per year, many of which impact on Mauritius, Reunion and Madagascar, and Mozambique. SODA data and a regional ocean model forced with the GFDL-CORE v.2b reanalysis winds and heat fluxes are used to derive TCHP values during the 1948-2007 period. The results indicate that TCHP increases through the austral summer, peaking in March. Values of TCHP above 40 kJ cm-2, suggested as the minimum needed for tropical cyclone intensification, are still present in the northern Mozambique Channel in May. A time series of TCHP spatially averaged over the Seychelles-Chagos thermocline ridge (SCTR), an important area for tropical cyclones, is presented. The model time series, which agrees well with XBT-based observations (r = 0.82, p = 0.01), shows considerable interannual variability overlaying an upward tendency that matches with an observed increase in severe tropical cyclone days in the Southwest Indian Ocean. Although an increase in severe storms is seen during 1997-2007, the increasing TCHP tendency time series after 1997 coincides with a decrease in total cyclone numbers, a mismatch that is ascribed to increased atmospheric anticyclonicity over the basin. Seasons of increased (decreased) TCHP over the SCTR appear to be associated with dry (wet) conditions over certain areas of southern and East Africa and are linked with changes in zonal wind and vertical motion in the midtroposphere.

  2. Real Data and Rapid Results: Ocean Color Data Analysis with Giovanni (GES DISC Interactive Online Visualization and ANalysis Infrastructure)

    NASA Technical Reports Server (NTRS)

    Acker, J. G.; Leptoukh, G.; Kempler, S.; Gregg, W.; Berrick, S.; Zhu, T.; Liu, Z.; Rui, H.; Shen, S.

    2004-01-01

    The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) has taken a major step addressing the challenge of using archived Earth Observing System (EOS) data for regional or global studies by developing an infrastructure with a World Wide Web interface which allows online, interactive, data analysis: the GES DISC Interactive Online Visualization and ANalysis Infrastructure, or "Giovanni." Giovanni provides a data analysis environment that is largely independent of underlying data file format. The Ocean Color Time-Series Project has created an initial implementation of Giovanni using monthly Standard Mapped Image (SMI) data products from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) mission. Giovanni users select geophysical parameters, and the geographical region and time period of interest. The system rapidly generates a graphical or ASCII numerical data output. Currently available output options are: Area plot (averaged or accumulated over any available data period for any rectangular area); Time plot (time series averaged over any rectangular area); Hovmeller plots (image view of any longitude-time and latitude-time cross sections); ASCII output for all plot types; and area plot animations. Future plans include correlation plots, output formats compatible with Geographical Information Systems (GIs), and higher temporal resolution data. The Ocean Color Time-Series Project will produce sensor-independent ocean color data beginning with the Coastal Zone Color Scanner (CZCS) mission and extending through SeaWiFS and Moderate Resolution Imaging Spectroradiometer (MODIS) data sets, and will enable incorporation of Visible/lnfrared Imaging Radiometer Suite (VIIRS) data, which will be added to Giovanni. The first phase of Giovanni will also include tutorials demonstrating the use of Giovanni and collaborative assistance in the development of research projects using the SeaWiFS and Ocean Color Time-Series Project data in the online Laboratory for Ocean Color Users (LOCUS). The synergy of Giovanni with high-quality ocean color data provides users with the ability to investigate a variety of important oceanic phenomena, such as coastal primary productivity related to pelagic fisheries, seasonal patterns and interannual variability, interdependence of atmospheric dust aerosols and harmful algal blooms, and the potential effects of climate change on oceanic productivity.

  3. Characterization of traffic-related PM concentration distribution and fluctuation patterns in near-highway urban residential street canyons.

    PubMed

    Hahn, Intaek; Brixey, Laurie A; Wiener, Russell W; Henkle, Stacy W; Baldauf, Richard

    2009-12-01

    Analyses of outdoor traffic-related particulate matter (PM) concentration distribution and fluctuation patterns in urban street canyons within a microscale distance of less than 500 m from a highway source are presented as part of the results from the Brooklyn Traffic Real-Time Ambient Pollutant Penetration and Environmental Dispersion (B-TRAPPED) study. Various patterns of spatial and temporal changes in the street canyon PM concentrations were investigated using time-series data of real-time PM concentrations measured during multiple monitoring periods. Concurrent time-series data of local street canyon wind conditions and wind data from the John F. Kennedy (JFK) International Airport National Weather Service (NWS) were used to characterize the effects of various wind conditions on the behavior of street canyon PM concentrations.Our results suggest that wind direction may strongly influence time-averaged mean PM concentration distribution patterns in near-highway urban street canyons. The rooftop-level wind speeds were found to be strongly correlated with the PM concentration fluctuation intensities in the middle sections of the street blocks. The ambient turbulence generated by shifting local wind directions (angles) showed a good correlation with the PM concentration fluctuation intensities along the entire distance of the first and second street blocks only when the wind angle standard deviations were larger than 30 degrees. Within-canyon turbulent shearing, caused by fluctuating local street canyon wind speeds, showed no correlation with PM concentration fluctuation intensities. The time-averaged mean PM concentration distribution along the longitudinal distances of the street blocks when wind direction was mostly constantly parallel to the street was found to be similar to the distribution pattern for the entire monitoring period when wind direction fluctuated wildly. Finally, we showed that two different PM concentration metrics-time-averaged mean concentration and number of concentration peaks above a certain threshold level-can possibly lead to different assessments of spatial concentration distribution patterns.

  4. Limb lengthening in short stature patients.

    PubMed

    Aldegheri, R; Dall'Oca, C

    2001-07-01

    A series of 140 patients with short stature operated on for limb lengthening (80 had achondroplasia, 20 had hypochondroplasia, 20 had Turner syndrome, 10 had idiopathic short stature due to an undemonstrated cause, 5 regarded their stature as too short, and 5 had a psychopathic personality due to dysmorphophobia that had developed because of their short stature) was reviewed. All patients underwent symmetric lengthening of both femora and tibiae; 10 of these achondroplastic patients underwent lengthening of the humeri. We carried out the 580 lengthening procedures by means of three different surgical techniques: 440 callotasis, 120 chondrodiatasis and 20 mid-shaft osteotomy. In the 130 patients with a disproportionate short stature, the average gain in length was 18.2 +/- 3.93 cm: 43.8% had complications and 3.8% had sequelae; the average treatment time was 31 months. In the 10 patients with proportionate short stature, the average gain in length was 10.8 +/- 1.00 cm: 4 experienced complications and none had sequelae; the average treatment time was 21 months. Patients who underwent lengthening of the upper limbs experienced an average gain in length of 10.2 +/- 1.25 cm: the average treatment time was 9 months and none of them experienced any complications or sequelae. The authors discuss how difficult it is to achieve the benefits of this surgery: they underline the strong commitment on the part of the patients and their families, the time in the hospital, the number of operations and, above all, the severity of those permanent sequelae that occurred.

  5. OceanXtremes: Scalable Anomaly Detection in Oceanographic Time-Series

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Armstrong, E. M.; Chin, T. M.; Gill, K. M.; Greguska, F. R., III; Huang, T.; Jacob, J. C.; Quach, N.

    2016-12-01

    The oceanographic community must meet the challenge to rapidly identify features and anomalies in complex and voluminous observations to further science and improve decision support. Given this data-intensive reality, we are developing an anomaly detection system, called OceanXtremes, powered by an intelligent, elastic Cloud-based analytic service backend that enables execution of domain-specific, multi-scale anomaly and feature detection algorithms across the entire archive of 15 to 30-year ocean science datasets.Our parallel analytics engine is extending the NEXUS system and exploits multiple open-source technologies: Apache Cassandra as a distributed spatial "tile" cache, Apache Spark for in-memory parallel computation, and Apache Solr for spatial search and storing pre-computed tile statistics and other metadata. OceanXtremes provides these key capabilities: Parallel generation (Spark on a compute cluster) of 15 to 30-year Ocean Climatologies (e.g. sea surface temperature or SST) in hours or overnight, using simple pixel averages or customizable Gaussian-weighted "smoothing" over latitude, longitude, and time; Parallel pre-computation, tiling, and caching of anomaly fields (daily variables minus a chosen climatology) with pre-computed tile statistics; Parallel detection (over the time-series of tiles) of anomalies or phenomena by regional area-averages exceeding a specified threshold (e.g. high SST in El Nino or SST "blob" regions), or more complex, custom data mining algorithms; Shared discovery and exploration of ocean phenomena and anomalies (facet search using Solr), along with unexpected correlations between key measured variables; Scalable execution for all capabilities on a hybrid Cloud, using our on-premise OpenStack Cloud cluster or at Amazon. The key idea is that the parallel data-mining operations will be run "near" the ocean data archives (a local "network" hop) so that we can efficiently access the thousands of files making up a three decade time-series. The presentation will cover the architecture of OceanXtremes, parallelization of the climatology computation and anomaly detection algorithms using Spark, example results for SST and other time-series, and parallel performance metrics.

  6. Quantifying the behavior of price dynamics at opening time in stock market

    NASA Astrophysics Data System (ADS)

    Ochiai, Tomoshiro; Takada, Hideyuki; Nacher, Jose C.

    2014-11-01

    The availability of huge volume of financial data has offered the possibility for understanding the markets as a complex system characterized by several stylized facts. Here we first show that the time evolution of the Japan’s Nikkei stock average index (Nikkei 225) futures follows the resistance and breaking-acceleration effects when the complete time series data is analyzed. However, in stock markets there are periods where no regular trades occur between the close of the market on one day and the next day’s open. To examine these time gaps we decompose the time series data into opening time and intermediate time. Our analysis indicates that for the intermediate time, both the resistance and the breaking-acceleration effects are still observed. However, for the opening time there are almost no resistance and breaking-acceleration effects, and volatility is always constantly high. These findings highlight unique dynamic differences between stock markets and forex market and suggest that current risk management strategies may need to be revised to address the absence of these dynamic effects at the opening time.

  7. Time Series Analysis for Spatial Node Selection in Environment Monitoring Sensor Networks

    PubMed Central

    Bhandari, Siddhartha; Jurdak, Raja; Kusy, Branislav

    2017-01-01

    Wireless sensor networks are widely used in environmental monitoring. The number of sensor nodes to be deployed will vary depending on the desired spatio-temporal resolution. Selecting an optimal number, position and sampling rate for an array of sensor nodes in environmental monitoring is a challenging question. Most of the current solutions are either theoretical or simulation-based where the problems are tackled using random field theory, computational geometry or computer simulations, limiting their specificity to a given sensor deployment. Using an empirical dataset from a mine rehabilitation monitoring sensor network, this work proposes a data-driven approach where co-integrated time series analysis is used to select the number of sensors from a short-term deployment of a larger set of potential node positions. Analyses conducted on temperature time series show 75% of sensors are co-integrated. Using only 25% of the original nodes can generate a complete dataset within a 0.5 °C average error bound. Our data-driven approach to sensor position selection is applicable for spatiotemporal monitoring of spatially correlated environmental parameters to minimize deployment cost without compromising data resolution. PMID:29271880

  8. Functional mixed effects spectral analysis

    PubMed Central

    KRAFTY, ROBERT T.; HALL, MARTICA; GUO, WENSHENG

    2011-01-01

    SUMMARY In many experiments, time series data can be collected from multiple units and multiple time series segments can be collected from the same unit. This article introduces a mixed effects Cramér spectral representation which can be used to model the effects of design covariates on the second-order power spectrum while accounting for potential correlations among the time series segments collected from the same unit. The transfer function is composed of a deterministic component to account for the population-average effects and a random component to account for the unit-specific deviations. The resulting log-spectrum has a functional mixed effects representation where both the fixed effects and random effects are functions in the frequency domain. It is shown that, when the replicate-specific spectra are smooth, the log-periodograms converge to a functional mixed effects model. A data-driven iterative estimation procedure is offered for the periodic smoothing spline estimation of the fixed effects, penalized estimation of the functional covariance of the random effects, and unit-specific random effects prediction via the best linear unbiased predictor. PMID:26855437

  9. Testing for the Presence of Correlation Changes in a Multivariate Time Series: A Permutation Based Approach.

    PubMed

    Cabrieto, Jedelyn; Tuerlinckx, Francis; Kuppens, Peter; Hunyadi, Borbála; Ceulemans, Eva

    2018-01-15

    Detecting abrupt correlation changes in multivariate time series is crucial in many application fields such as signal processing, functional neuroimaging, climate studies, and financial analysis. To detect such changes, several promising correlation change tests exist, but they may suffer from severe loss of power when there is actually more than one change point underlying the data. To deal with this drawback, we propose a permutation based significance test for Kernel Change Point (KCP) detection on the running correlations. Given a requested number of change points K, KCP divides the time series into K + 1 phases by minimizing the within-phase variance. The new permutation test looks at how the average within-phase variance decreases when K increases and compares this to the results for permuted data. The results of an extensive simulation study and applications to several real data sets show that, depending on the setting, the new test performs either at par or better than the state-of-the art significance tests for detecting the presence of correlation changes, implying that its use can be generally recommended.

  10. Power estimation using simulations for air pollution time-series studies

    PubMed Central

    2012-01-01

    Background Estimation of power to assess associations of interest can be challenging for time-series studies of the acute health effects of air pollution because there are two dimensions of sample size (time-series length and daily outcome counts), and because these studies often use generalized linear models to control for complex patterns of covariation between pollutants and time trends, meteorology and possibly other pollutants. In general, statistical software packages for power estimation rely on simplifying assumptions that may not adequately capture this complexity. Here we examine the impact of various factors affecting power using simulations, with comparison of power estimates obtained from simulations with those obtained using statistical software. Methods Power was estimated for various analyses within a time-series study of air pollution and emergency department visits using simulations for specified scenarios. Mean daily emergency department visit counts, model parameter value estimates and daily values for air pollution and meteorological variables from actual data (8/1/98 to 7/31/99 in Atlanta) were used to generate simulated daily outcome counts with specified temporal associations with air pollutants and randomly generated error based on a Poisson distribution. Power was estimated by conducting analyses of the association between simulated daily outcome counts and air pollution in 2000 data sets for each scenario. Power estimates from simulations and statistical software (G*Power and PASS) were compared. Results In the simulation results, increasing time-series length and average daily outcome counts both increased power to a similar extent. Our results also illustrate the low power that can result from using outcomes with low daily counts or short time series, and the reduction in power that can accompany use of multipollutant models. Power estimates obtained using standard statistical software were very similar to those from the simulations when properly implemented; implementation, however, was not straightforward. Conclusions These analyses demonstrate the similar impact on power of increasing time-series length versus increasing daily outcome counts, which has not previously been reported. Implementation of power software for these studies is discussed and guidance is provided. PMID:22995599

  11. Power estimation using simulations for air pollution time-series studies.

    PubMed

    Winquist, Andrea; Klein, Mitchel; Tolbert, Paige; Sarnat, Stefanie Ebelt

    2012-09-20

    Estimation of power to assess associations of interest can be challenging for time-series studies of the acute health effects of air pollution because there are two dimensions of sample size (time-series length and daily outcome counts), and because these studies often use generalized linear models to control for complex patterns of covariation between pollutants and time trends, meteorology and possibly other pollutants. In general, statistical software packages for power estimation rely on simplifying assumptions that may not adequately capture this complexity. Here we examine the impact of various factors affecting power using simulations, with comparison of power estimates obtained from simulations with those obtained using statistical software. Power was estimated for various analyses within a time-series study of air pollution and emergency department visits using simulations for specified scenarios. Mean daily emergency department visit counts, model parameter value estimates and daily values for air pollution and meteorological variables from actual data (8/1/98 to 7/31/99 in Atlanta) were used to generate simulated daily outcome counts with specified temporal associations with air pollutants and randomly generated error based on a Poisson distribution. Power was estimated by conducting analyses of the association between simulated daily outcome counts and air pollution in 2000 data sets for each scenario. Power estimates from simulations and statistical software (G*Power and PASS) were compared. In the simulation results, increasing time-series length and average daily outcome counts both increased power to a similar extent. Our results also illustrate the low power that can result from using outcomes with low daily counts or short time series, and the reduction in power that can accompany use of multipollutant models. Power estimates obtained using standard statistical software were very similar to those from the simulations when properly implemented; implementation, however, was not straightforward. These analyses demonstrate the similar impact on power of increasing time-series length versus increasing daily outcome counts, which has not previously been reported. Implementation of power software for these studies is discussed and guidance is provided.

  12. Application of Time-series Model to Predict Groundwater Quality Parameters for Agriculture: (Plain Mehran Case Study)

    NASA Astrophysics Data System (ADS)

    Mehrdad Mirsanjari, Mir; Mohammadyari, Fatemeh

    2018-03-01

    Underground water is regarded as considerable water source which is mainly available in arid and semi arid with deficient surface water source. Forecasting of hydrological variables are suitable tools in water resources management. On the other hand, time series concepts is considered efficient means in forecasting process of water management. In this study the data including qualitative parameters (electrical conductivity and sodium adsorption ratio) of 17 underground water wells in Mehran Plain has been used to model the trend of parameters change over time. Using determined model, the qualitative parameters of groundwater is predicted for the next seven years. Data from 2003 to 2016 has been collected and were fitted by AR, MA, ARMA, ARIMA and SARIMA models. Afterward, the best model is determined using information criterion or Akaike (AIC) and correlation coefficient. After modeling parameters, the map of agricultural land use in 2016 and 2023 were generated and the changes between these years were studied. Based on the results, the average of predicted SAR (Sodium Adsorption Rate) in all wells in the year 2023 will increase compared to 2016. EC (Electrical Conductivity) average in the ninth and fifteenth holes and decreases in other wells will be increased. The results indicate that the quality of groundwater for Agriculture Plain Mehran will decline in seven years.

  13. Analysis of Supergranule Sizes and Velocities Using Solar Dynamics Observatory (SDO)/Helioseismic Magnetic Imager (HMI) and Solar and Heliospheric Observatory (SOHO)/Michelson Doppler Imager (MDI) Dopplergrams

    NASA Technical Reports Server (NTRS)

    Williams, Peter E.; Pesnell, W. Dean; Beck, John G.; Lee, Shannon

    2013-01-01

    Co-temporal Doppler images from Solar and Heliospheric Observatory (SOHO)/ Michelson Doppler Imager (MDI) and Solar Dynamics Observatory (SDO)/Helioseismic Magnetic Imager (HMI) have been analyzed to extract quantitative information about global properties of the spatial and temporal characteristics of solar supergranulation. Preliminary comparisons show that supergranules appear to be smaller and have stronger horizontal velocity flows within HMI data than was measured with MDI. There appears to be no difference in their evolutionary timescales. Supergranule sizes and velocities were analyzed over a ten-day time period at a 15-minute cadence. While the averages of the time-series retain the aforementioned differences, fluctuations of these parameters first observed in MDI data were seen in both MDI and HMI time-series, exhibiting a strong cross-correlation. This verifies that these fluctuations are not instrumental, but are solar in origin. The observed discrepancies between the averaged values from the two sets of data are a consequence of instrument resolution. The lower spatial resolution of MDI results in larger observed structures with lower velocities than is seen in HMI. While these results offer a further constraint on the physical nature of supergranules, they also provide a level of calibration between the two instruments.

  14. Seasonal flux and assemblage composition of planktic foraminifera from the northern Gulf of Mexico, 2008–14

    USGS Publications Warehouse

    Reynolds, Caitlin E.; Richey, Julie N.

    2016-07-28

    The U.S. Geological Survey anchored a sediment trap in the northern Gulf of Mexico in January 2008 to collect seasonal time-series data on the flux and assemblage composition of live planktic foraminifers. This report provides an update of the previous time-series data to include continuous results from January 2013 through May 2014. Ten taxa constituted ~95 percent of both the 2013 and 2014 assemblages: Globigerinoides ruber (pink and white varieties), Globigerinoides sacculifer, Globigerina calida, Globigerinella aequilateralis, Globorotalia menardii group [The Gt. menardii group includes Gt. menardii, Gt. tumida, and Gt. ungulata], Orbulina universa, Globorotalia truncatulinoides, Pulleniatina spp., and Neogloboquadrina dutertrei. In 2013, the mean daily flux was 177 tests per square meter per day (m−2 day−1), with maximum fluxes of >1,200 tests m−2 day−1 during the middle of February and minimum fluxes of <13 tests m−2 day−1 during the beginning of November. In 2014, the mean daily flux was 189 tests m−2 day−1, with maximum fluxes of >900 tests m−2 day−1 at the end of January and minimum fluxes of <30 tests m−2 day−1 at the beginning of January. Globorotalia truncatulinoides showed a clear preference for the winter, consistent with data from 2008 to 2012. Globigerinoides ruber (white) flux data for 2012 (average 23 tests m−2 day−1) were consistent with data from 2011 (average 30 tests m−2 day−1) and 2010 (average 29 tests m−2 day−1) and showed a steady threefold increase since 2009 (average 11 tests m−2 day−1) and a tenfold increase from the 2008 flux (3 tests m−2 day−1). The flux data from 2013 (average 15 tests m−2 day−1) and 2014 (average 8 tests m−2 day−1) showed decline from the previous 3 years.

  15. Electrical Evaluation of RCA MWS5001D Random Access Memory, Volume 5, Appendix D

    NASA Technical Reports Server (NTRS)

    Klute, A.

    1979-01-01

    The electrical characterization and qualification test results are presented for the RCA MWS 5001D random access memory. The tests included functional tests, AC and DC parametric tests, AC parametric worst-case pattern selection test, determination of worst-case transition for setup and hold times, and a series of schmoo plots. Average input high current, worst case input high current, output low current, and data setup time are some of the results presented.

  16. Quantifying Uncertainty from Computational Factors in Simulations of a Model Ballistic System

    DTIC Science & Technology

    2017-08-01

    Comparison of runs 6–9 with the corresponding simulations from the stop time study (Tables 22 and 23) show that the restart series produces...Disclaimers The findings in this report are not to be construed as an official Department of the Army position unless so designated by other authorized...0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing

  17. Modeling Slip System Strength Evolution in Ti 7Al Informed by In situ Grain Stress Measurements (Postprint)

    DTIC Science & Technology

    2017-02-17

    time for the tomography and diffraction sweeps was approximately 42 min. In a typical quasi -static in-situ experiment, loading is halted and the...data is used to extract individual grain- average stress tensors in a large aggregate of Ti-7Al grains (z500) over a time series of prescribed states...for public release: distribution unlimited. © 2017 ELSEVIER LTD (STINFO COPY) AIR FORCE RESEARCH LABORATORY MATERIALS AND MANUFACTURING

  18. Cartesian-Grid Simulations of a Canard-Controlled Missile with a Free-Spinning Tail

    NASA Technical Reports Server (NTRS)

    Murman, Scott M.; Aftosmis, Michael J.; Kwak, Dochan (Technical Monitor)

    2002-01-01

    The proposed paper presents a series of simulations of a geometrically complex, canard-controlled, supersonic missile with free-spinning tail fins. Time-dependent simulations were performed using an inviscid Cartesian-grid-based method with results compared to both experimental data and high-resolution Navier-Stokes computations. At fixed free stream conditions and canard deflections, the tail spin rate was iteratively determined such that the net rolling moment on the empennage is zero. This rate corresponds to the time-asymptotic rate of the free-to-spin fin system. After obtaining spin-averaged aerodynamic coefficients for the missile, the investigation seeks a fixed-tail approximation to the spin-averaged aerodynamic coefficients, and examines the validity of this approximation over a variety of freestream conditions.

  19. D-region differential-phase measurements and ionization variability studies

    NASA Technical Reports Server (NTRS)

    Weiland, R. M.; Bowhill, S. A.

    1978-01-01

    Measurements of electron densities in the D region are made by the partial-reflection differential-absorption and differential-phase techniques. The differential-phase data are obtained by a hard-wired phase-measuring system. Electron-sensity profiles obtained by the two techniques on six occasions are plotted and compared. Electron-density profiles obtained at the same time on 30 occasions during the years 1975 through 1977 are averaged to form a single profile for each technique. The effect of varying the assumed collision-frequency profile on these averaged profiles is studied. Time series of D-region electron-sensity data obtained by 3.4 minute intervals on six days during the summer of 1977 are examined for wave-like disturbances and tidal oscillations.

  20. Anesthesia for left ventricular assist device insertion: a case series and review.

    PubMed

    Broussard, David; Donaldson, Emilie; Falterman, Jason; Bates, Michael

    2011-01-01

    From October 2008 to June 2010, a total of 42 patients had the HeartMate II left ventricular assist device inserted surgically at Ochsner Medical Center in New Orleans, LA. A retrospective electronic record review was conducted on this series of patients to analyze elements of perioperative anesthetic care, including general anesthetic care, echocardiographic considerations, and blood product usage. Etomidate was used to induce anesthesia for 34 of 42 patients (81%) in this series, with an average dose of 16.5 mg (±6 mg). The average intraoperative fentanyl dose was 1,318 µg (±631 µg). On average, patients were extubated 91 hours (±72 hours) after arrival to the intensive care unit and left on day 9 (±5 days). The average left ventricular ejection fraction of the patients in this series was 13% (±5%). Sixteen patients were evaluated as having severe right-heart dysfunction preoperatively. Two of 42 patients required surgical closure of echocardiographically identified patent foramen ovale. Twelve of 42 patients underwent surgical correction of tricuspid regurgitation. On average, 3 units (±2.6 units) of fresh frozen plasma were transfused intraoperatively and 10 units postoperatively. Intraoperative red blood cell usage averaged 1.1 units (maximum, 7 units), with an average 9.3 units administered in the first 48 hours postoperatively.

  1. Comparison of seasonal variability of Aquarius sea surface salinity time series with in situ observations in the Karimata Strait, Indonesia

    NASA Astrophysics Data System (ADS)

    Susanto, R. D.; Setiawan, A.; Zheng, Q.; Sulistyo, B.; Adi, T. R.; Agustiadi, T.; Trenggono, M.; Triyono, T.; Kuswardani, A.

    2016-12-01

    The seasonal variability of a full lifetime of Aquarius sea surface salinity time series from August 25, 2011 to June 7, 2015 is compared to salinity time series obtained from in situ observations in the Karimata Strait. The Karimata Strait plays dual roles in water exchange between the Pacific and the Indian Ocean. The salinity in the Karimata Strait is strongly affected by seasonal monsoon winds. During the boreal winter monsoon, northwesterly winds draws low salinity water from the South China Sea into the Java Sea and at the same time, the Java Sea receives an influx of the Indian Ocean water via the Sunda Strait. The Java Sea water will reduce the main Indonesian throughflow in the Makassar Strait. Conditions are reversed during the summer monsoon. Low salinity water from the South China Sea also controls the vertical structure of water properties in the upper layer of the Makassar Strait and the Lombok Strait. As a part of the South China Sea and Indonesian Seas Transport/Exchange (SITE) program, trawl resistance bottom mounted CTD was deployed in the Karimata Strait in mid-2010 to mid-2016 at water depth of 40 m. CTD casts during the mooring recoveries and deployments are used to compare the bottom salinity data. This in situ salinity time series is compared with various Aquarius NASA salinity products (the level 2, level 3 ascending and descending tracks and the seven-days rolling averaged) to check the consistency, correlation and statistical analysis. The preliminary results show that the seasonal variability of Aquarius salinity time series has larger amplitude variability compared to that of in situ data.

  2. Crustal displacements due to continental water loading

    USGS Publications Warehouse

    Van Dam, T.; Wahr, J.; Milly, P.C.D.; Shmakin, A.B.; Blewitt, G.; Lavallee, D.; Larson, K.M.

    2001-01-01

    The effects of long-wavelength (> 100 km), seasonal variability in continental water storage on vertical crustal motions are assessed. The modeled vertical displacements (??rM) have root-mean-square (RMS) values for 1994-1998 as large as 8 mm, with ranges up to 30 mm, and are predominantly annual in character. Regional strains are on the order of 20 nanostrain for tilt and 5 nanostrain for horizontal deformation. We compare ??rM with observed Global Positioning System (GPS) heights (??rO) (which include adjustments to remove estimated effects of atmospheric pressure and annual tidal and non-tidal ocean loading) for 147 globally distributed sites. When the ??rO time series are adjusted by ??rM, their variances are reduced, on average, by an amount equal to the variance of the ??rM. Of the ??rO time series exhibiting a strong annual signal, more than half are found to have an annual harmonic that is in phase and of comparable amplitude with the annual harmonic in the ??rM. The ??rM time series exhibit long-period variations that could be mistaken for secular tectonic trends or post-glacial rebound when observed over a time span of a few years.

  3. Modeling Polio Data Using the First Order Non-Negative Integer-Valued Autoregressive, INAR(1), Model

    NASA Astrophysics Data System (ADS)

    Vazifedan, Turaj; Shitan, Mahendran

    Time series data may consists of counts, such as the number of road accidents, the number of patients in a certain hospital, the number of customers waiting for service at a certain time and etc. When the value of the observations are large it is usual to use Gaussian Autoregressive Moving Average (ARMA) process to model the time series. However if the observed counts are small, it is not appropriate to use ARMA process to model the observed phenomenon. In such cases we need to model the time series data by using Non-Negative Integer valued Autoregressive (INAR) process. The modeling of counts data is based on the binomial thinning operator. In this paper we illustrate the modeling of counts data using the monthly number of Poliomyelitis data in United States between January 1970 until December 1983. We applied the AR(1), Poisson regression model and INAR(1) model and the suitability of these models were assessed by using the Index of Agreement(I.A.). We found that INAR(1) model is more appropriate in the sense it had a better I.A. and it is natural since the data are counts.

  4. Land, carbon and water footprints in Taiwan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Yung-Jaan, E-mail: yungjaanlee@gmail.com

    The consumer responsibility approach uses footprints as indicators of the total direct and indirect effects of a product or consumption activity. This study used a time-series analysis of three environmental pressures to quantify the total environmental pressures caused by consumption in Taiwan: land footprint, carbon footprint, and water footprint. Land footprint is the pressure from appropriation of biologically productive land and water area. Carbon footprint is the pressure from greenhouse gas emissions. Water footprint is the pressure from freshwater consumption. Conventional carbon footprint is the total CO{sub 2} emitted by a certain activity or the CO{sub 2} accumulation during amore » product life cycle. This definition cannot be used to convert CO{sub 2} emissions into land units. This study responds to the needs of “CO{sub 2} land” in the footprint family by applying the carbon footprint concept used by GFN. The analytical results showed that consumption by the average Taiwan citizen in 2000 required appropriation of 5.39 gha (hectares of land with global-average biological productivity) and 3.63 gha in 2011 in terms of land footprint. The average Taiwan citizen had a carbon footprint of 3.95 gha in 2000 and 5.94 gha in 2011. These results indicate that separately analyzing the land and carbon footprints enables their trends to be compared and appropriate policies and strategies for different sectors to be proposed accordingly. The average Taiwan citizen had a blue water footprint of 801 m{sup 3} in 2000 and 784 m{sup 3} in 2011. By comparison, their respective global averages were 1.23 gha, 2.36 gha and 163 m{sup 3} blue water in 2011, respectively. Overall, Taiwan revealed higher environmental pressures compared to the rest of the world, demonstrating that Taiwan has become a high footprint state and has appropriated environmental resources from other countries. That is, through its imports of products with embodied pressures and its exports, Taiwan has transferred the environmental pressures from consuming goods and services to other parts of the world, which is an environmental injustice. This study examines the time series trend of land, carbon, and water footprints in Taiwan. However, if these analyses can be downscaled to city/county levels, they will be more useful for examining different sustainability performance of local governments in different regions. - Highlights: • This study used a time-series analysis of three environmental pressures to quantify the total environmental pressures caused by consumption in Taiwan: land footprint, carbon footprint and water footprint. • The average Taiwan citizen had a land footprint of 5.39 gha in 2000 and 3.63 gha in 2011. • The average Taiwan citizen had a carbon footprint of 3.95 gha in 2000 and 5.94 gha in 2011. • The average Taiwan citizen had a blue water footprint of 801 m{sup 3} in 2000 and 784 m{sup 3} in 2011. • By comparison, their respective global averages were 1.23 gha, 2.36 gha and 163 m{sup 3} blue water footprint in 2011, respectively. Taiwan revealed higher environmental pressures compared to the rest of the world.« less

  5. Extracting the regional common-mode component of GPS station position time series from dense continuous network

    NASA Astrophysics Data System (ADS)

    Tian, Yunfeng; Shen, Zheng-Kang

    2016-02-01

    We develop a spatial filtering method to remove random noise and extract the spatially correlated transients (i.e., common-mode component (CMC)) that deviate from zero mean over the span of detrended position time series of a continuous Global Positioning System (CGPS) network. The technique utilizes a weighting scheme that incorporates two factors—distances between neighboring sites and their correlations of long-term residual position time series. We use a grid search algorithm to find the optimal thresholds for deriving the CMC that minimizes the root-mean-square (RMS) of the filtered residual position time series. Comparing to the principal component analysis technique, our method achieves better (>13% on average) reduction of residual position scatters for the CGPS stations in western North America, eliminating regional transients of all spatial scales. It also has advantages in data manipulation: less intervention and applicable to a dense network of any spatial extent. Our method can also be used to detect CMC irrespective of its origins (i.e., tectonic or nontectonic), if such signals are of particular interests for further study. By varying the filtering distance range, the long-range CMC related to atmospheric disturbance can be filtered out, uncovering CMC associated with transient tectonic deformation. A correlation-based clustering algorithm is adopted to identify stations cluster that share the common regional transient characteristics.

  6. Reliability Prediction of Ontology-Based Service Compositions Using Petri Net and Time Series Models

    PubMed Central

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy. PMID:24688429

  7. NEW SUNS IN THE COSMOS. III. MULTIFRACTAL SIGNATURE ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freitas, D. B. de; Nepomuceno, M. M. F.; Junior, P. R. V. de Moraes

    2016-11-01

    In the present paper, we investigate the multifractality signatures in hourly time series extracted from the CoRoT spacecraft database. Our analysis is intended to highlight the possibility that astrophysical time series can be members of a particular class of complex and dynamic processes, which require several photometric variability diagnostics to characterize their structural and topological properties. To achieve this goal, we search for contributions due to a nonlinear temporal correlation and effects caused by heavier tails than the Gaussian distribution, using a detrending moving average algorithm for one-dimensional multifractal signals (MFDMA). We observe that the correlation structure is the mainmore » source of multifractality, while heavy-tailed distribution plays a minor role in generating the multifractal effects. Our work also reveals that the rotation period of stars is inherently scaled by the degree of multifractality. As a result, analyzing the multifractal degree of the referred series, we uncover an evolution of multifractality from shorter to larger periods.« less

  8. Scaling analysis and model estimation of solar corona index

    NASA Astrophysics Data System (ADS)

    Ray, Samujjwal; Ray, Rajdeep; Khondekar, Mofazzal Hossain; Ghosh, Koushik

    2018-04-01

    A monthly average solar green coronal index time series for the period from January 1939 to December 2008 collected from NOAA (The National Oceanic and Atmospheric Administration) has been analysed in this paper in perspective of scaling analysis and modelling. Smoothed and de-noising have been done using suitable mother wavelet as a pre-requisite. The Finite Variance Scaling Method (FVSM), Higuchi method, rescaled range (R/S) and a generalized method have been applied to calculate the scaling exponents and fractal dimensions of the time series. Autocorrelation function (ACF) is used to find autoregressive (AR) process and Partial autocorrelation function (PACF) has been used to get the order of AR model. Finally a best fit model has been proposed using Yule-Walker Method with supporting results of goodness of fit and wavelet spectrum. The results reveal an anti-persistent, Short Range Dependent (SRD), self-similar property with signatures of non-causality, non-stationarity and nonlinearity in the data series. The model shows the best fit to the data under observation.

  9. Computational problems in autoregressive moving average (ARMA) models

    NASA Technical Reports Server (NTRS)

    Agarwal, G. C.; Goodarzi, S. M.; Oneill, W. D.; Gottlieb, G. L.

    1981-01-01

    The choice of the sampling interval and the selection of the order of the model in time series analysis are considered. Band limited (up to 15 Hz) random torque perturbations are applied to the human ankle joint. The applied torque input, the angular rotation output, and the electromyographic activity using surface electrodes from the extensor and flexor muscles of the ankle joint are recorded. Autoregressive moving average models are developed. A parameter constraining technique is applied to develop more reliable models. The asymptotic behavior of the system must be taken into account during parameter optimization to develop predictive models.

  10. PERIODIC AUTOREGRESSIVE-MOVING AVERAGE (PARMA) MODELING WITH APPLICATIONS TO WATER RESOURCES.

    USGS Publications Warehouse

    Vecchia, A.V.

    1985-01-01

    Results involving correlation properties and parameter estimation for autogressive-moving average models with periodic parameters are presented. A multivariate representation of the PARMA model is used to derive parameter space restrictions and difference equations for the periodic autocorrelations. Close approximation to the likelihood function for Gaussian PARMA processes results in efficient maximum-likelihood estimation procedures. Terms in the Fourier expansion of the parameters are sequentially included, and a selection criterion is given for determining the optimal number of harmonics to be included. Application of the techniques is demonstrated through analysis of a monthly streamflow time series.

  11. Statistical analysis of low level atmospheric turbulence

    NASA Technical Reports Server (NTRS)

    Tieleman, H. W.; Chen, W. W. L.

    1974-01-01

    The statistical properties of low-level wind-turbulence data were obtained with the model 1080 total vector anemometer and the model 1296 dual split-film anemometer, both manufactured by Thermo Systems Incorporated. The data obtained from the above fast-response probes were compared with the results obtained from a pair of Gill propeller anemometers. The digitized time series representing the three velocity components and the temperature were each divided into a number of blocks, the length of which depended on the lowest frequency of interest and also on the storage capacity of the available computer. A moving-average and differencing high-pass filter was used to remove the trend and the low frequency components in the time series. The calculated results for each of the anemometers used are represented in graphical or tabulated form.

  12. Fractal Analysis of Air Pollutant Concentrations

    NASA Astrophysics Data System (ADS)

    Cortina-Januchs, M. G.; Barrón-Adame, J. M.; Vega-Corona, A.; Andina, D.

    2010-05-01

    Air pollution poses significant threats to human health and the environment throughout the developed and developing countries. This work focuses on fractal analysis of pollutant concentration in Salamanca, Mexico. The city of Salamanca has been catalogued as one of the most polluted cities in Mexico. The main causes of pollution in this city are fixed emission sources, such as chemical industry and electricity generation. Sulphur Dioxide (SO2) and Particulate Matter less than 10 micrometer in diameter (PM10) are the most important pollutants in this region. Air pollutant concentrations were investigated by applying the box counting method in time series obtained of the Automatic Environmental Monitoring Network (AEMN). One year of time series of hourly average concentrations were analyzed in order to characterize the temporal structures of SO2 and PM10.

  13. A Novel Analysis Of The Connection Between Indian Monsoon Rainfall And Solar Activity

    NASA Astrophysics Data System (ADS)

    Bhattacharyya, S.; Narasimha, R.

    2005-12-01

    The existence of possible correlations between the solar cycle period as extracted from the yearly means of sunspot numbers and any periodicities that may be present in the Indian monsoon rainfall has been addressed using wavelet analysis. The wavelet transform coefficient maps of sunspot-number time series and those of the homogeneous Indian monsoon rainfall annual time series data reveal striking similarities, especially around the 11-year period. A novel method to analyse and quantify this similarity devising statistical schemes is suggested in this paper. The wavelet transform coefficient maxima at the 11-year period for the sunspot numbers and the monsoon rainfall have each been modelled as a point process in time and a statistical scheme for identifying a trend or dependence between the two processes has been devised. A regression analysis of parameters in these processes reveals a nearly linear trend with small but systematic deviations from the regressed line. Suitable function models for these deviations have been obtained through an unconstrained error minimisation scheme. These models provide an excellent fit to the time series of the given wavelet transform coefficient maxima obtained from actual data. Statistical significance tests on these deviations suggest with 99% confidence that the deviations are sample fluctuations obtained from normal distributions. In fact our earlier studies (see, Bhattacharyya and Narasimha, 2005, Geophys. Res. Lett., Vol. 32, No. 5) revealed that average rainfall is higher during periods of greater solar activity for all cases, at confidence levels varying from 75% to 99%, being 95% or greater in 3 out of 7 of them. Analysis using standard wavelet techniques reveals higher power in the 8--16 y band during the higher solar activity period, in 6 of the 7 rainfall time series, at confidence levels exceeding 99.99%. Furthermore, a comparison between the wavelet cross spectra of solar activity with rainfall and noise (including those simulating the rainfall spectrum and probability distribution) revealed that over the two test-periods respectively of high and low solar activity, the average cross power of the solar activity index with rainfall exceeds that with the noise at z-test confidence levels exceeding 99.99% over period-bands covering the 11.6 y sunspot cycle (see, Bhattacharyya and Narasimha, SORCE 2005 14-16th September, at Durango, Colorado USA). These results provide strong evidence for connections between Indian rainfall and solar activity. The present study reveals in addition the presence of subharmonics of the solar cycle period in the monsoon rainfall time series together with information on their phase relationships.

  14. Studies in astronomical time series analysis. IV - Modeling chaotic and random processes with linear filters

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.

    1990-01-01

    While chaos arises only in nonlinear systems, standard linear time series models are nevertheless useful for analyzing data from chaotic processes. This paper introduces such a model, the chaotic moving average. This time-domain model is based on the theorem that any chaotic process can be represented as the convolution of a linear filter with an uncorrelated process called the chaotic innovation. A technique, minimum phase-volume deconvolution, is introduced to estimate the filter and innovation. The algorithm measures the quality of a model using the volume covered by the phase-portrait of the innovation process. Experiments on synthetic data demonstrate that the algorithm accurately recovers the parameters of simple chaotic processes. Though tailored for chaos, the algorithm can detect both chaos and randomness, distinguish them from each other, and separate them if both are present. It can also recover nonminimum-delay pulse shapes in non-Gaussian processes, both random and chaotic.

  15. An effective chaos-geometric computational approach to analysis and prediction of evolutionary dynamics of the environmental systems: Atmospheric pollution dynamics

    NASA Astrophysics Data System (ADS)

    Buyadzhi, V. V.; Glushkov, A. V.; Khetselius, O. Yu; Bunyakova, Yu Ya; Florko, T. A.; Agayar, E. V.; Solyanikova, E. P.

    2017-10-01

    The present paper concerns the results of computational studying dynamics of the atmospheric pollutants (dioxide of nitrogen, sulphur etc) concentrations in an atmosphere of the industrial cities (Odessa) by using the dynamical systems and chaos theory methods. A chaotic behaviour in the nitrogen dioxide and sulphurous anhydride concentration time series at several sites of the Odessa city is numerically investigated. As usually, to reconstruct the corresponding attractor, the time delay and embedding dimension are needed. The former is determined by the methods of autocorrelation function and average mutual information, and the latter is calculated by means of a correlation dimension method and algorithm of false nearest neighbours. Further, the Lyapunov’s exponents spectrum, Kaplan-Yorke dimension and Kolmogorov entropy are computed. It has been found an existence of a low-D chaos in the time series of the atmospheric pollutants concentrations.

  16. Relationship research between meteorological disasters and stock markets based on a multifractal detrending moving average algorithm

    NASA Astrophysics Data System (ADS)

    Li, Qingchen; Cao, Guangxi; Xu, Wei

    2018-01-01

    Based on a multifractal detrending moving average algorithm (MFDMA), this study uses the fractionally autoregressive integrated moving average process (ARFIMA) to demonstrate the effectiveness of MFDMA in the detection of auto-correlation at different sample lengths and to simulate some artificial time series with the same length as the actual sample interval. We analyze the effect of predictable and unpredictable meteorological disasters on the US and Chinese stock markets and the degree of long memory in different sectors. Furthermore, we conduct a preliminary investigation to determine whether the fluctuations of financial markets caused by meteorological disasters are derived from the normal evolution of the financial system itself or not. We also propose several reasonable recommendations.

  17. Retrograde Intramedullary Nail With Femoral Head Allograft for Large Deficit Tibiotalocalcaneal Arthrodesis.

    PubMed

    Bussewitz, Bradly; DeVries, J George; Dujela, Michael; McAlister, Jeffrey E; Hyer, Christopher F; Berlet, Gregory C

    2014-07-01

    Large bone defects present a difficult task for surgeons when performing single-stage, complex combined hindfoot and ankle reconstruction. There exist little data in a case series format to evaluate the use of frozen femoral head allograft during tibiotalocalcaneal arthrodesis in various populations in the literature. The authors evaluated 25 patients from 2003 to 2011 who required a femoral head allograft and an intramedullary nail. The average time of final follow-up visit was 83 ± 63.6 weeks (range, 10-265). Twelve patients healed the fusion (48%). Twenty-one patients resulted in a braceable limb (84%). Four patients resulted in major amputation (16%). This series may allow surgeons to more accurately predict the success and clinical outcome of these challenging cases. Level IV, case series. © The Author(s) 2014.

  18. The range and response of Neonicotinoids on hemlock woolly adelgid, Adelges tsugae (Hemiptera: Adelgidae)

    Treesearch

    Shimat V. Joseph; S. Kristine Braman; Jim Quick; James L. Hanula

    2011-01-01

    Hemlock woolly adelgid (HWA), Adelges tsugae Annand is a serious pest of eastern and Carolina hemlock in the eastern United States. A series of experiments compared commercially available and experimental insecticides, rates, application methods and timing for HWA control in Georgia and North Carolina. Safari 20 SG (dinotefuran) provided an average of 79 to 87%...

  19. Trend Damping: Under-Adjustment, Experimental Artifact, or Adaptation to Features of the Natural Environment?

    ERIC Educational Resources Information Center

    Harvey, Nigel; Reimers, Stian

    2013-01-01

    People's forecasts from time series underestimate future values for upward trends and overestimate them for downward ones. This trend damping may occur because (a) people anchor on the last data point and make insufficient adjustment to take the trend into account, (b) they adjust toward the average of the trends they have encountered within the…

  20. No Evidence of Suicide Increase Following Terrorist Attacks in the United States: An Interrupted Time-Series Analysis of September 11 and Oklahoma City

    ERIC Educational Resources Information Center

    Pridemore, William Alex; Trahan, Adam; Chamlin, Mitchell B.

    2009-01-01

    There is substantial evidence of detrimental psychological sequelae following disasters, including terrorist attacks. The effect of these events on extreme responses such as suicide, however, is unclear. We tested competing hypotheses about such effects by employing autoregressive integrated moving average techniques to model the impact of…

  1. Predicting Rehabilitation Success Rate Trends among Ethnic Minorities Served by State Vocational Rehabilitation Agencies: A National Time Series Forecast Model Demonstration Study

    ERIC Educational Resources Information Center

    Moore, Corey L.; Wang, Ningning; Washington, Janique Tynez

    2017-01-01

    Purpose: This study assessed and demonstrated the efficacy of two select empirical forecast models (i.e., autoregressive integrated moving average [ARIMA] model vs. grey model [GM]) in accurately predicting state vocational rehabilitation agency (SVRA) rehabilitation success rate trends across six different racial and ethnic population cohorts…

  2. 77 FR 3818 - Order Making Fiscal Year 2012 Annual Adjustments to Transaction Fee Rates

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-25

    ... Commission must project the aggregate dollar amount of covered sales of securities on the securities... series {[Delta] 1 , [Delta] 2 , ... , [Delta] 120 {time} . These are given by [mu] = 0.0087 and [sigma...] + [sigma]\\2\\/2), or on average ADS t = 1.017 x ADS t-1 . 6. For December 2011, this gives a forecast ADS of...

  3. Hausdorff clustering

    NASA Astrophysics Data System (ADS)

    Basalto, Nicolas; Bellotti, Roberto; de Carlo, Francesco; Facchi, Paolo; Pantaleo, Ester; Pascazio, Saverio

    2008-10-01

    A clustering algorithm based on the Hausdorff distance is analyzed and compared to the single, complete, and average linkage algorithms. The four clustering procedures are applied to a toy example and to the time series of financial data. The dendrograms are scrutinized and their features compared. The Hausdorff linkage relies on firm mathematical grounds and turns out to be very effective when one has to discriminate among complex structures.

  4. Time-series modeling and prediction of global monthly absolute temperature for environmental decision making

    NASA Astrophysics Data System (ADS)

    Ye, Liming; Yang, Guixia; Van Ranst, Eric; Tang, Huajun

    2013-03-01

    A generalized, structural, time series modeling framework was developed to analyze the monthly records of absolute surface temperature, one of the most important environmental parameters, using a deterministicstochastic combined (DSC) approach. Although the development of the framework was based on the characterization of the variation patterns of a global dataset, the methodology could be applied to any monthly absolute temperature record. Deterministic processes were used to characterize the variation patterns of the global trend and the cyclic oscillations of the temperature signal, involving polynomial functions and the Fourier method, respectively, while stochastic processes were employed to account for any remaining patterns in the temperature signal, involving seasonal autoregressive integrated moving average (SARIMA) models. A prediction of the monthly global surface temperature during the second decade of the 21st century using the DSC model shows that the global temperature will likely continue to rise at twice the average rate of the past 150 years. The evaluation of prediction accuracy shows that DSC models perform systematically well against selected models of other authors, suggesting that DSC models, when coupled with other ecoenvironmental models, can be used as a supplemental tool for short-term (˜10-year) environmental planning and decision making.

  5. Reduction of the dimension of neural network models in problems of pattern recognition and forecasting

    NASA Astrophysics Data System (ADS)

    Nasertdinova, A. D.; Bochkarev, V. V.

    2017-11-01

    Deep neural networks with a large number of parameters are a powerful tool for solving problems of pattern recognition, prediction and classification. Nevertheless, overfitting remains a serious problem in the use of such networks. A method of solving the problem of overfitting is proposed in this article. This method is based on reducing the number of independent parameters of a neural network model using the principal component analysis, and can be implemented using existing libraries of neural computing. The algorithm was tested on the problem of recognition of handwritten symbols from the MNIST database, as well as on the task of predicting time series (rows of the average monthly number of sunspots and series of the Lorentz system were used). It is shown that the application of the principal component analysis enables reducing the number of parameters of the neural network model when the results are good. The average error rate for the recognition of handwritten figures from the MNIST database was 1.12% (which is comparable to the results obtained using the "Deep training" methods), while the number of parameters of the neural network can be reduced to 130 times.

  6. Rainfall-induced slope failures near Los Angeles detected by time series of high-resolution satellite imagery

    NASA Astrophysics Data System (ADS)

    McKinney, E.; Moon, S.

    2017-12-01

    Tectonically active, soil mantled, and often fire-scorched landscapes of the Los Angeles region are susceptible to slope failures, such as mudflow and landslides, during high-intensity precipitation events. During 2016-2017, this area received a precipitation rate that was 90 mm higher than the long-term precipitation rates averaged over 30 years. These precipitation rates were 24 % higher than the long-term averages and 245 % higher than those over the 2011-2016 period of drought. In this study, we examined the occurrences of slopes failures near Los Angeles in response to high rainfall rates over 2016-2017. We composited time series of high-resolution Planetscope satellite images with resolutions of 3 - 4 m/pixel for 4 selected locations after reviewing 190,000 km2 area in total. We mapped the surface changes by comparing satellite images before and after the winter 2016-2017. Preliminary analysis using spectral bands highlighted the surface changes made by mudflows, landslides, lake levels and land developments. We compared these changes across 2016-2017 with those over a period of recent drought (2011-2016) to assess the influence of high rainfall rates on slope failures.

  7. Wind data mining by Kohonen Neural Networks.

    PubMed

    Fayos, José; Fayos, Carolina

    2007-02-14

    Time series of Circulation Weather Type (CWT), including daily averaged wind direction and vorticity, are self-classified by similarity using Kohonen Neural Networks (KNN). It is shown that KNN is able to map by similarity all 7300 five-day CWT sequences during the period of 1975-94, in London, United Kingdom. It gives, as a first result, the most probable wind sequences preceding each one of the 27 CWT Lamb classes in that period. Inversely, as a second result, the observed diffuse correlation between both five-day CWT sequences and the CWT of the 6(th) day, in the long 20-year period, can be generalized to predict the last from the previous CWT sequence in a different test period, like 1995, as both time series are similar. Although the average prediction error is comparable to that obtained by forecasting standard methods, the KNN approach gives complementary results, as they depend only on an objective classification of observed CWT data, without any model assumption. The 27 CWT of the Lamb Catalogue were coded with binary three-dimensional vectors, pointing to faces, edges and vertex of a "wind-cube," so that similar CWT vectors were close.

  8. Incomplete Early Childhood Immunization Series and Missing Fourth DTaP Immunizations; Missed Opportunities or Missed Visits?

    PubMed

    Robison, Steve G

    2013-01-01

    The successful completion of early childhood immunizations is a proxy for overall quality of early care. Immunization statuses are usually assessed by up-to-date (UTD) rates covering combined series of different immunizations. However, series UTD rates often only bear on which single immunization is missing, rather than the success of all immunizations. In the US, most series UTD rates are limited by missing fourth DTaP-containing immunizations (diphtheria/tetanus/pertussis) due at 15 to 18 months of age. Missing 4th DTaP immunizations are associated either with a lack of visits at 15 to 18 months of age, or to visits without immunizations. Typical immunization data however cannot distinguish between these two reasons. This study compared immunization records from the Oregon ALERT IIS with medical encounter records for two-year olds in the Oregon Health Plan. Among those with 3 valid DTaPs by 9 months of age, 31.6% failed to receive a timely 4th DTaP; of those without a 4th DTaP, 42.1% did not have any provider visits from 15 through 18 months of age, while 57.9% had at least one provider visit. Those with a 4th DTaP averaged 2.45 encounters, while those with encounters but without 4th DTaPs averaged 2.23 encounters.

  9. Transmembrane protein CD93 diffuses by a continuous time random walk.

    NASA Astrophysics Data System (ADS)

    Goiko, Maria; de Bruyn, John; Heit, Bryan

    Molecular motion within the cell membrane is a poorly-defined process. In this study, we characterized the diffusion of the transmembrane protein CD93. By careful analysis of the dependence of the ensemble-averaged mean squared displacement (EA-MSD, r2) on time t and the ensemble-averaged, time-averaged MSD (EA-TAMSD, δ2) on lag time τ and total measurement time T, we showed that the motion of CD93 is well-described by a continuous-time random walk (CTRW). CD93 tracks were acquired using single particle tracking. The tracks were classified as confined or free, and the behavior of the MSD analyzed. EA-MSDs of both populations grew non-linearly with t, indicative of anomalous diffusion. Their EA-TAMSDs were found to depend on both τ and T, indicating non-ergodicity. Free molecules had r2 tα and δ2 (τ /T 1 - α) , with α 0 . 5 , consistent with a CTRW. Mean maximal excursion analysis supported this result. Confined CD93 had r2 t0 and δ2 (τ / T) α , with α 0 . 3 , consistent with a confined CTRW. CTRWs are described by a series of random jumps interspersed with power-law distributed waiting times, and may arise due to the interactions of CD93 with the endocytic machinery. NSERC.

  10. OPCAB in patients on hemodialysis.

    PubMed

    Milani, Rodrigo; Brofman, Paulo Roberto Slud; Souza, José Augusto Moutinho de; Barboza, Laura; Guimarães, Maximiliano Ricardo; Barbosa, Alexandre; Varela, Alexandre Manoel; Ravagnelli, Marcel Rogers; Silva, Francisco Maia da

    2007-01-01

    To analyze the hospital outcomes of patients, with chronic renal insufficiency in the hemodialysis, submitted to OPCAB. Fifty-one patients with chronic renal insufficiency were submitted to OPCAB. Hemodialysis was performed on the day before and the day after the operation. Myocardial revascularization was performed using LIMA's suture and suction stabilization. Fifty-one patients, with an average of 61.28+/-11.09 years, were analyzed. Thirty patients (58.8%) were female. The predominant functional class was IV in 21 (41.1%) of the patients. The left ventricle ejection fraction was dire in 21 (41.1%) patients. The mean EUROSCORE of this series was 7.65+/-3.83 and the mean number of distal anastomosis was 3.1+/-0.78 per patient. The average time of mechanical ventilation was 3.78+/-4.35 hours and the mean ICU stay was 41.9+/-13.8 hours, while the average hospitalization was 6.5+/-1.31 days. In respect to complications, nine (17.6%) of the patients developed atrial fibrilation, and one (1.9%) patient presented with a case of ischemic stroke but had a good recovery during hospitalization. There were no deaths in this series. Chronic renal patients submitted to hemodialysis were always a high risk population for myocardial revascularization. In this series, the absence of extracorporeal circulation appeared to be safe and efficient in this special subgroup of patients. The operations were performed with low indices of complications, absence of deaths and relatively low stays in the ICU and in hospital.

  11. Management of Toxic Epidermal Necrolysis with Plasmapheresis and Cyclosporine A: Our 10 Years’ Experience

    PubMed Central

    Giudice, Giuseppe; Maggio, Giulio; Bufano, Loredana; Memeo, Giuseppe

    2017-01-01

    Background: The management of toxic epidermal necrolysis (TEN) is controversial and there is no uniform strategy. Objective: To share our 10 years’ experience in treating severe TEN with a novel protocol based on the association of cyclosporine A and plasmapheresis. Methods: In this case series, we retrospectively collected and assessed the 12 cases of severe TEN treated from 2005 to 2015 at the Burn Unit of the University of Bari Policlinico hospital. Results: Average body surface area was 77; average SCORETEN was 4.3. The 12 patients had been treated with culprit drug withdrawal, systemic corticosteroids, and/or cyclosporine A with no response. The protocol was successfully administered in all 12 cases. Average time to response from protocol start was 4.9 days. Average time to remission from protocol start was 22 days; average hospital stay at our unit was 24.8 days. Four patients developed severe complications; 1 patient died. No complications linked to the protocol therapeutic measures were observed. The relatively small number of cases given the rarity of the condition is a limitation of this report. Conclusion: Our protocol based on the association of cyclosporine A and plasmapheresis is safe and efficacious in treating severe TEN. PMID:28280663

  12. A Bayesian CUSUM plot: Diagnosing quality of treatment.

    PubMed

    Rosthøj, Steen; Jacobsen, Rikke-Line

    2017-12-01

    To present a CUSUM plot based on Bayesian diagnostic reasoning displaying evidence in favour of "healthy" rather than "sick" quality of treatment (QOT), and to demonstrate a technique using Kaplan-Meier survival curves permitting application to case series with ongoing follow-up. For a case series with known final outcomes: Consider each case a diagnostic test of good versus poor QOT (expected vs. increased failure rates), determine the likelihood ratio (LR) of the observed outcome, convert LR to weight taking log to base 2, and add up weights sequentially in a plot showing how many times odds in favour of good QOT have been doubled. For a series with observed survival times and an expected survival curve: Divide the curve into time intervals, determine "healthy" and specify "sick" risks of failure in each interval, construct a "sick" survival curve, determine the LR of survival or failure at the given observation times, convert to weights, and add up. The Bayesian plot was applied retrospectively to 39 children with acute lymphoblastic leukaemia with completed follow-up, using Nordic collaborative results as reference, showing equal odds between good and poor QOT. In the ongoing treatment trial, with 22 of 37 children still at risk for event, QOT has been monitored with average survival curves as reference, odds so far favoring good QOT 2:1. QOT in small patient series can be assessed with a Bayesian CUSUM plot, retrospectively when all treatment outcomes are known, but also in ongoing series with unfinished follow-up. © 2017 John Wiley & Sons, Ltd.

  13. Reconstruction of regional mean temperature for East Asia since 1900s and its uncertainties

    NASA Astrophysics Data System (ADS)

    Hua, W.

    2017-12-01

    Regional average surface air temperature (SAT) is one of the key variables often used to investigate climate change. Unfortunately, because of the limited observations over East Asia, there were also some gaps in the observation data sampling for regional mean SAT analysis, which was important to estimate past climate change. In this study, the regional average temperature of East Asia since 1900s is calculated by the Empirical Orthogonal Function (EOF)-based optimal interpolation (OA) method with considering the data errors. The results show that our estimate is more precise and robust than the results from simple average, which provides a better way for past climate reconstruction. In addition to the reconstructed regional average SAT anomaly time series, we also estimated uncertainties of reconstruction. The root mean square error (RMSE) results show that the the error decreases with respect to time, and are not sufficiently large to alter the conclusions on the persist warming in East Asia during twenty-first century. Moreover, the test of influence of data error on reconstruction clearly shows the sensitivity of reconstruction to the size of the data error.

  14. Primary production export flux in Marguerite Bay (Antarctic Peninsula): Linking upper water-column production to sediment trap flux

    NASA Astrophysics Data System (ADS)

    Weston, Keith; Jickells, Timothy D.; Carson, Damien S.; Clarke, Andrew; Meredith, Michael P.; Brandon, Mark A.; Wallace, Margaret I.; Ussher, Simon J.; Hendry, Katharine R.

    2013-05-01

    A study was carried out to assess primary production and associated export flux in the coastal waters of the western Antarctic Peninsula at an oceanographic time-series site. New, i.e., exportable, primary production in the upper water-column was estimated in two ways; by nutrient deficit measurements, and by primary production rate measurements using separate 14C-labelled radioisotope and 15N-labelled stable isotope uptake incubations. The resulting average annual exportable primary production estimates at the time-series site from nutrient deficit and primary production rates were 13 and 16 mol C m-2, respectively. Regenerated primary production was measured using 15N-labelled ammonium and urea uptake, and was low throughout the sampling period. The exportable primary production measurements were compared with sediment trap flux measurements from 2 locations; the time-series site and at a site 40 km away in deeper water. Results showed ˜1% of the upper mixed layer exportable primary production was exported to traps at 200 m depth at the time-series site (total water column depth 520 m). The maximum particle flux rate to sediment traps at the deeper offshore site (total water column depth 820 m) was lower than the flux at the coastal time-series site. Flux of particulate organic carbon was similar throughout the spring-summer high flux period for both sites. Remineralisation of particulate organic matter predominantly occurred in the upper water-column (<200 m depth), with minimal remineralisation below 200 m, at both sites. This highly productive region on the Western Antarctic Peninsula is therefore best characterised as 'high recycling, low export'.

  15. A general theory on frequency and time-frequency analysis of irregularly sampled time series based on projection methods - Part 2: Extension to time-frequency analysis

    NASA Astrophysics Data System (ADS)

    Lenoir, Guillaume; Crucifix, Michel

    2018-03-01

    Geophysical time series are sometimes sampled irregularly along the time axis. The situation is particularly frequent in palaeoclimatology. Yet, there is so far no general framework for handling the continuous wavelet transform when the time sampling is irregular. Here we provide such a framework. To this end, we define the scalogram as the continuous-wavelet-transform equivalent of the extended Lomb-Scargle periodogram defined in Part 1 of this study (Lenoir and Crucifix, 2018). The signal being analysed is modelled as the sum of a locally periodic component in the time-frequency plane, a polynomial trend, and a background noise. The mother wavelet adopted here is the Morlet wavelet classically used in geophysical applications. The background noise model is a stationary Gaussian continuous autoregressive-moving-average (CARMA) process, which is more general than the traditional Gaussian white and red noise processes. The scalogram is smoothed by averaging over neighbouring times in order to reduce its variance. The Shannon-Nyquist exclusion zone is however defined as the area corrupted by local aliasing issues. The local amplitude in the time-frequency plane is then estimated with least-squares methods. We also derive an approximate formula linking the squared amplitude and the scalogram. Based on this property, we define a new analysis tool: the weighted smoothed scalogram, which we recommend for most analyses. The estimated signal amplitude also gives access to band and ridge filtering. Finally, we design a test of significance for the weighted smoothed scalogram against the stationary Gaussian CARMA background noise, and provide algorithms for computing confidence levels, either analytically or with Monte Carlo Markov chain methods. All the analysis tools presented in this article are available to the reader in the Python package WAVEPAL.

  16. Development of temporal modelling for forecasting and prediction of malaria infections using time-series and ARIMAX analyses: A case study in endemic districts of Bhutan

    PubMed Central

    2010-01-01

    Background Malaria still remains a public health problem in some districts of Bhutan despite marked reduction of cases in last few years. To strengthen the country's prevention and control measures, this study was carried out to develop forecasting and prediction models of malaria incidence in the endemic districts of Bhutan using time series and ARIMAX. Methods This study was carried out retrospectively using the monthly reported malaria cases from the health centres to Vector-borne Disease Control Programme (VDCP) and the meteorological data from Meteorological Unit, Department of Energy, Ministry of Economic Affairs. Time series analysis was performed on monthly malaria cases, from 1994 to 2008, in seven malaria endemic districts. The time series models derived from a multiplicative seasonal autoregressive integrated moving average (ARIMA) was deployed to identify the best model using data from 1994 to 2006. The best-fit model was selected for each individual district and for the overall endemic area was developed and the monthly cases from January to December 2009 and 2010 were forecasted. In developing the prediction model, the monthly reported malaria cases and the meteorological factors from 1996 to 2008 of the seven districts were analysed. The method of ARIMAX modelling was employed to determine predictors of malaria of the subsequent month. Results It was found that the ARIMA (p, d, q) (P, D, Q)s model (p and P representing the auto regressive and seasonal autoregressive; d and D representing the non-seasonal differences and seasonal differencing; and q and Q the moving average parameters and seasonal moving average parameters, respectively and s representing the length of the seasonal period) for the overall endemic districts was (2,1,1)(0,1,1)12; the modelling data from each district revealed two most common ARIMA models including (2,1,1)(0,1,1)12 and (1,1,1)(0,1,1)12. The forecasted monthly malaria cases from January to December 2009 and 2010 varied from 15 to 82 cases in 2009 and 67 to 149 cases in 2010, where population in 2009 was 285,375 and the expected population of 2010 to be 289,085. The ARIMAX model of monthly cases and climatic factors showed considerable variations among the different districts. In general, the mean maximum temperature lagged at one month was a strong positive predictor of an increased malaria cases for four districts. The monthly number of cases of the previous month was also a significant predictor in one district, whereas no variable could predict malaria cases for two districts. Conclusions The ARIMA models of time-series analysis were useful in forecasting the number of cases in the endemic areas of Bhutan. There was no consistency in the predictors of malaria cases when using ARIMAX model with selected lag times and climatic predictors. The ARIMA forecasting models could be employed for planning and managing malaria prevention and control programme in Bhutan. PMID:20813066

  17. Prediction of retention times in comprehensive two-dimensional gas chromatography using thermodynamic models.

    PubMed

    McGinitie, Teague M; Harynuk, James J

    2012-09-14

    A method was developed to accurately predict both the primary and secondary retention times for a series of alkanes, ketones and alcohols in a flow-modulated GC×GC system. This was accomplished through the use of a three-parameter thermodynamic model where ΔH, ΔS, and ΔC(p) for an analyte's interaction with the stationary phases in both dimensions are known. Coupling this thermodynamic model with a time summation calculation it was possible to accurately predict both (1)t(r) and (2)t(r) for all analytes. The model was able to predict retention times regardless of the temperature ramp used, with an average error of only 0.64% for (1)t(r) and an average error of only 2.22% for (2)t(r). The model shows promise for the accurate prediction of retention times in GC×GC for a wide range of compounds and is able to utilize data collected from 1D experiments. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Historical Data Analysis of Hospital Discharges Related to the Amerithrax Attack in Florida

    PubMed Central

    Burke, Lauralyn K.; Brown, C. Perry; Johnson, Tammie M.

    2016-01-01

    Interrupted time-series analysis (ITSA) can be used to identify, quantify, and evaluate the magnitude and direction of an event on the basis of time-series data. This study evaluates the impact of the bioterrorist anthrax attacks (“Amerithrax”) on hospital inpatient discharges in the metropolitan statistical area of Palm Beach, Broward, and Miami-Dade counties in the fourth quarter of 2001. Three statistical methods—standardized incidence ratio (SIR), segmented regression, and an autoregressive integrated moving average (ARIMA)—were used to determine whether Amerithrax influenced inpatient utilization. The SIR found a non–statistically significant 2 percent decrease in hospital discharges. Although the segmented regression test found a slight increase in the discharge rate during the fourth quarter, it was also not statistically significant; therefore, it could not be attributed to Amerithrax. Segmented regression diagnostics preparing for ARIMA indicated that the quarterly data time frame was not serially correlated and violated one of the assumptions for the use of the ARIMA method and therefore could not properly evaluate the impact on the time-series data. Lack of data granularity of the time frames hindered the successful evaluation of the impact by the three analytic methods. This study demonstrates that the granularity of the data points is as important as the number of data points in a time series. ITSA is important for the ability to evaluate the impact that any hazard may have on inpatient utilization. Knowledge of hospital utilization patterns during disasters offer healthcare and civic professionals valuable information to plan, respond, mitigate, and evaluate any outcomes stemming from biothreats. PMID:27843420

  19. Using a chemistry transport model to account for the spatial variability of exposure concentrations in epidemiologic air pollution studies.

    PubMed

    Valari, Myrto; Menut, Laurent; Chatignoux, Edouard

    2011-02-01

    Environmental epidemiology and more specifically time-series analysis have traditionally used area-averaged pollutant concentrations measured at central monitors as exposure surrogates to associate health outcomes with air pollution. However, spatial aggregation has been shown to contribute to the overall bias in the estimation of the exposure-response functions. This paper presents the benefit of adding features of the spatial variability of exposure by using concentration fields modeled with a chemistry transport model instead of monitor data and accounting for human activity patterns. On the basis of county-level census data for the city of Paris, France, and a Monte Carlo simulation, a simple activity model was developed accounting for the temporal variability between working and evening hours as well as during transit. By combining activity data with modeled concentrations, the downtown, suburban, and rural spatial patterns in exposure to nitrogen dioxide, ozone, and PM2.5 (particulate matter [PM] < or = 10 microm in aerodynamic diameter) were captured and parametrized. Exposures predicted with this model were used in a time-series study of the short-term effect of air pollution on total nonaccidental mortality for the 4-yr period from 2001 to 2004. It was shown that the time series of the exposure surrogates developed here are less correlated across co-pollutants than in the case of the area-averaged monitor data. This led to less biased exposure-response functions when all three co-pollutants were inserted simultaneously in the same regression model. This finding yields insight into pollutant-specific health effects that are otherwise masked by the high correlation among co-pollutants.

  20. Contemporary Surface Seasonal Oscillation and Vertical Deformation in Tibetan Plateau and Nepal Derived from the GPS, Leveling and GRACE Data

    NASA Astrophysics Data System (ADS)

    Shen, W.; Pan, Y.; Hwang, C.; Ding, H.

    2015-12-01

    We use 168 Continuous Global Positioning System (CGPS) stations distributed in the Tibetan Plateau (TP) and Nepal from lengths of 2.5 to 14 years to estimate the present-day velocity field in this area, including the horizontal and vertical deformations under the frame ITRF2008. We estimate and remove common mode errors in regional GPS time series using the principal component analysis (PCA), obtaining a time series with high signal to noise ratio. Following the maximum estimation analysis, a power law plus white noise stochastic model are adopted to estimate the velocity field. The highlight of Tibetan region is the crust vertical deformation. GPS vertical time series present seasonal oscillations caused by temporal mass loads, hence GRACE data from CSR are used to study the mass loads change. After removing the mass load deformations from GPS vertical rates, the results are improved. Leveling data about 48 years in this region are also used to estimate the rates of vertical movements. Our study suggests that the boundary of south Nepal is still sinking due to the fact that the India plate is crashing into the Eurasian plate. The uplift rates from south to north of TP reduce gradually. Himalayas region and north Nepal uplift around 6 mm/yr in average. The uplift rate along East TP in Qinhai is around 2.7 mm/yr in average. In contrast, the southeast of Tibetan Plateau, south Yunnan and Tarim in Xinjiang sink with different magnitudes. Our observation results suggest complicated mechanism of the mass migration in TP. This study is supported by National 973 Project China (grant Nos. 2013CB733302 and 2013CB733305), NSFC (grant Nos. 41174011, 41429401, 41210006, 41128003, 41021061).

  1. Mortality on match days of the German national soccer team: a time series analysis from 1995 to 2009.

    PubMed

    Medenwald, D; Kuss, O

    2014-09-01

    There is inconsistent evidence on population mortality, especially cardiovascular disease mortality, on match days of national soccer teams during particular international tournaments. This study examines the number of deaths in Germany on match days of the national soccer team during a long-term period including several tournaments. We analysed all registered daily deaths in Germany from 1995 to 2009 (11 225 966 cases) using time series analysis methods. Following the Box/Jenkins approach, we applied a seasonal autoregressive integrated moving average model. To assess the effect of match days, we performed an intervention analysis by including a transfer function model representing match days of the national team in the statistical analyses. We conducted separate analyses for all matches and for matches during international tournaments (European and World Championships) only. Time series and results were stratified in terms of sex, age (<50 years, 50-70 years, >70 years) and cause of death (cardiovascular deaths, injuries, others). We performed a further independent analysis focusing only on the effect of match results (victory, loss, draw) and kind of tournament (international championships, qualifications, friendly matches). Most of the results did not indicate a distinct effect of matches of the national team on general mortality. Moreover, all null value deviations were small when compared with the average number of daily deaths (n=2270). There is no relevant increase or decrease in mortality on match days of the German national soccer team. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  2. Modern Era Retrospective-analysis for Research and Applications (MERRA) Global Water and Energy Budgets

    NASA Technical Reports Server (NTRS)

    Bosilovich, Michael G.; Chen, Junye

    2009-01-01

    In the Summer of 2009, NASA's Modern Era Retrospective-analysis for Research and Applications (MERRA) will have completed 28 years of global satellite data analyses. Here, we characterize the global water and energy budgets of MERRA, compared with available observations and the latest reanalyses. In this analysis, the climatology of the global average components are studied as well as the separate land and ocean averages. In addition, the time series of the global averages are evaluated. For example, the global difference of precipitation and evaporation generally shows the influence of water vapor observations on the system. Since the observing systems change in time, especially remotely sensed observations of water, significant temporal variations can occur across the 28 year record. These then are also closely connected to changes in the atmospheric energy and water budgets. The net imbalance of the energy budget at the surface can be large and different signs for different reanalyses. In MERRA, the imbalance of energy at the surface tends to improve with time being the smallest during the most recent and abundant satellite observations.

  3. User's manual for the Graphical Constituent Loading Analysis System (GCLAS)

    USGS Publications Warehouse

    Koltun, G.F.; Eberle, Michael; Gray, J.R.; Glysson, G.D.

    2006-01-01

    This manual describes the Graphical Constituent Loading Analysis System (GCLAS), an interactive cross-platform program for computing the mass (load) and average concentration of a constituent that is transported in stream water over a period of time. GCLAS computes loads as a function of an equal-interval streamflow time series and an equal- or unequal-interval time series of constituent concentrations. The constituent-concentration time series may be composed of measured concentrations or a combination of measured and estimated concentrations. GCLAS is not intended for use in situations where concentration data (or an appropriate surrogate) are collected infrequently or where an appreciable amount of the concentration values are censored. It is assumed that the constituent-concentration time series used by GCLAS adequately represents the true time-varying concentration. Commonly, measured constituent concentrations are collected at a frequency that is less than ideal (from a load-computation standpoint), so estimated concentrations must be inserted in the time series to better approximate the expected chemograph. GCLAS provides tools to facilitate estimation and entry of instantaneous concentrations for that purpose. Water-quality samples collected for load computation frequently are collected in a single vertical or at single point in a stream cross section. Several factors, some of which may vary as a function of time and (or) streamflow, can affect whether the sample concentrations are representative of the mean concentration in the cross section. GCLAS provides tools to aid the analyst in assessing whether concentrations in samples collected in a single vertical or at single point in a stream cross section exhibit systematic bias with respect to the mean concentrations. In cases where bias is evident, the analyst can construct coefficient relations in GCLAS to reduce or eliminate the observed bias. GCLAS can export load and concentration data in formats suitable for entry into the U.S. Geological Survey's National Water Information System. GCLAS can also import and export data in formats that are compatible with various commonly used spreadsheet and statistics programs.

  4. Analysing the accuracy of machine learning techniques to develop an integrated influent time series model: case study of a sewage treatment plant, Malaysia.

    PubMed

    Ansari, Mozafar; Othman, Faridah; Abunama, Taher; El-Shafie, Ahmed

    2018-04-01

    The function of a sewage treatment plant is to treat the sewage to acceptable standards before being discharged into the receiving waters. To design and operate such plants, it is necessary to measure and predict the influent flow rate. In this research, the influent flow rate of a sewage treatment plant (STP) was modelled and predicted by autoregressive integrated moving average (ARIMA), nonlinear autoregressive network (NAR) and support vector machine (SVM) regression time series algorithms. To evaluate the models' accuracy, the root mean square error (RMSE) and coefficient of determination (R 2 ) were calculated as initial assessment measures, while relative error (RE), peak flow criterion (PFC) and low flow criterion (LFC) were calculated as final evaluation measures to demonstrate the detailed accuracy of the selected models. An integrated model was developed based on the individual models' prediction ability for low, average and peak flow. An initial assessment of the results showed that the ARIMA model was the least accurate and the NAR model was the most accurate. The RE results also prove that the SVM model's frequency of errors above 10% or below - 10% was greater than the NAR model's. The influent was also forecasted up to 44 weeks ahead by both models. The graphical results indicate that the NAR model made better predictions than the SVM model. The final evaluation of NAR and SVM demonstrated that SVM made better predictions at peak flow and NAR fit well for low and average inflow ranges. The integrated model developed includes the NAR model for low and average influent and the SVM model for peak inflow.

  5. Assessing the Impact of Different Measurement Time Intervals on Observed Long-Term Wind Speed Trends

    NASA Astrophysics Data System (ADS)

    Azorin-Molina, C.; Vicente-Serrano, S. M.; McVicar, T.; Jerez, S.; Revuelto, J.; López Moreno, J. I.

    2014-12-01

    During the last two decades climate studies have reported a tendency toward a decline in measured near-surface wind speed in some regions of Europe, North America, Asia and Australia. This weakening in observed wind speed has been recently termed "global stilling", showing a worldwide average trend of -0.140 m s-1 dec-1 during last 50-years. The precise cause of the "global stilling" remains largely uncertain and has been hypothetically attributed to several factors, mainly related to: (i) an increasing surface roughness (i.e. forest growth, land use changes, and urbanization); (ii) a slowdown in large-scale atmospheric circulation; (iii) instrumental drifts and technological improvements, maintenance, and shifts in measurements sites and calibration issues; (iv) sunlight dimming due to air pollution; and (v) astronomical changes. This study proposed a novel investigation aimed at analyzing how different measurement time intervals used to calculate a wind speed series can affect the sign and magnitude of long-term wind speed trends. For instance, National Weather Services across the globe estimate daily average wind speed using different time intervals and formulae that may affect the trend results. Firstly, we carried out a comprehensive review of wind studies reporting the sign and magnitude of wind speed trend and the sampling intervals used. Secondly, we analyzed near-surface wind speed trends recorded at 59 land-based stations across Spain comparing monthly mean wind speed series obtained from: (a) daily mean wind speed data averaged from standard 10-min mean observations at 0000, 0700, 1300 and 1800 UTC; and (b) average wind speed of 24 hourly measurements (i.e., wind run measurements) from 0000 to 2400 UTC. Thirdly and finally, we quantified the impact of anemometer drift (i.e. bearing malfunction) by presenting preliminary results (1-year of paired measurements) from a comparison of one new anemometer sensor against one malfunctioned anenometer sensor due to old bearings.

  6. The Leicester AATSR Global Analyser (LAGA) - Giving Young Students the Opportunity to Examine Space Observations of Global Climate-Related Processes

    NASA Astrophysics Data System (ADS)

    Llewellyn-Jones, David; Good, Simon; Corlett, Gary

    A pc-based analysis package has been developed, for the dual purposes of, firstly, providing ‘quick-look' capability to research workers inspecting long time-series of global satellite datasets of Sea-surface Temperature (SST); and, secondly, providing an introduction for students, either undergraduates, or advanced high-school students to the characteristics of commonly used analysis techniques for large geophysical data-sets from satellites. Students can also gain insight into the behaviour of some basic climate-related large-scale or global processes. The package gives students immediate access to up to 16 years of continuous global SST data, mainly from the Advanced Along-Track Scanning Radiometer, currently flying on ESA's Envisat satellite. The data are available and are presented in the form of monthly averages and spatial averaged to half-degree or one-sixth degree longitude-latitude grids. There are simple button-operated facilities for defining and calculating box-averages; producing time-series of such averages; defining and displaying transects and their evolution over time; and the examination anomalous behaviour by displaying the difference between observed values and values derived from climatological means. By using these facilities a student rapidly gains familiarity with such processes as annual variability, the El Nĩo effect, as well as major current systems n such as the Gulf Stream and other climatically important phenomena. In fact, the student is given immediate insights into the basic methods of examining geophysical data in a research context, without needing to acquire special analysis skills are go trough lengthy data retrieval and preparation procedures which are more generally required, as precursors to serious investigation, in the research laboratory. This software package, called the Leicester AAATSR Global Analyser (LAGA), is written in a well-known and widely used analysis language and the package can be run by using software that is readily available free-of-charge.

  7. Development of a Robust Identifier for NPPs Transients Combining ARIMA Model and EBP Algorithm

    NASA Astrophysics Data System (ADS)

    Moshkbar-Bakhshayesh, Khalil; Ghofrani, Mohammad B.

    2014-08-01

    This study introduces a novel identification method for recognition of nuclear power plants (NPPs) transients by combining the autoregressive integrated moving-average (ARIMA) model and the neural network with error backpropagation (EBP) learning algorithm. The proposed method consists of three steps. First, an EBP based identifier is adopted to distinguish the plant normal states from the faulty ones. In the second step, ARIMA models use integrated (I) process to convert non-stationary data of the selected variables into stationary ones. Subsequently, ARIMA processes, including autoregressive (AR), moving-average (MA), or autoregressive moving-average (ARMA) are used to forecast time series of the selected plant variables. In the third step, for identification the type of transients, the forecasted time series are fed to the modular identifier which has been developed using the latest advances of EBP learning algorithm. Bushehr nuclear power plant (BNPP) transients are probed to analyze the ability of the proposed identifier. Recognition of transient is based on similarity of its statistical properties to the reference one, rather than the values of input patterns. More robustness against noisy data and improvement balance between memorization and generalization are salient advantages of the proposed identifier. Reduction of false identification, sole dependency of identification on the sign of each output signal, selection of the plant variables for transients training independent of each other, and extendibility for identification of more transients without unfavorable effects are other merits of the proposed identifier.

  8. Detecting daily routines of older adults using sensor time series clustering.

    PubMed

    Hajihashemi, Zahra; Yefimova, Maria; Popescu, Mihail

    2014-01-01

    The aim of this paper is to develop an algorithm to identify deviations in patterns of day-to-day activities of older adults to generate alerts to the healthcare providers for timely interventions. Daily routines, such as bathroom visits, can be monitored by automated in-home sensor systems. We present a novel approach that finds periodicity in sensor time series data using clustering approach. For this study, we used data set from TigerPlace, a retirement community in Columbia, MO, where apartments are equipped with a network of motion, pressure and depth sensors. A retrospective multiple case study (N=3) design was used to quantify bathroom visits as parts of the older adult's daily routine, over a 10-day period. The distribution of duration, number, and average time between sensor hits was used to define the confidence level for routine visit extraction. Then, a hierarchical clustering was applied to extract periodic patterns. The performance of the proposed method was evaluated through experimental results.

  9. Interannual Variability of OLR as Observed by AIRS and CERES

    NASA Technical Reports Server (NTRS)

    Susskind, Joel; Molnar, Gyula; Iredell, Lena; Loeb, Norman G.

    2012-01-01

    This paper compares spatial anomaly time series of OLR (Outgoing Longwave Radiation) and OLR(sub CLR) (Clear Sky OLR) as determined using observations from CERES Terra and AIRS over the time period September 2002 through June 2011. Both AIRS and CERES show a significant decrease in global mean and tropical mean OLR over this time period. We find excellent agreement of the anomaly time-series of the two OLR data sets in almost every detail, down to 1 deg X 1 deg spatial grid point level. The extremely close agreement of OLR anomaly time series derived from observations by two different instruments implies that both sets of results must be highly stable. This agreement also validates to some extent the anomaly time series of the AIRS derived products used in the computation of the AIRS OLR product. The paper also examines the correlations of anomaly time series of AIRS and CERES OLR, on different spatial scales, as well as those of other AIRS derived products, with that of the NOAA Sea Surface Temperature (SST) product averaged over the NOAA Nino-4 spatial region. We refer to these SST anomalies as the El Nino Index. Large spatially coherent positive and negative correlations of OLR anomaly time series with that of the El Nino Index are found in different spatial regions. Anomalies of global mean, and especially tropical mean, OLR are highly positively correlated with the El Nino Index. These correlations explain that the recent global and tropical mean decreases in OLR over the period September 2002 through June 2011, as observed by both AIRS and CERES, are primarily the result of a transition from an El Nino condition at the beginning of the data record to La Nina conditions toward the end of the data period. We show that the close correlation of global mean, and especially tropical mean, OLR anomalies with the El Nino Index can be well accounted for by temporal changes of OLR within two spatial regions which lie outside the NOAA Nino-4 region, in which anomalies of cloud cover and mid-tropospheric water vapor are both highly negatively correlated with the El Nino Index. Agreement of the AIRS and CERES OLR(sub CLR) anomaly time series is less good, which may be a result of the large sampling differences in the ensemble of cases included in each OLR(sub CLR) data set.

  10. The Weighted-Average Lagged Ensemble.

    PubMed

    DelSole, T; Trenary, L; Tippett, M K

    2017-11-01

    A lagged ensemble is an ensemble of forecasts from the same model initialized at different times but verifying at the same time. The skill of a lagged ensemble mean can be improved by assigning weights to different forecasts in such a way as to maximize skill. If the forecasts are bias corrected, then an unbiased weighted lagged ensemble requires the weights to sum to one. Such a scheme is called a weighted-average lagged ensemble. In the limit of uncorrelated errors, the optimal weights are positive and decay monotonically with lead time, so that the least skillful forecasts have the least weight. In more realistic applications, the optimal weights do not always behave this way. This paper presents a series of analytic examples designed to illuminate conditions under which the weights of an optimal weighted-average lagged ensemble become negative or depend nonmonotonically on lead time. It is shown that negative weights are most likely to occur when the errors grow rapidly and are highly correlated across lead time. The weights are most likely to behave nonmonotonically when the mean square error is approximately constant over the range forecasts included in the lagged ensemble. An extreme example of the latter behavior is presented in which the optimal weights vanish everywhere except at the shortest and longest lead times.

  11. Comparison of different synthetic 5-min rainfall time series on the results of rainfall runoff simulations in urban drainage modelling

    NASA Astrophysics Data System (ADS)

    Krämer, Stefan; Rohde, Sophia; Schröder, Kai; Belli, Aslan; Maßmann, Stefanie; Schönfeld, Martin; Henkel, Erik; Fuchs, Lothar

    2015-04-01

    The design of urban drainage systems with numerical simulation models requires long, continuous rainfall time series with high temporal resolution. However, suitable observed time series are rare. As a result, usual design concepts often use uncertain or unsuitable rainfall data, which renders them uneconomic or unsustainable. An expedient alternative to observed data is the use of long, synthetic rainfall time series as input for the simulation models. Within the project SYNOPSE, several different methods to generate synthetic rainfall data as input for urban drainage modelling are advanced, tested, and compared. Synthetic rainfall time series of three different precipitation model approaches, - one parametric stochastic model (alternating renewal approach), one non-parametric stochastic model (resampling approach), one downscaling approach from a regional climate model-, are provided for three catchments with different sewer system characteristics in different climate regions in Germany: - Hamburg (northern Germany): maritime climate, mean annual rainfall: 770 mm; combined sewer system length: 1.729 km (City center of Hamburg), storm water sewer system length (Hamburg Harburg): 168 km - Brunswick (Lower Saxony, northern Germany): transitional climate from maritime to continental, mean annual rainfall: 618 mm; sewer system length: 278 km, connected impervious area: 379 ha, height difference: 27 m - Friburg in Brisgau (southern Germany): Central European transitional climate, mean annual rainfall: 908 mm; sewer system length: 794 km, connected impervious area: 1 546 ha, height difference 284 m Hydrodynamic models are set up for each catchment to simulate rainfall runoff processes in the sewer systems. Long term event time series are extracted from the - three different synthetic rainfall time series (comprising up to 600 years continuous rainfall) provided for each catchment and - observed gauge rainfall (reference rainfall) according national hydraulic design standards. The synthetic and reference long term event time series are used as rainfall input for the hydrodynamic sewer models. For comparison of the synthetic rainfall time series against the reference rainfall and against each other the number of - surcharged manholes, - surcharges per manhole, - and the average surcharge volume per manhole are applied as hydraulic performance criteria. The results are discussed and assessed to answer the following questions: - Are the synthetic rainfall approaches suitable to generate high resolution rainfall series and do they produce, - in combination with numerical rainfall runoff models - valid results for design of urban drainage systems? - What are the bounds of uncertainty in the runoff results depending on the synthetic rainfall model and on the climate region? The work is carried out within the SYNOPSE project, funded by the German Federal Ministry of Education and Research (BMBF).

  12. Volatility measurement with directional change in Chinese stock market: Statistical property and investment strategy

    NASA Astrophysics Data System (ADS)

    Ma, Junjun; Xiong, Xiong; He, Feng; Zhang, Wei

    2017-04-01

    The stock price fluctuation is studied in this paper with intrinsic time perspective. The event, directional change (DC) or overshoot, are considered as time scale of price time series. With this directional change law, its corresponding statistical properties and parameter estimation is tested in Chinese stock market. Furthermore, a directional change trading strategy is proposed for invest in the market portfolio in Chinese stock market, and both in-sample and out-of-sample performance are compared among the different method of model parameter estimation. We conclude that DC method can capture important fluctuations in Chinese stock market and gain profit due to the statistical property that average upturn overshoot size is bigger than average downturn directional change size. The optimal parameter of DC method is not fixed and we obtained 1.8% annual excess return with this DC-based trading strategy.

  13. Load balancing for massively-parallel soft-real-time systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hailperin, M.

    1988-09-01

    Global load balancing, if practical, would allow the effective use of massively-parallel ensemble architectures for large soft-real-problems. The challenge is to replace quick global communications, which is impractical in a massively-parallel system, with statistical techniques. In this vein, the author proposes a novel approach to decentralized load balancing based on statistical time-series analysis. Each site estimates the system-wide average load using information about past loads of individual sites and attempts to equal that average. This estimation process is practical because the soft-real-time systems of interest naturally exhibit loads that are periodic, in a statistical sense akin to seasonality in econometrics.more » It is shown how this load-characterization technique can be the foundation for a load-balancing system in an architecture employing cut-through routing and an efficient multicast protocol.« less

  14. On nonstationarity and antipersistency in global temperature series

    NASA Astrophysics Data System (ADS)

    KäRner, O.

    2002-10-01

    Statistical analysis is carried out for satellite-based global daily tropospheric and stratospheric temperature anomaly and solar irradiance data sets. Behavior of the series appears to be nonstationary with stationary daily increments. Estimating long-range dependence between the increments reveals a remarkable difference between the two temperature series. Global average tropospheric temperature anomaly behaves similarly to the solar irradiance anomaly. Their daily increments show antipersistency for scales longer than 2 months. The property points at a cumulative negative feedback in the Earth climate system governing the tropospheric variability during the last 22 years. The result emphasizes a dominating role of the solar irradiance variability in variations of the tropospheric temperature and gives no support to the theory of anthropogenic climate change. The global average stratospheric temperature anomaly proceeds like a 1-dim random walk at least up to 11 years, allowing good presentation by means of the autoregressive integrated moving average (ARIMA) models for monthly series.

  15. Essential/precursor chemicals and drug consumption: impacts of US sodium permanganate and Mexico pseudoephedrine controls on the numbers of US cocaine and methamphetamine users.

    PubMed

    Cunningham, James K; Liu, Lon-Mu; Callaghan, Russell C

    2016-11-01

    In December 2006 the United States regulated sodium permanganate, a cocaine essential chemical. In March 2007 Mexico, the United States' primary source for methamphetamine, closed a chemical company accused of illicitly importing 60+ tons of pseudoephedrine, a methamphetamine precursor chemical. US cocaine availability and methamphetamine availability, respectively, decreased in association. This study tested whether the controls had impacts upon the numbers of US cocaine users and methamphetamine users. Auto-regressive integrated moving average (ARIMA) intervention time-series analysis. Comparison series-heroin and marijuana users-were used. United States, 2002-14. The National Survey on Drug Use and Health (n = 723 283), a complex sample survey of the US civilian, non-institutionalized population. Estimates of the numbers of (1) past-year users and (2) past-month users were constructed for each calendar quarter from 2002 to 2014, providing each series with 52 time-periods. Downward shifts in cocaine users started at the time of the cocaine regulation. Past-year and past-month cocaine users series levels decreased by approximately 1 946 271 (-32%) (P < 0.05) and 694 770 (-29%) (P < 0.01), respectively-no apparent recovery occurred through 2014. Downward shifts in methamphetamine users started at the time of the chemical company closure. Past-year and past-month methamphetamine series levels decreased by 494 440 (-35%) [P < 0.01; 95% confidence interval (CI) = -771 897, -216 982] and 277 380 (-45%) (P < 0.05; CI = -554 073, -686), respectively-partial recovery possibly occurred in 2013. The comparison series changed little at the intervention times. Essential/precursor chemical controls in the United States (2006) and Mexico (2007) were associated with large, extended (7+ years) reductions in cocaine users and methamphetamine users in the United States. © 2016 Society for the Study of Addiction.

  16. 40 CFR Appendix N to Part 50 - Interpretation of the National Ambient Air Quality Standards for PM2.5

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... series of daily values represents the 98th percentile for that year. Creditable samples include daily... measured (or averaged from hourly measurements in AQS) from midnight to midnight (local standard time) from... design value (DV) or a 24-hour PM2.5 NAAQS DV to determine if those metrics, which are judged to be based...

  17. Time series analysis of collective motions in proteins

    NASA Astrophysics Data System (ADS)

    Alakent, Burak; Doruker, Pemra; ćamurdan, Mehmet C.

    2004-01-01

    The dynamics of α-amylase inhibitor tendamistat around its native state is investigated using time series analysis of the principal components of the Cα atomic displacements obtained from molecular dynamics trajectories. Collective motion along a principal component is modeled as a homogeneous nonstationary process, which is the result of the damped oscillations in local minima superimposed on a random walk. The motion in local minima is described by a stationary autoregressive moving average model, consisting of the frequency, damping factor, moving average parameters and random shock terms. Frequencies for the first 50 principal components are found to be in the 3-25 cm-1 range, which are well correlated with the principal component indices and also with atomistic normal mode analysis results. Damping factors, though their correlation is less pronounced, decrease as principal component indices increase, indicating that low frequency motions are less affected by friction. The existence of a positive moving average parameter indicates that the stochastic force term is likely to disturb the mode in opposite directions for two successive sampling times, showing the modes tendency to stay close to minimum. All these four parameters affect the mean square fluctuations of a principal mode within a single minimum. The inter-minima transitions are described by a random walk model, which is driven by a random shock term considerably smaller than that for the intra-minimum motion. The principal modes are classified into three subspaces based on their dynamics: essential, semiconstrained, and constrained, at least in partial consistency with previous studies. The Gaussian-type distributions of the intermediate modes, called "semiconstrained" modes, are explained by asserting that this random walk behavior is not completely free but between energy barriers.

  18. Air pollution and daily mortality: A new approach to an old problem

    NASA Astrophysics Data System (ADS)

    Lipfert, Frederick W.; Murray, Christian J.

    2012-08-01

    Many time-series studies find associations between acute health effects and ambient air quality under current conditions. However, few such studies link mortality with morbidity to provide rational bases for improving public health. This paper describes a research project that developed and validated a new modeling approach directly addressing changes in life expectancies and the prematurity of deaths associated with transient changes in air quality. We used state-space modeling and Kalman filtering of elderly Philadelphia mortality counts from 1974-88 to estimate the size of the population at highest risk of imminent death. This subpopulation appears stable over time but is sensitive to season and to environmental factors: ambient temperature, ozone, and total suspended particulate matter (TSP), as an index of airborne particles in this demonstration of methodology. This population at extreme risk averages fewer than 0.1% of the elderly. By considering successively longer lags or moving averages of TSP, we find that cumulative short-term effects on entry to the at-risk pool tend to level off and decrease as periods of exposure longer than a few days are considered. These estimated environmental effects on the elderly are consistent with previous analyses using conventional time-series methods. However, this new model suggests that such environmentally linked deaths comprise only about half of the subjects whose frailty is associated with environmental factors. The average life expectancy of persons in the at-risk pool is estimated to be 5-7 days, which may be reduced by less than one day by environmental effects. These results suggest that exposures leading up to severe acute frailty and subsequent risk of imminent death may be more important from a public health perspective than those directly associated with subsequent mortality.

  19. Reducing the legal blood alcohol concentration limit for driving in developing countries: a time for change? Results and implications derived from a time-series analysis (2001-10) conducted in Brazil.

    PubMed

    Andreuccetti, Gabriel; Carvalho, Heraclito B; Cherpitel, Cheryl J; Ye, Yu; Ponce, Julio C; Kahn, Tulio; Leyton, Vilma

    2011-12-01

    In Brazil, a new law introduced in 2008 has lowered the blood alcohol concentration limit for drivers from 0.06 to 0.02, but the effectiveness in reducing traffic accidents remains uncertain. This study evaluated the effects of this enactment on road traffic injuries and fatalities. Time-series analysis using autoregressive integrated moving average (ARIMA) modelling. State and capital of São Paulo, Brazil.   A total of 1,471,087 non-fatal and 51,561 fatal road traffic accident cases in both regions. Monthly rates of traffic injuries and fatalities per 100,000 inhabitants from January 2001 to June 2010. The new traffic law was responsible for significant reductions in traffic injury and fatality rates in both localities (P<0.05). A stronger effect was observed for traffic fatality (-7.2 and -16.0% in the average monthly rate in the State and capital, respectively) compared to traffic injury rates (-1.8 and -2.3% in the State and capital, respectively). Lowering the blood alcohol concentration limit in Brazil had a greater impact on traffic fatalities than injuries, with a higher effect in the capital, where presumably the police enforcement was enhanced. © 2011 The Authors, Addiction © 2011 Society for the Study of Addiction.

  20. Predicting Secchi disk depth from average beam attenuation in a deep, ultra-clear lake

    USGS Publications Warehouse

    Larson, G.L.; Hoffman, R.L.; Hargreaves, B.R.; Collier, R.W.

    2007-01-01

    We addressed potential sources of error in estimating the water clarity of mountain lakes by investigating the use of beam transmissometer measurements to estimate Secchi disk depth. The optical properties Secchi disk depth (SD) and beam transmissometer attenuation (BA) were measured in Crater Lake (Crater Lake National Park, Oregon, USA) at a designated sampling station near the maximum depth of the lake. A standard 20 cm black and white disk was used to measure SD. The transmissometer light source had a nearly monochromatic wavelength of 660 nm and a path length of 25 cm. We created a SD prediction model by regression of the inverse SD of 13 measurements recorded on days when environmental conditions were acceptable for disk deployment with BA averaged over the same depth range as the measured SD. The relationship between inverse SD and averaged BA was significant and the average 95% confidence interval for predicted SD relative to the measured SD was ??1.6 m (range = -4.6 to 5.5 m) or ??5.0%. Eleven additional sample dates tested the accuracy of the predictive model. The average 95% confidence interval for these sample dates was ??0.7 m (range = -3.5 to 3.8 m) or ??2.2%. The 1996-2000 time-series means for measured and predicted SD varied by 0.1 m, and the medians varied by 0.5 m. The time-series mean annual measured and predicted SD's also varied little, with intra-annual differences between measured and predicted mean annual SD ranging from -2.1 to 0.1 m. The results demonstrated that this prediction model reliably estimated Secchi disk depths and can be used to significantly expand optical observations in an environment where the conditions for standardized SD deployments are limited. ?? 2007 Springer Science+Business Media B.V.

  1. No evidence of suicide increase following terrorist attacks in the United States: an interrupted time-series analysis of September 11 and Oklahoma City.

    PubMed

    Pridemore, William Alex; Trahan, Adam; Chamlin, Mitchell B

    2009-12-01

    There is substantial evidence of detrimental psychological sequelae following disasters, including terrorist attacks. The effect of these events on extreme responses such as suicide, however, is unclear. We tested competing hypotheses about such effects by employing autoregressive integrated moving average techniques to model the impact of September 11 and the Oklahoma City bombing on monthly suicide counts at the local, state, and national level. Unlike prior studies that provided conflicting evidence, rigorous time series techniques revealed no support for an increase or decrease in suicides following these events. We conclude that while terrorist attacks produce subsequent psychological morbidity and may affect self and collective efficacy well beyond their immediate impact, these effects are not strong enough to influence levels of suicide mortality.

  2. Spatial and temporal patterns of dengue in Guangdong province of China.

    PubMed

    Wang, Chenggang; Yang, Weizhong; Fan, Jingchun; Wang, Furong; Jiang, Baofa; Liu, Qiyong

    2015-03-01

    The aim of the study was to describe the spatial and temporal patterns of dengue in Guangdong for 1978 to 2010. Time series analysis was performed using data on annual dengue incidence in Guangdong province for 1978-2010. Annual average dengue incidences for each city were mapped for 4 periods by using the geographical information system (GIS). Hot spot analysis was used to identify spatial patterns of dengue cases for 2005-2010 by using the CrimeStat III software. The incidence of dengue in Guangdong province had fallen steadily from 1978 to 2010. The time series was a random sequence without regularity and with no fixed cycle. The geographic range of dengue fever had expanded from 1978 to 2010. Cases were mostly concentrated in Zhanjiang and the developed regions of Pearl River Delta and Shantou. © 2013 APJPH.

  3. Understanding characteristics in multivariate traffic flow time series from complex network structure

    NASA Astrophysics Data System (ADS)

    Yan, Ying; Zhang, Shen; Tang, Jinjun; Wang, Xiaofei

    2017-07-01

    Discovering dynamic characteristics in traffic flow is the significant step to design effective traffic managing and controlling strategy for relieving traffic congestion in urban cities. A new method based on complex network theory is proposed to study multivariate traffic flow time series. The data were collected from loop detectors on freeway during a year. In order to construct complex network from original traffic flow, a weighted Froenius norm is adopt to estimate similarity between multivariate time series, and Principal Component Analysis is implemented to determine the weights. We discuss how to select optimal critical threshold for networks at different hour in term of cumulative probability distribution of degree. Furthermore, two statistical properties of networks: normalized network structure entropy and cumulative probability of degree, are utilized to explore hourly variation in traffic flow. The results demonstrate these two statistical quantities express similar pattern to traffic flow parameters with morning and evening peak hours. Accordingly, we detect three traffic states: trough, peak and transitional hours, according to the correlation between two aforementioned properties. The classifying results of states can actually represent hourly fluctuation in traffic flow by analyzing annual average hourly values of traffic volume, occupancy and speed in corresponding hours.

  4. Efficient Bayesian inference for natural time series using ARFIMA processes

    NASA Astrophysics Data System (ADS)

    Graves, T.; Gramacy, R. B.; Franzke, C. L. E.; Watkins, N. W.

    2015-11-01

    Many geophysical quantities, such as atmospheric temperature, water levels in rivers, and wind speeds, have shown evidence of long memory (LM). LM implies that these quantities experience non-trivial temporal memory, which potentially not only enhances their predictability, but also hampers the detection of externally forced trends. Thus, it is important to reliably identify whether or not a system exhibits LM. In this paper we present a modern and systematic approach to the inference of LM. We use the flexible autoregressive fractional integrated moving average (ARFIMA) model, which is widely used in time series analysis, and of increasing interest in climate science. Unlike most previous work on the inference of LM, which is frequentist in nature, we provide a systematic treatment of Bayesian inference. In particular, we provide a new approximate likelihood for efficient parameter inference, and show how nuisance parameters (e.g., short-memory effects) can be integrated over in order to focus on long-memory parameters and hypothesis testing more directly. We illustrate our new methodology on the Nile water level data and the central England temperature (CET) time series, with favorable comparison to the standard estimators. For CET we also extend our method to seasonal long memory.

  5. Time series modelling and forecasting of emergency department overcrowding.

    PubMed

    Kadri, Farid; Harrou, Fouzi; Chaabane, Sondès; Tahon, Christian

    2014-09-01

    Efficient management of patient flow (demand) in emergency departments (EDs) has become an urgent issue for many hospital administrations. Today, more and more attention is being paid to hospital management systems to optimally manage patient flow and to improve management strategies, efficiency and safety in such establishments. To this end, EDs require significant human and material resources, but unfortunately these are limited. Within such a framework, the ability to accurately forecast demand in emergency departments has considerable implications for hospitals to improve resource allocation and strategic planning. The aim of this study was to develop models for forecasting daily attendances at the hospital emergency department in Lille, France. The study demonstrates how time-series analysis can be used to forecast, at least in the short term, demand for emergency services in a hospital emergency department. The forecasts were based on daily patient attendances at the paediatric emergency department in Lille regional hospital centre, France, from January 2012 to December 2012. An autoregressive integrated moving average (ARIMA) method was applied separately to each of the two GEMSA categories and total patient attendances. Time-series analysis was shown to provide a useful, readily available tool for forecasting emergency department demand.

  6. Continuous measurement of suspended-sediment discharge in rivers by use of optical backscatterance sensors

    USGS Publications Warehouse

    Schoellhamer, D.H.; Wright, S.A.; Bogen, J.; Fergus, T.; Walling, D.

    2003-01-01

    Optical sensors have been used to measure turbidity and suspended-sediment concentration by many marine and estuarine studies, and optical sensors can provide automated, continuous time series of suspended-sediment concentration and discharge in rivers. Three potential problems with using optical sensors are biological fouling, particle-size variability, and particle-reflectivity variability. Despite varying particle size, output from an optical backscatterance sensor in the Sacramento River at Freeport, California, USA, was calibrated successfully to discharge-weighted, cross-sectionally averaged suspended-sediment concentration, which was measured with the equal discharge-, or width-increment, methods and an isokinetic sampler. A correction for sensor drift was applied to the 3-year time series. However, the calibration of an optical backscatterance sensor used in the Colorado River at Cisco, Utah, USA, was affected by particle-size variability. The adjusted time series at Freeport was used to calculate hourly suspended-sediment discharge that compared well with daily values from a sediment station at Freeport. The appropriateness of using optical sensors in rivers should be evaluated on a site-specific basis and measurement objectives, potential particle size effects, and potential fouling should be considered.

  7. Longest time series of glacier mass changes in the Himalaya based on stereo imagery

    NASA Astrophysics Data System (ADS)

    Bolch, T.; Pieczonka, T.; Benn, D. I.

    2010-12-01

    Mass loss of Himalayan glaciers has wide-ranging consequences such as declining water resources, sea level rise and an increasing risk of glacial lake outburst floods (GLOFs). The assessment of the regional and global impact of glacier changes in the Himalaya is, however, hampered by a lack of mass balance data for most of the range. Multi-temporal digital terrain models (DTMs) allow glacier mass balance to be calculated since the availability of stereo imagery. Here we present the longest time series of mass changes in the Himalaya and show the high value of early stereo spy imagery such as Corona (years 1962 and 1970) aerial images and recent high resolution satellite data (Cartosat-1) to calculate a time series of glacier changes south of Mt. Everest, Nepal. We reveal that the glaciers are significantly losing mass with an increasing rate since at least ~1970, despite thick debris cover. The specific mass loss is 0.32 ± 0.08 m w.e. a-1, however, not higher than the global average. The spatial patterns of surface lowering can be explained by variations in debris-cover thickness, glacier velocity, and ice melt due to exposed ice cliffs and ponds.

  8. Unraveling spurious properties of interaction networks with tailored random networks.

    PubMed

    Bialonski, Stephan; Wendler, Martin; Lehnertz, Klaus

    2011-01-01

    We investigate interaction networks that we derive from multivariate time series with methods frequently employed in diverse scientific fields such as biology, quantitative finance, physics, earth and climate sciences, and the neurosciences. Mimicking experimental situations, we generate time series with finite length and varying frequency content but from independent stochastic processes. Using the correlation coefficient and the maximum cross-correlation, we estimate interdependencies between these time series. With clustering coefficient and average shortest path length, we observe unweighted interaction networks, derived via thresholding the values of interdependence, to possess non-trivial topologies as compared to Erdös-Rényi networks, which would indicate small-world characteristics. These topologies reflect the mostly unavoidable finiteness of the data, which limits the reliability of typically used estimators of signal interdependence. We propose random networks that are tailored to the way interaction networks are derived from empirical data. Through an exemplary investigation of multichannel electroencephalographic recordings of epileptic seizures--known for their complex spatial and temporal dynamics--we show that such random networks help to distinguish network properties of interdependence structures related to seizure dynamics from those spuriously induced by the applied methods of analysis.

  9. Unraveling Spurious Properties of Interaction Networks with Tailored Random Networks

    PubMed Central

    Bialonski, Stephan; Wendler, Martin; Lehnertz, Klaus

    2011-01-01

    We investigate interaction networks that we derive from multivariate time series with methods frequently employed in diverse scientific fields such as biology, quantitative finance, physics, earth and climate sciences, and the neurosciences. Mimicking experimental situations, we generate time series with finite length and varying frequency content but from independent stochastic processes. Using the correlation coefficient and the maximum cross-correlation, we estimate interdependencies between these time series. With clustering coefficient and average shortest path length, we observe unweighted interaction networks, derived via thresholding the values of interdependence, to possess non-trivial topologies as compared to Erdös-Rényi networks, which would indicate small-world characteristics. These topologies reflect the mostly unavoidable finiteness of the data, which limits the reliability of typically used estimators of signal interdependence. We propose random networks that are tailored to the way interaction networks are derived from empirical data. Through an exemplary investigation of multichannel electroencephalographic recordings of epileptic seizures – known for their complex spatial and temporal dynamics – we show that such random networks help to distinguish network properties of interdependence structures related to seizure dynamics from those spuriously induced by the applied methods of analysis. PMID:21850239

  10. Temporal–Spatial Surface Seasonal Mass Changes and Vertical Crustal Deformation in South China Block from GPS and GRACE Measurements

    PubMed Central

    He, Meilin; Shen, Wenbin; Chen, Ruizhi; Ding, Hao; Guo, Guangyi

    2017-01-01

    The solid Earth deforms elastically in response to variations of surface atmosphere, hydrology, and ice/glacier mass loads. Continuous geodetic observations by Global Positioning System (CGPS) stations and Gravity Recovery and Climate Experiment (GRACE) record such deformations to estimate seasonal and secular mass changes. In this paper, we present the seasonal variation of the surface mass changes and the crustal vertical deformation in the South China Block (SCB) identified by GPS and GRACE observations with records spanning from 1999 to 2016. We used 33 CGPS stations to construct a time series of coordinate changes, which are decomposed by empirical orthogonal functions (EOFs) in SCB. The average weighted root-mean-square (WRMS) reduction is 38% when we subtract GRACE-modeled vertical displacements from GPS time series. The first common mode shows clear seasonal changes, indicating seasonal surface mass re-distribution in and around the South China Block. The correlation between GRACE and GPS time series is analyzed which provides a reference for further improvement of the seasonal variation of CGPS time series. The results of the GRACE observations inversion are the surface deformations caused by the surface mass change load at a rate of about −0.4 to −0.8 mm/year, which is used to improve the long-term trend of non-tectonic loads of the GPS vertical velocity field to further explain the crustal tectonic movement in the SCB and surroundings. PMID:29301236

  11. The quasi-biennial vertical oscillations at global GPS stations: identification by ensemble empirical mode decomposition.

    PubMed

    Pan, Yuanjin; Shen, Wen-Bin; Ding, Hao; Hwang, Cheinway; Li, Jin; Zhang, Tengxu

    2015-10-14

    Modeling nonlinear vertical components of a GPS time series is critical to separating sources contributing to mass displacements. Improved vertical precision in GPS positioning at stations for velocity fields is key to resolving the mechanism of certain geophysical phenomena. In this paper, we use ensemble empirical mode decomposition (EEMD) to analyze the daily GPS time series at 89 continuous GPS stations, spanning from 2002 to 2013. EEMD decomposes a GPS time series into different intrinsic mode functions (IMFs), which are used to identify different kinds of signals and secular terms. Our study suggests that the GPS records contain not only the well-known signals (such as semi-annual and annual signals) but also the seldom-noted quasi-biennial oscillations (QBS). The quasi-biennial signals are explained by modeled loadings of atmosphere, non-tidal and hydrology that deform the surface around the GPS stations. In addition, the loadings derived from GRACE gravity changes are also consistent with the quasi-biennial deformations derived from the GPS observations. By removing the modeled components, the weighted root-mean-square (WRMS) variation of the GPS time series is reduced by 7.1% to 42.3%, and especially, after removing the seasonal and QBO signals, the average improvement percentages for seasonal and QBO signals are 25.6% and 7.5%, respectively, suggesting that it is significant to consider the QBS signals in the GPS records to improve the observed vertical deformations.

  12. The Quasi-Biennial Vertical Oscillations at Global GPS Stations: Identification by Ensemble Empirical Mode Decomposition

    PubMed Central

    Pan, Yuanjin; Shen, Wen-Bin; Ding, Hao; Hwang, Cheinway; Li, Jin; Zhang, Tengxu

    2015-01-01

    Modeling nonlinear vertical components of a GPS time series is critical to separating sources contributing to mass displacements. Improved vertical precision in GPS positioning at stations for velocity fields is key to resolving the mechanism of certain geophysical phenomena. In this paper, we use ensemble empirical mode decomposition (EEMD) to analyze the daily GPS time series at 89 continuous GPS stations, spanning from 2002 to 2013. EEMD decomposes a GPS time series into different intrinsic mode functions (IMFs), which are used to identify different kinds of signals and secular terms. Our study suggests that the GPS records contain not only the well-known signals (such as semi-annual and annual signals) but also the seldom-noted quasi-biennial oscillations (QBS). The quasi-biennial signals are explained by modeled loadings of atmosphere, non-tidal and hydrology that deform the surface around the GPS stations. In addition, the loadings derived from GRACE gravity changes are also consistent with the quasi-biennial deformations derived from the GPS observations. By removing the modeled components, the weighted root-mean-square (WRMS) variation of the GPS time series is reduced by 7.1% to 42.3%, and especially, after removing the seasonal and QBO signals, the average improvement percentages for seasonal and QBO signals are 25.6% and 7.5%, respectively, suggesting that it is significant to consider the QBS signals in the GPS records to improve the observed vertical deformations. PMID:26473882

  13. Evidence for a fundamental and pervasive shift away from nature-based recreation

    PubMed Central

    Pergams, Oliver R. W.; Zaradic, Patricia A.

    2008-01-01

    After 50 years of steady increase, per capita visits to U.S. National Parks have declined since 1987. To evaluate whether we are seeing a fundamental shift away from people's interest in nature, we tested for similar longitudinal declines in 16 time series representing four classes of nature participation variables: (i) visitation to various types of public lands in the U.S. and National Parks in Japan and Spain, (ii) number of various types of U.S. game licenses issued, (iii) indicators of time spent camping, and (iv) indicators of time spent backpacking or hiking. The four variables with the greatest per capita participation were visits to Japanese National Parks, U.S. State Parks, U.S. National Parks, and U.S. National Forests, with an average individual participating 0.74–2.75 times per year. All four time series are in downtrends, with linear regressions showing ongoing losses of −1.0% to −3.1% per year. The longest and most complete time series tested suggest that typical declines in per capita nature recreation began between 1981 and 1991, are proceeding at rates of −1.0% to −1.3% per year, and total to date −18% to −25%. Spearman correlation analyses were performed on untransformed time series and on transformed percentage year-to-year changes. Results showed very highly significant correlations between many of the highest per capita participation variables in both untransformed and in difference models, further corroborating the general downtrend in nature recreation. In conclusion, all major lines of evidence point to an ongoing and fundamental shift away from nature-based recreation. PMID:18250312

  14. Multi-temporal AirSWOT elevations on the Willamette river: error characterization and algorithm testing

    NASA Astrophysics Data System (ADS)

    Tuozzolo, S.; Frasson, R. P. M.; Durand, M. T.

    2017-12-01

    We analyze a multi-temporal dataset of in-situ and airborne water surface measurements from the March 2015 AirSWOT field campaign on the Willamette River in Western Oregon, which included six days of AirSWOT flights over a 75km stretch of the river. We examine systematic errors associated with dark water and layover effects in the AirSWOT dataset, and test the efficacies of different filtering and spatial averaging techniques at reconstructing the water surface profile. Finally, we generate a spatially-averaged time-series of water surface elevation and water surface slope. These AirSWOT-derived reach-averaged values are ingested in a prospective SWOT discharge algorithm to assess its performance on SWOT-like data collected from a borderline SWOT-measurable river (mean width = 90m).

  15. WaterWatch - Maps, graphs, and tables of current, recent, and past streamflow conditions

    USGS Publications Warehouse

    Jian, Xiaodong; Wolock, David; Lins, Harry F.

    2008-01-01

    WaterWatch (http://water.usgs.gov/waterwatch/) is a U.S. Geological Survey (USGS) World Wide Web site that dis­plays maps, graphs, and tables describing real-time, recent, and past streamflow conditions for the United States. The real-time information generally is updated on an hourly basis. WaterWatch provides streamgage-based maps that show the location of more than 3,000 long-term (30 years or more) USGS streamgages; use colors to represent streamflow conditions compared to historical streamflow; feature a point-and-click interface allowing users to retrieve graphs of stream stage (water elevation) and flow; and highlight locations where extreme hydrologic events, such as floods and droughts, are occurring.The streamgage-based maps show streamflow conditions for real-time, average daily, and 7-day average streamflow. The real-time streamflow maps highlight flood and high flow conditions. The 7-day average streamflow maps highlight below-normal and drought conditions.WaterWatch also provides hydrologic unit code (HUC) maps. HUC-based maps are derived from the streamgage-based maps and illustrate streamflow conditions in hydrologic regions. These maps show average streamflow conditions for 1-, 7-, 14-, and 28-day periods, and for monthly average streamflow; highlight regions of low flow or hydrologic drought; and provide historical runoff and streamflow conditions beginning in 1901.WaterWatch summarizes streamflow conditions in a region (state or hydrologic unit) in terms of the long-term typical condition at streamgages in the region. Summary tables are provided along with time-series plots that depict variations through time. WaterWatch also includes tables of current streamflow information and locations of flooding.

  16. A Temporal Mining Framework for Classifying Un-Evenly Spaced Clinical Data: An Approach for Building Effective Clinical Decision-Making System.

    PubMed

    Jane, Nancy Yesudhas; Nehemiah, Khanna Harichandran; Arputharaj, Kannan

    2016-01-01

    Clinical time-series data acquired from electronic health records (EHR) are liable to temporal complexities such as irregular observations, missing values and time constrained attributes that make the knowledge discovery process challenging. This paper presents a temporal rough set induced neuro-fuzzy (TRiNF) mining framework that handles these complexities and builds an effective clinical decision-making system. TRiNF provides two functionalities namely temporal data acquisition (TDA) and temporal classification. In TDA, a time-series forecasting model is constructed by adopting an improved double exponential smoothing method. The forecasting model is used in missing value imputation and temporal pattern extraction. The relevant attributes are selected using a temporal pattern based rough set approach. In temporal classification, a classification model is built with the selected attributes using a temporal pattern induced neuro-fuzzy classifier. For experimentation, this work uses two clinical time series dataset of hepatitis and thrombosis patients. The experimental result shows that with the proposed TRiNF framework, there is a significant reduction in the error rate, thereby obtaining the classification accuracy on an average of 92.59% for hepatitis and 91.69% for thrombosis dataset. The obtained classification results prove the efficiency of the proposed framework in terms of its improved classification accuracy.

  17. Crustal Displacements Due to Continental Water Loading

    NASA Technical Reports Server (NTRS)

    vanDam, T.; Wahr, J.; Milly, P. C. D.; Shmakin, A. B.; Blewitt, G.; Lavallee, D.; Larson, K. M.

    2001-01-01

    The effects of long-wavelength (> 100 km), seasonal variability in continental water storage on vertical crustal motions are assessed. The modeled vertical displacements (delta-r(sub M)) have root-mean-square (RMS) values for 1994-1998 as large as 8 mm with ranges up to 30 mm, and are predominantly annual in character. Regional strains are on the order of 20 nanostrain for tilt and 5 nanostrain for horizontal deformation. We compare delta-r(sub M) with observed Global Positioning System (GPS) heights (delta-r(sub O)) (which include adjustments to remove estimated effects of atmospheric pressure and annual tidal and non-tidal ocean loading) for 147 globally distributed sites. When the delta-r(sub O) time series are adjusted by delta-r(sub M), their variances are reduced, on average, by an amount equal to the variance of the delta-r(sub M). Of the delta-r(sub O) time series exhibiting a strong annual signal, more than half are found to have an annual harmonic that is in phase and of comparable amplitude with the annual harmonic in the delta-r(sub M). The delta-r(sub M) time series exhibit long-period variations that could be mistaken for secular tectonic trends or post-glacial rebound when observed over a time span of a few years.

  18. Model calibration criteria for estimating ecological flow characteristics

    USGS Publications Warehouse

    Vis, Marc; Knight, Rodney; Poole, Sandra; Wolfe, William J.; Seibert, Jan; Breuer, Lutz; Kraft, Philipp

    2016-01-01

    Quantification of streamflow characteristics in ungauged catchments remains a challenge. Hydrological modeling is often used to derive flow time series and to calculate streamflow characteristics for subsequent applications that may differ from those envisioned by the modelers. While the estimation of model parameters for ungauged catchments is a challenging research task in itself, it is important to evaluate whether simulated time series preserve critical aspects of the streamflow hydrograph. To address this question, seven calibration objective functions were evaluated for their ability to preserve ecologically relevant streamflow characteristics of the average annual hydrograph using a runoff model, HBV-light, at 27 catchments in the southeastern United States. Calibration trials were repeated 100 times to reduce parameter uncertainty effects on the results, and 12 ecological flow characteristics were computed for comparison. Our results showed that the most suitable calibration strategy varied according to streamflow characteristic. Combined objective functions generally gave the best results, though a clear underprediction bias was observed. The occurrence of low prediction errors for certain combinations of objective function and flow characteristic suggests that (1) incorporating multiple ecological flow characteristics into a single objective function would increase model accuracy, potentially benefitting decision-making processes; and (2) there may be a need to have different objective functions available to address specific applications of the predicted time series.

  19. Wavelet assessment of cerebrospinal compensatory reserve and cerebrovascular pressure reactivity

    NASA Astrophysics Data System (ADS)

    Latka, M.; Turalska, M.; Kolodziej, W.; Latka, D.; West, B.

    2006-03-01

    We employ complex continuous wavelet transforms to develop a consistent mathematical framework capable of quantifying both cerebrospinal compensatory reserve and cerebrovascular pressure--reactivity. The wavelet gain, defined as the frequency dependent ratio of time averaged wavelet coefficients of intracranial (ICP) and arterial blood pressure (ABP) fluctuations, characterizes the dampening of spontaneous arterial blood oscillations. This gain is introduced as a novel measure of cerebrospinal compensatory reserve. For a group of 10 patients who died as a result of head trauma (Glasgow Outcome Scale GOS =1) the average gain is 0.45 calculated at 0.05 Hz significantly exceeds that of 16 patients with favorable outcome (GOS=2): with gain of 0.24 with p=4x10-5. We also study the dynamics of instantaneous phase difference between the fluctuations of the ABP and ICP time series. The time-averaged synchronization index, which depends upon frequency, yields the information about the stability of the phase difference and is used as a cerebrovascular pressure--reactivity index. The average phase difference for GOS=1 is close to zero in sharp contrast to the mean value of 30^o for patients with GOS=2. We hypothesize that in patients who died the impairment of cerebral autoregulation is followed by the break down of residual pressure reactivity.

  20. Design of Interrogation Protocols for Radiation Dose Measurements Using Optically-Stimulated Luminescent Dosimeters.

    PubMed

    Abraham, Sara A; Kearfott, Kimberlee J; Jawad, Ali H; Boria, Andrew J; Buth, Tobias J; Dawson, Alexander S; Eng, Sheldon C; Frank, Samuel J; Green, Crystal A; Jacobs, Mitchell L; Liu, Kevin; Miklos, Joseph A; Nguyen, Hien; Rafique, Muhammad; Rucinski, Blake D; Smith, Travis; Tan, Yanliang

    2017-03-01

    Optically-stimulated luminescent dosimeters are capable of being interrogated multiple times post-irradiation. Each interrogation removes a fraction of the signal stored within the optically-stimulated luminescent dosimeter. This signal loss must be corrected to avoid systematic errors in estimating the average signal of a series of optically-stimulated luminescent dosimeter interrogations and requires a minimum number of consecutive readings to determine an average signal that is within a desired accuracy of the true signal with a desired statistical confidence. This paper establishes a technical basis for determining the required number of readings for a particular application of these dosimeters when using certain OSL dosimetry systems.

Top