Computation of canonical correlation and best predictable aspect of future for time series
NASA Technical Reports Server (NTRS)
Pourahmadi, Mohsen; Miamee, A. G.
1989-01-01
The canonical correlation between the (infinite) past and future of a stationary time series is shown to be the limit of the canonical correlation between the (infinite) past and (finite) future, and computation of the latter is reduced to a (generalized) eigenvalue problem involving (finite) matrices. This provides a convenient and essentially, finite-dimensional algorithm for computing canonical correlations and components of a time series. An upper bound is conjectured for the largest canonical correlation.
Analysis and Forecasting of Shoreline Position
NASA Astrophysics Data System (ADS)
Barton, C. C.; Tebbens, S. F.
2007-12-01
Analysis of historical shoreline positions on sandy coasts, in the geologic record, and study of sea-level rise curves reveals that the dynamics of the underlying processes produce temporal/spatial signals that exhibit power scaling and are therefore self-affine fractals. Self-affine time series signals can be quantified over many orders of magnitude in time and space in terms of persistence, a measure of the degree of correlation between adjacent values in the stochastic portion of a time series. Fractal statistics developed for self-affine time series are used to forecast a probability envelope bounding future shoreline positions. The envelope provides the standard deviation as a function of three variables: persistence, a constant equal to the value of the power spectral density when 1/period equals 1, and the number of time increments. The persistence of a twenty-year time series of the mean-high-water (MHW) shoreline positions was measured for four profiles surveyed at Duck, NC at the Field Research Facility (FRF) by the U.S. Army Corps of Engineers. The four MHW shoreline time series signals are self-affine with persistence ranging between 0.8 and 0.9, which indicates that the shoreline position time series is weakly persistent (where zero is uncorrelated), and has highly varying trends for all time intervals sampled. Forecasts of a probability envelope for future MHW positions are made for the 20 years of record and beyond to 50 years from the start of the data records. The forecasts describe the twenty-year data sets well and indicate that within a 96% confidence envelope, future decadal MHW shoreline excursions should be within 14.6 m of the position at the start of data collection. This is a stable-oscillatory shoreline. The forecasting method introduced here includes the stochastic portion of the time series while the traditional method of predicting shoreline change reduces the time series to a linear trend line fit to historic shoreline positions and extrapolated linearly to forecast future positions with a linearly increasing mean that breaks the confidence envelope eight years into the future and continues to increase. The traditional method is a poor representation of the observed shoreline position time series and is a poor basis for extrapolating future shoreline positions.
75 FR 47320 - Millington Securities, Inc., et al.; Notice of Application
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-05
... investment. At such time, the Series also will transmit to the Unaffiliated Underlying Fund a list of the... Unit Investment Trusts (the ``Trust''), on behalf of itself and any future series, and any future... by or under common control with the Depositor) and their respective series (the future UITs, together...
How long will the traffic flow time series keep efficacious to forecast the future?
NASA Astrophysics Data System (ADS)
Yuan, PengCheng; Lin, XuXun
2017-02-01
This paper investigate how long will the historical traffic flow time series keep efficacious to forecast the future. In this frame, we collect the traffic flow time series data with different granularity at first. Then, using the modified rescaled range analysis method, we analyze the long memory property of the traffic flow time series by computing the Hurst exponent. We calculate the long-term memory cycle and test its significance. We also compare it with the maximum Lyapunov exponent method result. Our results show that both of the freeway traffic flow time series and the ground way traffic flow time series demonstrate positively correlated trend (have long-term memory property), both of their memory cycle are about 30 h. We think this study is useful for the short-term or long-term traffic flow prediction and management.
Association mining of dependency between time series
NASA Astrophysics Data System (ADS)
Hafez, Alaaeldin
2001-03-01
Time series analysis is considered as a crucial component of strategic control over a broad variety of disciplines in business, science and engineering. Time series data is a sequence of observations collected over intervals of time. Each time series describes a phenomenon as a function of time. Analysis on time series data includes discovering trends (or patterns) in a time series sequence. In the last few years, data mining has emerged and been recognized as a new technology for data analysis. Data Mining is the process of discovering potentially valuable patterns, associations, trends, sequences and dependencies in data. Data mining techniques can discover information that many traditional business analysis and statistical techniques fail to deliver. In this paper, we adapt and innovate data mining techniques to analyze time series data. By using data mining techniques, maximal frequent patterns are discovered and used in predicting future sequences or trends, where trends describe the behavior of a sequence. In order to include different types of time series (e.g. irregular and non- systematic), we consider past frequent patterns of the same time sequences (local patterns) and of other dependent time sequences (global patterns). We use the word 'dependent' instead of the word 'similar' for emphasis on real life time series where two time series sequences could be completely different (in values, shapes, etc.), but they still react to the same conditions in a dependent way. In this paper, we propose the Dependence Mining Technique that could be used in predicting time series sequences. The proposed technique consists of three phases: (a) for all time series sequences, generate their trend sequences, (b) discover maximal frequent trend patterns, generate pattern vectors (to keep information of frequent trend patterns), use trend pattern vectors to predict future time series sequences.
A novel time series link prediction method: Learning automata approach
NASA Astrophysics Data System (ADS)
Moradabadi, Behnaz; Meybodi, Mohammad Reza
2017-09-01
Link prediction is a main social network challenge that uses the network structure to predict future links. The common link prediction approaches to predict hidden links use a static graph representation where a snapshot of the network is analyzed to find hidden or future links. For example, similarity metric based link predictions are a common traditional approach that calculates the similarity metric for each non-connected link and sort the links based on their similarity metrics and label the links with higher similarity scores as the future links. Because people activities in social networks are dynamic and uncertainty, and the structure of the networks changes over time, using deterministic graphs for modeling and analysis of the social network may not be appropriate. In the time-series link prediction problem, the time series link occurrences are used to predict the future links In this paper, we propose a new time series link prediction based on learning automata. In the proposed algorithm for each link that must be predicted there is one learning automaton and each learning automaton tries to predict the existence or non-existence of the corresponding link. To predict the link occurrence in time T, there is a chain consists of stages 1 through T - 1 and the learning automaton passes from these stages to learn the existence or non-existence of the corresponding link. Our preliminary link prediction experiments with co-authorship and email networks have provided satisfactory results when time series link occurrences are considered.
Sea change: Charting the course for biogeochemical ocean time-series research in a new millennium
NASA Astrophysics Data System (ADS)
Church, Matthew J.; Lomas, Michael W.; Muller-Karger, Frank
2013-09-01
Ocean time-series provide vital information needed for assessing ecosystem change. This paper summarizes the historical context, major program objectives, and future research priorities for three contemporary ocean time-series programs: The Hawaii Ocean Time-series (HOT), the Bermuda Atlantic Time-series Study (BATS), and the CARIACO Ocean Time-Series. These three programs operate in physically and biogeochemically distinct regions of the world's oceans, with HOT and BATS located in the open-ocean waters of the subtropical North Pacific and North Atlantic, respectively, and CARIACO situated in the anoxic Cariaco Basin of the tropical Atlantic. All three programs sustain near-monthly shipboard occupations of their field sampling sites, with HOT and BATS beginning in 1988, and CARIACO initiated in 1996. The resulting data provide some of the only multi-disciplinary, decadal-scale determinations of time-varying ecosystem change in the global ocean. Facilitated by a scoping workshop (September 2010) sponsored by the Ocean Carbon Biogeochemistry (OCB) program, leaders of these time-series programs sought community input on existing program strengths and for future research directions. Themes that emerged from these discussions included: 1. Shipboard time-series programs are key to informing our understanding of the connectivity between changes in ocean-climate and biogeochemistry 2. The scientific and logistical support provided by shipboard time-series programs forms the backbone for numerous research and education programs. Future studies should be encouraged that seek mechanistic understanding of ecological interactions underlying the biogeochemical dynamics at these sites. 3. Detecting time-varying trends in ocean properties and processes requires consistent, high-quality measurements. Time-series must carefully document analytical procedures and, where possible, trace the accuracy of analyses to certified standards and internal reference materials. 4. Leveraged implementation, testing, and validation of autonomous and remote observing technologies at time-series sites provide new insights into spatiotemporal variability underlying ecosystem changes. 5. The value of existing time-series data for formulating and validating ecosystem models should be promoted. In summary, the scientific underpinnings of ocean time-series programs remain as strong and important today as when these programs were initiated. The emerging data inform our knowledge of the ocean's biogeochemistry and ecology, and improve our predictive capacity about planetary change.
Trading with the Future and Futures Trading. Series on Public Issues No. 14.
ERIC Educational Resources Information Center
Auernheimer, Leonardo
In this booklet, one of a series intended to apply economic principles to major social and political issues of the day, it is proposed that speculation is often misunderstood, particularly in the operation of the futures markets. These are markets in which obligations to consummate sales and purchases at some time in the future are traded at a…
The Hurst exponent in energy futures prices
NASA Astrophysics Data System (ADS)
Serletis, Apostolos; Rosenberg, Aryeh Adam
2007-07-01
This paper extends the work in Elder and Serletis [Long memory in energy futures prices, Rev. Financial Econ., forthcoming, 2007] and Serletis et al. [Detrended fluctuation analysis of the US stock market, Int. J. Bifurcation Chaos, forthcoming, 2007] by re-examining the empirical evidence for random walk type behavior in energy futures prices. In doing so, it uses daily data on energy futures traded on the New York Mercantile Exchange, over the period from July 2, 1990 to November 1, 2006, and a statistical physics approach-the ‘detrending moving average’ technique-providing a reliable framework for testing the information efficiency in financial markets as shown by Alessio et al. [Second-order moving average and scaling of stochastic time series, Eur. Phys. J. B 27 (2002) 197-200] and Carbone et al. [Time-dependent hurst exponent in financial time series. Physica A 344 (2004) 267-271; Analysis of clusters formed by the moving average of a long-range correlated time series. Phys. Rev. E 69 (2004) 026105]. The results show that energy futures returns display long memory and that the particular form of long memory is anti-persistence.
NASA Astrophysics Data System (ADS)
Du, Kongchang; Zhao, Ying; Lei, Jiaqiang
2017-09-01
In hydrological time series prediction, singular spectrum analysis (SSA) and discrete wavelet transform (DWT) are widely used as preprocessing techniques for artificial neural network (ANN) and support vector machine (SVM) predictors. These hybrid or ensemble models seem to largely reduce the prediction error. In current literature researchers apply these techniques to the whole observed time series and then obtain a set of reconstructed or decomposed time series as inputs to ANN or SVM. However, through two comparative experiments and mathematical deduction we found the usage of SSA and DWT in building hybrid models is incorrect. Since SSA and DWT adopt 'future' values to perform the calculation, the series generated by SSA reconstruction or DWT decomposition contain information of 'future' values. These hybrid models caused incorrect 'high' prediction performance and may cause large errors in practice.
A Review of Subsequence Time Series Clustering
Teh, Ying Wah
2014-01-01
Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies. PMID:25140332
A review of subsequence time series clustering.
Zolhavarieh, Seyedjamal; Aghabozorgi, Saeed; Teh, Ying Wah
2014-01-01
Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies.
Applications of physical methods in high-frequency futures markets
NASA Astrophysics Data System (ADS)
Bartolozzi, M.; Mellen, C.; Chan, F.; Oliver, D.; Di Matteo, T.; Aste, T.
2007-12-01
In the present work we demonstrate the application of different physical methods to high-frequency or tick-bytick financial time series data. In particular, we calculate the Hurst exponent and inverse statistics for the price time series taken from a range of futures indices. Additionally, we show that in a limit order book the relaxation times of an imbalanced book state with more demand or supply can be described by stretched exponential laws analogous to those seen in many physical systems.
Bendel, David; Beck, Ferdinand; Dittmer, Ulrich
2013-01-01
In the presented study climate change impacts on combined sewer overflows (CSOs) in Baden-Wuerttemberg, Southern Germany, were assessed based on continuous long-term rainfall-runoff simulations. As input data, synthetic rainfall time series were used. The applied precipitation generator NiedSim-Klima accounts for climate change effects on precipitation patterns. Time series for the past (1961-1990) and future (2041-2050) were generated for various locations. Comparing the simulated CSO activity of both periods we observe significantly higher overflow frequencies for the future. Changes in overflow volume and overflow duration depend on the type of overflow structure. Both values will increase at simple CSO structures that merely divide the flow, whereas they will decrease when the CSO structure is combined with a storage tank. However, there is a wide variation between the results of different precipitation time series (representative for different locations).
Zheng, Zeyu; Yamasaki, Kazuko; Tenenbaum, Joel N; Stanley, H Eugene
2013-01-01
In a highly interdependent economic world, the nature of relationships between financial entities is becoming an increasingly important area of study. Recently, many studies have shown the usefulness of minimal spanning trees (MST) in extracting interactions between financial entities. Here, we propose a modified MST network whose metric distance is defined in terms of cross-correlation coefficient absolute values, enabling the connections between anticorrelated entities to manifest properly. We investigate 69 daily time series, comprising three types of financial assets: 28 stock market indicators, 21 currency futures, and 20 commodity futures. We show that though the resulting MST network evolves over time, the financial assets of similar type tend to have connections which are stable over time. In addition, we find a characteristic time lag between the volatility time series of the stock market indicators and those of the EU CO(2) emission allowance (EUA) and crude oil futures (WTI). This time lag is given by the peak of the cross-correlation function of the volatility time series EUA (or WTI) with that of the stock market indicators, and is markedly different (>20 days) from 0, showing that the volatility of stock market indicators today can predict the volatility of EU emissions allowances and of crude oil in the near future.
NASA Astrophysics Data System (ADS)
Zheng, Zeyu; Yamasaki, Kazuko; Tenenbaum, Joel N.; Stanley, H. Eugene
2013-01-01
In a highly interdependent economic world, the nature of relationships between financial entities is becoming an increasingly important area of study. Recently, many studies have shown the usefulness of minimal spanning trees (MST) in extracting interactions between financial entities. Here, we propose a modified MST network whose metric distance is defined in terms of cross-correlation coefficient absolute values, enabling the connections between anticorrelated entities to manifest properly. We investigate 69 daily time series, comprising three types of financial assets: 28 stock market indicators, 21 currency futures, and 20 commodity futures. We show that though the resulting MST network evolves over time, the financial assets of similar type tend to have connections which are stable over time. In addition, we find a characteristic time lag between the volatility time series of the stock market indicators and those of the EU CO2 emission allowance (EUA) and crude oil futures (WTI). This time lag is given by the peak of the cross-correlation function of the volatility time series EUA (or WTI) with that of the stock market indicators, and is markedly different (>20 days) from 0, showing that the volatility of stock market indicators today can predict the volatility of EU emissions allowances and of crude oil in the near future.
A Study on Predictive Analytics Application to Ship Machinery Maintenance
2013-09-01
Looking at the nature of the time series forecasting method , it would be better applied to offline analysis . The application for real- time online...other system attributes in future. Two techniques of statistical analysis , mainly time series models and cumulative sum control charts, are discussed in...statistical tool employed for the two techniques of statistical analysis . Both time series forecasting as well as CUSUM control charts are shown to be
miniSEED: The Backbone Data Format for Seismological Time Series
NASA Astrophysics Data System (ADS)
Ahern, T. K.; Benson, R. B.; Trabant, C. M.
2017-12-01
In 1987, the International Federation of Digital Seismograph Networks (FDSN), adopted the Standard for the Exchange of Earthquake Data (SEED) format to be used for data archiving and exchange of seismological time series data. Since that time, the format has evolved to accommodate new capabilities and features. For example, a notable change in 1992 allowed the format, which includes both the comprehensive metadata and the time series samples, to be used in two additional forms: a container for metadata only called "dataless SEED", and 2) a stand-alone structure for time series called "miniSEED". While specifically designed for seismological data and related metadata, this format has proven to be a useful format for a wide variety of geophysical time series data. Many FDSN data centers now store temperature, pressure, infrasound, tilt and other time series measurements in this internationally used format. Since April 2016, members of the FDSN have been in discussions to design a next generation miniSEED format to accommodate current and future needs, to further generalize the format, and to address a number of historical problems or limitations. We believe the correct approach is to simplify the header, allow for arbitrary header additions, expand the current identifiers, and allow for anticipated future identifiers which are currently unknown. We also believe the primary goal of the format is for efficient archiving, selection and exchange of time series data. By focusing on these goals we avoid trying to generalize the format too broadly into specialized areas such as efficient, low-latency delivery, or including unbounded non-time series data. Our presentation will provide an overview of this format and highlight its most valuable characteristics for time series data from any geophysical domain or beyond.
Code of Federal Regulations, 2011 CFR
2011-10-01
... recipient in the market. Present value means the value at the time of calculation of a future payment, or series of future payments discounted by the time value of money as represented by an interest rate or...
Non-linear forecasting in high-frequency financial time series
NASA Astrophysics Data System (ADS)
Strozzi, F.; Zaldívar, J. M.
2005-08-01
A new methodology based on state space reconstruction techniques has been developed for trading in financial markets. The methodology has been tested using 18 high-frequency foreign exchange time series. The results are in apparent contradiction with the efficient market hypothesis which states that no profitable information about future movements can be obtained by studying the past prices series. In our (off-line) analysis positive gain may be obtained in all those series. The trading methodology is quite general and may be adapted to other financial time series. Finally, the steps for its on-line application are discussed.
Future heat stress arising from climate change on Iran's population health.
Modarres, Reza; Ghadami, Mohammad; Naderi, Sohrab; Naderi, Mohammad
2018-04-05
Climate change-induced extreme heat events are becoming a major issue in different parts of the world, especially in developing countries. The assessment of regional and temporal past and future change in heat waves is a crucial task for public health strategies and managements. The historical and future heat index (HI) time series are investigated for temporal change across Iran to study the impact of global warming on public health. The heat index is calculated, and the nonparametric trend assessment is carried out for historical time series (1981-2010). The future change in heat index is also projected for 2020-2049 and 2070-2099 periods. A rise in the historical heat index and extreme caution conditions for summer and spring seasons for major parts of Iran are notable for historical (1981-2010) series in this study. Using different climate change scenarios shows that heat index will exceed the critical threshold for human adaptability in the future in the country. The impact of climate change on heat index risk in Iran is significant in the future. To cope with this crucial situation, developing early warning systems and health care strategies to deal with population growth and remarkable socio-economic features in future is essential.
Future heat stress arising from climate change on Iran's population health
NASA Astrophysics Data System (ADS)
Modarres, Reza; Ghadami, Mohammad; Naderi, Sohrab; Naderi, Mohammad
2018-04-01
Climate change-induced extreme heat events are becoming a major issue in different parts of the world, especially in developing countries. The assessment of regional and temporal past and future change in heat waves is a crucial task for public health strategies and managements. The historical and future heat index (HI) time series are investigated for temporal change across Iran to study the impact of global warming on public health. The heat index is calculated, and the nonparametric trend assessment is carried out for historical time series (1981-2010). The future change in heat index is also projected for 2020-2049 and 2070-2099 periods. A rise in the historical heat index and extreme caution conditions for summer and spring seasons for major parts of Iran are notable for historical (1981-2010) series in this study. Using different climate change scenarios shows that heat index will exceed the critical threshold for human adaptability in the future in the country. The impact of climate change on heat index risk in Iran is significant in the future. To cope with this crucial situation, developing early warning systems and health care strategies to deal with population growth and remarkable socio-economic features in future is essential.
NASA Astrophysics Data System (ADS)
Fatichi, S.; Ivanov, V. Y.; Caporali, E.
2013-04-01
This study extends a stochastic downscaling methodology to generation of an ensemble of hourly time series of meteorological variables that express possible future climate conditions at a point-scale. The stochastic downscaling uses general circulation model (GCM) realizations and an hourly weather generator, the Advanced WEather GENerator (AWE-GEN). Marginal distributions of factors of change are computed for several climate statistics using a Bayesian methodology that can weight GCM realizations based on the model relative performance with respect to a historical climate and a degree of disagreement in projecting future conditions. A Monte Carlo technique is used to sample the factors of change from their respective marginal distributions. As a comparison with traditional approaches, factors of change are also estimated by averaging GCM realizations. With either approach, the derived factors of change are applied to the climate statistics inferred from historical observations to re-evaluate parameters of the weather generator. The re-parameterized generator yields hourly time series of meteorological variables that can be considered to be representative of future climate conditions. In this study, the time series are generated in an ensemble mode to fully reflect the uncertainty of GCM projections, climate stochasticity, as well as uncertainties of the downscaling procedure. Applications of the methodology in reproducing future climate conditions for the periods of 2000-2009, 2046-2065 and 2081-2100, using the period of 1962-1992 as the historical baseline are discussed for the location of Firenze (Italy). The inferences of the methodology for the period of 2000-2009 are tested against observations to assess reliability of the stochastic downscaling procedure in reproducing statistics of meteorological variables at different time scales.
Statistical regularities of Carbon emission trading market: Evidence from European Union allowances
NASA Astrophysics Data System (ADS)
Zheng, Zeyu; Xiao, Rui; Shi, Haibo; Li, Guihong; Zhou, Xiaofeng
2015-05-01
As an emerging financial market, the trading value of carbon emission trading market has definitely increased. In recent years, the carbon emission allowances have already become a way of investment. They are bought and sold not only by carbon emitters but also by investors. In this paper, we analyzed the price fluctuations of the European Union allowances (EUA) futures in European Climate Exchange (ECX) market from 2007 to 2011. The symmetric and power-law probability density function of return time series was displayed. We found that there are only short-range correlations in price changes (return), while long-range correlations in the absolute of price changes (volatility). Further, detrended fluctuation analysis (DFA) approach was applied with focus on long-range autocorrelations and Hurst exponent. We observed long-range power-law autocorrelations in the volatility that quantify risk, and found that they decay much more slowly than the autocorrelation of return time series. Our analysis also showed that the significant cross correlations exist between return time series of EUA and many other returns. These cross correlations exist in a wide range of fields, including stock markets, energy concerned commodities futures, and financial futures. The significant cross-correlations between energy concerned futures and EUA indicate the physical relationship between carbon emission and energy production process. Additionally, the cross-correlations between financial futures and EUA indicate that the speculation behavior may become an important factor that can affect the price of EUA. Finally we modeled the long-range volatility time series of EUA with a particular version of the GARCH process, and the result also suggests long-range volatility autocorrelations.
The Recalibrated Sunspot Number: Impact on Solar Cycle Predictions
NASA Astrophysics Data System (ADS)
Clette, F.; Lefevre, L.
2017-12-01
Recently and for the first time since their creation, the sunspot number and group number series were entirely revisited and a first fully recalibrated version was officially released in July 2015 by the World Data Center SILSO (Brussels). Those reference long-term series are widely used as input data or as a calibration reference by various solar cycle prediction methods. Therefore, past predictions may now need to be redone using the new sunspot series, and methods already used for predicting cycle 24 will require adaptations before attempting predictions of the next cycles.In order to clarify the nature of the applied changes, we describe the different corrections applied to the sunspot and group number series, which affect extended time periods and can reach up to 40%. While some changes simply involve constant scale factors, other corrections vary with time or follow the solar cycle modulation. Depending on the prediction method and on the selected time interval, this can lead to different responses and biases. Moreover, together with the new series, standard error estimates are also progressively added to the new sunspot numbers, which may help deriving more accurate uncertainties for predicted activity indices. We conclude on the new round of recalibration that is now undertaken in the framework of a broad multi-team collaboration articulated around upcoming ISSI workshops. We outline the future corrections that can still be expected in the future, as part of a permanent upgrading process and quality control. From now on, future sunspot-based predictive models should thus be made more adaptable, and regular updates of predictions should become common practice in order to track periodic upgrades of the sunspot number series, just like it is done when using other modern solar observational series.
Dou, Chao
2016-01-01
The storage volume of internet data center is one of the classical time series. It is very valuable to predict the storage volume of a data center for the business value. However, the storage volume series from a data center is always “dirty,” which contains the noise, missing data, and outliers, so it is necessary to extract the main trend of storage volume series for the future prediction processing. In this paper, we propose an irregular sampling estimation method to extract the main trend of the time series, in which the Kalman filter is used to remove the “dirty” data; then the cubic spline interpolation and average method are used to reconstruct the main trend. The developed method is applied in the storage volume series of internet data center. The experiment results show that the developed method can estimate the main trend of storage volume series accurately and make great contribution to predict the future volume value. PMID:28090205
Miao, Beibei; Dou, Chao; Jin, Xuebo
2016-01-01
The storage volume of internet data center is one of the classical time series. It is very valuable to predict the storage volume of a data center for the business value. However, the storage volume series from a data center is always "dirty," which contains the noise, missing data, and outliers, so it is necessary to extract the main trend of storage volume series for the future prediction processing. In this paper, we propose an irregular sampling estimation method to extract the main trend of the time series, in which the Kalman filter is used to remove the "dirty" data; then the cubic spline interpolation and average method are used to reconstruct the main trend. The developed method is applied in the storage volume series of internet data center. The experiment results show that the developed method can estimate the main trend of storage volume series accurately and make great contribution to predict the future volume value. .
Time series analysis of monthly pulpwood use in the Northeast
James T. Bones
1980-01-01
Time series analysis was used to develop a model that depicts pulpwood use in the Northeast. The model is useful in forecasting future pulpwood requirements (short term) or monitoring pulpwood-use activity in relation to past use patterns. The model predicted a downturn in use during 1980.
Prediction of mortality rates using a model with stochastic parameters
NASA Astrophysics Data System (ADS)
Tan, Chon Sern; Pooi, Ah Hin
2016-10-01
Prediction of future mortality rates is crucial to insurance companies because they face longevity risks while providing retirement benefits to a population whose life expectancy is increasing. In the past literature, a time series model based on multivariate power-normal distribution has been applied on mortality data from the United States for the years 1933 till 2000 to forecast the future mortality rates for the years 2001 till 2010. In this paper, a more dynamic approach based on the multivariate time series will be proposed where the model uses stochastic parameters that vary with time. The resulting prediction intervals obtained using the model with stochastic parameters perform better because apart from having good ability in covering the observed future mortality rates, they also tend to have distinctly shorter interval lengths.
Code of Federal Regulations, 2010 CFR
2010-10-01
... eligible for capital assistance. Capital assistance means Federal financial assistance for capital projects... recipient in the market. Present value means the value at the time of calculation of a future payment, or series of future payments discounted by the time value of money as represented by an interest rate or...
NASA Technical Reports Server (NTRS)
Hill, Emma M.; Ponte, Rui M.; Davis, James L.
2007-01-01
Comparison of monthly mean tide-gauge time series to corresponding model time series based on a static inverted barometer (IB) for pressure-driven fluctuations and a ocean general circulation model (OM) reveals that the combined model successfully reproduces seasonal and interannual changes in relative sea level at many stations. Removal of the OM and IB from the tide-gauge record produces residual time series with a mean global variance reduction of 53%. The OM is mis-scaled for certain regions, and 68% of the residual time series contain a significant seasonal variability after removal of the OM and IB from the tide-gauge data. Including OM admittance parameters and seasonal coefficients in a regression model for each station, with IB also removed, produces residual time series with mean global variance reduction of 71%. Examination of the regional improvement in variance caused by scaling the OM, including seasonal terms, or both, indicates weakness in the model at predicting sea-level variation for constricted ocean regions. The model is particularly effective at reproducing sea-level variation for stations in North America, Europe, and Japan. The RMS residual for many stations in these areas is 25-35 mm. The production of "cleaner" tide-gauge time series, with oceanographic variability removed, is important for future analysis of nonsecular and regionally differing sea-level variations. Understanding the ocean model's strengths and weaknesses will allow for future improvements of the model.
Evaluating the uncertainty of predicting future climate time series at the hourly time scale
NASA Astrophysics Data System (ADS)
Caporali, E.; Fatichi, S.; Ivanov, V. Y.
2011-12-01
A stochastic downscaling methodology is developed to generate hourly, point-scale time series for several meteorological variables, such as precipitation, cloud cover, shortwave radiation, air temperature, relative humidity, wind speed, and atmospheric pressure. The methodology uses multi-model General Circulation Model (GCM) realizations and an hourly weather generator, AWE-GEN. Probabilistic descriptions of factors of change (a measure of climate change with respect to historic conditions) are computed for several climate statistics and different aggregation times using a Bayesian approach that weights the individual GCM contributions. The Monte Carlo method is applied to sample the factors of change from their respective distributions thereby permitting the generation of time series in an ensemble fashion, which reflects the uncertainty of climate projections of future as well as the uncertainty of the downscaling procedure. Applications of the methodology and probabilistic expressions of certainty in reproducing future climates for the periods, 2000 - 2009, 2046 - 2065 and 2081 - 2100, using the 1962 - 1992 period as the baseline, are discussed for the location of Firenze (Italy). The climate predictions for the period of 2000 - 2009 are tested against observations permitting to assess the reliability and uncertainties of the methodology in reproducing statistics of meteorological variables at different time scales.
Bivariate analysis of floods in climate impact assessments.
Brunner, Manuela Irene; Sikorska, Anna E; Seibert, Jan
2018-03-01
Climate impact studies regarding floods usually focus on peak discharges and a bivariate assessment of peak discharges and hydrograph volumes is not commonly included. A joint consideration of peak discharges and hydrograph volumes, however, is crucial when assessing flood risks for current and future climate conditions. Here, we present a methodology to develop synthetic design hydrographs for future climate conditions that jointly consider peak discharges and hydrograph volumes. First, change factors are derived based on a regional climate model and are applied to observed precipitation and temperature time series. Second, the modified time series are fed into a calibrated hydrological model to simulate runoff time series for future conditions. Third, these time series are used to construct synthetic design hydrographs. The bivariate flood frequency analysis used in the construction of synthetic design hydrographs takes into account the dependence between peak discharges and hydrograph volumes, and represents the shape of the hydrograph. The latter is modeled using a probability density function while the dependence between the design variables peak discharge and hydrograph volume is modeled using a copula. We applied this approach to a set of eight mountainous catchments in Switzerland to construct catchment-specific and season-specific design hydrographs for a control and three scenario climates. Our work demonstrates that projected climate changes have an impact not only on peak discharges but also on hydrograph volumes and on hydrograph shapes both at an annual and at a seasonal scale. These changes are not necessarily proportional which implies that climate impact assessments on future floods should consider more flood characteristics than just flood peaks. Copyright © 2017. Published by Elsevier B.V.
77 FR 4588 - Incapital LLC and Incapital Unit Trust; Notice of Application
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-30
... sales charge. If such a market is not maintained at any time for any Series, holders of the Units... future unit investment trusts (collectively, with the Incapital Trust, the ``Trusts'') and series of the Trusts (``Series'') that are sponsored by Incapital or any entity controlling, controlled by or under...
77 FR 76303 - Notice of Availability of Producer Price Index (PPI) Data Users Survey
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-27
... conducted a survey of PPI data users in late 1976 through early 1977. Since that time, numerous new time series data have been introduced with the goal of fulfilling the needs of data users. This survey will... series, and identify areas for future expansion. DATES: The Producer Price Index (PPI) user survey will...
A large set of potential past, present and future hydro-meteorological time series for the UK
NASA Astrophysics Data System (ADS)
Guillod, Benoit P.; Jones, Richard G.; Dadson, Simon J.; Coxon, Gemma; Bussi, Gianbattista; Freer, James; Kay, Alison L.; Massey, Neil R.; Sparrow, Sarah N.; Wallom, David C. H.; Allen, Myles R.; Hall, Jim W.
2018-01-01
Hydro-meteorological extremes such as drought and heavy precipitation can have large impacts on society and the economy. With potentially increasing risks associated with such events due to climate change, properly assessing the associated impacts and uncertainties is critical for adequate adaptation. However, the application of risk-based approaches often requires large sets of extreme events, which are not commonly available. Here, we present such a large set of hydro-meteorological time series for recent past and future conditions for the United Kingdom based on weather@home 2, a modelling framework consisting of a global climate model (GCM) driven by observed or projected sea surface temperature (SST) and sea ice which is downscaled to 25 km over the European domain by a regional climate model (RCM). Sets of 100 time series are generated for each of (i) a historical baseline (1900-2006), (ii) five near-future scenarios (2020-2049) and (iii) five far-future scenarios (2070-2099). The five scenarios in each future time slice all follow the Representative Concentration Pathway 8.5 (RCP8.5) and sample the range of sea surface temperature and sea ice changes from CMIP5 (Coupled Model Intercomparison Project Phase 5) models. Validation of the historical baseline highlights good performance for temperature and potential evaporation, but substantial seasonal biases in mean precipitation, which are corrected using a linear approach. For extremes in low precipitation over a long accumulation period ( > 3 months) and shorter-duration high precipitation (1-30 days), the time series generally represents past statistics well. Future projections show small precipitation increases in winter but large decreases in summer on average, leading to an overall drying, consistently with the most recent UK Climate Projections (UKCP09) but larger in magnitude than the latter. Both drought and high-precipitation events are projected to increase in frequency and intensity in most regions, highlighting the need for appropriate adaptation measures. Overall, the presented dataset is a useful tool for assessing the risk associated with drought and more generally with hydro-meteorological extremes in the UK.
Kusev, Petko; van Schaik, Paul; Tsaneva-Atanasova, Krasimira; Juliusson, Asgeir; Chater, Nick
2018-01-01
When attempting to predict future events, people commonly rely on historical data. One psychological characteristic of judgmental forecasting of time series, established by research, is that when people make forecasts from series, they tend to underestimate future values for upward trends and overestimate them for downward ones, so-called trend-damping (modeled by anchoring on, and insufficient adjustment from, the average of recent time series values). Events in a time series can be experienced sequentially (dynamic mode), or they can also be retrospectively viewed simultaneously (static mode), not experienced individually in real time. In one experiment, we studied the influence of presentation mode (dynamic and static) on two sorts of judgment: (a) predictions of the next event (forecast) and (b) estimation of the average value of all the events in the presented series (average estimation). Participants' responses in dynamic mode were anchored on more recent events than in static mode for all types of judgment but with different consequences; hence, dynamic presentation improved prediction accuracy, but not estimation. These results are not anticipated by existing theoretical accounts; we develop and present an agent-based model-the adaptive anchoring model (ADAM)-to account for the difference between processing sequences of dynamically and statically presented stimuli (visually presented data). ADAM captures how variation in presentation mode produces variation in responses (and the accuracy of these responses) in both forecasting and judgment tasks. ADAM's model predictions for the forecasting and judgment tasks fit better with the response data than a linear-regression time series model. Moreover, ADAM outperformed autoregressive-integrated-moving-average (ARIMA) and exponential-smoothing models, while neither of these models accounts for people's responses on the average estimation task. Copyright © 2017 The Authors. Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.
Leverage effect and its causality in the Korea composite stock price index
NASA Astrophysics Data System (ADS)
Lee, Chang-Yong
2012-02-01
In this paper, we investigate the leverage effect and its causality in the time series of the Korea Composite Stock Price Index from November of 1997 to September of 2010. The leverage effect, which can be quantitatively expressed as a negative correlation between past return and future volatility, is measured by using the cross-correlation coefficient of different time lags between the two time series of the return and the volatility. We find that past return and future volatility are negatively correlated and that the cross correlation is moderate and decays over 60 trading days. We also carry out a partial correlation analysis in order to confirm that the negative correlation between past return and future volatility is neither an artifact nor influenced by the traded volume. To determine the causality of the leverage effect within the decay time, we additionally estimate the cross correlation between past volatility and future return. With the estimate, we perform a statistical hypothesis test to demonstrate that the causal relation is in favor of the return influencing the volatility rather than the other way around.
A radarsat-2 quad-polarized time series for monitoring crop and soil conditions in Barrax, Spain
USDA-ARS?s Scientific Manuscript database
The European Space Agency (ESA) along with multiple university and agency investigators joined to conduct the AgriSAR Campaign in 2009. The main objective was to analyze a dense time series of RADARSAT-2 quad-pol data to define and quantify the performance of Sentinel-1 and other future ESA C-Band ...
17 CFR 140.74 - Delegation of authority to issue special calls for Series 03 Reports and Form 40.
Code of Federal Regulations, 2010 CFR
2010-04-01
... issue special calls for Series 03 Reports and Form 40. 140.74 Section 140.74 Commodity and Securities... Functions § 140.74 Delegation of authority to issue special calls for Series 03 Reports and Form 40. (a) The Commodity Futures Trading Commission hereby delegates, until such time as the Commission orders otherwise...
17 CFR 140.74 - Delegation of authority to issue special calls for Series 03 Reports and Form 40.
Code of Federal Regulations, 2011 CFR
2011-04-01
... issue special calls for Series 03 Reports and Form 40. 140.74 Section 140.74 Commodity and Securities... Functions § 140.74 Delegation of authority to issue special calls for Series 03 Reports and Form 40. (a) The Commodity Futures Trading Commission hereby delegates, until such time as the Commission orders otherwise...
17 CFR 140.74 - Delegation of authority to issue special calls for Series 03 Reports and Form 40.
Code of Federal Regulations, 2012 CFR
2012-04-01
... issue special calls for Series 03 Reports and Form 40. 140.74 Section 140.74 Commodity and Securities... Functions § 140.74 Delegation of authority to issue special calls for Series 03 Reports and Form 40. (a) The Commodity Futures Trading Commission hereby delegates, until such time as the Commission orders otherwise...
Mohammed, Emad A; Naugler, Christopher
2017-01-01
Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.
Mohammed, Emad A.; Naugler, Christopher
2017-01-01
Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand. PMID:28400996
ERIC Educational Resources Information Center
McLinden, Michael
2013-01-01
This publication focuses on national and international policy initiatives to develop a better understanding of part-time learners and the types of flexibility that may enhance their study especially pedagogically. As part of our five-strand research project "Flexible Pedagogies: preparing for the future" it: (1) highlights the challenges…
Time Series Modelling of Syphilis Incidence in China from 2005 to 2012
Zhang, Xingyu; Zhang, Tao; Pei, Jiao; Liu, Yuanyuan; Li, Xiaosong; Medrano-Gracia, Pau
2016-01-01
Background The infection rate of syphilis in China has increased dramatically in recent decades, becoming a serious public health concern. Early prediction of syphilis is therefore of great importance for heath planning and management. Methods In this paper, we analyzed surveillance time series data for primary, secondary, tertiary, congenital and latent syphilis in mainland China from 2005 to 2012. Seasonality and long-term trend were explored with decomposition methods. Autoregressive integrated moving average (ARIMA) was used to fit a univariate time series model of syphilis incidence. A separate multi-variable time series for each syphilis type was also tested using an autoregressive integrated moving average model with exogenous variables (ARIMAX). Results The syphilis incidence rates have increased three-fold from 2005 to 2012. All syphilis time series showed strong seasonality and increasing long-term trend. Both ARIMA and ARIMAX models fitted and estimated syphilis incidence well. All univariate time series showed highest goodness-of-fit results with the ARIMA(0,0,1)×(0,1,1) model. Conclusion Time series analysis was an effective tool for modelling the historical and future incidence of syphilis in China. The ARIMAX model showed superior performance than the ARIMA model for the modelling of syphilis incidence. Time series correlations existed between the models for primary, secondary, tertiary, congenital and latent syphilis. PMID:26901682
Time Series Modelling of Syphilis Incidence in China from 2005 to 2012.
Zhang, Xingyu; Zhang, Tao; Pei, Jiao; Liu, Yuanyuan; Li, Xiaosong; Medrano-Gracia, Pau
2016-01-01
The infection rate of syphilis in China has increased dramatically in recent decades, becoming a serious public health concern. Early prediction of syphilis is therefore of great importance for heath planning and management. In this paper, we analyzed surveillance time series data for primary, secondary, tertiary, congenital and latent syphilis in mainland China from 2005 to 2012. Seasonality and long-term trend were explored with decomposition methods. Autoregressive integrated moving average (ARIMA) was used to fit a univariate time series model of syphilis incidence. A separate multi-variable time series for each syphilis type was also tested using an autoregressive integrated moving average model with exogenous variables (ARIMAX). The syphilis incidence rates have increased three-fold from 2005 to 2012. All syphilis time series showed strong seasonality and increasing long-term trend. Both ARIMA and ARIMAX models fitted and estimated syphilis incidence well. All univariate time series showed highest goodness-of-fit results with the ARIMA(0,0,1)×(0,1,1) model. Time series analysis was an effective tool for modelling the historical and future incidence of syphilis in China. The ARIMAX model showed superior performance than the ARIMA model for the modelling of syphilis incidence. Time series correlations existed between the models for primary, secondary, tertiary, congenital and latent syphilis.
NASA Astrophysics Data System (ADS)
Kurniati, Devi; Hoyyi, Abdul; Widiharih, Tatik
2018-05-01
Time series data is a series of data taken or measured based on observations at the same time interval. Time series data analysis is used to perform data analysis considering the effect of time. The purpose of time series analysis is to know the characteristics and patterns of a data and predict a data value in some future period based on data in the past. One of the forecasting methods used for time series data is the state space model. This study discusses the modeling and forecasting of electric energy consumption using the state space model for univariate data. The modeling stage is began with optimal Autoregressive (AR) order selection, determination of state vector through canonical correlation analysis, estimation of parameter, and forecasting. The result of this research shows that modeling of electric energy consumption using state space model of order 4 with Mean Absolute Percentage Error (MAPE) value 3.655%, so the model is very good forecasting category.
Time series modeling in traffic safety research.
Lavrenz, Steven M; Vlahogianni, Eleni I; Gkritza, Konstantina; Ke, Yue
2018-08-01
The use of statistical models for analyzing traffic safety (crash) data has been well-established. However, time series techniques have traditionally been underrepresented in the corresponding literature, due to challenges in data collection, along with a limited knowledge of proper methodology. In recent years, new types of high-resolution traffic safety data, especially in measuring driver behavior, have made time series modeling techniques an increasingly salient topic of study. Yet there remains a dearth of information to guide analysts in their use. This paper provides an overview of the state of the art in using time series models in traffic safety research, and discusses some of the fundamental techniques and considerations in classic time series modeling. It also presents ongoing and future opportunities for expanding the use of time series models, and explores newer modeling techniques, including computational intelligence models, which hold promise in effectively handling ever-larger data sets. The information contained herein is meant to guide safety researchers in understanding this broad area of transportation data analysis, and provide a framework for understanding safety trends that can influence policy-making. Copyright © 2017 Elsevier Ltd. All rights reserved.
``Wibbly-Wobbly, Timey-Wimey-Stuff:'' Teaching with a Time Lord
NASA Astrophysics Data System (ADS)
Larsen, K.
2014-07-01
November 2013 marked the 50th anniversary of the premiere of Doctor Who, the longest-running science fiction television series in history (790 episodes spanning 1963-1989 and 2005-present). The revival of the BBC series in 2005 has been both critically acclaimed and commercially successful. The travels of the 900-plus-year-old Time Lord and his companions introduce viewers to the past, present, and future of our planet and many others. While the series is obviously fictional, there is also a surprising amount of fairly accurate science as well.
NASA Astrophysics Data System (ADS)
Duveiller, G.; Donatelli, M.; Fumagalli, D.; Zucchini, A.; Nelson, R.; Baruth, B.
2017-02-01
Coupled atmosphere-ocean general circulation models (GCMs) simulate different realizations of possible future climates at global scale under contrasting scenarios of land-use and greenhouse gas emissions. Such data require several additional processing steps before it can be used to drive impact models. Spatial downscaling, typically by regional climate models (RCM), and bias-correction are two such steps that have already been addressed for Europe. Yet, the errors in resulting daily meteorological variables may be too large for specific model applications. Crop simulation models are particularly sensitive to these inconsistencies and thus require further processing of GCM-RCM outputs. Moreover, crop models are often run in a stochastic manner by using various plausible weather time series (often generated using stochastic weather generators) to represent climate time scale for a period of interest (e.g. 2000 ± 15 years), while GCM simulations typically provide a single time series for a given emission scenario. To inform agricultural policy-making, data on near- and medium-term decadal time scale is mostly requested, e.g. 2020 or 2030. Taking a sample of multiple years from these unique time series to represent time horizons in the near future is particularly problematic because selecting overlapping years may lead to spurious trends, creating artefacts in the results of the impact model simulations. This paper presents a database of consolidated and coherent future daily weather data for Europe that addresses these problems. Input data consist of daily temperature and precipitation from three dynamically downscaled and bias-corrected regional climate simulations of the IPCC A1B emission scenario created within the ENSEMBLES project. Solar radiation is estimated from temperature based on an auto-calibration procedure. Wind speed and relative air humidity are collected from historical series. From these variables, reference evapotranspiration and vapour pressure deficit are estimated ensuring consistency within daily records. The weather generator ClimGen is then used to create 30 synthetic years of all variables to characterize the time horizons of 2000, 2020 and 2030, which can readily be used for crop modelling studies.
NASA Astrophysics Data System (ADS)
Hamid, Nor Zila Abd; Adenan, Nur Hamiza; Noorani, Mohd Salmi Md
2017-08-01
Forecasting and analyzing the ozone (O3) concentration time series is important because the pollutant is harmful to health. This study is a pilot study for forecasting and analyzing the O3 time series in one of Malaysian educational area namely Shah Alam using chaotic approach. Through this approach, the observed hourly scalar time series is reconstructed into a multi-dimensional phase space, which is then used to forecast the future time series through the local linear approximation method. The main purpose is to forecast the high O3 concentrations. The original method performed poorly but the improved method addressed the weakness thereby enabling the high concentrations to be successfully forecast. The correlation coefficient between the observed and forecasted time series through the improved method is 0.9159 and both the mean absolute error and root mean squared error are low. Thus, the improved method is advantageous. The time series analysis by means of the phase space plot and Cao method identified the presence of low-dimensional chaotic dynamics in the observed O3 time series. Results showed that at least seven factors affect the studied O3 time series, which is consistent with the listed factors from the diurnal variations investigation and the sensitivity analysis from past studies. In conclusion, chaotic approach has been successfully forecast and analyzes the O3 time series in educational area of Shah Alam. These findings are expected to help stakeholders such as Ministry of Education and Department of Environment in having a better air pollution management.
Models for forecasting hospital bed requirements in the acute sector.
Farmer, R D; Emami, J
1990-01-01
STUDY OBJECTIVE--The aim was to evaluate the current approach to forecasting hospital bed requirements. DESIGN--The study was a time series and regression analysis. The time series for mean duration of stay for general surgery in the age group 15-44 years (1969-1982) was used in the evaluation of different methods of forecasting future values of mean duration of stay and its subsequent use in the formation of hospital bed requirements. RESULTS--It has been suggested that the simple trend fitting approach suffers from model specification error and imposes unjustified restrictions on the data. Time series approach (Box-Jenkins method) was shown to be a more appropriate way of modelling the data. CONCLUSION--The simple trend fitting approach is inferior to the time series approach in modelling hospital bed requirements. PMID:2277253
How bootstrap can help in forecasting time series with more than one seasonal pattern
NASA Astrophysics Data System (ADS)
Cordeiro, Clara; Neves, M. Manuela
2012-09-01
The search for the future is an appealing challenge in time series analysis. The diversity of forecasting methodologies is inevitable and is still in expansion. Exponential smoothing methods are the launch platform for modelling and forecasting in time series analysis. Recently this methodology has been combined with bootstrapping revealing a good performance. The algorithm (Boot. EXPOS) using exponential smoothing and bootstrap methodologies, has showed promising results for forecasting time series with one seasonal pattern. In case of more than one seasonal pattern, the double seasonal Holt-Winters methods and the exponential smoothing methods were developed. A new challenge was now to combine these seasonal methods with bootstrap and carry over a similar resampling scheme used in Boot. EXPOS procedure. The performance of such partnership will be illustrated for some well-know data sets existing in software.
NASA Astrophysics Data System (ADS)
Marcos-Garcia, Patricia; Pulido-Velazquez, Manuel; Lopez-Nicolas, Antonio
2016-04-01
Extreme natural phenomena, and more specifically droughts, constitute a serious environmental, economic and social issue in Southern Mediterranean countries, common in the Mediterranean Spanish basins due to the high temporal and spatial rainfall variability. Drought events are characterized by their complexity, being often difficult to identify and quantify both in time and space, and an universally accepted definition does not even exist. This fact, along with future uncertainty about the duration and intensity of the phenomena on account of climate change, makes necessary increasing the knowledge about the impacts of climate change on droughts in order to design management plans and mitigation strategies. The present abstract aims to evaluate the impact of climate change on both meteorological and hydrological droughts, through the use of a generalization of the Standardized Precipitation Index (SPI). We use the Standardized Flow Index (SFI) to assess the hydrological drought, using flow time series instead of rainfall time series. In the case of the meteorological droughts, the Standardized Precipitation and Evapotranspiration Index (SPEI) has been applied to assess the variability of temperature impacts. In order to characterize climate change impacts on droughts, we have used projections from the CORDEX project (Coordinated Regional Climate Downscaling Experiment). Future rainfall and temperature time series for short (2011-2040) and medium terms (2041-2070) were obtained, applying a quantile mapping method to correct the bias of these time series. Regarding the hydrological drought, the Témez hydrological model has been applied to simulate the impacts of future temperature and rainfall time series on runoff and river discharges. It is a conceptual, lumped and a few parameters hydrological model. Nevertheless, it is necessary to point out the time difference between the meteorological and the hydrological droughts. The case study is the Jucar river basin (Spain), a highly regulated system with a share of 80% of water use for irrigated agriculture. The results show that the climate change would increase the historical drought impacts in the river basin. Acknowledgments The study has been supported by the IMPADAPT project (CGL2013-48424-C2-1-R) with Spanish MINECO (Ministerio de Economía y Competitividad) and European FEDER funds.
Testing the weak-form efficiency of the WTI crude oil futures market
NASA Astrophysics Data System (ADS)
Jiang, Zhi-Qiang; Xie, Wen-Jie; Zhou, Wei-Xing
2014-07-01
The weak-form efficiency of energy futures markets has long been studied and empirical evidence suggests controversial conclusions. In this work, nonparametric methods are adopted to estimate the Hurst indexes of the WTI crude oil futures prices (1983-2012) and a strict statistical test in the spirit of bootstrapping is put forward to verify the weak-form market efficiency hypothesis. The results show that the crude oil futures market is efficient when the whole period is considered. When the whole series is divided into three sub-series separated by the outbreaks of the Gulf War and the Iraq War, it is found that the Gulf War reduced the efficiency of the market. If the sample is split into two sub-series based on the signing date of the North American Free Trade Agreement, the market is found to be inefficient in the sub-periods during which the Gulf War broke out. The same analysis on short-time series in moving windows shows that the market is inefficient only when some turbulent events occur, such as the oil price crash in 1985, the Gulf war, and the oil price crash in 2008.
Using SAR satellite data time series for regional glacier mapping
NASA Astrophysics Data System (ADS)
Winsvold, Solveig H.; Kääb, Andreas; Nuth, Christopher; Andreassen, Liss M.; van Pelt, Ward J. J.; Schellenberger, Thomas
2018-03-01
With dense SAR satellite data time series it is possible to map surface and subsurface glacier properties that vary in time. On Sentinel-1A and RADARSAT-2 backscatter time series images over mainland Norway and Svalbard, we outline how to map glaciers using descriptive methods. We present five application scenarios. The first shows potential for tracking transient snow lines with SAR backscatter time series and correlates with both optical satellite images (Sentinel-2A and Landsat 8) and equilibrium line altitudes derived from in situ surface mass balance data. In the second application scenario, time series representation of glacier facies corresponding to SAR glacier zones shows potential for a more accurate delineation of the zones and how they change in time. The third application scenario investigates the firn evolution using dense SAR backscatter time series together with a coupled energy balance and multilayer firn model. We find strong correlation between backscatter signals with both the modeled firn air content and modeled wetness in the firn. In the fourth application scenario, we highlight how winter rain events can be detected in SAR time series, revealing important information about the area extent of internal accumulation. In the last application scenario, averaged summer SAR images were found to have potential in assisting the process of mapping glaciers outlines, especially in the presence of seasonal snow. Altogether we present examples of how to map glaciers and to further understand glaciological processes using the existing and future massive amount of multi-sensor time series data.
Using satellite laser ranging to measure ice mass change in Greenland and Antarctica
NASA Astrophysics Data System (ADS)
Bonin, Jennifer A.; Chambers, Don P.; Cheng, Minkang
2018-01-01
A least squares inversion of satellite laser ranging (SLR) data over Greenland and Antarctica could extend gravimetry-based estimates of mass loss back to the early 1990s and fill any future gap between the current Gravity Recovery and Climate Experiment (GRACE) and the future GRACE Follow-On mission. The results of a simulation suggest that, while separating the mass change between Greenland and Antarctica is not possible at the limited spatial resolution of the SLR data, estimating the total combined mass change of the two areas is feasible. When the method is applied to real SLR and GRACE gravity series, we find significantly different estimates of inverted mass loss. There are large, unpredictable, interannual differences between the two inverted data types, making us conclude that the current 5×5 spherical harmonic SLR series cannot be used to stand in for GRACE. However, a comparison with the longer IMBIE time series suggests that on a 20-year time frame, the inverted SLR series' interannual excursions may average out, and the long-term mass loss estimate may be reasonable.
Keckhut, P; Funatsu, B M; Claud, C; Hauchecorne, A
2015-01-01
Stratospheric temperature series derived from the Advanced Microwave Sounding Unit (AMSU) on board successive NOAA satellites reveal, during periods of overlap, some bias and drifts. Part of the reason for these discrepancies could be atmospheric tides as the orbits of these satellites drifted, inducing large changes in the actual times of measurement. NOAA 15 and 16, which exhibit a long period of overlap, allow deriving diurnal tides that can correct such temperature drifts. The characteristics of the derived diurnal tides during summer periods is in good agreement with those calculated with the Global Scale Wave Model, indicating that most of the observed drifts are likely due to the atmospheric tides. Cooling can be biased by a factor of 2, if times of measurement are not considered. When diurnal tides are considered, trends derived from temperature lidar series are in good agreement with AMSU series. Future adjustments of temperature time series based on successive AMSU instruments will require considering corrections associated with the local times of measurement. PMID:26300563
Keckhut, P; Funatsu, B M; Claud, C; Hauchecorne, A
2015-01-01
Stratospheric temperature series derived from the Advanced Microwave Sounding Unit (AMSU) on board successive NOAA satellites reveal, during periods of overlap, some bias and drifts. Part of the reason for these discrepancies could be atmospheric tides as the orbits of these satellites drifted, inducing large changes in the actual times of measurement. NOAA 15 and 16, which exhibit a long period of overlap, allow deriving diurnal tides that can correct such temperature drifts. The characteristics of the derived diurnal tides during summer periods is in good agreement with those calculated with the Global Scale Wave Model, indicating that most of the observed drifts are likely due to the atmospheric tides. Cooling can be biased by a factor of 2, if times of measurement are not considered. When diurnal tides are considered, trends derived from temperature lidar series are in good agreement with AMSU series. Future adjustments of temperature time series based on successive AMSU instruments will require considering corrections associated with the local times of measurement.
Flood return level analysis of Peaks over Threshold series under changing climate
NASA Astrophysics Data System (ADS)
Li, L.; Xiong, L.; Hu, T.; Xu, C. Y.; Guo, S.
2016-12-01
Obtaining insights into future flood estimation is of great significance for water planning and management. Traditional flood return level analysis with the stationarity assumption has been challenged by changing environments. A method that takes into consideration the nonstationarity context has been extended to derive flood return levels for Peaks over Threshold (POT) series. With application to POT series, a Poisson distribution is normally assumed to describe the arrival rate of exceedance events, but this distribution assumption has at times been reported as invalid. The Negative Binomial (NB) distribution is therefore proposed as an alternative to the Poisson distribution assumption. Flood return levels were extrapolated in nonstationarity context for the POT series of the Weihe basin, China under future climate scenarios. The results show that the flood return levels estimated under nonstationarity can be different with an assumption of Poisson and NB distribution, respectively. The difference is found to be related to the threshold value of POT series. The study indicates the importance of distribution selection in flood return level analysis under nonstationarity and provides a reference on the impact of climate change on flood estimation in the Weihe basin for the future.
50 CFR 600.315 - National Standard 2-Scientific Information.
Code of Federal Regulations, 2014 CFR
2014-10-01
...., abundance, environmental, catch statistics, market and trade trends) provide time-series information on... comment should be solicited at appropriate times during the review of scientific information... information or the promise of future data collection or analysis. In some cases, due to time constraints...
Time Series Data Analysis of Wireless Sensor Network Measurements of Temperature.
Bhandari, Siddhartha; Bergmann, Neil; Jurdak, Raja; Kusy, Branislav
2017-05-26
Wireless sensor networks have gained significant traction in environmental signal monitoring and analysis. The cost or lifetime of the system typically depends on the frequency at which environmental phenomena are monitored. If sampling rates are reduced, energy is saved. Using empirical datasets collected from environmental monitoring sensor networks, this work performs time series analyses of measured temperature time series. Unlike previous works which have concentrated on suppressing the transmission of some data samples by time-series analysis but still maintaining high sampling rates, this work investigates reducing the sampling rate (and sensor wake up rate) and looks at the effects on accuracy. Results show that the sampling period of the sensor can be increased up to one hour while still allowing intermediate and future states to be estimated with interpolation RMSE less than 0.2 °C and forecasting RMSE less than 1 °C.
Clinical time series prediction: Toward a hierarchical dynamical system framework.
Liu, Zitao; Hauskrecht, Milos
2015-09-01
Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. We tested our framework by first learning the time series model from data for the patients in the training set, and then using it to predict future time series values for the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. Copyright © 2014 Elsevier B.V. All rights reserved.
Landsat Time-Series Analysis Opens New Approaches for Regional Glacier Mapping
NASA Astrophysics Data System (ADS)
Winsvold, S. H.; Kääb, A.; Nuth, C.; Altena, B.
2016-12-01
The archive of Landsat satellite scenes is important for mapping of glaciers, especially as it represents the longest running and continuous satellite record of sufficient resolution to track glacier changes over time. Contributing optical sensors newly launched (Landsat 8 and Sentinel-2A) or upcoming in the near future (Sentinel-2B), will promote very high temporal resolution of optical satellite images especially in high-latitude regions. Because of the potential that lies within such near-future dense time series, methods for mapping glaciers from space should be revisited. We present application scenarios that utilize and explore dense time series of optical data for automatic mapping of glacier outlines and glacier facies. Throughout the season, glaciers display a temporal sequence of properties in optical reflection as the seasonal snow melts away, and glacier ice appears in the ablation area and firn in the accumulation area. In one application scenario presented we simulated potential future seasonal resolution using several years of Landsat 5TM/7ETM+ data, and found a sinusoidal evolution of the spectral reflectance for on-glacier pixels throughout a year. We believe this is because of the short wave infrared band and its sensitivity to snow grain size. The parameters retrieved from the fitting sinus curve can be used for glacier mapping purposes, thus we also found similar results using e.g. the mean of summer band ratio images. In individual optical mapping scenes, conditions will vary (e.g., snow, ice, and clouds) and will not be equally optimal over the entire scene. Using robust statistics on stacked pixels reveals a potential for synthesizing optimal mapping scenes from a temporal stack, as we present in a further application scenario. The dense time series available from satellite imagery will also promote multi-temporal and multi-sensor based analyses. The seasonal pattern of snow and ice on a glacier seen in the optical time series can in the summer season also be observed using radar backscatter series. Optical sensors reveal the reflective properties at the surface, while radar sensors may penetrate the surface revealing properties from a certain volume.In an outlook to this contribution we have explored how we can combine information from SAR and optical sensor systems for different purposes.
Time Series Analysis of Technology Trends based on the Internet Resources
NASA Astrophysics Data System (ADS)
Kobayashi, Shin-Ichi; Shirai, Yasuyuki; Hiyane, Kazuo; Kumeno, Fumihiro; Inujima, Hiroshi; Yamauchi, Noriyoshi
Information technology is increasingly important in recent years for the development of our society. IT has brought many changes to everything in our society with incredible speed. Hence, when we investigate R & D themes or plan business strategies in IT, we must understand overall situation around the target technology area besides technology itself. Especially it is crucial to understand overall situation as time series to know what will happen in the near future in the target area. For this purpose, we developed a method to generate Multiple-phased trend maps automatically based on the Internet content. Furthermore, we introduced quantitative indicators to analyze near future possible changes. According to the evaluation of this method we got successful and interesting results.
A Proposed Data Fusion Architecture for Micro-Zone Analysis and Data Mining
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kevin McCarthy; Milos Manic
Data Fusion requires the ability to combine or “fuse” date from multiple data sources. Time Series Analysis is a data mining technique used to predict future values from a data set based upon past values. Unlike other data mining techniques, however, Time Series places special emphasis on periodicity and how seasonal and other time-based factors tend to affect trends over time. One of the difficulties encountered in developing generic time series techniques is the wide variability of the data sets available for analysis. This presents challenges all the way from the data gathering stage to results presentation. This paper presentsmore » an architecture designed and used to facilitate the collection of disparate data sets well suited to Time Series analysis as well as other predictive data mining techniques. Results show this architecture provides a flexible, dynamic framework for the capture and storage of a myriad of dissimilar data sets and can serve as a foundation from which to build a complete data fusion architecture.« less
Return periods of losses associated with European windstorm series in a changing climate
NASA Astrophysics Data System (ADS)
Karremann, Melanie K.; Pinto, Joaquim G.; Reyers, Mark; Klawa, Matthias
2015-04-01
During the last decades, several windstorm series hit Europe leading to large aggregated losses. Such storm series are examples of serial clustering of extreme cyclones, presenting a considerable risk for the insurance industry. Clustering of events and return periods of storm series affecting Europe are quantified based on potential losses using empirical models. Moreover, possible future changes of clustering and return periods of European storm series with high potential losses are quantified. Historical storm series are identified using 40 winters of NCEP reanalysis data (1973/1974 - 2012/2013). Time series of top events (1, 2 or 5 year return levels) are used to assess return periods of storm series both empirically and theoretically. Return periods of historical storm series are estimated based on the Poisson and the negative binomial distributions. Additionally, 800 winters of ECHAM5/MPI-OM1 general circulation model simulations for present (SRES scenario 20C: years 1960- 2000) and future (SRES scenario A1B: years 2060- 2100) climate conditions are investigated. Clustering is identified for most countries in Europe, and estimated return periods are similar for reanalysis and present day simulations. Future changes of return periods are estimated for fixed return levels and fixed loss index thresholds. For the former, shorter return periods are found for Western Europe, but changes are small and spatially heterogeneous. For the latter, which combines the effects of clustering and event ranking shifts, shorter return periods are found everywhere except for Mediterranean countries. These changes are generally not statistically significant between recent and future climate. However, the return periods for the fixed loss index approach are mostly beyond the range of preindustrial natural climate variability. This is not true for fixed return levels. The quantification of losses associated with storm series permits a more adequate windstorm risk assessment in a changing climate.
Human Mars Lander Design for NASA's Evolvable Mars Campaign
NASA Technical Reports Server (NTRS)
Polsgrove, Tara; Chapman, Jack; Sutherlin, Steve; Taylor, Brian; Fabisinski, Leo; Collins, Tim; Cianciolo Dwyer, Alicia; Samareh, Jamshid; Robertson, Ed; Studak, Bill;
2016-01-01
Landing humans on Mars will require entry, descent, and landing capability beyond the current state of the art. Nearly twenty times more delivered payload and an order of magnitude improvement in precision landing capability will be necessary. To better assess entry, descent, and landing technology options and sensitivities to future human mission design variations, a series of design studies on human-class Mars landers has been initiated. This paper describes the results of the first design study in the series of studies to be completed in 2016 and includes configuration, trajectory and subsystem design details for a lander with Hypersonic Inflatable Aerodynamic Decelerator (HIAD) entry technology. Future design activities in this series will focus on other entry technology options.
NASA Astrophysics Data System (ADS)
Mosier, T. M.; Hill, D. F.; Sharp, K. V.
2013-12-01
High spatial resolution time-series data are critical for many hydrological and earth science studies. Multiple groups have developed historical and forecast datasets of high-resolution monthly time-series for regions of the world such as the United States (e.g. PRISM for hindcast data and MACA for long-term forecasts); however, analogous datasets have not been available for most data scarce regions. The current work fills this data need by producing and freely distributing hindcast and forecast time-series datasets of monthly precipitation and mean temperature for all global land surfaces, gridded at a 30 arc-second resolution. The hindcast data are constructed through a Delta downscaling method, using as inputs 0.5 degree monthly time-series and 30 arc-second climatology global weather datasets developed by Willmott & Matsuura and WorldClim, respectively. The forecast data are formulated using a similar downscaling method, but with an additional step to remove bias from the climate variable's probability distribution over each region of interest. The downscaling package is designed to be compatible with a number of general circulation models (GCM) (e.g. with GCMs developed for the IPCC AR4 report and CMIP5), and is presently implemented using time-series data from the NCAR CESM1 model in conjunction with 30 arc-second future decadal climatologies distributed by the Consultative Group on International Agricultural Research. The resulting downscaled datasets are 30 arc-second time-series forecasts of monthly precipitation and mean temperature available for all global land areas. As an example of these data, historical and forecast 30 arc-second monthly time-series from 1950 through 2070 are created and analyzed for the region encompassing Pakistan. For this case study, forecast datasets corresponding to the future representative concentration pathways 45 and 85 scenarios developed by the IPCC are presented and compared. This exercise highlights a range of potential meteorological trends for the Pakistan region and more broadly serves to demonstrate the utility of the presented 30 arc-second monthly precipitation and mean temperature datasets for use in data scarce regions.
Analysis of continuous GPS measurements from southern Victoria Land, Antarctica
Willis, Michael J.
2007-01-01
Several years of continuous data have been collected at remote bedrock Global Positioning System (GPS) sites in southern Victoria Land, Antarctica. Annual to sub-annual variations are observed in the position time-series. An atmospheric pressure loading (APL) effect is calculated from pressure field anomalies supplied by the European Centre for Medium-Range Weather Forecasts (ECMWF) model loading an elastic Earth model. The predicted APL signal has a moderate correlation with the vertical position time-series at McMurdo, Ross Island (International Global Navigation Satellite System Service (IGS) station MCM4), produced using a global solution. In contrast, a local solution in which MCM4 is the fiducial site generates a vertical time series for a remote site in Victoria Land (Cape Roberts, ROB4) which exhibits a low, inverse correlation with the predicted atmospheric pressure loading signal. If, in the future, known and well modeled geophysical loads can be separated from the time-series, then local hydrological loading, of interest for glaciological and climate applications, can potentially be extracted from the GPS time-series.
Brenčič, Mihael
2016-01-01
Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899–2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future. PMID:27116375
Brenčič, Mihael
2016-01-01
Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899-2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future.
10 CFR 300.5 - Submission of an entity statement.
Code of Federal Regulations, 2010 CFR
2010-01-01
... report as a large emitter in all future years in order to ensure a consistent time series of reports... average annual emissions over a continuous period not to exceed four years of time ending in its chosen... necessary, update their entity statements. (2) From time to time, a reporting entity may choose to change...
10 CFR 300.5 - Submission of an entity statement.
Code of Federal Regulations, 2011 CFR
2011-01-01
... report as a large emitter in all future years in order to ensure a consistent time series of reports... average annual emissions over a continuous period not to exceed four years of time ending in its chosen... necessary, update their entity statements. (2) From time to time, a reporting entity may choose to change...
10 CFR 300.5 - Submission of an entity statement.
Code of Federal Regulations, 2012 CFR
2012-01-01
... report as a large emitter in all future years in order to ensure a consistent time series of reports... average annual emissions over a continuous period not to exceed four years of time ending in its chosen... necessary, update their entity statements. (2) From time to time, a reporting entity may choose to change...
10 CFR 300.5 - Submission of an entity statement.
Code of Federal Regulations, 2014 CFR
2014-01-01
... report as a large emitter in all future years in order to ensure a consistent time series of reports... average annual emissions over a continuous period not to exceed four years of time ending in its chosen... necessary, update their entity statements. (2) From time to time, a reporting entity may choose to change...
10 CFR 300.5 - Submission of an entity statement.
Code of Federal Regulations, 2013 CFR
2013-01-01
... report as a large emitter in all future years in order to ensure a consistent time series of reports... average annual emissions over a continuous period not to exceed four years of time ending in its chosen... necessary, update their entity statements. (2) From time to time, a reporting entity may choose to change...
Chang, Li-Chiu; Chen, Pin-An; Chang, Fi-John
2012-08-01
A reliable forecast of future events possesses great value. The main purpose of this paper is to propose an innovative learning technique for reinforcing the accuracy of two-step-ahead (2SA) forecasts. The real-time recurrent learning (RTRL) algorithm for recurrent neural networks (RNNs) can effectively model the dynamics of complex processes and has been used successfully in one-step-ahead forecasts for various time series. A reinforced RTRL algorithm for 2SA forecasts using RNNs is proposed in this paper, and its performance is investigated by two famous benchmark time series and a streamflow during flood events in Taiwan. Results demonstrate that the proposed reinforced 2SA RTRL algorithm for RNNs can adequately forecast the benchmark (theoretical) time series, significantly improve the accuracy of flood forecasts, and effectively reduce time-lag effects.
2016-08-21
less pronounced for pelvis velocity • Seat velocity and dynamic displacement not recorded for this test series – Would provide key information for...effectiveness of seat – Displacement /time history data should be recorded for all future test series UNCLASSIFIED UNCLASSIFIED Conclusions/Future...interfacing with seat manufacturers to broaden occupant protection range – Record dynamic stroke on all drop tower tests to evaluate correlation between displacement rate and lumbar compression UNCLASSIFIED UNCLASSIFIED 17
Nonlinear techniques for forecasting solar activity directly from its time series
NASA Technical Reports Server (NTRS)
Ashrafi, S.; Roszman, L.; Cooley, J.
1992-01-01
Numerical techniques for constructing nonlinear predictive models to forecast solar flux directly from its time series are presented. This approach makes it possible to extract dynamical invariants of our system without reference to any underlying solar physics. We consider the dynamical evolution of solar activity in a reconstructed phase space that captures the attractor (strange), given a procedure for constructing a predictor of future solar activity, and discuss extraction of dynamical invariants such as Lyapunov exponents and attractor dimension.
Nonlinear techniques for forecasting solar activity directly from its time series
NASA Technical Reports Server (NTRS)
Ashrafi, S.; Roszman, L.; Cooley, J.
1993-01-01
This paper presents numerical techniques for constructing nonlinear predictive models to forecast solar flux directly from its time series. This approach makes it possible to extract dynamical in variants of our system without reference to any underlying solar physics. We consider the dynamical evolution of solar activity in a reconstructed phase space that captures the attractor (strange), give a procedure for constructing a predictor of future solar activity, and discuss extraction of dynamical invariants such as Lyapunov exponents and attractor dimension.
Future mission studies: Forecasting solar flux directly from its chaotic time series
NASA Technical Reports Server (NTRS)
Ashrafi, S.
1991-01-01
The mathematical structure of the programs written to construct a nonlinear predictive model to forecast solar flux directly from its time series without reference to any underlying solar physics is presented. This method and the programs are written so that one could apply the same technique to forecast other chaotic time series, such as geomagnetic data, attitude and orbit data, and even financial indexes and stock market data. Perhaps the most important application of this technique to flight dynamics is to model Goddard Trajectory Determination System (GTDS) output of residues between observed position of spacecraft and calculated position with no drag (drag flag = off). This would result in a new model of drag working directly from observed data.
NASA Astrophysics Data System (ADS)
Pan, Supriya
2018-01-01
Cosmological models with time-dependent Λ (read as Λ(t)) have been investigated widely in the literature. Models that solve background dynamics analytically are of special interest. Additionally, the allowance of past or future singularities at finite cosmic time in a specific model signals for a generic test on its viabilities with the current observations. Following these, in this work we consider a variety of Λ(t) models focusing on their evolutions and singular behavior. We found that a series of models in this class can be exactly solved when the background universe is described by a spatially flat Friedmann-Lemaître-Robertson-Walker (FLRW) line element. The solutions in terms of the scale factor of the FLRW universe offer different universe models, such as power-law expansion, oscillating, and the singularity free universe. However, we also noticed that a large number of the models in this series permit past or future cosmological singularities at finite cosmic time. At last we close the work with a note that the avoidance of future singularities is possible for certain models under some specific restrictions.
On statistical inference in time series analysis of the evolution of road safety.
Commandeur, Jacques J F; Bijleveld, Frits D; Bergel-Hayat, Ruth; Antoniou, Constantinos; Yannis, George; Papadimitriou, Eleonora
2013-11-01
Data collected for building a road safety observatory usually include observations made sequentially through time. Examples of such data, called time series data, include annual (or monthly) number of road traffic accidents, traffic fatalities or vehicle kilometers driven in a country, as well as the corresponding values of safety performance indicators (e.g., data on speeding, seat belt use, alcohol use, etc.). Some commonly used statistical techniques imply assumptions that are often violated by the special properties of time series data, namely serial dependency among disturbances associated with the observations. The first objective of this paper is to demonstrate the impact of such violations to the applicability of standard methods of statistical inference, which leads to an under or overestimation of the standard error and consequently may produce erroneous inferences. Moreover, having established the adverse consequences of ignoring serial dependency issues, the paper aims to describe rigorous statistical techniques used to overcome them. In particular, appropriate time series analysis techniques of varying complexity are employed to describe the development over time, relating the accident-occurrences to explanatory factors such as exposure measures or safety performance indicators, and forecasting the development into the near future. Traditional regression models (whether they are linear, generalized linear or nonlinear) are shown not to naturally capture the inherent dependencies in time series data. Dedicated time series analysis techniques, such as the ARMA-type and DRAG approaches are discussed next, followed by structural time series models, which are a subclass of state space methods. The paper concludes with general recommendations and practice guidelines for the use of time series models in road safety research. Copyright © 2012 Elsevier Ltd. All rights reserved.
Self-affinity in the dengue fever time series
NASA Astrophysics Data System (ADS)
Azevedo, S. M.; Saba, H.; Miranda, J. G. V.; Filho, A. S. Nascimento; Moret, M. A.
2016-06-01
Dengue is a complex public health problem that is common in tropical and subtropical regions. This disease has risen substantially in the last three decades, and the physical symptoms depict the self-affine behavior of the occurrences of reported dengue cases in Bahia, Brazil. This study uses detrended fluctuation analysis (DFA) to verify the scale behavior in a time series of dengue cases and to evaluate the long-range correlations that are characterized by the power law α exponent for different cities in Bahia, Brazil. The scaling exponent (α) presents different long-range correlations, i.e. uncorrelated, anti-persistent, persistent and diffusive behaviors. The long-range correlations highlight the complex behavior of the time series of this disease. The findings show that there are two distinct types of scale behavior. In the first behavior, the time series presents a persistent α exponent for a one-month period. For large periods, the time series signal approaches subdiffusive behavior. The hypothesis of the long-range correlations in the time series of the occurrences of reported dengue cases was validated. The observed self-affinity is useful as a forecasting tool for future periods through extrapolation of the α exponent behavior. This complex system has a higher predictability in a relatively short time (approximately one month), and it suggests a new tool in epidemiological control strategies. However, predictions for large periods using DFA are hidden by the subdiffusive behavior.
Time series analysis for psychological research: examining and forecasting change
Jebb, Andrew T.; Tay, Louis; Wang, Wei; Huang, Qiming
2015-01-01
Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials. PMID:26106341
Time series analysis for psychological research: examining and forecasting change.
Jebb, Andrew T; Tay, Louis; Wang, Wei; Huang, Qiming
2015-01-01
Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials.
ERIC Educational Resources Information Center
Trumper, Ricardo
2006-01-01
Bearing in mind students' misconceptions about basic concepts in astronomy, the present study conducted a series of constructivist activities aimed at changing future elementary and junior high school teachers' conceptions about the cause of seasonal changes, and several characteristics of the Sun-Earth-Moon relative movements like Moon phases,…
Clinical time series prediction: towards a hierarchical dynamical system framework
Liu, Zitao; Hauskrecht, Milos
2014-01-01
Objective Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Materials and methods Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. Results We tested our framework by first learning the time series model from data for the patient in the training set, and then applying the model in order to predict future time series values on the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. Conclusion A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. PMID:25534671
Simulated hydrologic response to climate change during the 21st century in New Hampshire
Bjerklie, David M.; Sturtevant, Luke P.
2018-01-24
The U.S. Geological Survey, in cooperation with the New Hampshire Department of Environmental Services and the Department of Health and Human Services, has developed a hydrologic model to assess the effects of short- and long-term climate change on hydrology in New Hampshire. This report documents the model and datasets developed by using the model to predict how climate change will affect the hydrologic cycle and provide data that can be used by State and local agencies to identify locations that are vulnerable to the effects of climate change in areas across New Hampshire. Future hydrologic projections were developed from the output of five general circulation models for two future climate scenarios. The scenarios are based on projected future greenhouse gas emissions and estimates of land-use and land-cover change within a projected global economic framework. An evaluation of the possible effect of projected future temperature on modeling of evapotranspiration is summarized to address concerns regarding the implications of the future climate on model parameters that are based on climate variables. The results of the model simulations are hydrologic projections indicating increasing streamflow across the State with large increases in streamflow during winter and early spring and general decreases during late spring and summer. Wide spatial variability in changes to groundwater recharge is projected, with general decreases in the Connecticut River Valley and at high elevations in the northern part of the State and general increases in coastal and lowland areas of the State. In general, total winter snowfall is projected to decrease across the State, but there is a possibility of increasing snow in some locations, particularly during November, February, and March. The simulated future changes in recharge and snowfall vary by watershed across the State. This means that each area of the State could experience very different changes, depending on topography or other factors. Therefore, planning for infrastructure and public safety needs to be flexible in order to address the range of possible outcomes indicated by the various model simulations. The absolute magnitude and timing of the daily streamflows, especially the larger floods, are not considered to be reliably simulated compared to changes in frequency and duration of daily streamflows and changes in accumulated monthly and seasonal streamflow volumes. Simulated current and future streamflow, groundwater recharge, and snowfall datasets include simulated data derived from the five general circulation models used in this study for a current reference time period and two future time periods. Average monthly streamflow time series datasets are provided for 27 streamgages in New Hampshire. Fourteen of the 27 streamgages associated with daily streamflow time series showed a good calibration. Average monthly groundwater recharge and snowfall time series for the same reference time period and two future time periods are also provided for each of the 467 hydrologic response units that compose the model.
Complexity analysis based on generalized deviation for financial markets
NASA Astrophysics Data System (ADS)
Li, Chao; Shang, Pengjian
2018-03-01
In this paper, a new modified method is proposed as a measure to investigate the correlation between past price and future volatility for financial time series, known as the complexity analysis based on generalized deviation. In comparison with the former retarded volatility model, the new approach is both simple and computationally efficient. The method based on the generalized deviation function presents us an exhaustive way showing the quantization of the financial market rules. Robustness of this method is verified by numerical experiments with both artificial and financial time series. Results show that the generalized deviation complexity analysis method not only identifies the volatility of financial time series, but provides a comprehensive way distinguishing the different characteristics between stock indices and individual stocks. Exponential functions can be used to successfully fit the volatility curves and quantify the changes of complexity for stock market data. Then we study the influence for negative domain of deviation coefficient and differences during the volatile periods and calm periods. after the data analysis of the experimental model, we found that the generalized deviation model has definite advantages in exploring the relationship between the historical returns and future volatility.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plotkin, S.; Stephens, T.; McManus, W.
2013-03-01
Scenarios of new vehicle technology deployment serve various purposes; some will seek to establish plausibility. This report proposes two reality checks for scenarios: (1) implications of manufacturing constraints on timing of vehicle deployment and (2) investment decisions required to bring new vehicle technologies to market. An estimated timeline of 12 to more than 22 years from initial market introduction to saturation is supported by historical examples and based on the product development process. Researchers also consider the series of investment decisions to develop and build the vehicles and their associated fueling infrastructure. A proposed decision tree analysis structure could bemore » used to systematically examine investors' decisions and the potential outcomes, including consideration of cash flow and return on investment. This method requires data or assumptions about capital cost, variable cost, revenue, timing, and probability of success/failure, and would result in a detailed consideration of the value proposition of large investments and long lead times. This is one of a series of reports produced as a result of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency effort to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plotkin, Steve; Stephens, Thomas; McManus, Walter
2013-03-01
Scenarios of new vehicle technology deployment serve various purposes; some will seek to establish plausibility. This report proposes two reality checks for scenarios: (1) implications of manufacturing constraints on timing of vehicle deployment and (2) investment decisions required to bring new vehicle technologies to market. An estimated timeline of 12 to more than 22 years from initial market introduction to saturation is supported by historical examples and based on the product development process. Researchers also consider the series of investment decisions to develop and build the vehicles and their associated fueling infrastructure. A proposed decision tree analysis structure could bemore » used to systematically examine investors' decisions and the potential outcomes, including consideration of cash flow and return on investment. This method requires data or assumptions about capital cost, variable cost, revenue, timing, and probability of success/failure, and would result in a detailed consideration of the value proposition of large investments and long lead times. This is one of a series of reports produced as a result of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency effort to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.« less
A wrinkle in time: asymmetric valuation of past and future events.
Caruso, Eugene M; Gilbert, Daniel T; Wilson, Timothy D
2008-08-01
A series of studies shows that people value future events more than equivalent events in the equidistant past. Whether people imagined being compensated or compensating others, they required and offered more compensation for events that would take place in the future than for identical events that had taken place in the past. This temporal value asymmetry (TVA) was robust in between-persons comparisons and absent in within-persons comparisons, which suggests that participants considered the TVA irrational. Contemplating future events produced greater affect than did contemplating past events, and this difference mediated the TVA. We suggest that the TVA, the gain-loss asymmetry, and hyperbolic time discounting can be unified in a three-dimensional value function that describes how people value gains and losses of different magnitudes at different moments in time.
Quantifying the consequences of changing hydroclimatic extremes on protection levels for the Rhine
NASA Astrophysics Data System (ADS)
Sperna Weiland, Frederiek; Hegnauer, Mark; Buiteveld, Hendrik; Lammersen, Rita; van den Boogaard, Henk; Beersma, Jules
2017-04-01
The Dutch method for quantifying the magnitude and frequency of occurrence of discharge extremes in the Rhine basin and the potential influence of climate change hereon are presented. In the Netherlands flood protection design requires estimates of discharge extremes for return periods of 1000 up to 100,000 years. Observed discharge records are too short to derive such extreme return discharges, therefore extreme value assessment is based on very long synthetic discharge time-series generated with the Generator of Rainfall And Discharge Extremes (GRADE). The GRADE instrument consists of (1) a stochastic weather generator based on time series resampling of historical f rainfall and temperature and (2) a hydrological model optimized following the GLUE methodology and (3) a hydrodynamic model to simulate the propagation of flood waves based on the generated hydrological time-series. To assess the potential influence of climate change, the four KNMI'14 climate scenarios are applied. These four scenarios represent a large part of the uncertainty provided by the GCMs used for the IPCC 5th assessment report (the CMIP5 GCM simulations under different climate forcings) and are for this purpose tailored to the Rhine and Meuse river basins. To derive the probability distributions of extreme discharges under climate change the historical synthetic rainfall and temperature series simulated with the weather generator are transformed to the future following the KNMI'14 scenarios. For this transformation the Advanced Delta Change method, which allows that the changes in the extremes differ from those in the means, is used. Subsequently the hydrological model is forced with the historical and future (i.e. transformed) synthetic time-series after which the propagation of the flood waves is simulated with the hydrodynamic model to obtain the extreme discharge statistics both for current and future climate conditions. The study shows that both for 2050 and 2085 increases in discharge extremes for the river Rhine at Lobith are projected by all four KNMI'14 climate scenarios. This poses increased requirements for flood protection design in order to prepare for changing climate conditions.
NASA Astrophysics Data System (ADS)
Sobradelo, R.; Martí, J.; Mendoza-Rosas, A. T.; Gómez, G.
2012-04-01
The Canary Islands are an active volcanic region densely populated and visited by several millions of tourists every year. Nearly twenty eruptions have been reported through written chronicles in the last 600 years, suggesting that the probability of a new eruption in the near future is far from zero. This shows the importance of assessing and monitoring the volcanic hazard of the region in order to reduce and manage its potential volcanic risk, and ultimately contribute to the design of appropriate preparedness plans. Hence, the probabilistic analysis of the volcanic eruption time series for the Canary Islands is an essential step for the assessment of volcanic hazard and risk in the area. Such a series describes complex processes involving different types of eruptions over different time scales. Here we propose a statistical method for calculating the probabilities of future eruptions which is most appropriate given the nature of the documented historical eruptive data. We first characterise the eruptions by their magnitudes, and then carry out a preliminary analysis of the data to establish the requirements for the statistical method. Past studies in eruptive time series used conventional statistics and treated the series as an homogeneous process. In this paper, we will use a method that accounts for the time-dependence of the series and includes rare or extreme events, in the form of few data of large eruptions, since these data require special methods of analysis. Hence, we will use a statistical method from extreme value theory. In particular, we will apply a non-homogeneous Poisson process to the historical eruptive data of the Canary Islands to estimate the probability of having at least one volcanic event of a magnitude greater than one in the upcoming years. Shortly after the publication of this method an eruption in the island of El Hierro took place for the first time in historical times, supporting our method and contributing towards the validation of our results.
CI2 for creating and comparing confidence-intervals for time-series bivariate plots.
Mullineaux, David R
2017-02-01
Currently no method exists for calculating and comparing the confidence-intervals (CI) for the time-series of a bivariate plot. The study's aim was to develop 'CI2' as a method to calculate the CI on time-series bivariate plots, and to identify if the CI between two bivariate time-series overlap. The test data were the knee and ankle angles from 10 healthy participants running on a motorised standard-treadmill and non-motorised curved-treadmill. For a recommended 10+ trials, CI2 involved calculating 95% confidence-ellipses at each time-point, then taking as the CI the points on the ellipses that were perpendicular to the direction vector between the means of two adjacent time-points. Consecutive pairs of CI created convex quadrilaterals, and any overlap of these quadrilaterals at the same time or ±1 frame as a time-lag calculated using cross-correlations, indicated where the two time-series differed. CI2 showed no group differences between left and right legs on both treadmills, but the same legs between treadmills for all participants showed differences of less knee extension on the curved-treadmill before heel-strike. To improve and standardise the use of CI2 it is recommended to remove outlier time-series, use 95% confidence-ellipses, and scale the ellipse by the fixed Chi-square value as opposed to the sample-size dependent F-value. For practical use, and to aid in standardisation or future development of CI2, Matlab code is provided. CI2 provides an effective method to quantify the CI of bivariate plots, and to explore the differences in CI between two bivariate time-series. Copyright © 2016 Elsevier B.V. All rights reserved.
Time is an affliction: Why ecology cannot be as predictive as physics and why it needs time series
NASA Astrophysics Data System (ADS)
Boero, F.; Kraberg, A. C.; Krause, G.; Wiltshire, K. H.
2015-07-01
Ecological systems depend on both constraints and historical contingencies, both of which shape their present observable system state. In contrast to ahistorical systems, which are governed solely by constraints (i.e. laws), historical systems and their dynamics can be understood only if properly described, in the course of time. Describing these dynamics and understanding long-term variability can be seen as the mission of long time series measuring not only simple abiotic features but also complex biological variables, such as species diversity and abundances, allowing deep insights in the functioning of food webs and ecosystems in general. Long time-series are irreplaceable for understanding change, and crucially inherent system variability and thus envisaging future scenarios. This notwithstanding current policies in funding and evaluating scientific research discourage the maintenance of long term series, despite a clear need for long-term strategies to cope with climate change. Time series are crucial for a pursuit of the much invoked Ecosystem Approach and to the passage from simple monitoring programs of large-scale and long-term Earth observatories - thus promoting a better understanding of the causes and effects of change in ecosystems. The few ongoing long time series in European waters must be integrated and networked so as to facilitate the formation of nodes of a series of observatories which, together, should allow the long-term management of the features and characteristics of European waters. Human capacity building in this region of expertise and a stronger societal involvement are also urgently needed, since the expertise in recognizing and describing species and therefore recording them reliably in the context of time series is rapidly vanishing from the European Scientific community.
A probabilistic method for constructing wave time-series at inshore locations using model scenarios
Long, Joseph W.; Plant, Nathaniel G.; Dalyander, P. Soupy; Thompson, David M.
2014-01-01
Continuous time-series of wave characteristics (height, period, and direction) are constructed using a base set of model scenarios and simple probabilistic methods. This approach utilizes an archive of computationally intensive, highly spatially resolved numerical wave model output to develop time-series of historical or future wave conditions without performing additional, continuous numerical simulations. The archive of model output contains wave simulations from a set of model scenarios derived from an offshore wave climatology. Time-series of wave height, period, direction, and associated uncertainties are constructed at locations included in the numerical model domain. The confidence limits are derived using statistical variability of oceanographic parameters contained in the wave model scenarios. The method was applied to a region in the northern Gulf of Mexico and assessed using wave observations at 12 m and 30 m water depths. Prediction skill for significant wave height is 0.58 and 0.67 at the 12 m and 30 m locations, respectively, with similar performance for wave period and direction. The skill of this simplified, probabilistic time-series construction method is comparable to existing large-scale, high-fidelity operational wave models but provides higher spatial resolution output at low computational expense. The constructed time-series can be developed to support a variety of applications including climate studies and other situations where a comprehensive survey of wave impacts on the coastal area is of interest.
NASA Astrophysics Data System (ADS)
McCloskey, John
2008-03-01
The Sumatra-Andaman earthquake of 26 December 2004 (Boxing Day 2004) and its tsunami will endure in our memories as one of the worst natural disasters of our time. For geophysicists, the scale of the devastation and the likelihood of another equally destructive earthquake set out a series of challenges of how we might use science not only to understand the earthquake and its aftermath but also to help in planning for future earthquakes in the region. In this article a brief account of these efforts is presented. Earthquake prediction is probably impossible, but earth scientists are now able to identify particularly dangerous places for future events by developing an understanding of the physics of stress interaction. Having identified such a dangerous area, a series of numerical Monte Carlo simulations is described which allow us to get an idea of what the most likely consequences of a future earthquake are by modelling the tsunami generated by lots of possible, individually unpredictable, future events. As this article was being written, another earthquake occurred in the region, which had many expected characteristics but was enigmatic in other ways. This has spawned a series of further theories which will contribute to our understanding of this extremely complex problem.
Linear and nonlinear trending and prediction for AVHRR time series data
NASA Technical Reports Server (NTRS)
Smid, J.; Volf, P.; Slama, M.; Palus, M.
1995-01-01
The variability of AVHRR calibration coefficient in time was analyzed using algorithms of linear and non-linear time series analysis. Specifically we have used the spline trend modeling, autoregressive process analysis, incremental neural network learning algorithm and redundancy functional testing. The analysis performed on available AVHRR data sets revealed that (1) the calibration data have nonlinear dependencies, (2) the calibration data depend strongly on the target temperature, (3) both calibration coefficients and the temperature time series can be modeled, in the first approximation, as autonomous dynamical systems, (4) the high frequency residuals of the analyzed data sets can be best modeled as an autoregressive process of the 10th degree. We have dealt with a nonlinear identification problem and the problem of noise filtering (data smoothing). The system identification and filtering are significant problems for AVHRR data sets. The algorithms outlined in this study can be used for the future EOS missions. Prediction and smoothing algorithms for time series of calibration data provide a functional characterization of the data. Those algorithms can be particularly useful when calibration data are incomplete or sparse.
Predicting future forestland area: a comparison of econometric approaches.
SoEun Ahn; Andrew J. Plantinga; Ralph J. Alig
2000-01-01
Predictions of future forestland area are an important component of forest policy analyses. In this article, we test the ability of econometric land use models to accurately forecast forest area. We construct a panel data set for Alabama consisting of county and time-series observation for the period 1964 to 1992. We estimate models using restricted data sets-namely,...
ERIC Educational Resources Information Center
Trumper, Ricardo
2006-01-01
In view of students' alternative conceptions about basic concepts in astronomy, we conducted a series of constructivist activities with future elementary and junior high school teachers aimed at changing their conceptions about the cause of seasonal changes, and of several characteristics of the Sun-Earth-Moon relative movements like Moon phases,…
Performance Predictions for the Adaptive Optics System at LCRD's Ground Station 1
NASA Technical Reports Server (NTRS)
Roberts, Lewis C., Jr.; Burruss, Rick; Roberts, Jennifer E.; Piazzolla, Sabino; Dew, Sharon; Truong, Tuan; Fregoso, Santos; Page, Norm
2015-01-01
NASA's LCRD mission will lay the foundation for future laser communication systems. We show the design of the Table Mountain ground station's AO system and time series of predicted coupling efficiency.
NASA Astrophysics Data System (ADS)
Di Piazza, A.; Cordano, E.; Eccel, E.
2012-04-01
The issue of climate change detection is considered a major challenge. In particular, high temporal resolution climate change scenarios are required in the evaluation of the effects of climate change on agricultural management (crop suitability, yields, risk assessment, etc.) energy production and water management. In this work, a "Weather Generator" technique was used for downscaling climate change scenarios for temperature. An R package (RMAWGEN, Cordano and Eccel, 2011 - available on http://cran.r-project.org) was developed aiming to generate synthetic daily weather conditions by using the theory of vectorial auto-regressive models (VAR). The VAR model was chosen for its ability in maintaining the temporal and spatial correlations among variables. In particular, observed time series of daily maximum and minimum temperature are transformed into "new" normally-distributed variable time series which are used to calibrate the parameters of a VAR model by using ordinary least square methods. Therefore the implemented algorithm, applied to monthly mean climatic values downscaled by Global Climate Model predictions, can generate several stochastic daily scenarios where the statistical consistency among series is saved. Further details are present in RMAWGEN documentation. An application is presented here by using a dataset with daily temperature time series recorded in 41 different sites of Trentino region for the period 1958-2010. Temperature time series were pre-processed to fill missing values (by a site-specific calibrated Inverse Distance Weighting algorithm, corrected with elevation) and to remove inhomogeneities. Several climatic indices were taken into account, useful for several impact assessment applications, and their time trends within the time series were analyzed. The indices go from the more classical ones, as annual mean temperatures, seasonal mean temperatures and their anomalies (from the reference period 1961-1990) to the climate change indices selected from the list recommended by the World Meteorological Organization Commission for Climatology (WMO-CCL) and the Research Programme on Climate Variability and Predictability (CLIVAR) project's Expert Team on Climate Change Detection, Monitoring and Indices (ETCCDMI). Each index was applied to both observed (and processed) data and to synthetic time series produced by the Weather Generator, over the thirty year reference period 1981-2010, in order to validate the procedure. Climate projections were statistically downscaled for a selection of sites for the two 30-year periods 2021-2050 and 2071-2099 of the European project "Ensembles" multi-model output (scenario A1B). The use of several climatic indices strengthens the trend analysis of both the generated synthetic series and future climate projections.
NASA Astrophysics Data System (ADS)
Chek, Mohd Zaki Awang; Ahmad, Abu Bakar; Ridzwan, Ahmad Nur Azam Ahmad; Jelas, Imran Md.; Jamal, Nur Faezah; Ismail, Isma Liana; Zulkifli, Faiz; Noor, Syamsul Ikram Mohd
2012-09-01
The main objective of this study is to forecast the future claims amount of Invalidity Pension Scheme (IPS). All data were derived from SOCSO annual reports from year 1972 - 2010. These claims consist of all claims amount from 7 benefits offered by SOCSO such as Invalidity Pension, Invalidity Grant, Survivors Pension, Constant Attendance Allowance, Rehabilitation, Funeral and Education. Prediction of future claims of Invalidity Pension Scheme will be made using Univariate Forecasting Models to predict the future claims among workforce in Malaysia.
Rinaldi, Luca; Vecchi, Tomaso; Fantino, Micaela; Merabet, Lotfi B; Cattaneo, Zaira
2018-03-01
In many cultures, humans conceptualize the past as behind the body and the future as in front. Whether this spatial mapping of time depends on visual experience is still not known. Here, we addressed this issue by testing early-blind participants in a space-time motor congruity task requiring them to classify a series of words as referring to the past or the future by moving their hand backward or forward. Sighted participants showed a preferential mapping between forward movements and future-words and backward movements and past-words. Critically, blind participants did not show any such preferential time-space mapping. Furthermore, in a questionnaire requiring participants to think about past and future events, blind participants did not appear to perceive the future as psychologically closer than the past, as it is the case of sighted individuals. These findings suggest that normal visual development is crucial for representing time along the sagittal space. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Simulation of an ensemble of future climate time series with an hourly weather generator
NASA Astrophysics Data System (ADS)
Caporali, E.; Fatichi, S.; Ivanov, V. Y.; Kim, J.
2010-12-01
There is evidence that climate change is occurring in many regions of the world. The necessity of climate change predictions at the local scale and fine temporal resolution is thus warranted for hydrological, ecological, geomorphological, and agricultural applications that can provide thematic insights into the corresponding impacts. Numerous downscaling techniques have been proposed to bridge the gap between the spatial scales adopted in General Circulation Models (GCM) and regional analyses. Nevertheless, the time and spatial resolutions obtained as well as the type of meteorological variables may not be sufficient for detailed studies of climate change effects at the local scales. In this context, this study presents a stochastic downscaling technique that makes use of an hourly weather generator to simulate time series of predicted future climate. Using a Bayesian approach, the downscaling procedure derives distributions of factors of change for several climate statistics from a multi-model ensemble of GCMs. Factors of change are sampled from their distributions using a Monte Carlo technique to entirely account for the probabilistic information obtained with the Bayesian multi-model ensemble. Factors of change are subsequently applied to the statistics derived from observations to re-evaluate the parameters of the weather generator. The weather generator can reproduce a wide set of climate variables and statistics over a range of temporal scales, from extremes, to the low-frequency inter-annual variability. The final result of such a procedure is the generation of an ensemble of hourly time series of meteorological variables that can be considered as representative of future climate, as inferred from GCMs. The generated ensemble of scenarios also accounts for the uncertainty derived from multiple GCMs used in downscaling. Applications of the procedure in reproducing present and future climates are presented for different locations world-wide: Tucson (AZ), Detroit (MI), and Firenze (Italy). The stochastic downscaling is carried out with eight GCMs from the CMIP3 multi-model dataset (IPCC 4AR, A1B scenario).
Interannual Change Detection of Mediterranean Seagrasses Using RapidEye Image Time Series
Traganos, Dimosthenis; Reinartz, Peter
2018-01-01
Recent research studies have highlighted the decrease in the coverage of Mediterranean seagrasses due to mainly anthropogenic activities. The lack of data on the distribution of these significant aquatic plants complicates the quantification of their decreasing tendency. While Mediterranean seagrasses are declining, satellite remote sensing technology is growing at an unprecedented pace, resulting in a wealth of spaceborne image time series. Here, we exploit recent advances in high spatial resolution sensors and machine learning to study Mediterranean seagrasses. We process a multispectral RapidEye time series between 2011 and 2016 to detect interannual seagrass dynamics in 888 submerged hectares of the Thermaikos Gulf, NW Aegean Sea, Greece (eastern Mediterranean Sea). We assess the extent change of two Mediterranean seagrass species, the dominant Posidonia oceanica and Cymodocea nodosa, following atmospheric and analytical water column correction, as well as machine learning classification, using Random Forests, of the RapidEye time series. Prior corrections are necessary to untangle the initially weak signal of the submerged seagrass habitats from satellite imagery. The central results of this study show that P. oceanica seagrass area has declined by 4.1%, with a trend of −11.2 ha/yr, while C. nodosa seagrass area has increased by 17.7% with a trend of +18 ha/yr throughout the 5-year study period. Trends of change in spatial distribution of seagrasses in the Thermaikos Gulf site are in line with reported trends in the Mediterranean. Our presented methodology could be a time- and cost-effective method toward the quantitative ecological assessment of seagrass dynamics elsewhere in the future. From small meadows to whole coastlines, knowledge of aquatic plant dynamics could resolve decline or growth trends and accurately highlight key units for future restoration, management, and conservation. PMID:29467777
Interannual Change Detection of Mediterranean Seagrasses Using RapidEye Image Time Series.
Traganos, Dimosthenis; Reinartz, Peter
2018-01-01
Recent research studies have highlighted the decrease in the coverage of Mediterranean seagrasses due to mainly anthropogenic activities. The lack of data on the distribution of these significant aquatic plants complicates the quantification of their decreasing tendency. While Mediterranean seagrasses are declining, satellite remote sensing technology is growing at an unprecedented pace, resulting in a wealth of spaceborne image time series. Here, we exploit recent advances in high spatial resolution sensors and machine learning to study Mediterranean seagrasses. We process a multispectral RapidEye time series between 2011 and 2016 to detect interannual seagrass dynamics in 888 submerged hectares of the Thermaikos Gulf, NW Aegean Sea, Greece (eastern Mediterranean Sea). We assess the extent change of two Mediterranean seagrass species, the dominant Posidonia oceanica and Cymodocea nodosa , following atmospheric and analytical water column correction, as well as machine learning classification, using Random Forests, of the RapidEye time series. Prior corrections are necessary to untangle the initially weak signal of the submerged seagrass habitats from satellite imagery. The central results of this study show that P. oceanica seagrass area has declined by 4.1%, with a trend of -11.2 ha/yr, while C. nodosa seagrass area has increased by 17.7% with a trend of +18 ha/yr throughout the 5-year study period. Trends of change in spatial distribution of seagrasses in the Thermaikos Gulf site are in line with reported trends in the Mediterranean. Our presented methodology could be a time- and cost-effective method toward the quantitative ecological assessment of seagrass dynamics elsewhere in the future. From small meadows to whole coastlines, knowledge of aquatic plant dynamics could resolve decline or growth trends and accurately highlight key units for future restoration, management, and conservation.
Grootswagers, Tijl; Wardle, Susan G; Carlson, Thomas A
2017-04-01
Multivariate pattern analysis (MVPA) or brain decoding methods have become standard practice in analyzing fMRI data. Although decoding methods have been extensively applied in brain-computer interfaces, these methods have only recently been applied to time series neuroimaging data such as MEG and EEG to address experimental questions in cognitive neuroscience. In a tutorial style review, we describe a broad set of options to inform future time series decoding studies from a cognitive neuroscience perspective. Using example MEG data, we illustrate the effects that different options in the decoding analysis pipeline can have on experimental results where the aim is to "decode" different perceptual stimuli or cognitive states over time from dynamic brain activation patterns. We show that decisions made at both preprocessing (e.g., dimensionality reduction, subsampling, trial averaging) and decoding (e.g., classifier selection, cross-validation design) stages of the analysis can significantly affect the results. In addition to standard decoding, we describe extensions to MVPA for time-varying neuroimaging data including representational similarity analysis, temporal generalization, and the interpretation of classifier weight maps. Finally, we outline important caveats in the design and interpretation of time series decoding experiments.
NASA Astrophysics Data System (ADS)
Clarke, Hannah; Done, Fay; Casadio, Stefano; Mackin, Stephen; Dinelli, Bianca Maria; Castelli, Elisa
2016-08-01
The long time-series of observations made by the Along Track Scanning Radiometers (ATSR) missions represents a valuable resource for a wide range of research and EO applications.With the advent of ESA's Long-TermData Preservation (LTDP) programme, thought has turned to the preservation and improved understanding of such long time-series, to support their continued exploitation in both existing and new areas of research, bringing the possibility of improving the existing data set and to inform and contribute towards future missions. For this reason, the 'Long Term Stability of the ATSR Instrument Series: SWIR Calibration, Cloud Masking and SAA' project, commonly known as the ATSR Long Term Stability (or ALTS) project, is designed to explore the key characteristics of the data set and new and innovative ways of enhancing and exploiting it.Work has focussed on: A new approach to the assessment of Short Wave Infra-Red (SWIR) channel calibration.; Developmentof a new method for Total Column Water Vapour (TCWV) retrieval.; Study of the South Atlantic Anomaly (SAA).; Radiative Transfer (RT) modelling for ATSR.; Providing AATSR observations with their location in the original instrument grid.; Strategies for the retrieval and archiving of historical ATSR documentation.; Study of TCWV retrieval over land; Development of new methods for cloud masking This paper provides an overview of these activities and illustrates the importance of preserving and understanding 'old' data for continued use in the future.
The Supply of Part-Time Higher Education in the UK. Research Report
ERIC Educational Resources Information Center
Callender, Claire; Birkbeck, Anne Jamieson; Mason, Geoff
2010-01-01
This report explores the supply of part-time higher education in the UK, with particular consideration to the study of part-time undergraduate provision in England. It is the final publication in the series of reports on individual student markets that were commissioned by Universities UK following the publication of the reports on the Future size…
ERIC Educational Resources Information Center
Ginevra, Maria Cristina; Di Maggio, Ilaria; Nota, Laura; Soresi, Salvatore
2017-01-01
A career intervention based on life design approach was devised for a group of young adults at risk for the process of career construction. It was aimed at fostering a series of resources useful to cope with career transitions, to encourage reflection on the future, to identify one's own strengths, and to plan future projects. Results of the study…
Time-series analysis of the transcriptome and proteome of Escherichia coli upon glucose repression.
Borirak, Orawan; Rolfe, Matthew D; de Koning, Leo J; Hoefsloot, Huub C J; Bekker, Martijn; Dekker, Henk L; Roseboom, Winfried; Green, Jeffrey; de Koster, Chris G; Hellingwerf, Klaas J
2015-10-01
Time-series transcript- and protein-profiles were measured upon initiation of carbon catabolite repression in Escherichia coli, in order to investigate the extent of post-transcriptional control in this prototypical response. A glucose-limited chemostat culture was used as the CCR-free reference condition. Stopping the pump and simultaneously adding a pulse of glucose, that saturated the cells for at least 1h, was used to initiate the glucose response. Samples were collected and subjected to quantitative time-series analysis of both the transcriptome (using microarray analysis) and the proteome (through a combination of 15N-metabolic labeling and mass spectrometry). Changes in the transcriptome and corresponding proteome were analyzed using statistical procedures designed specifically for time-series data. By comparison of the two sets of data, a total of 96 genes were identified that are post-transcriptionally regulated. This gene list provides candidates for future in-depth investigation of the molecular mechanisms involved in post-transcriptional regulation during carbon catabolite repression in E. coli, like the involvement of small RNAs. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Hostetler, S.W.; Alder, J.R.; Allan, A.M.
2011-01-01
We have completed an array of high-resolution simulations of present and future climate over Western North America (WNA) and Eastern North America (ENA) by dynamically downscaling global climate simulations using a regional climate model, RegCM3. The simulations are intended to provide long time series of internally consistent surface and atmospheric variables for use in climate-related research. In addition to providing high-resolution weather and climate data for the past, present, and future, we have developed an integrated data flow and methodology for processing, summarizing, viewing, and delivering the climate datasets to a wide range of potential users. Our simulations were run over 50- and 15-kilometer model grids in an attempt to capture more of the climatic detail associated with processes such as topographic forcing than can be captured by general circulation models (GCMs). The simulations were run using output from four GCMs. All simulations span the present (for example, 1968-1999), common periods of the future (2040-2069), and two simulations continuously cover 2010-2099. The trace gas concentrations in our simulations were the same as those of the GCMs: the IPCC 20th century time series for 1968-1999 and the A2 time series for simulations of the future. We demonstrate that RegCM3 is capable of producing present day annual and seasonal climatologies of air temperature and precipitation that are in good agreement with observations. Important features of the high-resolution climatology of temperature, precipitation, snow water equivalent (SWE), and soil moisture are consistently reproduced in all model runs over WNA and ENA. The simulations provide a potential range of future climate change for selected decades and display common patterns of the direction and magnitude of changes. As expected, there are some model to model differences that limit interpretability and give rise to uncertainties. Here, we provide background information about the GCMs and the RegCM3, a basic evaluation of the model output and examples of simulated future climate. We also provide information needed to access the web applications for visualizing and downloading the data, and give complete metadata that describe the variables in the datasets.
NASA Astrophysics Data System (ADS)
Baydaroğlu, Özlem; Koçak, Kasım; Duran, Kemal
2018-06-01
Prediction of water amount that will enter the reservoirs in the following month is of vital importance especially for semi-arid countries like Turkey. Climate projections emphasize that water scarcity will be one of the serious problems in the future. This study presents a methodology for predicting river flow for the subsequent month based on the time series of observed monthly river flow with hybrid models of support vector regression (SVR). Monthly river flow over the period 1940-2012 observed for the Kızılırmak River in Turkey has been used for training the method, which then has been applied for predictions over a period of 3 years. SVR is a specific implementation of support vector machines (SVMs), which transforms the observed input data time series into a high-dimensional feature space (input matrix) by way of a kernel function and performs a linear regression in this space. SVR requires a special input matrix. The input matrix was produced by wavelet transforms (WT), singular spectrum analysis (SSA), and a chaotic approach (CA) applied to the input time series. WT convolutes the original time series into a series of wavelets, and SSA decomposes the time series into a trend, an oscillatory and a noise component by singular value decomposition. CA uses a phase space formed by trajectories, which represent the dynamics producing the time series. These three methods for producing the input matrix for the SVR proved successful, while the SVR-WT combination resulted in the highest coefficient of determination and the lowest mean absolute error.
Volcanic hazard assessment for the Canary Islands (Spain) using extreme value theory
NASA Astrophysics Data System (ADS)
Sobradelo, R.; Martí, J.; Mendoza-Rosas, A. T.; Gómez, G.
2011-10-01
The Canary Islands are an active volcanic region densely populated and visited by several millions of tourists every year. Nearly twenty eruptions have been reported through written chronicles in the last 600 yr, suggesting that the probability of a new eruption in the near future is far from zero. This shows the importance of assessing and monitoring the volcanic hazard of the region in order to reduce and manage its potential volcanic risk, and ultimately contribute to the design of appropriate preparedness plans. Hence, the probabilistic analysis of the volcanic eruption time series for the Canary Islands is an essential step for the assessment of volcanic hazard and risk in the area. Such a series describes complex processes involving different types of eruptions over different time scales. Here we propose a statistical method for calculating the probabilities of future eruptions which is most appropriate given the nature of the documented historical eruptive data. We first characterize the eruptions by their magnitudes, and then carry out a preliminary analysis of the data to establish the requirements for the statistical method. Past studies in eruptive time series used conventional statistics and treated the series as an homogeneous process. In this paper, we will use a method that accounts for the time-dependence of the series and includes rare or extreme events, in the form of few data of large eruptions, since these data require special methods of analysis. Hence, we will use a statistical method from extreme value theory. In particular, we will apply a non-homogeneous Poisson process to the historical eruptive data of the Canary Islands to estimate the probability of having at least one volcanic event of a magnitude greater than one in the upcoming years. This is done in three steps: First, we analyze the historical eruptive series to assess independence and homogeneity of the process. Second, we perform a Weibull analysis of the distribution of repose time between successive eruptions. Third, we analyze the non-homogeneous Poisson process with a generalized Pareto distribution as the intensity function.
Adaptation options to future climate of maize crop in Southern Italy examined using thermal sums
NASA Astrophysics Data System (ADS)
Di Tommasi, P.; Alfieri, S. M.; Bonfante, A.; Basile, A.; De Lorenzi, F.; Menenti, M.
2012-04-01
Future climate scenarios predict substantial changes in air temperature within a few decades and agriculture needs to increase the capacity of adaptation both by changing spatial distribution of crops and shifting timing of management. In this context the prediction of future behaviour of crops with respect to present climate could be useful for farm and landscape management. In this work, thermal sums were used to simulate a maize crop in a future scenario, in terms of length of the growing season and of intervals between the main phenological stages. The area under study is the Sele plain (Campania Region), a pedo-climatic homogeneous area, one of the most agriculturally advanced and relevant flatland in Southern Italy. Maize was selected for the present study since it is extensively grown in the Sele Plain for water buffalofeeding,. Daily time-series of climatic data of the area under study were generated within the Italian project AGROSCENARI, and include maximum and minimum temperature and precipitation. The 1961-1990 and the 1998-2008 periods were compared to a future climate scenario (2021-2050). Future time series were generated using a statistical downscaling technique (Tomozeiu et al., 2007) from general circulation models (AOGCM). Differences in crop development length were calculated for different maize varieties under 3 management options for sowing time: custom date (typical for the area), before and after custom date. The interactions between future thermal regime and the length of growing season under the different management options were analyzed. Moreover, frequency of spells of high temperatures during the anthesis was examined. The feasibility of the early sowing option was discussed in relation with field trafficability at the beginning of the crop cycle. The work was carried out within the Italian national project AGROSCENARI funded by the Ministry for Agricultural, Food and Forest Policies (MIPAAF, D.M. 8608/7303/2008)
Multiresolution forecasting for futures trading using wavelet decompositions.
Zhang, B L; Coggins, R; Jabri, M A; Dersch, D; Flower, B
2001-01-01
We investigate the effectiveness of a financial time-series forecasting strategy which exploits the multiresolution property of the wavelet transform. A financial series is decomposed into an over complete, shift invariant scale-related representation. In transform space, each individual wavelet series is modeled by a separate multilayer perceptron (MLP). We apply the Bayesian method of automatic relevance determination to choose short past windows (short-term history) for the inputs to the MLPs at lower scales and long past windows (long-term history) at higher scales. To form the overall forecast, the individual forecasts are then recombined by the linear reconstruction property of the inverse transform with the chosen autocorrelation shell representation, or by another perceptron which learns the weight of each scale in the prediction of the original time series. The forecast results are then passed to a money management system to generate trades.
Anelli, Filomena; Ciaramelli, Elisa; Arzy, Shahar; Frassinetti, Francesca
2016-11-01
Accumulating evidence suggests that humans process time and space in similar veins. Humans represent time along a spatial continuum, and perception of temporal durations can be altered through manipulations of spatial attention by prismatic adaptation (PA). Here, we investigated whether PA-induced manipulations of spatial attention can also influence more conceptual aspects of time, such as humans' ability to travel mentally back and forward in time (mental time travel, MTT). Before and after leftward- and rightward-PA, participants projected themselves in the past, present or future time (i.e., self-projection), and, for each condition, determined whether a series of events were located in the past or the future with respect to that specific self-location in time (i.e., self-reference). The results demonstrated that leftward and rightward shifts of spatial attention facilitated recognition of past and future events, respectively. These findings suggest that spatial attention affects the temporal processing of the human self. Copyright © 2016 Elsevier B.V. All rights reserved.
Light-weight Parallel Python Tools for Earth System Modeling Workflows
NASA Astrophysics Data System (ADS)
Mickelson, S. A.; Paul, K.; Xu, H.; Dennis, J.; Brown, D. I.
2015-12-01
With the growth in computing power over the last 30 years, earth system modeling codes have become increasingly data-intensive. As an example, it is expected that the data required for the next Intergovernmental Panel on Climate Change (IPCC) Assessment Report (AR6) will increase by more than 10x to an expected 25PB per climate model. Faced with this daunting challenge, developers of the Community Earth System Model (CESM) have chosen to change the format of their data for long-term storage from time-slice to time-series, in order to reduce the required download bandwidth needed for later analysis and post-processing by climate scientists. Hence, efficient tools are required to (1) perform the transformation of the data from time-slice to time-series format and to (2) compute climatology statistics, needed for many diagnostic computations, on the resulting time-series data. To address the first of these two challenges, we have developed a parallel Python tool for converting time-slice model output to time-series format. To address the second of these challenges, we have developed a parallel Python tool to perform fast time-averaging of time-series data. These tools are designed to be light-weight, be easy to install, have very few dependencies, and can be easily inserted into the Earth system modeling workflow with negligible disruption. In this work, we present the motivation, approach, and testing results of these two light-weight parallel Python tools, as well as our plans for future research and development.
NASA Technical Reports Server (NTRS)
Jackson, T. J.; Shiue, J.; Oneill, P.; Wang, J.; Fuchs, J.; Owe, M.
1984-01-01
The verification of a multi-sensor aircraft system developed to study soil moisture applications is discussed. This system consisted of a three beam push broom L band microwave radiometer, a thermal infrared scanner, a multispectral scanner, video and photographic cameras and an onboard navigational instrument. Ten flights were made of agricultural sites in Maryland and Delaware with little or no vegetation cover. Comparisons of aircraft and ground measurements showed that the system was reliable and consistent. Time series analysis of microwave and evaporation data showed a strong similarity that indicates a potential direction for future research.
Kwasniok, Frank
2013-11-01
A time series analysis method for predicting the probability density of a dynamical system is proposed. A nonstationary parametric model of the probability density is estimated from data within a maximum likelihood framework and then extrapolated to forecast the future probability density and explore the system for critical transitions or tipping points. A full systematic account of parameter uncertainty is taken. The technique is generic, independent of the underlying dynamics of the system. The method is verified on simulated data and then applied to prediction of Arctic sea-ice extent.
Burby, Joshua W.; Lacker, Daniel
2016-01-01
Systems as diverse as the interacting species in a community, alleles at a genetic locus, and companies in a market are characterized by competition (over resources, space, capital, etc) and adaptation. Neutral theory, built around the hypothesis that individual performance is independent of group membership, has found utility across the disciplines of ecology, population genetics, and economics, both because of the success of the neutral hypothesis in predicting system properties and because deviations from these predictions provide information about the underlying dynamics. However, most tests of neutrality are weak, based on static system properties such as species-abundance distributions or the number of singletons in a sample. Time-series data provide a window onto a system’s dynamics, and should furnish tests of the neutral hypothesis that are more powerful to detect deviations from neutrality and more informative about to the type of competitive asymmetry that drives the deviation. Here, we present a neutrality test for time-series data. We apply this test to several microbial time-series and financial time-series and find that most of these systems are not neutral. Our test isolates the covariance structure of neutral competition, thus facilitating further exploration of the nature of asymmetry in the covariance structure of competitive systems. Much like neutrality tests from population genetics that use relative abundance distributions have enabled researchers to scan entire genomes for genes under selection, we anticipate our time-series test will be useful for quick significance tests of neutrality across a range of ecological, economic, and sociological systems for which time-series data are available. Future work can use our test to categorize and compare the dynamic fingerprints of particular competitive asymmetries (frequency dependence, volatility smiles, etc) to improve forecasting and management of complex adaptive systems. PMID:27689714
The Impact of Nature Experience on Willingness to Support Conservation
Zaradic, Patricia A.; Pergams, Oliver R. W.; Kareiva, Peter
2009-01-01
We hypothesized that willingness to financially support conservation depends on one's experience with nature. In order to test this hypothesis, we used a novel time-lagged correlation analysis to look at times series data concerning nature participation, and evaluate its relationship with future conservation support (measured as contributions to conservation NGOs). Our results suggest that the type and timing of nature experience may determine future conservation investment. Time spent hiking or backpacking is correlated with increased conservation contributions 11–12 years later. On the other hand, contributions are negatively correlated with past time spent on activities such as public lands visitation or fishing. Our results suggest that each hiker or backpacker translates to $200–$300 annually in future NGO contributions. We project that the recent decline in popularity of hiking and backpacking will negatively impact conservation NGO contributions from approximately 2010–2011 through at least 2018. PMID:19809511
Exploring fractal behaviour of blood oxygen saturation in preterm babies
NASA Astrophysics Data System (ADS)
Zahari, Marina; Hui, Tan Xin; Zainuri, Nuryazmin Ahmat; Darlow, Brian A.
2017-04-01
Recent evidence has been emerging that oxygenation instability in preterm babies could lead to an increased risk of retinal injury such as retinopathy of prematurity. There is a potential that disease severity could be better understood using nonlinear methods for time series data such as fractal theories [1]. Theories on fractal behaviours have been employed by researchers in various disciplines who were motivated to look into the behaviour or structure of irregular fluctuations in temporal data. In this study, an investigation was carried out to examine whether fractal behaviour could be detected in blood oxygen time series. Detection for the presence of fractals in oxygen data of preterm infants was performed using the methods of power spectrum, empirical probability distribution function and autocorrelation function. The results from these fractal identification methods indicate the possibility that these data exhibit fractal nature. Subsequently, a fractal framework for future research was suggested for oxygen time series.
Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis
NASA Astrophysics Data System (ADS)
Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.
2015-06-01
This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.
Cascade Error Projection with Low Bit Weight Quantization for High Order Correlation Data
NASA Technical Reports Server (NTRS)
Duong, Tuan A.; Daud, Taher
1998-01-01
In this paper, we reinvestigate the solution for chaotic time series prediction problem using neural network approach. The nature of this problem is such that the data sequences are never repeated, but they are rather in chaotic region. However, these data sequences are correlated between past, present, and future data in high order. We use Cascade Error Projection (CEP) learning algorithm to capture the high order correlation between past and present data to predict a future data using limited weight quantization constraints. This will help to predict a future information that will provide us better estimation in time for intelligent control system. In our earlier work, it has been shown that CEP can sufficiently learn 5-8 bit parity problem with 4- or more bits, and color segmentation problem with 7- or more bits of weight quantization. In this paper, we demonstrate that chaotic time series can be learned and generalized well with as low as 4-bit weight quantization using round-off and truncation techniques. The results show that generalization feature will suffer less as more bit weight quantization is available and error surfaces with the round-off technique are more symmetric around zero than error surfaces with the truncation technique. This study suggests that CEP is an implementable learning technique for hardware consideration.
An evaluation of Dynamic TOPMODEL for low flow simulation
NASA Astrophysics Data System (ADS)
Coxon, G.; Freer, J. E.; Quinn, N.; Woods, R. A.; Wagener, T.; Howden, N. J. K.
2015-12-01
Hydrological models are essential tools for drought risk management, often providing input to water resource system models, aiding our understanding of low flow processes within catchments and providing low flow predictions. However, simulating low flows and droughts is challenging as hydrological systems often demonstrate threshold effects in connectivity, non-linear groundwater contributions and a greater influence of water resource system elements during low flow periods. These dynamic processes are typically not well represented in commonly used hydrological models due to data and model limitations. Furthermore, calibrated or behavioural models may not be effectively evaluated during more extreme drought periods. A better understanding of the processes that occur during low flows and how these are represented within models is thus required if we want to be able to provide robust and reliable predictions of future drought events. In this study, we assess the performance of dynamic TOPMODEL for low flow simulation. Dynamic TOPMODEL was applied to a number of UK catchments in the Thames region using time series of observed rainfall and potential evapotranspiration data that captured multiple historic droughts over a period of several years. The model performance was assessed against the observed discharge time series using a limits of acceptability framework, which included uncertainty in the discharge time series. We evaluate the models against multiple signatures of catchment low-flow behaviour and investigate differences in model performance between catchments, model diagnostics and for different low flow periods. We also considered the impact of surface water and groundwater abstractions and discharges on the observed discharge time series and how this affected the model evaluation. From analysing the model performance, we suggest future improvements to Dynamic TOPMODEL to improve the representation of low flow processes within the model structure.
NASA Astrophysics Data System (ADS)
Ruhi, A.; Olden, J. D.; Sabo, J. L.
2015-12-01
In the American Southwest, hydrologic drought has become a new normal as a result of increasing human appropriation of freshwater resources and increased aridity associated with global warming. Although drought has often been touted to threaten freshwater biodiversity, connecting drought to extinction risk of highly-imperiled faunas remains a challenge. Here we combine time-series methods from signal processing and econometrics to analyze a spatially comprehensive and long-term dataset to link discharge variation and community abundance of fish across the American Southwest. This novel time series framework identifies ongoing trends in daily discharge anomalies across the Southwest, quantifies the effect of the historical hydrologic drivers on fish community abundance, and allows us to simulate species trajectories and range-wide risk of decline (quasiextinction) under scenarios of future climate. Spectral anomalies are declining over the last 30 years in at least a quarter of the stream gaging stations across the American Southwest and these anomalies are robust predictors of historical abundance of native and non-native fishes. Quasiextinction probabilities are high (>50 %) for nearly ¾ of the native species across several large river basins in the same region; and the negative trend in annual anomalies increases quasiextinction risk for native but reduces this risk for non-native fishes. These findings suggest that ongoing drought is causing range-wide collapse and replacement of native fish faunas, and that this homogenization of western fish faunas will continue given the prevailing negative trend in discharge anomalies. Additionally, this combination of methods can be applied elsewhere as long as environmental and biological long-term time-series data are available. Collectively, these methods allow identifying the link between hydroclimatic forcing and ecological responses and thus may help anticipating the potential impacts of ongoing and future hydrologic extremes in freshwater ecosystems.
MAKING THE WEASELS WILD AGAIN: ENSURING FUTURE AIR DOMINANCE THROUGH EFFECTIVE SEAD TRAINING
2016-06-01
both multi-mission design series (MMDS) and joint SEAD training as well as improve the capabilities of its electronic warfare (EW) ranges in order...USAF units to train for multi-mission design series (MMDS) SEAD operations.14 MMDS training includes the use of multiple USAF airborne platforms...not provided SEAD aircrews with either the quantity or quality of training required to conduct effective operations.2 At that time , Major Jon Norman
The United States Strategy in East Asia-Pacific-Implications for Australia’s Defenses
1996-01-01
Mohamed Najib bin Tun Abdul Razak , "Towards Cooperative Security and Regional Stability, the Malaysian View." The Army and the Future. Ed. David Horner...repair and maintenance of warships on a commercial basis. See The Hon. Datuk Seri Mohamed Najib bin Tun Abdul Razak , 132. Times says that Jakarta...Cambridge University Press, 1965. The Hon. Datuk Seri Mohamed Najib bin Tun Abdul Razak . "Towards Cooperative Security and Regional Stability: The
Asquith, W.H.; Mosier, J. G.; Bush, P.W.
1997-01-01
The watershed simulation model Hydrologic Simulation Program—Fortran (HSPF) was used to generate simulated flow (runoff) from the 13 watersheds to the six bay systems because adequate gaged streamflow data from which to estimate freshwater inflows are not available; only about 23 percent of the adjacent contributing watershed area is gaged. The model was calibrated for the gaged parts of three watersheds—that is, selected input parameters (meteorologic and hydrologic properties and conditions) that control runoff were adjusted in a series of simulations until an adequate match between model-generated flows and a set (time series) of gaged flows was achieved. The primary model input is rainfall and evaporation data and the model output is a time series of runoff volumes. After calibration, simulations driven by daily rainfall for a 26-year period (1968–93) were done for the 13 watersheds to obtain runoff under current (1983–93), predevelopment (pre-1940 streamflow and pre-urbanization), and future (2010) land-use conditions for estimating freshwater inflows and for comparing runoff under the three land-use conditions; and to obtain time series of runoff from which to estimate time series of freshwater inflows for trend analysis.
NASA Astrophysics Data System (ADS)
Xu, Xijin; Tang, Qian; Xia, Haiyue; Zhang, Yuling; Li, Weiqiu; Huo, Xia
2016-04-01
Chaotic time series prediction based on nonlinear systems showed a superior performance in prediction field. We studied prenatal exposure to polychlorinated biphenyls (PCBs) by chaotic time series prediction using the least squares self-exciting threshold autoregressive (SEATR) model in umbilical cord blood in an electronic waste (e-waste) contaminated area. The specific prediction steps basing on the proposal methods for prenatal PCB exposure were put forward, and the proposed scheme’s validity was further verified by numerical simulation experiments. Experiment results show: 1) seven kinds of PCB congeners negatively correlate with five different indices for birth status: newborn weight, height, gestational age, Apgar score and anogenital distance; 2) prenatal PCB exposed group at greater risks compared to the reference group; 3) PCBs increasingly accumulated with time in newborns; and 4) the possibility of newborns suffering from related diseases in the future was greater. The desirable numerical simulation experiments results demonstrated the feasibility of applying mathematical model in the environmental toxicology field.
NASA Astrophysics Data System (ADS)
Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.
2016-05-01
Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.
Xu, Xijin; Tang, Qian; Xia, Haiyue; Zhang, Yuling; Li, Weiqiu; Huo, Xia
2016-01-01
Chaotic time series prediction based on nonlinear systems showed a superior performance in prediction field. We studied prenatal exposure to polychlorinated biphenyls (PCBs) by chaotic time series prediction using the least squares self-exciting threshold autoregressive (SEATR) model in umbilical cord blood in an electronic waste (e-waste) contaminated area. The specific prediction steps basing on the proposal methods for prenatal PCB exposure were put forward, and the proposed scheme’s validity was further verified by numerical simulation experiments. Experiment results show: 1) seven kinds of PCB congeners negatively correlate with five different indices for birth status: newborn weight, height, gestational age, Apgar score and anogenital distance; 2) prenatal PCB exposed group at greater risks compared to the reference group; 3) PCBs increasingly accumulated with time in newborns; and 4) the possibility of newborns suffering from related diseases in the future was greater. The desirable numerical simulation experiments results demonstrated the feasibility of applying mathematical model in the environmental toxicology field. PMID:27118260
Artificial neural networks applied to forecasting time series.
Montaño Moreno, Juan J; Palmer Pol, Alfonso; Muñoz Gracia, Pilar
2011-04-01
This study offers a description and comparison of the main models of Artificial Neural Networks (ANN) which have proved to be useful in time series forecasting, and also a standard procedure for the practical application of ANN in this type of task. The Multilayer Perceptron (MLP), Radial Base Function (RBF), Generalized Regression Neural Network (GRNN), and Recurrent Neural Network (RNN) models are analyzed. With this aim in mind, we use a time series made up of 244 time points. A comparative study establishes that the error made by the four neural network models analyzed is less than 10%. In accordance with the interpretation criteria of this performance, it can be concluded that the neural network models show a close fit regarding their forecasting capacity. The model with the best performance is the RBF, followed by the RNN and MLP. The GRNN model is the one with the worst performance. Finally, we analyze the advantages and limitations of ANN, the possible solutions to these limitations, and provide an orientation towards future research.
The construction of a Central Netherlands temperature
NASA Astrophysics Data System (ADS)
van der Schrier, G.; van Ulden, A.; van Oldenborgh, G. J.
2011-05-01
The Central Netherlands Temperature (CNT) is a monthly daily mean temperature series constructed from homogenized time series from the centre of the Netherlands. The purpose of this series is to offer a homogeneous time series representative of a larger area in order to study large-scale temperature changes. It will also facilitate a comparison with climate models, which resolve similar scales. From 1906 onwards, temperature measurements in the Netherlands have been sufficiently standardized to construct a high-quality series. Long time series have been constructed by merging nearby stations and using the overlap to calibrate the differences. These long time series and a few time series of only a few decades in length have been subjected to a homogeneity analysis in which significant breaks and artificial trends have been corrected. Many of the detected breaks correspond to changes in the observations that are documented in the station metadata. This version of the CNT, to which we attach the version number 1.1, is constructed as the unweighted average of four stations (De Bilt, Winterswijk/Hupsel, Oudenbosch/Gilze-Rijen and Gemert/Volkel) with the stations Eindhoven and Deelen added from 1951 and 1958 onwards, respectively. The global gridded datasets used for detecting and attributing climate change are based on raw observational data. Although some homogeneity adjustments are made, these are not based on knowledge of local circumstances but only on statistical evidence. Despite this handicap, and the fact that these datasets use grid boxes that are far larger then the area associated with that of the Central Netherlands Temperature, the temperature interpolated to the CNT region shows a warming trend that is broadly consistent with the CNT trend in all of these datasets. The actual trends differ from the CNT trend up to 30 %, which highlights the need to base future global gridded temperature datasets on homogenized time series.
Regional Glacier Mapping by Combination of Dense Optical and SAR Satellite Image Time-Series
NASA Astrophysics Data System (ADS)
Winsvold, S. H.; Kääb, A.; Andreassen, L. M.; Nuth, C.; Schellenberger, T.; van Pelt, W.
2016-12-01
Near-future dense time series from both SAR (Sentinel-1A and B) and optical satellite sensors (Landsat 8, Sentinel-2A and B) will promote new multisensory time series applications for glacier mapping. We assess such combinations of optical and SAR data among others by 1) using SAR data to supplement optical time series that suffer from heavy cloud cover (chronological gap-filling), 2) merging the two data types based on stack statistics (Std.dev, Mean, Max. etc.), or 3) better explaining glacier facies patterns in SAR data using optical satellite images. As one example, summer SAR backscatter time series have been largely unexplored and even neglected in many glaciological studies due to the high content of liquid melt water on the ice surface and its intrusion in the upper part of the snow and firn. This water content causes strong specular scattering and absorption of the radar signal, and little energy is scattered back to the SAR sensor. We find in many scenes of a Sentinel-1 time series a significant temporal backscatter difference between the glacier ice surface and the seasonal snow as it melts up glacier. Even though both surfaces have typically wet conditions, we suggest that the backscatter difference is due to different roughness lengths of the two surfaces. Higher backscatter is found on the ice surface in the ablation area compared to the firn/seasonal snow surface. We find and present also other backscatter patterns in the Sentinel-1 time series related to glacier facies and weather events. For the Ny Ålesund area, Svalbard we use Radarsat-2 time series to explore the glacier backscatter conditions in a > 5 year period, discussing distinct temporal signals from among others refreezing of the firn in late autumn, or temporal lakes. All these examples are analyzed using the above 3 methods. By this multi-temporal and multi-sensor approach we also explore and describe the possible connection between combined SAR/optical time series and surface mass balance.
Detection of long term persistence in time series of the Neuquen River (Argentina)
NASA Astrophysics Data System (ADS)
Seoane, Rafael; Paz González, Antonio
2014-05-01
In the Patagonian region (Argentina), previous hydrometeorological studies that have been developed using general circulation models show variations in annual mean flows. Future climate scenarios obtained from high-resolution models indicate decreases in total annual precipitation, and these scenarios are more important in the Neuquén river basin (23000 km2). The aim of this study was the estimation of long term persistence in the Neuquén River basin (Argentina). The detection of variations in the long range dependence term and long memory of time series was evaluated with the Hurst exponent. We applied rescaled adjusted range analysis (R/S) to time series of River discharges measured from 1903 to 2011 and this time series was divided into two subperiods: the first was from 1903 to 1970 and the second from 1970 to 2011. Results show a small increase in persistence for the second period. Our results are consistent with those obtained by Koch and Markovic (2007), who observed and estimated an increase of the H exponent for the period 1960-2000 in the Elbe River (Germany). References Hurst, H. (1951).Long term storage capacities of reservoirs". Trans. Am. Soc. Civil Engrs., 116:776-808. Koch and Markovic (2007). Evidences for Climate Change in Germany over the 20th Century from the Stochastic Analysis of hydro-meteorological Time Series, MODSIM07, International Congress on Modelling and Simulation, Christchurch, New Zealand.
Road safety forecasts in five European countries using structural time series models.
Antoniou, Constantinos; Papadimitriou, Eleonora; Yannis, George
2014-01-01
Modeling road safety development is a complex task and needs to consider both the quantifiable impact of specific parameters as well as the underlying trends that cannot always be measured or observed. The objective of this research is to apply structural time series models for obtaining reliable medium- to long-term forecasts of road traffic fatality risk using data from 5 countries with different characteristics from all over Europe (Cyprus, Greece, Hungary, Norway, and Switzerland). Two structural time series models are considered: (1) the local linear trend model and the (2) latent risk time series model. Furthermore, a structured decision tree for the selection of the applicable model for each situation (developed within the Road Safety Data, Collection, Transfer and Analysis [DaCoTA] research project, cofunded by the European Commission) is outlined. First, the fatality and exposure data that are used for the development of the models are presented and explored. Then, the modeling process is presented, including the model selection process, introduction of intervention variables, and development of mobility scenarios. The forecasts using the developed models appear to be realistic and within acceptable confidence intervals. The proposed methodology is proved to be very efficient for handling different cases of data availability and quality, providing an appropriate alternative from the family of structural time series models in each country. A concluding section providing perspectives and directions for future research is presented.
Characteristic mega-basin water storage behavior using GRACE.
Reager, J T; Famiglietti, James S
2013-06-01
[1] A long-standing challenge for hydrologists has been a lack of observational data on global-scale basin hydrological behavior. With observations from NASA's Gravity Recovery and Climate Experiment (GRACE) mission, hydrologists are now able to study terrestrial water storage for large river basins (>200,000 km 2 ), with monthly time resolution. Here we provide results of a time series model of basin-averaged GRACE terrestrial water storage anomaly and Global Precipitation Climatology Project precipitation for the world's largest basins. We address the short (10 year) length of the GRACE record by adopting a parametric spectral method to calculate frequency-domain transfer functions of storage response to precipitation forcing and then generalize these transfer functions based on large-scale basin characteristics, such as percent forest cover and basin temperature. Among the parameters tested, results show that temperature, soil water-holding capacity, and percent forest cover are important controls on relative storage variability, while basin area and mean terrain slope are less important. The derived empirical relationships were accurate (0.54 ≤ E f ≤ 0.84) in modeling global-scale water storage anomaly time series for the study basins using only precipitation, average basin temperature, and two land-surface variables, offering the potential for synthesis of basin storage time series beyond the GRACE observational period. Such an approach could be applied toward gap filling between current and future GRACE missions and for predicting basin storage given predictions of future precipitation.
Characteristic mega-basin water storage behavior using GRACE
Reager, J T; Famiglietti, James S
2013-01-01
[1] A long-standing challenge for hydrologists has been a lack of observational data on global-scale basin hydrological behavior. With observations from NASA’s Gravity Recovery and Climate Experiment (GRACE) mission, hydrologists are now able to study terrestrial water storage for large river basins (>200,000 km2), with monthly time resolution. Here we provide results of a time series model of basin-averaged GRACE terrestrial water storage anomaly and Global Precipitation Climatology Project precipitation for the world’s largest basins. We address the short (10 year) length of the GRACE record by adopting a parametric spectral method to calculate frequency-domain transfer functions of storage response to precipitation forcing and then generalize these transfer functions based on large-scale basin characteristics, such as percent forest cover and basin temperature. Among the parameters tested, results show that temperature, soil water-holding capacity, and percent forest cover are important controls on relative storage variability, while basin area and mean terrain slope are less important. The derived empirical relationships were accurate (0.54 ≤ Ef ≤ 0.84) in modeling global-scale water storage anomaly time series for the study basins using only precipitation, average basin temperature, and two land-surface variables, offering the potential for synthesis of basin storage time series beyond the GRACE observational period. Such an approach could be applied toward gap filling between current and future GRACE missions and for predicting basin storage given predictions of future precipitation. PMID:24563556
The application of neural networks to myoelectric signal analysis: a preliminary study.
Kelly, M F; Parker, P A; Scott, R N
1990-03-01
Two neural network implementations are applied to myoelectric signal (MES) analysis tasks. The motivation behind this research is to explore more reliable methods of deriving control for multidegree of freedom arm prostheses. A discrete Hopfield network is used to calculate the time series parameters for a moving average MES model. It is demonstrated that the Hopfield network is capable of generating the same time series parameters as those produced by the conventional sequential least squares (SLS) algorithm. Furthermore, it can be extended to applications utilizing larger amounts of data, and possibly to higher order time series models, without significant degradation in computational efficiency. The second neural network implementation involves using a two-layer perceptron for classifying a single site MES based on two features, specifically the first time series parameter, and the signal power. Using these features, the perceptron is trained to distinguish between four separate arm functions. The two-dimensional decision boundaries used by the perceptron classifier are delineated. It is also demonstrated that the perceptron is able to rapidly compensate for variations when new data are incorporated into the training set. This adaptive quality suggests that perceptrons may provide a useful tool for future MES analysis.
NASA Astrophysics Data System (ADS)
Vlasenko, A. V.; Sizonenko, A. B.; Zhdanov, A. A.
2018-05-01
Discrete time series or mappings are proposed for describing the dynamics of a nonlinear system. The article considers the problems of forecasting the dynamics of the system from the time series generated by it. In particular, the commercial rate of drilling oil and gas wells can be considered as a series where each next value depends on the previous one. The main parameter here is the technical drilling speed. With the aim of eliminating the measurement error and presenting the commercial speed of the object to the current with a good accuracy, future or any of the elapsed time points, the use of the Kalman filter is suggested. For the transition from a deterministic model to a probabilistic one, the use of ensemble modeling is suggested. Ensemble systems can provide a wide range of visual output, which helps the user to evaluate the measure of confidence in the model. In particular, the availability of information on the estimated calendar duration of the construction of oil and gas wells will allow drilling companies to optimize production planning by rationalizing the approach to loading drilling rigs, which ultimately leads to maximization of profit and an increase of their competitiveness.
A Source Term for Wave Attenuation by Sea Ice in WAVEWATCH III (registered trademark): IC4
2017-06-07
by ANSI Std. Z39.18 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for... time . Diamonds indicate active, moored AWACs. Circle indicates location of R/V Sikuliaq. Thick magenta and white lines indicate path of R/V Sikuliaq...past and future ship position, respectively). .................................................................. 15 Figure 10 Time series of
Use of a prototype pulse oximeter for time series analysis of heart rate variability
NASA Astrophysics Data System (ADS)
González, Erika; López, Jehú; Hautefeuille, Mathieu; Velázquez, Víctor; Del Moral, Jésica
2015-05-01
This work presents the development of a low cost pulse oximeter prototype consisting of pulsed red and infrared commercial LEDs and a broad spectral photodetector used to register time series of heart rate and oxygen saturation of blood. This platform, besides providing these values, like any other pulse oximeter, processes the signals to compute a power spectrum analysis of the patient heart rate variability in real time and, additionally, the device allows access to all raw and analyzed data if databases construction is required or another kind of further analysis is desired. Since the prototype is capable of acquiring data for long periods of time, it is suitable for collecting data in real life activities, enabling the development of future wearable applications.
NASA Astrophysics Data System (ADS)
He, Ling-Yun; Chen, Shu-Peng
2011-01-01
Nonlinear dependency between characteristic financial and commodity market quantities (variables) is crucially important, especially between trading volume and market price. Studies on nonlinear dependency between price and volume can provide practical insights into market trading characteristics, as well as the theoretical understanding of market dynamics. Actually, nonlinear dependency and its underlying dynamical mechanisms between price and volume can help researchers and technical analysts in understanding the market dynamics by integrating the market variables, instead of investigating them in the current literature. Therefore, for investigating nonlinear dependency of price-volume relationships in agricultural commodity futures markets in China and the US, we perform a new statistical test to detect cross-correlations and apply a new methodology called Multifractal Detrended Cross-Correlation Analysis (MF-DCCA), which is an efficient algorithm to analyze two spatially or temporally correlated time series. We discuss theoretically the relationship between the bivariate cross-correlation exponent and the generalized Hurst exponents for time series of respective variables. We also perform an empirical study and find that there exists a power-law cross-correlation between them, and that multifractal features are significant in all the analyzed agricultural commodity futures markets.
Applications and Comparisons of Four Time Series Models in Epidemiological Surveillance Data
Young, Alistair A.; Li, Xiaosong
2014-01-01
Public health surveillance systems provide valuable data for reliable predication of future epidemic events. This paper describes a study that used nine types of infectious disease data collected through a national public health surveillance system in mainland China to evaluate and compare the performances of four time series methods, namely, two decomposition methods (regression and exponential smoothing), autoregressive integrated moving average (ARIMA) and support vector machine (SVM). The data obtained from 2005 to 2011 and in 2012 were used as modeling and forecasting samples, respectively. The performances were evaluated based on three metrics: mean absolute error (MAE), mean absolute percentage error (MAPE), and mean square error (MSE). The accuracy of the statistical models in forecasting future epidemic disease proved their effectiveness in epidemiological surveillance. Although the comparisons found that no single method is completely superior to the others, the present study indeed highlighted that the SVMs outperforms the ARIMA model and decomposition methods in most cases. PMID:24505382
Immediate versus sustained effects: interrupted time series analysis of a tailored intervention.
Hanbury, Andria; Farley, Katherine; Thompson, Carl; Wilson, Paul M; Chambers, Duncan; Holmes, Heather
2013-11-05
Detailed intervention descriptions and robust evaluations that test intervention impact--and explore reasons for impact--are an essential part of progressing implementation science. Time series designs enable the impact and sustainability of intervention effects to be tested. When combined with time series designs, qualitative methods can provide insight into intervention effectiveness and help identify areas for improvement for future interventions. This paper describes the development, delivery, and evaluation of a tailored intervention designed to increase primary health care professionals' adoption of a national recommendation that women with mild to moderate postnatal depression (PND) are referred for psychological therapy as a first stage treatment. Three factors influencing referral for psychological treatment were targeted using three related intervention components: a tailored educational meeting, a tailored educational leaflet, and changes to an electronic system data template used by health professionals during consultations for PND. Evaluation comprised time series analysis of monthly audit data on percentage referral rates and monthly first prescription rates for anti-depressants. Interviews were conducted with a sample of health professionals to explore their perceptions of the intervention components and to identify possible factors influencing intervention effectiveness. The intervention was associated with a significant, immediate, positive effect upon percentage referral rates for psychological treatments. This effect was not sustained over the ten month follow-on period. Monthly rates of anti-depressant prescriptions remained consistently high after the intervention. Qualitative interview findings suggest key messages received from the intervention concerned what appropriate antidepressant prescribing is, suggesting this to underlie the lack of impact upon prescribing rates. However, an understanding that psychological treatment can have long-term benefits was also cited. Barriers to referral identified before intervention were cited again after the intervention, suggesting the intervention had not successfully tackled the barriers targeted. A time series design allowed the initial and sustained impact of our intervention to be tested. Combined with qualitative interviews, this provided insight into intervention effectiveness. Future research should test factors influencing intervention sustainability, and promote adoption of the targeted behavior and dis-adoption of competing behaviors where appropriate.
NASA Astrophysics Data System (ADS)
Patra, S. R.
2017-12-01
Evapotranspiration (ET0) influences water resources and it is considered as a vital process in aridic hydrologic frameworks. It is one of the most important measure in finding the drought condition. Therefore, time series forecasting of evapotranspiration is very important in order to help the decision makers and water system mangers build up proper systems to sustain and manage water resources. Time series considers that -history repeats itself, hence by analysing the past values, better choices, or forecasts, can be carried out for the future. Ten years of ET0 data was used as a part of this study to make sure a satisfactory forecast of monthly values. In this study, three models: (ARIMA) mathematical model, artificial neural network model, support vector machine model are presented. These three models are used for forecasting monthly reference crop evapotranspiration based on ten years of past historical records (1991-2001) of measured evaporation at Ganjam region, Odisha, India without considering the climate data. The developed models will allow water resource managers to predict up to 12 months, making these predictions very useful to optimize the resources needed for effective water resources management. In this study multistep-ahead prediction is performed which is more complex and troublesome than onestep ahead. Our investigation proposed that nonlinear relationships may exist among the monthly indices, so that the ARIMA model might not be able to effectively extract the full relationship hidden in the historical data. Support vector machines are potentially helpful time series forecasting strategies on account of their strong nonlinear mapping capability and resistance to complexity in forecasting data. SVMs have great learning capability in time series modelling compared to ANN. For instance, the SVMs execute the structural risk minimization principle, which allows in better generalization as compared to neural networks that use the empirical risk minimization principle. The reliability of these computational models was analysed in light of simulation results and it was found out that SVM model produces better results among the three. The future research should be routed to extend the validation data set and to check the validity of our results on different areas with hybrid intelligence techniques.
Immediate versus sustained effects: interrupted time series analysis of a tailored intervention
2013-01-01
Background Detailed intervention descriptions and robust evaluations that test intervention impact—and explore reasons for impact—are an essential part of progressing implementation science. Time series designs enable the impact and sustainability of intervention effects to be tested. When combined with time series designs, qualitative methods can provide insight into intervention effectiveness and help identify areas for improvement for future interventions. This paper describes the development, delivery, and evaluation of a tailored intervention designed to increase primary health care professionals’ adoption of a national recommendation that women with mild to moderate postnatal depression (PND) are referred for psychological therapy as a first stage treatment. Methods Three factors influencing referral for psychological treatment were targeted using three related intervention components: a tailored educational meeting, a tailored educational leaflet, and changes to an electronic system data template used by health professionals during consultations for PND. Evaluation comprised time series analysis of monthly audit data on percentage referral rates and monthly first prescription rates for anti-depressants. Interviews were conducted with a sample of health professionals to explore their perceptions of the intervention components and to identify possible factors influencing intervention effectiveness. Results The intervention was associated with a significant, immediate, positive effect upon percentage referral rates for psychological treatments. This effect was not sustained over the ten month follow-on period. Monthly rates of anti-depressant prescriptions remained consistently high after the intervention. Qualitative interview findings suggest key messages received from the intervention concerned what appropriate antidepressant prescribing is, suggesting this to underlie the lack of impact upon prescribing rates. However, an understanding that psychological treatment can have long-term benefits was also cited. Barriers to referral identified before intervention were cited again after the intervention, suggesting the intervention had not successfully tackled the barriers targeted. Conclusion A time series design allowed the initial and sustained impact of our intervention to be tested. Combined with qualitative interviews, this provided insight into intervention effectiveness. Future research should test factors influencing intervention sustainability, and promote adoption of the targeted behavior and dis-adoption of competing behaviors where appropriate. PMID:24188718
Paleoclimate and bubonic plague: a forewarning of future risk?
McMichael, Anthony J
2010-08-27
Pandemics of bubonic plague have occurred in Eurasia since the sixth century AD. Climatic variations in Central Asia affect the population size and activity of the plague bacterium's reservoir rodent species, influencing the probability of human infection. Using innovative time-series analysis of surrogate climate records spanning 1,500 years, a study in BMC Biology concludes that climatic fluctuations may have influenced these pandemics. This has potential implications for health risks from future climate change.
Starting and Promoting a "First-Time" Association Seminar Series. TECHNIQUES.
ERIC Educational Resources Information Center
Paul, Sharon A.
1984-01-01
As the competition among providers in the continuing education market intensifies, universities starting new seminars will need to alter their marketing and recruitment procedures drastically. Telemarketing and a two-step marketing approach will undoubtedly become more widespread in the future. Individuals responsible for marketing continuing…
MODIS Interactive Subsetting Tool (MIST)
NASA Astrophysics Data System (ADS)
McAllister, M.; Duerr, R.; Haran, T.; Khalsa, S. S.; Miller, D.
2008-12-01
In response to requests from the user community, NSIDC has teamed with the Oak Ridge National Laboratory Distributive Active Archive Center (ORNL DAAC) and the Moderate Resolution Data Center (MrDC) to provide time series subsets of satellite data covering stations in the Greenland Climate Network (GC-NET) and the International Arctic Systems for Observing the Atmosphere (IASOA) network. To serve these data NSIDC created the MODIS Interactive Subsetting Tool (MIST). MIST works with 7 km by 7 km subset time series of certain Version 5 (V005) MODIS products over GC-Net and IASOA stations. User- selected data are delivered in a text Comma Separated Value (CSV) file format. MIST also provides online analysis capabilities that include generating time series and scatter plots. Currently, MIST is a Beta prototype and NSIDC intends that user requests will drive future development of the tool. The intent of this poster is to introduce MIST to the MODIS data user audience and illustrate some of the online analysis capabilities.
Washburne, Alex D.; Burby, Joshua W.; Lacker, Daniel; ...
2016-09-30
Systems as diverse as the interacting species in a community, alleles at a genetic locus, and companies in a market are characterized by competition (over resources, space, capital, etc) and adaptation. Neutral theory, built around the hypothesis that individual performance is independent of group membership, has found utility across the disciplines of ecology, population genetics, and economics, both because of the success of the neutral hypothesis in predicting system properties and because deviations from these predictions provide information about the underlying dynamics. However, most tests of neutrality are weak, based on static system properties such as species-abundance distributions or themore » number of singletons in a sample. Time-series data provide a window onto a system’s dynamics, and should furnish tests of the neutral hypothesis that are more powerful to detect deviations from neutrality and more informative about to the type of competitive asymmetry that drives the deviation. Here, we present a neutrality test for time-series data. We apply this test to several microbial time-series and financial time-series and find that most of these systems are not neutral. Our test isolates the covariance structure of neutral competition, thus facilitating further exploration of the nature of asymmetry in the covariance structure of competitive systems. Much like neutrality tests from population genetics that use relative abundance distributions have enabled researchers to scan entire genomes for genes under selection, we anticipate our time-series test will be useful for quick significance tests of neutrality across a range of ecological, economic, and sociological systems for which time-series data are available. Here, future work can use our test to categorize and compare the dynamic fingerprints of particular competitive asymmetries (frequency dependence, volatility smiles, etc) to improve forecasting and management of complex adaptive systems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Washburne, Alex D.; Burby, Joshua W.; Lacker, Daniel
Systems as diverse as the interacting species in a community, alleles at a genetic locus, and companies in a market are characterized by competition (over resources, space, capital, etc) and adaptation. Neutral theory, built around the hypothesis that individual performance is independent of group membership, has found utility across the disciplines of ecology, population genetics, and economics, both because of the success of the neutral hypothesis in predicting system properties and because deviations from these predictions provide information about the underlying dynamics. However, most tests of neutrality are weak, based on static system properties such as species-abundance distributions or themore » number of singletons in a sample. Time-series data provide a window onto a system’s dynamics, and should furnish tests of the neutral hypothesis that are more powerful to detect deviations from neutrality and more informative about to the type of competitive asymmetry that drives the deviation. Here, we present a neutrality test for time-series data. We apply this test to several microbial time-series and financial time-series and find that most of these systems are not neutral. Our test isolates the covariance structure of neutral competition, thus facilitating further exploration of the nature of asymmetry in the covariance structure of competitive systems. Much like neutrality tests from population genetics that use relative abundance distributions have enabled researchers to scan entire genomes for genes under selection, we anticipate our time-series test will be useful for quick significance tests of neutrality across a range of ecological, economic, and sociological systems for which time-series data are available. Here, future work can use our test to categorize and compare the dynamic fingerprints of particular competitive asymmetries (frequency dependence, volatility smiles, etc) to improve forecasting and management of complex adaptive systems.« less
M. Matonis; R. Hubbard; K. Gebert; B. Hahn; C. Regan
2014-01-01
The Future Forest Webinar Series facilitated dialogue between scientists and managers about the challenges and opportunities created by the mountain pine beetle (MPB) epidemic. The series consisted of six webinar facilitated by the USFS Rocky Mountain Research Station, the Northern and Rocky Mountain Regions, and the Colorado Forest Restoration Institute. The series...
NASA Astrophysics Data System (ADS)
Qian, Xi-Yuan; Liu, Ya-Min; Jiang, Zhi-Qiang; Podobnik, Boris; Zhou, Wei-Xing; Stanley, H. Eugene
2015-06-01
When common factors strongly influence two power-law cross-correlated time series recorded in complex natural or social systems, using detrended cross-correlation analysis (DCCA) without considering these common factors will bias the results. We use detrended partial cross-correlation analysis (DPXA) to uncover the intrinsic power-law cross correlations between two simultaneously recorded time series in the presence of nonstationarity after removing the effects of other time series acting as common forces. The DPXA method is a generalization of the detrended cross-correlation analysis that takes into account partial correlation analysis. We demonstrate the method by using bivariate fractional Brownian motions contaminated with a fractional Brownian motion. We find that the DPXA is able to recover the analytical cross Hurst indices, and thus the multiscale DPXA coefficients are a viable alternative to the conventional cross-correlation coefficient. We demonstrate the advantage of the DPXA coefficients over the DCCA coefficients by analyzing contaminated bivariate fractional Brownian motions. We calculate the DPXA coefficients and use them to extract the intrinsic cross correlation between crude oil and gold futures by taking into consideration the impact of the U.S. dollar index. We develop the multifractal DPXA (MF-DPXA) method in order to generalize the DPXA method and investigate multifractal time series. We analyze multifractal binomial measures masked with strong white noises and find that the MF-DPXA method quantifies the hidden multifractal nature while the multifractal DCCA method fails.
Tuarob, Suppawong; Tucker, Conrad S; Kumara, Soundar; Giles, C Lee; Pincus, Aaron L; Conroy, David E; Ram, Nilam
2017-04-01
It is believed that anomalous mental states such as stress and anxiety not only cause suffering for the individuals, but also lead to tragedies in some extreme cases. The ability to predict the mental state of an individual at both current and future time periods could prove critical to healthcare practitioners. Currently, the practical way to predict an individual's mental state is through mental examinations that involve psychological experts performing the evaluations. However, such methods can be time and resource consuming, mitigating their broad applicability to a wide population. Furthermore, some individuals may also be unaware of their mental states or may feel uncomfortable to express themselves during the evaluations. Hence, their anomalous mental states could remain undetected for a prolonged period of time. The objective of this work is to demonstrate the ability of using advanced machine learning based approaches to generate mathematical models that predict current and future mental states of an individual. The problem of mental state prediction is transformed into the time series forecasting problem, where an individual is represented as a multivariate time series stream of monitored physical and behavioral attributes. A personalized mathematical model is then automatically generated to capture the dependencies among these attributes, which is used for prediction of mental states for each individual. In particular, we first illustrate the drawbacks of traditional multivariate time series forecasting methodologies such as vector autoregression. Then, we show that such issues could be mitigated by using machine learning regression techniques which are modified for capturing temporal dependencies in time series data. A case study using the data from 150 human participants illustrates that the proposed machine learning based forecasting methods are more suitable for high-dimensional psychological data than the traditional vector autoregressive model in terms of both magnitude of error and directional accuracy. These results not only present a successful usage of machine learning techniques in psychological studies, but also serve as a building block for multiple medical applications that could rely on an automated system to gauge individuals' mental states. Copyright © 2017 Elsevier Inc. All rights reserved.
Headlines: Planet Earth: Improving Climate Literacy with Short Format News Videos
NASA Astrophysics Data System (ADS)
Tenenbaum, L. F.; Kulikov, A.; Jackson, R.
2012-12-01
One of the challenges of communicating climate science is the sense that climate change is remote and unconnected to daily life--something that's happening to someone else or in the future. To help face this challenge, NASA's Global Climate Change website http://climate.nasa.gov has launched a new video series, "Headlines: Planet Earth," which focuses on current climate news events. This rapid-response video series uses 3D video visualization technology combined with real-time satellite data and images, to throw a spotlight on real-world events.. The "Headlines: Planet Earth" news video products will be deployed frequently, ensuring timeliness. NASA's Global Climate Change Website makes extensive use of interactive media, immersive visualizations, ground-based and remote images, narrated and time-lapse videos, time-series animations, and real-time scientific data, plus maps and user-friendly graphics that make the scientific content both accessible and engaging to the public. The site has also won two consecutive Webby Awards for Best Science Website. Connecting climate science to current real-world events will contribute to improving climate literacy by making climate science relevant to everyday life.
Reilly, Carolyn Miller; Higgins, Melinda; Smith, Andrew; Culler, Steven D; Dunbar, Sandra B
2015-11-01
This paper presents a secondary in-depth analysis of five persons with heart failure randomized to receive an education and behavioral intervention on fluid restriction as part of a larger study. Using a single subject analysis design, time series analyses models were constructed for each of the five patients for a period of 180 days to determine correlations between daily measures of patient reported fluid intake, thoracic impedance, and weights, and relationships between patient reported outcomes of symptom burden and health related quality of life over time. Negative relationships were observed between fluid intake and thoracic impedance, and between impedance and weight, while positive correlations were observed between daily fluid intake and weight. By constructing time series analyses of daily measures of fluid congestion, trends and patterns of fluid congestion emerged which could be used to guide individualized patient care or future research endeavors. Employment of such a specialized analysis technique allows for the elucidation of clinically relevant findings potentially disguised when only evaluating aggregate outcomes of larger studies. Copyright © 2015 Elsevier Inc. All rights reserved.
Doing Your Thing: Fourth Grade.
ERIC Educational Resources Information Center
Potter, Beverly
The fourth grade instructional unit, part of a grade school level career education series, is designed to assist learners in relating present experiences to past and future ones. Before the main body of the lessons is described, field testing results are reported, and key items are presented: the concepts, the estimated instructional time, the…
Analysis of Patent Activity in the Field of Quantum Information Processing
NASA Astrophysics Data System (ADS)
Winiarczyk, Ryszard; Gawron, Piotr; Miszczak, Jarosław Adam; Pawela, Łukasz; Puchała, Zbigniew
2013-03-01
This paper provides an analysis of patent activity in the field of quantum information processing. Data from the PatentScope database from the years 1993-2011 was used. In order to predict the future trends in the number of filed patents time series models were used.
ERIC Educational Resources Information Center
Dillon, Naomi
2008-01-01
Life, in general, is a series of ever-increasing challenges. Ideally, the lessons learned from previous experiences prepare a person for future ones. That isn't always the case, particularly when puberty hits. Despite the many environmental and personal variables converging at the same time, schools can be instrumental in guiding teens through…
Forecasting--A Systematic Modeling Methodology. Paper No. 489.
ERIC Educational Resources Information Center
Mabert, Vincent A.; Radcliffe, Robert C.
In an attempt to bridge the gap between academic understanding and practical business use, the Box-Jenkins technique of time series analysis for forecasting future events is presented with a minimum of mathematical notation. The method is presented in three stages: a discussion of traditional forecasting techniques, focusing on traditional…
USDA-ARS?s Scientific Manuscript database
Testing soil salinity assessment methodologies over different regions is important for future continental and global scale applications. A novel regional-scale soil salinity modeling approach using plant-performance metrics was proposed by Zhang et al. (2015) for farmland in the Yellow River Delta, ...
Tracking MODIS NDVI time series to estimate fuel accumulation
Kellie A. Uyeda; Douglas A. Stow; Philip J. Riggan
2015-01-01
Patterns of post-fire recovery in southern California chaparral shrublands are important for understanding fuel available for future fires. Satellite remote sensing provides an opportunity to examine these patterns over large spatial extents and at high temporal resolution. The relatively limited temporal range of satellite remote sensing products has previously...
Adaptive Sensing of Time Series with Application to Remote Exploration
NASA Technical Reports Server (NTRS)
Thompson, David R.; Cabrol, Nathalie A.; Furlong, Michael; Hardgrove, Craig; Low, Bryan K. H.; Moersch, Jeffrey; Wettergreen, David
2013-01-01
We address the problem of adaptive informationoptimal data collection in time series. Here a remote sensor or explorer agent throttles its sampling rate in order to track anomalous events while obeying constraints on time and power. This problem is challenging because the agent has limited visibility -- all collected datapoints lie in the past, but its resource allocation decisions require predicting far into the future. Our solution is to continually fit a Gaussian process model to the latest data and optimize the sampling plan on line to maximize information gain. We compare the performance characteristics of stationary and nonstationary Gaussian process models. We also describe an application based on geologic analysis during planetary rover exploration. Here adaptive sampling can improve coverage of localized anomalies and potentially benefit mission science yield of long autonomous traverses.
Performance of time-series methods in forecasting the demand for red blood cell transfusion.
Pereira, Arturo
2004-05-01
Planning the future blood collection efforts must be based on adequate forecasts of transfusion demand. In this study, univariate time-series methods were investigated for their performance in forecasting the monthly demand for RBCs at one tertiary-care, university hospital. Three time-series methods were investigated: autoregressive integrated moving average (ARIMA), the Holt-Winters family of exponential smoothing models, and one neural-network-based method. The time series consisted of the monthly demand for RBCs from January 1988 to December 2002 and was divided into two segments: the older one was used to fit or train the models, and the younger to test for the accuracy of predictions. Performance was compared across forecasting methods by calculating goodness-of-fit statistics, the percentage of months in which forecast-based supply would have met the RBC demand (coverage rate), and the outdate rate. The RBC transfusion series was best fitted by a seasonal ARIMA(0,1,1)(0,1,1)(12) model. Over 1-year time horizons, forecasts generated by ARIMA or exponential smoothing laid within the +/- 10 percent interval of the real RBC demand in 79 percent of months (62% in the case of neural networks). The coverage rate for the three methods was 89, 91, and 86 percent, respectively. Over 2-year time horizons, exponential smoothing largely outperformed the other methods. Predictions by exponential smoothing laid within the +/- 10 percent interval of real values in 75 percent of the 24 forecasted months, and the coverage rate was 87 percent. Over 1-year time horizons, predictions of RBC demand generated by ARIMA or exponential smoothing are accurate enough to be of help in the planning of blood collection efforts. For longer time horizons, exponential smoothing outperforms the other forecasting methods.
NASA Astrophysics Data System (ADS)
Allen, D. M.; Henry, C.; Demon, H.; Kirste, D. M.; Huang, J.
2011-12-01
Sustainable management of groundwater resources, particularly in water stressed regions, requires estimates of groundwater recharge. This study in southern Mali, Africa compares approaches for estimating groundwater recharge and understanding recharge processes using a variety of methods encompassing groundwater level-climate data analysis, GRACE satellite data analysis, and recharge modelling for current and future climate conditions. Time series data for GRACE (2002-2006) and observed groundwater level data (1982-2001) do not overlap. To overcome this problem, GRACE time series data were appended to the observed historical time series data, and the records compared. Terrestrial water storage anomalies from GRACE were corrected for soil moisture (SM) using the Global Land Data Assimilation System (GLDAS) to obtain monthly groundwater storage anomalies (GRACE-SM), and monthly recharge estimates. Historical groundwater storage anomalies and recharge were determined using the water table fluctuation method using observation data from 15 wells. Historical annual recharge averaged 145.0 mm (or 15.9% of annual rainfall) and compared favourably with the GRACE-SM estimate of 149.7 mm (or 14.8% of annual rainfall). Both records show lows and peaks in May and September, respectively; however, the peak for the GRACE-SM data is shifted later in the year to November, suggesting that the GLDAS may poorly predict the timing of soil water storage in this region. Recharge simulation results show good agreement between the timing and magnitude of the mean monthly simulated recharge and the regional mean monthly storage anomaly hydrograph generated from all monitoring wells. Under future climate conditions, annual recharge is projected to decrease by 8% for areas with luvisols and by 11% for areas with nitosols. Given this potential reduction in groundwater recharge, there may be added stress placed on an already stressed resource.
Connectionist Architectures for Time Series Prediction of Dynamical Systems
NASA Astrophysics Data System (ADS)
Weigend, Andreas Sebastian
We investigate the effectiveness of connectionist networks for predicting the future continuation of temporal sequences. The problem of overfitting, particularly serious for short records of noisy data, is addressed by the method of weight-elimination: a term penalizing network complexity is added to the usual cost function in back-propagation. We describe the dynamics of the procedure and clarify the meaning of the parameters involved. From a Bayesian perspective, the complexity term can be usefully interpreted as an assumption about prior distribution of the weights. We analyze three time series. On the benchmark sunspot series, the networks outperform traditional statistical approaches. We show that the network performance does not deteriorate when there are more input units than needed. In the second example, the notoriously noisy foreign exchange rates series, we pick one weekday and one currency (DM vs. US). Given exchange rate information up to and including a Monday, the task is to predict the rate for the following Tuesday. Weight-elimination manages to extract a significant part of the dynamics and makes the solution interpretable. In the third example, the networks predict the resource utilization of a chaotic computational ecosystem for hundreds of steps forward in time.
Li, Jia; Xia, Yunni; Luo, Xin
2014-01-01
OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy.
Time series modelling to forecast prehospital EMS demand for diabetic emergencies.
Villani, Melanie; Earnest, Arul; Nanayakkara, Natalie; Smith, Karen; de Courten, Barbora; Zoungas, Sophia
2017-05-05
Acute diabetic emergencies are often managed by prehospital Emergency Medical Services (EMS). The projected growth in prevalence of diabetes is likely to result in rising demand for prehospital EMS that are already under pressure. The aims of this study were to model the temporal trends and provide forecasts of prehospital attendances for diabetic emergencies. A time series analysis on monthly cases of hypoglycemia and hyperglycemia was conducted using data from the Ambulance Victoria (AV) electronic database between 2009 and 2015. Using the seasonal autoregressive integrated moving average (SARIMA) modelling process, different models were evaluated. The most parsimonious model with the highest accuracy was selected. Forty-one thousand four hundred fifty-four prehospital diabetic emergencies were attended over a seven-year period with an increase in the annual median monthly caseload between 2009 (484.5) and 2015 (549.5). Hypoglycemia (70%) and people with type 1 diabetes (48%) accounted for most attendances. The SARIMA (0,1,0,12) model provided the best fit, with a MAPE of 4.2% and predicts a monthly caseload of approximately 740 by the end of 2017. Prehospital EMS demand for diabetic emergencies is increasing. SARIMA time series models are a valuable tool to allow forecasting of future caseload with high accuracy and predict increasing cases of prehospital diabetic emergencies into the future. The model generated by this study may be used by service providers to allow appropriate planning and resource allocation of EMS for diabetic emergencies.
Modeling pollen time series using seasonal-trend decomposition procedure based on LOESS smoothing.
Rojo, Jesús; Rivero, Rosario; Romero-Morte, Jorge; Fernández-González, Federico; Pérez-Badia, Rosa
2017-02-01
Analysis of airborne pollen concentrations provides valuable information on plant phenology and is thus a useful tool in agriculture-for predicting harvests in crops such as the olive and for deciding when to apply phytosanitary treatments-as well as in medicine and the environmental sciences. Variations in airborne pollen concentrations, moreover, are indicators of changing plant life cycles. By modeling pollen time series, we can not only identify the variables influencing pollen levels but also predict future pollen concentrations. In this study, airborne pollen time series were modeled using a seasonal-trend decomposition procedure based on LOcally wEighted Scatterplot Smoothing (LOESS) smoothing (STL). The data series-daily Poaceae pollen concentrations over the period 2006-2014-was broken up into seasonal and residual (stochastic) components. The seasonal component was compared with data on Poaceae flowering phenology obtained by field sampling. Residuals were fitted to a model generated from daily temperature and rainfall values, and daily pollen concentrations, using partial least squares regression (PLSR). This method was then applied to predict daily pollen concentrations for 2014 (independent validation data) using results for the seasonal component of the time series and estimates of the residual component for the period 2006-2013. Correlation between predicted and observed values was r = 0.79 (correlation coefficient) for the pre-peak period (i.e., the period prior to the peak pollen concentration) and r = 0.63 for the post-peak period. Separate analysis of each of the components of the pollen data series enables the sources of variability to be identified more accurately than by analysis of the original non-decomposed data series, and for this reason, this procedure has proved to be a suitable technique for analyzing the main environmental factors influencing airborne pollen concentrations.
The long-range correlation and evolution law of centennial-scale temperatures in Northeast China.
Zheng, Xiaohui; Lian, Yi; Wang, Qiguang
2018-01-01
This paper applies the detrended fluctuation analysis (DFA) method to investigate the long-range correlation of monthly mean temperatures from three typical measurement stations at Harbin, Changchun, and Shenyang in Northeast China from 1909 to 2014. The results reveal the memory characteristics of the climate system in this region. By comparing the temperatures from different time periods and investigating the variations of its scaling exponents at the three stations during these different time periods, we found that the monthly mean temperature has long-range correlation, which indicates that the temperature in Northeast China has long-term memory and good predictability. The monthly time series of temperatures over the past 106 years also shows good long-range correlation characteristics. These characteristics are also obviously observed in the annual mean temperature time series. Finally, we separated the centennial-length temperature time series into two time periods. These results reveal that the long-range correlations at the Harbin station over these two time periods have large variations, whereas no obvious variations are observed at the other two stations. This indicates that warming affects the regional climate system's predictability differently at different time periods. The research results can provide a quantitative reference point for regional climate predictability assessment and future climate model evaluation.
NASA Astrophysics Data System (ADS)
Miura, T.; Kato, A.; Wang, J.; Vargas, M.; Lindquist, M.
2015-12-01
Satellite vegetation index (VI) time series data serve as an important means to monitor and characterize seasonal changes of terrestrial vegetation and their interannual variability. It is, therefore, critical to ensure quality of such VI products and one method of validating VI product quality is cross-comparison with in situ flux tower measurements. In this study, we evaluated the quality of VI time series derived from Visible Infrared Imaging Radiometer Suite (VIIRS) onboard the Suomi National Polar-orbiting Partnership (NPP) spacecraft by cross-comparison with in situ radiation flux measurements at select flux tower sites over North America and Europe. VIIRS is a new polar-orbiting satellite sensor series, slated to replace National Oceanic and Atmospheric Administration's Advanced Very High Resolution Radiometer in the afternoon overpass and to continue the highly-calibrated data streams initiated with Moderate Resolution Imaging Spectrometer of National Aeronautics and Space Administration's Earth Observing System. The selected sites covered a wide range of biomes, including croplands, grasslands, evergreen needle forest, woody savanna, and open shrublands. The two VIIRS indices of the Top-of-Atmosphere (TOA) Normalized Difference Vegetation Index (NDVI) and the atmospherically-corrected, Top-of-Canopy (TOC) Enhanced Vegetation Index (EVI) (daily, 375 m spatial resolution) were compared against the TOC NDVI and a two-band version of EVI (EVI2) calculated from tower radiation flux measurements, respectively. VIIRS and Tower VI time series showed comparable seasonal profiles across biomes with statistically significant correlations (> 0.60; p-value < 0.01). "Start-of-season (SOS)" phenological metric values extracted from VIIRS and Tower VI time series were also highly compatible (R2 > 0.95), with mean differences of 2.3 days and 5.0 days for the NDVI and the EVI, respectively. These results indicate that VIIRS VI time series can capture seasonal evolution of vegetated land surface as good as in situ radiometric measurements. Future studies that address biophysical or physiological interpretations of Tower VI time series derived from radiation flux measurements are desirable.
Forecast models for suicide: Time-series analysis with data from Italy.
Preti, Antonio; Lentini, Gianluca
2016-01-01
The prediction of suicidal behavior is a complex task. To fine-tune targeted preventative interventions, predictive analytics (i.e. forecasting future risk of suicide) is more important than exploratory data analysis (pattern recognition, e.g. detection of seasonality in suicide time series). This study sets out to investigate the accuracy of forecasting models of suicide for men and women. A total of 101 499 male suicides and of 39 681 female suicides - occurred in Italy from 1969 to 2003 - were investigated. In order to apply the forecasting model and test its accuracy, the time series were split into a training set (1969 to 1996; 336 months) and a test set (1997 to 2003; 84 months). The main outcome was the accuracy of forecasting models on the monthly number of suicides. These measures of accuracy were used: mean absolute error; root mean squared error; mean absolute percentage error; mean absolute scaled error. In both male and female suicides a change in the trend pattern was observed, with an increase from 1969 onwards to reach a maximum around 1990 and decrease thereafter. The variances attributable to the seasonal and trend components were, respectively, 24% and 64% in male suicides, and 28% and 41% in female ones. Both annual and seasonal historical trends of monthly data contributed to forecast future trends of suicide with a margin of error around 10%. The finding is clearer in male than in female time series of suicide. The main conclusion of the study is that models taking seasonality into account seem to be able to derive information on deviation from the mean when this occurs as a zenith, but they fail to reproduce it when it occurs as a nadir. Preventative efforts should concentrate on the factors that influence the occurrence of increases above the main trend in both seasonal and cyclic patterns of suicides.
Revising time series of the Elbe river discharge for flood frequency determination at gauge Dresden
NASA Astrophysics Data System (ADS)
Bartl, S.; Schümberg, S.; Deutsch, M.
2009-11-01
The German research programme RIsk MAnagment of eXtreme flood events has accomplished the improvement of regional hazard assessment for the large rivers in Germany. Here we focused on the Elbe river at its gauge Dresden, which belongs to the oldest gauges in Europe with officially available daily discharge time series beginning on 1 January 1890. The project on the one hand aimed to extend and to revise the existing time series, and on the other hand to examine the variability of the Elbe river discharge conditions on a greater time scale. Therefore one major task were the historical searches and the examination of the retrieved documents and the contained information. After analysing this information the development of the river course and the discharge conditions were discussed. Using the provided knowledge, in an other subproject, a historical hydraulic model was established. Its results then again were used here. A further purpose was the determining of flood frequency based on all pre-processed data. The obtained knowledge about historical changes was also used to get an idea about possible future variations under climate change conditions. Especially variations in the runoff characteristic of the Elbe river over the course of the year were analysed. It succeeded to obtain a much longer discharge time series which contain fewer errors and uncertainties. Hence an optimized regional hazard assessment was realised.
Model-Based Design of Long-Distance Tracer Transport Experiments in Plants.
Bühler, Jonas; von Lieres, Eric; Huber, Gregor J
2018-01-01
Studies of long-distance transport of tracer isotopes in plants offer a high potential for functional phenotyping, but so far measurement time is a bottleneck because continuous time series of at least 1 h are required to obtain reliable estimates of transport properties. Hence, usual throughput values are between 0.5 and 1 samples h -1 . Here, we propose to increase sample throughput by introducing temporal gaps in the data acquisition of each plant sample and measuring multiple plants one after each other in a rotating scheme. In contrast to common time series analysis methods, mechanistic tracer transport models allow the analysis of interrupted time series. The uncertainties of the model parameter estimates are used as a measure of how much information was lost compared to complete time series. A case study was set up to systematically investigate different experimental schedules for different throughput scenarios ranging from 1 to 12 samples h -1 . Selected designs with only a small amount of data points were found to be sufficient for an adequate parameter estimation, implying that the presented approach enables a substantial increase of sample throughput. The presented general framework for automated generation and evaluation of experimental schedules allows the determination of a maximal sample throughput and the respective optimal measurement schedule depending on the required statistical reliability of data acquired by future experiments.
NREL Launches Electrification Futures Study Series | News | NREL
Study Series First report includes foundational data on cost and performance of electric technologies Futures Study: End-Use Electric Technology Cost and Performance Projections through 2050. This report uses a combination of recently published literature and expert assessment to develop future cost and
NASA Astrophysics Data System (ADS)
Geiger, Tobias
2018-04-01
Gross domestic product (GDP) represents a widely used metric to compare economic development across time and space. GDP estimates have been routinely assembled only since the beginning of the second half of the 20th century, making comparisons with prior periods cumbersome or even impossible. In recent years various efforts have been put forward to re-estimate national GDP for specific years in the past centuries and even millennia, providing new insights into past economic development on a snapshot basis. In order to make this wealth of data utilizable across research disciplines, we here present a first continuous and consistent data set of GDP time series for 195 countries from 1850 to 2009, based mainly on data from the Maddison Project and other population and GDP sources. The GDP data are consistent with Penn World Tables v8.1 and future GDP projections from the Shared Socio-economic Pathways (SSPs), and are freely available at http://doi.org/10.5880/pik.2018.010 (Geiger and Frieler, 2018). To ease usability, we additionally provide GDP per capita data and further supplementary and data description files in the online archive. We utilize various methods to handle missing data and discuss the advantages and limitations of our methodology. Despite known shortcomings this data set provides valuable input, e.g., for climate impact research, in order to consistently analyze economic impacts from pre-industrial times to the future.
The study of Thai stock market across the 2008 financial crisis
NASA Astrophysics Data System (ADS)
Kanjamapornkul, K.; Pinčák, Richard; Bartoš, Erik
2016-11-01
The cohomology theory for financial market can allow us to deform Kolmogorov space of time series data over time period with the explicit definition of eight market states in grand unified theory. The anti-de Sitter space induced from a coupling behavior field among traders in case of a financial market crash acts like gravitational field in financial market spacetime. Under this hybrid mathematical superstructure, we redefine a behavior matrix by using Pauli matrix and modified Wilson loop for time series data. We use it to detect the 2008 financial market crash by using a degree of cohomology group of sphere over tensor field in correlation matrix over all possible dominated stocks underlying Thai SET50 Index Futures. The empirical analysis of financial tensor network was performed with the help of empirical mode decomposition and intrinsic time scale decomposition of correlation matrix and the calculation of closeness centrality of planar graph.
Water quality management using statistical analysis and time-series prediction model
NASA Astrophysics Data System (ADS)
Parmar, Kulwinder Singh; Bhardwaj, Rashmi
2014-12-01
This paper deals with water quality management using statistical analysis and time-series prediction model. The monthly variation of water quality standards has been used to compare statistical mean, median, mode, standard deviation, kurtosis, skewness, coefficient of variation at Yamuna River. Model validated using R-squared, root mean square error, mean absolute percentage error, maximum absolute percentage error, mean absolute error, maximum absolute error, normalized Bayesian information criterion, Ljung-Box analysis, predicted value and confidence limits. Using auto regressive integrated moving average model, future water quality parameters values have been estimated. It is observed that predictive model is useful at 95 % confidence limits and curve is platykurtic for potential of hydrogen (pH), free ammonia, total Kjeldahl nitrogen, dissolved oxygen, water temperature (WT); leptokurtic for chemical oxygen demand, biochemical oxygen demand. Also, it is observed that predicted series is close to the original series which provides a perfect fit. All parameters except pH and WT cross the prescribed limits of the World Health Organization /United States Environmental Protection Agency, and thus water is not fit for drinking, agriculture and industrial use.
Facilities Planning for Small Colleges.
ERIC Educational Resources Information Center
O'Neill, Joseph P.; And Others
This second publication in a three-part series called "Alternative Futures" is essentially a workbook that, followed step by step, allows a college to see how its use of space has changed over time. Especially designed for small colleges, the kit makes use of the information that is routinely collected, such as annual financial statements and…
Palestine 1918: General Edmund Allenby’s Application of Operational Art and Design
the British connected a series of joint combined arms operations in time , space, and purpose to achieve the strategic goals of the Empire. The campaign...perception. This monograph analizes the campaign in Palestine through the lens of Army Design Methodology (ADM) to illuminate how future leaders can
Evaluation as Story: The Narrative Quality of Educational Evaluation.
ERIC Educational Resources Information Center
Wachtman, Edward L.
The author presents his opinion that educational evaluation has much similarity to the nonfiction narrative, (defined as a series of events ordered in time), particularly as it relates a current situation to future possibilities. He refers to Stake's statement that evaluation is concerned not only with outcomes but also with antecedents and with…
Following and Giving Directions: Fifth Grade.
ERIC Educational Resources Information Center
Davis, Nancy
The fifth grade instructional unit, part of a grade school level career education series, is designed to assist learners in understanding how present experiences relate to past and future ones. Before the main body of the lessons is described field test results are reported and key items are presented: the concepts, the estimated time for…
ERIC Educational Resources Information Center
Serafini, Frank, Ed.; Gee, Elisabeth, Ed.
2017-01-01
Bringing together renowned scholars in literacy education, this volume offers the first comprehensive account of the evolution and future of multiliteracies pedagogy. This groundbreaking collection examines the rich contributions of the New London Group (NLG)--an international gathering of noted scholars who met in 1996 and influenced the…
Bivariate autoregressive state-space modeling of psychophysiological time series data.
Smith, Daniel M; Abtahi, Mohammadreza; Amiri, Amir Mohammad; Mankodiya, Kunal
2016-08-01
Heart rate (HR) and electrodermal activity (EDA) are often used as physiological measures of psychological arousal in various neuropsychology experiments. In this exploratory study, we analyze HR and EDA data collected from four participants, each with a history of suicidal tendencies, during a cognitive task known as the Paced Auditory Serial Addition Test (PASAT). A central aim of this investigation is to guide future research by assessing heterogeneity in the population of individuals with suicidal tendencies. Using a state-space modeling approach to time series analysis, we evaluate the effect of an exogenous input, i.e., the stimulus presentation rate which was increased systematically during the experimental task. Participants differed in several parameters characterizing the way in which psychological arousal was experienced during the task. Increasing the stimulus presentation rate was associated with an increase in EDA in participants 2 and 4. The effect on HR was positive for participant 2 and negative for participants 3 and 4. We discuss future directions in light of the heterogeneity in the population indicated by these findings.
NASA Astrophysics Data System (ADS)
Fink, G.; Koch, M.
2010-12-01
An important aspect in water resources and hydrological engineering is the assessment of hydrological risk, due to the occurrence of extreme events, e.g. droughts or floods. When dealing with the latter - as is the focus here - the classical methods of flood frequency analysis (FFA) are usually being used for the proper dimensioning of a hydraulic structure, for the purpose of bringing down the flood risk to an acceptable level. FFA is based on extreme value statistics theory. Despite the progress of methods in this scientific branch, the development, decision, and fitting of an appropriate distribution function stills remains a challenge, particularly, when certain underlying assumptions of the theory are not met in real applications. This is, for example, the case when the stationarity-condition for a random flood time series is not satisfied anymore, as could be the situation when long-term hydrological impacts of future climate change are to be considered. The objective here is to verify the applicability of classical (stationary) FFA to predicted flood time series in the Fulda catchment in central Germany, as they may occur in the wake of climate change during the 21st century. These discharge time series at the outlet of the Fulda basin have been simulated with a distributed hydrological model (SWAT) that is forced by predicted climate variables of a regional climate model for Germany (REMO). From the simulated future daily time series, annual maximum (extremes) values are computed and analyzed for the purpose of risk evaluation. Although the 21st century estimated extreme flood series of the Fulda river turn out to be only mildly non-stationary, alleviating the need for further action and concern at the first sight, the more detailed analysis of the risk, as quantified, for example, by the return period, shows non-negligent differences in the calculated risk levels. This could be verified by employing a new method, the so-called flood series maximum analysis (FSMA) method, which consists in the stochastic simulation of numerous trajectories of a stochastic process with a given GEV-distribution over a certain length of time (> larger than a desired return period). Then the maximum value for each trajectory is computed, all of which are then used to determine the empirical distribution of this maximum series. Through graphical inversion of this distribution function the size of the design flood for a given risk (quantile) and given life duration can be inferred. The results of numerous simulations show that for stationary flood series, the new FSMA method results, expectedly, in nearly identical risk values as the classical FFA approach. However, once the flood time series becomes slightly non-stationary - for reasons as discussed - and regardless of whether the trend is increasing or decreasing, large differences in the computed risk values for a given design flood occur. Or in other word, for the same risk, the new FSMA method would lead to different values in the design flood for a hydraulic structure than the classical FFA method. This, in turn, could lead to some cost savings in the realization of a hydraulic project.
Future projects in asteroseismology: the unique role of Antarctica
NASA Astrophysics Data System (ADS)
Mosser, B.; Siamois Team
Asteroseismology requires observables registered in stringent conditions: very high sensitivity, uninterrupted time series, long duration. These specifications then allow to study the details of the stellar interior structure. Space-borne and ground-based asteroseismic projects are presented and compared. With CoRoT as a precursor, then Kepler and maybe Plato, the roadmap in space appears to be precisely designed. In parallel, ground-based projects are necessary to provide different and unique information on bright stars with Doppler measurements. Dome C appears to be the ideal place for ground-based asteroseismic observations. The unequalled weather conditions yield a duty cycle comparable to space. Long time series (up to 3 months) will be possible, thanks to the long duration of the polar night.
Earth Observing System, Conclusions and Recommendations
NASA Technical Reports Server (NTRS)
1984-01-01
The following Earth Observing Systems (E.O.S.) recommendations were suggested: (1) a program must be initiated to ensure that present time series of Earth science data are maintained and continued. (2) A data system that provides easy, integrated, and complete access to past, present, and future data must be developed as soon as possible. (3) A long term research effort must be sustained to study and understand these time series of Earth observations. (4) The E.O.S. should be established as an information system to carry out those aspects of the above recommendations which go beyond existing and currently planned activities. (5) The scientific direction of the E.O.S. should be established and continued through an international scientific steering committee.
Imai, Chisato; Hashizume, Masahiro
2015-03-01
Time series analysis is suitable for investigations of relatively direct and short-term effects of exposures on outcomes. In environmental epidemiology studies, this method has been one of the standard approaches to assess impacts of environmental factors on acute non-infectious diseases (e.g. cardiovascular deaths), with conventionally generalized linear or additive models (GLM and GAM). However, the same analysis practices are often observed with infectious diseases despite of the substantial differences from non-infectious diseases that may result in analytical challenges. Following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, systematic review was conducted to elucidate important issues in assessing the associations between environmental factors and infectious diseases using time series analysis with GLM and GAM. Published studies on the associations between weather factors and malaria, cholera, dengue, and influenza were targeted. Our review raised issues regarding the estimation of susceptible population and exposure lag times, the adequacy of seasonal adjustments, the presence of strong autocorrelations, and the lack of a smaller observation time unit of outcomes (i.e. daily data). These concerns may be attributable to features specific to infectious diseases, such as transmission among individuals and complicated causal mechanisms. The consequence of not taking adequate measures to address these issues is distortion of the appropriate risk quantifications of exposures factors. Future studies should pay careful attention to details and examine alternative models or methods that improve studies using time series regression analysis for environmental determinants of infectious diseases.
Gershon, Andrea; Thiruchelvam, Deva; Moineddin, Rahim; Zhao, Xiu Yan; Hwee, Jeremiah; To, Teresa
2017-06-01
Knowing trends in and forecasting hospitalization and emergency department visit rates for chronic obstructive pulmonary disease (COPD) can enable health care providers, hospitals, and health care decision makers to plan for the future. We conducted a time-series analysis using health care administrative data from the Province of Ontario, Canada, to determine previous trends in acute care hospitalization and emergency department visit rates for COPD and then to forecast future rates. Individuals aged 35 years and older with physician-diagnosed COPD were identified using four universal government health administrative databases and a validated case definition. Monthly COPD hospitalization and emergency department visit rates per 1,000 people with COPD were determined from 2003 to 2014 and then forecasted to 2024 using autoregressive integrated moving average models. Between 2003 and 2014, COPD prevalence increased from 8.9 to 11.1%. During that time, there were 274,951 hospitalizations and 290,482 emergency department visits for COPD. After accounting for seasonality, we found that monthly COPD hospitalization and emergency department visit rates per 1,000 individuals with COPD remained stable. COPD prevalence was forecasted to increase to 12.7% (95% confidence interval [CI], 11.4-14.1) by 2024, whereas monthly COPD hospitalization and emergency department visit rates per 1,000 people with COPD were forecasted to remain stable at 2.7 (95% CI, 1.6-4.4) and 3.7 (95% CI, 2.3-5.6), respectively. Forecasted age- and sex-stratified rates were also stable. COPD hospital and emergency department visit rates per 1,000 people with COPD have been stable for more than a decade and are projected to remain stable in the near future. Given increasing COPD prevalence, this means notably more COPD health service use in the future.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-31
... trade VIX derivatives (e.g., options and futures) often hedge their positions with the SPX option series... options and futures. Because those option series are typically used to hedge VIX derivatives, market... settlement date for volatility index options and futures, modified Hybrid Opening System (HOSS) opening...
An investigation of fMRI time series stationarity during motor sequence learning foot tapping tasks.
Muhei-aldin, Othman; VanSwearingen, Jessie; Karim, Helmet; Huppert, Theodore; Sparto, Patrick J; Erickson, Kirk I; Sejdić, Ervin
2014-04-30
Understanding complex brain networks using functional magnetic resonance imaging (fMRI) is of great interest to clinical and scientific communities. To utilize advanced analysis methods such as graph theory for these investigations, the stationarity of fMRI time series needs to be understood as it has important implications on the choice of appropriate approaches for the analysis of complex brain networks. In this paper, we investigated the stationarity of fMRI time series acquired from twelve healthy participants while they performed a motor (foot tapping sequence) learning task. Since prior studies have documented that learning is associated with systematic changes in brain activation, a sequence learning task is an optimal paradigm to assess the degree of non-stationarity in fMRI time-series in clinically relevant brain areas. We predicted that brain regions involved in a "learning network" would demonstrate non-stationarity and may violate assumptions associated with some advanced analysis approaches. Six blocks of learning, and six control blocks of a foot tapping sequence were performed in a fixed order. The reverse arrangement test was utilized to investigate the time series stationarity. Our analysis showed some non-stationary signals with a time varying first moment as a major source of non-stationarity. We also demonstrated a decreased number of non-stationarities in the third block as a result of priming and repetition. Most of the current literature does not examine stationarity prior to processing. The implication of our findings is that future investigations analyzing complex brain networks should utilize approaches robust to non-stationarities, as graph-theoretical approaches can be sensitive to non-stationarities present in data. Copyright © 2014 Elsevier B.V. All rights reserved.
RECENT DEVELOPMENTS IN HYDROWEB DATABASE Water level time series on lakes and reservoirs (Invited)
NASA Astrophysics Data System (ADS)
Cretaux, J.; Arsen, A.; Calmant, S.
2013-12-01
We present the current state of the Hydroweb database as well as developments in progress. It provides offline water level time series on rivers, reservoirs and lakes based on altimetry data from several satellites (Topex/Poseidon, ERS, Jason-1&2, GFO and ENVISAT). The major developments in Hydroweb concerns the development of an operational data centre with automatic acquisition and processing of IGDR data for updating time series in near real time (both for lakes & rivers) and also use of additional remote sensing data, like satellite imagery allowing the calculation of lake's surfaces. A lake data centre is under development at the Legos in coordination with Hydrolare Project leaded by SHI (State Hydrological Institute of the Russian Academy of Science). It will provide the level-surface-volume variations of about 230 lakes and reservoirs, calculated through combination of various satellite images (Modis, Asar, Landsat, Cbers) and radar altimetry (Topex / Poseidon, Jason-1 & 2, GFO, Envisat, ERS2, AltiKa). The final objective is to propose a data centre fully based on remote sensing technique and controlled by in situ infrastructure for the Global Terrestrial Network for Lakes (GTN-L) under the supervision of WMO and GCOS. In a longer perspective, the Hydroweb database will integrate data from future missions (Jason-3, Jason-CS, Sentinel-3A/B) and finally will serve for the design of the SWOT mission. The products of hydroweb will be used as input data for simulation of the SWOT products (water height and surface variations of lakes and rivers). In the future, the SWOT mission will allow to monitor on a sub-monthly basis the worldwide lakes and reservoirs bigger than 250 * 250 m and Hydroweb will host water level and extent products from this
Quantifying the behavior of price dynamics at opening time in stock market
NASA Astrophysics Data System (ADS)
Ochiai, Tomoshiro; Takada, Hideyuki; Nacher, Jose C.
2014-11-01
The availability of huge volume of financial data has offered the possibility for understanding the markets as a complex system characterized by several stylized facts. Here we first show that the time evolution of the Japan’s Nikkei stock average index (Nikkei 225) futures follows the resistance and breaking-acceleration effects when the complete time series data is analyzed. However, in stock markets there are periods where no regular trades occur between the close of the market on one day and the next day’s open. To examine these time gaps we decompose the time series data into opening time and intermediate time. Our analysis indicates that for the intermediate time, both the resistance and the breaking-acceleration effects are still observed. However, for the opening time there are almost no resistance and breaking-acceleration effects, and volatility is always constantly high. These findings highlight unique dynamic differences between stock markets and forex market and suggest that current risk management strategies may need to be revised to address the absence of these dynamic effects at the opening time.
Djurdjevic, Tanja; Rehwald, Rafael; Knoflach, Michael; Matosevic, Benjamin; Kiechl, Stefan; Gizewski, Elke Ruth; Glodny, Bernhard; Grams, Astrid Ellen
2017-03-01
After intraarterial recanalisation (IAR), the haemorrhage and the blood-brain barrier (BBB) disruption can be distinguished using dual-energy computed tomography (DECT). The aim of the present study was to investigate whether future infarction development can be predicted from DECT. DECT scans of 20 patients showing 45 BBB disrupted areas after IAR were assessed and compared with follow-up examinations. Receiver operator characteristic (ROC) analyses using densities from the iodine map (IM) and virtual non-contrast (VNC) were performed. Future infarction areas are denser than future non-infarction areas on IM series (23.44 ± 24.86 vs. 5.77 ± 2.77; p < 0.0001) and more hypodense on VNC series (29.71 ± 3.33 vs. 35.33 ± 3.50; p < 0.0001). ROC analyses for the IM series showed an area under the curve (AUC) of 0.99 (cut-off: <9.97 HU; p < 0.05; sensitivity 91.18 %; specificity 100.00 %; accuracy 0.93) for the prediction of future infarctions. The AUC for the prediction of haemorrhagic infarctions was 0.78 (cut-off >17.13 HU; p < 0.05; sensitivity 90.00 %; specificity 62.86 %; accuracy 0.69). The VNC series allowed prediction of infarction volume. Future infarction development after IAR can be reliably predicted with the IM series. The prediction of haemorrhages and of infarction size is less reliable. • The IM series (DECT) can predict future infarction development after IAR. • Later haemorrhages can be predicted using the IM and the BW series. • The volume of definable hypodense areas in VNC correlates with infarction volume.
Time series analysis of gold production in Malaysia
NASA Astrophysics Data System (ADS)
Muda, Nora; Hoon, Lee Yuen
2012-05-01
Gold is a soft, malleable, bright yellow metallic element and unaffected by air or most reagents. It is highly valued as an asset or investment commodity and is extensively used in jewellery, industrial application, dentistry and medical applications. In Malaysia, gold mining is limited in several areas such as Pahang, Kelantan, Terengganu, Johor and Sarawak. The main purpose of this case study is to obtain a suitable model for the production of gold in Malaysia. The model can also be used to predict the data of Malaysia's gold production in the future. Box-Jenkins time series method was used to perform time series analysis with the following steps: identification, estimation, diagnostic checking and forecasting. In addition, the accuracy of prediction is tested using mean absolute percentage error (MAPE). From the analysis, the ARIMA (3,1,1) model was found to be the best fitted model with MAPE equals to 3.704%, indicating the prediction is very accurate. Hence, this model can be used for forecasting. This study is expected to help the private and public sectors to understand the gold production scenario and later plan the gold mining activities in Malaysia.
Manikandan, Narayanan; Subha, Srinivasan
2016-01-01
Software development life cycle has been characterized by destructive disconnects between activities like planning, analysis, design, and programming. Particularly software developed with prediction based results is always a big challenge for designers. Time series data forecasting like currency exchange, stock prices, and weather report are some of the areas where an extensive research is going on for the last three decades. In the initial days, the problems with financial analysis and prediction were solved by statistical models and methods. For the last two decades, a large number of Artificial Neural Networks based learning models have been proposed to solve the problems of financial data and get accurate results in prediction of the future trends and prices. This paper addressed some architectural design related issues for performance improvement through vectorising the strengths of multivariate econometric time series models and Artificial Neural Networks. It provides an adaptive approach for predicting exchange rates and it can be called hybrid methodology for predicting exchange rates. This framework is tested for finding the accuracy and performance of parallel algorithms used.
Abrupt Shift in the Observed Runoff from the Southwest Greenland Ice Sheet?
NASA Astrophysics Data System (ADS)
Ahlstrom, A.; Petersen, D.; Box, J.; Langen, P. P.; Citterio, M.
2016-12-01
Mass loss of the Greenland ice sheet has contributed significantly to sea level rise in recent years and is considered a crucial parameter when estimating the impact of future climate change. Few observational records of sufficient length exist to validate surface mass balance models, especially the estimated runoff. Here we present an observation time series from 1975-2014 of discharge from a large proglacial lake, Tasersiaq, in West Greenland (66.3°N, 50.4°W) with a mainly ice-covered catchment. We argue that the discharge time series is representative measure of ice sheet runoff, making it the only observational record of runoff to exceed the 30-year period needed to assess the climatological state of the ice sheet. We proceed to isolate the runoff part of the signal from precipitation and identified glacial lake outburst floods from a small sub-catchment. Similarly, the impact from major volcanic eruptions is clearly identified. We examine the trend and annual variability in the annual discharge, relating it to likely atmospheric forcing mechanisms and compare the observational time series with modelled runoff from the regional climate model HIRHAM.
Manikandan, Narayanan; Subha, Srinivasan
2016-01-01
Software development life cycle has been characterized by destructive disconnects between activities like planning, analysis, design, and programming. Particularly software developed with prediction based results is always a big challenge for designers. Time series data forecasting like currency exchange, stock prices, and weather report are some of the areas where an extensive research is going on for the last three decades. In the initial days, the problems with financial analysis and prediction were solved by statistical models and methods. For the last two decades, a large number of Artificial Neural Networks based learning models have been proposed to solve the problems of financial data and get accurate results in prediction of the future trends and prices. This paper addressed some architectural design related issues for performance improvement through vectorising the strengths of multivariate econometric time series models and Artificial Neural Networks. It provides an adaptive approach for predicting exchange rates and it can be called hybrid methodology for predicting exchange rates. This framework is tested for finding the accuracy and performance of parallel algorithms used. PMID:26881271
Predicting Physical Time Series Using Dynamic Ridge Polynomial Neural Networks
Al-Jumeily, Dhiya; Ghazali, Rozaida; Hussain, Abir
2014-01-01
Forecasting naturally occurring phenomena is a common problem in many domains of science, and this has been addressed and investigated by many scientists. The importance of time series prediction stems from the fact that it has wide range of applications, including control systems, engineering processes, environmental systems and economics. From the knowledge of some aspects of the previous behaviour of the system, the aim of the prediction process is to determine or predict its future behaviour. In this paper, we consider a novel application of a higher order polynomial neural network architecture called Dynamic Ridge Polynomial Neural Network that combines the properties of higher order and recurrent neural networks for the prediction of physical time series. In this study, four types of signals have been used, which are; The Lorenz attractor, mean value of the AE index, sunspot number, and heat wave temperature. The simulation results showed good improvements in terms of the signal to noise ratio in comparison to a number of higher order and feedforward neural networks in comparison to the benchmarked techniques. PMID:25157950
Predicting physical time series using dynamic ridge polynomial neural networks.
Al-Jumeily, Dhiya; Ghazali, Rozaida; Hussain, Abir
2014-01-01
Forecasting naturally occurring phenomena is a common problem in many domains of science, and this has been addressed and investigated by many scientists. The importance of time series prediction stems from the fact that it has wide range of applications, including control systems, engineering processes, environmental systems and economics. From the knowledge of some aspects of the previous behaviour of the system, the aim of the prediction process is to determine or predict its future behaviour. In this paper, we consider a novel application of a higher order polynomial neural network architecture called Dynamic Ridge Polynomial Neural Network that combines the properties of higher order and recurrent neural networks for the prediction of physical time series. In this study, four types of signals have been used, which are; The Lorenz attractor, mean value of the AE index, sunspot number, and heat wave temperature. The simulation results showed good improvements in terms of the signal to noise ratio in comparison to a number of higher order and feedforward neural networks in comparison to the benchmarked techniques.
Quentin, Wilm; Neubauer, Simone; Leidl, Reiner; König, Hans-Helmut
2007-01-01
This paper reviews the international literature that employed time-series analysis to evaluate the effects of advertising bans on aggregate consumption of cigarettes or tobacco. A systematic search of the literature was conducted. Three groups of studies representing analyses of advertising bans in the U.S.A., in other countries and in 22 OECD countries were defined. The estimated effects of advertising bans and their significance were analysed. 24 studies were identified. They used a wide array of explanatory variables, models, estimating methods and data sources. 18 studies found a negative effect of an advertising ban on aggregate consumption, but only ten of these studies found a significant effect. Two studies using data from 22 OECD countries suggested that partial bans would have little or no influence on aggregate consumption, whereas complete bans would significantly reduce consumption. The results imply that advertising bans have a negative but sometimes only narrow impact on consumption. Complete bans let expect a higher effectiveness. Because of methodological restrictions of analysing advertising bans' effects by time series approaches, also different approaches should be used in the future.
Alternative Fuels Data Center: Efficient Driving Behaviors to Conserve Fuel
Energy Futures Series: Effects of Travel Reduction and Efficient Driving on Transportation: Energy Use and Greenhouse Gas Emissions Transportation Energy Futures Series: Effects of the Built Environment on
NASA Astrophysics Data System (ADS)
Caffarra, Amelia; Zottele, Fabio; Gleeson, Emily; Donnelly, Alison
2014-05-01
In order to predict the impact of future climate warming on trees it is important to quantify the effect climate has on their development. Our understanding of the phenological response to environmental drivers has given rise to various mathematical models of the annual growth cycle of plants. These models simulate the timing of phenophases by quantifying the relationship between development and its triggers, typically temperature. In addition, other environmental variables have an important role in determining the timing of budburst. For example, photoperiod has been shown to have a strong influence on phenological events of a number of tree species, including Betula pubescens (birch). A recently developed model for birch (DORMPHOT), which integrates the effects of temperature and photoperiod on budburst, was applied to future temperature projections from a 19-member ensemble of regional climate simulations (on a 25 km grid) generated as part of the ENSEMBLES project, to simulate the timing of birch budburst in Ireland each year up to the end of the present century. Gridded temperature time series data from the climate simulations were used as input to the DORMPHOT model to simulate future budburst timing. The results showed an advancing trend in the timing of birch budburst over most regions in Ireland up to 2100. Interestingly, this trend appeared greater in the northeast of the country than in the southwest, where budburst is currently relatively early. These results could have implications for future forest planning, species distribution modeling, and the birch allergy season.
Sizirici, Banu; Tansel, Berrin
2010-01-01
The purpose of this study was to evaluate suitability of using the time series analysis for selected leachate quantity and quality parameters to forecast the duration of post closure period of a closed landfill. Selected leachate quality parameters (i.e., sodium, chloride, iron, bicarbonate, total dissolved solids (TDS), and ammonium as N) and volatile organic compounds (VOCs) (i.e., vinyl chloride, 1,4-dichlorobenzene, chlorobenzene, benzene, toluene, ethyl benzene, xylenes, total BTEX) were analyzed by the time series multiplicative decomposition model to estimate the projected levels of the parameters. These parameters were selected based on their detection levels and consistency of detection in leachate samples. In addition, VOCs detected in leachate and their chemical transformations were considered in view of the decomposition stage of the landfill. Projected leachate quality trends were analyzed and compared with the maximum contaminant level (MCL) for the respective parameters. Conditions that lead to specific trends (i.e., increasing, decreasing, or steady) and interactions of leachate quality parameters were evaluated. Decreasing trends were projected for leachate quantity, concentrations of sodium, chloride, TDS, ammonia as N, vinyl chloride, 1,4-dichlorobenzene, benzene, toluene, ethyl benzene, xylenes, and total BTEX. Increasing trends were projected for concentrations of iron, bicarbonate, and chlorobenzene. Anaerobic conditions in landfill provide favorable conditions for corrosion of iron resulting in higher concentrations over time. Bicarbonate formation as a byproduct of bacterial respiration during waste decomposition and the lime rock cap system of the landfill contribute to the increasing levels of bicarbonate in leachate. Chlorobenzene is produced during anaerobic biodegradation of 1,4-dichlorobenzene, hence, the increasing trend of chlorobenzene may be due to the declining trend of 1,4-dichlorobenzene. The time series multiplicative decomposition model in general provides an adequate forecast for future planning purposes for the parameters monitored in leachate. The model projections for 1,4-dichlorobenzene were relatively less accurate in comparison to the projections for vinyl chloride and chlorobenzene. Based on the trends observed, future monitoring needs for the selected leachate parameters were identified.
Modeling pollen time series using seasonal-trend decomposition procedure based on LOESS smoothing
NASA Astrophysics Data System (ADS)
Rojo, Jesús; Rivero, Rosario; Romero-Morte, Jorge; Fernández-González, Federico; Pérez-Badia, Rosa
2017-02-01
Analysis of airborne pollen concentrations provides valuable information on plant phenology and is thus a useful tool in agriculture—for predicting harvests in crops such as the olive and for deciding when to apply phytosanitary treatments—as well as in medicine and the environmental sciences. Variations in airborne pollen concentrations, moreover, are indicators of changing plant life cycles. By modeling pollen time series, we can not only identify the variables influencing pollen levels but also predict future pollen concentrations. In this study, airborne pollen time series were modeled using a seasonal-trend decomposition procedure based on LOcally wEighted Scatterplot Smoothing (LOESS) smoothing (STL). The data series—daily Poaceae pollen concentrations over the period 2006-2014—was broken up into seasonal and residual (stochastic) components. The seasonal component was compared with data on Poaceae flowering phenology obtained by field sampling. Residuals were fitted to a model generated from daily temperature and rainfall values, and daily pollen concentrations, using partial least squares regression (PLSR). This method was then applied to predict daily pollen concentrations for 2014 (independent validation data) using results for the seasonal component of the time series and estimates of the residual component for the period 2006-2013. Correlation between predicted and observed values was r = 0.79 (correlation coefficient) for the pre-peak period (i.e., the period prior to the peak pollen concentration) and r = 0.63 for the post-peak period. Separate analysis of each of the components of the pollen data series enables the sources of variability to be identified more accurately than by analysis of the original non-decomposed data series, and for this reason, this procedure has proved to be a suitable technique for analyzing the main environmental factors influencing airborne pollen concentrations.
Scaling and Multiscaling in Financial Time Series
2005-01-07
stable process (Mandelbrot, Fama , 1963) The return process X(t) satisfies: dX(t) = µ dt + σ dLα(t) where Lα(t) is an α-stable Levy process. The returns r...year Future JY/USD 7.104 0.02 6 months Nikkei 225 7.104 0.030 1.5 year Future Nikkei 7.104 0.02 6 months Japanese Yen 7.104 0.022 1.0 year French index...the horizon) - Optimal portfolios are stable by linear superposition. → CAPM : Relates the mean return of some asset to its covariance with the
SURMODERR: A MATLAB toolbox for estimation of velocity uncertainties of a non-permanent GPS station
NASA Astrophysics Data System (ADS)
Teza, Giordano; Pesci, Arianna; Casula, Giuseppe
2010-08-01
SURMODERR is a MATLAB toolbox intended for the estimation of reliable velocity uncertainties of a non-permanent GPS station (NPS), i.e. a GPS receiver used in campaign-style measurements. The implemented method is based on the subsampling of daily coordinate time series of one or more continuous GPS stations located inside or close to the area where the NPSs are installed. The continuous time series are subsampled according to real or planned occupation tables and random errors occurring in antenna replacement on different surveys are taken into account. In order to overcome the uncertainty underestimation that typically characterizes short duration GPS time series, statistical analysis of the simulated data is performed to estimate the velocity uncertainties of this real NPS. The basic hypotheses required are: (i) the signal must be a long-term linear trend plus seasonal and colored noise for each coordinate; (ii) the standard data processing should have already been performed to provide daily data series; and (iii) if the method is applied to survey planning, the future behavior should not be significantly different from the past behavior. In order to show the strength of the approach, two case studies with real data are presented and discussed (Central Apennine and Panarea Island, Italy).
Reliability Prediction of Ontology-Based Service Compositions Using Petri Net and Time Series Models
Li, Jia; Xia, Yunni; Luo, Xin
2014-01-01
OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy. PMID:24688429
Citizenship for the 21st Century. Our Democracy: How America Works Series.
ERIC Educational Resources Information Center
Callahan, William T., Jr., Ed.; Banaszak, Ronald A., Ed.
As part of the formulation of a new multidisciplinary civics curriculum for students in grades 8 and 9, a major national conference on the future of civic education was conceived, on the premise that early adolescence is an especially appropriate time to introduce the fundamental ideas of a democratic society. This volume contains the…
Spatial, Temporal and Spatio-Temporal Patterns of Maritime Piracy.
Marchione, Elio; Johnson, Shane D
2013-11-01
To examine patterns in the timing and location of incidents of maritime piracy to see whether, like many urban crimes, attacks cluster in space and time. Data for all incidents of maritime piracy worldwide recorded by the National Geospatial Intelligence Agency are analyzed using time-series models and methods originally developed to detect disease contagion. At the macro level, analyses suggest that incidents of pirate attacks are concentrated in five subregions of the earth's oceans and that the time series for these different subregions differ. At the micro level, analyses suggest that for the last 16 years (or more), pirate attacks appear to cluster in space and time suggesting that patterns are not static but are also not random. Much like other types of crime, pirate attacks cluster in space, and following an attack at one location the risk of others at the same location or nearby is temporarily elevated. The identification of such regularities has implications for the understanding of maritime piracy and for predicting the future locations of attacks.
Dyer, Bryce; Hassani, Hossein; Shadi, Mehran
2016-01-01
The format of cycling time trials in England, Wales and Northern Ireland, involves riders competing individually over several fixed race distances of 10-100 miles in length and using time constrained formats of 12 and 24 h in duration. Drawing on data provided by the national governing body that covers the regions of England and Wales, an analysis of six male competition record progressions was undertaken to illustrate its progression. Future forecasts are then projected through use of the Singular Spectrum Analysis technique. This method has not been applied to sport-based time series data before. All six records have seen a progressive improvement and are non-linear in nature. Five records saw their highest level of record change during the 1950-1969 period. Whilst new record frequency generally has reduced since this period, the magnitude of performance improvement has generally increased. The Singular Spectrum Analysis technique successfully provided forecasted projections in the short to medium term with a high level of fit to the time series data.
NASA Technical Reports Server (NTRS)
Luthcke, Scott; Rowlands, David; Lemoine, Frank; Zelensky, Nikita; Beckley, Brian; Klosko, Steve; Chinn, Doug
2006-01-01
Although satellite altimetry has been around for thirty years, the last fifteen beginning with the launch of TOPEX/Poseidon (TP) have yielded an abundance of significant results including: monitoring of ENS0 events, detection of internal tides, determination of accurate global tides, unambiguous delineation of Rossby waves and their propagation characteristics, accurate determination of geostrophic currents, and a multi-decadal time series of mean sea level trend and dynamic ocean topography variability. While the high level of accuracy being achieved is a result of both instrument maturity and the quality of models and correction algorithms applied to the data, improving the quality of the Climate Data Records produced from altimetry is highly dependent on concurrent progress being made in fields such as orbit determination. The precision orbits form the reference frame from which the radar altimeter observations are made. Therefore, the accuracy of the altimetric mapping is limited to a great extent by the accuracy to which a satellite orbit can be computed. The TP mission represents the first time that the radial component of an altimeter orbit was routinely computed with an accuracy of 2-cm. Recently it has been demonstrated that it is possible to compute the radial component of Jason orbits with an accuracy of better than 1-cm. Additionally, still further improvements in TP orbits are being achieved with new techniques and algorithms largely developed from combined Jason and TP data analysis. While these recent POD achievements are impressive, the new accuracies are now revealing subtle systematic orbit error that manifest as both intra and inter annual ocean topography errors. Additionally the construction of inter-decadal time series of climate data records requires the removal of systematic differences across multiple missions. Current and future efforts must focus on the understanding and reduction of these errors in order to generate a complete and consistent time series of improved orbits across multiple missions and decades required for the most stringent climate-related research. This presentation discusses the POD progress and achievements made over nearly three decades, and presents the future challenges, goals and their impact on altimetric derived ocean sciences.
Sea surface temperature 1871-2099 in 38 cells in the Caribbean region.
Sheppard, Charles; Rioja-Nieto, Rodolfo
2005-09-01
Sea surface temperature (SST) data with monthly resolution are provided for 38 cells in the Caribbean Sea and Bahamas region, plus Bermuda. These series are derived from the HadISST1 data set for historical time (1871-1999) and from the HadCM3 coupled climate model for predicted SST (1950-2099). Statistical scaling of the forecast data sets are performed to produce confluent SST series according to a now established method. These SST series are available for download. High water temperatures in 1998 killed enormous amounts of corals in tropical seas, though in the Caribbean region the effects at that time appeared less marked than in the Indo-Pacific. However, SSTs are rising in accordance with world-wide trends and it has been predicted that temperature will become increasingly important in this region in the near future. Patterns of SST rise within the Caribbean region are shown, and the importance of sub-regional patterns within this biologically highly interconnected area are noted.
Yasuhara, Moriaki; Doi, Hideyuki; Wei, Chih-Lin; Danovaro, Roberto; Myhre, Sarah E
2016-05-19
The link between biodiversity and ecosystem functioning (BEF) over long temporal scales is poorly understood. Here, we investigate biological monitoring and palaeoecological records on decadal, centennial and millennial time scales from a BEF framework by using deep sea, soft-sediment environments as a test bed. Results generally show positive BEF relationships, in agreement with BEF studies based on present-day spatial analyses and short-term manipulative experiments. However, the deep-sea BEF relationship is much noisier across longer time scales compared with modern observational studies. We also demonstrate with palaeoecological time-series data that a larger species pool does not enhance ecosystem stability through time, whereas higher abundance as an indicator of higher ecosystem functioning may enhance ecosystem stability. These results suggest that BEF relationships are potentially time scale-dependent. Environmental impacts on biodiversity and ecosystem functioning may be much stronger than biodiversity impacts on ecosystem functioning at long, decadal-millennial, time scales. Longer time scale perspectives, including palaeoecological and ecosystem monitoring data, are critical for predicting future BEF relationships on a rapidly changing planet. © 2016 The Author(s).
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, L. K.; Vogel, R. M.
2015-11-01
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied Generalized Pareto (GP) model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series X, with corresponding failure time series T, should have application to a wide class of natural hazards with rich opportunities for future extensions.
Laser Time-of-Flight Mass Spectrometry for Future In Situ Planetary Missions
NASA Technical Reports Server (NTRS)
Getty, S. A.; Brinckerhoff, W. B.; Cornish, T.; Ecelberger, S. A.; Li, X.; Floyd, M. A. Merrill; Chanover, N.; Uckert, K.; Voelz, D.; Xiao, X.;
2012-01-01
Laser desorption/ionization time-of-flight mass spectrometry (LD-TOF-MS) is a versatile, low-complexity instrument class that holds significant promise for future landed in situ planetary missions that emphasize compositional analysis of surface materials. Here we describe a 5kg-class instrument that is capable of detecting and analyzing a variety of analytes directly from rock or ice samples. Through laboratory studies of a suite of representative samples, we show that detection and analysis of key mineral composition, small organics, and particularly, higher molecular weight organics are well suited to this instrument design. A mass range exceeding 100,000 Da has recently been demonstrated. We describe recent efforts in instrument prototype development and future directions that will enhance our analytical capabilities targeting organic mixtures on primitive and icy bodies. We present results on a series of standards, simulated mixtures, and meteoritic samples.
Acclimatization to extreme heat
NASA Astrophysics Data System (ADS)
Warner, M. E.; Ganguly, A. R.; Bhatia, U.
2017-12-01
Heat extremes throughout the globe, as well as in the United States, are expected to increase. These heat extremes have been shown to impact human health, resulting in some of the highest levels of lives lost as compared with similar natural disasters. But in order to inform decision makers and best understand future mortality and morbidity, adaptation and mitigation must be considered. Defined as the ability for individuals or society to change behavior and/or adapt physiologically, acclimatization encompasses the gradual adaptation that occurs over time. Therefore, this research aims to account for acclimatization to extreme heat by using a hybrid methodology that incorporates future air conditioning use and installation patterns with future temperature-related time series data. While previous studies have not accounted for energy usage patterns and market saturation scenarios, we integrate such factors to compare the impact of air conditioning as a tool for acclimatization, with a particular emphasis on mortality within vulnerable communities.
Financial states of world financial and commodities markets around sovereign debt crisis
NASA Astrophysics Data System (ADS)
Nobi, Ashadun; Lee, Jae Woo
2017-11-01
We applied a threshold method to construct a complex network from cross-correlations coefficients of 46 daily time series comprised of 23 global indices and 23 commodity futures from 2010 - 2014. We identify financial states of both global indices and commodity futures based on the change of the network structure. The trend of the average correlation is decreasing except sharp peak during crises during the study period. The threshold networks are generated at a threshold value of θ = 0.1 and the change of degrees of each node over time is used to identify the financial state for each index. We observe that commodity futures, such as EU CO2 emission, live cattle, natural gas as well as the financial indices of Jakarta and Indonesia stock exchange (JKSE) and Kuala Lumpur stock exchange (KLSE) change states frequently. By the average change in links we identify the indices which are more reactive to crises.
NASA Astrophysics Data System (ADS)
Dai, Jun; Zhou, Haigang; Zhao, Shaoquan
2017-01-01
This paper considers a multi-scale future hedge strategy that minimizes lower partial moments (LPM). To do this, wavelet analysis is adopted to decompose time series data into different components. Next, different parametric estimation methods with known distributions are applied to calculate the LPM of hedged portfolios, which is the key to determining multi-scale hedge ratios over different time scales. Then these parametric methods are compared with the prevailing nonparametric kernel metric method. Empirical results indicate that in the China Securities Index 300 (CSI 300) index futures and spot markets, hedge ratios and hedge efficiency estimated by the nonparametric kernel metric method are inferior to those estimated by parametric hedging model based on the features of sequence distributions. In addition, if minimum-LPM is selected as a hedge target, the hedging periods, degree of risk aversion, and target returns can affect the multi-scale hedge ratios and hedge efficiency, respectively.
Doran, Kara S.; Howd, Peter A.; Sallenger,, Asbury H.
2016-01-04
Recent studies, and most of their predecessors, use tide gage data to quantify SL acceleration, ASL(t). In the current study, three techniques were used to calculate acceleration from tide gage data, and of those examined, it was determined that the two techniques based on sliding a regression window through the time series are more robust compared to the technique that fits a single quadratic form to the entire time series, particularly if there is temporal variation in the magnitude of the acceleration. The single-fit quadratic regression method has been the most commonly used technique in determining acceleration in tide gage data. The inability of the single-fit method to account for time-varying acceleration may explain some of the inconsistent findings between investigators. Properly quantifying ASL(t) from field measurements is of particular importance in evaluating numerical models of past, present, and future SLR resulting from anticipated climate change.
Bao, Wei; Yue, Jun; Rao, Yulei
2017-01-01
The application of deep learning approaches to finance has received a great deal of attention from both investors and researchers. This study presents a novel deep learning framework where wavelet transforms (WT), stacked autoencoders (SAEs) and long-short term memory (LSTM) are combined for stock price forecasting. The SAEs for hierarchically extracted deep features is introduced into stock price forecasting for the first time. The deep learning framework comprises three stages. First, the stock price time series is decomposed by WT to eliminate noise. Second, SAEs is applied to generate deep high-level features for predicting the stock price. Third, high-level denoising features are fed into LSTM to forecast the next day's closing price. Six market indices and their corresponding index futures are chosen to examine the performance of the proposed model. Results show that the proposed model outperforms other similar models in both predictive accuracy and profitability performance.
Sentinel 2 products and data quality status
NASA Astrophysics Data System (ADS)
Clerc, Sebastien; Gascon, Ferran; Bouzinac, Catherine; Touli-Lebreton, Dimitra; Francesconi, Benjamin; Lafrance, Bruno; Louis, Jerome; Alhammoud, Bahjat; Massera, Stephane; Pflug, Bringfried; Viallefont, Francoise; Pessiot, Laetitia
2017-04-01
Since July 2015, Sentinel-2A provides high-quality multi-spectral images with 10 m spatial resolution. With the launch of Sentinel-2B scheduled for early March 2017, the mission will create a consistent time series with a revisit time of 5 days. The consistency of the time series is ensured by some specific performance requirements such as multi-temporal spatial co-registration and radiometric stability, routinely monitored by the Sentinel-2 Mission Performance Centre (S2MPC). The products also provide a rich set of metadata and auxiliary data to support higher-level processing. This presentation will focus on the current status of the Sentinel-2 L1C and L2A products, including dissemination and product format aspects. Up-to-date mission performance estimations will be presented. Finally we will provide an outlook on the future evolutions: commissioning tasks for Sentinel-2B, geometric refinement, product format and processing improvements.
NASA Technical Reports Server (NTRS)
Molnar, Gyula I.; Susskind, Joel; Iredell, Lena
2011-01-01
In the beginning, a good measure of a GMCs performance was their ability to simulate the observed mean seasonal cycle. That is, a reasonable simulation of the means (i.e., small biases) and standard deviations of TODAY?S climate would suffice. Here, we argue that coupled GCM (CG CM for short) simulations of FUTURE climates should be evaluated in much more detail, both spatially and temporally. Arguably, it is not the bias, but rather the reliability of the model-generated anomaly time-series, even down to the [C]GCM grid-scale, which really matter. This statement is underlined by the social need to address potential REGIONAL climate variability, and climate drifts/changes in a manner suitable for policy decisions.
Imai, Chisato; Hashizume, Masahiro
2015-01-01
Background: Time series analysis is suitable for investigations of relatively direct and short-term effects of exposures on outcomes. In environmental epidemiology studies, this method has been one of the standard approaches to assess impacts of environmental factors on acute non-infectious diseases (e.g. cardiovascular deaths), with conventionally generalized linear or additive models (GLM and GAM). However, the same analysis practices are often observed with infectious diseases despite of the substantial differences from non-infectious diseases that may result in analytical challenges. Methods: Following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, systematic review was conducted to elucidate important issues in assessing the associations between environmental factors and infectious diseases using time series analysis with GLM and GAM. Published studies on the associations between weather factors and malaria, cholera, dengue, and influenza were targeted. Findings: Our review raised issues regarding the estimation of susceptible population and exposure lag times, the adequacy of seasonal adjustments, the presence of strong autocorrelations, and the lack of a smaller observation time unit of outcomes (i.e. daily data). These concerns may be attributable to features specific to infectious diseases, such as transmission among individuals and complicated causal mechanisms. Conclusion: The consequence of not taking adequate measures to address these issues is distortion of the appropriate risk quantifications of exposures factors. Future studies should pay careful attention to details and examine alternative models or methods that improve studies using time series regression analysis for environmental determinants of infectious diseases. PMID:25859149
Healthy and happy in Europe? On the association between happiness and life expectancy over time.
Bjørnskov, Christian
2008-04-01
This paper revisits the standard finding in individual-level studies that happiness leads to longevity. It does so in a cross-country time-series analysis in which the use of a random effects estimator controls for most relevant time-invariant factors. The findings suggest that happiness is negatively associated with longevity at the national level, and suggests a potential indirect transmission channel, as national happiness is negatively associated with public health expenditures. The paper concludes by discussing the implications of the results for public policy and future research.
NASA Astrophysics Data System (ADS)
Van Uytven, Els; Willems, Patrick
2017-04-01
Current trends in the hydro-meteorological variables indicate the potential impact of climate change on hydrological extremes. Therefore, they trigger an increased importance climate adaptation strategies in water management. The impact of climate change on hydro-meteorological and hydrological extremes is, however, highly uncertain. This is due to uncertainties introduced by the climate models, the internal variability inherent to the climate system, the greenhouse gas scenarios and the statistical downscaling methods. In view of the need to define sustainable climate adaptation strategies, there is a need to assess these uncertainties. This is commonly done by means of ensemble approaches. Because more and more climate models and statistical downscaling methods become available, there is a need to facilitate the climate impact and uncertainty analysis. A Climate Perturbation Tool has been developed for that purpose, which combines a set of statistical downscaling methods including weather typing, weather generator, transfer function and advanced perturbation based approaches. By use of an interactive interface, climate impact modelers can apply these statistical downscaling methods in a semi-automatic way to an ensemble of climate model runs. The tool is applicable to any region, but has been demonstrated so far to cases in Belgium, Suriname, Vietnam and Bangladesh. Time series representing future local-scale precipitation, temperature and potential evapotranspiration (PET) conditions were obtained, starting from time series of historical observations. Uncertainties on the future meteorological conditions are represented in two different ways: through an ensemble of time series, and a reduced set of synthetic scenarios. The both aim to span the full uncertainty range as assessed from the ensemble of climate model runs and downscaling methods. For Belgium, for instance, use was made of 100-year time series of 10-minutes precipitation observations and daily temperature and PET observations at Uccle and a large ensemble of 160 global climate model runs (CMIP5). They cover all four representative concentration pathway based greenhouse gas scenarios. While evaluating the downscaled meteorological series, particular attention was given to the performance of extreme value metrics (e.g. for precipitation, by means of intensity-duration-frequency statistics). Moreover, the total uncertainty was decomposed in the fractional uncertainties for each of the uncertainty sources considered. Research assessing the additional uncertainty due to parameter and structural uncertainties of the hydrological impact model is ongoing.
NASA Astrophysics Data System (ADS)
Yan, Fang; Winijkul, Ekbordin; Bond, Tami C.; Streets, David G.
2014-04-01
Estimates of future emissions are necessary for understanding the future health of the atmosphere, designing national and international strategies for air quality control, and evaluating mitigation policies. Emission inventories are uncertain and future projections even more so, thus it is important to quantify the uncertainty inherent in emission projections. This paper is the second in a series that seeks to establish a more mechanistic understanding of future air pollutant emissions based on changes in technology. The first paper in this series (Yan et al., 2011) described a model that projects emissions based on dynamic changes of vehicle fleet, Speciated Pollutant Emission Wizard-Trend, or SPEW-Trend. In this paper, we explore the underlying uncertainties of global and regional exhaust PM emission projections from on-road vehicles in the coming decades using sensitivity analysis and Monte Carlo simulation. This work examines the emission sensitivities due to uncertainties in retirement rate, timing of emission standards, transition rate of high-emitting vehicles called “superemitters”, and emission factor degradation rate. It is concluded that global emissions are most sensitive to parameters in the retirement rate function. Monte Carlo simulations show that emission uncertainty caused by lack of knowledge about technology composition is comparable to the uncertainty demonstrated by alternative economic scenarios, especially during the period 2010-2030.
Multi-scale correlations in different futures markets
NASA Astrophysics Data System (ADS)
Bartolozzi, M.; Mellen, C.; di Matteo, T.; Aste, T.
2007-07-01
In the present work we investigate the multiscale nature of the correlations for high frequency data (1 min) in different futures markets over a period of two years, starting on the 1st of January 2003 and ending on the 31st of December 2004. In particular, by using the concept of local Hurst exponent, we point out how the behaviour of this parameter, usually considered as a benchmark for persistency/antipersistency recognition in time series, is largely time-scale dependent in the market context. These findings are a direct consequence of the intrinsic complexity of a system where trading strategies are scale-adaptive. Moreover, our analysis points out different regimes in the dynamical behaviour of the market indices under consideration.
Time takes space: selective effects of multitasking on concurrent spatial processing.
Mäntylä, Timo; Coni, Valentina; Kubik, Veit; Todorov, Ivo; Del Missier, Fabio
2017-08-01
Many everyday activities require coordination and monitoring of complex relations of future goals and deadlines. Cognitive offloading may provide an efficient strategy for reducing control demands by representing future goals and deadlines as a pattern of spatial relations. We tested the hypothesis that multiple-task monitoring involves time-to-space transformational processes, and that these spatial effects are selective with greater demands on coordinate (metric) than categorical (nonmetric) spatial relation processing. Participants completed a multitasking session in which they monitored four series of deadlines, running on different time scales, while making concurrent coordinate or categorical spatial judgments. We expected and found that multitasking taxes concurrent coordinate, but not categorical, spatial processing. Furthermore, males showed a better multitasking performance than females. These findings provide novel experimental evidence for the hypothesis that efficient multitasking involves metric relational processing.
ERIC Educational Resources Information Center
Devlin, Maureen E., Ed.; Meyerson, Joel W., Ed.
This book summarizes presentations and discussions from the Fall 1999 symposium of the Forum for the Future of Higher Education. Part 1, "Winner-Take-All Markets," includes: (1) "Higher Education: The Ultimate Winner-Take-All Market?" (Robert H. Frank); (2) "The Return to Attending a More Selective College: 1960 to the…
ERIC Educational Resources Information Center
Harvey, Nigel; Reimers, Stian
2013-01-01
People's forecasts from time series underestimate future values for upward trends and overestimate them for downward ones. This trend damping may occur because (a) people anchor on the last data point and make insufficient adjustment to take the trend into account, (b) they adjust toward the average of the trends they have encountered within the…
USDA-ARS?s Scientific Manuscript database
It is widely believed that in Germany and Europe the risk of soil erosion by water increases as a result of changes in climate. Especially, an increase of the frequency of extreme precipitation events during phenological crop phases with reduced soil cover is very likely for the near future. A monit...
The Ratio of Public Investment in Education in China
ERIC Educational Resources Information Center
Liu, Zeyun; Yuan, Liansheng
2007-01-01
Based on cross-section data worldwide and time series data in China, the essay is intended to make an analysis of the factors which have impacts on the ratio of public investment in education by using econometric models and then the future ratio may be predicted. Conclusions are as follows. First, the proportion of fiscal revenue to GDP (gross…
ERIC Educational Resources Information Center
Bachman, Jerald G.; And Others
To explore costs and benefits of part-time work for high school students, survey responses of high school seniors from the classes of 1980 through 1984 were examined, distinguishing between those working many hours, those working fewer hours, and those not employed. Because hours of work differed by sex and by college plans, most analyses…
Geerse, Daphne J; Coolen, Bert H; Roerdink, Melvyn
2015-01-01
Walking ability is frequently assessed with the 10-meter walking test (10MWT), which may be instrumented with multiple Kinect v2 sensors to complement the typical stopwatch-based time to walk 10 meters with quantitative gait information derived from Kinect's 3D body point's time series. The current study aimed to evaluate a multi-Kinect v2 set-up for quantitative gait assessments during the 10MWT against a gold-standard motion-registration system by determining between-systems agreement for body point's time series, spatiotemporal gait parameters and the time to walk 10 meters. To this end, the 10MWT was conducted at comfortable and maximum walking speed, while 3D full-body kinematics was concurrently recorded with the multi-Kinect v2 set-up and the Optotrak motion-registration system (i.e., the gold standard). Between-systems agreement for body point's time series was assessed with the intraclass correlation coefficient (ICC). Between-systems agreement was similarly determined for the gait parameters' walking speed, cadence, step length, stride length, step width, step time, stride time (all obtained for the intermediate 6 meters) and the time to walk 10 meters, complemented by Bland-Altman's bias and limits of agreement. Body point's time series agreed well between the motion-registration systems, particularly so for body points in motion. For both comfortable and maximum walking speeds, the between-systems agreement for the time to walk 10 meters and all gait parameters except step width was high (ICC ≥ 0.888), with negligible biases and narrow limits of agreement. Hence, body point's time series and gait parameters obtained with a multi-Kinect v2 set-up match well with those derived with a gold standard in 3D measurement accuracy. Future studies are recommended to test the clinical utility of the multi-Kinect v2 set-up to automate 10MWT assessments, thereby complementing the time to walk 10 meters with reliable spatiotemporal gait parameters obtained objectively in a quick, unobtrusive and patient-friendly manner.
Geerse, Daphne J.; Coolen, Bert H.; Roerdink, Melvyn
2015-01-01
Walking ability is frequently assessed with the 10-meter walking test (10MWT), which may be instrumented with multiple Kinect v2 sensors to complement the typical stopwatch-based time to walk 10 meters with quantitative gait information derived from Kinect’s 3D body point’s time series. The current study aimed to evaluate a multi-Kinect v2 set-up for quantitative gait assessments during the 10MWT against a gold-standard motion-registration system by determining between-systems agreement for body point’s time series, spatiotemporal gait parameters and the time to walk 10 meters. To this end, the 10MWT was conducted at comfortable and maximum walking speed, while 3D full-body kinematics was concurrently recorded with the multi-Kinect v2 set-up and the Optotrak motion-registration system (i.e., the gold standard). Between-systems agreement for body point’s time series was assessed with the intraclass correlation coefficient (ICC). Between-systems agreement was similarly determined for the gait parameters’ walking speed, cadence, step length, stride length, step width, step time, stride time (all obtained for the intermediate 6 meters) and the time to walk 10 meters, complemented by Bland-Altman’s bias and limits of agreement. Body point’s time series agreed well between the motion-registration systems, particularly so for body points in motion. For both comfortable and maximum walking speeds, the between-systems agreement for the time to walk 10 meters and all gait parameters except step width was high (ICC ≥ 0.888), with negligible biases and narrow limits of agreement. Hence, body point’s time series and gait parameters obtained with a multi-Kinect v2 set-up match well with those derived with a gold standard in 3D measurement accuracy. Future studies are recommended to test the clinical utility of the multi-Kinect v2 set-up to automate 10MWT assessments, thereby complementing the time to walk 10 meters with reliable spatiotemporal gait parameters obtained objectively in a quick, unobtrusive and patient-friendly manner. PMID:26461498
Kapsenberg, Lydia; Kelley, Amanda L.; Shaw, Emily C.; Martz, Todd R.; Hofmann, Gretchen E.
2015-01-01
Understanding how declining seawater pH caused by anthropogenic carbon emissions, or ocean acidification, impacts Southern Ocean biota is limited by a paucity of pH time-series. Here, we present the first high-frequency in-situ pH time-series in near-shore Antarctica from spring to winter under annual sea ice. Observations from autonomous pH sensors revealed a seasonal increase of 0.3 pH units. The summer season was marked by an increase in temporal pH variability relative to spring and early winter, matching coastal pH variability observed at lower latitudes. Using our data, simulations of ocean acidification show a future period of deleterious wintertime pH levels potentially expanding to 7–11 months annually by 2100. Given the presence of (sub)seasonal pH variability, Antarctica marine species have an existing physiological tolerance of temporal pH change that may influence adaptation to future acidification. Yet, pH-induced ecosystem changes remain difficult to characterize in the absence of sufficient physiological data on present-day tolerances. It is therefore essential to incorporate natural and projected temporal pH variability in the design of experiments intended to study ocean acidification biology.
Wavelet Statistical Analysis of Low-Latitude Geomagnetic Measurements
NASA Astrophysics Data System (ADS)
Papa, A. R.; Akel, A. F.
2009-05-01
Following previous works by our group (Papa et al., JASTP, 2006), where we analyzed a series of records acquired at the Vassouras National Geomagnetic Observatory in Brazil for the month of October 2000, we introduced a wavelet analysis for the same type of data and for other periods. It is well known that wavelets allow a more detailed study in several senses: the time window for analysis can be drastically reduced if compared to other traditional methods (Fourier, for example) and at the same time allow an almost continuous accompaniment of both amplitude and frequency of signals as time goes by. This advantage brings some possibilities for potentially useful forecasting methods of the type also advanced by our group in previous works (see for example, Papa and Sosman, JASTP, 2008). However, the simultaneous statistical analysis of both time series (in our case amplitude and frequency) is a challenging matter and is in this sense that we have found what we consider our main goal. Some possible trends for future works are advanced.
NASA Astrophysics Data System (ADS)
Campbell, Adam J.; Hulbe, Christina L.; Lee, Choon-Ki
2018-01-01
As time series observations of Antarctic change proliferate, it is imperative that mathematical frameworks through which they are understood keep pace. Here we present a new method of interpreting remotely sensed change using spatial statistics and apply it to the specific case of thickness change on the Ross Ice Shelf. First, a numerical model of ice shelf flow is used together with empirical orthogonal function analysis to generate characteristic patterns of response to specific forcings. Because they are continuous and scalable in space and time, the patterns allow short duration observations to be placed in a longer time series context. Second, focusing only on changes that are statistically significant, the synthetic response surfaces are used to extract magnitude and timing of past events from the observational data. Slowdown of Kamb and Whillans Ice Streams is clearly detectable in remotely sensed thickness change. Moreover, those past events will continue to drive thinning into the future.
NASA Astrophysics Data System (ADS)
Pulido-Velazquez, David; Collados-Lara, Antonio-Juan; Alcalá, Francisco J.
2017-04-01
This research proposes and applies a method to assess potential impacts of future climatic scenarios on aquifer rainfall recharge in wide and varied regions. The continental Spain territory was selected to show the application. The method requires to generate future series of climatic variables (precipitation, temperature) in the system to simulate them within a previously calibrated hydrological model for the historical data. In a previous work, Alcalá and Custodio (2014) used the atmospheric chloride mass balance (CMB) method for the spatial evaluation of average aquifer recharge by rainfall over the whole of continental Spain, by assuming long-term steady conditions of the balance variables. The distributed average CMB variables necessary to calculate recharge were estimated from available variable-length data series of variable quality and spatial coverage. The CMB variables were regionalized by ordinary kriging at the same 4976 nodes of a 10 km x 10 km grid. Two main sources of uncertainty affecting recharge estimates (given by the coefficient of variation, CV), induced by the inherent natural variability of the variables and from mapping were segregated. Based on these stationary results we define a simple empirical rainfall-recharge model. We consider that spatiotemporal variability of rainfall and temperature are the most important climatic feature and variables influencing potential aquifer recharge in natural regime. Changes in these variables can be important in the assessment of future potential impacts of climatic scenarios over spatiotemporal renewable groundwater resource. For instance, if temperature increases, actual evapotranspitration (EA) will increases reducing the available water for others groundwater balance components, including the recharge. For this reason, instead of defining an infiltration rate coefficient that relates precipitation (P) and recharge we propose to define a transformation function that allows estimating the spatial distribution of recharge (both average value and its uncertainty) from the difference in P and EA in each area. A complete analysis of potential short-term (2016-2045) future climate scenarios in continental Spain has been performed by considering different sources of uncertainty. It is based on the historical climatic data for the period 1976-2005 and the climatic models simulations (for the control [1976-2005] and future scenarios [2016-2045]) performed in the frame of the CORDEX EU project. The most pessimistic emission scenario (RCP8.5) has been considered. For the RCP8.5 scenario we have analyzed the time series generated by simulating with 5 Regional Climatic models (CCLM4-8-17, RCA4, HIRHAM5, RACMO22E, and WRF331F) nested to 4 different General Circulation Models (GCMs). Two different conceptual approaches (bias correction and delta change techniques) have been applied to generate potential future climate scenarios from these data. Different ensembles of obtained time series have been proposed to obtain more representative scenarios by considering all the simulations or only those providing better approximations to the historical statistics based on a multicriteria analysis. This was a step to analyze future potential impacts on the aquifer recharge by simulating them within a rainfall-recharge model. This research has been supported by the CGL2013-48424-C2-2-R (MINECO) and the PMAFI/06/14 (UCAM) projects.
A framework for periodic outlier pattern detection in time-series sequences.
Rasheed, Faraz; Alhajj, Reda
2014-05-01
Periodic pattern detection in time-ordered sequences is an important data mining task, which discovers in the time series all patterns that exhibit temporal regularities. Periodic pattern mining has a large number of applications in real life; it helps understanding the regular trend of the data along time, and enables the forecast and prediction of future events. An interesting related and vital problem that has not received enough attention is to discover outlier periodic patterns in a time series. Outlier patterns are defined as those which are different from the rest of the patterns; outliers are not noise. While noise does not belong to the data and it is mostly eliminated by preprocessing, outliers are actual instances in the data but have exceptional characteristics compared with the majority of the other instances. Outliers are unusual patterns that rarely occur, and, thus, have lesser support (frequency of appearance) in the data. Outlier patterns may hint toward discrepancy in the data such as fraudulent transactions, network intrusion, change in customer behavior, recession in the economy, epidemic and disease biomarkers, severe weather conditions like tornados, etc. We argue that detecting the periodicity of outlier patterns might be more important in many sequences than the periodicity of regular, more frequent patterns. In this paper, we present a robust and time efficient suffix tree-based algorithm capable of detecting the periodicity of outlier patterns in a time series by giving more significance to less frequent yet periodic patterns. Several experiments have been conducted using both real and synthetic data; all aspects of the proposed approach are compared with the existing algorithm InfoMiner; the reported results demonstrate the effectiveness and applicability of the proposed approach.
Online Time Series Analysis of Land Products over Asia Monsoon Region via Giovanni
NASA Technical Reports Server (NTRS)
Shen, Suhung; Leptoukh, Gregory G.; Gerasimov, Irina
2011-01-01
Time series analysis is critical to the study of land cover/land use changes and climate. Time series studies at local-to-regional scales require higher spatial resolution, such as 1km or less, data. MODIS land products of 250m to 1km resolution enable such studies. However, such MODIS land data files are distributed in 10ox10o tiles, due to large data volumes. Conducting a time series study requires downloading all tiles that include the study area for the time period of interest, and mosaicking the tiles spatially. This can be an extremely time-consuming process. In support of the Monsoon Asia Integrated Regional Study (MAIRS) program, NASA GES DISC (Goddard Earth Sciences Data and Information Services Center) has processed MODIS land products at 1 km resolution over the Asia monsoon region (0o-60oN, 60o-150oE) with a common data structure and format. The processed data have been integrated into the Giovanni system (Goddard Interactive Online Visualization ANd aNalysis Infrastructure) that enables users to explore, analyze, and download data over an area and time period of interest easily. Currently, the following regional MODIS land products are available in Giovanni: 8-day 1km land surface temperature and active fire, monthly 1km vegetation index, and yearly 0.05o, 500m land cover types. More data will be added in the near future. By combining atmospheric and oceanic data products in the Giovanni system, it is possible to do further analyses of environmental and climate changes associated with the land, ocean, and atmosphere. This presentation demonstrates exploring land products in the Giovanni system with sample case scenarios.
NASA Astrophysics Data System (ADS)
Champion, N.
2012-08-01
Contrary to aerial images, satellite images are often affected by the presence of clouds. Identifying and removing these clouds is one of the primary steps to perform when processing satellite images, as they may alter subsequent procedures such as atmospheric corrections, DSM production or land cover classification. The main goal of this paper is to present the cloud detection approach, developed at the French Mapping agency. Our approach is based on the availability of multi-temporal satellite images (i.e. time series that generally contain between 5 and 10 images) and is based on a region-growing procedure. Seeds (corresponding to clouds) are firstly extracted through a pixel-to-pixel comparison between the images contained in time series (the presence of a cloud is here assumed to be related to a high variation of reflectance between two images). Clouds are then delineated finely using a dedicated region-growing algorithm. The method, originally designed for panchromatic SPOT5-HRS images, is tested in this paper using time series with 9 multi-temporal satellite images. Our preliminary experiments show the good performances of our method. In a near future, the method will be applied to Pléiades images, acquired during the in-flight commissioning phase of the satellite (launched at the end of 2011). In that context, this is a particular goal of this paper to show to which extent and in which way our method can be adapted to this kind of imagery.
NASA Astrophysics Data System (ADS)
Chen, R. S.; Levy, M.; Baptista, S.; Adamo, S.
2010-12-01
Vulnerability to climate variability and change will depend on dynamic interactions between different aspects of climate, land-use change, and socioeconomic trends. Measurements and projections of these changes are difficult at the local scale but necessary for effective planning. New data sources and methods make it possible to assess land-use and socioeconomic changes that may affect future patterns of climate vulnerability. In this paper we report on new time series data sets that reveal trends in the spatial patterns of climate vulnerability in the Caribbean/Gulf of Mexico Region. Specifically, we examine spatial time series data for human population over the period 1990-2000, time series data on land use and land cover over 2000-2009, and infant mortality rates as a proxy for poverty for 2000-2008. We compare the spatial trends for these measures to the distribution of climate-related natural disaster risk hotspots (cyclones, floods, landslides, and droughts) in terms of frequency, mortality, and economic losses. We use these data to identify areas where climate vulnerability appears to be increasing and where it may be decreasing. Regions where trends and patterns are especially worrisome include coastal areas of Guatemala and Honduras.
Observing climate change trends in ocean biogeochemistry: when and where.
Henson, Stephanie A; Beaulieu, Claudie; Lampitt, Richard
2016-04-01
Understanding the influence of anthropogenic forcing on the marine biosphere is a high priority. Climate change-driven trends need to be accurately assessed and detected in a timely manner. As part of the effort towards detection of long-term trends, a network of ocean observatories and time series stations provide high quality data for a number of key parameters, such as pH, oxygen concentration or primary production (PP). Here, we use an ensemble of global coupled climate models to assess the temporal and spatial scales over which observations of eight biogeochemically relevant variables must be made to robustly detect a long-term trend. We find that, as a global average, continuous time series are required for between 14 (pH) and 32 (PP) years to distinguish a climate change trend from natural variability. Regional differences are extensive, with low latitudes and the Arctic generally needing shorter time series (<~30 years) to detect trends than other areas. In addition, we quantify the 'footprint' of existing and planned time series stations, that is the area over which a station is representative of a broader region. Footprints are generally largest for pH and sea surface temperature, but nevertheless the existing network of observatories only represents 9-15% of the global ocean surface. Our results present a quantitative framework for assessing the adequacy of current and future ocean observing networks for detection and monitoring of climate change-driven responses in the marine ecosystem. © 2016 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.
Replacement predictions for drinking water networks through historical data.
Malm, Annika; Ljunggren, Olle; Bergstedt, Olof; Pettersson, Thomas J R; Morrison, Gregory M
2012-05-01
Lifetime distribution functions and current network age data can be combined to provide an assessment of the future replacement needs for drinking water distribution networks. Reliable lifetime predictions are limited by a lack of understanding of deterioration processes for different pipe materials under varied conditions. An alternative approach is the use of real historical data for replacement over an extended time series. In this paper, future replacement needs are predicted through historical data representing more than one hundred years of drinking water pipe replacement in Gothenburg, Sweden. The verified data fits well with commonly used lifetime distribution curves. Predictions for the future are discussed in the context of path dependence theory. Copyright © 2012 Elsevier Ltd. All rights reserved.
77 FR 16280 - Capital Research and Management Company, et al.; Notice of Application
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-20
.... APPLICANTS: American Funds Insurance Series (``AFIS''), Capital Research and Management Company (``CRMC.../search.htm or by calling (202) 551-8090. Applicants' Representations 1. AFIS is organized as a... extent necessary to permit any existing or future series of AFIS and any other existing or future...
NASA Astrophysics Data System (ADS)
Müller, Ruben; Schütze, Niels
2014-05-01
Water resources systems with reservoirs are expected to be sensitive to climate change. Assessment studies that analyze the impact of climate change on the performance of reservoirs can be divided in two groups: (1) Studies that simulate the operation under projected inflows with the current set of operational rules. Due to non adapted operational rules the future performance of these reservoirs can be underestimated and the impact overestimated. (2) Studies that optimize the operational rules for best adaption of the system to the projected conditions before the assessment of the impact. The latter allows for estimating more realistically future performance and adaption strategies based on new operation rules are available if required. Multi-purpose reservoirs serve various, often conflicting functions. If all functions cannot be served simultaneously at a maximum level, an effective compromise between multiple objectives of the reservoir operation has to be provided. Yet under climate change the historically preferenced compromise may no longer be the most suitable compromise in the future. Therefore a multi-objective based climate change impact assessment approach for multi-purpose multi-reservoir systems is proposed in the study. Projected inflows are provided in a first step using a physically based rainfall-runoff model. In a second step, a time series model is applied to generate long-term inflow time series. Finally, the long-term inflow series are used as driving variables for a simulation-based multi-objective optimization of the reservoir system in order to derive optimal operation rules. As a result, the adapted Pareto-optimal set of diverse best compromise solutions can be presented to the decision maker in order to assist him in assessing climate change adaption measures with respect to the future performance of the multi-purpose reservoir system. The approach is tested on a multi-purpose multi-reservoir system in a mountainous catchment in Germany. A climate change assessment is performed for climate change scenarios based on the SRES emission scenarios A1B, B1 and A2 for a set of statistically downscaled meteorological data. The future performance of the multi-purpose multi-reservoir system is quantified and possible intensifications of trade-offs between management goals or reservoir utilizations are shown.
Optimizing Use of Water Management Systems during Changes of Hydrological Conditions
NASA Astrophysics Data System (ADS)
Výleta, Roman; Škrinár, Andrej; Danáčová, Michaela; Valent, Peter
2017-10-01
When designing the water management systems and their components, there is a need of more detail research on hydrological conditions of the river basin, runoff of which creates the main source of water in the reservoir. Over the lifetime of the water management systems the hydrological time series are never repeated in the same form which served as the input for the design of the system components. The design assumes the observed time series to be representative at the time of the system use. However, it is rather unrealistic assumption, because the hydrological past will not be exactly repeated over the design lifetime. When designing the water management systems, the specialists may occasionally face the insufficient or oversized capacity design, possibly wrong specification of the management rules which may lead to their non-optimal use. It is therefore necessary to establish a comprehensive approach to simulate the fluctuations in the interannual runoff (taking into account the current dry and wet periods) in the form of stochastic modelling techniques in water management practice. The paper deals with the methodological procedure of modelling the mean monthly flows using the stochastic Thomas-Fiering model, while modification of this model by Wilson-Hilferty transformation of independent random number has been applied. This transformation usually applies in the event of significant asymmetry in the observed time series. The methodological procedure was applied on the data acquired at the gauging station of Horné Orešany in the Parná Stream. Observed mean monthly flows for the period of 1.11.1980 - 31.10.2012 served as the model input information. After extrapolation the model parameters and Wilson-Hilferty transformation parameters the synthetic time series of mean monthly flows were simulated. Those have been compared with the observed hydrological time series using basic statistical characteristics (e. g. mean, standard deviation and skewness) for testing the quality of the model simulation. The synthetic hydrological series of monthly flows were created having the same statistical properties as the time series observed in the past. The compiled model was able to take into account the diversity of extreme hydrological situations in a form of synthetic series of mean monthly flows, while the occurrence of a set of flows was confirmed, which could and may occur in the future. The results of stochastic modelling in the form of synthetic time series of mean monthly flows, which takes into account the seasonal fluctuations of runoff within the year, could be applicable in engineering hydrology (e. g. for optimum use of the existing water management system that is related to reassessment of economic risks of the system).
Neural basis of postural instability identified by VTC and EEG
Cao, Cheng; Jaiswal, Niharika; Newell, Karl M.
2010-01-01
In this study, we investigated the neural basis of virtual time to contact (VTC) and the hypothesis that VTC provides predictive information for future postural instability. A novel approach to differentiate stable pre-falling and transition-to-instability stages within a single postural trial while a subject was performing a challenging single leg stance with eyes closed was developed. Specifically, we utilized wavelet transform and stage segmentation algorithms using VTC time series data set as an input. The VTC time series was time-locked with multichannel (n = 64) EEG signals to examine its underlying neural substrates. To identify the focal sources of neural substrates of VTC, a two-step approach was designed combining the independent component analysis (ICA) and low-resolution tomography (LORETA) of multichannel EEG. There were two major findings: (1) a significant increase of VTC minimal values (along with enhanced variability of VTC) was observed during the transition-to-instability stage with progression to ultimate loss of balance and falling; and (2) this VTC dynamics was associated with pronounced modulation of EEG predominantly within theta, alpha and gamma frequency bands. The sources of this EEG modulation were identified at the cingulate cortex (ACC) and the junction of precuneus and parietal lobe, as well as at the occipital cortex. The findings support the hypothesis that the systematic increase of minimal values of VTC concomitant with modulation of EEG signals at the frontal-central and parietal–occipital areas serve collectively to predict the future instability in posture. PMID:19655130
Financing College in Hard Times: Work and Student Aid. The CSU Crisis and California's Future
ERIC Educational Resources Information Center
Civil Rights Project / Proyecto Derechos Civiles, 2011
2011-01-01
This report is the third in a series of reports designed to analyze the impact of the fiscal cutbacks on opportunity for higher education in the California State University system, the huge network of 23 universities that provide the greatest amount of Bachelor of Arts (BA) level of education in the state. The first study, "Higher Tuition,…
ERIC Educational Resources Information Center
Liu, Wendy; Aaker, Jennifer
2007-01-01
In this research, we investigate the impact of significant life experiences on intertemporal decisions among young adults. A series of experiments focus specifically on the impact of experiencing the death of a close other by cancer. We show that such an experience, which bears information about time, is associated with making decisions that favor…
The Future of U.S. Doctoral Programs in Physics (May 22-23, 1989). Topical Conference Series.
ERIC Educational Resources Information Center
Neal, Homer A., Ed.; Wilson, Jack M., Ed.
The 1990's represent an unusual period in physics. Some areas are in a state of unusual excitement, while there are divisions growing within the discipline over priorities. Another problem facing the field at this time is that few U.S. nationals are going into careers related to physics. In addition, the percentage of females and minorities…
Cross-cultural comparisons of delay discounting of gain and loss.
Ishii, Keiko; Gang, Lili; Takahashi, Taiki
2016-11-01
People generally tend to discount future outcomes in favor of smaller but immediate gains (i.e., delay discounting). The present research examined cultural similarities and differences in delay discounting of gain and loss between Chinese and Japanese, based on a q-exponential model of intertemporal choice. Using a hypothetical situation, we asked 65 Japanese participants and 51 Chinese participants to choose between receiving (or paying) a different amount of money immediately or with a specified delay (1 week, 2 weeks, 1 month, 6 months, 1 year, 5 years, and 25 years). For each delay, participants completed a series of 40 binary choices for gain or loss. Regardless of cultures, the q-exponential model was the optimal model. Both impulsivity and time-inconsistency were higher for future gains than for future losses. In addition to the cultural similarities, Chinese participants discounted future gains and losses more steeply than did Japanese. In contrast, Japanese participants were more time-inconsistent in delay discounting than were Chinese, suggesting that the reduction in their subjective value depended relatively on delay.
Spatial, Temporal and Spatio-Temporal Patterns of Maritime Piracy
Marchione, Elio
2013-01-01
Objectives: To examine patterns in the timing and location of incidents of maritime piracy to see whether, like many urban crimes, attacks cluster in space and time. Methods: Data for all incidents of maritime piracy worldwide recorded by the National Geospatial Intelligence Agency are analyzed using time-series models and methods originally developed to detect disease contagion. Results: At the macro level, analyses suggest that incidents of pirate attacks are concentrated in five subregions of the earth’s oceans and that the time series for these different subregions differ. At the micro level, analyses suggest that for the last 16 years (or more), pirate attacks appear to cluster in space and time suggesting that patterns are not static but are also not random. Conclusions: Much like other types of crime, pirate attacks cluster in space, and following an attack at one location the risk of others at the same location or nearby is temporarily elevated. The identification of such regularities has implications for the understanding of maritime piracy and for predicting the future locations of attacks. PMID:25076796
Temporal Decompostion of a Distribution System Quasi-Static Time-Series Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mather, Barry A; Hunsberger, Randolph J
This paper documents the first phase of an investigation into reducing runtimes of complex OpenDSS models through parallelization. As the method seems promising, future work will quantify - and further mitigate - errors arising from this process. In this initial report, we demonstrate how, through the use of temporal decomposition, the run times of a complex distribution-system-level quasi-static time series simulation can be reduced roughly proportional to the level of parallelization. Using this method, the monolithic model runtime of 51 hours was reduced to a minimum of about 90 minutes. As expected, this comes at the expense of control- andmore » voltage-errors at the time-slice boundaries. All evaluations were performed using a real distribution circuit model with the addition of 50 PV systems - representing a mock complex PV impact study. We are able to reduce induced transition errors through the addition of controls initialization, though small errors persist. The time savings with parallelization are so significant that we feel additional investigation to reduce control errors is warranted.« less
There's alcohol in my soap: portrayal and effects of alcohol use in a popular television series.
van Hoof, Joris J; de Jong, Menno D T; Fennis, Bob M; Gosselt, Jordy F
2009-06-01
Two studies are reported addressing the media influences on adolescents' alcohol-related attitudes and behaviours. A content analysis was conducted to investigate the prevalence of alcohol portrayal in a Dutch soap series. The coding scheme covered the alcohol consumption per soap character, drinking situations and drinking times. Inter-coder reliability was satisfactory. The results showed that alcohol portrayal was prominent and that many instances of alcohol use reflected undesirable behaviours. To assess the influence of such alcohol cues on adolescents, a 2x2 experiment was conducted focusing on the separate and combined effects of alcohol portrayal in the soap series and surrounding alcohol commercials. Whereas the alcohol commercials had the expected effects on adolescents' attitudes, the alcohol-related soap content only appeared to have unexpected effects. Adolescents who were exposed to the alcohol portrayal in the soap series had a less positive attitude towards alcohol and lower drinking intentions. Implications of these findings for health policy and future research are discussed.
NASA Astrophysics Data System (ADS)
Rahim, K. J.; Cumming, B. F.; Hallett, D. J.; Thomson, D. J.
2007-12-01
An accurate assessment of historical local Holocene data is important in making future climate predictions. Holocene climate is often obtained through proxy measures such as diatoms or pollen using radiocarbon dating. Wiggle Match Dating (WMD) uses an iterative least squares approach to tune a core with a large amount of 14C dates to the 14C calibration curve. This poster will present a new method of tuning a time series with when only a modest number of 14C dates are available. The method presented uses the multitaper spectral estimation, and it specifically makes use of a multitaper spectral coherence tuning technique. Holocene climate reconstructions are often based on a simple depth-time fit such as a linear interpolation, splines, or low order polynomials. Many of these models make use of only a small number of 14C dates, each of which is a point estimate with a significant variance. This technique attempts to tune the 14C dates to a reference series, such as tree rings, varves, or the radiocarbon calibration curve. The amount of 14C in the atmosphere is not constant, and a significant source of variance is solar activity. A decrease in solar activity coincides with an increase in cosmogenic isotope production, and an increase in cosmogenic isotope production coincides with a decrease in temperature. The method presented uses multitaper coherence estimates and adjusts the phase of the time series to line up significant line components with that of the reference series in attempt to obtain a better depth-time fit then the original model. Given recent concerns and demonstrations of the variation in estimated dates from radiocarbon labs, methods to confirm and tune the depth-time fit can aid climate reconstructions by improving and serving to confirm the accuracy of the underlying depth-time fit. Climate reconstructions can then be made on the improved depth-time fit. This poster presents a run though of this process using Chauvin Lake in the Canadian prairies and Mt. Barr Cirque Lake located in British Columbia as examples.
Does the Rain fall in our heads?
NASA Astrophysics Data System (ADS)
Costa, M. E. G.; Rodrigues, M. A. S.
2012-04-01
In our school the activities linked with sciences are developed in a partnership with other school subjects. Interdisciplinary projects are always valued from beginning to end of a project. It is common for teachers of different areas to work together in a Science project. Research of English written articles is very important not only for the development of our students' scientific literacy but also as a way of widening knowledge and a view on different perspectives of life instead of being limited to research of any articles in Portuguese language. In this work, we are going to study the rainfall trends in our council (Góis, Portugal). The use of the analyses of long-term time series of rainfall becomes imperative to evaluate variability and tendency of the climate in secular time series. These, in turn, result in a better understanding of the regional climate, allowing a prognosis of the future climate which is of extreme importance in managing the natural and hydro resources and for planning human activities through scenarios and their impact. This work consists of analysis of long-term observed rainfall series for the council of Góis.
Neumeister, Veronique M; Anagnostou, Valsamo; Siddiqui, Summar; England, Allison Michal; Zarrella, Elizabeth R; Vassilakopoulou, Maria; Parisi, Fabio; Kluger, Yuval; Hicks, David G; Rimm, David L
2012-12-05
Companion diagnostic tests can depend on accurate measurement of protein expression in tissues. Preanalytic variables, especially cold ischemic time (time from tissue removal to fixation in formalin) can affect the measurement and may cause false-negative results. We examined 23 proteins, including four commonly used breast cancer biomarker proteins, to quantify their sensitivity to cold ischemia in breast cancer tissues. A series of 93 breast cancer specimens with known time-to-fixation represented in a tissue microarray and a second series of 25 matched pairs of core needle biopsies and breast cancer resections were used to evaluate changes in antigenicity as a function of cold ischemic time. Estrogen receptor (ER), progesterone receptor (PgR), HER2 or Ki67, and 19 other antigens were tested. Each antigen was measured using the AQUA method of quantitative immunofluorescence on at least one series. All statistical tests were two-sided. We found no evidence for loss of antigenicity with time-to-fixation for ER, PgR, HER2, or Ki67 in a 4-hour time window. However, with a bootstrapping analysis, we observed a trend toward loss for ER and PgR, a statistically significant loss of antigenicity for phosphorylated tyrosine (P = .0048), and trends toward loss for other proteins. There was evidence of increased antigenicity in acetylated lysine, AKAP13 (P = .009), and HIF1A (P = .046), which are proteins known to be expressed in conditions of hypoxia. The loss of antigenicity for phosphorylated tyrosine and increase in expression of AKAP13, and HIF1A were confirmed in the biopsy/resection series. Key breast cancer biomarkers show no evidence of loss of antigenicity, although this dataset assesses the relatively short time beyond the 1-hour limit in recent guidelines. Other proteins show changes in antigenicity in both directions. Future studies that extend the time range and normalize for heterogeneity will provide more comprehensive information on preanalytic variation due to cold ischemic time.
NASA Astrophysics Data System (ADS)
Liang, Y.; Gallaher, D. W.; Grant, G.; Lv, Q.
2011-12-01
Change over time, is the central driver of climate change detection. The goal is to diagnose the underlying causes, and make projections into the future. In an effort to optimize this process we have developed the Data Rod model, an object-oriented approach that provides the ability to query grid cell changes and their relationships to neighboring grid cells through time. The time series data is organized in time-centric structures called "data rods." A single data rod can be pictured as the multi-spectral data history at one grid cell: a vertical column of data through time. This resolves the long-standing problem of managing time-series data and opens new possibilities for temporal data analysis. This structure enables rapid time- centric analysis at any grid cell across multiple sensors and satellite platforms. Collections of data rods can be spatially and temporally filtered, statistically analyzed, and aggregated for use with pattern matching algorithms. Likewise, individual image pixels can be extracted to generate multi-spectral imagery at any spatial and temporal location. The Data Rods project has created a series of prototype databases to store and analyze massive datasets containing multi-modality remote sensing data. Using object-oriented technology, this method overcomes the operational limitations of traditional relational databases. To demonstrate the speed and efficiency of time-centric analysis using the Data Rods model, we have developed a sea ice detection algorithm. This application determines the concentration of sea ice in a small spatial region across a long temporal window. If performed using traditional analytical techniques, this task would typically require extensive data downloads and spatial filtering. Using Data Rods databases, the exact spatio-temporal data set is immediately available No extraneous data is downloaded, and all selected data querying occurs transparently on the server side. Moreover, fundamental statistical calculations such as running averages are easily implemented against the time-centric columns of data.
Brothers, Allyson; Gabrian, Martina; Wahl, Hans-Werner; Diehl, Manfred
2016-01-01
This study examined how two distinct facets of perceived personal lifetime – future time perspective (FTP) and awareness of age-related change (AARC) – are associated with one another, and how they may interact to predict psychological well-being. To better understand associations among subjective perceptions of lifetime, aging and well-being, we tested a series of models to investigate questions of directionality, indirect effects, and conditional processes among FTP, AARC-Gains, AARC-Losses, and psychological well-being. In all models, we tested for differences between middle-aged and older adults, and between adults from the U.S. and Germany. Analyses were conducted within a structural equation modeling framework on a cross-national, 2.5-year longitudinal sample of 537 community-residing adults (age 40–98 years). Awareness of age-related losses (AARC-Losses) at Time 1 predicted FTP at Time 2, but FTP did not predict AARC-Gains or AARC-Losses. Furthermore, future time perspective mediated the association between AARC-Losses and well-being. Moderation analyses revealed a buffering effect of awareness of age-related gains (AARC-Gains) in which perceptions of more age-related gains diminished the negative effect of a limited future time perspective on well-being. Effects were robust across age groups and countries. Taken together, these findings suggest that perceived age-related loss experiences may sensitize individuals to perceive a more limited future lifetime which may then lead to lower psychological well-being. In contrast, perceived age-related gains may function as a resource to preserve psychological well-being, in particular when time is perceived as running out. PMID:27243764
Brothers, Allyson; Gabrian, Martina; Wahl, Hans-Werner; Diehl, Manfred
2016-09-01
This study examined how 2 distinct facets of perceived personal lifetime-future time perspective (FTP) and awareness of age-related change (AARC)-are associated with another, and how they may interact to predict psychological well-being. To better understand associations among subjective perceptions of lifetime, aging, and well-being, we tested a series of models to investigate questions of directionality, indirect effects, and conditional processes among FTP, AARC-Gains, AARC-Losses, and psychological well-being. In all models, we tested for differences between middle-aged and older adults, and between adults from the United States and Germany. Analyses were conducted within a structural equation modeling framework on a cross-national, 2.5-year longitudinal sample of 537 community-residing adults (age 40-98 years). Awareness of age-related losses (AARC-Losses) at Time 1 predicted FTP at Time 2, but FTP did not predict AARC-Gains or AARC-Losses. Furthermore, future time perspective mediated the association between AARC-Losses and well-being. Moderation analyses revealed a buffering effect of awareness of age-related gains (AARC-Gains) in which perceptions of more age-related gains diminished the negative effect of a limited future time perspective on well-being. Effects were robust across age groups and countries. Taken together, these findings suggest that perceived age-related loss experiences may sensitize individuals to perceive a more limited future lifetime which may then lead to lower psychological well-being. In contrast, perceived age-related gains may function as a resource to preserve psychological well-being, in particular when time is perceived as running out. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
[The case-case-time-control study design].
Wang, Jing; Zhuo, Lin; Zhan, Siyan
2014-12-01
Although the 'self-matched case-only studies' (such as the case-cross-over or self-controlled case-series method) can control the time-invariant confounders (measured or unmeasured) through design of the study, however, they can not control those confounders that vary with time. A bidirectional case-crossover design can be used to adjust the exposure-time trends. In the areas of pharmaco-epidemiology, illness often influence the future use of medications, making a bidirectional study design problematic. Suissa's case-time-control design combines the case-crossover and the case-control design which could adjust for exposure-trend bias, but the control group may reintroduce selection bias, if the matching does not go well. We propose a "case-case-time-control" design which is an extension of the case-time-control design. However, rather than using a sample of external controls, we choose those future cases as controls for current cases to counter the bias that arising from temporal trends caused by exposure to the target of interest. In the end of this article we will discuss the strength and limitations of this design based on an applied example.
The Future of Engineering Education--Revisited
ERIC Educational Resources Information Center
Wankat, Phillip C.; Bullard, Lisa G.
2016-01-01
This paper revisits the landmark CEE series, "The Future of Engineering Education," published in 2000 (available free in the CEE archives on the internet) to examine the predictions made in the original paper as well as the tools and approaches documented. Most of the advice offered in the original series remains current. Despite new…
Gross, Markus; Magar, Vanesa
2016-01-01
In previous work, the authors demonstrated how data from climate simulations can be utilized to estimate regional wind power densities. In particular, it was shown that the quality of wind power densities, estimated from the UPSCALE global dataset in offshore regions of Mexico, compared well with regional high resolution studies. Additionally, a link between surface temperature and moist air density in the estimates was presented. UPSCALE is an acronym for UK on PRACE (the Partnership for Advanced Computing in Europe)—weather-resolving Simulations of Climate for globAL Environmental risk. The UPSCALE experiment was performed in 2012 by NCAS (National Centre for Atmospheric Science)-Climate, at the University of Reading and the UK Met Office Hadley Centre. The study included a 25.6-year, five-member ensemble simulation of the HadGEM3 global atmosphere, at 25km resolution for present climate conditions. The initial conditions for the ensemble runs were taken from consecutive days of a test configuration. In the present paper, the emphasis is placed on the single climate run for a potential future climate scenario in the UPSCALE experiment dataset, using the Representation Concentrations Pathways (RCP) 8.5 climate change scenario. Firstly, some tests were performed to ensure that the results using only one instantiation of the current climate dataset are as robust as possible within the constraints of the available data. In order to achieve this, an artificial time series over a longer sampling period was created. Then, it was shown that these longer time series provided almost the same results than the short ones, thus leading to the argument that the short time series is sufficient to capture the climate. Finally, with the confidence that one instantiation is sufficient, the future climate dataset was analysed to provide, for the first time, a projection of future changes in wind power resources using the UPSCALE dataset. It is hoped that this, in turn, will provide some guidance for wind power developers and policy makers to prepare and adapt for climate change impacts on wind energy production. Although offshore locations around Mexico were used as a case study, the dataset is global and hence the methodology presented can be readily applied at any desired location. PMID:27788208
Gross, Markus; Magar, Vanesa
2016-01-01
In previous work, the authors demonstrated how data from climate simulations can be utilized to estimate regional wind power densities. In particular, it was shown that the quality of wind power densities, estimated from the UPSCALE global dataset in offshore regions of Mexico, compared well with regional high resolution studies. Additionally, a link between surface temperature and moist air density in the estimates was presented. UPSCALE is an acronym for UK on PRACE (the Partnership for Advanced Computing in Europe)-weather-resolving Simulations of Climate for globAL Environmental risk. The UPSCALE experiment was performed in 2012 by NCAS (National Centre for Atmospheric Science)-Climate, at the University of Reading and the UK Met Office Hadley Centre. The study included a 25.6-year, five-member ensemble simulation of the HadGEM3 global atmosphere, at 25km resolution for present climate conditions. The initial conditions for the ensemble runs were taken from consecutive days of a test configuration. In the present paper, the emphasis is placed on the single climate run for a potential future climate scenario in the UPSCALE experiment dataset, using the Representation Concentrations Pathways (RCP) 8.5 climate change scenario. Firstly, some tests were performed to ensure that the results using only one instantiation of the current climate dataset are as robust as possible within the constraints of the available data. In order to achieve this, an artificial time series over a longer sampling period was created. Then, it was shown that these longer time series provided almost the same results than the short ones, thus leading to the argument that the short time series is sufficient to capture the climate. Finally, with the confidence that one instantiation is sufficient, the future climate dataset was analysed to provide, for the first time, a projection of future changes in wind power resources using the UPSCALE dataset. It is hoped that this, in turn, will provide some guidance for wind power developers and policy makers to prepare and adapt for climate change impacts on wind energy production. Although offshore locations around Mexico were used as a case study, the dataset is global and hence the methodology presented can be readily applied at any desired location.
Not my future? Core values and the neural representation of future events.
Brosch, Tobias; Stussi, Yoann; Desrichard, Olivier; Sander, David
2018-06-01
Individuals with pronounced self-transcendence values have been shown to put greater weight on the long-term consequences of their actions when making decisions. Using functional magnetic resonance imaging, we investigated the neural mechanisms underlying the evaluation of events occurring several decades in the future as well as the role of core values in these processes. Thirty-six participants viewed a series of events, consisting of potential consequences of climate change, which could occur in the near future (around 2030), and thus would be experienced by the participants themselves, or in the far future (around 2080). We observed increased activation in anterior VMPFC (BA11), a region involved in encoding the personal significance of future events, when participants were envisioning far future events, demonstrating for the first time that the role of the VMPFC in future projection extends to the time scale of decades. Importantly, this activation increase was observed only in participants with pronounced self-transcendence values measured by self-report questionnaire, as shown by a statistically significant interaction of temporal distance and value structure. These findings suggest that future projection mechanisms are modulated by self-transcendence values to allow for a more extensive simulation of far future events. Consistent with this, these participants reported similar concern ratings for near and far future events, whereas participants with pronounced self-enhancement values were more concerned about near future events. Our findings provide a neural substrate for the tendency of individuals with pronounced self-transcendence values to consider the long-term consequences of their actions.
First Generation Least Expensive Approach to Fission (FiGLEAF) Testing Results
NASA Technical Reports Server (NTRS)
VanDyke, Melissa; Houts, Mike; Pedersen, Kevin; Godfroy, Tom; Dickens, Ricky; Poston, David; Reid, Bob; Salvail. Pat; Ring, Peter; Schmidt, George R. (Technical Monitor)
2000-01-01
Successful development of space fission systems will require an extensive program of affordable and realistic testing. In addition to tests related to design/development of the fission system, realistic testing of the actual flight unit must also be performed. Testing can be divided into two categories, non-nuclear tests and nuclear tests. Full power nuclear tests of space fission systems are expensive, time consuming, and of limited use, even in the best of programmatic environments. If the system is designed to operate within established radiation damage and fuel burn up limits while simultaneously being designed to allow close simulation of heat from fission using resistance heaters, high confidence in fission system performance and lifetime can be attained through a series of non-nuclear tests. Non-nuclear tests are affordable and timely, and the cause of component and system failures can be quickly and accurately identified. MSFC is leading a Safe Affordable Fission Engine (SAFE) test series whose ultimate goal is the demonstration of a 300 kW flight configuration system using non-nuclear testing. This test series is carried out in collaboration with other NASA centers, other government agencies, industry, and universities. The paper describes the SAFE test series, which includes test article descriptions, test results and conclusions, and future test plans.
Tong, Feifei; Lian, Yan; Zhou, Huang; Shi, Xiaohong; He, Fengjiao
2014-10-21
A new multichannel series piezoelectric quartz crystal (MSPQC) cell sensor for real time monitoring of living cells in vitro was reported in this paper. The constructed sensor was used successfully to monitor adhesion, spreading, proliferation, and apoptosis of MG63 osteosarcoma cells and investigate the effects of different concentrations of cobalt chloride on MG63 cells. Quantitative real time and dynamic cell analyses data were conducted using the MSPQC cell sensor. Compared with methods such as fluorescence staining and morphology observation by microscopy, the MSPQC cell sensor is noninvasive, label free, simple, cheap, and capable of online monitoring. It can automatically record the growth status of cells and quantitatively evaluate cell proliferation and the apoptotic response to drugs. It will be a valuable detection and analysis tool for the acquisition of cellular level information and is anticipated to have application in the field of cell biology research or cytotoxicity testing in the future.
Memory effects in stock price dynamics: evidences of technical trading
Garzarelli, Federico; Cristelli, Matthieu; Pompa, Gabriele; Zaccaria, Andrea; Pietronero, Luciano
2014-01-01
Technical trading represents a class of investment strategies for Financial Markets based on the analysis of trends and recurrent patterns in price time series. According standard economical theories these strategies should not be used because they cannot be profitable. On the contrary, it is well-known that technical traders exist and operate on different time scales. In this paper we investigate if technical trading produces detectable signals in price time series and if some kind of memory effects are introduced in the price dynamics. In particular, we focus on a specific figure called supports and resistances. We first develop a criterion to detect the potential values of supports and resistances. Then we show that memory effects in the price dynamics are associated to these selected values. In fact we show that prices more likely re-bounce than cross these values. Such an effect is a quantitative evidence of the so-called self-fulfilling prophecy, that is the self-reinforcement of agents' belief and sentiment about future stock prices' behavior. PMID:24671011
Memory effects in stock price dynamics: evidences of technical trading
NASA Astrophysics Data System (ADS)
Garzarelli, Federico; Cristelli, Matthieu; Pompa, Gabriele; Zaccaria, Andrea; Pietronero, Luciano
2014-03-01
Technical trading represents a class of investment strategies for Financial Markets based on the analysis of trends and recurrent patterns in price time series. According standard economical theories these strategies should not be used because they cannot be profitable. On the contrary, it is well-known that technical traders exist and operate on different time scales. In this paper we investigate if technical trading produces detectable signals in price time series and if some kind of memory effects are introduced in the price dynamics. In particular, we focus on a specific figure called supports and resistances. We first develop a criterion to detect the potential values of supports and resistances. Then we show that memory effects in the price dynamics are associated to these selected values. In fact we show that prices more likely re-bounce than cross these values. Such an effect is a quantitative evidence of the so-called self-fulfilling prophecy, that is the self-reinforcement of agents' belief and sentiment about future stock prices' behavior.
Bao, Wei; Rao, Yulei
2017-01-01
The application of deep learning approaches to finance has received a great deal of attention from both investors and researchers. This study presents a novel deep learning framework where wavelet transforms (WT), stacked autoencoders (SAEs) and long-short term memory (LSTM) are combined for stock price forecasting. The SAEs for hierarchically extracted deep features is introduced into stock price forecasting for the first time. The deep learning framework comprises three stages. First, the stock price time series is decomposed by WT to eliminate noise. Second, SAEs is applied to generate deep high-level features for predicting the stock price. Third, high-level denoising features are fed into LSTM to forecast the next day’s closing price. Six market indices and their corresponding index futures are chosen to examine the performance of the proposed model. Results show that the proposed model outperforms other similar models in both predictive accuracy and profitability performance. PMID:28708865
Meyer, Swen; Blaschek, Michael; Duttmann, Rainer; Ludwig, Ralf
2016-02-01
According to current climate projections, Mediterranean countries are at high risk for an even pronounced susceptibility to changes in the hydrological budget and extremes. These changes are expected to have severe direct impacts on the management of water resources, agricultural productivity and drinking water supply. Current projections of future hydrological change, based on regional climate model results and subsequent hydrological modeling schemes, are very uncertain and poorly validated. The Rio Mannu di San Sperate Basin, located in Sardinia, Italy, is one test site of the CLIMB project. The Water Simulation Model (WaSiM) was set up to model current and future hydrological conditions. The availability of measured meteorological and hydrological data is poor as it is common for many Mediterranean catchments. In this study we conducted a soil sampling campaign in the Rio Mannu catchment. We tested different deterministic and hybrid geostatistical interpolation methods on soil textures and tested the performance of the applied models. We calculated a new soil texture map based on the best prediction method. The soil model in WaSiM was set up with the improved new soil information. The simulation results were compared to standard soil parametrization. WaSiMs was validated with spatial evapotranspiration rates using the triangle method (Jiang and Islam, 1999). WaSiM was driven with the meteorological forcing taken from 4 different ENSEMBLES climate projections for a reference (1971-2000) and a future (2041-2070) times series. The climate change impact was assessed based on differences between reference and future time series. The simulated results show a reduction of all hydrological quantities in the future in the spring season. Furthermore simulation results reveal an earlier onset of dry conditions in the catchment. We show that a solid soil model setup based on short-term field measurements can improve long-term modeling results, which is especially important in ungauged catchments. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Koslow, J. A.; Brodeur, R.; Duffy-Anderson, J. T.; Perry, I.; jimenez Rosenberg, S.; Aceves, G.
2016-02-01
Ichthyoplankton time series available from the Bering Sea, Gulf of Alaska and California Current (Oregon to Baja California) provide a potential ocean observing network to assess climate impacts on fish communities along the west coast of North America. Larval fish abundance reflects spawning stock biomass, so these data sets provide indicators of the status of a broad range of exploited and unexploited fish populations. Analyses to date have focused on individual time series, which generally exhibit significant change in relation to climate. Off California, a suite of 24 midwater fish taxa have declined > 60%, correlated with declining midwater oxygen concentrations, and overall larval fish abundance has declined 72% since 1969, a trend based on the decline of predominantly cool-water affinity taxa in response to warming ocean temperatures. Off Oregon, there were dramatic differences in community structure and abundance of larval fishes between warm and cool ocean conditions. Midwater deoxygenation and warming sea surface temperature trends are predicted to continue as a result of global climate change. US, Canadian, and Mexican fishery scientists are now collaborating in a virtual ocean observing network to synthesize available ichthyoplankton time series and compare patterns of change in relation to climate. This will provide regional indicators of populations and groups of taxa sensitive to warming, deoxygenation and potentially other stressors, establish the relevant scales of coherence among sub-regions and across Large Marine Ecosystems, and provide the basis for predicting future climate change impacts on these ecosystems.
Stochastic Simulation and Forecast of Hydrologic Time Series Based on Probabilistic Chaos Expansion
NASA Astrophysics Data System (ADS)
Li, Z.; Ghaith, M.
2017-12-01
Hydrological processes are characterized by many complex features, such as nonlinearity, dynamics and uncertainty. How to quantify and address such complexities and uncertainties has been a challenging task for water engineers and managers for decades. To support robust uncertainty analysis, an innovative approach for the stochastic simulation and forecast of hydrologic time series is developed is this study. Probabilistic Chaos Expansions (PCEs) are established through probabilistic collocation to tackle uncertainties associated with the parameters of traditional hydrological models. The uncertainties are quantified in model outputs as Hermite polynomials with regard to standard normal random variables. Sequentially, multivariate analysis techniques are used to analyze the complex nonlinear relationships between meteorological inputs (e.g., temperature, precipitation, evapotranspiration, etc.) and the coefficients of the Hermite polynomials. With the established relationships between model inputs and PCE coefficients, forecasts of hydrologic time series can be generated and the uncertainties in the future time series can be further tackled. The proposed approach is demonstrated using a case study in China and is compared to a traditional stochastic simulation technique, the Markov-Chain Monte-Carlo (MCMC) method. Results show that the proposed approach can serve as a reliable proxy to complicated hydrological models. It can provide probabilistic forecasting in a more computationally efficient manner, compared to the traditional MCMC method. This work provides technical support for addressing uncertainties associated with hydrological modeling and for enhancing the reliability of hydrological modeling results. Applications of the developed approach can be extended to many other complicated geophysical and environmental modeling systems to support the associated uncertainty quantification and risk analysis.
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, Laura K.; Vogel, Richard M.
2016-04-01
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.
Burned area detection based on Landsat time series in savannas of southern Burkina Faso
NASA Astrophysics Data System (ADS)
Liu, Jinxiu; Heiskanen, Janne; Maeda, Eduardo Eiji; Pellikka, Petri K. E.
2018-02-01
West African savannas are subject to regular fires, which have impacts on vegetation structure, biodiversity and carbon balance. An efficient and accurate mapping of burned area associated with seasonal fires can greatly benefit decision making in land management. Since coarse resolution burned area products cannot meet the accuracy needed for fire management and climate modelling at local scales, the medium resolution Landsat data is a promising alternative for local scale studies. In this study, we developed an algorithm for continuous monitoring of annual burned areas using Landsat time series. The algorithm is based on burned pixel detection using harmonic model fitting with Landsat time series and breakpoint identification in the time series data. This approach was tested in a savanna area in southern Burkina Faso using 281 images acquired between October 2000 and April 2016. An overall accuracy of 79.2% was obtained with balanced omission and commission errors. This represents a significant improvement in comparison with MODIS burned area product (67.6%), which had more omission errors than commission errors, indicating underestimation of the total burned area. By observing the spatial distribution of burned areas, we found that the Landsat based method misclassified cropland and cloud shadows as burned areas due to the similar spectral response, and MODIS burned area product omitted small and fragmented burned areas. The proposed algorithm is flexible and robust against decreased data availability caused by clouds and Landsat 7 missing lines, therefore having a high potential for being applied in other landscapes in future studies.
Arismendi, Ivan; Johnson, Sherri; Dunham, Jason B.; Haggerty, Roy; Hockman-Wert, David
2012-01-01
Temperature is a fundamentally important driver of ecosystem processes in streams. Recent warming of terrestrial climates around the globe has motivated concern about consequent increases in stream temperature. More specifically, observed trends of increasing air temperature and declining stream flow are widely believed to result in corresponding increases in stream temperature. Here, we examined the evidence for this using long-term stream temperature data from minimally and highly human-impacted sites located across the Pacific continental United States. Based on hypothesized climate impacts, we predicted that we should find warming trends in the maximum, mean and minimum temperatures, as well as increasing variability over time. These predictions were not fully realized. Warming trends were most prevalent in a small subset of locations with longer time series beginning in the 1950s. More recent series of observations (1987-2009) exhibited fewer warming trends and more cooling trends in both minimally and highly human-influenced systems. Trends in variability were much less evident, regardless of the length of time series. Based on these findings, we conclude that our perspective of climate impacts on stream temperatures is clouded considerably by a lack of long-termdata on minimally impacted streams, and biased spatio-temporal representation of existing time series. Overall our results highlight the need to develop more mechanistic, process-based understanding of linkages between climate change, other human impacts and stream temperature, and to deploy sensor networks that will provide better information on trends in stream temperatures in the future.
NASA Technical Reports Server (NTRS)
Prochzaka, Ivan; Kodat, Jan; Blazej, Josef; Sun, Xiaoli (Editor)
2015-01-01
We are reporting on a design, construction and performance of photon-counting detector packages based on silicon avalanche photodiodes. These photon-counting devices have been optimized for extremely high stability of their detection delay. The detectors have been designed for future applications in fundamental metrology and optical time transfer in space. The detectors have been qualified for operation in space missions. The exceptional radiation tolerance of the detection chip itself and of all critical components of a detector package has been verified in a series of experiments.
A recurrence-weighted prediction algorithm for musical analysis
NASA Astrophysics Data System (ADS)
Colucci, Renato; Leguizamon Cucunuba, Juan Sebastián; Lloyd, Simon
2018-03-01
Forecasting the future behaviour of a system using past data is an important topic. In this article we apply nonlinear time series analysis in the context of music, and present new algorithms for extending a sample of music, while maintaining characteristics similar to the original piece. By using ideas from ergodic theory, we adapt the classical prediction method of Lorenz analogues so as to take into account recurrence times, and demonstrate with examples, how the new algorithm can produce predictions with a high degree of similarity to the original sample.
NASA Astrophysics Data System (ADS)
Boudhina, Nissaf; Zitouna-Chebbi, Rim; Mekki, Insaf; Jacob, Frédéric; Ben Mechlia, Nétij; Masmoudi, Moncef; Prévot, Laurent
2018-06-01
Estimating evapotranspiration in hilly watersheds is paramount for managing water resources, especially in semiarid/subhumid regions. The eddy covariance (EC) technique allows continuous measurements of latent heat flux (LE). However, time series of EC measurements often experience large portions of missing data because of instrumental malfunctions or quality filtering. Existing gap-filling methods are questionable over hilly crop fields because of changes in airflow inclination and subsequent aerodynamic properties. We evaluated the performances of different gap-filling methods before and after tailoring to conditions of hilly crop fields. The tailoring consisted of splitting the LE time series beforehand on the basis of upslope and downslope winds. The experiment was setup within an agricultural hilly watershed in northeastern Tunisia. EC measurements were collected throughout the growth cycle of three wheat crops, two of them located in adjacent fields on opposite hillslopes, and the third one located in a flat field. We considered four gap-filling methods: the REddyProc method, the linear regression between LE and net radiation (Rn), the multi-linear regression of LE against the other energy fluxes, and the use of evaporative fraction (EF). Regardless of the method, the splitting of the LE time series did not impact the gap-filling rate, and it might improve the accuracies on LE retrievals in some cases. Regardless of the method, the obtained accuracies on LE estimates after gap filling were close to instrumental accuracies, and they were comparable to those reported in previous studies over flat and mountainous terrains. Overall, REddyProc was the most appropriate method, for both gap-filling rate and retrieval accuracy. Thus, it seems possible to conduct gap filling for LE time series collected over hilly crop fields, provided the LE time series are split beforehand on the basis of upslope-downslope winds. Future works should address consecutive vegetation growth cycles for a larger panel of conditions in terms of climate, vegetation, and water status.
Scharnweber, Tobias; Hevia, Andrea; Buras, Allan; van der Maaten, Ernst; Wilmking, Martin
2016-10-01
Element composition of annually resolved tree-rings constitutes a promising biological proxy for reconstructions of environmental conditions and pollution history. However, several methodological and physiological issues have to be addressed before sound conclusions can be drawn from dendrochemical time series. For example, radial and vertical translocation processes of elements in the wood might blur or obscure any dendrochemical signal. In this study, we tested the degree of synchronism of elemental time series within and between trees of one coniferous (Pinus sylvestris L.) and one broadleaf (Castanea sativa Mill.) species growing in conventionally managed forests without direct pollution sources in their surroundings. Micro X-ray fluorescence (μXRF) analysis was used to establish time series of relative concentrations of multiple elements (Mg, Al, P, Cl, K, Ca, Cr, Mn, Fe and Ni) for different stem heights and stem exposures. We found a common long-term (decadal) trend for most elements in both species, but only little coherence in the high frequency domain (inter-annual variations). Aligning the element curves by cambial age instead of year of ring formation reduced the standard deviations between the single measurements. This points at an influence of age on longer term trends and would require a detrending in order to extract any environmental signal from dendrochemical time series. The common signal was stronger for pine than for chestnut. In pine, many elements show a concentration gradient with higher values towards the tree crown. Mobility of elements in the stem leading to high within- and between-tree variability, as well as a potential age-trend apparently complicate the establishment of reliable dendrochemical chronologies. For future wood-chemical studies, we recommend to work with element ratios instead of single element time series, to consider potential age trends and to analyze more than one sample per tree to account for internal variability. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Savani, N. P.; Vourlidas, A.; Pulkkinen, A.; Nieves-Chinchilla, T.; Lavraud, B.; Owens, M. J.
2013-01-01
We investigate a coronal mass ejection (CME) propagating toward Earth on 29 March 2011. This event is specifically chosen for its predominately northward directed magnetic field, so that the influence from the momentum flux onto Earth can be isolated. We focus our study on understanding how a small Earth-directed segment propagates. Mass images are created from the white-light cameras onboard STEREO which are also converted into mass height-time maps (mass J-maps). The mass tracks on these J-maps correspond to the sheath region between the CME and its associated shockfront as detected by in situ measurements at L1. A time series of mass measurements from the STEREOCOR-2A instrument is made along the Earth propagation direction. Qualitatively, this mass time series shows a remarkable resemblance to the L1 in situ density series. The in situ measurements are used as inputs into a three-dimensional (3-D) magnetospheric space weather simulation from the Community Coordinated Modeling Center. These simulations display a sudden compression of the magnetosphere from the large momentum flux at the leading edge of the CME, and predictions are made for the time derivative of the magnetic field (dBdt) on the ground. The predicted dBdt values were then compared with the observations from specific equatorially located ground stations and showed notable similarity. This study of the momentum of a CME from the Sun down to its influence on magnetic ground stations on Earth is presented as a preliminary proof of concept, such that future attempts may try to use remote sensing to create density and velocity time series as inputs to magnetospheric simulations.
Ip, Ryan H L; Li, W K; Leung, Kenneth M Y
2013-09-15
Large scale environmental remediation projects applied to sea water always involve large amount of capital investments. Rigorous effectiveness evaluations of such projects are, therefore, necessary and essential for policy review and future planning. This study aims at investigating effectiveness of environmental remediation using three different Seemingly Unrelated Regression (SUR) time series models with intervention effects, including Model (1) assuming no correlation within and across variables, Model (2) assuming no correlation across variable but allowing correlations within variable across different sites, and Model (3) allowing all possible correlations among variables (i.e., an unrestricted model). The results suggested that the unrestricted SUR model is the most reliable one, consistently having smallest variations of the estimated model parameters. We discussed our results with reference to marine water quality management in Hong Kong while bringing managerial issues into consideration. Copyright © 2013 Elsevier Ltd. All rights reserved.
Kellett, Stephen; Simmonds-Buckley, Mel; Totterdell, Peter
2017-08-18
The evidence base for treatment of hypersexuality disorder (HD) has few studies with appropriate methodological rigor. This study therefore conducted a single case experiment of cognitive analytic therapy (CAT) for HD using an A/B design with extended follow-up. Cruising, pornography usage, masturbation frequency and associated cognitions and emotions were measured daily in a 231-day time series. Following a three-week assessment baseline (A: 21 days), treatment was delivered via outpatient sessions (B: 147 days), with the follow-up period lasting 63 days. Results show that cruising and pornography usage extinguished. The total sexual outlet score no longer met caseness, and the primary nomothetic hypersexuality outcome measure met recovery criteria. Reduced pornography consumption was mediated by reduced obsessionality and greater interpersonal connectivity. The utility of the CAT model for intimacy problems shows promise. Directions for future HD outcome research are also provided.
Moody, George B; Mark, Roger G; Goldberger, Ary L
2011-01-01
PhysioNet provides free web access to over 50 collections of recorded physiologic signals and time series, and related open-source software, in support of basic, clinical, and applied research in medicine, physiology, public health, biomedical engineering and computing, and medical instrument design and evaluation. Its three components (PhysioBank, the archive of signals; PhysioToolkit, the software library; and PhysioNetWorks, the virtual laboratory for collaborative development of future PhysioBank data collections and PhysioToolkit software components) connect researchers and students who need physiologic signals and relevant software with researchers who have data and software to share. PhysioNet's annual open engineering challenges stimulate rapid progress on unsolved or poorly solved questions of basic or clinical interest, by focusing attention on achievable solutions that can be evaluated and compared objectively using freely available reference data.
Quantitative evaluation of pregnant women delivery status' records in Akure, Nigeria.
Adejumo, Adebowale O; Suleiman, Esivue A; Okagbue, Hilary I; Oguntunde, Pelumi E; Odetunmibi, Oluwole A
2018-02-01
In this data article, monthly records (datasets) of total delivery, normal delivery, delivery through Caesarean section and number of still births from pregnant women in Akure, the capital city of Ondo state Nigeria, for a period of ten years, between January 2007 and December 2016 were considered. Correlational and time series analyses were conducted on the monthly records of total delivery, normal delivery (delivery through woman virginal), delivery through Caesarean section, and number of still births, in order to observe the patterns each of these indicators follows and to recommend appropriate model for forecasting their future values. The data were obtained in raw form from State Specialist Hospital (SSH), Akure, Ondo state, Nigeria. A clear description and variation in each of these indicators (total delivery, normal delivery, caesarean section, and still births) were considered separately using descriptive statistics and box plots. Different models were also proposed for each of these indicators using time series models.
The slippery slope: how small ethical transgressions pave the way for larger future transgressions.
Welsh, David T; Ordóñez, Lisa D; Snyder, Deirdre G; Christian, Michael S
2015-01-01
Many recent corporate scandals have been described as resulting from a slippery slope in which a series of small infractions gradually increased over time (e.g., McLean & Elkind, 2003). However, behavioral ethics research has rarely considered how unethical behavior unfolds over time. In this study, we draw on theories of self-regulation to examine whether individuals engage in a slippery slope of increasingly unethical behavior. First, we extend Bandura's (1991, 1999) social-cognitive theory by demonstrating how the mechanism of moral disengagement can reduce ethicality over a series of gradually increasing indiscretions. Second, we draw from recent research connecting regulatory focus theory and behavioral ethics (Gino & Margolis, 2011) to demonstrate that inducing a prevention focus moderates this mediated relationship by reducing one's propensity to slide down the slippery slope. We find support for the developed model across 4 multiround studies. (c) 2015 APA, all rights reserved.
Industry contributions to aggregate workplace injury and illness rate trends: 1992-2008.
Ruser, John W
2014-10-01
Aggregate workplace injury and illness rates have generally declined over the past quarter century. Assessing which industries contributed to these declines is hampered by industry coding changes that broke time series data. Ratios were estimated to convert older incidence rate data to current industry codes and to create long industry time series from data of the BLS Survey of Occupational Injuries and Illnesses. These data were used to assess contributions to aggregate trends from within-industry incidence rate trends and across-industry hours shifts. Hours shifts toward safer industries do not explain aggregate incidence rate declines. Rather declines resulted from within-industry declines. The top 20 contributors out of 307 industries account for 40 percent of the decline and include both goods-producing and service-providing industries. These data help focus future research on industries responsible for rate declines and factors hypothesized as contributing to declines. © Published 2014 by Wiley Periodicals, Inc.
The Current Status and Future of GNSS-Meteorology in Europe
NASA Astrophysics Data System (ADS)
Jones, J.; Guerova, G.; Dousa, J.; Dick, G.; Haan, de, S.; Pottiaux, E.; Bock, O.; Pacione, R.
2017-12-01
GNSS is a well established atmospheric observing system which can accurately sense water vapour, the most abundant greenhouse gas, accounting for 60-70% of atmospheric warming. Water vapour observations are currently under-sampled in operational meteorology and obtaining and exploiting additional high-quality humidity observations is essential to improve severe weather forecasting and climate monitoring. Inconsistencies introduced into long-term time series from improved GNSS processing algorithms make climate trend analysis challenging. Ongoing re-processing efforts using state-of-the-art models are underway which will provide consistent time series' of tropospheric data, using 15+ years of GNSS observations and from over 600 stations worldwide. These datasets will enable validation of systematic biases from a range of instrumentation, improve the knowledge of climatic trends of atmospheric water vapour, and will potentially be of great benefit to global and regional NWP reanalyses and climate model simulations (e.g. IPCC AR5) COST Action ES1206 is a 4-year project, running from 2013 to 2017, which has coordinated new and improved capabilities from concurrent developments in GNSS, meteorological and climate communities. For the first time, the synergy of multi-GNSS constellations has been used to develop new, more advanced tropospheric products, exploiting the full potential of multi-GNSS on a wide range of temporal and spatial scales - from real-time products monitoring and forecasting severe weather, to the highest quality post-processed products suitable for climate research. The Action has also promoted the use of meteorological data as an input to real-time GNSS positioning, navigation, and timing services and has stimulated knowledge and data transfer throughout Europe and beyond. This presentation will give an overview of COST Action ES1206 plus an overview of ground-based GNSS-meteorology in Europe in general, including current status and future opportunities.
Memory, mental time travel and The Moustachio Quartet
Wilkins, Clive
2017-01-01
Mental time travel allows us to revisit our memories and imagine future scenarios, and this is why memories are not only about the past, but they are also prospective. These episodic memories are not a fixed store of what happened, however, they are reassessed each time they are revisited and depend on the sequence in which events unfold. In this paper, we shall explore the complex relationships between memory and human experience, including through a series of novels ‘The Moustachio Quartet’ that can be read in any order. To do so, we shall integrate evidences from science and the arts to explore the subjective nature of memory and mental time travel, and argue that it has evolved primarily for prospection as opposed to retrospection. Furthermore, we shall question the notion that mental time travel is a uniquely human construct, and argue that some of the best evidence for the evolution of mental time travel comes from our distantly related cousins, the corvids, that cache food for the future and rely on long-lasting and highly accurate memories of what, where and when they stored their stashes of food. PMID:28479980
Memory, mental time travel and The Moustachio Quartet.
Clayton, Nicola; Wilkins, Clive
2017-06-06
Mental time travel allows us to revisit our memories and imagine future scenarios, and this is why memories are not only about the past, but they are also prospective. These episodic memories are not a fixed store of what happened, however, they are reassessed each time they are revisited and depend on the sequence in which events unfold. In this paper, we shall explore the complex relationships between memory and human experience, including through a series of novels 'The Moustachio Quartet' that can be read in any order. To do so, we shall integrate evidences from science and the arts to explore the subjective nature of memory and mental time travel, and argue that it has evolved primarily for prospection as opposed to retrospection. Furthermore, we shall question the notion that mental time travel is a uniquely human construct, and argue that some of the best evidence for the evolution of mental time travel comes from our distantly related cousins, the corvids, that cache food for the future and rely on long-lasting and highly accurate memories of what, where and when they stored their stashes of food.
NASA Astrophysics Data System (ADS)
Wu, Qi
2010-03-01
Demand forecasts play a crucial role in supply chain management. The future demand for a certain product is the basis for the respective replenishment systems. Aiming at demand series with small samples, seasonal character, nonlinearity, randomicity and fuzziness, the existing support vector kernel does not approach the random curve of the sales time series in the space (quadratic continuous integral space). In this paper, we present a hybrid intelligent system combining the wavelet kernel support vector machine and particle swarm optimization for demand forecasting. The results of application in car sale series forecasting show that the forecasting approach based on the hybrid PSOWv-SVM model is effective and feasible, the comparison between the method proposed in this paper and other ones is also given, which proves that this method is, for the discussed example, better than hybrid PSOv-SVM and other traditional methods.
Utilization management in radiology, part 2: perspectives and future directions.
Duszak, Richard; Berlin, Jonathan W
2012-10-01
Increased utilization of medical imaging in the early part of the last decade has resulted in numerous efforts to reduce associated spending. Recent initiatives have focused on managing utilization with radiology benefits managers and real-time order entry decision support systems. Although these approaches might seem mutually exclusive and their application to radiology appears unique, the historical convergence and broad acceptance of both programs within the pharmacy sector may offer parallels for their potential future in medical imaging. In this second installment of a two-part series, anticipated trends in radiology utilization management are reviewed. Perspectives on current and future potential roles of radiologists in such initiatives are discussed, particularly in light of emerging physician payment models. Copyright © 2012 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Using GRACE and climate model simulations to predict mass loss of Alaskan glaciers through 2100
Wahr, John; Burgess, Evan; Swenson, Sean
2016-05-30
Glaciers in Alaska are currently losing mass at a rate of ~–50 Gt a –1, one of the largest ice loss rates of any regional collection of mountain glaciers on Earth. Existing projections of Alaska's future sea-level contributions tend to be divergent and are not tied directly to regional observations. Here we develop a simple, regional observation-based projection of Alaska's future sea-level contribution. We compute a time series of recent Alaska glacier mass variability using monthly GRACE gravity fields from August 2002 through December 2014. We also construct a three-parameter model of Alaska glacier mass variability based on monthly ERA-Interimmore » snowfall and temperature fields. When these three model parameters are fitted to the GRACE time series, the model explains 94% of the variance of the GRACE data. Using these parameter values, we then apply the model to simulated fields of monthly temperature and snowfall from the Community Earth System Model, to obtain predictions of mass variations through 2100. Here, we conclude that mass loss rates may increase between –80 and –110 Gt a –1by 2100, with a total sea-level rise contribution of 19 ± 4 mm during the 21st century.« less
Applying downscaled global climate model data to a hydrodynamic surface-water and groundwater model
Swain, Eric; Stefanova, Lydia; Smith, Thomas
2014-01-01
Precipitation data from Global Climate Models have been downscaled to smaller regions. Adapting this downscaled precipitation data to a coupled hydrodynamic surface-water/groundwater model of southern Florida allows an examination of future conditions and their effect on groundwater levels, inundation patterns, surface-water stage and flows, and salinity. The downscaled rainfall data include the 1996-2001 time series from the European Center for Medium-Range Weather Forecasting ERA-40 simulation and both the 1996-1999 and 2038-2057 time series from two global climate models: the Community Climate System Model (CCSM) and the Geophysical Fluid Dynamic Laboratory (GFDL). Synthesized surface-water inflow datasets were developed for the 2038-2057 simulations. The resulting hydrologic simulations, with and without a 30-cm sea-level rise, were compared with each other and field data to analyze a range of projected conditions. Simulations predicted generally higher future stage and groundwater levels and surface-water flows, with sea-level rise inducing higher coastal salinities. A coincident rise in sea level, precipitation and surface-water flows resulted in a narrower inland saline/fresh transition zone. The inland areas were affected more by the rainfall difference than the sea-level rise, and the rainfall differences make little difference in coastal inundation, but a larger difference in coastal salinities.
Overview of the Future Forest Webinar Series [Chapter 1
Sarah Hines; Megan Matonis
2014-01-01
The Future Forest Webinar Series was created to facilitate dialogue between scientists and managers about the challenges and opportunities created by the mountain pine beetle1 (MPB) epidemic. A core team of scientists and managers from the USFS Rocky Mountain Research Station and the Northern and Rocky Mountain Regions worked together to develop the format and content...
NASA Astrophysics Data System (ADS)
Leggett, L. Mark W.; Ball, David A.
2018-02-01
The difference between the time series trend for temperature expected from the increasing level of atmospheric CO2 and that for the (more slowly rising) observed temperature has been termed the global surface temperature slowdown. In this paper, we characterise the single time series made from the subtraction of these two time series as the `global surface temperature gap'. We also develop an analogous atmospheric CO2 gap series from the difference between the level of CO2 and first-difference CO2 (that is, the change in CO2 from one period to the next). This paper provides three further pieces of evidence concerning the global surface temperature slowdown. First, we find that the present size of both the global surface temperature gap and the CO2 gap is unprecedented over a period starting at least as far back as the 1860s. Second, ARDL and Granger causality analyses involving the global surface temperature gap against the major candidate physical drivers of the ocean heat sink and biosphere evapotranspiration are conducted. In each case where ocean heat data was available, it was significant in the models: however, evapotranspiration, or its argued surrogate precipitation, also remained significant in the models alongside ocean heat. In terms of relative scale, the standardised regression coefficient for evapotranspiration was repeatedly of the same order of magnitude as—typically as much as half that for—ocean heat. The foregoing is evidence that, alongside the ocean heat sink, evapotranspiration is also likely to be making a substantial contribution to the global atmospheric temperature outcome. Third, there is evidence that both the ocean heat sink and the evapotranspiration process might be able to continue into the future to keep the temperature lower than the level-of-CO2 models would suggest. It is shown that this means there can be benefit in using the first-difference CO2 to temperature relationship shown in Leggett and Ball (Atmos Chem Phys 15(20):11571-11592, 2015) to forecast future global surface temperature.
A comprehensive segmentation analysis of crude oil market based on time irreversibility
NASA Astrophysics Data System (ADS)
Xia, Jianan; Shang, Pengjian; Lu, Dan; Yin, Yi
2016-05-01
In this paper, we perform a comprehensive entropic segmentation analysis of crude oil future prices from 1983 to 2014 which used the Jensen-Shannon divergence as the statistical distance between segments, and analyze the results from original series S and series begin at 1986 (marked as S∗) to find common segments which have same boundaries. Then we apply time irreversibility analysis of each segment to divide all segments into two groups according to their asymmetry degree. Based on the temporal distribution of the common segments and high asymmetry segments, we figure out that these two types of segments appear alternately and do not overlap basically in daily group, while the common portions are also high asymmetry segments in weekly group. In addition, the temporal distribution of the common segments is fairly close to the time of crises, wars or other events, because the hit from severe events to oil price makes these common segments quite different from their adjacent segments. The common segments can be confirmed in daily group series, or weekly group series due to the large divergence between common segments and their neighbors. While the identification of high asymmetry segments is helpful to know the segments which are not affected badly by the events and can recover to steady states automatically. Finally, we rearrange the segments by merging the connected common segments or high asymmetry segments into a segment, and conjoin the connected segments which are neither common nor high asymmetric.
Modeling urban flood risk territories for Riga city
NASA Astrophysics Data System (ADS)
Piliksere, A.; Sennikovs, J.; Virbulis, J.; Bethers, U.; Bethers, P.; Valainis, A.
2012-04-01
Riga, the capital of Latvia, is located on River Daugava at the Gulf of Riga. The main flooding risks of Riga city are: (1) storm caused water setup in South part of Gulf of Riga (storm event), (2) water level increase caused by Daugava River discharge maximums (spring snow melting event) and (3) strong rainfall or rapid snow melting in densely populated urban areas. The first two flooding factors were discussed previously (Piliksere et al, 2011). The aims of the study were (1) the identification of the flood risk situations in densely populated areas, (2) the quantification of the flooding scenarios caused by rain and snow melting events of different return periods nowadays, in the near future (2021-2050), far future (2071-2100) taking into account the projections of climate change, (3) estimation of groundwater level for Riga city, (4) the building and calibration of the hydrological mathematical model based on SWMM (EPA, 2004) for the domain potentially vulnerable for rain and snow melt flooding events, (5) the calculation of rain and snow melting flood events with different return periods, (6) mapping the potentially flooded areas on a fine grid. The time series of short term precipitation events during warm time period of year (id est. rain events) were analyzed for 35 year long time period. Annual maxima of precipitation intensity for events with different duration (5 min; 15 min; 1h; 3h; 6h; 12h; 1 day; 2 days; 4 days; 10 days) were calculated. The time series of long term simultaneous precipitation data and observations of the reduction of thickness of snow cover were analyzed for 27 year long time period. Snow thawing periods were detected and maximum of snow melting intensity for events with different intensity (1day; 2 days; 4 days; 7 days; 10 days) were calculated. According to the occurrence probability six scenarios for each event for nowadays, near and far future with return period once in 5, 10, 20, 50, 100 and 200 years were constructed based on the Gumbell extreme value analysis. The hydrological modelling driven by the temperature and precipitation data series from regional climate models were used for evaluation of rain event maximums in the future periods. The usage of the climate model data in hydrological models causes systematic errors; therefore the bias correction method (Sennikovs, Bethers, 2009) was applied for determination of the future rainfall intensities. SWMM model was built for the urban area. Objects of hydraulic importance (manifold, penstock, ditch, pumping station, weir, well, catchment sub-basin etc.) were included in the model. There exist pure rain sewage system and mixed rain-water/household sewage system in Riga. Sewage system with wastewater load proportional to population density was taken account and calibrated. Model system was calibrated for a real rain event against the water flux time series into sewage treatment plant of Riga. High resolution (~1.5 points per square meter) digital terrain map was used as the base for finite element mesh for the geospatial mapping of results of hydraulic calculations. Main results of study are (1) detection of the hot spots of densely populated urban areas; (2) identification of the weak chains of the melioration and sewage systems; (3) mapping the elevation of ground water mainly caused by snow melting. A.Piliksere, A.Valainis, J.Seņņikovs, (2011), A flood risk assessment for Riga city taking account climate changes, EGU, Vienna, Austria. EPA, (2004), Storm water management model. User's manual version 5.0. US Environmental Protection Agency J.Sennikovs, U.Bethers, (2009), Statistical downscaling method of regional climate model results for hydrological modelling. 18th World IMACS/MODSIM Congress, Cairns, Australia.
Initial Validation of NDVI time seriesfrom AVHRR, VEGETATION, and MODIS
NASA Technical Reports Server (NTRS)
Morisette, Jeffrey T.; Pinzon, Jorge E.; Brown, Molly E.; Tucker, Jim; Justice, Christopher O.
2004-01-01
The paper will address Theme 7: Multi-sensor opportunities for VEGETATION. We present analysis of a long-term vegetation record derived from three moderate resolution sensors: AVHRR, VEGETATION, and MODIS. While empirically based manipulation can ensure agreement between the three data sets, there is a need to validate the series. This paper uses atmospherically corrected ETM+ data available over the EOS Land Validation Core Sites as an independent data set with which to compare the time series. We use ETM+ data from 15 globally distributed sites, 7 of which contain repeat coverage in time. These high-resolution data are compared to the values of each sensor by spatially aggregating the ETM+ to each specific sensors' spatial coverage. The aggregated ETM+ value provides a point estimate for a specific site on a specific date. The standard deviation of that point estimate is used to construct a confidence interval for that point estimate. The values from each moderate resolution sensor are then evaluated with respect to that confident interval. Result show that AVHRR, VEGETATION, and MODIS data can be combined to assess temporal uncertainties and address data continuity issues and that the atmospherically corrected ETM+ data provide an independent source with which to compare that record. The final product is a consistent time series climate record that links historical observations to current and future measurements.
Study nonlinear dynamics of stratospheric ozone concentration at Pakistan Terrestrial region
NASA Astrophysics Data System (ADS)
Jan, Bulbul; Zai, Muhammad Ayub Khan Yousuf; Afradi, Faisal Khan; Aziz, Zohaib
2018-03-01
This study investigates the nonlinear dynamics of the stratospheric ozone layer at Pakistan atmospheric region. Ozone considered now the most important issue in the world because of its diverse effects on earth biosphere, including human health, ecosystem, marine life, agriculture yield and climate change. Therefore, this paper deals with total monthly time series data of stratospheric ozone over the Pakistan atmospheric region from 1970 to 2013. Two approaches, basic statistical analysis and Fractal dimension (D) have adapted to study the nature of nonlinear dynamics of stratospheric ozone level. Results obtained from this research have shown that the Hurst exponent values of both methods of fractal dimension revealed an anti-persistent behavior (negatively correlated), i.e. decreasing trend for all lags and Rescaled range analysis is more appropriate as compared to Detrended fluctuation analysis. For seasonal time series all month follows an anti-persistent behavior except in the month of November which shown persistence behavior i.e. time series is an independent and increasing trend. The normality test statistics also confirmed the nonlinear behavior of ozone and the rejection of hypothesis indicates the strong evidence of the complexity of data. This study will be useful to the researchers working in the same field in the future to verify the complex nature of stratospheric ozone.
Wet tropospheric delays forecast based on Vienna Mapping Function time series analysis
NASA Astrophysics Data System (ADS)
Rzepecka, Zofia; Kalita, Jakub
2016-04-01
It is well known that the dry part of the zenith tropospheric delay (ZTD) is much easier to model than the wet part (ZTW). The aim of the research is applying stochastic modeling and prediction of ZTW using time series analysis tools. Application of time series analysis enables closer understanding of ZTW behavior as well as short-term prediction of future ZTW values. The ZTW data used for the studies were obtained from the GGOS service hold by Vienna technical University. The resolution of the data is six hours. ZTW for the years 2010 -2013 were adopted for the study. The International GNSS Service (IGS) permanent stations LAMA and GOPE, located in mid-latitudes, were admitted for the investigations. Initially the seasonal part was separated and modeled using periodic signals and frequency analysis. The prominent annual and semi-annual signals were removed using sines and consines functions. The autocorrelation of the resulting signal is significant for several days (20-30 samples). The residuals of this fitting were further analyzed and modeled with ARIMA processes. For both the stations optimal ARMA processes based on several criterions were obtained. On this basis predicted ZTW values were computed for one day ahead, leaving the white process residuals. Accuracy of the prediction can be estimated at about 3 cm.
Andrews, Gavin J
2017-04-01
Part one in this two paper series reviewed the nature of geographical thinking in nursing research thus far. The current paper builds on it by looking forwards and providing a particular vision for future research. It argues that it is time to once again look to the parent discipline of human geography for inspiration, specifically to its turn towards non-representational theory, involving an emphasis on life that onflows prior to meaning, significance, and full cognition; on life's 'taking-place'. The paper introduces this way of viewing and animating the world. Some potential connections to nursing research and practice are suggested, as are some specific avenues for future inquiry. Explained is how, through non-representational theory, nursing might be re-imagined as something that reveals space-time. © 2016 John Wiley & Sons Ltd.
The fractal feature and price trend in the gold future market at the Shanghai Futures Exchange (SFE)
NASA Astrophysics Data System (ADS)
Wu, Binghui; Duan, Tingting
2017-05-01
The price of gold future is affected by many factors, which include the fluctuation of gold price and the change of trading environment. Fractal analysis can help investors gain better understandings of the price fluctuation and make reasonable investment decisions in the gold future market. After analyzing gold future price from January 2th, 2014 to April 12th, 2016 at the Shanghai Futures Exchange (SFE) in China, the conclusion is drawn that the gold future market has sustainability in each trading day, with all Hurst indexes greater than 0.5. The changing features of Hurst index indicate the sustainability of gold future market is strengthened first and weakened then. As a complicatedly nonlinear system, the gold future market can be well reflected by Elman neural network, which is capable of memorizing previous prices and particularly suited for forecasting time series in comparison with other types of neural networks. After analyzing the price trend in the gold future market, the results show that the relative error between the actual value of gold future and the predictive value of Elman neural network is smaller. This model that has a better performance in data fitting and predication, can help investors analyze and foresee the price tendency in the gold future market.
One hundred and fifty years of sprint and distance running – Past trends and future prospects
Weiss, Martin; Newman, Alexandra; Whitmore, Ceri; Weiss, Stephan
2016-01-01
Abstract Sprint and distance running have experienced remarkable performance improvements over the past century. Attempts to forecast running performances share an almost similarly long history but have relied so far on relatively short data series. Here, we compile a comprehensive set of season-best performances for eight Olympically contested running events. With this data set, we conduct (1) an exponential time series analysis and (2) a power-law experience curve analysis to quantify the rate of past performance improvements and to forecast future performances until the year 2100. We find that the sprint and distance running performances of women and men improve exponentially with time and converge at yearly rates of 4% ± 3% and 2% ± 2%, respectively, towards their asymptotic limits. Running performances can also be modelled with the experience curve approach, yielding learning rates of 3% ± 1% and 6% ± 2% for the women's and men's events, respectively. Long-term trends suggest that: (1) women will continue to run 10–20% slower than men, (2) 9.50 s over 100 m dash may only be broken at the end of this century and (3) several middle- and long-distance records may be broken within the next two to three decades. The prospects of witnessing a sub-2 hour marathon before 2100 remain inconclusive. Our results should be interpreted cautiously as forecasting human behaviour is intrinsically uncertain. The future season-best sprint and distance running performances will continue to scatter around the trends identified here and may yield unexpected improvements of standing world records. PMID:26088705
Satellite Ocean Color: Present Status, Future Challenges
NASA Technical Reports Server (NTRS)
Gregg, Watson W.; McClain, Charles R.; Zukor, Dorothy J. (Technical Monitor)
2001-01-01
We are midway into our 5th consecutive year of nearly continuous, high quality ocean color observations from space. The Ocean Color and Temperature Scanner/Polarization and Directionality of the Earth's Reflectances (OCTS/POLDER: Nov. 1996 - Jun. 1997), the Sea-viewing Wide Field-of-view Sensor (SeaWiFS: Sep. 1997 - present), and now the Moderate Resolution Imaging Spectrometer (MODIS: Sep. 2000 - present) have and are providing unprecedented views of chlorophyll dynamics on global scales. Global synoptic views of ocean chlorophyll were once a fantasy for ocean color scientists. It took nearly the entire 8-year lifetime of limited Coastal Zone Color Scanner (CZCS) observations to compile seasonal climatologies. Now SeaWIFS produces comparably complete fields in about 8 days. For the first time, scientists may observe spatial and temporal variability never before seen in a synoptic context. Even more exciting, we are beginning to plausibly ask questions of interannual variability. We stand at the beginning of long-time time series of ocean color, from which we may begin to ask questions of interdecadal variability and climate change. These are the scientific questions being addressed by users of the 18-year Advanced Very High Resolution Radiometer time series with respect to terrestrial processes and ocean temperatures. The nearly 5-year time series of ocean color observations now being constructed, with possibilities of continued observations, can put us at comparable standing with our terrestrial and physical oceanographic colleagues, and enable us to understand how ocean biological processes contribute to, and are affected by global climate change.
ERIC Educational Resources Information Center
Johnston, Lloyd D.; O'Malley, Patrick M.; Bachman, Jerald G.; Schulenberg, John E.; Miech, Richard A.
2014-01-01
Monitoring the Future (MTF) is a research program conducted at the University of Michigan's Institute for Social Research under a series of investigator-initiated research grants from the National Institute on Drug Abuse--one of the National Institutes of Health. The study comprises several ongoing series of annual surveys of nationally…
ERIC Educational Resources Information Center
Lee, Everett S.; Bouvier, Leon F.
"Growth and Future of Cities" and "The Nation's Minorities" are units eight and nine, respectively from the fourteen-units series Population Profiles. The former initiates its consideration of our urban future with two divergent points of view on population distribution. Those views are brought into perspective by an historical investigation of…
ERIC Educational Resources Information Center
Methe, Scott A.
2012-01-01
The purpose of this extended commentary article is to frame the set of studies in the first of two issues and recommend areas of inquiry for future research. This special series issue features studies examining the technical qualities of formative assessment procedures that were developed to inform intervention. This article intends to emphasize…
Future Air Transportation System Breakout Series Report
NASA Technical Reports Server (NTRS)
2001-01-01
This presentation discusses: AvSTAR Future System Effort Critically important; Investment in the future; Need to follow a systems engineering process; and Efforts need to be worked in worldwide context
NASA Astrophysics Data System (ADS)
Wang, Zhuosen; Schaaf, Crystal B.; Sun, Qingsong; Kim, JiHyun; Erb, Angela M.; Gao, Feng; Román, Miguel O.; Yang, Yun; Petroy, Shelley; Taylor, Jeffrey R.; Masek, Jeffrey G.; Morisette, Jeffrey T.; Zhang, Xiaoyang; Papuga, Shirley A.
2017-07-01
Seasonal vegetation phenology can significantly alter surface albedo which in turn affects the global energy balance and the albedo warming/cooling feedbacks that impact climate change. To monitor and quantify the surface dynamics of heterogeneous landscapes, high temporal and spatial resolution synthetic time series of albedo and the enhanced vegetation index (EVI) were generated from the 500 m Moderate Resolution Imaging Spectroradiometer (MODIS) operational Collection V006 daily BRDF/NBAR/albedo products and 30 m Landsat 5 albedo and near-nadir reflectance data through the use of the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM). The traditional Landsat Albedo (Shuai et al., 2011) makes use of the MODIS BRDF/Albedo products (MCD43) by assigning appropriate BRDFs from coincident MODIS products to each Landsat image to generate a 30 m Landsat albedo product for that acquisition date. The available cloud free Landsat 5 albedos (due to clouds, generated every 16 days at best) were used in conjunction with the daily MODIS albedos to determine the appropriate 30 m albedos for the intervening daily time steps in this study. These enhanced daily 30 m spatial resolution synthetic time series were then used to track albedo and vegetation phenology dynamics over three Ameriflux tower sites (Harvard Forest in 2007, Santa Rita in 2011 and Walker Branch in 2005). These Ameriflux sites were chosen as they are all quite nearby new towers coming on line for the National Ecological Observatory Network (NEON), and thus represent locations which will be served by spatially paired albedo measures in the near future. The availability of data from the NEON towers will greatly expand the sources of tower albedometer data available for evaluation of satellite products. At these three Ameriflux tower sites the synthetic time series of broadband shortwave albedos were evaluated using the tower albedo measurements with a Root Mean Square Error (RMSE) less than 0.013 and a bias within the range of ±0.006. These synthetic time series provide much greater spatial detail than the 500 m gridded MODIS data, especially over more heterogeneous surfaces, which improves the efforts to characterize and monitor the spatial variation across species and communities. The mean of the difference between maximum and minimum synthetic time series of albedo within the MODIS pixels over a subset of satellite data of Harvard Forest (16 km by 14 km) was as high as 0.2 during the snow-covered period and reduced to around 0.1 during the snow-free period. Similarly, we have used STARFM to also couple MODIS Nadir BRDF Adjusted Reflectances (NBAR) values with Landsat 5 reflectances to generate daily synthetic times series of NBAR and thus Enhanced Vegetation Index (NBAR-EVI) at a 30 m resolution. While normally STARFM is used with directional reflectances, the use of the view angle corrected daily MODIS NBAR values will provide more consistent time series. These synthetic times series of EVI are shown to capture seasonal vegetation dynamics with finer spatial and temporal details, especially over heterogeneous land surfaces.
Wang, Zhuosen; Schaaf, Crystal B.; Sun, Qingson; Kim, JiHyun; Erb, Angela M.; Gao, Feng; Roman, Miguel O.; Yang, Yun; Petroy, Shelley; Taylor, Jeffrey; Masek, Jeffrey G.; Morisette, Jeffrey T.; Zhang, Xiaoyang; Papuga, Shirley A.
2017-01-01
Seasonal vegetation phenology can significantly alter surface albedo which in turn affects the global energy balance and the albedo warming/cooling feedbacks that impact climate change. To monitor and quantify the surface dynamics of heterogeneous landscapes, high temporal and spatial resolution synthetic time series of albedo and the enhanced vegetation index (EVI) were generated from the 500 m Moderate Resolution Imaging Spectroradiometer (MODIS) operational Collection V006 daily BRDF/NBAR/albedo products and 30 m Landsat 5 albedo and near-nadir reflectance data through the use of the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM). The traditional Landsat Albedo (Shuai et al., 2011) makes use of the MODIS BRDF/Albedo products (MCD43) by assigning appropriate BRDFs from coincident MODIS products to each Landsat image to generate a 30 m Landsat albedo product for that acquisition date. The available cloud free Landsat 5 albedos (due to clouds, generated every 16 days at best) were used in conjunction with the daily MODIS albedos to determine the appropriate 30 m albedos for the intervening daily time steps in this study. These enhanced daily 30 m spatial resolution synthetic time series were then used to track albedo and vegetation phenology dynamics over three Ameriflux tower sites (Harvard Forest in 2007, Santa Rita in 2011 and Walker Branch in 2005). These Ameriflux sites were chosen as they are all quite nearby new towers coming on line for the National Ecological Observatory Network (NEON), and thus represent locations which will be served by spatially paired albedo measures in the near future. The availability of data from the NEON towers will greatly expand the sources of tower albedometer data available for evaluation of satellite products. At these three Ameriflux tower sites the synthetic time series of broadband shortwave albedos were evaluated using the tower albedo measurements with a Root Mean Square Error (RMSE) less than 0.013 and a bias within the range of ±0.006. These synthetic time series provide much greater spatial detail than the 500 m gridded MODIS data, especially over more heterogeneous surfaces, which improves the efforts to characterize and monitor the spatial variation across species and communities. The mean of the difference between maximum and minimum synthetic time series of albedo within the MODIS pixels over a subset of satellite data of Harvard Forest (16 km by 14 km) was as high as 0.2 during the snow-covered period and reduced to around 0.1 during the snow-free period. Similarly, we have used STARFM to also couple MODIS Nadir BRDF Adjusted Reflectances (NBAR) values with Landsat 5 reflectances to generate daily synthetic times series of NBAR and thus Enhanced Vegetation Index (NBAR-EVI) at a 30 m resolution. While normally STARFM is used with directional reflectances, the use of the view angle corrected daily MODIS NBAR values will provide more consistent time series. These synthetic times series of EVI are shown to capture seasonal vegetation dynamics with finer spatial and temporal details, especially over heterogeneous land surfaces.
NASA Technical Reports Server (NTRS)
Wang, Zhuosen; Schaaf, Crystal B.; Sun, Quingsong; Kim, Jihyun; Erb, Angela M.; Gao, Feng; Roman, Miguel O.; Yang, Yun; Petroy, Shelley; Taylor, Jeffrey R.;
2017-01-01
Seasonal vegetation phenology can significantly alter surface albedo which in turn affects the global energy balance and the albedo warmingcooling feedbacks that impact climate change. To monitor and quantify the surface dynamics of heterogeneous landscapes, high temporal and spatial resolution synthetic time series of albedo and the enhanced vegetation index (EVI) were generated from the 500-meter Moderate Resolution Imaging Spectroradiometer (MODIS) operational Collection V006 daily BRDF (Bidirectional Reflectance Distribution Function) / NBAR (Nadir BRDF-Adjusted Reflectance) / albedo products and 30-meter Landsat 5 albedo and near-nadir reflectance data through the use of the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM). The traditional Landsat Albedo (Shuai et al., 2011) makes use of the MODIS BRDFAlbedo products (MCD43) by assigning appropriate BRDFs from coincident MODIS products to each Landsat image to generate a 30-meter Landsat albedo product for that acquisition date. The available cloud free Landsat 5 albedos (due to clouds, generated every 16 days at best) were used in conjunction with the daily MODIS albedos to determine the appropriate 30-meter albedos for the intervening daily time steps in this study. These enhanced daily 30-meter spatial resolution synthetic time series were then used to track albedo and vegetation phenology dynamics over three Ameriflux tower sites (Harvard Forest in 2007, Santa Rita in 2011 and Walker Branch in 2005). These Ameriflux sites were chosen as they are all quite nearby new towers coming on line for the National Ecological Observatory Network (NEON), and thus represent locations which will be served by spatially paired albedo measures in the near future. The availability of data from the NEON towers will greatly expand the sources of tower albedometer data available for evaluation of satellite products. At these three Ameriflux tower sites the synthetic time series of broadband shortwave albedos were evaluated using the tower albedo measurements with a Root Mean Square Error (RMSE) less than 0.013 and a bias within the range of 0.006. These synthetic time series provide much greater spatial detail than the 500 meter gridded MODIS data, especially over more heterogeneous surfaces, which improves the efforts to characterize and monitor the spatial variation across species and communities. The mean of the difference between maximum and minimum synthetic time series of albedo within the MODIS pixels over a subset of satellite data of Harvard Forest (16 kilometers by 14 kilometers) was as high as 0.2 during the snow-covered period and reduced to around 0.1 during the snow-free period. Similarly, we have used STARFM to also couple MODIS Nadir BRDF-Adjusted Reflectances (NBAR) values with Landsat 5 reflectances to generate daily synthetic times series of NBAR and thus Enhanced Vegetation Index (NBAR-EVI) at a 30-meter resolution. While normally STARFM is used with directional reflectances, the use of the view angle corrected daily MODIS NBAR values will provide more consistent time series. These synthetic times series of EVI are shown to capture seasonal vegetation dynamics with finer spatial and temporal details, especially over heterogeneous land surfaces.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singh, Ravindra; Uluski, Robert; Reilly, James T.
The objective of this survey is to benchmark current practices for DMS implementation to serve as a guide for future system implementations. The survey sought information on current plans to implement DMS, DMS functions of interest, implementation challenges, functional benefits achieved, and other relevant information. These survey results were combined (where possible) with results of similar surveys conducted in the previous four years to observe trends over time.
International Symposium of the Society of MADMEN (1st), 14-16 June 1982
1982-06-01
During 1980 the NAVAIRDEVCEN conducted a series of flight tests in an effort to evaluate two advanced compensator sysytems ; namely, the Compensator...The fact that the A indices can be used at all, however unsatisfactorily, as indicators of activity in the MAD band is the result of correlations...updating capability based on real-time solar flare occurrence information, it should be possible to transmit both immediate and estimated future
1987-02-01
flowcharting . 3. ProEram Codin in HLL. This stage consists of transcribing the previously designed program into R an t at can be translated into the machine...specified conditios 7. Documentation. Program documentation is necessary for user information, for maintenance, and for future applications. Flowcharts ...particular CP U. Asynchronous. Operating without reference to an overall timing source. BASIC. Beginners ’ All-purpose Symbolic Instruction Code; a widely
Scale-free avalanche dynamics in the stock market
NASA Astrophysics Data System (ADS)
Bartolozzi, M.; Leinweber, D. B.; Thomas, A. W.
2006-10-01
Self-organized criticality (SOC) has been claimed to play an important role in many natural and social systems. In the present work we empirically investigate the relevance of this theory to stock-market dynamics. Avalanches in stock-market indices are identified using a multi-scale wavelet-filtering analysis designed to remove Gaussian noise from the index. Here, new methods are developed to identify the optimal filtering parameters which maximize the noise removal. The filtered time series is reconstructed and compared with the original time series. A statistical analysis of both high-frequency Nasdaq E-mini Futures and daily Dow Jones data is performed. The results of this new analysis confirm earlier results revealing a robust power-law behaviour in the probability distribution function of the sizes, duration and laminar times between avalanches. This power-law behaviour holds the potential to be established as a stylized fact of stock market indices in general. While the memory process, implied by the power-law distribution of the laminar times, is not consistent with classical models for SOC, we note that a power-law distribution of the laminar times cannot be used to rule out self-organized critical behaviour.
Untenable nonstationarity: An assessment of the fitness for purpose of trend tests in hydrology
NASA Astrophysics Data System (ADS)
Serinaldi, Francesco; Kilsby, Chris G.; Lombardo, Federico
2018-01-01
The detection and attribution of long-term patterns in hydrological time series have been important research topics for decades. A significant portion of the literature regards such patterns as 'deterministic components' or 'trends' even though the complexity of hydrological systems does not allow easy deterministic explanations and attributions. Consequently, trend estimation techniques have been developed to make and justify statements about tendencies in the historical data, which are often used to predict future events. Testing trend hypothesis on observed time series is widespread in the hydro-meteorological literature mainly due to the interest in detecting consequences of human activities on the hydrological cycle. This analysis usually relies on the application of some null hypothesis significance tests (NHSTs) for slowly-varying and/or abrupt changes, such as Mann-Kendall, Pettitt, or similar, to summary statistics of hydrological time series (e.g., annual averages, maxima, minima, etc.). However, the reliability of this application has seldom been explored in detail. This paper discusses misuse, misinterpretation, and logical flaws of NHST for trends in the analysis of hydrological data from three different points of view: historic-logical, semantic-epistemological, and practical. Based on a review of NHST rationale, and basic statistical definitions of stationarity, nonstationarity, and ergodicity, we show that even if the empirical estimation of trends in hydrological time series is always feasible from a numerical point of view, it is uninformative and does not allow the inference of nonstationarity without assuming a priori additional information on the underlying stochastic process, according to deductive reasoning. This prevents the use of trend NHST outcomes to support nonstationary frequency analysis and modeling. We also show that the correlation structures characterizing hydrological time series might easily be underestimated, further compromising the attempt to draw conclusions about trends spanning the period of records. Moreover, even though adjusting procedures accounting for correlation have been developed, some of them are insufficient or are applied only to some tests, while some others are theoretically flawed but still widely applied. In particular, using 250 unimpacted stream flow time series across the conterminous United States (CONUS), we show that the test results can dramatically change if the sequences of annual values are reproduced starting from daily stream flow records, whose larger sizes enable a more reliable assessment of the correlation structures.
Fossil-Fuel C02 Emissions Database and Exploration System
NASA Astrophysics Data System (ADS)
Krassovski, M.; Boden, T.; Andres, R. J.; Blasing, T. J.
2012-12-01
The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL) quantifies the release of carbon from fossil-fuel use and cement production at global, regional, and national spatial scales. The CDIAC emission time series estimates are based largely on annual energy statistics published at the national level by the United Nations (UN). CDIAC has developed a relational database to house collected data and information and a web-based interface to help users worldwide identify, explore and download desired emission data. The available information is divided in two major group: time series and gridded data. The time series data is offered for global, regional and national scales. Publications containing historical energy statistics make it possible to estimate fossil fuel CO2 emissions back to 1751. Etemad et al. (1991) published a summary compilation that tabulates coal, brown coal, peat, and crude oil production by nation and year. Footnotes in the Etemad et al.(1991) publication extend the energy statistics time series back to 1751. Summary compilations of fossil fuel trade were published by Mitchell (1983, 1992, 1993, 1995). Mitchell's work tabulates solid and liquid fuel imports and exports by nation and year. These pre-1950 production and trade data were digitized and CO2 emission calculations were made following the procedures discussed in Marland and Rotty (1984) and Boden et al. (1995). The gridded data presents annual and monthly estimates. Annual data presents a time series recording 1° latitude by 1° longitude CO2 emissions in units of million metric tons of carbon per year from anthropogenic sources for 1751-2008. The monthly, fossil-fuel CO2 emissions estimates from 1950-2008 provided in this database are derived from time series of global, regional, and national fossil-fuel CO2 emissions (Boden et al. 2011), the references therein, and the methodology described in Andres et al. (2011). The data accessible here take these tabular, national, mass-emissions data and distribute them spatially on a one degree latitude by one degree longitude grid. The within-country spatial distribution is achieved through a fixed population distribution as reported in Andres et al. (1996). This presentation introduces newly build database and web interface, reflects the present state and functionality of the Fossil-Fuel CO2 Emissions Database and Exploration System as well as future plans for expansion.
The Impact of United States Monetary Policy in the Crude Oil futures market
NASA Astrophysics Data System (ADS)
Padilla-Padilla, Fernando M.
This research examines the empirical impact the United States monetary policy, through the federal fund interest rate, has on the volatility in the crude oil price in the futures market. Prior research has shown how macroeconomic events and variables have impacted different financial markets within short and long--term movements. After testing and decomposing the variables, the two stationary time series were analyzed using a Vector Autoregressive Model (VAR). The empirical evidence shows, with statistical significance, a direct relationship when explaining crude oil prices as function of fed fund rates (t-1) and an indirect relationship when explained as a function of fed fund rates (t-2). These results partially address the literature review lacunas within the topic of the existing implication monetary policy has within the crude oil futures market.
Hazard function theory for nonstationary natural hazards
Read, Laura K.; Vogel, Richard M.
2016-04-11
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series ( X) with its failure time series ( T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.« less
NASA Technical Reports Server (NTRS)
Campbell, Petya K. Entcheva; Middleton, Elizabeth M.; Thome, Kurt J.; Kokaly, Raymond F.; Huemmrich, Karl Fred; Lagomasino, David; Novick, Kimberly A.; Brunsell, Nathaniel A.
2013-01-01
This study evaluated Earth Observing 1 (EO-1) Hyperion reflectance time series at established calibration sites to assess the instrument stability and suitability for monitoring vegetation functional parameters. Our analysis using three pseudo-invariant calibration sites in North America indicated that the reflectance time series are devoid of apparent spectral trends and their stability consistently is within 2.5-5 percent throughout most of the spectral range spanning the 12-plus year data record. Using three vegetated sites instrumented with eddy covariance towers, the Hyperion reflectance time series were evaluated for their ability to determine important variables of ecosystem function. A number of narrowband and derivative vegetation indices (VI) closely described the seasonal profiles in vegetation function and ecosystem carbon exchange (e.g., net and gross ecosystem productivity) in three very different ecosystems, including a hardwood forest and tallgrass prairie in North America, and a Miombo woodland in Africa. Our results demonstrate the potential for scaling the carbon flux tower measurements to local and regional landscape levels. The VIs with stronger relationships to the CO2 parameters were derived using continuous reflectance spectra and included wavelengths associated with chlorophyll content and/or chlorophyll fluorescence. Since these indices cannot be calculated from broadband multispectral instrument data, the opportunity to exploit these spectrometer-based VIs in the future will depend on the launch of satellites such as EnMAP and HyspIRI. This study highlights the practical utility of space-borne spectrometers for characterization of the spectral stability and uniformity of the calibration sites in support of sensor cross-comparisons, and demonstrates the potential of narrowband VIs to track and spatially extend ecosystem functional status as well as carbon processes measured at flux towers.
Hazard function theory for nonstationary natural hazards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Read, Laura K.; Vogel, Richard M.
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series ( X) with its failure time series ( T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.« less
Cloern, James E.; Abreu, Paulo C.; Carstensen, Jacob; Chauvaud, Laurent; Elmgren, Ragnar; Grall, Jacques; Greening, Holly; Johansson, John O.R.; Kahru, Mati; Sherwood, Edward T.; Xu, Jie; Yin, Kedong
2016-01-01
Time series of environmental measurements are essential for detecting, measuring and understanding changes in the Earth system and its biological communities. Observational series have accumulated over the past 2–5 decades from measurements across the world's estuaries, bays, lagoons, inland seas and shelf waters influenced by runoff. We synthesize information contained in these time series to develop a global view of changes occurring in marine systems influenced by connectivity to land. Our review is organized around four themes: (i) human activities as drivers of change; (ii) variability of the climate system as a driver of change; (iii) successes, disappointments and challenges of managing change at the sea-land interface; and (iv) discoveries made from observations over time. Multidecadal time series reveal that many of the world's estuarine–coastal ecosystems are in a continuing state of change, and the pace of change is faster than we could have imagined a decade ago. Some have been transformed into novel ecosystems with habitats, biogeochemistry and biological communities outside the natural range of variability. Change takes many forms including linear and nonlinear trends, abrupt state changes and oscillations. The challenge of managing change is daunting in the coastal zone where diverse human pressures are concentrated and intersect with different responses to climate variability over land and over ocean basins. The pace of change in estuarine–coastal ecosystems will likely accelerate as the human population and economies continue to grow and as global climate change accelerates. Wise stewardship of the resources upon which we depend is critically dependent upon a continuing flow of information from observations to measure, understand and anticipate future changes along the world's coastlines.
Campbell, P.K.E.; Middleton, E.M.; Thome, K.J.; Kokaly, Raymond F.; Huemmrich, K.F.; Novick, K.A.; Brunsell, N.A.
2013-01-01
This study evaluated Earth Observing 1 (EO-1) Hyperion reflectance time series at established calibration sites to assess the instrument stability and suitability for monitoring vegetation functional parameters. Our analysis using three pseudo-invariant calibration sites in North America indicated that the reflectance time series are devoid of apparent spectral trends and their stability consistently is within 2.5-5 percent throughout most of the spectral range spanning the 12+ year data record. Using three vegetated sites instrumented with eddy covariance towers, the Hyperion reflectance time series were evaluated for their ability to determine important variables of ecosystem function. A number of narrowband and derivative vegetation indices (VI) closely described the seasonal profiles in vegetation function and ecosystem carbon exchange (e.g., net and gross ecosystem productivity) in three very different ecosystems, including a hardwood forest and tallgrass prairie in North America, and a Miombo woodland in Africa. Our results demonstrate the potential for scaling the carbon flux tower measurements to local and regional landscape levels. The VIs with stronger relationships to the CO2 parameters were derived using continuous reflectance spectra and included wavelengths associated with chlorophyll content and/or chlorophyll fluorescence. Since these indices cannot be calculated from broadband multispectral instrument data, the opportunity to exploit these spectrometer-based VIs in the future will depend on the launch of satellites such as EnMAP and HyspIRI. This study highlights the practical utility of space-borne spectrometers for characterization of the spectral stability and uniformity of the calibration sites in support of sensor cross-comparisons, and demonstrates the potential of narrowband VIs to track and spatially extend ecosystem functional status as well as carbon processes measured at flux towers.
NASA Technical Reports Server (NTRS)
Rudasill-Neigh, Christopher S.; Bolton, Douglas K.; Diabate, Mouhamad; Williams, Jennifer J.; Carvalhais, Nuno
2014-01-01
Forests contain a majority of the aboveground carbon (C) found in ecosystems, and understanding biomass lost from disturbance is essential to improve our C-cycle knowledge. Our study region in the Wisconsin and Minnesota Laurentian Forest had a strong decline in Normalized Difference Vegetation Index (NDVI) from 1982 to 2007, observed with the National Ocean and Atmospheric Administration's (NOAA) series of Advanced Very High Resolution Radiometer (AVHRR). To understand the potential role of disturbances in the terrestrial C-cycle, we developed an algorithm to map forest disturbances from either harvest or insect outbreak for Landsat time-series stacks. We merged two image analysis approaches into one algorithm to monitor forest change that included: (1) multiple disturbance index thresholds to capture clear-cut harvest; and (2) a spectral trajectory-based image analysis with multiple confidence interval thresholds to map insect outbreak. We produced 20 maps and evaluated classification accuracy with air-photos and insect air-survey data to understand the performance of our algorithm. We achieved overall accuracies ranging from 65% to 75%, with an average accuracy of 72%. The producer's and user's accuracy ranged from a maximum of 32% to 70% for insect disturbance, 60% to 76% for insect mortality and 82% to 88% for harvested forest, which was the dominant disturbance agent. Forest disturbances accounted for 22% of total forested area (7349 km2). Our algorithm provides a basic approach to map disturbance history where large impacts to forest stands have occurred and highlights the limited spectral sensitivity of Landsat time-series to outbreaks of defoliating insects. We found that only harvest and insect mortality events can be mapped with adequate accuracy with a non-annual Landsat time-series. This limited our land cover understanding of NDVI decline drivers. We demonstrate that to capture more subtle disturbances with spectral trajectories, future observations must be temporally dense to distinguish between type and frequency in heterogeneous landscapes.
NASA Astrophysics Data System (ADS)
Guo, Danlu; Westra, Seth; Maier, Holger R.
2017-11-01
Scenario-neutral approaches are being used increasingly for assessing the potential impact of climate change on water resource systems, as these approaches allow the performance of these systems to be evaluated independently of climate change projections. However, practical implementations of these approaches are still scarce, with a key limitation being the difficulty of generating a range of plausible future time series of hydro-meteorological data. In this study we apply a recently developed inverse stochastic generation approach to support the scenario-neutral analysis, and thus identify the key hydro-meteorological variables to which the system is most sensitive. The stochastic generator simulates synthetic hydro-meteorological time series that represent plausible future changes in (1) the average, extremes and seasonal patterns of rainfall; and (2) the average values of temperature (Ta), relative humidity (RH) and wind speed (uz) as variables that drive PET. These hydro-meteorological time series are then fed through a conceptual rainfall-runoff model to simulate the potential changes in runoff as a function of changes in the hydro-meteorological variables, and runoff sensitivity is assessed with both correlation and Sobol' sensitivity analyses. The method was applied to a case study catchment in South Australia, and the results showed that the most important hydro-meteorological attributes for runoff were winter rainfall followed by the annual average rainfall, while the PET-related meteorological variables had comparatively little impact. The high importance of winter rainfall can be related to the winter-dominated nature of both the rainfall and runoff regimes in this catchment. The approach illustrated in this study can greatly enhance our understanding of the key hydro-meteorological attributes and processes that are likely to drive catchment runoff under a changing climate, thus enabling the design of tailored climate impact assessments to specific water resource systems.
Environmental triggers of Past Ebola Outbreaks in Africa, 1981 - 2014
NASA Astrophysics Data System (ADS)
Dartevelle, S.; NguyRobertson, A. L.
2016-12-01
Ebola virus, especially its most common and lethal form, Zaire Ebolavirus, has eluded scientists nearly 50 years. What is its primary host? Why does it go dormant to suddenly to reappear full force years later? What are the driving forces behind its intriguing dynamic? It has been surmised that local environmental factors (such as droughts, seasons) might be at play behind the on-and-off Ebola outbreak outbursts. However, so far, no clear lead has been demonstrated making Ebola a constant hidden lethal menace lurking in the environment for many African communities. We have analyzed long-term time-series of three environmental variables that influence the controlling factor behind the cycle of Ebola virus outbreaks: (i) vegetation health, as determined from the Normalized Difference Vegetation Index (NDVI) collected by AVHRR and MODIS satellite sensors, and the weather variables (ii) temperature and (iii) precipitation from the Climate Forecast System ver. 2. Time series data were averaged monthly and spatially over a 100 km grid around past known outbreak locations. Seasonal effects were removed from these time series before applying statistical analyses identifying causal linkages between NDVI, temperature, precipitation and Ebola outbreaks. Likewise, possible tipping-points prior to outbreaks (i.e., early warning signals of an upcoming outbreak) were identified. Our results indicate that there is a causal dynamic link between outbreaks and the three environmental variables examined months prior to an outbreak. This was likely due to an abnormal change in the local precipitation pattern which influence NDVI values and to a lesser extent temperature. Furthermore, our results provide evidence that these factors demonstrate early warning signals of a dynamical system at a tipping-point, prior to a future outbreak. These tipping-point or early warning models may open new ways to furthermore develop forecast models of future Ebola outbreaks. [US Government — Approved for Public Release 16-560].
Busanello, Marcos; de Freitas, Larissa Nazareth; Winckler, João Pedro Pereira; Farias, Hiron Pereira; Dos Santos Dias, Carlos Tadeu; Cassoli, Laerte Dagher; Machado, Paulo Fernando
2017-01-01
Payment programs based on milk quality (PPBMQ) are used in several countries around the world as an incentive to improve milk quality. One of the principal milk parameters used in such programs is the bulk tank somatic cell count (BTSCC). In this study, using data from an average of 37,000 farms per month in Brazil where milk was analyzed, BTSCC data were divided into different payment classes based on milk quality. Then, descriptive and graphical analyses were performed. The probability of a change to a worse payment class was calculated, future BTSCC values were predicted using time series models, and financial losses due to the failure to reach the maximum bonus for the payment based on milk quality were simulated. In Brazil, the mean BTSCC has remained high in recent years, without a tendency to improve. The probability of changing to a worse payment class was strongly affected by both the BTSCC average and BTSCC standard deviation for classes 1 and 2 (1000-200,000 and 201,000-400,000 cells/mL, respectively) and only by the BTSCC average for classes 3 and 4 (401,000-500,000 and 501,000-800,000 cells/mL, respectively). The time series models indicated that at some point in the year, farms would not remain in their current class and would accrue financial losses due to payments based on milk quality. The BTSCC for Brazilian dairy farms has not recently improved. The probability of a class change to a worse class is a metric that can aid in decision-making and stimulate farmers to improve milk quality. A time series model can be used to predict the future value of the BTSCC, making it possible to estimate financial losses and to show, moreover, that financial losses occur in all classes of the PPBMQ because the farmers do not remain in the best payment class in all months.
Time Frequency Analysis of The Land Subsidence Monitored Data with Exploration Geophysics
NASA Astrophysics Data System (ADS)
Wang, Shang-Wei
2014-05-01
Taiwan geographic patterns and various industry water, caused Zhuoshui River Fan groundwater extraction of excess leads to land subsidence, affect the safety of high-speed railway traffic and public construction. It is necessary to do the deeply research on the reason and behavior of subsidence. All the related element will be confer including the water extracted groundwater that be used on each industry or the impact of climate change rainfall and the ground formation characteristics. Conducted a series of in situ measurements and monitoring data with Hilbert Huang Transform. Discussion of subsidence mechanism and estimate the future high-speed rail traffic may affect the extent of providing for future reference remediation. We investigate and experiment on the characteristic of land subsidence in Yun Lin area. The Hilbert-Huang Transform (HHT) and signal normalized are be used to discuss the physical meanings and interactions among the time series data of settlement, groundwater, pumping, rainfall and micro-tremor of ground. The broadband seismic signals of the Broadband Array in Taiwan for Seismology, (BATS) obtained near the Zhuoshui River (WLGB in Chia Yi, WGKB in Yun Lin and RLNB in Zhang Hua) were analyzed by using HHT and empirical mode decomposition (EMD) to discuss the micro-tremor characteristics of the settled ground. To compare among ten years series data of micro-tremor, groundwater and land subsidence monitoring wells, we can get more information about land subsidence. The electrical resistivity tomography (ERT) were performed to correlate the resistivity profile and borehole logging data at the test area. The relationships among resistivity, groundwater variation, and ground subsidence obtained from the test area have been discussed. Active and passive multichannel analysis of surface waves method (MASW) can calculate Poisson's ratio by using shear velocity and pressure velocity. The groundwater level can be presumed when Poisson's ratio arrive 0.5. We can know about undulate groundwater stages and variation of ground by more times measurements.
ERIC Educational Resources Information Center
Schulenberg, John E.; Johnston, Lloyd D.; O'Malley, Patrick M.; Bachman, Jerald G.; Miech, Richard A.; Patrick, Megan E.
2017-01-01
Monitoring the Future (MTF), now in its 42nd year, is a research program conducted at the University of Michigan's Institute for Social Research under a series of investigator-initiated, competing research grants from the National Institute on Drug Abuse--one of the National Institutes of Health. The study comprises several ongoing series of…
ERIC Educational Resources Information Center
Center for the Study of Mathematics Curriculum, 2012
2012-01-01
In 2009-10 a series of Workshops was organized to focus on STEM (science, technology, engineering, and mathematics) learning design for young students and adolescents. The objective was to provide visionary leadership to the education community by: (a) identifying and analyzing the needs and opportunities for future STEM curriculum development and…
Stakeholder Analysis of an Executable Achitecture Systems Engineering (EASE) Tool
2013-06-21
The FCR tables and stakeholder feedback are then used as the foundation of a Strengths, Weaknesses, Opportunities, and Threats ( SWOT ) analysis . Finally...the SWOT analysis and stakeholder feedback arc translated into an EASE future development strategy; a series of recommendations regarding...and Threats ( SWOT ) analysis . Finally, the SWOT analysis and stakeholder feedback are translated into an EASE future development strategy; a series
ERIC Educational Resources Information Center
Johnston, Lloyd D.; O'Malley, Patrick M.; Bachman, Jerald G.; Schulenberg, John E.; Miech, Richard A.
2016-01-01
Monitoring the Future (MTF), now in its 41st year, is a research program conducted at the University of Michigan's Institute for Social Research under a series of investigator-initiated, competing research grants from the National Institute on Drug Abuse--one of the National Institutes of Health. The study comprises several ongoing series of…
ERIC Educational Resources Information Center
Johnston, Lloyd D.; O'Malley, Patrick M.; Bachman, Jerald G.; Schulenberg, John E.; Miech, Richard A.
2015-01-01
Monitoring the Future (MTF), now in its 40th year, is a research program conducted at the University of Michigan's Institute for Social Research under a series of investigator-initiated research grants from the National Institute on Drug Abuse--one of the National Institutes of Health. The study comprises several ongoing series of annual surveys…
Satellite Ocean Biology: Past, Present, Future
NASA Technical Reports Server (NTRS)
McClain, Charles R.
2012-01-01
Since 1978 when the first satellite ocean color proof-of-concept sensor, the Nimbus-7 Coastal Zone Color Scanner, was launched, much progress has been made in refining the basic measurement concept and expanding the research applications of global satellite time series of biological and optical properties such as chlorophyll-a concentrations. The seminar will review the fundamentals of satellite ocean color measurements (sensor design considerations, on-orbit calibration, atmospheric corrections, and bio-optical algorithms), scientific results from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) and Moderate resolution Imaging Spectroradiometer (MODIS) missions, and the goals of future NASA missions such as PACE, the Aerosol, Cloud, Ecology (ACE), and Geostationary Coastal and Air Pollution Events (GeoCAPE) missions.
Space Fission Propulsion Testing and Development Progress. Phase 1
NASA Technical Reports Server (NTRS)
VanDyke, Melissa; Houts, Mike; Pedersen, Kevin; Godfroy, Tom; Dickens, Ricky; Poston, David; Reid, Bob; Salvail, Pat; Ring, Peter; Rodgers, Stephen L. (Technical Monitor)
2001-01-01
Successful development of space fission systems will require an extensive program of affordable and realistic testing. In addition to tests related to design/development of the fission system, realistic testing of the actual flight unit must also be performed. Testing can be divided into two categories, non-nuclear tests and nuclear tests. Full power nuclear tests of space fission systems we expensive, time consuming, and of limited use, even in the best of programmatic environments. If the system is designed to operate within established radiation damage and fuel burn up limits while simultaneously being designed to allow close simulation of heat from fission using resistance heaters, high confidence in fission system performance and lifetime can be attained through a series of non-nuclear tests. Non-nuclear tests are affordable and timely, and the cause of component and system failures can be quickly and accurately identified. MSFC is leading a Safe Affordable Fission Engine (SAFE) test series whose ultimate goal is the demonstration of a 300 kW flight configuration system using non-nuclear testing. This test series is carried out in collaboration with other NASA centers, other government agencies, industry, and universities. If SAFE-related nuclear tests are desired they will have a high probability of success and can be performed at existing nuclear facilities. The paper describes the SAFE non-nuclear test series, which includes test article descriptions, test results and conclusions, and future test plans.
Phase 1 space fission propulsion system testing and development progress
NASA Astrophysics Data System (ADS)
van Dyke, Melissa; Houts, Mike; Pedersen, Kevin; Godfroy, Tom; Dickens, Ricky; Poston, David; Reid, Bob; Salvail, Pat; Ring, Peter
2001-02-01
Successful development of space fission systems will require an extensive program of affordable and realistic testing. In addition to tests related to design/development of the fission system, realistic testing of the actual flight unit must also be performed. Testing can be divided into two categories, non-nuclear tests and nuclear tests. Full power nuclear tests of space fission systems are expensive, time consuming, and of limited use, even in the best of programmatic environments. If the system is designed to operate within established radiation damage and fuel burn up limits while simultaneously being designed to allow close simulation of heat from fission using resistance heaters, high confidence in fission system performance and lifetime can be attained through a series of non-nuclear tests. Non-nuclear tests are affordable and timely, and the cause of component and system failures can be quickly and accurately identified, MSFC is leading a Safe Affordable Fission Engine (SAFE) test series whose ultimate goal is the demonstration of a 300 kW flight configuration system using non-nuclear testing. This test series is carried out in collaboration with other NASA centers, other government agencies, industry, and universities. If SAFE-related nuclear tests are desired, they will have a high probability of success and can be performed at existing nuclear facilities. The paper describes the SAFE non-nuclear test series, which includes test article descriptions, test results and conclusions, and future test plans. .
Downscaling climate change scenarios for apple pest and disease modeling in Switzerland
NASA Astrophysics Data System (ADS)
Hirschi, M.; Stoeckli, S.; Dubrovsky, M.; Spirig, C.; Calanca, P.; Rotach, M. W.; Fischer, A. M.; Duffy, B.; Samietz, J.
2012-02-01
As a consequence of current and projected climate change in temperate regions of Europe, agricultural pests and diseases are expected to occur more frequently and possibly to extend to previously non-affected regions. Given their economic and ecological relevance, detailed forecasting tools for various pests and diseases have been developed, which model their phenology, depending on actual weather conditions, and suggest management decisions on that basis. Assessing the future risk of pest-related damages requires future weather data at high temporal and spatial resolution. Here, we use a combined stochastic weather generator and re-sampling procedure for producing site-specific hourly weather series representing present and future (1980-2009 and 2045-2074 time periods) climate conditions in Switzerland. The climate change scenarios originate from the ENSEMBLES multi-model projections and provide probabilistic information on future regional changes in temperature and precipitation. Hourly weather series are produced by first generating daily weather data for these climate scenarios and then using a nearest neighbor re-sampling approach for creating realistic diurnal cycles. These hourly weather series are then used for modeling the impact of climate change on important life phases of the codling moth and on the number of predicted infection days of fire blight. Codling moth (Cydia pomonella) and fire blight (Erwinia amylovora) are two major pest and disease threats to apple, one of the most important commercial and rural crops across Europe. Results for the codling moth indicate a shift in the occurrence and duration of life phases relevant for pest control. In southern Switzerland, a 3rd generation per season occurs only very rarely under today's climate conditions but is projected to become normal in the 2045-2074 time period. While the potential risk for a 3rd generation is also significantly increasing in northern Switzerland (for most stations from roughly 1% on average today to over 60% in the future for the median climate change signal of the multi-model projections), the actual risk will critically depend on the pace of the adaptation of the codling moth with respect to the critical photoperiod. To control this additional generation, an intensification and prolongation of control measures (e.g. insecticides) will be required, implying an increasing risk of pesticide resistances. For fire blight, the projected changes in infection days are less certain due to uncertainties in the leaf wetness approximation and the simulation of the blooming period. Two compensating effects are projected, warmer temperatures favoring infections are balanced by a temperature-induced advancement of the blooming period, leading to no significant change in the number of infection days under future climate conditions for most stations.
Downscaling climate change scenarios for apple pest and disease modeling in Switzerland
NASA Astrophysics Data System (ADS)
Hirschi, M.; Stoeckli, S.; Dubrovsky, M.; Spirig, C.; Calanca, P.; Rotach, M. W.; Fischer, A. M.; Duffy, B.; Samietz, J.
2011-08-01
As a consequence of current and projected climate change in temperate regions of Europe, agricultural pests and diseases are expected to occur more frequently and possibly to extend to previously not affected regions. Given their economic and ecological relevance, detailed forecasting tools for various pests and diseases have been developed, which model their phenology depending on actual weather conditions and suggest management decisions on that basis. Assessing the future risk of pest-related damages requires future weather data at high temporal and spatial resolution. Here, we use a combined stochastic weather generator and re-sampling procedure for producing site-specific hourly weather series representing present and future (1980-2009 and 2045-2074 time periods) climate conditions in Switzerland. The climate change scenarios originate from the ENSEMBLES multi-model projections and provide probabilistic information on future regional changes in temperature and precipitation. Hourly weather series are produced by first generating daily weather data for these climate scenarios and then using a nearest neighbor re-sampling approach for creating realistic diurnal cycles. These hourly weather series are then used for modeling the impact of climate change on important life phases of the codling moth and on the number of predicted infection days of fire blight. Codling moth (Cydia pomonella) and fire blight (Erwinia amylovora) are two major pest and disease threats to apple, one of the most important commercial and rural crops across Europe. Results for the codling moth indicate a shift in the occurrence and duration of life phases relevant for pest control. In southern Switzerland, a 3rd generation per season occurs only very rarely under today's climate conditions but is projected to become normal in the 2045-2074 time period. While the potential risk for a 3rd generation is also significantly increasing in northern Switzerland (for most stations from roughly 1 % on average today to over 60 % in the future for the median climate change signal of the multi-model projections), the actual risk will critically depend on the pace of the adaptation of the codling moth with respect to the critical photoperiod. To control this additional generation, an intensification and prolongation of control measures (e.g., insecticides) will be required, implying an increasing risk of pesticide resistances. For fire blight, the projected changes in infection days are less certain due to uncertainties in the leaf wetness approximation and the simulation of the blooming period. Two compensating effects are projected, warmer temperatures favoring infections are balanced by a temperature-induced advancement of the blooming period, leading to no significant change in the number of infection days under future climate conditions for most stations.
Linzer, Mark; Warde, Carole; Alexander, R Wayne; Demarco, Deborah M; Haupt, Allison; Hicks, Leroi; Kutner, Jean; Mangione, Carol M; Mechaber, Hilit; Rentz, Meridith; Riley, Joanne; Schuster, Barbara; Solomon, Glen D; Volberding, Paul; Ibrahim, Tod
2009-10-01
To establish guidelines for more effectively incorporating part-time faculty into departments of internal medicine, a task force was convened in early 2007 by the Association of Specialty Professors. The task force used informal surveys, current literature, and consensus building among members of the Alliance for Academic Internal Medicine to produce a consensus statement and a series of recommendations. The task force agreed that part-time faculty could enrich a department of medicine, enhance workforce flexibility, and provide high-quality research, patient care, and education in a cost-effective manner. The task force provided a series of detailed steps for operationalizing part-time practice; to do so, key issues were addressed, such as fixed costs, malpractice insurance, space, cross-coverage, mentoring, career development, productivity targets, and flexible scheduling. Recommendations included (1) increasing respect for work-family balance, (2) allowing flexible time as well as part-time employment, (3) directly addressing negative perceptions about part-time faculty, (4) developing policies to allow flexibility in academic advancement, (5) considering part-time faculty as candidates for leadership positions, (6) encouraging granting agencies, including the National Institutes of Health and Veterans Administration, to consider part-time faculty as eligible for research career development awards, and (7) supporting future research in "best practices" for incorporating part-time faculty into academic departments of medicine.
Legave, Jean-Michel; Guédon, Yann; Malagi, Gustavo; El Yaacoubi, Adnane; Bonhomme, Marc
2015-01-01
The responses of flowering phenology to temperature increases in temperate fruit trees have rarely been investigated in contrasting climatic regions. This is an appropriate framework for highlighting varying responses to diverse warming contexts, which would potentially combine chill accumulation (CA) declines and heat accumulation (HA) increases. To examine this issue, a data set was constituted in apple tree from flowering dates collected for two phenological stages of three cultivars in seven climate-contrasting temperate regions of Western Europe and in three mild regions, one in Northern Morocco and two in Southern Brazil. Multiple change-point models were applied to flowering date series, as well as to corresponding series of mean temperature during two successive periods, respectively determining for the fulfillment of chill and heat requirements. A new overview in space and time of flowering date changes was provided in apple tree highlighting not only flowering date advances as in previous studies but also stationary flowering date series. At global scale, differentiated flowering time patterns result from varying interactions between contrasting thermal determinisms of flowering dates and contrasting warming contexts. This may explain flowering date advances in most of European regions and in Morocco vs. stationary flowering date series in the Brazilian regions. A notable exception in Europe was found in the French Mediterranean region where the flowering date series was stationary. While the flowering duration series were stationary whatever the region, the flowering durations were far longer in mild regions compared to temperate regions. Our findings suggest a new warming vulnerability in temperate Mediterranean regions, which could shift toward responding more to chill decline and consequently experience late and extended flowering under future warming scenarios.
2012-01-01
Background Companion diagnostic tests can depend on accurate measurement of protein expression in tissues. Preanalytic variables, especially cold ischemic time (time from tissue removal to fixation in formalin) can affect the measurement and may cause false-negative results. We examined 23 proteins, including four commonly used breast cancer biomarker proteins, to quantify their sensitivity to cold ischemia in breast cancer tissues. Methods A series of 93 breast cancer specimens with known time-to-fixation represented in a tissue microarray and a second series of 25 matched pairs of core needle biopsies and breast cancer resections were used to evaluate changes in antigenicity as a function of cold ischemic time. Estrogen receptor (ER), progesterone receptor (PgR), HER2 or Ki67, and 19 other antigens were tested. Each antigen was measured using the AQUA method of quantitative immunofluorescence on at least one series. All statistical tests were two-sided. Results We found no evidence for loss of antigenicity with time-to-fixation for ER, PgR, HER2, or Ki67 in a 4-hour time window. However, with a bootstrapping analysis, we observed a trend toward loss for ER and PgR, a statistically significant loss of antigenicity for phosphorylated tyrosine (P = .0048), and trends toward loss for other proteins. There was evidence of increased antigenicity in acetylated lysine, AKAP13 (P = .009), and HIF1A (P = .046), which are proteins known to be expressed in conditions of hypoxia. The loss of antigenicity for phosphorylated tyrosine and increase in expression of AKAP13, and HIF1A were confirmed in the biopsy/resection series. Conclusions Key breast cancer biomarkers show no evidence of loss of antigenicity, although this dataset assesses the relatively short time beyond the 1-hour limit in recent guidelines. Other proteins show changes in antigenicity in both directions. Future studies that extend the time range and normalize for heterogeneity will provide more comprehensive information on preanalytic variation due to cold ischemic time. PMID:23090068
Multivariate time series modeling of short-term system scale irrigation demand
NASA Astrophysics Data System (ADS)
Perera, Kushan C.; Western, Andrew W.; George, Biju; Nawarathna, Bandara
2015-12-01
Travel time limits the ability of irrigation system operators to react to short-term irrigation demand fluctuations that result from variations in weather, including very hot periods and rainfall events, as well as the various other pressures and opportunities that farmers face. Short-term system-wide irrigation demand forecasts can assist in system operation. Here we developed a multivariate time series (ARMAX) model to forecast irrigation demands with respect to aggregated service points flows (IDCGi, ASP) and off take regulator flows (IDCGi, OTR) based across 5 command areas, which included area covered under four irrigation channels and the study area. These command area specific ARMAX models forecast 1-5 days ahead daily IDCGi, ASP and IDCGi, OTR using the real time flow data recorded at the service points and the uppermost regulators and observed meteorological data collected from automatic weather stations. The model efficiency and the predictive performance were quantified using the root mean squared error (RMSE), Nash-Sutcliffe model efficiency coefficient (NSE), anomaly correlation coefficient (ACC) and mean square skill score (MSSS). During the evaluation period, NSE for IDCGi, ASP and IDCGi, OTR across 5 command areas were ranged 0.98-0.78. These models were capable of generating skillful forecasts (MSSS ⩾ 0.5 and ACC ⩾ 0.6) of IDCGi, ASP and IDCGi, OTR for all 5 lead days and IDCGi, ASP and IDCGi, OTR forecasts were better than using the long term monthly mean irrigation demand. Overall these predictive performance from the ARMAX time series models were higher than almost all the previous studies we are aware. Further, IDCGi, ASP and IDCGi, OTR forecasts have improved the operators' ability to react for near future irrigation demand fluctuations as the developed ARMAX time series models were self-adaptive to reflect the short-term changes in the irrigation demand with respect to various pressures and opportunities that farmers' face, such as changing water policy, continued development of water markets, drought and changing technology.
Future projection of design storms using a GCM-informed weather generator
NASA Astrophysics Data System (ADS)
KIm, T. W.; Wi, S.; Valdés-Pineda, R.; Valdés, J. B.
2017-12-01
The rainfall Intensity-Duration-Frequency (IDF) curves are one of the most common tools used to provide planners with a description of the frequency of extreme rainfall events of various intensities and durations. Therefore deriving appropriate IDF estimates is important to avoid malfunctions of water structures that cause huge damage. Evaluating IDF estimates in the context of climate change has become more important because projections from climate models suggest that the frequency of intense rainfall events will increase in the future due to the increase in greenhouse gas emissions. In this study, the Bartlett-Lewis (BL) stochastic rainfall model is employed to generate annual maximum series of various sub-daily durations for test basins of the Model Parameter Estimation Experiment (MOPEX) project, and to derive the IDF curves in the context of climate changes projected by the North American Regional Climate Change (NARCCAP) models. From our results, it has been found that the observed annual rainfall maximum series is reasonably represented by the synthetic annual maximum series generated by the BL model. The observed data is perturbed by change factors to incorporate the NARCCAP climate change scenarios into the IDF estimates. The future IDF curves show a significant difference from the historical IDF curves calculated for the period 1968-2000. Overall, the projected IDF curves show an increasing trend over time. The impacts of changes in extreme rainfall on the hydrologic response of the MOPEX basins are also explored. Acknowledgement: This research was supported by a grant [MPSS-NH-2015-79] through the Disaster and Safety Management Institute funded by Ministry of Public Safety and Security of Korean government.
Future Directions for Fusion Propulsion Research at NASA
NASA Technical Reports Server (NTRS)
Adams, Robert B.; Cassibry, Jason T.
2005-01-01
Fusion propulsion is inevitable if the human race remains dedicated to exploration of the solar system. There are fundamental reasons why fusion surpasses more traditional approaches to routine crewed missions to Mars, crewed missions to the outer planets, and deep space high speed robotic missions, assuming that reduced trip times, increased payloads, and higher available power are desired. A recent series of informal discussions were held among members from government, academia, and industry concerning fusion propulsion. We compiled a sufficient set of arguments for utilizing fusion in space. .If the U.S. is to lead the effort and produce a working system in a reasonable amount of time, NASA must take the initiative, relying on, but not waiting for, DOE guidance. Arguments for fusion propulsion are presented, along with fusion enabled mission examples, fusion technology trade space, and a proposed outline for future efforts.
Scaling properties of Polish rain series
NASA Astrophysics Data System (ADS)
Licznar, P.
2009-04-01
Scaling properties as well as multifractal nature of precipitation time series have not been studied for local Polish conditions until recently due to lack of long series of high-resolution data. The first Polish study of precipitation time series scaling phenomena was made on the base of pluviograph data from the Wroclaw University of Environmental and Life Sciences meteorological station located at the south-western part of the country. The 38 annual rainfall records from years 1962-2004 were converted into digital format and transformed into a standard format of 5-minute time series. The scaling properties and multifractal character of this material were studied by means of several different techniques: power spectral density analysis, functional box-counting, probability distribution/multiple scaling and trace moment methods. The result proved the general scaling character of time series at the range of time scales ranging form 5 minutes up to at least 24 hours. At the same time some characteristic breaks at scaling behavior were recognized. It is believed that the breaks were artificial and arising from the pluviograph rain gauge measuring precision limitations. Especially strong limitations at the precision of low-intensity precipitations recording by pluviograph rain gauge were found to be the main reason for artificial break at energy spectra, as was reported by other authors before. The analysis of co-dimension and moments scaling functions showed the signs of the first-order multifractal phase transition. Such behavior is typical for dressed multifractal processes that are observed by spatial or temporal averaging on scales larger than the inner-scale of those processes. The fractal dimension of rainfall process support derived from codimension and moments scaling functions geometry analysis was found to be 0.45. The same fractal dimension estimated by means of the functional box-counting method was equal to 0.58. At the final part of the study implementation of double trace moment method allowed for estimation of local universal multifractal rainfall parameters (α=0.69; C1=0.34; H=-0.01). The research proved the fractal character of rainfall process support and multifractal character of the rainfall intensity values variability among analyzed time series. It is believed that scaling of local Wroclaw's rainfalls for timescales at the range from 24 hours up to 5 minutes opens the door for future research concerning for example random cascades implementation for daily precipitation totals disaggregation for smaller time intervals. The results of such a random cascades functioning in a form of 5 minute artificial rainfall scenarios could be of great practical usability for needs of urban hydrology, and design and hydrodynamic modeling of storm water and combined sewage conveyance systems.
NASA Astrophysics Data System (ADS)
Sharma, A.; Woldemeskel, F. M.; Sivakumar, B.; Mehrotra, R.
2014-12-01
We outline a new framework for assessing uncertainties in model simulations, be they hydro-ecological simulations for known scenarios, or climate simulations for assumed scenarios representing the future. This framework is illustrated here using GCM projections for future climates for hydrologically relevant variables (precipitation and temperature), with the uncertainty segregated into three dominant components - model uncertainty, scenario uncertainty (representing greenhouse gas emission scenarios), and ensemble uncertainty (representing uncertain initial conditions and states). A novel uncertainty metric, the Square Root Error Variance (SREV), is used to quantify the uncertainties involved. The SREV requires: (1) Interpolating raw and corrected GCM outputs to a common grid; (2) Converting these to percentiles; (3) Estimating SREV for model, scenario, initial condition and total uncertainty at each percentile; and (4) Transforming SREV to a time series. The outcome is a spatially varying series of SREVs associated with each model that can be used to assess how uncertain the system is at each simulated point or time. This framework, while illustrated in a climate change context, is completely applicable for assessment of uncertainties any modelling framework may be subject to. The proposed method is applied to monthly precipitation and temperature from 6 CMIP3 and 13 CMIP5 GCMs across the world. For CMIP3, B1, A1B and A2 scenarios whereas for CMIP5, RCP2.6, RCP4.5 and RCP8.5 representing low, medium and high emissions are considered. For both CMIP3 and CMIP5, model structure is the largest source of uncertainty, which reduces significantly after correcting for biases. Scenario uncertainly increases, especially for temperature, in future due to divergence of the three emission scenarios analysed. While CMIP5 precipitation simulations exhibit a small reduction in total uncertainty over CMIP3, there is almost no reduction observed for temperature projections. Estimation of uncertainty in both space and time sheds lights on the spatial and temporal patterns of uncertainties in GCM outputs, providing an effective platform for risk-based assessments of any alternate plans or decisions that may be formulated using GCM simulations.
NASA Astrophysics Data System (ADS)
Berton, R.; Shaw, S. B.; Chandler, D. G.; Driscoll, C. T.
2014-12-01
Climatic change affects streamflow in watersheds with winter snowpack and an annual snowmelt hydrograph. In the northeastern US, changes in streamflow are driven by both the advanced timing of snowmelt and increasing summer precipitation. Projections of climate for the region in the 21st century is for warmer winters and wetter summers. Water planners need to understand future changes in flow metrics to determine if the current water resources are capable of fulfilling future demands or adapting to future changes in climate. The study of teleconnection patterns between oceanic indices variations and hydrologic variables may help improve the understanding of future water resources conditions in a watershed. The purpose of this study is to evaluate the correlation between oceanic indices and discharge variations in the Merrimack Watershed. The Merrimack Watershed is the fourth largest basin in New England which drains much of New Hampshire and northeastern portions of Massachusetts, USA. Variations in sea surface temperature (SST) and sea level pressure (SLP) are defined by the Atlantic Multi-decadal Oscillation (AMO) and the North Atlantic Oscillation (NAO), respectively. We hypothesize that temporal changes in discharge are related to AMO and NAO variations since precipitation and discharge are highly correlated in the Merrimack. The Merrimack Watershed consists of undisturbed (reference) catchments and disturbed (developed) basins with long stream gauge records (> 100 years). Developed basins provide an opportunity to evaluate the impacts of river regulation and land development on teleconnection patterns as well as changing climate. Time series of AMO and NAO indices over the past 150 years along with Merrimack annual precipitation and discharge time series have shown a 1 to 2-year watershed hydrologic memory; higher correlation between Merrimack annual precipitation and discharge with AMO and NAO are observed when a 1 to 2-year lag is given to AMO and NAO indices. For instance, the mean correlation of AMO with precipitation/discharge for a zero-year lag was 0.16/0.09 and increased to 0.26/0.23 for a 1-year lag. Our study provides an insight on the lagged hydrologic response of reference catchments and developed basins to variations in oceanic indices.
CU Prime Diversity Workshops: Creating Spaces for Growth Amongst Organizers
NASA Astrophysics Data System (ADS)
Hyater-Adams, Simone
2016-03-01
CU Prime is a graduate student run organization that was created as a way to promote community and inclusion amongst students in CU Physics Department. With a mission to improve the experiences of students, especially those underrepresented in the department and field, the core organizers developed three programs: a seminar series, a class, and a mentorship program. However, because this is strictly volunteer time for most organizers, there is little time for development and growth as a group. In response, we developed a series of diversity workshops for the group, in order to provide space and time for organizers to reflect on and grapple with difficult issues around diversity and inclusion that are important to think about when running these programs. With a structure based on readings, informal videos, and reflection, there have been 5 workshops around topics ranging from gender in physics to how to be an ally. We overview the structure and framing of these workshops, along with the challenges and successes throughout the process of developing them, along with plans for future development.
Rainfall height stochastic modelling as a support tool for landslides early warning
NASA Astrophysics Data System (ADS)
Capparelli, G.; Giorgio, M.; Greco, R.; Versace, P.
2009-04-01
Occurrence of landslides is uneasy to predict, since it is affected by a number of variables, such as mechanical and hydraulic soil properties, slope morphology, vegetation coverage, rainfall spatial and temporal variability. Although heavy landslides frequently occurred in Campania, southern Italy, during the last decade, no complete data sets are available for natural slopes where landslides occurred. As a consequence, landslide risk assessment procedures and early warning systems in Campania still rely on simple empirical models based on correlation between daily rainfall records and observed landslides, like FLAIR model [Versace et al., 2003]. Effectiveness of such systems could be improved by reliable quantitative rainfall prediction. In mountainous areas, rainfall spatial and temporal variability are very pronounced due to orographic effects, making predictions even more complicated. Existing rain gauge networks are not dense enough to resolve the small scale spatial variability, and the same limitation of spatial resolution affects rainfall height maps provided by radar sensors as well as by meteorological physically based models. Therefore, analysis of on-site recorded rainfall height time series still represents the most effective approach for a reliable prediction of local temporal evolution of rainfall. Hydrological time series analysis is a widely studied field in hydrology, often carried out by means of autoregressive models, such as AR and ARMA [Box and Jenkins, 1976]. Sometimes exogenous information coming from additional series of observations is also taken into account, and the models are called ARX and ARMAX (e.g. Salas [1992]). Such models gave the best results when applied to the analysis of autocorrelated hydrological time series, like river flow or level time series. Conversely, they are not able to model the behaviour of intermittent time series, like point rainfall height series usually are, especially when recorded with short sampling time intervals. More useful for this issue are the so-called DRIP (Disaggregated Rectangular Intensity Pulse) and NSRP (Neymann-Scott Rectangular Pulse) model [Heneker et al., 2001; Cowpertwait et al., 2002], usually adopted to generate synthetic point rainfall series. In this paper, the DRIP model approach is adopted in conjunction with FLAIR model to calculate the probability of flowslides occurrence. The final aim of the study is in fact to provide a useful tool to implement an early warning system for hydrogeological risk management. Model calibration has been carried out with hourly rainfall hieght data provided by the rain gauges of Campania Region civil protection agency meteorological warning network. So far, the model has been applied only to data series recorded at a single rain gauge. Future extension will deal with spatial correlation between time series recorded at different gauges. ACKNOWLEDGEMENTS The research was co-financed by the Italian Ministry of University, by means of the PRIN 2006 PRIN program, within the research project entitled ‘Definition of critical rainfall thresholds for destructive landslides for civil protection purposes'. REFERENCES Box, G.E.P. and Jenkins, G.M., 1976. Time Series Analysis Forecasting and Control, Holden-Day, San Francisco. Cowpertwait, P.S.P., Kilsby, C.G. and O'Connell, P.E., 2002. A space-time Neyman-Scott model of rainfall: Empirical analysis of extremes, Water Resources Research, 38(8):1-14. Salas, J.D., 1992. Analysis and modeling of hydrological time series, in D.R. Maidment, ed., Handbook of Hydrology, McGraw-Hill, New York. Heneker, T.M., Lambert, M.F. and Kuczera G., 2001. A point rainfall model for risk-based design, Journal of Hydrology, 247(1-2):54-71. Versace, P., Sirangelo. B. and Capparelli, G., 2003. Forewarning model of landslides triggered by rainfall. Proc. 3rd International Conference on Debris-Flow Hazards Mitigation: Mechanics, Prediction and Assessment, Davos.
ERIC Educational Resources Information Center
Johnston, Lloyd D.; O'Malley, Patrick M.; Bachman, Jerald G.; Schulenberg, John E.
2010-01-01
Now in its 35th year, Monitoring the Future (MTF) is a long-term program of research conducted at the University of Michigan's Institute for Social Research under a series of investigator-initiated research grants from the National Institute on Drug Abuse. The study is comprised of several ongoing series of annual surveys of nationally…
ERIC Educational Resources Information Center
Johnston, Lloyd D.; O'Malley, Patrick M.; Bachman, Jerald G.; Schulenberg, John E.; Miech, Richard A.
2014-01-01
This occasional paper presents national demographic subgroup trends for U.S. secondary school students in a series of figures and tables. It supplements two of four annual monographs from the Monitoring the Future (MTF) study, namely the "Overview of Key Findings" and "Volume I: Secondary School Students." MTF is funded by the…
ERIC Educational Resources Information Center
Johnston, Lloyd D.; O'Malley, Patrick M.; Bachman, Jerald G.; Schulenberg, John E.
2007-01-01
Monitoring the Future is a long-term program of research being conducted at the University of Michigan's Institute for Social Research under a series of investigator-initiated research grants from the National Institute on Drug Abuse. Now in its 32nd year, the study is comprised of several ongoing series of annual surveys of nationally…
Circular analysis in complex stochastic systems
Valleriani, Angelo
2015-01-01
Ruling out observations can lead to wrong models. This danger occurs unwillingly when one selects observations, experiments, simulations or time-series based on their outcome. In stochastic processes, conditioning on the future outcome biases all local transition probabilities and makes them consistent with the selected outcome. This circular self-consistency leads to models that are inconsistent with physical reality. It is also the reason why models built solely on macroscopic observations are prone to this fallacy. PMID:26656656
Neural network approaches to capture temporal information
NASA Astrophysics Data System (ADS)
van Veelen, Martijn; Nijhuis, Jos; Spaanenburg, Ben
2000-05-01
The automated design and construction of neural networks receives growing attention of the neural networks community. Both the growing availability of computing power and development of mathematical and probabilistic theory have had severe impact on the design and modelling approaches of neural networks. This impact is most apparent in the use of neural networks to time series prediction. In this paper, we give our views on past, contemporary and future design and modelling approaches to neural forecasting.
Development of Alabama Resources Information System (ARIS)
NASA Technical Reports Server (NTRS)
Herring, B. E.; Vachon, R. I.
1976-01-01
A formal, organized set of information concerning the development status of the Alabama Resources Information System (ARIS) as of September 1976 is provided. A series of computer source language programs, and flow charts related to each of the computer programs to provide greater ease in performing future change are presented. Listings of the variable names, and their meanings, used in the various source code programs, and copies of the various user manuals which were prepared through this time are given.
1991-06-01
commercial products . The D-series of reports includes publications of the Environmental Effects of Dredging Programs: Dredging Operations Technical Support...insufficient data are available, areas for future productive research are recommended. The major amount of information available is for the upland area, where...Conse- quently, the upland, wetland, and aquatic areas that appear either as an end product or transiently at all CDFs are permanently established
NASA Astrophysics Data System (ADS)
Liu, Xiaojia; An, Haizhong; Wang, Lijun; Guan, Qing
2017-09-01
The moving average strategy is a technical indicator that can generate trading signals to assist investment. While the trading signals tell the traders timing to buy or sell, the moving average cannot tell the trading volume, which is a crucial factor for investment. This paper proposes a fuzzy moving average strategy, in which the fuzzy logic rule is used to determine the strength of trading signals, i.e., the trading volume. To compose one fuzzy logic rule, we use four types of moving averages, the length of the moving average period, the fuzzy extent, and the recommend value. Ten fuzzy logic rules form a fuzzy set, which generates a rating level that decides the trading volume. In this process, we apply genetic algorithms to identify an optimal fuzzy logic rule set and utilize crude oil futures prices from the New York Mercantile Exchange (NYMEX) as the experiment data. Each experiment is repeated for 20 times. The results show that firstly the fuzzy moving average strategy can obtain a more stable rate of return than the moving average strategies. Secondly, holding amounts series is highly sensitive to price series. Thirdly, simple moving average methods are more efficient. Lastly, the fuzzy extents of extremely low, high, and very high are more popular. These results are helpful in investment decisions.
Recombinant Temporal Aberration Detection Algorithms for Enhanced Biosurveillance
Murphy, Sean Patrick; Burkom, Howard
2008-01-01
Objective Broadly, this research aims to improve the outbreak detection performance and, therefore, the cost effectiveness of automated syndromic surveillance systems by building novel, recombinant temporal aberration detection algorithms from components of previously developed detectors. Methods This study decomposes existing temporal aberration detection algorithms into two sequential stages and investigates the individual impact of each stage on outbreak detection performance. The data forecasting stage (Stage 1) generates predictions of time series values a certain number of time steps in the future based on historical data. The anomaly measure stage (Stage 2) compares features of this prediction to corresponding features of the actual time series to compute a statistical anomaly measure. A Monte Carlo simulation procedure is then used to examine the recombinant algorithms’ ability to detect synthetic aberrations injected into authentic syndromic time series. Results New methods obtained with procedural components of published, sometimes widely used, algorithms were compared to the known methods using authentic datasets with plausible stochastic injected signals. Performance improvements were found for some of the recombinant methods, and these improvements were consistent over a range of data types, outbreak types, and outbreak sizes. For gradual outbreaks, the WEWD MovAvg7+WEWD Z-Score recombinant algorithm performed best; for sudden outbreaks, the HW+WEWD Z-Score performed best. Conclusion This decomposition was found not only to yield valuable insight into the effects of the aberration detection algorithms but also to produce novel combinations of data forecasters and anomaly measures with enhanced detection performance. PMID:17947614
What the ADEA CCI series of articles means to me: reflections of a mid-career dental faculty member.
Novak, Karen F
2009-02-01
In this reflection article, Dr. Karen Novak, a mid-career faculty member at a U.S. dental school, identifies important messages and insights she gained from a series of twenty-one articles about the future of dental education published in the Journal of Dental Education from October 2005 to February 2009. This article addresses four questions: 1) What influence have these articles had on an academic dentist's perspectives about her role and priorities as a dental school faculty member and her own career plans and future directions? 2) What are the key messages in these articles for other dental educators who are at similar places in their careers? 3) What additional topics concerning the future of academic dentistry should be covered in future articles? and 4) What issues and priorities should receive the most attention from academic dentistry in the next decade? The American Dental Education Association's Commission on Change and Innovation in Dental Education (ADEA CCI) was established to provide a mechanism for stakeholders in academic dentistry to meet and consider future directions in the education of the nation's dental workforce. Along with ADEA, these stakeholders included dental schools, the American Dental Association (ADA) Board of Trustees, the Commission on Dental Accreditation (CODA), the ADA Council on Dental Education and Licensure (CDEL), the Joint Commission on National Dental Examinations (JCNDE), the dental licensure community, the ADA Foundation, and advanced dental education programs. The ADEA CCI was created to build consensus within the dental community for innovative changes in the education of general dentists. One outcome of this process was a series of articles intended to raise awareness and stimulate dialogue about issues and forces shaping the future of dental education. Collectively, this series of articles is known as the Perspectives and Reflections in Dental Education (PRIDE) series to acknowledge the commitment of the academic dental community to reflect on current practices and future directions and also to represent the pride of dental school faculty members in their educational responsibilities and accomplishments.
NASA Astrophysics Data System (ADS)
Dodds, S. F.; Mock, C. J.
2009-12-01
All available instrumental winter precipitation data for the Central Valley of California back to 1850 were digitized and analyzed to construct continuous time series. Many of these data, in paper or microfilm format, extend prior to modern National Weather Service Cooperative Data Program and Historical Climate Network data, and were recorded by volunteer observers from networks such as the US Army Surgeon General, Smithsonian Institution, and US Army Signal Service. Given incomplete individual records temporally, detailed documentary data from newspapers, personal diaries and journals, ship logbooks, and weather enthusiasts’ instrumental data, were used in conjunction with instrumental data to reconstruct precipitation frequency per month and season, continuous days of precipitation, and to identify anomalous precipitation events. Multilinear regression techniques, using surrounding stations and the relationships between modern and historical records, bridge timeframes lacking data and provided homogeneous nature of time series. The metadata for each station was carefully screened, and notes were made about any possible changes to the instrumentation, location of instruments, or an untrained observer to verify that anomalous events were not recorded incorrectly. Precipitation in the Central Valley varies throughout the entire region, but waterways link the differing elevations and latitudes. This study integrates the individual station data with additional accounts of flood descriptions through unique newspaper and journal data. River heights and flood extent inundating cities, agricultural lands, and individual homes are often recorded within unique documentary sources, which add to the understanding of flood occurrence within this area. Comparisons were also made between dam and levee construction through time and how waters are diverted through cities in natural and anthropogenically changed environments. Some precipitation that lead to flooding events that occur in the Central Valley in the mid-19th century through the early 20th century are more outstanding at some particular stations than the modern records include. Several years that are included in the study are 1850, 1862, 1868, 1878, 1881, 1890, and 1907. These flood years were compared to the modern record and reconstructed through time series and maps. Incorporating the extent and effects these anomalous events in future climate studies could improve models and preparedness for the future floods.
New Approach To Hour-By-Hour Weather Forecast
NASA Astrophysics Data System (ADS)
Liao, Q. Q.; Wang, B.
2017-12-01
Fine hourly forecast in single station weather forecast is required in many human production and life application situations. Most previous MOS (Model Output Statistics) which used a linear regression model are hard to solve nonlinear natures of the weather prediction and forecast accuracy has not been sufficient at high temporal resolution. This study is to predict the future meteorological elements including temperature, precipitation, relative humidity and wind speed in a local region over a relatively short period of time at hourly level. By means of hour-to-hour NWP (Numeral Weather Prediction)meteorological field from Forcastio (https://darksky.net/dev/docs/forecast) and real-time instrumental observation including 29 stations in Yunnan and 3 stations in Tianjin of China from June to October 2016, predictions are made of the 24-hour hour-by-hour ahead. This study presents an ensemble approach to combine the information of instrumental observation itself and NWP. Use autoregressive-moving-average (ARMA) model to predict future values of the observation time series. Put newest NWP products into the equations derived from the multiple linear regression MOS technique. Handle residual series of MOS outputs with autoregressive (AR) model for the linear property presented in time series. Due to the complexity of non-linear property of atmospheric flow, support vector machine (SVM) is also introduced . Therefore basic data quality control and cross validation makes it able to optimize the model function parameters , and do 24 hours ahead residual reduction with AR/SVM model. Results show that AR model technique is better than corresponding multi-variant MOS regression method especially at the early 4 hours when the predictor is temperature. MOS-AR combined model which is comparable to MOS-SVM model outperform than MOS. Both of their root mean square error and correlation coefficients for 2 m temperature are reduced to 1.6 degree Celsius and 0.91 respectively. The forecast accuracy of 24- hour forecast deviation no more than 2 degree Celsius is 78.75 % for MOS-AR model and 81.23 % for AR model.
Impact of the climate change to shallow groundwater in Baltic artesian basin
NASA Astrophysics Data System (ADS)
Lauva, D.; Bethers, P.; Timuhins, A.; Sennikovs, J.
2012-04-01
The purpose of our work was to find the long term pattern of annual shallow ground water changes in region of Latvia, ground water level modelling for the contemporary climate and future climate scenarios and the model generalization to the Baltic artesian basin (BAB) region. Latvia is located in the middle part of BAB. It occupies about 65'000 square kilometers. BAB territory (480'000 square kilometres) also includes Lithuania, Estonia as well as parts of Poland, Russia, Belarus and the Baltic Sea. Territory of BAB is more than seven times bigger than Latvia. Precipitation and spring snow melt are the main sources of the ground water recharge in BAB territory. The long term pattern of annual shallow ground water changes was extracted from the data of 25 monitoring wells in the territory of Latvia. The main Latvian groundwater level fluctuation regime can be described as a function with two maximums (in spring and late autumn) and two minimums (in winter and late summer). The mathematical model METUL (developed by Latvian University of Agriculture) was chosen for the ground water modelling. It was calibrated on the observations in 25 gauging wells around Latvia. After the calibration we made calculations using data provided by an ensemble of regional climate models, yielding a continuous groundwater table time-series from 1961 to 2100, which were analysed and split into 3 time windows for further analysis: contemporary climate (1961-1990), near future (2021-2050) and far future (2071-2100). The daily average temperature, precipitation and humidity time series were used as METUL forcing parameters. The statistical downscaling method (Sennikovs and Bethers, 2009) was applied for the bias correction of RCM calculated and measured variables. The qualitative differences in future and contemporary annual groundwater regime are expected. The future Latvian annual groundwater cycle according to the RCM climate projection changes to curve with one peak and one drought point. Acknowledgements. This research was supported by the European Social Fund project "Establishment of interdisciplinary scientist group and modelling system for groundwater research" (Project Nr. 2009/0212/1DP/1.1.1.2.0/09/APIA/VIAA/060). Regional climate model data was provided through the ENSEMBLES data archive, funded by the EU FP6 Integrated Project ENSEMBLES (Contract number 505539). Reference: Sennikovs, J., Bethers, U. 2009. Statistical downscaling method of regional climate model results for hydrological modelling. In: Proceedings of 18th World IMACS / MODSIM Congress.
Why didn't Box-Jenkins win (again)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pack, D.J.; Downing, D.J.
This paper focuses on the forecasting performance of the Box-Jenkins methodology applied to the 111 time series of the Makridakis competition. It considers the influence of the following factors: (1) time series length, (2) time-series information (autocorrelation) content, (3) time-series outliers or structural changes, (4) averaging results over time series, and (5) forecast time origin choice. It is found that the 111 time series contain substantial numbers of very short series, series with obvious structural change, and series whose histories are relatively uninformative. If these series are typical of those that one must face in practice, the real message ofmore » the competition is that univariate time series extrapolations will frequently fail regardless of the methodology employed to produce them.« less
Zhang, Yingying; Wang, Juncheng; Vorontsov, A M; Hou, Guangli; Nikanorova, M N; Wang, Hongliang
2014-01-01
The international marine ecological safety monitoring demonstration station in the Yellow Sea was developed as a collaborative project between China and Russia. It is a nonprofit technical workstation designed as a facility for marine scientific research for public welfare. By undertaking long-term monitoring of the marine environment and automatic data collection, this station will provide valuable information for marine ecological protection and disaster prevention and reduction. The results of some initial research by scientists at the research station into predictive modeling of marine ecological environments and early warning are described in this paper. Marine ecological processes are influenced by many factors including hydrological and meteorological conditions, biological factors, and human activities. Consequently, it is very difficult to incorporate all these influences and their interactions in a deterministic or analysis model. A prediction model integrating a time series prediction approach with neural network nonlinear modeling is proposed for marine ecological parameters. The model explores the natural fluctuations in marine ecological parameters by learning from the latest observed data automatically, and then predicting future values of the parameter. The model is updated in a "rolling" fashion with new observed data from the monitoring station. Prediction experiments results showed that the neural network prediction model based on time series data is effective for marine ecological prediction and can be used for the development of early warning systems.
Aschieri, Filippo; Smith, Justin D
2012-01-01
This article presents the therapeutic assessment (TA; Finn, 2007) of a traumatized young woman named Claire. Claire reported feeling debilitated by academic demands and the expectations of her parents, and was finding it nearly impossible to progress in her studies. She was also finding it difficult to develop and sustain intimate relationships. The emotional aspects of close relationships were extremely difficult for her and she routinely blamed herself for her struggles in this arena. The assessor utilized the TA model for adults, with the exception of not including an optional intervention session. The steps of TA, particularly the extended inquiry and the discussion of test findings along the way, cultivated a supportive and empathic atmosphere with Claire. By employing the single-case time-series experimental design used in previous TA studies (e.g., Smith, Handler, & Nash, 2010; Smith, Wolf, Handler, & Nash, 2009), the authors demonstrated that Claire experienced statistically significant improvement correlated with the onset of TA. Results indicated that participation in TA coincided with a positive shift in the trajectory of her reported symptoms and with recognizing the affection she held for others in her life. This case illustrates the successful application of case-based time-series methodology in the evaluation of an adult TA. The potential implications for future study are discussed.
125 years of glacier survey of the Austrian Alpine Club: results and future challenges
NASA Astrophysics Data System (ADS)
Fischer, Andrea
2016-04-01
One of the aims of the German and Austrian Alpine Club was the scientific investigation of the Alps. In 1891, several years after Swiss initiatives, Richter put out a call to contribute to regular glacier length surveys in the Eastern Alps. Since then more than 100 glaciers have been surveyed on a first biannual and later annual basis. The database includes measured data showing a general glacier retreat since 1891, with two periods of glacier advances in the 1920s and 1980s. Less well known are the sketches and reports which illustrate, for instance, changes in surface texture. The interpretation of length change data requires a larger sample of data for a reasonable interpretation on a regional scale. Nearly every time series in the long history of investigation includes gaps, e.g. in cases of problematic snout positions on steep rock walls or in lakes, or of debris-covered tongues. Current climate change adds the problem of glaciers splitting up into several smaller glaciers which behave differently. Several basic questions need to be addressed to arrive at a most accurate prolongated time series: How should measurements on disintegrating or debris-covered (and thus more or less stagnating) glaciers be documented, and how can we homogenize length change time series? Despite of uncertainties, length change data are amongst the longest available records, bridging the gap to moraine datings of the early holocene.
Detrended Cross Correlation Analysis: a new way to figure out the underlying cause of global warming
NASA Astrophysics Data System (ADS)
Hazra, S.; Bera, S. K.
2016-12-01
Analysing non-stationary time series is a challenging task in earth science, seismology, solar physics, climate, biology, finance etc. Most of the cases external noise like oscillation, high frequency noise, low frequency noise in different scales lead to erroneous result. Many statistical methods are proposed to find the correlation between two non-stationary time series. N. Scafetta and B. J. West, Phys. Rev. Lett. 90, 248701 (2003), reported a strong relationship between solar flare intermittency (SFI) and global temperature anomalies (GTA) using diffusion entropy analysis. It has been recently shown that detrended cross correlation analysis (DCCA) is better technique to remove the effects of any unwanted signal as well as local and periodic trend. Thus DCCA technique is more suitable to find the correlation between two non-stationary time series. By this technique, correlation coefficient at different scale can be estimated. Motivated by this here we have applied a new DCCA technique to find the relationship between SFI and GTA. We have also applied this technique to find the relationship between GTA and carbon di-oxide density, GTA and methane density on earth atmosphere. In future we will try to find the relationship between GTA and aerosols present in earth atmosphere, water vapour density on earth atmosphere, ozone depletion etc. This analysis will help us for better understanding about the reason behind global warming
Multifractal analysis of visibility graph-based Ito-related connectivity time series.
Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano
2016-02-01
In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.
Time-series analysis of sleep wake stage of rat EEG using time-dependent pattern entropy
NASA Astrophysics Data System (ADS)
Ishizaki, Ryuji; Shinba, Toshikazu; Mugishima, Go; Haraguchi, Hikaru; Inoue, Masayoshi
2008-05-01
We performed electroencephalography (EEG) for six male Wistar rats to clarify temporal behaviors at different levels of consciousness. Levels were identified both by conventional sleep analysis methods and by our novel entropy method. In our method, time-dependent pattern entropy is introduced, by which EEG is reduced to binary symbolic dynamics and the pattern of symbols in a sliding temporal window is considered. A high correlation was obtained between level of consciousness as measured by the conventional method and mean entropy in our entropy method. Mean entropy was maximal while awake (stage W) and decreased as sleep deepened. These results suggest that time-dependent pattern entropy may offer a promising method for future sleep research.
Design and analysis considerations for deployment mechanisms in a space environment
NASA Technical Reports Server (NTRS)
Vorlicek, P. L.; Gore, J. V.; Plescia, C. T.
1982-01-01
On the second flight of the INTELSAT V spacecraft the time required for successful deployment of the north solar array was longer than originally predicted. The south solar array deployed as predicted. As a result of the difference in deployment times a series of experiments was conducted to locate the cause of the difference. Deployment rate sensitivity to hinge friction and temperature levels was investigated. A digital computer simulation of the deployment was created to evaluate the effects of parameter changes on deployment. Hinge design was optimized for nominal solar array deployment time for future INTELSAT V satellites. The nominal deployment times of both solar arrays on the third flight of INTELSAT V confirms the validity of the simulation and design optimization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan
A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source tomore » enable use by others.« less
Appropriating Video Surveillance for Art and Environmental Awareness: Experiences from ARTiVIS.
Mendes, Mónica; Ângelo, Pedro; Correia, Nuno; Nisi, Valentina
2018-06-01
Arts, Real-Time Video and Interactivity for Sustainability (ARTiVIS) is an ongoing collaborative research project investigating how real-time video, DIY surveillance technologies and sensor data can be used as a tool for environmental awareness, activism and artistic explorations. The project consists of a series of digital contexts for aesthetic contemplation of nature and civic engagement, aiming to foster awareness and empowerment of local populations through DIY surveillance. At the core of the ARTIVIS efforts are a series of interactive installations (namely B-Wind!, Hug@tree and Play with Fire), that make use of surveillance technologies and real-time video as raw material to promote environmental awareness through the emotion generated by real-time connections with nature. Throughout the project development, the surveillance concept has been shifting from the use of surveillance technology in a centralized platform, to the idea of veillance with distributed peer-to-peer networks that can be used for science and environmental monitoring. In this paper we present the history of the ARTiVIS project, related and inspiring work, describe ongoing research work and explore the present and future challenges of appropriating surveillance technology for artistic, educational and civic engagement purposes.
NASA Technical Reports Server (NTRS)
Milesi, Cristina; Costa-Cabral, Mariza; Rath, John; Mills, William; Roy, Sujoy; Thrasher, Bridget; Wang, Weile; Chiang, Felicia; Loewenstein, Max; Podolske, James
2014-01-01
Water resource managers planning for the adaptation to future events of extreme precipitation now have access to high resolution downscaled daily projections derived from statistical bias correction and constructed analogs. We also show that along the Pacific Coast the Northern Oscillation Index (NOI) is a reliable predictor of storm likelihood, and therefore a predictor of seasonal precipitation totals and likelihood of extremely intense precipitation. Such time series can be used to project intensity duration curves into the future or input into stormwater models. However, few climate projection studies have explored the impact of the type of downscaling method used on the range and uncertainty of predictions for local flood protection studies. Here we present a study of the future climate flood risk at NASA Ames Research Center, located in South Bay Area, by comparing the range of predictions in extreme precipitation events calculated from three sets of time series downscaled from CMIP5 data: 1) the Bias Correction Constructed Analogs method dataset downscaled to a 1/8 degree grid (12km); 2) the Bias Correction Spatial Disaggregation method downscaled to a 1km grid; 3) a statistical model of extreme daily precipitation events and projected NOI from CMIP5 models. In addition, predicted years of extreme precipitation are used to estimate the risk of overtopping of the retention pond located on the site through simulations of the EPA SWMM hydrologic model. Preliminary results indicate that the intensity of extreme precipitation events is expected to increase and flood the NASA Ames retention pond. The results from these estimations will assist flood protection managers in planning for infrastructure adaptations.
Urbanization and stream ecology: Diverse mechanisms of change
Roy, Allison; Capps, Krista A.; El-Sabaawi, Rana W.; Jones, Krista L.; Parr, Thomas B.; Ramirez, Alonso; Smith, Robert F.; Walsh, Christopher J.; Wenger, Seth J.
2016-01-01
The field of urban stream ecology has evolved rapidly in the last 3 decades, and it now includes natural scientists from numerous disciplines working with social scientists, landscape planners and designers, and land and water managers to address complex, socioecological problems that have manifested in urban landscapes. Over the last decade, stream ecologists have met 3 times at the Symposium on Urbanization and Stream Ecology (SUSE) to discuss current research, identify knowledge gaps, and promote future research collaborations. The papers in this special series on urbanization and stream ecology include both primary research studies and conceptual synthesis papers spurred from discussions at SUSE in May 2014. The themes of the meeting are reflected in the papers in this series emphasizing global differences in mechanisms and responses of stream ecosystems to urbanization and management solutions in diverse urban streams. Our hope is that this series will encourage continued interdisciplinary and collaborative research to increase the global understanding of urban stream ecology toward stream protection and restoration in urban landscapes.
ÖGRO survey on radiotherapy capacity in Austria : Status quo and estimation of future demands.
Zurl, Brigitte; Bayerl, Anja; De Vries, Alexander; Geinitz, Hans; Hawliczek, Robert; Knocke-Abulesz, Tomas-Henrik; Lukas, Peter; Pötter, Richard; Raunik, Wolfgang; Scholz, Brigitte; Schratter-Sehn, Annemarie; Sedlmayer, Felix; Seewald, Dietmar; Selzer, Edgar; Kapp, Karin S
2018-04-01
A comprehensive evaluation of the current national and regional radiotherapy capacity in Austria with an estimation of demands for 2020 and 2030 was performed by the Austrian Society for Radiation Oncology, Radiobiology and Medical Radiophysics (ÖGRO). All Austrian centers provided data on the number of megavoltage (MV) units, treatment series, fractions, percentage of retreatments and complex treatment techniques as well as the daily operating hours for the year 2014. In addition, waiting times until the beginning of radiotherapy were prospectively recorded over the first quarter of 2015. National and international epidemiological prediction data were used to estimate future demands. For a population of 8.51 million, 43 MV units were at disposal. In 14 radiooncological centers, a total of 19,940 series with a mean number of 464 patients per MV unit/year and a mean fraction number of 20 (range 16-24) per case were recorded. The average re-irradiation ratio was 14%. The survey on waiting times until start of treatment showed provision shortages in 40% of centers with a mean waiting time of 13.6 days (range 0.5-29.3 days) and a mean maximum waiting time of 98.2 days. Of all centers, 21% had no or only a limited ability to deliver complex treatment techniques. Predictions for 2020 and 2030 indicate an increased need in the overall number of MV units to a total of 63 and 71, respectively. This ÖGRO survey revealed major regional differences in radiooncological capacity. Considering epidemiological developments, an aggravation of the situation can be expected shortly. This analysis serves as a basis for improved public regional health care planning.
NASA Astrophysics Data System (ADS)
Spaans, Karsten; Hatton, Emma; Gonzalez, Pablo; Walters, Richard; McDougall, Alistair; Wright, Tim; Hooper, Andy
2017-04-01
The advantages of the Sentinel-1 constellation for InSAR applications over previous radar missions are numerous, and include small baselines, a planned operation time of 20 years, continuous and systematic acquisition of data over tectonic and volcanic areas, near-global coverage of the earth and free data availability. In order to take advantage of these properties, we at the Centre for the Observation and Modelling of Earthquakes, Volcanoes, and Tectonics (COMET) are developing a system that routinely processes and freely distributes interferometric products and time series over tectonic and volcanic regions. This project, and similar efforts at other institutions, will be a game changer for the monitoring and studying of tectonic and volcanic activity using InSAR. Since December 2016, the COMET-LiCS InSAR portal (http://comet.nerc.ac.uk/COMET-LiCS-portal/) has been live, delivering interferograms and coherence estimates over the entire Alpine-Himalayan belt. The portal already contains tens of thousands of products, which can be browsed in a user-friendly portal, and downloaded for free by the general public. For our processing, we use the Climate and Environmental Monitoring from Space (CEMS) facility, where we have large storage and processing facilities to our disposal and a complete duplicate of the Sentinel-1 archive is maintained. This greatly simplifies the infrastructure we have had to develop for automated processing of large areas. Here we will give an overview of the current status of the processing system, as well as discuss future plans. We will cover the infrastructure we developed to automatically produce interferograms and its challenges, and the processing strategy for time series analysis. We will outline the objectives of the system in the near and distant future, and a roadmap for its continued development. Finally, we will highlight some of the scientific results and projects linked to the system.
NASA Astrophysics Data System (ADS)
Shean, D. E.; Joughin, I.; Smith, B.; Floricioiu, D.
2015-12-01
Greenland's large marine-terminating outlet glaciers have displayed marked retreat, speedup, and thinning in recent decades. Jakobshavn Isbrae, one of Greenland's largest outlet glaciers, has retreated ~15 km, accelerated ~150%, and thinned ~200 m since the early 1990s. Here, we present a comprehensive analysis of high-resolution elevation (~2-5 m/px) and velocity (~100 m/px) time series with dense temporal coverage (daily-monthly). The Jakobshavn DEM time series consists of >70 WorldView-1/2/3 stereo DEMs and >11 TanDEM-X DEMs spanning 2008-2015. Complementary point elevation data from Operation IceBridge (ATM, LVIS), pre-IceBridge ATM flights, and ICESat-1 GLAS extend the surface elevation record to 1999 and provide essential absolute control data, enabling sub-meter horizontal/vertical accuracy for gridded DEMs. Velocity data are primarily derived from TerraSAR-X/TanDEM-X image pairs with 11-day interval from 2009-2015. These elevation and velocity data capture outlet glacier evolution with unprecedented detail during the post-ICESat era. The lower trunk of Jakobshavn displays significant seasonal velocity variations, with recent rates of ~8 km/yr during winter and >17 km/yr during summer. DEM data show corresponding seasonal elevation changes of -30 to -45 m in summer and +15 to +20 m in winter, with decreasing magnitude upstream. Seasonal discharge varies from ~30-35 Gt/yr in winter to ~45-55 Gt/yr in summer, and we integrate these measurements for improved long-term mass-balance estimates. Recent interannual trends show increased discharge, velocity, and thinning (-15 to -20 m/yr), which is consistent with long-term altimetry records. The DEM time series also reveal new details about calving front and mélange evolution during the seasonal cycle. Similar time series are available for Kangerdlugssuaq and Helheim Glaciers. These observations are improving our understanding of outlet glacier dynamics, while complementing ongoing efforts to constrain estimates for ice-sheet mass balance and present/future sea level rise contributions.
Smith, Justin D.; Borckardt, Jeffrey J.; Nash, Michael R.
2013-01-01
The case-based time-series design is a viable methodology for treatment outcome research. However, the literature has not fully addressed the problem of missing observations with such autocorrelated data streams. Mainly, to what extent do missing observations compromise inference when observations are not independent? Do the available missing data replacement procedures preserve inferential integrity? Does the extent of autocorrelation matter? We use Monte Carlo simulation modeling of a single-subject intervention study to address these questions. We find power sensitivity to be within acceptable limits across four proportions of missing observations (10%, 20%, 30%, and 40%) when missing data are replaced using the Expectation-Maximization Algorithm, more commonly known as the EM Procedure (Dempster, Laird, & Rubin, 1977).This applies to data streams with lag-1 autocorrelation estimates under 0.80. As autocorrelation estimates approach 0.80, the replacement procedure yields an unacceptable power profile. The implications of these findings and directions for future research are discussed. PMID:22697454
Portfolio management under sudden changes in volatility and heterogeneous investment horizons
NASA Astrophysics Data System (ADS)
Fernandez, Viviana; Lucey, Brian M.
2007-03-01
We analyze the implications for portfolio management of accounting for conditional heteroskedasticity and sudden changes in volatility, based on a sample of weekly data of the Dow Jones Country Titans, the CBT-municipal bond, spot and futures prices of commodities for the period 1992-2005. To that end, we first proceed to utilize the ICSS algorithm to detect long-term volatility shifts, and incorporate that information into PGARCH models fitted to the returns series. At the next stage, we simulate returns series and compute a wavelet-based value at risk, which takes into consideration the investor's time horizon. We repeat the same procedure for artificial data generated from semi-parametric estimates of the distribution functions of returns, which account for fat tails. Our estimation results show that neglecting GARCH effects and volatility shifts may lead to an overestimation of financial risk at different time horizons. In addition, we conclude that investors benefit from holding commodities as their low or even negative correlation with stock and bond indices contribute to portfolio diversification.
Artificial neural networks for modeling time series of beach litter in the southern North Sea.
Schulz, Marcus; Matthies, Michael
2014-07-01
In European marine waters, existing monitoring programs of beach litter need to be improved concerning litter items used as indicators of pollution levels, efficiency, and effectiveness. In order to ease and focus future monitoring of beach litter on few important litter items, feed-forward neural networks consisting of three layers were developed to relate single litter items to general categories of marine litter. The neural networks developed were applied to seven beaches in the southern North Sea and modeled time series of five general categories of marine litter, such as litter from fishing, shipping, and tourism. Results of regression analyses show that general categories were predicted significantly moderately to well. Measured and modeled data were in the same order of magnitude, and minima and maxima overlapped well. Neural networks were found to be eligible tools to deliver reliable predictions of marine litter with low computational effort and little input of information. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Sen, Asok K.; Ogrin, Darko
2016-02-01
Long instrumental records of meteorological variables such as temperature and precipitation are very useful for studying regional climate in the past, present, and future. They can also be useful for understanding the influence of large-scale atmospheric circulation processes on the regional climate. This paper investigates the monthly, winter, and annual temperature time series obtained from the instrumental records in Zagreb, Croatia, for the period 1864-2010. Using wavelet analysis, the dominant modes of variability in these temperature series are identified, and the time intervals over which these modes may persist are delineated. The results reveal that all three temperature records exhibit low-frequency variability with a dominant periodicity at around 7.7 years. The 7.7-year cycle has also been observed in the temperature data recorded at several other stations in Europe, especially in Northern and Western Europe, and may be linked to the North Atlantic Oscillation (NAO) and/or solar/geomagnetic activity.
Observing and Understanding Tropospheric Ozone Changes
NASA Astrophysics Data System (ADS)
Logan, Jennifer; Schultz, Martin; Oltmans, Samuel
2010-03-01
Tropospheric Ozone Changes Workshop; Boulder, Colorado, 14-16 October 2009; Prompted by the lack of consensus on, and the need to assess current understanding of, long-term changes in tropospheric ozone, a workshop was held in Colorado to (1) evaluate the consistency of data records; (2) assess robust long-term changes; (3) determine how to combine observations and model studies; and (4) define research and observation needs for the future. At the workshop, long-term ozone records from regionally representative surface and mountain sites, ozonesondes, and aircraft were reviewed by region. In western Europe there are several time series of ˜15-40 years from all platforms. Overall, they show a rise in ozone into the middle to late 1990s and a leveling off, or in some cases declines, in the 2000s, in general agreement with precursor emission changes. However, significant differences in detail in the time series from nearby locations provide less confidence in changes before the late 1990s.
Regenerating time series from ordinal networks.
McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael
2017-03-01
Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.
Regenerating time series from ordinal networks
NASA Astrophysics Data System (ADS)
McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael
2017-03-01
Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.
GPS Position Time Series @ JPL
NASA Technical Reports Server (NTRS)
Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen
2013-01-01
Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis
NASA Astrophysics Data System (ADS)
Li, Jiqing; Duan, Zhipeng; Huang, Jing
2018-06-01
With the aggravation of the global climate change, the shortage of water resources in China is becoming more and more serious. Using reasonable methods to study changes in precipitation is very important for planning and management of water resources. Based on the time series of precipitation in Beijing from 1951 to 2015, the multi-scale features of precipitation are analyzed by the Extreme-point Symmetric Mode Decomposition (ESMD) method to forecast the precipitation shift. The results show that the precipitation series have periodic changes of 2.6, 4.3, 14 and 21.7 years, and the variance contribution rate of each modal component shows that the inter-annual variation dominates the precipitation in Beijing. It is predicted that precipitation in Beijing will continue to decrease in the near future.
Golan, Ofer; Ashwin, Emma; Granader, Yael; McClintock, Suzy; Day, Kate; Leggett, Victoria; Baron-Cohen, Simon
2010-03-01
This study evaluated The Transporters, an animated series designed to enhance emotion comprehension in children with autism spectrum conditions (ASC). n = 20 children with ASC (aged 4-7) watched The Transporters everyday for 4 weeks. Participants were tested before and after intervention on emotional vocabulary and emotion recognition at three levels of generalization. Two matched control groups of children (ASC group, n = 18 and typically developing group, n = 18) were also assessed twice without any intervention. The intervention group improved significantly more than the clinical control group on all task levels, performing comparably to typical controls at Time 2. We conclude that using The Transporters significantly improves emotion recognition in children with ASC. Future research should evaluate the series' effectiveness with lower-functioning individuals.
Handbook for Conducting Future Studies in Education.
ERIC Educational Resources Information Center
Phi Delta Kappa, Bloomington, IN.
This handbook is designed to aid school administrators, policy-makers, and teachers in bringing a "futures orientation" to their schools. The first part of the book describes a "futuring process" developed as a tool for examining alternative future probabilities. It consists of a series of diverging and converging techniques that alternately…
ERIC Educational Resources Information Center
Johnston, Lloyd D.; Schulenberg, John E.; O'Malley, Patrick M.; Bachman, Jerald G.; Miech, Richard A.; Patrick, Megan E.
2017-01-01
This occasional paper presents subgroup findings from the Monitoring the Future (MTF) study on levels of, and trends in, the use of a number of substances for nationally representative samples of high school graduates ages 19-30. The data have been gathered in a series of follow-up surveys of representative subsamples of high school seniors who…
ERIC Educational Resources Information Center
Johnston, Lloyd D.; O'Malley, Patrick M.; Bachman, Jerald G.; Schulenberg, John E.; Miech, Richard A.
2016-01-01
This occasional paper presents subgroup findings from the Monitoring the Future (MTF) study on levels of, and trends in, the use of a number of substances for nationally representative samples of high school graduates ages 19-30. The data have been gathered in a series of follow-up surveys of representative subsamples of high school seniors who…
ERIC Educational Resources Information Center
Johnston, Lloyd D.; O'Malley, Patrick M.; Miech, Richard A.; Bachman, Jerald G.; Schulenberg, John E.
2016-01-01
This occasional paper presents national demographic subgroup data for the 1975-2015 Monitoring the Future (MTF) national survey results on 8th, 10th, and 12th graders' use of drugs, alcohol, and tobacco. MTF is funded by the National Institute on Drug Abuse at the National Institutes of Health under a series of investigator-initiated, competitive…
ERIC Educational Resources Information Center
Johnston, Lloyd D.; O'Malley, Patrick M.; Miech, Richard A.; Bachman, Jerald G.; Schulenberg, John E.
2015-01-01
This occasional paper presents national demographic subgroup data for the 1975-2014 Monitoring the Future (MTF) national survey results on 8th, 10th, and 12th graders' use of drugs, alcohol, and tobacco. MTF is funded by the National Institute on Drug Abuse at the National Institutes of Health under a series of investigator-initiated, competitive…
ERIC Educational Resources Information Center
Johnston, Lloyd D.; O'Malley, Patrick M.; Miech, Richard A.; Bachman, Jerald G.; Schulenberg, John E.
2017-01-01
This occasional paper presents national demographic subgroup data for the 1975-2016 Monitoring the Future (MTF) national survey results on 8th , 10th, and 12th graders' use of drugs, alcohol, and tobacco. MTF is funded by the National Institute on Drug Abuse at the National Institutes of Health under a series of investigator-initiated, competitive…
ERIC Educational Resources Information Center
O'Malley, Patrick M.; And Others
Conducted as part of the Monitoring the Future project, this study used a cohort-sequential design to examine period, age, and cohort effects on substance use among American youth between the ages of 18 and 28 from the high school classes of 1976 to 1986. This manuscript supersedes Paper 14 in the series which reported on American youth from 18-24…
ERIC Educational Resources Information Center
Johnston, Lloyd D.; O'Malley, Patrick M.; Bachman, Jerald G.; Schulenberg, John E.
2008-01-01
Monitoring the Future is a long-term program of research being conducted at the University of Michigan's Institute for Social Research under a series of investigator-initiated research grants from the National Institute on Drug Abuse. Now in its 33rd year, the study is comprised of several ongoing series of annual surveys of nationally…
ERIC Educational Resources Information Center
Johnston, Lloyd D.; O'Malley, Patrick M.; Bachman, Jerald G.; Schulenberg, John E.
2009-01-01
Monitoring the Future is a long-term program of research being conducted at the University of Michigan's Institute for Social Research under a series of investigator-initiated research grants from the National Institute on Drug Abuse. Now in its 34th year, the study is comprised of several ongoing series of annual surveys of nationally…
ERIC Educational Resources Information Center
Johnston, Lloyd D.; O'Malley, Patrick M.; Bachman, Jerald G.; Schulenberg, John E.; Miech, Richard A.
2015-01-01
This occasional paper presents subgroup findings from the Monitoring the Future (MTF) study on levels of and trends in the use of a number of substances for nationally representative samples of high school graduates ages 19-30. The data have been gathered in a series of follow-up surveys of representative subsamples of high school seniors who were…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jadun, Paige; McMillan, Colin; Steinberg, Daniel
This report is the first in a series of Electrification Futures Study (EFS) publications. The EFS is a multiyear research project to explore widespread electrification in the future energy system of the United States. More specifically, the EFS is designed to examine electric technology advancement and adoption for end uses in all major economic sectors as well as electricity consumption growth and load profiles, future power system infrastructure development and operations, and the economic and environmental implications of widespread electrification. Because of the expansive scope and the multiyear duration of the study, research findings and supporting data will be publishedmore » as a series of reports, with each report released on its own timeframe.« less
Drought variability and change across the Iberian Peninsula
NASA Astrophysics Data System (ADS)
Coll, Joan Ramon; Aguilar, Enric
2015-04-01
Drought variability and change is assessed in this study across the Iberian Peninsula along the 20th century and the first decade of the 21st century using state of the art drought indices: the Sc-PDSI, the SPI and the SPEI. Daily temperature and precipitation data from 24 time-series regularly spread over Iberian Peninsula are quality controlled and also homogenized in a monthly scale to create the Monthly Iberian Temperature and Precipitation Series (MITPS) for the period 1906-2010. The Sc-PDSI, the 12-month SPI and 12-month SPEI are computed on a monthly basis using the newly MITPS dataset to identify dry and wet conditions across time. Precipitation data is only required to compute SPI, but potential evapotranspiration (PET) is also needed to perform the Sc-PDSI and SPEI, which is estimated using the Tornthwaite's method. The analysis conducted in this study confirms that drought conditions are worsening for most of the Iberian Peninsula across time strongly induced by global warming especially during the last three decades. All drought indices have found a drying trend in the Pyrenees, Ebro basin, central Iberia and in the south and south-eastern area while a wetting trend is identified in the western and in the north-western region. Future projections also indicate a clear increase in hydrological drought conditions along the 21st century, thus, water saving and the application of effective water management strategies will be crucial to minimize the impact of hydrological droughts over the Iberian Peninsula into the near future. KEY WORDS: Drought, climate change, Iberian Peninsula, drought indices.
Dai, Zongli; Zhao, Aiwu; He, Jie
2018-01-01
In this paper, we propose a hybrid method to forecast the stock prices called High-order-fuzzy-fluctuation-Trends-based Back Propagation(HTBP)Neural Network model. First, we compare each value of the historical training data with the previous day's value to obtain a fluctuation trend time series (FTTS). On this basis, the FTTS blur into fuzzy time series (FFTS) based on the fluctuation of the increasing, equality, decreasing amplitude and direction. Since the relationship between FFTS and future wave trends is nonlinear, the HTBP neural network algorithm is used to find the mapping rules in the form of self-learning. Finally, the results of the algorithm output are used to predict future fluctuations. The proposed model provides some innovative features:(1)It combines fuzzy set theory and neural network algorithm to avoid overfitting problems existed in traditional models. (2)BP neural network algorithm can intelligently explore the internal rules of the actual existence of sequential data, without the need to analyze the influence factors of specific rules and the path of action. (3)The hybrid modal can reasonably remove noises from the internal rules by proper fuzzy treatment. This paper takes the TAIEX data set of Taiwan stock exchange as an example, and compares and analyzes the prediction performance of the model. The experimental results show that this method can predict the stock market in a very simple way. At the same time, we use this method to predict the Shanghai stock exchange composite index, and further verify the effectiveness and universality of the method. PMID:29420584
Guan, Hongjun; Dai, Zongli; Zhao, Aiwu; He, Jie
2018-01-01
In this paper, we propose a hybrid method to forecast the stock prices called High-order-fuzzy-fluctuation-Trends-based Back Propagation(HTBP)Neural Network model. First, we compare each value of the historical training data with the previous day's value to obtain a fluctuation trend time series (FTTS). On this basis, the FTTS blur into fuzzy time series (FFTS) based on the fluctuation of the increasing, equality, decreasing amplitude and direction. Since the relationship between FFTS and future wave trends is nonlinear, the HTBP neural network algorithm is used to find the mapping rules in the form of self-learning. Finally, the results of the algorithm output are used to predict future fluctuations. The proposed model provides some innovative features:(1)It combines fuzzy set theory and neural network algorithm to avoid overfitting problems existed in traditional models. (2)BP neural network algorithm can intelligently explore the internal rules of the actual existence of sequential data, without the need to analyze the influence factors of specific rules and the path of action. (3)The hybrid modal can reasonably remove noises from the internal rules by proper fuzzy treatment. This paper takes the TAIEX data set of Taiwan stock exchange as an example, and compares and analyzes the prediction performance of the model. The experimental results show that this method can predict the stock market in a very simple way. At the same time, we use this method to predict the Shanghai stock exchange composite index, and further verify the effectiveness and universality of the method.
NASA Astrophysics Data System (ADS)
Sawant, S. A.; Chakraborty, M.; Suradhaniwar, S.; Adinarayana, J.; Durbha, S. S.
2016-06-01
Satellite based earth observation (EO) platforms have proved capability to spatio-temporally monitor changes on the earth's surface. Long term satellite missions have provided huge repository of optical remote sensing datasets, and United States Geological Survey (USGS) Landsat program is one of the oldest sources of optical EO datasets. This historical and near real time EO archive is a rich source of information to understand the seasonal changes in the horticultural crops. Citrus (Mandarin / Nagpur Orange) is one of the major horticultural crops cultivated in central India. Erratic behaviour of rainfall and dependency on groundwater for irrigation has wide impact on the citrus crop yield. Also, wide variations are reported in temperature and relative humidity causing early fruit onset and increase in crop water requirement. Therefore, there is need to study the crop growth stages and crop evapotranspiration at spatio-temporal scale for managing the scarce resources. In this study, an attempt has been made to understand the citrus crop growth stages using Normalized Difference Time Series (NDVI) time series data obtained from Landsat archives (http://earthexplorer.usgs.gov/). Total 388 Landsat 4, 5, 7 and 8 scenes (from year 1990 to Aug. 2015) for Worldwide Reference System (WRS) 2, path 145 and row 45 were selected to understand seasonal variations in citrus crop growth. Considering Landsat 30 meter spatial resolution to obtain homogeneous pixels with crop cover orchards larger than 2 hectare area was selected. To consider change in wavelength bandwidth (radiometric resolution) with Landsat sensors (i.e. 4, 5, 7 and 8) NDVI has been selected to obtain continuous sensor independent time series. The obtained crop growth stage information has been used to estimate citrus basal crop coefficient information (Kcb). Satellite based Kcb estimates were used with proximal agrometeorological sensing system observed relevant weather parameters for crop ET estimation. The results show that time series EO based crop growth stage estimates provide better information about geographically separated citrus orchards. Attempts are being made to estimate regional variations in citrus crop water requirement for effective irrigation planning. In future high resolution Sentinel 2 observations from European Space Agency (ESA) will be used to fill the time gaps and to get better understanding about citrus crop canopy parameters.
Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P
2014-10-01
There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.
Wang, Tongli; Hamann, Andreas; Spittlehouse, Dave; Carroll, Carlos
2016-01-01
Large volumes of gridded climate data have become available in recent years including interpolated historical data from weather stations and future predictions from general circulation models. These datasets, however, are at various spatial resolutions that need to be converted to scales meaningful for applications such as climate change risk and impact assessments or sample-based ecological research. Extracting climate data for specific locations from large datasets is not a trivial task and typically requires advanced GIS and data management skills. In this study, we developed a software package, ClimateNA, that facilitates this task and provides a user-friendly interface suitable for resource managers and decision makers as well as scientists. The software locally downscales historical and future monthly climate data layers into scale-free point estimates of climate values for the entire North American continent. The software also calculates a large number of biologically relevant climate variables that are usually derived from daily weather data. ClimateNA covers 1) 104 years of historical data (1901–2014) in monthly, annual, decadal and 30-year time steps; 2) three paleoclimatic periods (Last Glacial Maximum, Mid Holocene and Last Millennium); 3) three future periods (2020s, 2050s and 2080s); and 4) annual time-series of model projections for 2011–2100. Multiple general circulation models (GCMs) were included for both paleo and future periods, and two representative concentration pathways (RCP4.5 and 8.5) were chosen for future climate data. PMID:27275583
Wang, Tongli; Hamann, Andreas; Spittlehouse, Dave; Carroll, Carlos
2016-01-01
Large volumes of gridded climate data have become available in recent years including interpolated historical data from weather stations and future predictions from general circulation models. These datasets, however, are at various spatial resolutions that need to be converted to scales meaningful for applications such as climate change risk and impact assessments or sample-based ecological research. Extracting climate data for specific locations from large datasets is not a trivial task and typically requires advanced GIS and data management skills. In this study, we developed a software package, ClimateNA, that facilitates this task and provides a user-friendly interface suitable for resource managers and decision makers as well as scientists. The software locally downscales historical and future monthly climate data layers into scale-free point estimates of climate values for the entire North American continent. The software also calculates a large number of biologically relevant climate variables that are usually derived from daily weather data. ClimateNA covers 1) 104 years of historical data (1901-2014) in monthly, annual, decadal and 30-year time steps; 2) three paleoclimatic periods (Last Glacial Maximum, Mid Holocene and Last Millennium); 3) three future periods (2020s, 2050s and 2080s); and 4) annual time-series of model projections for 2011-2100. Multiple general circulation models (GCMs) were included for both paleo and future periods, and two representative concentration pathways (RCP4.5 and 8.5) were chosen for future climate data.
SPAGETTA, a Gridded Weather Generator: Calibration, Validation and its Use for Future Climate
NASA Astrophysics Data System (ADS)
Dubrovsky, Martin; Rotach, Mathias W.; Huth, Radan
2017-04-01
Spagetta is a new (started in 2016) stochastic multi-site multi-variate weather generator (WG). It can produce realistic synthetic daily (or monthly, or annual) weather series representing both present and future climate conditions at multiple sites (grids or stations irregularly distributed in space). The generator, whose model is based on the Wilks' (1999) multi-site extension of the parametric (Richardson's type) single site M&Rfi generator, may be run in two modes: In the first mode, it is run as a classical generator, which is calibrated in the first step using weather data from multiple sites, and only then it may produce arbitrarily long synthetic time series mimicking the spatial and temporal structure of the calibration weather data. To generate the weather series representing the future climate, the WG parameters are modified according to the climate change scenario, typically derived from GCM or RCM simulations. In the second mode, the user provides only basic information (not necessarily to be realistic) on the temporal and spatial auto-correlation structure of the surface weather variables and their mean annual cycle; the generator itself derives the parameters of the underlying autoregressive model, which produces the multi-site weather series. In the latter mode of operation, the user is allowed to prescribe the spatially varying trend, which is superimposed to the values produced by the generator; this feature has been implemented for use in developing the methodology for assessing significance of trends in multi-site weather series (for more details see another EGU-2017 contribution: Huth and Dubrovsky, 2017, Evaluating collective significance of climatic trends: A comparison of methods on synthetic data; EGU2017-4993). This contribution will focus on the first (classical) mode. The poster will present (a) model of the generator, (b) results of the validation tests made in terms of the spatial hot/cold/dry/wet spells, and (c) results of the pilot climate change impact experiment, in which (i) the WG parameters representing the spatial and temporal variability are modified using the climate change scenarios and then (ii) the effect on the above spatial validation indices derived from the synthetic series produced by the modified WG is analysed. In this experiment, the generator is calibrated using the E-OBS gridded daily weather data for several European regions, and the climate change scenarios are derived from the selected RCM simulation (taken from the CORDEX database).
Perception of aircraft Deviation Cues
NASA Technical Reports Server (NTRS)
Martin, Lynne; Azuma, Ronald; Fox, Jason; Verma, Savita; Lozito, Sandra
2005-01-01
To begin to address the need for new displays, required by a future airspace concept to support new roles that will be assigned to flight crews, a study of potentially informative display cues was undertaken. Two cues were tested on a simple plan display - aircraft trajectory and flight corridor. Of particular interest was the speed and accuracy with which participants could detect an aircraft deviating outside its flight corridor. Presence of the trajectory cue significantly reduced participant reaction time to a deviation while the flight corridor cue did not. Although non-significant, the flight corridor cue seemed to have a relationship with the accuracy of participants judgments rather than their speed. As this is the second of a series of studies, these issues will be addressed further in future studies.
Kirchner, James W.; Neal, Colin
2013-01-01
The chemical dynamics of lakes and streams affect their suitability as aquatic habitats and as water supplies for human needs. Because water quality is typically monitored only weekly or monthly, however, the higher-frequency dynamics of stream chemistry have remained largely invisible. To illuminate a wider spectrum of water quality dynamics, rainfall and streamflow were sampled in two headwater catchments at Plynlimon, Wales, at 7-h intervals for 1–2 y and weekly for over two decades, and were analyzed for 45 solutes spanning the periodic table from H+ to U. Here we show that in streamflow, all 45 of these solutes, including nutrients, trace elements, and toxic metals, exhibit fractal 1/fα scaling on time scales from hours to decades (α = 1.05 ± 0.15, mean ± SD). We show that this fractal scaling can arise through dispersion of random chemical inputs distributed across a catchment. These 1/f time series are non–self-averaging: monthly, yearly, or decadal averages are approximately as variable, one from the next, as individual measurements taken hours or days apart, defying naive statistical expectations. (By contrast, stream discharge itself is nonfractal, and self-averaging on time scales of months and longer.) In the solute time series, statistically significant trends arise much more frequently, on all time scales, than one would expect from conventional t statistics. However, these same trends are poor predictors of future trends—much poorer than one would expect from their calculated uncertainties. Our results illustrate how 1/f time series pose fundamental challenges to trend analysis and change detection in environmental systems. PMID:23842090
NASA Astrophysics Data System (ADS)
Kirchner, James W.; Neal, Colin
2013-07-01
The chemical dynamics of lakes and streams affect their suitability as aquatic habitats and as water supplies for human needs. Because water quality is typically monitored only weekly or monthly, however, the higher-frequency dynamics of stream chemistry have remained largely invisible. To illuminate a wider spectrum of water quality dynamics, rainfall and streamflow were sampled in two headwater catchments at Plynlimon, Wales, at 7-h intervals for 1-2 y and weekly for over two decades, and were analyzed for 45 solutes spanning the periodic table from H+ to U. Here we show that in streamflow, all 45 of these solutes, including nutrients, trace elements, and toxic metals, exhibit fractal 1/fα scaling on time scales from hours to decades (α = 1.05 ± 0.15, mean ± SD). We show that this fractal scaling can arise through dispersion of random chemical inputs distributed across a catchment. These 1/f time series are non-self-averaging: monthly, yearly, or decadal averages are approximately as variable, one from the next, as individual measurements taken hours or days apart, defying naive statistical expectations. (By contrast, stream discharge itself is nonfractal, and self-averaging on time scales of months and longer.) In the solute time series, statistically significant trends arise much more frequently, on all time scales, than one would expect from conventional t statistics. However, these same trends are poor predictors of future trends-much poorer than one would expect from their calculated uncertainties. Our results illustrate how 1/f time series pose fundamental challenges to trend analysis and change detection in environmental systems.
Integrated orbital servicing study follow-on. Volume 3: Engineering test unit and controls
NASA Technical Reports Server (NTRS)
1978-01-01
A one-g servicing demonstration system which can be used to investigate and develop, in a real time hands-on situation, a wide variety of the mechanism and control system aspects of orbital servicing in the form of module exchange is described including the engineering test unit and the servicer servo drive console. A series of recommendations for future work is given concerning the control problem and more efficient module exchanges, mechanical elements, and electronics.
Stochastic demographic forecasting.
Lee, R D
1992-11-01
"This paper describes a particular approach to stochastic population forecasting, which is implemented for the U.S.A. through 2065. Statistical time series methods are combined with demographic models to produce plausible long run forecasts of vital rates, with probability distributions. The resulting mortality forecasts imply gains in future life expectancy that are roughly twice as large as those forecast by the Office of the Social Security Actuary.... Resulting stochastic forecasts of the elderly population, elderly dependency ratios, and payroll tax rates for health, education and pensions are presented." excerpt
Detection of a sudden change of the field time series based on the Lorenz system.
Da, ChaoJiu; Li, Fang; Shen, BingLu; Yan, PengCheng; Song, Jian; Ma, DeShan
2017-01-01
We conducted an exploratory study of the detection of a sudden change of the field time series based on the numerical solution of the Lorenz system. First, the time when the Lorenz path jumped between the regions on the left and right of the equilibrium point of the Lorenz system was quantitatively marked and the sudden change time of the Lorenz system was obtained. Second, the numerical solution of the Lorenz system was regarded as a vector; thus, this solution could be considered as a vector time series. We transformed the vector time series into a time series using the vector inner product, considering the geometric and topological features of the Lorenz system path. Third, the sudden change of the resulting time series was detected using the sliding t-test method. Comparing the test results with the quantitatively marked time indicated that the method could detect every sudden change of the Lorenz path, thus the method is effective. Finally, we used the method to detect the sudden change of the pressure field time series and temperature field time series, and obtained good results for both series, which indicates that the method can apply to high-dimension vector time series. Mathematically, there is no essential difference between the field time series and vector time series; thus, we provide a new method for the detection of the sudden change of the field time series.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-11
... Chief, at (202) 551-6821 (Division of Investment Management, Exemptive Applications Office... management investment company currently comprising 23 series (the ``Compass Funds'').\\1\\ Each series of the... series of the Trust and any other existing or future registered open-end management investment company or...
ERIC Educational Resources Information Center
Hesselbein, Frances, Ed.; And Others
The 31 papers in this volume address the requirements and qualities of leadership and leaders in the organization of the future. Papers are grouped into the following categories: Leading the Organization of the Future, Future Leaders in Action, Learning to Lead for Tomorrow, and Executives on the Future of Leadership. Some of the papers included…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chichester, Heather Jean MacLean; Hayes, Steven Lowe; Dempsey, Douglas
This report summarizes the objectives of the current irradiation testing activities being undertaken by the Advanced Fuels Campaign relative to supporting the development and demonstration of innovative design features for metallic fuels in order to realize reliable performance to ultra-high burnups. The AFC-3 and AFC-4 test series are nearing completion; the experiments in this test series that have been completed or are in progress are reviewed and the objectives and test matrices for the final experiments in these two series are defined. The objectives, testing strategy, and test parameters associated with a future AFC test series, AFC-5, are documented. Finally,more » the future intersections and/or synergies of the AFC irradiation testing program with those of the TREAT transient testing program, emerging needs of proposed Versatile Test Reactor concepts, and the Joint Fuel Cycle Study program’s Integrated Recycle Test are discussed.« less
Volatility of linear and nonlinear time series
NASA Astrophysics Data System (ADS)
Kalisky, Tomer; Ashkenazy, Yosef; Havlin, Shlomo
2005-07-01
Previous studies indicated that nonlinear properties of Gaussian distributed time series with long-range correlations, ui , can be detected and quantified by studying the correlations in the magnitude series ∣ui∣ , the “volatility.” However, the origin for this empirical observation still remains unclear and the exact relation between the correlations in ui and the correlations in ∣ui∣ is still unknown. Here we develop analytical relations between the scaling exponent of linear series ui and its magnitude series ∣ui∣ . Moreover, we find that nonlinear time series exhibit stronger (or the same) correlations in the magnitude time series compared with linear time series with the same two-point correlations. Based on these results we propose a simple model that generates multifractal time series by explicitly inserting long range correlations in the magnitude series; the nonlinear multifractal time series is generated by multiplying a long-range correlated time series (that represents the magnitude series) with uncorrelated time series [that represents the sign series sgn(ui) ]. We apply our techniques on daily deep ocean temperature records from the equatorial Pacific, the region of the El-Ninõ phenomenon, and find: (i) long-range correlations from several days to several years with 1/f power spectrum, (ii) significant nonlinear behavior as expressed by long-range correlations of the volatility series, and (iii) broad multifractal spectrum.
Ocean Observatory efforts in and around Monterey Bay, California, 1930 to the present
NASA Astrophysics Data System (ADS)
Chavez, F. P.; Pennington, J. T.; Collins, C. A.; Paduan, J. D.; Marinovich, B.; Bellingham, J.
2002-12-01
Monterey Bay (MB) is a deep (>1000 m), non-estuarine embayment in central California broadly open to the coastal ocean. Its oceanography had received considerable study beginning in the early 1930's when MB was the center of a large sardine fishery, and continuing intermittently since the collapse of the fishery in the 1950s. Many studies had been conducted within and offshore of MB, primarily by the many marine science laboratories and academic departments ringing Monterey Bay, and the time series studies constituted an impressive, albeit discontinuous, record (at least 39 of 61 years between 1928-1989). The Monterey Bay Aquarium Research Institute (MBARI) initiated in 1989 a program of inter-disciplinary semi-monthly time series cruises to stations within and offshore of MB. In addition to the shipboard time series, MBARI has maintained two moorings since 1989 (M1 and M2). More recently, additional moorings have been deployed for shorter periods by MBARI (M3, S2, S3) and the Naval Postgraduate School (NPS) (M4). The M moorings are equipped with meteorological, physical, chemical, and bio-optical instrumentation. The S moorings have current meters and sediment traps. Since 1997, as part of a cooperative program between MBARI and NPS, quarterly cruises that occupy CalCOFI line 67 to 300 km from shore, have been carried out. High-frequency radar (CODAR) measurements of MB by NPS, Cal State University MB and UC Santa Cruz, collected since 1995, have recently been augmented with coverage south of Point Sur. Since 1997, UC Santa Cruz and MBARI have carried out cooperative studies of zooplankton abundance and composition. Finally in 1998 modeling studies have been initiated in an effort to integrate the available data and to direct future observational studies. Several new collaborative initiatives, funded by NSF (MARS), N OAA (CIMT) and, ONR (AOSN) geared at adding new and more sophisticated observing and modeling capabilities, are set to begin in the near future. In this paper we review some of the discoveries and scientific advances that have resulted from the sustained time series and show that we are beginning to understand the functioning and complexities of Monterey Bay pelagic ecosystems. As new technologies are deployed to explore the interconnected physical, geological, chemical and biological processes, the challenge will be to integrate these data into new conceptual and dynamical models of ocean dynamics. This will require a truly synergistic effort between organizations and disciplines.
Duality between Time Series and Networks
Campanharo, Andriana S. L. O.; Sirer, M. Irmak; Malmgren, R. Dean; Ramos, Fernando M.; Amaral, Luís A. Nunes.
2011-01-01
Studying the interaction between a system's components and the temporal evolution of the system are two common ways to uncover and characterize its internal workings. Recently, several maps from a time series to a network have been proposed with the intent of using network metrics to characterize time series. Although these maps demonstrate that different time series result in networks with distinct topological properties, it remains unclear how these topological properties relate to the original time series. Here, we propose a map from a time series to a network with an approximate inverse operation, making it possible to use network statistics to characterize time series and time series statistics to characterize networks. As a proof of concept, we generate an ensemble of time series ranging from periodic to random and confirm that application of the proposed map retains much of the information encoded in the original time series (or networks) after application of the map (or its inverse). Our results suggest that network analysis can be used to distinguish different dynamic regimes in time series and, perhaps more importantly, time series analysis can provide a powerful set of tools that augment the traditional network analysis toolkit to quantify networks in new and useful ways. PMID:21858093
New approaches to some methodological problems of meteor science
NASA Technical Reports Server (NTRS)
Meisel, David D.
1987-01-01
Several low cost approaches to continuous radioscatter monitoring of the incoming meteor flux are described. Preliminary experiments were attempted using standard time frequency stations WWVH and CHU (on frequencies near 15 MHz) during nighttime hours. Around-the-clock monitoring using the international standard aeronautical beacon frequency of 75 MHz was also attempted. The techniques are simple and can be managed routinely by amateur astronomers with relatively little technical expertise. Time series analysis can now be performed using relatively inexpensive microcomputers. Several algorithmic approaches to the analysis of meteor rates are discussed. Methods of obtaining optimal filter predictions of future meteor flux are also discussed.
Memory interface simulator: A computer design aid
NASA Technical Reports Server (NTRS)
Taylor, D. S.; Williams, T.; Weatherbee, J. E.
1972-01-01
Results are presented of a study conducted with a digital simulation model being used in the design of the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. The model simulates the activity involved as instructions are fetched from random access memory for execution in one of the system central processing units. A series of model runs measured instruction execution time under various assumptions pertaining to the CPU's and the interface between the CPU's and RAM. Design tradeoffs are presented in the following areas: Bus widths, CPU microprogram read only memory cycle time, multiple instruction fetch, and instruction mix.
NASA Astrophysics Data System (ADS)
Forsythe, N.; Fowler, H. J.; Blenkinsop, S.; Burton, A.; Kilsby, C. G.; Archer, D. R.; Harpham, C.; Hashmi, M. Z.
2014-09-01
Assessing local climate change impacts requires downscaling from Global Climate Model simulations. Here, a stochastic rainfall model (RainSim) combined with a rainfall conditioned weather generator (CRU WG) have been successfully applied in a semi-arid mountain climate, for part of the Upper Indus Basin (UIB), for point stations at a daily time-step to explore climate change impacts. Validation of the simulated time-series against observations (1961-1990) demonstrated the models' skill in reproducing climatological means of core variables with monthly RMSE of <2.0 mm for precipitation and ⩽0.4 °C for mean temperature and daily temperature range. This level of performance is impressive given complexity of climate processes operating in this mountainous context at the boundary between monsoonal and mid-latitude (westerly) weather systems. Of equal importance the model captures well the observed interannual variability as quantified by the first and last decile of 30-year climatic periods. Differences between a control (1961-1990) and future (2071-2100) regional climate model (RCM) time-slice experiment were then used to provide change factors which could be applied within the rainfall and weather models to produce perturbed ‘future' weather time-series. These project year-round increases in precipitation (maximum seasonal mean change:+27%, annual mean change: +18%) with increased intensity in the wettest months (February, March, April) and year-round increases in mean temperature (annual mean +4.8 °C). Climatic constraints on the productivity of natural resource-dependent systems were also assessed using relevant indices from the European Climate Assessment (ECA) and indicate potential future risk to water resources and local agriculture. However, the uniformity of projected temperature increases is in stark contrast to recent seasonally asymmetrical trends in observations, so an alternative scenario of extrapolated trends was also explored. We conclude that interannual variability in climate will continue to have the dominant impact on water resources management whichever trajectory is followed. This demonstrates the need for sophisticated downscaling methods which can evaluate changes in variability and sequencing of events to explore climate change impacts in this region.
ERIC Educational Resources Information Center
Johnston, Lloyd D.; O'Malley, Patrick M.; Bachman, Jerald G.; Schulenberg, John E.; Miech, Richard A.
2014-01-01
This occasional paper presents subgroup findings from the Monitoring the Future (MTF) study on levels of and trends in the use of a number of substances for nationally representative samples of high school graduates ages 19-30. The data have been gathered in a series of follow-up surveys of representative subsamples of high school seniors who were…
Authentic Astronomical Discovery in Planetariums: Data-Driven Immersive Lectures
NASA Astrophysics Data System (ADS)
Wyatt, Ryan Jason
2018-01-01
Planetariums are akin to “branch offices” for astronomy in major cities and other locations around the globe. With immersive, fulldome video technology, modern digital planetariums offer the opportunity to integrate authentic astronomical data into both pre-recorded shows and live lectures. At the California Academy of Sciences Morrison Planetarium, we host the monthly Benjamin Dean Astronomy Lecture Series, which features researchers describing their cutting-edge work to well-informed lay audiences. The Academy’s visualization studio and engineering teams work with researchers to visualize their data in both pre-rendered and real-time formats, and these visualizations are integrated into a variety of programs—including lectures! The assets are then made available to any other planetariums with similar software to support their programming. A lecturer can thus give the same immersive presentation to audiences in a variety of planetariums. The Academy has also collaborated with Chicago’s Adler Planetarium to bring Kavli Fulldome Lecture Series to San Francisco, and the two theaters have also linked together in live “domecasts” to share real-time content with audiences in both cities. These lecture series and other, similar projects suggest a bright future for astronomers to bring their research to the public in an immersive and visually compelling format.
NASA Astrophysics Data System (ADS)
Spellman, P.; Griffis, V. W.; LaFond, K.
2013-12-01
A changing climate brings about new challenges for flood risk analysis and water resources planning and management. Current methods for estimating flood risk in the US involve fitting the Pearson Type III (P3) probability distribution to the logarithms of the annual maximum flood (AMF) series using the method of moments. These methods are employed under the premise of stationarity, which assumes that the fitted distribution is time invariant and variables affecting stream flow such as climate do not fluctuate. However, climate change would bring about shifts in meteorological forcings which can alter the summary statistics (mean, variance, skew) of flood series used for P3 parameter estimation, resulting in erroneous flood risk projections. To ascertain the degree to which future risk may be misrepresented by current techniques, we use climate scenarios generated from global climate models (GCMs) as input to a hydrological model to explore how relative changes to current climate affect flood response for watersheds in the northeastern United States. The watersheds were calibrated and run on a daily time step using the continuous, semi-distributed, process based Soil and Water Assessment Tool (SWAT). Nash Sutcliffe Efficiency (NSE), RMSE to Standard Deviation ratio (RSR) and Percent Bias (PBIAS) were all used to assess model performance. Eight climate scenarios were chosen from GCM output based on relative precipitation and temperature changes from the current climate of the watershed and then further bias-corrected. Four of the scenarios were selected to represent warm-wet, warm-dry, cool-wet and cool-dry future climates, and the other four were chosen to represent more extreme, albeit possible, changes in precipitation and temperature. We quantify changes in response by comparing the differences in total mass balance and summary statistics of the logarithms of the AMF series from historical baseline values. We then compare forecasts of flood quantiles from fitting a P3 distribution to the logs of historical AMF data to that of generated AMF series.
A randomized intervention of reminder letter for human papillomavirus vaccine series completion.
Chao, Chun; Preciado, Melissa; Slezak, Jeff; Xu, Lanfang
2015-01-01
Completion rate for the three-dose series of the human papillomavirus (HPV) vaccine has generally been low. This study evaluated the effectiveness of a reminder letter intervention on HPV vaccine three-dose series completion. Female members of Kaiser Permanente Southern California Health Plan who received at least one dose, but not more than two doses, of the HPV vaccine by February 13, 2013, and who were between ages 9 and 26 years at the time of first HPV vaccination were included. Eighty percent of these females were randomized to receive the reminder letter, and 20% were randomized to receive standard of care (control). The reminder letters were mailed quarterly to those who had not completed the series. The proportion of series completion at the end of the 12-month evaluation period was compared using chi-square test. A total of 9,760 females were included in the intervention group and 2,445 in the control group. HPV vaccine series completion was 56.4% in the intervention group and 46.6% in the control groups (p < .001). The effect of the intervention appeared to be stronger in girls aged 9-17 years compared with young women aged 18-26 years at the first dose and in blacks compared with whites. Reminder letters scheduled quarterly were effective to enhance HPV vaccine series completion among those who initiated the vaccine. However, a large gap in series completion remained despite the intervention. Future studies should address other barriers to series completion, including those at the providers and the health care system level. Copyright © 2015 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
A systematic analysis of model performance during simulations based on observed landcover/use change is used to quantify errors associated with simulations of known "future" conditions. Calibrated and uncalibrated assessments of relative change over different lengths of...
NASA Astrophysics Data System (ADS)
Zhou, Yu; Chen, Shi
2016-02-01
In this paper, we investigate the high-frequency cross-correlation relationship between Chinese treasury futures contracts and treasury ETF. We analyze the logarithmic return of these two price series, from which we can conclude that both return series are not normally distributed and the futures markets have greater volatility. We find significant cross-correlation between these two series. We further confirm the relationship using the DCCA coefficient and the DMCA coefficient. We quantify the long-range cross-correlation with DCCA method, and we further show that the relationship is multifractal. An arbitrage algorithm based on DFA regression with stable return is proposed in the last part.
An Algorithm Framework for Isolating Anomalous Signals in Electromagnetic Data
NASA Astrophysics Data System (ADS)
Kappler, K. N.; Schneider, D.; Bleier, T.; MacLean, L. S.
2016-12-01
QuakeFinder and its international collaborators have installed and currently maintain an array of 165 three-axis induction magnetometer instrument sites in California, Peru, Taiwan, Greece, Chile and Sumatra. Based on research by Bleier et al. (2009), Fraser-Smith et al. (1990), and Freund (2007), the electromagnetic data from these instruments are being analyzed for pre-earthquake signatures. This analysis consists of both private research by QuakeFinder, and institutional collaborators (PUCP in Peru, NCU in Taiwan, NOA in Greece, LASP at University of Colorado, Stanford, UCLA, NASA-ESI, NASA-AMES and USC-CSEP). QuakeFinder has developed an algorithm framework aimed at isolating anomalous signals (pulses) in the time series. Results are presented from an application of this framework to induction-coil magnetometer data. Our data driven approach starts with sliding windows applied to uniformly resampled array data with a variety of lengths and overlap. Data variance (a proxy for energy) is calculated on each window and a short-term average/ long-term average (STA/LTA) filter is applied to the variance time series. Pulse identification is done by flagging time intervals in the STA/LTA filtered time series which exceed a threshold. Flagged time intervals are subsequently fed into a feature extraction program which computes statistical properties of the resampled data. These features are then filtered using a Principal Component Analysis (PCA) based method to cluster similar pulses. We explore the extent to which this approach categorizes pulses with known sources (e.g. cars, lightning, etc.) and the remaining pulses of unknown origin can be analyzed with respect to their relationship with seismicity. We seek a correlation between these daily pulse-counts (with known sources removed) and subsequent (days to weeks) seismic events greater than M5 within 15km radius. Thus we explore functions which map daily pulse-counts to a time series representing the likelihood of a seismic event occurring at some future time. These "pseudo-probabilities" can in turn be represented as Molchan diagrams. The Molchan curve provides an effective cost function for optimization and allows for a rigorous statistical assessment of the validity of pre-earthquake signals in the electromagnetic data.
Modeling species invasions in Ecopath with Ecosim: an evaluation using Laurentian Great Lakes models
Langseth, Brian J.; Rogers, Mark; Zhang, Hongyan
2012-01-01
Invasive species affect the structure and processes of ecosystems they invade. Invasive species have been particularly relevant to the Laurentian Great Lakes, where they have played a part in both historical and recent changes to Great Lakes food webs and the fisheries supported therein. There is increased interest in understanding the effects of ecosystem changes on fisheries within the Great Lakes, and ecosystem models provide an essential tool from which this understanding can take place. A commonly used model for exploring fisheries management questions within an ecosystem context is the Ecopath with Ecosim (EwE) modeling software. Incorporating invasive species into EwE models is a challenging process, and descriptions and comparisons of methods for modeling species invasions are lacking. We compared four methods for incorporating invasive species into EwE models for both Lake Huron and Lake Michigan based on the ability of each to reproduce patterns in observed data time series. The methods differed in whether invasive species biomass was forced in the model, the initial level of invasive species biomass at the beginning of time dynamic simulations, and the approach to cause invasive species biomass to increase at the time of invasion. The overall process of species invasion could be reproduced by all methods, but fits to observed time series varied among the methods and models considered. We recommend forcing invasive species biomass when model objectives are to understand ecosystem impacts in the past and when time series of invasive species biomass are available. Among methods where invasive species time series were not forced, mediating the strength of predator–prey interactions performed best for the Lake Huron model, but worse for the Lake Michigan model. Starting invasive species biomass at high values and then artificially removing biomass until the time of invasion performed well for both models, but was more complex than starting invasive species biomass at low values. In general, for understanding the effect of invasive species on future fisheries management actions, we recommend initiating invasive species biomass at low levels based on the greater simplicity and realism of the method compared to others.
Time series analysis of temporal networks
NASA Astrophysics Data System (ADS)
Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh
2016-01-01
A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue "Temporal Network Theory and Applications", edited by Petter Holme.
NASA Astrophysics Data System (ADS)
Friedl, M. A.; Melaas, E. K.; Sulla-menashe, D. J.; Gray, J. M.
2014-12-01
Phenology, the seasonal progression of organisms through stages of dormancy, active growth, and senescence is a key regulator of ecosystem processes and is widely used as an indicator of vegetation responses to climate change. This is especially true in temperate forests, where seasonal dynamics in canopy development and senescence are tightly coupled to the climate system. Despite this, understanding of climate-phenology interactions is incomplete. A key impediment to improving this understanding is that available datasets are geographically sparse, and in most cases include relatively short time series. Remote sensing has been widely promoted as a useful tool for studies of large-scale phenology, but long-term studies from remote sensing have been limited to AVHRR data, which suffers from limitations related to its coarse spatial resolution and uncertainties in atmospheric corrections and radiometric adjustments that are used to create AVHRR time series. In this study, we used 30 years of Landsat data to quantify the nature and magnitude of long-term trends and short-term variability in the timing of spring leaf emergence and fall senescence. Our analysis focuses on temperate forest locations in the Northeastern United States that are co-located with surface meteorological observations, where we have estimated the timing of leaf emergence and leaf senescence at annual time steps using atmospherically corrected surface reflectances from Landsat TM and ETM+ imagery. Comparison of results from Landsat against ground observations demonstrates that phenological events can be reliably estimated from Landsat time series. More importantly, results from this analysis suggest two main conclusions related to the nature of climate change impacts on temperate forest phenology. First, there is clear evidence of trends towards longer growing seasons in the Landsat record. Second, interannual variability is large, with average year-to-year variability exceeding the magnitude of total changes to the growing season that have occurred over the last three decades. Based on these results we suggest that year-to-year variability in phenology, rather than long-term trends, provides the best basis for predicting future changes in temperate forest phenology in response to climate change.
NASA Astrophysics Data System (ADS)
Wang, L.; Toshioka, T.; Nakajima, T.; Narita, A.; Xue, Z.
2017-12-01
In recent years, more and more Carbon Capture and Storage (CCS) studies focus on seismicity monitoring. For the safety management of geological CO2 storage at Tomakomai, Hokkaido, Japan, an Advanced Traffic Light System (ATLS) combined different seismic messages (magnitudes, phases, distributions et al.) is proposed for injection controlling. The primary task for ATLS is the seismic events detection in a long-term sustained time series record. Considering the time-varying characteristics of Signal to Noise Ratio (SNR) of a long-term record and the uneven energy distributions of seismic event waveforms will increase the difficulty in automatic seismic detecting, in this work, an improved probability autoregressive (AR) method for automatic seismic event detecting is applied. This algorithm, called sequentially discounting AR learning (SDAR), can identify the effective seismic event in the time series through the Change Point detection (CPD) of the seismic record. In this method, an anomaly signal (seismic event) can be designed as a change point on the time series (seismic record). The statistical model of the signal in the neighborhood of event point will change, because of the seismic event occurrence. This means the SDAR aims to find the statistical irregularities of the record thought CPD. There are 3 advantages of SDAR. 1. Anti-noise ability. The SDAR does not use waveform messages (such as amplitude, energy, polarization) for signal detecting. Therefore, it is an appropriate technique for low SNR data. 2. Real-time estimation. When new data appears in the record, the probability distribution models can be automatic updated by SDAR for on-line processing. 3. Discounting property. the SDAR introduces a discounting parameter to decrease the influence of present statistic value on future data. It makes SDAR as a robust algorithm for non-stationary signal processing. Within these 3 advantages, the SDAR method can handle the non-stationary time-varying long-term series and achieve real-time monitoring. Finally, we employ the SDAR on a synthetic model and Tomakomai Ocean Bottom Cable (OBC) baseline data to prove the feasibility and advantage of our method.
NASA Astrophysics Data System (ADS)
Leauthaud, C.; Demarty, J.; Cappelaere, B.; Grippa, M.; Kergoat, L.; Velluet, C.; Guichard, F.; Mougin, E.; Chelbi, S.; Sultan, B.
2015-06-01
Rainfall and climatic conditions are the main drivers of natural and cultivated vegetation productivity in the semiarid region of Central Sahel. In a context of decreasing cultivable area per capita, understanding and predicting changes in the water cycle are crucial. Yet, it remains challenging to project future climatic conditions in West Africa since there is no consensus on the sign of future precipitation changes in simulations coming from climate models. The Sahel region has experienced severe climatic changes in the past 60 years that can provide a first basis to understand the response of the water cycle to non-stationary conditions in this part of the world. The objective of this study was to better understand the response of the water cycle to highly variable climatic regimes in Central Sahel using historical climate records and the coupling of a land surface energy and water model with a vegetation model that, when combined, simulated the Sahelian water, energy and vegetation cycles. To do so, we relied on a reconstructed long-term climate series in Niamey, Republic of Niger, in which three precipitation regimes can be distinguished with a relative deficit exceeding 25% for the driest period compared to the wettest period. Two temperature scenarios (+2 and +4 °C) consistent with future warming scenarios were superimposed to this climatic signal to generate six virtual future 20-year climate time series. Simulations by the two coupled models forced by these virtual scenarios showed a strong response of the water budget and its components to temperature and precipitation changes, including decreases in transpiration, runoff and drainage for all scenarios but those with highest precipitation. Such climatic changes also strongly impacted soil temperature and moisture. This study illustrates the potential of using the strong climatic variations recorded in the past decades to better understand potential future climate variations.
Detection of a sudden change of the field time series based on the Lorenz system
Li, Fang; Shen, BingLu; Yan, PengCheng; Song, Jian; Ma, DeShan
2017-01-01
We conducted an exploratory study of the detection of a sudden change of the field time series based on the numerical solution of the Lorenz system. First, the time when the Lorenz path jumped between the regions on the left and right of the equilibrium point of the Lorenz system was quantitatively marked and the sudden change time of the Lorenz system was obtained. Second, the numerical solution of the Lorenz system was regarded as a vector; thus, this solution could be considered as a vector time series. We transformed the vector time series into a time series using the vector inner product, considering the geometric and topological features of the Lorenz system path. Third, the sudden change of the resulting time series was detected using the sliding t-test method. Comparing the test results with the quantitatively marked time indicated that the method could detect every sudden change of the Lorenz path, thus the method is effective. Finally, we used the method to detect the sudden change of the pressure field time series and temperature field time series, and obtained good results for both series, which indicates that the method can apply to high-dimension vector time series. Mathematically, there is no essential difference between the field time series and vector time series; thus, we provide a new method for the detection of the sudden change of the field time series. PMID:28141832
The transport forecast - an important stage of transport management
NASA Astrophysics Data System (ADS)
Dragu, Vasile; Dinu, Oana; Oprea, Cristina; Alina Roman, Eugenia
2017-10-01
The transport system is a powerful system with varying loads in operation coming from changes in freight and passenger traffic in different time periods. The variations are due to the specific conditions of organization and development of socio-economic activities. The causes of varying loads can be included in three groups: economic, technical and organizational. The assessing of transport demand variability leads to proper forecast and development of the transport system, knowing that the market price is determined on equilibrium between supply and demand. The reduction of transport demand variability through different technical solutions, organizational, administrative, legislative leads to an increase in the efficiency and effectiveness of transport. The paper presents a new way of assessing the future needs of transport through dynamic series. Both researchers and practitioners in transport planning can benefit from the research results. This paper aims to analyze in an original approach how a good transport forecast can lead to a better management in transport, with significant effects on transport demand full meeting in quality terms. The case study shows how dynamic series of statistics can be used to identify the size of future demand addressed to the transport system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grenzeback, L. R.; Brown, A.; Fischer, M. J.
2013-03-01
Freight transportation demand is projected to grow to 27.5 billion tons in 2040, and to nearly 30.2 billion tons in 2050. This report describes the current and future demand for freight transportation in terms of tons and ton-miles of commodities moved by truck, rail, water, pipeline, and air freight carriers. It outlines the economic, logistics, transportation, and policy and regulatory factors that shape freight demand, the trends and 2050 outlook for these factors, and their anticipated effect on freight demand. After describing federal policy actions that could influence future freight demand, the report then summarizes the capabilities of available analyticalmore » models for forecasting freight demand. This is one in a series of reports produced as a result of the Transportation Energy Futures project, a Department of Energy-sponsored multi-agency effort to pinpoint underexplored strategies for reducing GHGs and petroleum dependence related to transportation.« less
NASA Astrophysics Data System (ADS)
Collados-Lara, Antonio-Juan; Pulido-Velazquez, David; Pardo-Iguzquiza, Eulogio
2017-04-01
Assessing impacts of potential future climate change scenarios in precipitation and temperature is essential to design adaptive strategies in water resources systems. The objective of this work is to analyze the possibilities of different statistical downscaling methods to generate future potential scenarios in an Alpine Catchment from historical data and the available climate models simulations performed in the frame of the CORDEX EU project. The initial information employed to define these downscaling approaches are the historical climatic data (taken from the Spain02 project for the period 1971-2000 with a spatial resolution of 12.5 Km) and the future series provided by climatic models in the horizon period 2071-2100 . We have used information coming from nine climate model simulations (obtained from five different Regional climate models (RCM) nested to four different Global Climate Models (GCM)) from the European CORDEX project. In our application we have focused on the Representative Concentration Pathways (RCP) 8.5 emissions scenario, which is the most unfavorable scenario considered in the fifth Assessment Report (AR5) by the Intergovernmental Panel on Climate Change (IPCC). For each RCM we have generated future climate series for the period 2071-2100 by applying two different approaches, bias correction and delta change, and five different transformation techniques (first moment correction, first and second moment correction, regression functions, quantile mapping using distribution derived transformation and quantile mapping using empirical quantiles) for both of them. Ensembles of the obtained series were proposed to obtain more representative potential future climate scenarios to be employed to study potential impacts. In this work we propose a non-equifeaseble combination of the future series giving more weight to those coming from models (delta change approaches) or combination of models and techniques that provides better approximation to the basic and drought statistic of the historical data. A multi-objective analysis using basic statistics (mean, standard deviation and asymmetry coefficient) and droughts statistics (duration, magnitude and intensity) has been performed to identify which models are better in terms of goodness of fit to reproduce the historical series. The drought statistics have been obtained from the Standard Precipitation index (SPI) series using the Theory of Runs. This analysis allows discriminate the best RCM and the best combination of model and correction technique in the bias-correction method. We have also analyzed the possibilities of using different Stochastic Weather Generators to approximate the basic and droughts statistics of the historical series. These analyses have been performed in our case study in a lumped and in a distributed way in order to assess its sensibility to the spatial scale. The statistic of the future temperature series obtained with different ensemble options are quite homogeneous, but the precipitation shows a higher sensibility to the adopted method and spatial scale. The global increment in the mean temperature values are 31.79 %, 31.79 %, 31.03 % and 31.74 % for the distributed bias-correction, distributed delta-change, lumped bias-correction and lumped delta-change ensembles respectively and in the precipitation they are -25.48 %, -28.49 %, -26.42 % and -27.35% respectively. Acknowledgments: This research work has been partially supported by the GESINHIMPADAPT project (CGL2013-48424-C2-2-R) with Spanish MINECO funds. We would also like to thank Spain02 and CORDEX projects for the data provided for this study and the R package qmap.
NASA Astrophysics Data System (ADS)
Haff, P. K.
2012-12-01
Technological modification of the earth's surface (e.g., agriculture, urbanization) is an old story in human history, but what about the future? The future of landscape in an accelerating technological world, beyond a relatively short time horizon, lies hidden behind an impenetrable veil of complexity. Sufficiently complex dynamics generates not only the trajectory of a variable of interest (e.g., vegetation cover) but also the environment in which that variable evolves (e.g., background climate). There is no way to anticipate what variables will define that environment—the dynamics creates its own variables. We are always open to surprise by a change of conditions we thought or assumed were fixed or by the appearance of new phenomena of whose possible existence we had been unaware or thought unlikely. This is especially true under the influence of technology, where novelty is the rule. Lack of direct long-term predictability of landscape change does not, however, mean we cannot say anything about its future. The presence of persistence (finite time scales) in a system means that prediction by a calibrated numerical model should be good for a limited period of time barring bad luck or faulty implementation. Short-term prediction, despite its limitations, provides an option for dealing with the longer-term future. If a computer-controlled car tries to drive itself from New York to Los Angeles, no conceivable (or possible) stand-alone software can be constructed to predict a priori the space-time trajectory of the vehicle. Yet the drive is normally completed easily by most drivers. The trip is successfully completed because each in a series of very short (linear) steps can be "corrected" on the fly by the driver, who takes her cues from the environment to keep the car on the road and headed toward its destination. This metaphor differs in a fundamental way from the usual notion of predicting geomorphic change, because it involves a goal—to reach a desired destination—whereas the natural evolution of landscape has no such goal. Goals will become an essential feature of landscape prediction. The presence of a goal potentially increases our ability to predict, provided it is possible to use feedback (i.e., management) to nudge the system back in the "right" direction when it starts to stray. Under a regime of accelerating technology the closest we can get to predicting the longer term future of landscape is adaptive management, which at large scale is really geoengineer the system. The goal presumably would be to maintain a condition conducive to human well-being, for example to maintain a suitable fraction of global arable land. A successful "prediction" would be to stay within an envelope of states consistent with that goal. We cannot say, however, in what specific state the landscape will be at any time beyond the near future; this will depend on the future sequence of management decisions, which are, like the system they are managing, unpredictable, except shortly before they are implemented. The landscape of the future will thus likely be the result of a series of quick fixes to previous trends in landscape change. Similar comments apply to the prediction, or management, of climate. There is of course no guarantee that it will be possible to stay within the desired envelope of well-being.
Orsini, Luisa; Schwenk, Klaus; De Meester, Luc; Colbourne, John K.; Pfrender, Michael E.; Weider, Lawrence J.
2013-01-01
Evolutionary changes are determined by a complex assortment of ecological, demographic and adaptive histories. Predicting how evolution will shape the genetic structures of populations coping with current (and future) environmental challenges has principally relied on investigations through space, in lieu of time, because long-term phenotypic and molecular data are scarce. Yet, dormant propagules in sediments, soils and permafrost are convenient natural archives of population-histories from which to trace adaptive trajectories along extended time periods. DNA sequence data obtained from these natural archives, combined with pioneering methods for analyzing both ecological and population genomic time-series data, are likely to provide predictive models to forecast evolutionary responses of natural populations to environmental changes resulting from natural and anthropogenic stressors, including climate change. PMID:23395434
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan
A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source tomore » enable use by others.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan
A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source tomore » enable use by others.« less
Multiple Indicator Stationary Time Series Models.
ERIC Educational Resources Information Center
Sivo, Stephen A.
2001-01-01
Discusses the propriety and practical advantages of specifying multivariate time series models in the context of structural equation modeling for time series and longitudinal panel data. For time series data, the multiple indicator model specification improves on classical time series analysis. For panel data, the multiple indicator model…
Imaging the Ways to a Preferred Future.
ERIC Educational Resources Information Center
MacKenzie, Terri; Frommelt, Nancy
1985-01-01
Presents a series of exercises that can be used with any age level to stimulate visioning skills (e.g., dreaming, creating, intuiting, and imaging). The exercises focus on building imagination skills, guided imaging, envisioning the future that students would prefer, and creating a futures wheel. (DMM)
Applications of functional data analysis: A systematic review.
Ullah, Shahid; Finch, Caroline F
2013-03-19
Functional data analysis (FDA) is increasingly being used to better analyze, model and predict time series data. Key aspects of FDA include the choice of smoothing technique, data reduction, adjustment for clustering, functional linear modeling and forecasting methods. A systematic review using 11 electronic databases was conducted to identify FDA application studies published in the peer-review literature during 1995-2010. Papers reporting methodological considerations only were excluded, as were non-English articles. In total, 84 FDA application articles were identified; 75.0% of the reviewed articles have been published since 2005. Application of FDA has appeared in a large number of publications across various fields of sciences; the majority is related to biomedicine applications (21.4%). Overall, 72 studies (85.7%) provided information about the type of smoothing techniques used, with B-spline smoothing (29.8%) being the most popular. Functional principal component analysis (FPCA) for extracting information from functional data was reported in 51 (60.7%) studies. One-quarter (25.0%) of the published studies used functional linear models to describe relationships between explanatory and outcome variables and only 8.3% used FDA for forecasting time series data. Despite its clear benefits for analyzing time series data, full appreciation of the key features and value of FDA have been limited to date, though the applications show its relevance to many public health and biomedical problems. Wider application of FDA to all studies involving correlated measurements should allow better modeling of, and predictions from, such data in the future especially as FDA makes no a priori age and time effects assumptions.
Applications of functional data analysis: A systematic review
2013-01-01
Background Functional data analysis (FDA) is increasingly being used to better analyze, model and predict time series data. Key aspects of FDA include the choice of smoothing technique, data reduction, adjustment for clustering, functional linear modeling and forecasting methods. Methods A systematic review using 11 electronic databases was conducted to identify FDA application studies published in the peer-review literature during 1995–2010. Papers reporting methodological considerations only were excluded, as were non-English articles. Results In total, 84 FDA application articles were identified; 75.0% of the reviewed articles have been published since 2005. Application of FDA has appeared in a large number of publications across various fields of sciences; the majority is related to biomedicine applications (21.4%). Overall, 72 studies (85.7%) provided information about the type of smoothing techniques used, with B-spline smoothing (29.8%) being the most popular. Functional principal component analysis (FPCA) for extracting information from functional data was reported in 51 (60.7%) studies. One-quarter (25.0%) of the published studies used functional linear models to describe relationships between explanatory and outcome variables and only 8.3% used FDA for forecasting time series data. Conclusions Despite its clear benefits for analyzing time series data, full appreciation of the key features and value of FDA have been limited to date, though the applications show its relevance to many public health and biomedical problems. Wider application of FDA to all studies involving correlated measurements should allow better modeling of, and predictions from, such data in the future especially as FDA makes no a priori age and time effects assumptions. PMID:23510439
A SmallSat constellation mission architecture for a GRACE-type mission design
NASA Astrophysics Data System (ADS)
Deccia, C. M. A.; Nerem, R. S.; Yunck, T.
2017-12-01
The Gravity Recovery and Climate Experiment (GRACE) launched in 2002 and has been providing invaluable information of Earth's time-varying gravity field and GRACE-FO will continue this time series. For this work, we focus on architectures of future post-GRACE-FO like missions. Single pairs of satellites like GRACE and GRACE-FO are inherently limited in their spatio-temporal coverage. Full global coverage for a single pair can take up to 30 days for spatial resolutions of a few hundred kilometers, thus a single satellite pair is unable to observe sub-monthly signals in the Earth's time varying gravity field (e.g. hydrologic signals, etc.). Small satellite systems are becoming increasingly affordable and will soon allow a constellation of GRACE-type satellites to be deployed, with the capability to range between multiple satellites. Here, using simulation studies, we investigate the performance of such a constellation for different numbers of satellites (N) and different orbital configurations, in order to understand the improved performance that might be gained from such future mission architectures.
Live Aircraft Encounter Visualization at FutureFlight Central
NASA Technical Reports Server (NTRS)
Murphy, James R.; Chinn, Fay; Monheim, Spencer; Otto, Neil; Kato, Kenji; Archdeacon, John
2018-01-01
Researchers at the National Aeronautics and Space Administration (NASA) have developed an aircraft data streaming capability that can be used to visualize live aircraft in near real-time. During a joint Federal Aviation Administration (FAA)/NASA Airborne Collision Avoidance System flight series, test sorties between unmanned aircraft and manned intruder aircraft were shown in real-time at NASA Ames' FutureFlight Central tower facility as a virtual representation of the encounter. This capability leveraged existing live surveillance, video, and audio data streams distributed through a Live, Virtual, Constructive test environment, then depicted the encounter from the point of view of any aircraft in the system showing the proximity of the other aircraft. For the demonstration, position report data were sent to the ground from on-board sensors on the unmanned aircraft. The point of view can be change dynamically, allowing encounters from all angles to be observed. Visualizing the encounters in real-time provides a safe and effective method for observation of live flight testing and a strong alternative to travel to the remote test range.
Dynamical behaviors of inter-out-of-equilibrium state intervals in Korean futures exchange markets
NASA Astrophysics Data System (ADS)
Lim, Gyuchang; Kim, SooYong; Kim, Kyungsik; Lee, Dong-In; Scalas, Enrico
2008-05-01
A recently discovered feature of financial markets, the two-phase phenomenon, is utilized to categorize a financial time series into two phases, namely equilibrium and out-of-equilibrium states. For out-of-equilibrium states, we analyze the time intervals at which the state is revisited. The power-law distribution of inter-out-of-equilibrium state intervals is shown and we present an analogy with discrete-time heat bath dynamics, similar to random Ising systems. In the mean-field approximation, this model reduces to a one-dimensional multiplicative process. By varying global and local model parameters, the relevance between volatilities in financial markets and the interaction strengths between agents in the Ising model are investigated and discussed.
Ecosystems resilience to drought: indicators derived from time-series of Earth Observation data
NASA Astrophysics Data System (ADS)
Garcia, Monica; Fernández, Nestor; Delibes, Miguel
2013-04-01
Increasing our understanding of how ecosystems differ in their vulnerability to extreme climatic events such as drought is critical. Resilient ecosystems are capable to cope with climatic perturbations retaining the same essential function, structure and feedbacks. However, if the effect of a perturbation is amplified, abrupt shifts can occur such as in desertification processes. Empirical indicators of robustness and resilience to drought events could be developed from time series of Earth Observation (EO) data. So far, the information content of EO time series for monitoring ecosystem resilience has been underutilized, being mostly limited to detection of greening or rainfall use efficiency (RUE) trends at interannual time-scales. Detection of thresholds, shifts, extremes, and hysteresis processes is still in its infancy using EO data. Only recently some studies are starting to utilize this avenue of research using vegetation indices with some controversy due to the substitution of time by space. In drylands, where ecosystem functioning is largely controlled by rainfall, a key variable for monitoring is evapotranspiration as it connects the energy, water and carbon cycles. It can be estimated using EO data using a surface energy balance approach. In this work we propose the use of new empirical indicators of resilience to drought derived from EO time series. They are extracted from analyses of lagged cross-correlations between rainfall and evapotranspiration anomalies at several time-steps. This allows elucidating as well if an observed extreme ecological response can be attributed to a climate extreme. Additionally, increases in autocorrelation have been proposed to detect losses of resilience or changes in recovery capacity from a perturbation. Our objective was to compare rates of recovery from drought of different ecosystems in the natural park of Doñana (Spain) composed of wetlands, pine forest, shrublands with and without access to groundwater. The recovery was characterized by (i) the duration of -effects (ii) resistance to change and (iii) autocorrelation of the time-series. Time series of 2000-2008 from the satellite MODIS and meteorological stations were used. Evapotranspiration was estimated using a surface energy balance contextual or triangle approach using EO data. Analyses were performed at time-steps from 1 month up to 1 year. Among the four ecosystems, wetlands were the most resilient with a faster rate of recovery from drought but at the same time greater transient responses. Perennial vegetation types showed more resistance to drought but higher persistence of effects into the following year, especially shrublands without access to groundwater. Drought effects in pine forests were minimum as they access groundwater during dry periods. Our results suggest that in a future context of higher rainfall extremes, the long-term success in the case of vegetation types with access to the water table might depend on their capability to balance groundwater extractions and rainfall recharge. In the vegetation types without access to the water table their success will depend on their recovery potential after a drought sequence of several years.
Efficient Algorithms for Segmentation of Item-Set Time Series
NASA Astrophysics Data System (ADS)
Chundi, Parvathi; Rosenkrantz, Daniel J.
We propose a special type of time series, which we call an item-set time series, to facilitate the temporal analysis of software version histories, email logs, stock market data, etc. In an item-set time series, each observed data value is a set of discrete items. We formalize the concept of an item-set time series and present efficient algorithms for segmenting a given item-set time series. Segmentation of a time series partitions the time series into a sequence of segments where each segment is constructed by combining consecutive time points of the time series. Each segment is associated with an item set that is computed from the item sets of the time points in that segment, using a function which we call a measure function. We then define a concept called the segment difference, which measures the difference between the item set of a segment and the item sets of the time points in that segment. The segment difference values are required to construct an optimal segmentation of the time series. We describe novel and efficient algorithms to compute segment difference values for each of the measure functions described in the paper. We outline a dynamic programming based scheme to construct an optimal segmentation of the given item-set time series. We use the item-set time series segmentation techniques to analyze the temporal content of three different data sets—Enron email, stock market data, and a synthetic data set. The experimental results show that an optimal segmentation of item-set time series data captures much more temporal content than a segmentation constructed based on the number of time points in each segment, without examining the item set data at the time points, and can be used to analyze different types of temporal data.
Uranium-Series Dating of the East Franklin Mountain's Fault Carbonates in El Paso, Texas
NASA Astrophysics Data System (ADS)
Garcia, V. H.; Ma, L.; Pavlis, T. L.; Hurtado, J. M., Jr.
2017-12-01
Direct dating of fault activity is a fundamentally important part of many paleoseismic studies and has potential implications on the quantity, magnitude, recurrence intervals, and timing of earthquake occurrences in the past and future. Faults in the Rio Grande Rift (RGR) in southern New Mexico and West Texas have often been overlooked in seismic hazard assessments due to inferred low tectonic rates and long recurrence intervals. However, there is geologic evidence from surface ruptures that at least 22 large earthquakes (M > 6.25) have occurred in the RGR within the last 10,000 kyrs. The binational conurbation of the El Paso-Juarez region (home to 2.3 million people) lies in the southern extent of the RGR and is traversed by many Quaternary faults, which pose a potentially catastrophic hazard for the region. One fault in particular, the East Franklin Mountains fault (EFMF), is made up of many smaller fault segments that cross through heavily populated areas of the El Paso-Juarez region. Direct dating of past movement on a central segment of the EFMF is a fundamental and important piece of the puzzle in understanding when and how often seismic activity occurred in the fault. In this study, we applied Uranium-series (U-series) dating of fault carbonates collected from a trench that was dug on the central segment of the EFMF. Fault related calcite precipitants and pedogenic carbonates from a nearby soil profile were collected to (1) constraint the timing of past fault activity and (2) understand the relationship and timing of pedogenic carbonate formation away from the EFMF. U-series dating reveals that pedogenic carbonates collected from colluvial wedges along the fault are approximately half the optically stimulated luminescence age of the deposits, suggesting the U-Series dates record a relatively continuous accumulation of carbonates post-deposition. U-Series dates from within the EFMF, however, provided potentially the best estimates for the age of the most recent seismic event with ages of 10 - 12 kyrs, suggesting this method has potential broader applications in paleoseismic studies.
Accelerating Into the Future: From 0 to GeV in a Few Centimeters (LBNL Summer Lecture Series)
Leemans, Wim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Accelerator and Fusion Research Division (AFRD) and Laser Optics and Accelerator Systems Integrated Studies (LOASIS)
2018-05-04
Summer Lecture Series 2008: By exciting electric fields in plasma-based waveguides, lasers accelerate electrons in a fraction of the distance conventional accelerators require. The Accelerator and Fusion Research Division's LOASIS program, headed by Wim Leemans, has used 40-trillion-watt laser pulses to deliver billion-electron-volt (1 GeV) electron beams within centimeters. Leemans looks ahead to BELLA, 10-GeV accelerating modules that could power a future linear collider.
A novel weight determination method for time series data aggregation
NASA Astrophysics Data System (ADS)
Xu, Paiheng; Zhang, Rong; Deng, Yong
2017-09-01
Aggregation in time series is of great importance in time series smoothing, predicting and other time series analysis process, which makes it crucial to address the weights in times series correctly and reasonably. In this paper, a novel method to obtain the weights in time series is proposed, in which we adopt induced ordered weighted aggregation (IOWA) operator and visibility graph averaging (VGA) operator and linearly combine the weights separately generated by the two operator. The IOWA operator is introduced to the weight determination of time series, through which the time decay factor is taken into consideration. The VGA operator is able to generate weights with respect to the degree distribution in the visibility graph constructed from the corresponding time series, which reflects the relative importance of vertices in time series. The proposed method is applied to two practical datasets to illustrate its merits. The aggregation of Construction Cost Index (CCI) demonstrates the ability of proposed method to smooth time series, while the aggregation of The Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) illustrate how proposed method maintain the variation tendency of original data.
Future Aspiring Aviators, Primary: An Aviation Curriculum Guide K-3
DOT National Transportation Integrated Search
1995-01-01
Prepared ca. 1995. The Federal Aviation Administration is pleased to present the Aviation Education Teacher's Guide Series. The series includes four publications specifically designed as resources to those interested in aviation education. The guides...
NASA Astrophysics Data System (ADS)
Boulariah, Ouafik; Longobardi, Antonia; Meddi, Mohamed
2017-04-01
One of the major challenges scientists, practitioners and stakeholders are nowadays involved in, is to provide the worldwide population with reliable water supplies, protecting, at the same time, the freshwater ecosystems quality and quantity. Climate and land use changes undermine the balance between water demand and water availability, causing alteration of rivers flow regime. Knowledge of hydro-climate variables temporal and spatial variability is clearly helpful to plan drought and flood hazard mitigation strategies but also to adapt them to future environmental scenarios. The present study relates to the coastal semi-arid Tafna catchment, located in the North-West of Algeria, within the Mediterranean basin. The aim is the investigation of streamflow and rainfall indices temporal variability in six sub-basins of the large catchment Tafna, attempting to relate streamflow and rainfall changes. Rainfall and streamflow time series have been preliminary tested for data quality and homogeneity, through the coupled application of two-tailed t test, Pettitt test and Cumsum tests (significance level of 0.1, 0.05 and 0.01). Subsequently maximum annual daily rainfall and streamflow and average daily annual rainfall and streamflow time series have been derived and tested for temporal variability, through the application of the Mann Kendall and Sen's test. Overall maximum annual daily streamflow time series exhibit a negative trend which is however significant for only 30% of the station. Maximum annual daily rainfall also e exhibit a negative trend which is intend significant for the 80% of the stations. In the case of average daily annual streamflow and rainfall, the tendency for decrease in time is unclear and, in both cases, appear significant for 60% of stations.
Ramseyer, Simon T; Helbling, Christoph; Lussi, Adrian
2015-06-01
In the present case series, the authors report on seven cases of erosively worn dentitions (98 posterior teeth) which were treated with direct resin composite. In all cases, both arches were restored by using the so-called stamp technique. All patients were treated with standardized materials and protocols. Prior to treatment, a waxup was made on die-cast models to build up the loss of occlusion as well as ensure the optimal future anatomy and function of the eroded teeth to be restored. During treatment, teeth were restored by using templates of silicone (ie, two "stamps," one on the vestibular, one on the oral aspect of each tooth), which were filled with resin composite in order to transfer the planned, future restoration (ie, in the shape of the waxup) from the extra- to the intraoral situation. Baseline examinations were performed in all patients after treatment, and photographs as well as radiographs were taken. To evaluate the outcome, the modified United States Public Health Service criteria (USPHS) were used. The patients were re-assessed after a mean observation time of 40 months (40.8 ± 7.2 months). The overall outcome of the restorations was good, and almost exclusively "Alpha" scores were given. Only the marginal integrity and the anatomical form received a "Charlie" score (10.2%) in two cases. Direct resin composite restorations made with the stamp technique are a valuable treatment option for restoring erosively worn dentitions.
NASA Astrophysics Data System (ADS)
Norman, S. P.; Hargrove, W. W.; Lee, D. C.; Spruce, J.
2013-12-01
Wildfires could provide a cost-effective means to maintain or restore some aspects of fire-adapted landscapes. Yet with the added influence of climate change and invasives, wildfires may also facilitate or accelerate undesired type conversions. As megafires are becoming increasingly common across portions of the US West, managers require a framework for long-term monitoring that integrates the trajectories of fire-prone landscapes and objectives, not just conditions immediately after a burn. Systematic use of satellite data provides an efficient cross-jurisdictional solution to this problem. Since 2000, MODIS-technology has provided high frequency, 240m resolution observations of Earth. Using this data stream, the ForWarn system, developed through a partnership of the US Forest Service, NASA-Stennis and others, provides 46 estimates of the Normalized Difference Vegetation Index (NDVI) per year for the conterminous US. From this time series, a variety of secondary metrics have been derived including median annual NDVI, amplitude, and phenological spikiness. Each is both a fire and recovery sensitive measure that allows managers to systematically track conditions with respect to either the pre-fire baseline or desired future conditions more adaptively. In dry interior forests where wildfires could be used to thin stands, recovery to untreated conditions may not be desired given fuels objectives or climate change. In more mesic systems, fire effects may be monitored as staged succession. With both coarse filter monitoring and desired conditions in hand, managers can better recognize and prioritize problems in disturbance-prone landscapes.
Wakie, Tewodros; Evangelista, Paul H.; Jarnevich, Catherine S.; Laituri, Melinda
2014-01-01
We used correlative models with species occurrence points, Moderate Resolution Imaging Spectroradiometer (MODIS) vegetation indices, and topo-climatic predictors to map the current distribution and potential habitat of invasive Prosopis juliflora in Afar, Ethiopia. Time-series of MODIS Enhanced Vegetation Indices (EVI) and Normalized Difference Vegetation Indices (NDVI) with 250 m2 spatial resolution were selected as remote sensing predictors for mapping distributions, while WorldClim bioclimatic products and generated topographic variables from the Shuttle Radar Topography Mission product (SRTM) were used to predict potential infestations. We ran Maxent models using non-correlated variables and the 143 species-occurrence points. Maxent generated probability surfaces were converted into binary maps using the 10-percentile logistic threshold values. Performances of models were evaluated using area under the receiver-operating characteristic (ROC) curve (AUC). Our results indicate that the extent of P. juliflora invasion is approximately 3,605 km2 in the Afar region (AUC = 0.94), while the potential habitat for future infestations is 5,024 km2 (AUC = 0.95). Our analyses demonstrate that time-series of MODIS vegetation indices and species occurrence points can be used with Maxent modeling software to map the current distribution of P. juliflora, while topo-climatic variables are good predictors of potential habitat in Ethiopia. Our results can quantify current and future infestations, and inform management and policy decisions for containing P. juliflora. Our methods can also be replicated for managing invasive species in other East African countries.
Current and future climate- and air pollution-mediated impacts on human health.
Doherty, Ruth M; Heal, Mathew R; Wilkinson, Paul; Pattenden, Sam; Vieno, Massimo; Armstrong, Ben; Atkinson, Richard; Chalabi, Zaid; Kovats, Sari; Milojevic, Ai; Stevenson, David S
2009-12-21
We describe a project to quantify the burden of heat and ozone on mortality in the UK, both for the present-day and under future emission scenarios. Mortality burdens attributable to heat and ozone exposure are estimated by combination of climate-chemistry modelling and epidemiological risk assessment. Weather forecasting models (WRF) are used to simulate the driving meteorology for the EMEP4UK chemistry transport model at 5 km by 5 km horizontal resolution across the UK; the coupled WRF-EMEP4UK model is used to simulate daily surface temperature and ozone concentrations for the years 2003, 2005 and 2006, and for future emission scenarios. The outputs of these models are combined with evidence on the ozone-mortality and heat-mortality relationships derived from epidemiological analyses (time series regressions) of daily mortality in 15 UK conurbations, 1993-2003, to quantify present-day health burdens. During the August 2003 heatwave period, elevated ozone concentrations > 200 microg m-3 were measured at sites in London and elsewhere. This and other ozone photochemical episodes cause breaches of the UK air quality objective for ozone. Simulations performed with WRF-EMEP4UK reproduce the August 2003 heatwave temperatures and ozone concentrations. There remains day-to-day variability in the high ozone concentrations during the heatwave period, which on some days may be explained by ozone import from the European continent.Preliminary calculations using extended time series of spatially-resolved WRF-EMEP4UK model output suggest that in the summers (May to September) of 2003, 2005 & 2006 over 6000 deaths were attributable to ozone and around 5000 to heat in England and Wales. The regional variation in these deaths appears greater for heat-related than for ozone-related burdens.Changes in UK health burdens due to a range of future emission scenarios will be quantified. These future emissions scenarios span a range of possible futures from assuming current air quality legislation is fully implemented, to a more optimistic case with maximum feasible reductions, through to a more pessimistic case with continued strong economic growth and minimal implementation of air quality legislation. Elevated surface ozone concentrations during the 2003 heatwave period led to exceedences of the current UK air quality objective standards. A coupled climate-chemistry model is able to reproduce these temperature and ozone extremes. By combining model simulations of surface temperature and ozone with ozone-heat-mortality relationships derived from an epidemiological regression model, we estimate present-day and future health burdens across the UK. Future air quality legislation may need to consider the risk of increases in future heatwaves.
Hanbury, Andria; Wallace, Louise; Clark, Michael
2009-09-01
The aim of this study was to test the effectiveness of a theory of planned behaviour intervention to increase adherence of community mental health professionals to a national suicide prevention guideline. Routinely collected audit adherence data from an intervention and control site were collected and analysed using time series analysis to test whether the intervention significantly increased adherence. The effects of a local and national event on adherence were also examined. A Theory of Planned Behaviour (TPB) questionnaire, developed from interview findings, was administered to the health professionals. Subjective norms were found to be the most significant predictor of intention to adhere to the guideline, and were targeted with an interactive educational intervention. Time series analysis applied to routinely collected audit adherence data was used to test intervention effectiveness. The TPB accounted for 58% of the variance in intention to adhere, with subjective norms the only significant predictor. The intervention did not significantly increase adherence; however, the national and local events were found to have significantly increased adherence. The TPB was a useful framework for exploring barriers to adherence; however, this did not translate into an effective intervention. Future research should seek collaboration with local experts, and use this information in combination with the TPB, to develop interventions. Collaborative research with experts in pedagogy may also help to develop more effective interventions, particularly education-based interventions that require adult learning.
Digitalizing historical high resolution water level data: Challenges and opportunities
NASA Astrophysics Data System (ADS)
Holinde, Lars; Hein, Hartmut; Barjenbruch, Ulrich
2017-04-01
Historical tide-gauge data offer the opportunities for determining variations in key characteristics for water level data and the analyses of past extreme events (storm surges). These information are important for calculating future trends and scenarios. But there are challenges involved due to the extensive effort needed to digitalize gauge sheets and quality control the resulting historical data. Based on these conditions, two main sources for inaccuracies in historical time series can be identified. First are several challenges due to the digitalization of the historical data, e.g. general quality of the sheets, multiple crossing lines of the observed water levels and additional comments on the sheet describing problems or additional information during the measurements. Second are problems during the measurements themselves. These can include the incorrect positioning of the sheets, trouble with the tide-gauge and maintenance. Errors resulting from these problems can be e.g. flat lines, discontinuities and outlier. Especially, the characterization of outliers has to be conducted carefully, to distinguish between real outliers and the appearance of extreme events. Methods for the quality control process involve the use of statistics, machine learning and neural networks. These will be described and applied to three different time series from tide gauge stations at the cost of Lower Saxony, Germany. Resulting difficulties and outcomes of the quality control process will be presented and explained. Furthermore, we will present a first glance at analyses for these time series.
Analysis of time series for postal shipments in Regional VII East Java Indonesia
NASA Astrophysics Data System (ADS)
Kusrini, DE; Ulama, B. S. S.; Aridinanti, L.
2018-03-01
The change of number delivery goods through PT. Pos Regional VII East Java Indonesia indicates that the trend of increasing and decreasing the delivery of documents and non-documents in PT. Pos Regional VII East Java Indonesia is strongly influenced by conditions outside of PT. Pos Regional VII East Java Indonesia so that the prediction the number of document and non-documents requires a model that can accommodate it. Based on the time series plot monthly data fluctuations occur from 2013-2016 then the model is done using ARIMA or seasonal ARIMA and selected the best model based on the smallest AIC value. The results of data analysis about the number of shipments on each product sent through the Sub-Regional Postal Office VII East Java indicates that there are 5 post offices of 26 post offices entering the territory. The largest number of shipments is available on the PPB (Paket Pos Biasa is regular package shipment/non-document ) and SKH (Surat Kilat Khusus is Special Express Mail/document) products. The time series model generated is largely a Random walk model meaning that the number of shipment in the future is influenced by random effects that are difficult to predict. Some are AR and MA models, except for Express shipment products with Malang post office destination which has seasonal ARIMA model on lag 6 and 12. This means that the number of items in the following month is affected by the number of items in the previous 6 months.
Sea level data and techniques for detecting vertical crustal movements
NASA Technical Reports Server (NTRS)
Lennon, G. W.
1978-01-01
An attempt is made to survey problems, requirements, and the outlook for the future in the study of sea level time series so as to determine the relative movement of land and sea levels. The basic aim is to eliminate from the record the contributions from whatever marine dynamic phenomena respond to treatment, allowing the secular element to be identified with optimum clarity. Nevertheless the concept of sea level perturbation varies according to regional experience. The recent work of the Permanent Service for Mean Sea Level helps to eliminate geodetic noise from the series and makes it possible, perhaps, to treat the global mean sea level data bank so as to define eustatic changes in ocean volume which, in the present context, may be regarded as the final goal, allowing the identification of vertical crustal motion itself.
Shaping low-thrust trajectories with thrust-handling feature
NASA Astrophysics Data System (ADS)
Taheri, Ehsan; Kolmanovsky, Ilya; Atkins, Ella
2018-02-01
Shape-based methods are becoming popular in low-thrust trajectory optimization due to their fast computation speeds. In existing shape-based methods constraints are treated at the acceleration level but not at the thrust level. These two constraint types are not equivalent since spacecraft mass decreases over time as fuel is expended. This paper develops a shape-based method based on a Fourier series approximation that is capable of representing trajectories defined in spherical coordinates and that enforces thrust constraints. An objective function can be incorporated to minimize overall mission cost, i.e., achieve minimum ΔV . A representative mission from Earth to Mars is studied. The proposed Fourier series technique is demonstrated capable of generating feasible and near-optimal trajectories. These attributes can facilitate future low-thrust mission designs where different trajectory alternatives must be rapidly constructed and evaluated.
Parallel optimization of signal detection in active magnetospheric signal injection experiments
NASA Astrophysics Data System (ADS)
Gowanlock, Michael; Li, Justin D.; Rude, Cody M.; Pankratius, Victor
2018-05-01
Signal detection and extraction requires substantial manual parameter tuning at different stages in the processing pipeline. Time-series data depends on domain-specific signal properties, necessitating unique parameter selection for a given problem. The large potential search space makes this parameter selection process time-consuming and subject to variability. We introduce a technique to search and prune such parameter search spaces in parallel and select parameters for time series filters using breadth- and depth-first search strategies to increase the likelihood of detecting signals of interest in the field of magnetospheric physics. We focus on studying geomagnetic activity in the extremely and very low frequency ranges (ELF/VLF) using ELF/VLF transmissions from Siple Station, Antarctica, received at Québec, Canada. Our technique successfully detects amplified transmissions and achieves substantial speedup performance gains as compared to an exhaustive parameter search. We present examples where our algorithmic approach reduces the search from hundreds of seconds down to less than 1 s, with a ranked signal detection in the top 99th percentile, thus making it valuable for real-time monitoring. We also present empirical performance models quantifying the trade-off between the quality of signal recovered and the algorithm response time required for signal extraction. In the future, improved signal extraction in scenarios like the Siple experiment will enable better real-time diagnostics of conditions of the Earth's magnetosphere for monitoring space weather activity.
NASA Astrophysics Data System (ADS)
Hermosilla, Txomin; Wulder, Michael A.; White, Joanne C.; Coops, Nicholas C.; Hobart, Geordie W.
2017-12-01
The use of time series satellite data allows for the temporally dense, systematic, transparent, and synoptic capture of land dynamics over time. Subsequent to the opening of the Landsat archive, several time series approaches for characterizing landscape change have been developed, often representing a particular analytical time window. The information richness and widespread utility of these time series data have created a need to maintain the currency of time series information via the addition of new data, as it becomes available. When an existing time series is temporally extended, it is critical that previously generated change information remains consistent, thereby not altering reported change statistics or science outcomes based on that change information. In this research, we investigate the impacts and implications of adding additional years to an existing 29-year annual Landsat time series for forest change. To do so, we undertook a spatially explicit comparison of the 29 overlapping years of a time series representing 1984-2012, with a time series representing 1984-2016. Surface reflectance values, and presence, year, and type of change were compared. We found that the addition of years to extend the time series had minimal effect on the annual surface reflectance composites, with slight band-specific differences (r ≥ 0.1) in the final years of the original time series being updated. The area of stand replacing disturbances and determination of change year are virtually unchanged for the overlapping period between the two time-series products. Over the overlapping temporal period (1984-2012), the total area of change differs by 0.53%, equating to an annual difference in change area of 0.019%. Overall, the spatial and temporal agreement of the changes detected by both time series was 96%. Further, our findings suggest that the entire pre-existing historic time series does not need to be re-processed during the update process. Critically, given the time series change detection and update approach followed here, science outcomes or reports representing one temporal epoch can be considered stable and will not be altered when a time series is updated with newly available data.
Highly comparative time-series analysis: the empirical structure of time series and their methods.
Fulcher, Ben D; Little, Max A; Jones, Nick S
2013-06-06
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.
Highly comparative time-series analysis: the empirical structure of time series and their methods
Fulcher, Ben D.; Little, Max A.; Jones, Nick S.
2013-01-01
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344
Flexible risk metrics for identifying and monitoring conservation-priority species
Stanton, Jessica C.; Semmens, Brice X.; McKann, Patrick C.; Will, Tom; Thogmartin, Wayne E.
2016-01-01
Region-specific conservation programs should have objective, reliable metrics for species prioritization and progress evaluation that are customizable to the goals of a program, easy to comprehend and communicate, and standardized across time. Regional programs may have vastly different goals, spatial coverage, or management agendas, and one-size-fits-all schemes may not always be the best approach. We propose a quantitative and objective framework for generating metrics for prioritizing species that is straightforward to implement and update, customizable to different spatial resolutions, and based on readily available time-series data. This framework is also well-suited to handling missing-data and observer error. We demonstrate this approach using North American Breeding Bird Survey (NABBS) data to identify conservation priority species from a list of over 300 landbirds across 33 bird conservation regions (BCRs). To highlight the flexibility of the framework for different management goals and timeframes we calculate two different metrics. The first identifies species that may be inadequately monitored by NABBS protocols in the near future (TMT, time to monitoring threshold), and the other identifies species likely to decline significantly in the near future based on recent trends (TPD, time to percent decline). Within the individual BCRs we found up to 45% (mean 28%) of the species analyzed had overall declining population trajectories, which could result in up to 37 species declining below a minimum NABBS monitoring threshold in at least one currently occupied BCR within the next 50 years. Additionally, up to 26% (mean 8%) of the species analyzed within the individual BCRs may decline by 30% within the next decade. Conservation workers interested in conserving avian diversity and abundance within these BCRs can use these metrics to plan alternative monitoring schemes or highlight the urgency of those populations experiencing the fastest declines. However, this framework is adaptable to many taxa besides birds where abundance time-series data are available.
Costagli, Mauro; Waggoner, R Allen; Ueno, Kenichi; Tanaka, Keiji; Cheng, Kang
2009-04-15
In functional magnetic resonance imaging (fMRI), even subvoxel motion dramatically corrupts the blood oxygenation level-dependent (BOLD) signal, invalidating the assumption that intensity variation in time is primarily due to neuronal activity. Thus, correction of the subject's head movements is a fundamental step to be performed prior to data analysis. Most motion correction techniques register a series of volumes assuming that rigid body motion, characterized by rotational and translational parameters, occurs. Unlike the most widely used applications for fMRI data processing, which correct motion in the image domain by numerically estimating rotational and translational components simultaneously, the algorithm presented here operates in a three-dimensional k-space, to decouple and correct rotations and translations independently, offering new ways and more flexible procedures to estimate the parameters of interest. We developed an implementation of this method in MATLAB, and tested it on both simulated and experimental data. Its performance was quantified in terms of square differences and center of mass stability across time. Our data show that the algorithm proposed here successfully corrects for rigid-body motion, and its employment in future fMRI studies is feasible and promising.
A new approach for evaluating flexible working hours.
Giebel, Ole; Janssen, Daniela; Schomann, Carsten; Nachreiner, Friedhelm
2004-01-01
Recent studies on flexible working hours show at least some of these working time arrangements seem to be associated with impairing effects of health and well-being. According to available evidence, variability of working hours seems to play an important role. The question, however, is how this variability can be assessed and used to explain or predict impairments. Based on earlier methods used to assess shift-work effects, a time series analysis approach was applied to the matter of flexible working hours. Data on the working hours of 4 week's length of 137 respondents derived from a survey on flexible work hours involving 15 companies of different production and service sectors in Germany were converted to time series and analyzed by spectral analysis. A cluster analysis of the resulting power spectra yielded 5 clusters of flexible work hours. Analyzing these clusters for differences in reported impairments showed that workers who showed suppression of circadian and weekly rhythms experienced severest impairments, especially in circadian controlled functions like sleep and digestion. The results thus indicate that analyzing the periodicity of flexible working hours seems to be a promising approach for predicting impairments which should be investigated further in the future.
ERIC Educational Resources Information Center
Ngan, Chun-Kit
2013-01-01
Making decisions over multivariate time series is an important topic which has gained significant interest in the past decade. A time series is a sequence of data points which are measured and ordered over uniform time intervals. A multivariate time series is a set of multiple, related time series in a particular domain in which domain experts…
Magnetohydrodynamic modelling of solar disturbances in the interplanetary medium
NASA Astrophysics Data System (ADS)
Dryer, M.
1985-12-01
A scientifically constructed series of interplanetary magnetohydrodynamic models is made that comprise the foundations for a composite solar terrestrial environment model. These models, unique in the field of solar wind physics, include both 2-1/2D as well as 3D time dependent codes that will lead to future operational status. We have also developed a geomagnetic storm forecasting strategy, referred to as the Solar Terrestrial Environment Model (STEM/2000), whereby these models would be appended in modular fashion to solar, magnetosphere, ionosphere, thermosphere, and neutral atmosphere models. We stress that these models, while still not appropriate at this date for operational use, outline a strategy or blueprint for the future. This strategy, if implemented in its essential features, offers a high probability for technology transfer from theory to operational testing within, approximately, a decade. It would ensure that real time observations would be used to drive physically based models that outputs of which would be used by space environment forecasters.
Projecting crop yield in northern high latitude area.
Matsumura, Kanichiro
2014-01-01
Changing climatic conditions on seasonal and longer time scales influence agricultural production. Improvement of soil and fertilizer is a strong factor in agricultural production, but agricultural production is influenced by climate conditions even in highly developed countries. It is valuable if fewer predictors make it possible to conduct future projections. Monthly temperature and precipitation, wintertime 500hPa geopotential height, and the previous year's yield are used as predictors to forecast spring wheat yield in advance. Canadian small agricultural divisions (SAD) are used for analysis. Each SAD is composed of a collection of Canadian Agricultural Regions (CAR) of similar weather and growing conditions. Spring wheat yields in each CAR are forecast from the following variables: (a) the previous year's yield, (b) earlier stages of the growing season's climate conditions and, (c) the previous year's wintertime northern hemisphere 500hPa geopotential height field. Arctic outflow events in the Okanagan Valley in Canada are associated with episodes of extremely low temperatures during wintertime. Principal component analysis (PCA) is applied for wintertime northern hemisphere 500hPa geopotential height anomalies. The spatial PCA mode1 is defined as Arctic Oscillation and it influences prevailing westerlies. The prevailing westerlies meanders and influences climatic conditions. The spatial similarity between wintertime top 5 Arctic outflow event year's composites of 500hPa geopotential height anomalies and mode 3's spatial pattern is found. Mode 3's spatial pattern looks like the Pacific/North American (PNA) pattern which describes the variation of atmospheric circulation pattern over the Pacific Ocean and North America. Climate conditions from April to June, May to July, mode 3's time coefficients, and previous year's yield are used for forecasting spring wheat yield in each SAD. Cross-validation procedure which generates eight sets of models for the eight validation periods is used. To show the reproducing projection between observed and calculated values, the root mean squared error for skill score (RMSE SS) with the persistence model serving as the reference model is used. The persistence model is used as a benchmark. The results show that SADs near USA border show better RMSE SS values and mode 3's time coefficients can be a useful predictor especially for inland province such as Manitoba. Among 27 Canadian Prairie's SADs with perfect yield data, 67% of Alberta's SADs, 86% of Manitoba's SADs, and 77% of Saskatchewan's SADs can get positive skill scores. In each SAD, future yield projection is calculated applying predictors in 2013 for the obtained eight sets of models and eight sets of forecasted values in 2013 are averaged and a near future projection result is obtained. Series of outputs including calculated forecasted yield value in each SAD is provided by smart phone application. A system for providing climatic condition for a point with a permission of Climatic Research Unit - University of East Anglia and for obtaining patent is proposed. There are several patented systems similar to the system proposed in this paper. However, these patents are different in essence. The system proposed in this paper consists of two parts. First part is to estimate equations using time series data. The second part is to acquire and apply latest climatic conditions for obtained equations and calculate future projection. If the procedure is refined and devices are originally developed, series of idea can be patented. For future work, crop index, Hokkaido is also introduced.
De Keersmaecker, Wanda; Lhermitte, Stef; Honnay, Olivier; Farifteh, Jamshid; Somers, Ben; Coppin, Pol
2014-07-01
Increasing frequency of extreme climate events is likely to impose increased stress on ecosystems and to jeopardize the services that ecosystems provide. Therefore, it is of major importance to assess the effects of extreme climate events on the temporal stability (i.e., the resistance, the resilience, and the variance) of ecosystem properties. Most time series of ecosystem properties are, however, affected by varying data characteristics, uncertainties, and noise, which complicate the comparison of ecosystem stability metrics (ESMs) between locations. Therefore, there is a strong need for a more comprehensive understanding regarding the reliability of stability metrics and how they can be used to compare ecosystem stability globally. The objective of this study was to evaluate the performance of temporal ESMs based on time series of the Moderate Resolution Imaging Spectroradiometer derived Normalized Difference Vegetation Index of 15 global land-cover types. We provide a framework (i) to assess the reliability of ESMs in function of data characteristics, uncertainties and noise and (ii) to integrate reliability estimates in future global ecosystem stability studies against climate disturbances. The performance of our framework was tested through (i) a global ecosystem comparison and (ii) an comparison of ecosystem stability in response to the 2003 drought. The results show the influence of data quality on the accuracy of ecosystem stability. White noise, biased noise, and trends have a stronger effect on the accuracy of stability metrics than the length of the time series, temporal resolution, or amount of missing values. Moreover, we demonstrate the importance of integrating reliability estimates to interpret stability metrics within confidence limits. Based on these confidence limits, other studies dealing with specific ecosystem types or locations can be put into context, and a more reliable assessment of ecosystem stability against environmental disturbances can be obtained. © 2013 John Wiley & Sons Ltd.
Multiscale structure of time series revealed by the monotony spectrum.
Vamoş, Călin
2017-03-01
Observation of complex systems produces time series with specific dynamics at different time scales. The majority of the existing numerical methods for multiscale analysis first decompose the time series into several simpler components and the multiscale structure is given by the properties of their components. We present a numerical method which describes the multiscale structure of arbitrary time series without decomposing them. It is based on the monotony spectrum defined as the variation of the mean amplitude of the monotonic segments with respect to the mean local time scale during successive averagings of the time series, the local time scales being the durations of the monotonic segments. The maxima of the monotony spectrum indicate the time scales which dominate the variations of the time series. We show that the monotony spectrum can correctly analyze a diversity of artificial time series and can discriminate the existence of deterministic variations at large time scales from the random fluctuations. As an application we analyze the multifractal structure of some hydrological time series.
NASA Astrophysics Data System (ADS)
Terry, R. L.; Funning, G.; Floyd, M.
2017-12-01
The Geysers geothermal field in California, which provides a large portion of northern California's power, has seen declining steam pressures over the past three decades, accompanied by surface subsidence. Together, these two phenomena are likely the result of the exploitation of the reservoir without adequate time for natural restoration. To combat the decline in steam pressures, The Geysers began injecting imported wastewater into the geothermal reservoir in 1997 and expanded injection in 2003. In 2012 and 2013, we installed three continuously recording GPS stations in The Geysers to closely monitor crustal deformation due to both the extraction of steam and the injection of wastewater. To assess the impact of the current injection and extraction activities on the geothermal reservoir, we analyze the position time-series from these GPS stations alongside wastewater injection and steam extraction data. We use common-mode filtering to remove any regionally-correlated noise from our GPS time series, and also estimate and subtract any seasonal signals present. To predict the effect of injection and production on surface movement, we summed the monthly time series of well data within a rectangular grid framework. We then use an array of Mogi sources based on each grid cell's total volume change to calculate the expected surface deformation due to these volume changes at depth. The temporal resolution provided by GPS allows us to characterize more accurately the properties of the subsurface geothermal reservoir related to forcing. For example, based on a similar spatiotemporal relationship between injection and seismicity, we hypothesize that there may be a delayed deformation response following injection, related to the permeability of the reservoir, and are undertaking detailed comparisons between our time series data to identify this response. Overall changes in the sense and rate of vertical motion in the field due to injection over time are also expected. We anticipate that the impact of discovering a relationship between injection and surface deformation will be of great importance in maintaining and managing geothermal resources in the future.
Support vector machines for TEC seismo-ionospheric anomalies detection
NASA Astrophysics Data System (ADS)
Akhoondzadeh, M.
2013-02-01
Using time series prediction methods, it is possible to pursue the behaviors of earthquake precursors in the future and to announce early warnings when the differences between the predicted value and the observed value exceed the predefined threshold value. Support Vector Machines (SVMs) are widely used due to their many advantages for classification and regression tasks. This study is concerned with investigating the Total Electron Content (TEC) time series by using a SVM to detect seismo-ionospheric anomalous variations induced by the three powerful earthquakes of Tohoku (11 March 2011), Haiti (12 January 2010) and Samoa (29 September 2009). The duration of TEC time series dataset is 49, 46 and 71 days, for Tohoku, Haiti and Samoa earthquakes, respectively, with each at time resolution of 2 h. In the case of Tohoku earthquake, the results show that the difference between the predicted value obtained from the SVM method and the observed value reaches the maximum value (i.e., 129.31 TECU) at earthquake time in a period of high geomagnetic activities. The SVM method detected a considerable number of anomalous occurrences 1 and 2 days prior to the Haiti earthquake and also 1 and 5 days before the Samoa earthquake in a period of low geomagnetic activities. In order to show that the method is acting sensibly with regard to the results extracted during nonevent and event TEC data, i.e., to perform some null-hypothesis tests in which the methods would also be calibrated, the same period of data from the previous year of the Samoa earthquake date has been taken into the account. Further to this, in this study, the detected TEC anomalies using the SVM method were compared to the previous results (Akhoondzadeh and Saradjian, 2011; Akhoondzadeh, 2012) obtained from the mean, median, wavelet and Kalman filter methods. The SVM detected anomalies are similar to those detected using the previous methods. It can be concluded that SVM can be a suitable learning method to detect the novelty changes of a nonlinear time series such as variations of earthquake precursors.
NASA Astrophysics Data System (ADS)
Zhang, G.; Ganguly, S.; Saatchi, S. S.; Hagen, S. C.; Harris, N.; Yu, Y.; Nemani, R. R.
2013-12-01
Spatial and temporal patterns of forest disturbance and regrowth processes are key for understanding aboveground terrestrial vegetation biomass and carbon stocks at regional-to-continental scales. The NASA Carbon Monitoring System (CMS) program seeks key input datasets, especially information related to impacts due to natural/man-made disturbances in forested landscapes of Conterminous U.S. (CONUS), that would reduce uncertainties in current carbon stock estimation and emission models. This study provides a end-to-end forest disturbance detection framework based on pixel time series analysis from MODIS (Moderate Resolution Imaging Spectroradiometer) and Landsat surface spectral reflectance data. We applied the BFAST (Breaks for Additive Seasonal and Trend) algorithm to the Normalized Difference Vegetation Index (NDVI) data for the time period from 2000 to 2011. A harmonic seasonal model was implemented in BFAST to decompose the time series to seasonal and interannual trend components in order to detect abrupt changes in magnitude and direction of these components. To apply the BFAST for whole CONUS, we built a parallel computing setup for processing massive time-series data using the high performance computing facility of the NASA Earth Exchange (NEX). In the implementation process, we extracted the dominant deforestation events from the magnitude of abrupt changes in both seasonal and interannual components, and estimated dates for corresponding deforestation events. We estimated the recovery rate for deforested regions through regression models developed between NDVI values and time since disturbance for all pixels. A similar implementation of the BFAST algorithm was performed over selected Landsat scenes (all Landsat cloud free data was used to generate NDVI from atmospherically corrected spectral reflectances) to demonstrate the spatial coherence in retrieval layers between MODIS and Landsat. In future, the application of this largely parallel disturbance detection setup will facilitate large scale processing and wall-to-wall mapping of forest disturbance and regrowth of Landsat data for the whole of CONUS. This exercise will aid in improving the present capabilities of the NASA CMS effort in reducing uncertainties in national-level estimates of biomass and carbon stocks.
Characterizing the Responses of Land Surface Phenology to the Rainy Season in the Congo Basin
NASA Astrophysics Data System (ADS)
Yan, D.; Zhang, X.; Yu, Y.; Guo, W.
2016-12-01
The most pronounced climate changes across the Congo Basin are predicted to be the changes in the timing and amount of rainfall in the coming decades. It is expected to alter a significant shift in land surface phenology (LSP), so that an understanding of its responses to the rainy season can benefit the predictions of changes in the Congolese ecosystem under future climate change scenarios. However, quantitative analyses has not been performed to investigate the relationship between LSP and the rainy season in the Congo Basin. Based on 30-minute observations acquired by the Spinning Enhanced Visible and Infrared Imager (SEVIRI) onboard the METEOSAT Second Generation series of geostationary satellites, we generated a time series of three-day angularly corrected Two-band Enhanced Vegetation Index (EVI2) between 2006 and 2013. We then reconstructed EVI2 temporal trajectories and retrieved the timings and magnitudes of LSP using the hybrid piecewise logistic model. We further associated the phenological timings and magnitudes with those of the rainy seasons derived from the three-hourly rainfall rate measurements provided by the Tropical Rainfall Measurement Mission Product 3B42. Finally, we investigated the impacts of tree cover on the timing discrepancy between LSP and the rainy season. Results show that LSP was strongly associated with the rainy season. Specifically, the SEVIRI EVI2 time series reveals that two annual canopy greenness cycles (CGC) occur in the Congolese rainforests whereas a single annual CGC with strong seasonal amplitude was identified for other land cover types. The spatial shifts in CGC timings closely follow those of the rainy season controlled by the seasonal migration of the Intertropical Convergence Zone. However, the tree cover controls the timing discrepancy between LSP and the rainy season. The accumulated vegetation greenness during a CGC shows a strong dependence on the total rainfall received.
NASA Astrophysics Data System (ADS)
Huttenlau, Matthias; Schneeberger, Klaus; Winter, Benjamin; Pazur, Robert; Förster, Kristian; Achleitner, Stefan; Bolliger, Janine
2017-04-01
Devastating flood events have caused substantial economic damage across Europe during past decades. Flood risk management has therefore become a topic of crucial interest across state agencies, research communities and the public sector including insurances. There is consensus that mitigating flood risk relies on impact assessments which quantitatively account for a broad range of aspects in a (changing) environment. Flood risk assessments which take into account the interaction between the drivers climate change, land-use change and socio-economic change might bring new insights to the understanding of the magnitude and spatial characteristic of flood risks. Furthermore, the comparative assessment of different adaptation measures can give valuable information for decision-making. With this contribution we present an inter- and transdisciplinary research project aiming at developing and applying such an impact assessment relying on a coupled modelling framework for the Province of Vorarlberg in Austria. Stakeholder engagement ensures that the final outcomes of our study are accepted and successfully implemented in flood management practice. The study addresses three key questions: (i) What are scenarios of land- use and climate change for the study area? (ii) How will the magnitude and spatial characteristic of future flood risk change as a result of changes in climate and land use? (iii) Are there spatial planning and building-protection measures which effectively reduce future flood risk? The modelling framework has a modular structure comprising modules (i) climate change, (ii) land-use change, (iii) hydrologic modelling, (iv) flood risk analysis, and (v) adaptation measures. Meteorological time series are coupled with spatially explicit scenarios of land-use change to model runoff time series. The runoff time series are combined with impact indicators such as building damages and results are statistically assessed to analyse flood risk scenarios. Thus, the regional flood risk can be expressed in terms of expected annual damage and damages associated with a low probability of occurrence. We consider building protection measures explicitly as part of the consequence analysis of flood risk whereas spatial planning measures are already considered as explicit scenarios in the course of land-use change modelling.
NASA Astrophysics Data System (ADS)
Gromoll, B.
2004-06-01
For the future high temperature superconductivity, HTS, series products new refrigerators are essential. Demands are made on these which are only partly fulfilled by refrigerators available in the market today. This refers to cooling power, initial cost and in particular reliability. Without proper refrigeration techniques it will be almost impossible to bring HTS products to the market. Based on the experiences made by the construction and operation of HTS prototypes within our company, like the 400 kW motor, 1.2 MVA current limiter and 1 MVA traction-transformer provided with refrigerators which are available in the market today, criteria have been established to identify the future technical and economical requirements. These criteria apply to efficiency, maintainability, operation flexibility, feasibility of integration and performance/cost ratio. For the temperature range of 20 K to 77 K cooling with Gifford-McMahon, Pulse Tube, Stirling and Mixture-Cascade refrigerators are applicable. The development potential of these processes are compared for the different applications in future series products. Presented are the necessary steps towards reliable and economic refrigerators from the viewpoint of an equipment manufacturer. These are essential for a market entry in the year 2008.
A Review of the Stockton Training Series: Instructor Reports of Current Use and Future Application
ERIC Educational Resources Information Center
Krieger, Kenin M.; Whittingham, Martyn
2005-01-01
For the past decade, the Stockton training series has offered instructors a valuable training tool for use in a variety of clinical settings. Counselor educators were asked to reflect upon the series and its application for beginning leader training. Specifically, surveys were distributed to a wide range of instructors who train group leaders;…
The Future of Transportation: Safety, Opportunity, and Innovation
DOT National Transportation Integrated Search
2016-12-30
This report summarizes key findings from the Future of Transportation: Safety, Opportunity, and Innovation thought leadership speaker series held at Volpe, The National Transportation Systems Center, during the summer and fall of 2016.
GIS-based hydrologic modeling offers a convenient means of assessing the impacts associated with land-cover/use change for environmental planning efforts. Future scenarios can be developed through a combination of modifications to the land-cover/use maps used to parameterize hydr...
NASA Astrophysics Data System (ADS)
Syafrina, A. H.; Zalina, M. D.; Juneng, L.
2014-09-01
A stochastic downscaling methodology known as the Advanced Weather Generator, AWE-GEN, has been tested at four stations in Peninsular Malaysia using observations available from 1975 to 2005. The methodology involves a stochastic downscaling procedure based on a Bayesian approach. Climate statistics from a multi-model ensemble of General Circulation Model (GCM) outputs were calculated and factors of change were derived to produce the probability distribution functions (PDF). New parameters were obtained to project future climate time series. A multi-model ensemble was used in this study. The projections of extreme precipitation were based on the RCP 6.0 scenario (2081-2100). The model was able to simulate both hourly and 24-h extreme precipitation, as well as wet spell durations quite well for almost all regions. However, the performance of GCM models varies significantly in all regions showing high variability of monthly precipitation for both observed and future periods. The extreme precipitation for both hourly and 24-h seems to increase in future, while extreme of wet spells remain unchanged, up to the return periods of 10-40 years.
Jewkes, Rachel; Gibbs, Andrew; Jama-Shai, Nwabisa; Willan, Samantha; Misselhorn, Alison; Mushinga, Mildred; Washington, Laura; Mbatha, Nompumelelo; Skiweyiya, Yandisa
2014-12-29
Gender-based violence and HIV are highly prevalent in the harsh environment of informal settlements and reducing violence here is very challenging. The group intervention Stepping Stones has been shown to reduce men's perpetration of violence in more rural areas, but violence experienced by women in the study was not affected. Economic empowerment interventions with gender training can protect older women from violence, but microloan interventions have proved challenging with young women. We investigated whether combining a broad economic empowerment intervention and Stepping Stones could impact on violence among young men and women. The intervention, Creating Futures, was developed as a new generation of economic empowerment intervention, which enabled livelihood strengthening though helping participants find work or set up a business, and did not give cash or make loans. We piloted Stepping Stones with Creating Futures in two informal settlements of Durban with 232 out of school youth, mostly aged 18-30 and evaluated with a shortened interrupted time series of two baseline surveys and at 28 and 58 weeks post-baseline. 94/110 men and 111/122 women completed the last assessment, 85.5% and 90.2% respectively of those enrolled. To determine trend, we built random effects regression models with each individual as the cluster for each variable, and measured the slope of the line across the time points. Men's mean earnings in the past month increased by 247% from R411 (~$40) to R1015 (~$102, and women's by 278% R 174 (~$17) to R 484 (about $48) (trend test, p < 0.0001). There was a significant reduction in women's experience of the combined measure of physical and/or sexual IPV in the prior three months from 30.3% to 18.9% (p = 0.037). This was not seen for men. However both men and women scored significantly better on gender attitudes and men significantly reduced their controlling practices in their relationship. The prevalence of moderate or severe depression symptomatology among men and suicidal thoughts decreased significantly (p < 0.0001 and p = 0.01). These findings are very positive for an exploratory study and indicate that the Creating Futures/Stepping Stones intervention has potential for impact in these difficult areas with young men and women. Further evaluation is needed.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-07
... Series, adjusted option series and any options series until the time to expiration for such series is... time to expiration for such series is less than nine months be treated differently. Specifically, under... series until the time to expiration for such series is less than nine months. Accordingly, the...