Regenerating time series from ordinal networks.
McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael
2017-03-01
Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.
Regenerating time series from ordinal networks
NASA Astrophysics Data System (ADS)
McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael
2017-03-01
Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.
Efficient Algorithms for Segmentation of Item-Set Time Series
NASA Astrophysics Data System (ADS)
Chundi, Parvathi; Rosenkrantz, Daniel J.
We propose a special type of time series, which we call an item-set time series, to facilitate the temporal analysis of software version histories, email logs, stock market data, etc. In an item-set time series, each observed data value is a set of discrete items. We formalize the concept of an item-set time series and present efficient algorithms for segmenting a given item-set time series. Segmentation of a time series partitions the time series into a sequence of segments where each segment is constructed by combining consecutive time points of the time series. Each segment is associated with an item set that is computed from the item sets of the time points in that segment, using a function which we call a measure function. We then define a concept called the segment difference, which measures the difference between the item set of a segment and the item sets of the time points in that segment. The segment difference values are required to construct an optimal segmentation of the time series. We describe novel and efficient algorithms to compute segment difference values for each of the measure functions described in the paper. We outline a dynamic programming based scheme to construct an optimal segmentation of the given item-set time series. We use the item-set time series segmentation techniques to analyze the temporal content of three different data sets—Enron email, stock market data, and a synthetic data set. The experimental results show that an optimal segmentation of item-set time series data captures much more temporal content than a segmentation constructed based on the number of time points in each segment, without examining the item set data at the time points, and can be used to analyze different types of temporal data.
Constructing networks from a dynamical system perspective for multivariate nonlinear time series.
Nakamura, Tomomichi; Tanizawa, Toshihiro; Small, Michael
2016-03-01
We describe a method for constructing networks for multivariate nonlinear time series. We approach the interaction between the various scalar time series from a deterministic dynamical system perspective and provide a generic and algorithmic test for whether the interaction between two measured time series is statistically significant. The method can be applied even when the data exhibit no obvious qualitative similarity: a situation in which the naive method utilizing the cross correlation function directly cannot correctly identify connectivity. To establish the connectivity between nodes we apply the previously proposed small-shuffle surrogate (SSS) method, which can investigate whether there are correlation structures in short-term variabilities (irregular fluctuations) between two data sets from the viewpoint of deterministic dynamical systems. The procedure to construct networks based on this idea is composed of three steps: (i) each time series is considered as a basic node of a network, (ii) the SSS method is applied to verify the connectivity between each pair of time series taken from the whole multivariate time series, and (iii) the pair of nodes is connected with an undirected edge when the null hypothesis cannot be rejected. The network constructed by the proposed method indicates the intrinsic (essential) connectivity of the elements included in the system or the underlying (assumed) system. The method is demonstrated for numerical data sets generated by known systems and applied to several experimental time series.
A probabilistic method for constructing wave time-series at inshore locations using model scenarios
Long, Joseph W.; Plant, Nathaniel G.; Dalyander, P. Soupy; Thompson, David M.
2014-01-01
Continuous time-series of wave characteristics (height, period, and direction) are constructed using a base set of model scenarios and simple probabilistic methods. This approach utilizes an archive of computationally intensive, highly spatially resolved numerical wave model output to develop time-series of historical or future wave conditions without performing additional, continuous numerical simulations. The archive of model output contains wave simulations from a set of model scenarios derived from an offshore wave climatology. Time-series of wave height, period, direction, and associated uncertainties are constructed at locations included in the numerical model domain. The confidence limits are derived using statistical variability of oceanographic parameters contained in the wave model scenarios. The method was applied to a region in the northern Gulf of Mexico and assessed using wave observations at 12 m and 30 m water depths. Prediction skill for significant wave height is 0.58 and 0.67 at the 12 m and 30 m locations, respectively, with similar performance for wave period and direction. The skill of this simplified, probabilistic time-series construction method is comparable to existing large-scale, high-fidelity operational wave models but provides higher spatial resolution output at low computational expense. The constructed time-series can be developed to support a variety of applications including climate studies and other situations where a comprehensive survey of wave impacts on the coastal area is of interest.
A KST framework for correlation network construction from time series signals
NASA Astrophysics Data System (ADS)
Qi, Jin-Peng; Gu, Quan; Zhu, Ying; Zhang, Ping
2018-04-01
A KST (Kolmogorov-Smirnov test and T statistic) method is used for construction of a correlation network based on the fluctuation of each time series within the multivariate time signals. In this method, each time series is divided equally into multiple segments, and the maximal data fluctuation in each segment is calculated by a KST change detection procedure. Connections between each time series are derived from the data fluctuation matrix, and are used for construction of the fluctuation correlation network (FCN). The method was tested with synthetic simulations and the result was compared with those from using KS or T only for detection of data fluctuation. The novelty of this study is that the correlation analyses was based on the data fluctuation in each segment of each time series rather than on the original time signals, which would be more meaningful for many real world applications and for analysis of large-scale time signals where prior knowledge is uncertain.
On the deduction of chemical reaction pathways from measurements of time series of concentrations.
Samoilov, Michael; Arkin, Adam; Ross, John
2001-03-01
We discuss the deduction of reaction pathways in complex chemical systems from measurements of time series of chemical concentrations of reacting species. First we review a technique called correlation metric construction (CMC) and show the construction of a reaction pathway from measurements on a part of glycolysis. Then we present two new improved methods for the analysis of time series of concentrations, entropy metric construction (EMC), and entropy reduction method (ERM), and illustrate (EMC) with calculations on a model reaction system. (c) 2001 American Institute of Physics.
A novel weight determination method for time series data aggregation
NASA Astrophysics Data System (ADS)
Xu, Paiheng; Zhang, Rong; Deng, Yong
2017-09-01
Aggregation in time series is of great importance in time series smoothing, predicting and other time series analysis process, which makes it crucial to address the weights in times series correctly and reasonably. In this paper, a novel method to obtain the weights in time series is proposed, in which we adopt induced ordered weighted aggregation (IOWA) operator and visibility graph averaging (VGA) operator and linearly combine the weights separately generated by the two operator. The IOWA operator is introduced to the weight determination of time series, through which the time decay factor is taken into consideration. The VGA operator is able to generate weights with respect to the degree distribution in the visibility graph constructed from the corresponding time series, which reflects the relative importance of vertices in time series. The proposed method is applied to two practical datasets to illustrate its merits. The aggregation of Construction Cost Index (CCI) demonstrates the ability of proposed method to smooth time series, while the aggregation of The Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) illustrate how proposed method maintain the variation tendency of original data.
The construction of a Central Netherlands temperature
NASA Astrophysics Data System (ADS)
van der Schrier, G.; van Ulden, A.; van Oldenborgh, G. J.
2011-05-01
The Central Netherlands Temperature (CNT) is a monthly daily mean temperature series constructed from homogenized time series from the centre of the Netherlands. The purpose of this series is to offer a homogeneous time series representative of a larger area in order to study large-scale temperature changes. It will also facilitate a comparison with climate models, which resolve similar scales. From 1906 onwards, temperature measurements in the Netherlands have been sufficiently standardized to construct a high-quality series. Long time series have been constructed by merging nearby stations and using the overlap to calibrate the differences. These long time series and a few time series of only a few decades in length have been subjected to a homogeneity analysis in which significant breaks and artificial trends have been corrected. Many of the detected breaks correspond to changes in the observations that are documented in the station metadata. This version of the CNT, to which we attach the version number 1.1, is constructed as the unweighted average of four stations (De Bilt, Winterswijk/Hupsel, Oudenbosch/Gilze-Rijen and Gemert/Volkel) with the stations Eindhoven and Deelen added from 1951 and 1958 onwards, respectively. The global gridded datasets used for detecting and attributing climate change are based on raw observational data. Although some homogeneity adjustments are made, these are not based on knowledge of local circumstances but only on statistical evidence. Despite this handicap, and the fact that these datasets use grid boxes that are far larger then the area associated with that of the Central Netherlands Temperature, the temperature interpolated to the CNT region shows a warming trend that is broadly consistent with the CNT trend in all of these datasets. The actual trends differ from the CNT trend up to 30 %, which highlights the need to base future global gridded temperature datasets on homogenized time series.
Inflow forecasting model construction with stochastic time series for coordinated dam operation
NASA Astrophysics Data System (ADS)
Kim, T.; Jung, Y.; Kim, H.; Heo, J. H.
2014-12-01
Dam inflow forecasting is one of the most important tasks in dam operation for an effective water resources management and control. In general, dam inflow forecasting with stochastic time series model is possible to apply when the data is stationary because most of stochastic process based on stationarity. However, recent hydrological data cannot be satisfied the stationarity anymore because of climate change. Therefore a stochastic time series model, which can consider seasonality and trend in the data series, named SARIMAX(Seasonal Autoregressive Integrated Average with eXternal variable) model were constructed in this study. This SARIMAX model could increase the performance of stochastic time series model by considering the nonstationarity components and external variable such as precipitation. For application, the models were constructed for four coordinated dams on Han river in South Korea with monthly time series data. As a result, the models of each dam have similar performance and it would be possible to use the model for coordinated dam operation.Acknowledgement This research was supported by a grant 'Establishing Active Disaster Management System of Flood Control Structures by using 3D BIM Technique' [NEMA-NH-12-57] from the Natural Hazard Mitigation Research Group, National Emergency Management Agency of Korea.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Xiaoran, E-mail: sxr0806@gmail.com; School of Mathematics and Statistics, The University of Western Australia, Crawley WA 6009; Small, Michael, E-mail: michael.small@uwa.edu.au
In this work, we propose a novel method to transform a time series into a weighted and directed network. For a given time series, we first generate a set of segments via a sliding window, and then use a doubly symbolic scheme to characterize every windowed segment by combining absolute amplitude information with an ordinal pattern characterization. Based on this construction, a network can be directly constructed from the given time series: segments corresponding to different symbol-pairs are mapped to network nodes and the temporal succession between nodes is represented by directed links. With this conversion, dynamics underlying the timemore » series has been encoded into the network structure. We illustrate the potential of our networks with a well-studied dynamical model as a benchmark example. Results show that network measures for characterizing global properties can detect the dynamical transitions in the underlying system. Moreover, we employ a random walk algorithm to sample loops in our networks, and find that time series with different dynamics exhibits distinct cycle structure. That is, the relative prevalence of loops with different lengths can be used to identify the underlying dynamics.« less
NASA Astrophysics Data System (ADS)
Li, Xiuming; Sun, Mei; Gao, Cuixia; Han, Dun; Wang, Minggang
2018-02-01
This paper presents the parametric modified limited penetrable visibility graph (PMLPVG) algorithm for constructing complex networks from time series. We modify the penetrable visibility criterion of limited penetrable visibility graph (LPVG) in order to improve the rationality of the original penetrable visibility and preserve the dynamic characteristics of the time series. The addition of view angle provides a new approach to characterize the dynamic structure of the time series that is invisible in the previous algorithm. The reliability of the PMLPVG algorithm is verified by applying it to three types of artificial data as well as the actual data of natural gas prices in different regions. The empirical results indicate that PMLPVG algorithm can distinguish the different time series from each other. Meanwhile, the analysis results of natural gas prices data using PMLPVG are consistent with the detrended fluctuation analysis (DFA). The results imply that the PMLPVG algorithm may be a reasonable and significant tool for identifying various time series in different fields.
A univariate model of river water nitrate time series
NASA Astrophysics Data System (ADS)
Worrall, F.; Burt, T. P.
1999-01-01
Four time series were taken from three catchments in the North and South of England. The sites chosen included two in predominantly agricultural catchments, one at the tidal limit and one downstream of a sewage treatment works. A time series model was constructed for each of these series as a means of decomposing the elements controlling river water nitrate concentrations and to assess whether this approach could provide a simple management tool for protecting water abstractions. Autoregressive (AR) modelling of the detrended and deseasoned time series showed a "memory effect". This memory effect expressed itself as an increase in the winter-summer difference in nitrate levels that was dependent upon the nitrate concentration 12 or 6 months previously. Autoregressive moving average (ARMA) modelling showed that one of the series contained seasonal, non-stationary elements that appeared as an increasing trend in the winter-summer difference. The ARMA model was used to predict nitrate levels and predictions were tested against data held back from the model construction process - predictions gave average percentage errors of less than 10%. Empirical modelling can therefore provide a simple, efficient method for constructing management models for downstream water abstraction.
An approach to constructing a homogeneous time series of soil mositure using SMOS
USDA-ARS?s Scientific Manuscript database
Overlapping soil moisture time series derived from two satellite microwave radiometers (SMOS, Soil Moisture and Ocean Salinity; AMSR-E, Advanced Microwave Scanning Radiometer - Earth Observing System) are used to generate a soil moisture time series from 2003 to 2010. Two statistical methodologies f...
Phase Time and Envelope Time in Time-Distance Analysis and Acoustic Imaging
NASA Technical Reports Server (NTRS)
Chou, Dean-Yi; Duvall, Thomas L.; Sun, Ming-Tsung; Chang, Hsiang-Kuang; Jimenez, Antonio; Rabello-Soares, Maria Cristina; Ai, Guoxiang; Wang, Gwo-Ping; Goode Philip; Marquette, William;
1999-01-01
Time-distance analysis and acoustic imaging are two related techniques to probe the local properties of solar interior. In this study, we discuss the relation of phase time and envelope time between the two techniques. The location of the envelope peak of the cross correlation function in time-distance analysis is identified as the travel time of the wave packet formed by modes with the same w/l. The phase time of the cross correlation function provides information of the phase change accumulated along the wave path, including the phase change at the boundaries of the mode cavity. The acoustic signals constructed with the technique of acoustic imaging contain both phase and intensity information. The phase of constructed signals can be studied by computing the cross correlation function between time series constructed with ingoing and outgoing waves. In this study, we use the data taken with the Taiwan Oscillation Network (TON) instrument and the Michelson Doppler Imager (MDI) instrument. The analysis is carried out for the quiet Sun. We use the relation of envelope time versus distance measured in time-distance analyses to construct the acoustic signals in acoustic imaging analyses. The phase time of the cross correlation function of constructed ingoing and outgoing time series is twice the difference between the phase time and envelope time in time-distance analyses as predicted. The envelope peak of the cross correlation function between constructed ingoing and outgoing time series is located at zero time as predicted for results of one-bounce at 3 mHz for all four data sets and two-bounce at 3 mHz for two TON data sets. But it is different from zero for other cases. The cause of the deviation of the envelope peak from zero is not known.
Memory and betweenness preference in temporal networks induced from time series
NASA Astrophysics Data System (ADS)
Weng, Tongfeng; Zhang, Jie; Small, Michael; Zheng, Rui; Hui, Pan
2017-02-01
We construct temporal networks from time series via unfolding the temporal information into an additional topological dimension of the networks. Thus, we are able to introduce memory entropy analysis to unravel the memory effect within the considered signal. We find distinct patterns in the entropy growth rate of the aggregate network at different memory scales for time series with different dynamics ranging from white noise, 1/f noise, autoregressive process, periodic to chaotic dynamics. Interestingly, for a chaotic time series, an exponential scaling emerges in the memory entropy analysis. We demonstrate that the memory exponent can successfully characterize bifurcation phenomenon, and differentiate the human cardiac system in healthy and pathological states. Moreover, we show that the betweenness preference analysis of these temporal networks can further characterize dynamical systems and separate distinct electrocardiogram recordings. Our work explores the memory effect and betweenness preference in temporal networks constructed from time series data, providing a new perspective to understand the underlying dynamical systems.
Construction of regulatory networks using expression time-series data of a genotyped population.
Yeung, Ka Yee; Dombek, Kenneth M; Lo, Kenneth; Mittler, John E; Zhu, Jun; Schadt, Eric E; Bumgarner, Roger E; Raftery, Adrian E
2011-11-29
The inference of regulatory and biochemical networks from large-scale genomics data is a basic problem in molecular biology. The goal is to generate testable hypotheses of gene-to-gene influences and subsequently to design bench experiments to confirm these network predictions. Coexpression of genes in large-scale gene-expression data implies coregulation and potential gene-gene interactions, but provide little information about the direction of influences. Here, we use both time-series data and genetics data to infer directionality of edges in regulatory networks: time-series data contain information about the chronological order of regulatory events and genetics data allow us to map DNA variations to variations at the RNA level. We generate microarray data measuring time-dependent gene-expression levels in 95 genotyped yeast segregants subjected to a drug perturbation. We develop a Bayesian model averaging regression algorithm that incorporates external information from diverse data types to infer regulatory networks from the time-series and genetics data. Our algorithm is capable of generating feedback loops. We show that our inferred network recovers existing and novel regulatory relationships. Following network construction, we generate independent microarray data on selected deletion mutants to prospectively test network predictions. We demonstrate the potential of our network to discover de novo transcription-factor binding sites. Applying our construction method to previously published data demonstrates that our method is competitive with leading network construction algorithms in the literature.
Relation between delayed feedback and delay-coupled systems and its application to chaotic lasers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soriano, Miguel C., E-mail: miguel@ifisc.uib-csic.es; Flunkert, Valentin; Fischer, Ingo
2013-12-15
We present a systematic approach to identify the similarities and differences between a chaotic system with delayed feedback and two mutually delay-coupled systems. We consider the general case in which the coupled systems are either unsynchronized or in a generally synchronized state, in contrast to the mostly studied case of identical synchronization. We construct a new time-series for each of the two coupling schemes, respectively, and present analytic evidence and numerical confirmation that these two constructed time-series are statistically equivalent. From the construction, it then follows that the distribution of time-series segments that are small compared to the overall delaymore » in the system is independent of the value of the delay and of the coupling scheme. By focusing on numerical simulations of delay-coupled chaotic lasers, we present a practical example of our findings.« less
Real-time Series Resistance Monitoring in PV Systems; NREL (National Renewable Energy Laboratory)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deceglie, M. G.; Silverman, T. J.; Marion, B.
We apply the physical principles of a familiar method, suns-Voc, to a new application: the real-time detection of series resistance changes in modules and systems operating outside. The real-time series resistance (RTSR) method that we describe avoids the need for collecting IV curves or constructing full series-resistance-free IV curves. RTSR is most readily deployable at the module level on apply the physical principles of a familiar method, suns-Voc, to a new application: the real-time detection of series resistance changes in modules and systems operating outside. The real-time series resistance (RTSR) method that we describe avoids the need for collecting IVmore » curves or constructing full series-resistance-free IV curves. RTSR is most readily deployable at the module level on micro-inverters or module-integrated electronics, but it can also be extended to full strings. Automated detection of series resistance increases can provide early warnings of some of the most common reliability issues, which also pose fire risks, including broken ribbons, broken solder bonds, and contact problems in the junction or combiner box. We describe the method in detail and describe a sample application to data collected from modules operating in the field.« less
Perl Modules for Constructing Iterators
NASA Technical Reports Server (NTRS)
Tilmes, Curt
2009-01-01
The Iterator Perl Module provides a general-purpose framework for constructing iterator objects within Perl, and a standard API for interacting with those objects. Iterators are an object-oriented design pattern where a description of a series of values is used in a constructor. Subsequent queries can request values in that series. These Perl modules build on the standard Iterator framework and provide iterators for some other types of values. Iterator::DateTime constructs iterators from DateTime objects or Date::Parse descriptions and ICal/RFC 2445 style re-currence descriptions. It supports a variety of input parameters, including a start to the sequence, an end to the sequence, an Ical/RFC 2445 recurrence describing the frequency of the values in the series, and a format description that can refine the presentation manner of the DateTime. Iterator::String constructs iterators from string representations. This module is useful in contexts where the API consists of supplying a string and getting back an iterator where the specific iteration desired is opaque to the caller. It is of particular value to the Iterator::Hash module which provides nested iterations. Iterator::Hash constructs iterators from Perl hashes that can include multiple iterators. The constructed iterators will return all the permutations of the iterations of the hash by nested iteration of embedded iterators. A hash simply includes a set of keys mapped to values. It is a very common data structure used throughout Perl programming. The Iterator:: Hash module allows a hash to include strings defining iterators (parsed and dispatched with Iterator::String) that are used to construct an overall series of hash values.
NASA Astrophysics Data System (ADS)
Donges, Jonathan F.; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V.; Marwan, Norbert; Dijkstra, Henk A.; Kurths, Jürgen
2015-11-01
We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology.
The application of complex network time series analysis in turbulent heated jets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Charakopoulos, A. K.; Karakasidis, T. E., E-mail: thkarak@uth.gr; Liakopoulos, A.
In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topologicalmore » properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.« less
The application of complex network time series analysis in turbulent heated jets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Charakopoulos, A. K.; Karakasidis, T. E., E-mail: thkarak@uth.gr; Liakopoulos, A.
2014-06-15
In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topologicalmore » properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.« less
Empirical method to measure stochasticity and multifractality in nonlinear time series
NASA Astrophysics Data System (ADS)
Lin, Chih-Hao; Chang, Chia-Seng; Li, Sai-Ping
2013-12-01
An empirical algorithm is used here to study the stochastic and multifractal nature of nonlinear time series. A parameter can be defined to quantitatively measure the deviation of the time series from a Wiener process so that the stochasticity of different time series can be compared. The local volatility of the time series under study can be constructed using this algorithm, and the multifractal structure of the time series can be analyzed by using this local volatility. As an example, we employ this method to analyze financial time series from different stock markets. The result shows that while developed markets evolve very much like an Ito process, the emergent markets are far from efficient. Differences about the multifractal structures and leverage effects between developed and emergent markets are discussed. The algorithm used here can be applied in a similar fashion to study time series of other complex systems.
Nonlinear techniques for forecasting solar activity directly from its time series
NASA Technical Reports Server (NTRS)
Ashrafi, S.; Roszman, L.; Cooley, J.
1992-01-01
Numerical techniques for constructing nonlinear predictive models to forecast solar flux directly from its time series are presented. This approach makes it possible to extract dynamical invariants of our system without reference to any underlying solar physics. We consider the dynamical evolution of solar activity in a reconstructed phase space that captures the attractor (strange), given a procedure for constructing a predictor of future solar activity, and discuss extraction of dynamical invariants such as Lyapunov exponents and attractor dimension.
Nonlinear techniques for forecasting solar activity directly from its time series
NASA Technical Reports Server (NTRS)
Ashrafi, S.; Roszman, L.; Cooley, J.
1993-01-01
This paper presents numerical techniques for constructing nonlinear predictive models to forecast solar flux directly from its time series. This approach makes it possible to extract dynamical in variants of our system without reference to any underlying solar physics. We consider the dynamical evolution of solar activity in a reconstructed phase space that captures the attractor (strange), give a procedure for constructing a predictor of future solar activity, and discuss extraction of dynamical invariants such as Lyapunov exponents and attractor dimension.
Visibility Graph Based Time Series Analysis.
Stephen, Mutua; Gu, Changgui; Yang, Huijie
2015-01-01
Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.
Sriyudthsak, Kansuporn; Iwata, Michio; Hirai, Masami Yokota; Shiraishi, Fumihide
2014-06-01
The availability of large-scale datasets has led to more effort being made to understand characteristics of metabolic reaction networks. However, because the large-scale data are semi-quantitative, and may contain biological variations and/or analytical errors, it remains a challenge to construct a mathematical model with precise parameters using only these data. The present work proposes a simple method, referred to as PENDISC (Parameter Estimation in a N on- DImensionalized S-system with Constraints), to assist the complex process of parameter estimation in the construction of a mathematical model for a given metabolic reaction system. The PENDISC method was evaluated using two simple mathematical models: a linear metabolic pathway model with inhibition and a branched metabolic pathway model with inhibition and activation. The results indicate that a smaller number of data points and rate constant parameters enhances the agreement between calculated values and time-series data of metabolite concentrations, and leads to faster convergence when the same initial estimates are used for the fitting. This method is also shown to be applicable to noisy time-series data and to unmeasurable metabolite concentrations in a network, and to have a potential to handle metabolome data of a relatively large-scale metabolic reaction system. Furthermore, it was applied to aspartate-derived amino acid biosynthesis in Arabidopsis thaliana plant. The result provides confirmation that the mathematical model constructed satisfactorily agrees with the time-series datasets of seven metabolite concentrations.
Nonstationary time series prediction combined with slow feature analysis
NASA Astrophysics Data System (ADS)
Wang, G.; Chen, X.
2015-07-01
Almost all climate time series have some degree of nonstationarity due to external driving forces perturbing the observed system. Therefore, these external driving forces should be taken into account when constructing the climate dynamics. This paper presents a new technique of obtaining the driving forces of a time series from the slow feature analysis (SFA) approach, and then introduces them into a predictive model to predict nonstationary time series. The basic theory of the technique is to consider the driving forces as state variables and to incorporate them into the predictive model. Experiments using a modified logistic time series and winter ozone data in Arosa, Switzerland, were conducted to test the model. The results showed improved prediction skills.
NASA Astrophysics Data System (ADS)
Mukhin, Dmitry; Gavrilov, Andrey; Loskutov, Evgeny; Feigin, Alexander
2016-04-01
We suggest a method for empirical forecast of climate dynamics basing on the reconstruction of reduced dynamical models in a form of random dynamical systems [1,2] derived from observational time series. The construction of proper embedding - the set of variables determining the phase space the model works in - is no doubt the most important step in such a modeling, but this task is non-trivial due to huge dimension of time series of typical climatic fields. Actually, an appropriate expansion of observational time series is needed yielding the number of principal components considered as phase variables, which are to be efficient for the construction of low-dimensional evolution operator. We emphasize two main features the reduced models should have for capturing the main dynamical properties of the system: (i) taking into account time-lagged teleconnections in the atmosphere-ocean system and (ii) reflecting the nonlinear nature of these teleconnections. In accordance to these principles, in this report we present the methodology which includes the combination of a new way for the construction of an embedding by the spatio-temporal data expansion and nonlinear model construction on the basis of artificial neural networks. The methodology is aplied to NCEP/NCAR reanalysis data including fields of sea level pressure, geopotential height, and wind speed, covering Northern Hemisphere. Its efficiency for the interannual forecast of various climate phenomena including ENSO, PDO, NAO and strong blocking event condition over the mid latitudes, is demonstrated. Also, we investigate the ability of the models to reproduce and predict the evolution of qualitative features of the dynamics, such as spectral peaks, critical transitions and statistics of extremes. This research was supported by the Government of the Russian Federation (Agreement No. 14.Z50.31.0033 with the Institute of Applied Physics RAS) [1] Y. I. Molkov, E. M. Loskutov, D. N. Mukhin, and A. M. Feigin, "Random dynamical models from time series," Phys. Rev. E, vol. 85, no. 3, p. 036216, 2012. [2] D. Mukhin, D. Kondrashov, E. Loskutov, A. Gavrilov, A. Feigin, and M. Ghil, "Predicting Critical Transitions in ENSO models. Part II: Spatially Dependent Models," J. Clim., vol. 28, no. 5, pp. 1962-1976, 2015.
Dynamic Factor Analysis of Nonstationary Multivariate Time Series.
ERIC Educational Resources Information Center
Molenaar, Peter C. M.; And Others
1992-01-01
The dynamic factor model proposed by P. C. Molenaar (1985) is exhibited, and a dynamic nonstationary factor model (DNFM) is constructed with latent factor series that have time-varying mean functions. The use of a DNFM is illustrated using data from a television viewing habits study. (SLD)
Characterizing time series: when Granger causality triggers complex networks
NASA Astrophysics Data System (ADS)
Ge, Tian; Cui, Yindong; Lin, Wei; Kurths, Jürgen; Liu, Chong
2012-08-01
In this paper, we propose a new approach to characterize time series with noise perturbations in both the time and frequency domains by combining Granger causality and complex networks. We construct directed and weighted complex networks from time series and use representative network measures to describe their physical and topological properties. Through analyzing the typical dynamical behaviors of some physical models and the MIT-BIHMassachusetts Institute of Technology-Beth Israel Hospital. human electrocardiogram data sets, we show that the proposed approach is able to capture and characterize various dynamics and has much potential for analyzing real-world time series of rather short length.
Bispectral Inversion: The Construction of a Time Series from Its Bispectrum
1988-04-13
take the inverse transform . Since the goal is to compute a time series given its bispectrum, it would also be nice to stay entirely in the frequency...domain and be able to go directly from the bispectrum to the Fourier transform of the time series without the need to inverse transform continuous...the picture. The approximations arise from representing the bicovariance, which is the inverse transform of a continuous function, by the inverse disrte
Correlations of stock price fluctuations under multi-scale and multi-threshold scenarios
NASA Astrophysics Data System (ADS)
Sui, Guo; Li, Huajiao; Feng, Sida; Liu, Xueyong; Jiang, Meihui
2018-01-01
The multi-scale method is widely used in analyzing time series of financial markets and it can provide market information for different economic entities who focus on different periods. Through constructing multi-scale networks of price fluctuation correlation in the stock market, we can detect the topological relationship between each time series. Previous research has not addressed the problem that the original fluctuation correlation networks are fully connected networks and more information exists within these networks that is currently being utilized. Here we use listed coal companies as a case study. First, we decompose the original stock price fluctuation series into different time scales. Second, we construct the stock price fluctuation correlation networks at different time scales. Third, we delete the edges of the network based on thresholds and analyze the network indicators. Through combining the multi-scale method with the multi-threshold method, we bring to light the implicit information of fully connected networks.
A Time Series of Mean Global Sea Surface Temperature from the Along-Track Scanning Radiometers
NASA Astrophysics Data System (ADS)
Veal, Karen L.; Corlett, Gary; Remedios, John; Llewellyn-Jones, David
2010-12-01
A climate data set requires a long time series of consistently processed data with suitably long periods of overlap of different instruments which allows characterization of any inter-instrument biases. The data obtained from ESA's three Along-Track Scanning Radiometers (ATSRs) together comprise an 18 year record of SST with overlap periods of at least 6 months. The data from all three ATSRs has been consistently processed. These factors together with the stability of the instruments and the precision of the derived SST makes this data set eminently suitable for the construction of a time series of SST that complies with many of the GCOS requirements for a climate data set. A time series of global and regional average SST anomalies has been constructed from the ATSR version 2 data set. An analysis of the overlap periods of successive instruments was used to remove intra-series biases and align the series to a common reference. An ATSR climatology has been developed and has been used to calculate the SST anomalies. The ATSR-1 time series and the AATSR time series have been aligned to ATSR-2. The largest adjustment is ~0.2 K between ATSR-2 and AATSR which is suspected to be due to a shift of the 12 μm filter function for AATSR. An uncertainty of 0.06 K is assigned to the relative anomaly record that is derived from the dual three-channel night-time data. A relative uncertainty of 0.07 K is assigned to the dual night-time two-channel record, except in the ATSR-1 period (1994-1996) where it is larger.
Simple Deterministically Constructed Recurrent Neural Networks
NASA Astrophysics Data System (ADS)
Rodan, Ali; Tiňo, Peter
A large number of models for time series processing, forecasting or modeling follows a state-space formulation. Models in the specific class of state-space approaches, referred to as Reservoir Computing, fix their state-transition function. The state space with the associated state transition structure forms a reservoir, which is supposed to be sufficiently complex so as to capture a large number of features of the input stream that can be potentially exploited by the reservoir-to-output readout mapping. The largely "black box" character of reservoirs prevents us from performing a deeper theoretical investigation of the dynamical properties of successful reservoirs. Reservoir construction is largely driven by a series of (more-or-less) ad-hoc randomized model building stages, with both the researchers and practitioners having to rely on a series of trials and errors. We show that a very simple deterministically constructed reservoir with simple cycle topology gives performances comparable to those of the Echo State Network (ESN) on a number of time series benchmarks. Moreover, we argue that the memory capacity of such a model can be made arbitrarily close to the proved theoretical limit.
Visibility Graph Based Time Series Analysis
Stephen, Mutua; Gu, Changgui; Yang, Huijie
2015-01-01
Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it’s microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks. PMID:26571115
NASA Astrophysics Data System (ADS)
Jia, Duo; Wang, Cangjiao; Lei, Shaogang
2018-01-01
Mapping vegetation dynamic types in mining areas is significant for revealing the mechanisms of environmental damage and for guiding ecological construction. Dynamic types of vegetation can be identified by applying interannual normalized difference vegetation index (NDVI) time series. However, phase differences and time shifts in interannual time series decrease mapping accuracy in mining regions. To overcome these problems and to increase the accuracy of mapping vegetation dynamics, an interannual Landsat time series for optimum vegetation growing status was constructed first by using the enhanced spatial and temporal adaptive reflectance fusion model algorithm. We then proposed a Markov random field optimized semisupervised Gaussian dynamic time warping kernel-based fuzzy c-means (FCM) cluster algorithm for interannual NDVI time series to map dynamic vegetation types in mining regions. The proposed algorithm has been tested in the Shengli mining region and Shendong mining region, which are typical representatives of China's open-pit and underground mining regions, respectively. Experiments show that the proposed algorithm can solve the problems of phase differences and time shifts to achieve better performance when mapping vegetation dynamic types. The overall accuracies for the Shengli and Shendong mining regions were 93.32% and 89.60%, respectively, with improvements of 7.32% and 25.84% when compared with the original semisupervised FCM algorithm.
Layered Ensemble Architecture for Time Series Forecasting.
Rahman, Md Mustafizur; Islam, Md Monirul; Murase, Kazuyuki; Yao, Xin
2016-01-01
Time series forecasting (TSF) has been widely used in many application areas such as science, engineering, and finance. The phenomena generating time series are usually unknown and information available for forecasting is only limited to the past values of the series. It is, therefore, necessary to use an appropriate number of past values, termed lag, for forecasting. This paper proposes a layered ensemble architecture (LEA) for TSF problems. Our LEA consists of two layers, each of which uses an ensemble of multilayer perceptron (MLP) networks. While the first ensemble layer tries to find an appropriate lag, the second ensemble layer employs the obtained lag for forecasting. Unlike most previous work on TSF, the proposed architecture considers both accuracy and diversity of the individual networks in constructing an ensemble. LEA trains different networks in the ensemble by using different training sets with an aim of maintaining diversity among the networks. However, it uses the appropriate lag and combines the best trained networks to construct the ensemble. This indicates LEAs emphasis on accuracy of the networks. The proposed architecture has been tested extensively on time series data of neural network (NN)3 and NN5 competitions. It has also been tested on several standard benchmark time series data. In terms of forecasting accuracy, our experimental results have revealed clearly that LEA is better than other ensemble and nonensemble methods.
NASA Astrophysics Data System (ADS)
Zou, Hai-Long; Yu, Zu-Guo; Anh, Vo; Ma, Yuan-Lin
2018-05-01
In recent years, researchers have proposed several methods to transform time series (such as those of fractional Brownian motion) into complex networks. In this paper, we construct horizontal visibility networks (HVNs) based on the -stable Lévy motion. We aim to study the relations of multifractal and Laplacian spectrum of transformed networks on the parameters and of the -stable Lévy motion. First, we employ the sandbox algorithm to compute the mass exponents and multifractal spectrum to investigate the multifractality of these HVNs. Then we perform least squares fits to find possible relations of the average fractal dimension , the average information dimension and the average correlation dimension against using several methods of model selection. We also investigate possible dependence relations of eigenvalues and energy on , calculated from the Laplacian and normalized Laplacian operators of the constructed HVNs. All of these constructions and estimates will help us to evaluate the validity and usefulness of the mappings between time series and networks, especially between time series of -stable Lévy motions and HVNs.
Multiscale Poincaré plots for visualizing the structure of heartbeat time series.
Henriques, Teresa S; Mariani, Sara; Burykin, Anton; Rodrigues, Filipa; Silva, Tiago F; Goldberger, Ary L
2016-02-09
Poincaré delay maps are widely used in the analysis of cardiac interbeat interval (RR) dynamics. To facilitate visualization of the structure of these time series, we introduce multiscale Poincaré (MSP) plots. Starting with the original RR time series, the method employs a coarse-graining procedure to create a family of time series, each of which represents the system's dynamics in a different time scale. Next, the Poincaré plots are constructed for the original and the coarse-grained time series. Finally, as an optional adjunct, color can be added to each point to represent its normalized frequency. We illustrate the MSP method on simulated Gaussian white and 1/f noise time series. The MSP plots of 1/f noise time series reveal relative conservation of the phase space area over multiple time scales, while those of white noise show a marked reduction in area. We also show how MSP plots can be used to illustrate the loss of complexity when heartbeat time series from healthy subjects are compared with those from patients with chronic (congestive) heart failure syndrome or with atrial fibrillation. This generalized multiscale approach to Poincaré plots may be useful in visualizing other types of time series.
NASA Astrophysics Data System (ADS)
Donges, Jonathan; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik; Marwan, Norbert; Dijkstra, Henk; Kurths, Jürgen
2016-04-01
We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology. pyunicorn is available online at https://github.com/pik-copan/pyunicorn. Reference: J.F. Donges, J. Heitzig, B. Beronov, M. Wiedermann, J. Runge, Q.-Y. Feng, L. Tupikina, V. Stolbova, R.V. Donner, N. Marwan, H.A. Dijkstra, and J. Kurths, Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package, Chaos 25, 113101 (2015), DOI: 10.1063/1.4934554, Preprint: arxiv.org:1507.01571 [physics.data-an].
NASA Astrophysics Data System (ADS)
Arqub, Omar Abu; El-Ajou, Ahmad; Momani, Shaher
2015-07-01
Building fractional mathematical models for specific phenomena and developing numerical or analytical solutions for these fractional mathematical models are crucial issues in mathematics, physics, and engineering. In this work, a new analytical technique for constructing and predicting solitary pattern solutions of time-fractional dispersive partial differential equations is proposed based on the generalized Taylor series formula and residual error function. The new approach provides solutions in the form of a rapidly convergent series with easily computable components using symbolic computation software. For method evaluation and validation, the proposed technique was applied to three different models and compared with some of the well-known methods. The resultant simulations clearly demonstrate the superiority and potentiality of the proposed technique in terms of the quality performance and accuracy of substructure preservation in the construct, as well as the prediction of solitary pattern solutions for time-fractional dispersive partial differential equations.
Mining Recent Temporal Patterns for Event Detection in Multivariate Time Series Data
Batal, Iyad; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos
2015-01-01
Improving the performance of classifiers using pattern mining techniques has been an active topic of data mining research. In this work we introduce the recent temporal pattern mining framework for finding predictive patterns for monitoring and event detection problems in complex multivariate time series data. This framework first converts time series into time-interval sequences of temporal abstractions. It then constructs more complex temporal patterns backwards in time using temporal operators. We apply our framework to health care data of 13,558 diabetic patients and show its benefits by efficiently finding useful patterns for detecting and diagnosing adverse medical conditions that are associated with diabetes. PMID:25937993
A daily Azores-Iceland North Atlantic Oscillation index back to 1850.
Cropper, Thomas; Hanna, Edward; Valente, Maria Antónia; Jónsson, Trausti
2015-07-01
We present the construction of a continuous, daily (09:00 UTC), station-based (Azores-Iceland) North Atlantic Oscillation (NAO) Index back to 1871 which is extended back to 1850 with additional daily mean data. The constructed index more than doubles the length of previously existing, widely available, daily NAO time series. The index is created using entirely observational sea-level pressure (SLP) data from Iceland and 73.5% of observational SLP data from the Azores - the remainder being filled in via reanalysis (Twentieth Century Reanalysis Project and European Mean Sea Level Pressure) SLP data. Icelandic data are taken from the Southwest Iceland pressure series. We construct and document a new Ponta Delgada SLP time series based on recently digitized and newly available data that extend back to 1872. The Ponta Delgada time series is created by splicing together several fractured records (from Ponta Delgada, Lajes, and Santa Maria) and filling in the major gaps (pre-1872, 1888-1905, and 1940-1941) and occasional days (145) with reanalysis data. Further homogeneity corrections are applied to the Azores record, and the daily (09:00 UTC) NAO index is then calculated. The resulting index, with its extended temporal length and daily resolution, is the first reconstruction of daily NAO back into the 19th Century and therefore is useful for researchers across multiple disciplines.
Characterizing Time Series Data Diversity for Wind Forecasting: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hodge, Brian S; Chartan, Erol Kevin; Feng, Cong
Wind forecasting plays an important role in integrating variable and uncertain wind power into the power grid. Various forecasting models have been developed to improve the forecasting accuracy. However, it is challenging to accurately compare the true forecasting performances from different methods and forecasters due to the lack of diversity in forecasting test datasets. This paper proposes a time series characteristic analysis approach to visualize and quantify wind time series diversity. The developed method first calculates six time series characteristic indices from various perspectives. Then the principal component analysis is performed to reduce the data dimension while preserving the importantmore » information. The diversity of the time series dataset is visualized by the geometric distribution of the newly constructed principal component space. The volume of the 3-dimensional (3D) convex polytope (or the length of 1D number axis, or the area of the 2D convex polygon) is used to quantify the time series data diversity. The method is tested with five datasets with various degrees of diversity.« less
Constructive Processes in Linear Order Problems Revealed by Sentence Study Times
ERIC Educational Resources Information Center
Mynatt, Barbee T.; Smith, Kirk H.
1977-01-01
This research was a further test of the theory of constructive processes proposed by Foos, Smith, Sabol, and Mynatt (1976) to account for differences among presentation orders in the construction of linear orders. This theory is composed of different series of mental operations that must be performed when an order relationship is integrated with…
Hu, Wenfa; He, Xinhua
2014-01-01
The time, quality, and cost are three important but contradictive objectives in a building construction project. It is a tough challenge for project managers to optimize them since they are different parameters. This paper presents a time-cost-quality optimization model that enables managers to optimize multiobjectives. The model is from the project breakdown structure method where task resources in a construction project are divided into a series of activities and further into construction labors, materials, equipment, and administration. The resources utilized in a construction activity would eventually determine its construction time, cost, and quality, and a complex time-cost-quality trade-off model is finally generated based on correlations between construction activities. A genetic algorithm tool is applied in the model to solve the comprehensive nonlinear time-cost-quality problems. Building of a three-storey house is an example to illustrate the implementation of the model, demonstrate its advantages in optimizing trade-off of construction time, cost, and quality, and help make a winning decision in construction practices. The computational time-cost-quality curves in visual graphics from the case study prove traditional cost-time assumptions reasonable and also prove this time-cost-quality trade-off model sophisticated.
Multidimensional stock network analysis: An Escoufier's RV coefficient approach
NASA Astrophysics Data System (ADS)
Lee, Gan Siew; Djauhari, Maman A.
2013-09-01
The current practice of stocks network analysis is based on the assumption that the time series of closed stock price could represent the behaviour of the each stock. This assumption leads to consider minimal spanning tree (MST) and sub-dominant ultrametric (SDU) as an indispensible tool to filter the economic information contained in the network. Recently, there is an attempt where researchers represent stock not only as a univariate time series of closed price but as a bivariate time series of closed price and volume. In this case, they developed the so-called multidimensional MST to filter the important economic information. However, in this paper, we show that their approach is only applicable for that bivariate time series only. This leads us to introduce a new methodology to construct MST where each stock is represented by a multivariate time series. An example of Malaysian stock exchange will be presented and discussed to illustrate the advantages of the method.
Phase space reconstruction and estimation of the largest Lyapunov exponent for gait kinematic data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Josiński, Henryk; Świtoński, Adam; Silesian University of Technology, Akademicka 16, 44-100 Gliwice
The authors describe an example of application of nonlinear time series analysis directed at identifying the presence of deterministic chaos in human motion data by means of the largest Lyapunov exponent. The method was previously verified on the basis of a time series constructed from the numerical solutions of both the Lorenz and the Rössler nonlinear dynamical systems.
Bivariate analysis of floods in climate impact assessments.
Brunner, Manuela Irene; Sikorska, Anna E; Seibert, Jan
2018-03-01
Climate impact studies regarding floods usually focus on peak discharges and a bivariate assessment of peak discharges and hydrograph volumes is not commonly included. A joint consideration of peak discharges and hydrograph volumes, however, is crucial when assessing flood risks for current and future climate conditions. Here, we present a methodology to develop synthetic design hydrographs for future climate conditions that jointly consider peak discharges and hydrograph volumes. First, change factors are derived based on a regional climate model and are applied to observed precipitation and temperature time series. Second, the modified time series are fed into a calibrated hydrological model to simulate runoff time series for future conditions. Third, these time series are used to construct synthetic design hydrographs. The bivariate flood frequency analysis used in the construction of synthetic design hydrographs takes into account the dependence between peak discharges and hydrograph volumes, and represents the shape of the hydrograph. The latter is modeled using a probability density function while the dependence between the design variables peak discharge and hydrograph volume is modeled using a copula. We applied this approach to a set of eight mountainous catchments in Switzerland to construct catchment-specific and season-specific design hydrographs for a control and three scenario climates. Our work demonstrates that projected climate changes have an impact not only on peak discharges but also on hydrograph volumes and on hydrograph shapes both at an annual and at a seasonal scale. These changes are not necessarily proportional which implies that climate impact assessments on future floods should consider more flood characteristics than just flood peaks. Copyright © 2017. Published by Elsevier B.V.
Reilly, Carolyn Miller; Higgins, Melinda; Smith, Andrew; Culler, Steven D; Dunbar, Sandra B
2015-11-01
This paper presents a secondary in-depth analysis of five persons with heart failure randomized to receive an education and behavioral intervention on fluid restriction as part of a larger study. Using a single subject analysis design, time series analyses models were constructed for each of the five patients for a period of 180 days to determine correlations between daily measures of patient reported fluid intake, thoracic impedance, and weights, and relationships between patient reported outcomes of symptom burden and health related quality of life over time. Negative relationships were observed between fluid intake and thoracic impedance, and between impedance and weight, while positive correlations were observed between daily fluid intake and weight. By constructing time series analyses of daily measures of fluid congestion, trends and patterns of fluid congestion emerged which could be used to guide individualized patient care or future research endeavors. Employment of such a specialized analysis technique allows for the elucidation of clinically relevant findings potentially disguised when only evaluating aggregate outcomes of larger studies. Copyright © 2015 Elsevier Inc. All rights reserved.
Modeling and forecasting of KLCI weekly return using WT-ANN integrated model
NASA Astrophysics Data System (ADS)
Liew, Wei-Thong; Liong, Choong-Yeun; Hussain, Saiful Izzuan; Isa, Zaidi
2013-04-01
The forecasting of weekly return is one of the most challenging tasks in investment since the time series are volatile and non-stationary. In this study, an integrated model of wavelet transform and artificial neural network, WT-ANN is studied for modeling and forecasting of KLCI weekly return. First, the WT is applied to decompose the weekly return time series in order to eliminate noise. Then, a mathematical model of the time series is constructed using the ANN. The performance of the suggested model will be evaluated by root mean squared error (RMSE), mean absolute error (MAE), mean absolute percentage error (MAPE). The result shows that the WT-ANN model can be considered as a feasible and powerful model for time series modeling and prediction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gentili, Pier Luigi, E-mail: pierluigi.gentili@unipg.it; Gotoda, Hiroshi; Dolnik, Milos
Forecasting of aperiodic time series is a compelling challenge for science. In this work, we analyze aperiodic spectrophotometric data, proportional to the concentrations of two forms of a thermoreversible photochromic spiro-oxazine, that are generated when a cuvette containing a solution of the spiro-oxazine undergoes photoreaction and convection due to localized ultraviolet illumination. We construct the phase space for the system using Takens' theorem and we calculate the Lyapunov exponents and the correlation dimensions to ascertain the chaotic character of the time series. Finally, we predict the time series using three distinct methods: a feed-forward neural network, fuzzy logic, and amore » local nonlinear predictor. We compare the performances of these three methods.« less
Construction of the Non-Rigid Earth Rotation Series
NASA Astrophysics Data System (ADS)
Pashkevich, V. V.
2007-01-01
Last years a lot of attempts to derive a high-precision theory of the non-rigid Earth rotation are carried out. For these purposes different transfer functions are used. Usually these transfer functions are applied to the series representing the nutation in the longitude and the obliquity of the rigid Earth rotation with respect to the ecliptic of date. The aim of this investigation is a construction of new high-precision non-rigid Earth rotation series (SN9000), dynamically adequate to the DE404/LE404 ephemeris over 2000 time span years, which are presented as functions of the Euler angles Ψ, θ and φ with respect to the fixed ecliptic plane and equinox J2000.0.
2014-01-01
The time, quality, and cost are three important but contradictive objectives in a building construction project. It is a tough challenge for project managers to optimize them since they are different parameters. This paper presents a time-cost-quality optimization model that enables managers to optimize multiobjectives. The model is from the project breakdown structure method where task resources in a construction project are divided into a series of activities and further into construction labors, materials, equipment, and administration. The resources utilized in a construction activity would eventually determine its construction time, cost, and quality, and a complex time-cost-quality trade-off model is finally generated based on correlations between construction activities. A genetic algorithm tool is applied in the model to solve the comprehensive nonlinear time-cost-quality problems. Building of a three-storey house is an example to illustrate the implementation of the model, demonstrate its advantages in optimizing trade-off of construction time, cost, and quality, and help make a winning decision in construction practices. The computational time-cost-quality curves in visual graphics from the case study prove traditional cost-time assumptions reasonable and also prove this time-cost-quality trade-off model sophisticated. PMID:24672351
Reconstructing multi-mode networks from multivariate time series
NASA Astrophysics Data System (ADS)
Gao, Zhong-Ke; Yang, Yu-Xuan; Dang, Wei-Dong; Cai, Qing; Wang, Zhen; Marwan, Norbert; Boccaletti, Stefano; Kurths, Jürgen
2017-09-01
Unveiling the dynamics hidden in multivariate time series is a task of the utmost importance in a broad variety of areas in physics. We here propose a method that leads to the construction of a novel functional network, a multi-mode weighted graph combined with an empirical mode decomposition, and to the realization of multi-information fusion of multivariate time series. The method is illustrated in a couple of successful applications (a multi-phase flow and an epileptic electro-encephalogram), which demonstrate its powerfulness in revealing the dynamical behaviors underlying the transitions of different flow patterns, and enabling to differentiate brain states of seizure and non-seizure.
What does the structure of its visibility graph tell us about the nature of the time series?
NASA Astrophysics Data System (ADS)
Franke, Jasper G.; Donner, Reik V.
2017-04-01
Visibility graphs are a recently introduced method to construct complex network representations based upon univariate time series in order to study their dynamical characteristics [1]. In the last years, this approach has been successfully applied to studying a considerable variety of geoscientific research questions and data sets, including non-trivial temporal patterns in complex earthquake catalogs [2] or time-reversibility in climate time series [3]. It has been shown that several characteristic features of the thus constructed networks differ between stochastic and deterministic (possibly chaotic) processes, which is, however, relatively hard to exploit in the case of real-world applications. In this study, we propose studying two new measures related with the network complexity of visibility graphs constructed from time series, one being a special type of network entropy [4] and the other a recently introduced measure of the heterogeneity of the network's degree distribution [5]. For paradigmatic model systems exhibiting bifurcation sequences between regular and chaotic dynamics, both properties clearly trace the transitions between both types of regimes and exhibit marked quantitative differences for regular and chaotic dynamics. Moreover, for dynamical systems with a small amount of additive noise, the considered properties demonstrate gradual changes prior to the bifurcation point. This finding appears closely related to the subsequent loss of stability of the current state known to lead to a critical slowing down as the transition point is approaches. In this spirit, both considered visibility graph characteristics provide alternative tracers of dynamical early warning signals consistent with classical indicators. Our results demonstrate that measures of visibility graph complexity (i) provide a potentially useful means to tracing changes in the dynamical patterns encoded in a univariate time series that originate from increasing autocorrelation and (ii) allow to systematically distinguish regular from deterministic-chaotic dynamics. We demonstrate the application of our method for different model systems as well as selected paleoclimate time series from the North Atlantic region. Notably, visibility graph based methods are particularly suited for studying the latter type of geoscientific data, since they do not impose intrinsic restrictions or assumptions on the nature of the time series under investigation in terms of noise process, linearity and sampling homogeneity. [1] Lacasa, Lucas, et al. "From time series to complex networks: The visibility graph." Proceedings of the National Academy of Sciences 105.13 (2008): 4972-4975. [2] Telesca, Luciano, and Michele Lovallo. "Analysis of seismic sequences by using the method of visibility graph." EPL (Europhysics Letters) 97.5 (2012): 50002. [3] Donges, Jonathan F., Reik V. Donner, and Jürgen Kurths. "Testing time series irreversibility using complex network methods." EPL (Europhysics Letters) 102.1 (2013): 10004. [4] Small, Michael. "Complex networks from time series: capturing dynamics." 2013 IEEE International Symposium on Circuits and Systems (ISCAS2013), Beijing (2013): 2509-2512. [5] Jacob, Rinku, K.P. Harikrishnan, Ranjeev Misra, and G. Ambika. "Measure for degree heterogeneity in complex networks and its application to recurrence network analysis." arXiv preprint 1605.06607 (2016).
46 CFR 108.550 - Survival craft launching and recovery arrangements: General.
Code of Federal Regulations, 2011 CFR
2011-10-01
... must be designed, based on the ultimate strength of the construction material, to be at least 4.5 times...-MOBILE OFFSHORE DRILLING UNITS DESIGN AND EQUIPMENT Lifesaving Equipment § 108.550 Survival craft... approved under approval series 160.132, with a winch approved under approval series 160.115. Each launching...
46 CFR 108.550 - Survival craft launching and recovery arrangements: General.
Code of Federal Regulations, 2010 CFR
2010-10-01
... must be designed, based on the ultimate strength of the construction material, to be at least 4.5 times...-MOBILE OFFSHORE DRILLING UNITS DESIGN AND EQUIPMENT Lifesaving Equipment § 108.550 Survival craft... approved under approval series 160.132, with a winch approved under approval series 160.115. Each launching...
A Continuous Long-Term Record of Magnetic-Storm Occurrence and Intensity
NASA Astrophysics Data System (ADS)
Love, J. J.
2007-05-01
Hourly magnetometer data have been produced by ground-based magnetic observatories for over a century. These data are used for a wide variety of applications, including many for space physics. In particular, hourly data from a longitudinal necklace of mid-latitude observatories can be used to construct a time series recording the storm-time disturbance index Dst, one of the most useful scalar summaries of magnetic storm intensity which is generally interpreted in terms of an equivalent equatorial magnetospheric ring current. Dst has been routinely calculated in a temporally piece-wise fashion since the IGY using a subset of the available observatories: four or five stations, typically including Honolulu (HON), San Juan (SJG), Kakioka Japan (KAK), Hermanus South Africa (HER), and Alibag India (ABG). In this presentation we discuss a single continuous Dst time series made using a denser and more uniform distribution of observatories than that which is standard: including, additionally, Watheroo Australia (WAT), Apia Samoa (API), and Vassouras Brazil (VSS). Starting first with the data from each individual observatory, we subtract the geomagnetic secular variation, caused primarily by the core dynamo, and the solar-quiet (Sq) variation, caused primarily by the ionospheric dynamo. The latter requires careful spectral analysis, and those intermediate results are, themselves, of scientific interest. Following this, we combine the disturbance residuals from each station to form the continuous Dst time series. Statistics deduced from this model allow us to quantify the likelihood of storm occurrence and intensity, both of which are modulated in time by the solar cycle. This analysis is accomplished using a 50 year Dst time series. The prospects for constructing a longer continuous Dst time series are discussed.
Dynamic Factor Analysis Models with Time-Varying Parameters
ERIC Educational Resources Information Center
Chow, Sy-Miin; Zu, Jiyun; Shifren, Kim; Zhang, Guangjian
2011-01-01
Dynamic factor analysis models with time-varying parameters offer a valuable tool for evaluating multivariate time series data with time-varying dynamics and/or measurement properties. We use the Dynamic Model of Activation proposed by Zautra and colleagues (Zautra, Potter, & Reich, 1997) as a motivating example to construct a dynamic factor…
Using spectrotemporal indices to improve the fruit-tree crop classification accuracy
NASA Astrophysics Data System (ADS)
Peña, M. A.; Liao, R.; Brenning, A.
2017-06-01
This study assesses the potential of spectrotemporal indices derived from satellite image time series (SITS) to improve the classification accuracy of fruit-tree crops. Six major fruit-tree crop types in the Aconcagua Valley, Chile, were classified by applying various linear discriminant analysis (LDA) techniques on a Landsat-8 time series of nine images corresponding to the 2014-15 growing season. As features we not only used the complete spectral resolution of the SITS, but also all possible normalized difference indices (NDIs) that can be constructed from any two bands of the time series, a novel approach to derive features from SITS. Due to the high dimensionality of this "enhanced" feature set we used the lasso and ridge penalized variants of LDA (PLDA). Although classification accuracies yielded by the standard LDA applied on the full-band SITS were good (misclassification error rate, MER = 0.13), they were further improved by 23% (MER = 0.10) with ridge PLDA using the enhanced feature set. The most important bands to discriminate the crops of interest were mainly concentrated on the first two image dates of the time series, corresponding to the crops' greenup stage. Despite the high predictor weights provided by the red and near infrared bands, typically used to construct greenness spectral indices, other spectral regions were also found important for the discrimination, such as the shortwave infrared band at 2.11-2.19 μm, sensitive to foliar water changes. These findings support the usefulness of spectrotemporal indices in the context of SITS-based crop type classifications, which until now have been mainly constructed by the arithmetic combination of two bands of the same image date in order to derive greenness temporal profiles like those from the normalized difference vegetation index.
Halliday, David M; Senik, Mohd Harizal; Stevenson, Carl W; Mason, Rob
2016-08-01
The ability to infer network structure from multivariate neuronal signals is central to computational neuroscience. Directed network analyses typically use parametric approaches based on auto-regressive (AR) models, where networks are constructed from estimates of AR model parameters. However, the validity of using low order AR models for neurophysiological signals has been questioned. A recent article introduced a non-parametric approach to estimate directionality in bivariate data, non-parametric approaches are free from concerns over model validity. We extend the non-parametric framework to include measures of directed conditional independence, using scalar measures that decompose the overall partial correlation coefficient summatively by direction, and a set of functions that decompose the partial coherence summatively by direction. A time domain partial correlation function allows both time and frequency views of the data to be constructed. The conditional independence estimates are conditioned on a single predictor. The framework is applied to simulated cortical neuron networks and mixtures of Gaussian time series data with known interactions. It is applied to experimental data consisting of local field potential recordings from bilateral hippocampus in anaesthetised rats. The framework offers a non-parametric approach to estimation of directed interactions in multivariate neuronal recordings, and increased flexibility in dealing with both spike train and time series data. The framework offers a novel alternative non-parametric approach to estimate directed interactions in multivariate neuronal recordings, and is applicable to spike train and time series data. Copyright © 2016 Elsevier B.V. All rights reserved.
Typology of State Types: Persistence and Transition
2015-04-28
is the lack of positive transition among the weakest states. Our findings are derived from a minimalist construct of a refined time series dataset...states based on a „ minimalist ‟ construct of the Country Indicators for Foreign Policy (CIFP) fragile states project and its core structural...begin with the rationale for developing a minimalist construct of a state typology model (STM), similar to the approach taken by Gravingholt, Ziaja
Transition Icons for Time-Series Visualization and Exploratory Analysis.
Nickerson, Paul V; Baharloo, Raheleh; Wanigatunga, Amal A; Manini, Todd M; Tighe, Patrick J; Rashidi, Parisa
2018-03-01
The modern healthcare landscape has seen the rapid emergence of techniques and devices that temporally monitor and record physiological signals. The prevalence of time-series data within the healthcare field necessitates the development of methods that can analyze the data in order to draw meaningful conclusions. Time-series behavior is notoriously difficult to intuitively understand due to its intrinsic high-dimensionality, which is compounded in the case of analyzing groups of time series collected from different patients. Our framework, which we call transition icons, renders common patterns in a visual format useful for understanding the shared behavior within groups of time series. Transition icons are adept at detecting and displaying subtle differences and similarities, e.g., between measurements taken from patients receiving different treatment strategies or stratified by demographics. We introduce various methods that collectively allow for exploratory analysis of groups of time series, while being free of distribution assumptions and including simple heuristics for parameter determination. Our technique extracts discrete transition patterns from symbolic aggregate approXimation representations, and compiles transition frequencies into a bag of patterns constructed for each group. These transition frequencies are normalized and aligned in icon form to intuitively display the underlying patterns. We demonstrate the transition icon technique for two time-series datasets-postoperative pain scores, and hip-worn accelerometer activity counts. We believe transition icons can be an important tool for researchers approaching time-series data, as they give rich and intuitive information about collective time-series behaviors.
Real-Time Series Resistance Monitoring in PV Systems Without the Need for I-V Curves
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deceglie, Michael G.; Silverman, Timothy J.; Marion, Bill
We apply the physical principles of a familiar method, suns-V oc, to a new application: the real-time detection of series resistance changes in modules and systems operating outside. The real-time series resistance (RTSR) method that we describe avoids the need for collecting I-V curves or constructing full series resistance-free I-V curves. RTSR is most readily deployable at the module level on microinverters or module-integrated electronics, but it can also be extended to full strings. We found that automated detection of series resistance increases can provide early warnings of some of the most common reliability issues, which also pose fire risks,more » including broken ribbons, broken solder bonds, and contact problems in the junction or combiner box. We also describe the method in detail and describe a sample application to data collected from modules operating in the field.« less
Real-Time Series Resistance Monitoring in PV Systems Without the Need for IV Curves
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deceglie, Michael G.; Silverman, Timothy J.; Marion, Bill
We apply the physical principles of a familiar method, suns-Voc, to a new application: the real-time detection of series resistance changes in modules and systems operating outside. The real-time series resistance (RTSR) method that we describe avoids the need for collecting IV curves or constructing full series-resistance-free IV curves. RTSR is most readily deployable at the module level on micro-inverters or module-integrated electronics, but it can also be extended to full strings. Automated detection of series resistance increases can provide early warnings of some of the most common reliability issues, which also pose fire risks, including broken ribbons, broken soldermore » bonds, and contact problems in the junction or combiner box. We describe the method in detail and describe a sample application to data collected from modules operating in the field.« less
Joint Seasonal ARMA Approach for Modeling of Load Forecast Errors in Planning Studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hafen, Ryan P.; Samaan, Nader A.; Makarov, Yuri V.
2014-04-14
To make informed and robust decisions in the probabilistic power system operation and planning process, it is critical to conduct multiple simulations of the generated combinations of wind and load parameters and their forecast errors to handle the variability and uncertainty of these time series. In order for the simulation results to be trustworthy, the simulated series must preserve the salient statistical characteristics of the real series. In this paper, we analyze day-ahead load forecast error data from multiple balancing authority locations and characterize statistical properties such as mean, standard deviation, autocorrelation, correlation between series, time-of-day bias, and time-of-day autocorrelation.more » We then construct and validate a seasonal autoregressive moving average (ARMA) model to model these characteristics, and use the model to jointly simulate day-ahead load forecast error series for all BAs.« less
Detection of anomalous signals in temporally correlated data (Invited)
NASA Astrophysics Data System (ADS)
Langbein, J. O.
2010-12-01
Detection of transient tectonic signals in data obtained from large geodetic networks requires the ability to detect signals that are both temporally and spatially coherent. In this report I will describe a modification to an existing method that estimates both the coefficients of temporally correlated noise model and an efficient filter based on the noise model. This filter, when applied to the original time-series, effectively whitens (or flattens) the power spectrum. The filtered data provide the means to calculate running averages which are then used to detect deviations from the background trends. For large networks, time-series of signal-to-noise ratio (SNR) can be easily constructed since, by filtering, each of the original time-series has been transformed into one that is closer to having a Gaussian distribution with a variance of 1.0. Anomalous intervals may be identified by counting the number of GPS sites for which the SNR exceeds a specified value. For example, during one time interval, if there were 5 out of 20 time-series with SNR>2, this would be considered anomalous; typically, one would expect at 95% confidence that there would be at least 1 out of 20 time-series with an SNR>2. For time intervals with an anomalously large number of high SNR, the spatial distribution of the SNR is mapped to identify the location of the anomalous signal(s) and their degree of spatial clustering. Estimating the filter that should be used to whiten the data requires modification of the existing methods that employ maximum likelihood estimation to determine the temporal covariance of the data. In these methods, it is assumed that the noise components in the data are a combination of white, flicker and random-walk processes and that they are derived from three different and independent sources. Instead, in this new method, the covariance matrix is constructed assuming that only one source is responsible for the noise and that source can be represented as a white-noise random-number generator convolved with a filter whose spectral properties are frequency (f) independent at its highest frequencies, 1/f at the middle frequencies, and 1/f2 at the lowest frequencies. For data sets with no gaps in their time-series, construction of covariance and inverse covariance matrices is extremely efficient. Application of the above algorithm to real data potentially involves several iterations as small, tectonic signals of interest are often indistinguishable from background noise. Consequently, simply plotting the time-series of each GPS site is used to identify the largest outliers and signals independent of their cause. Any analysis of the background noise levels must factor in these other signals while the gross outliers need to be removed.
Reconstruction of the Precipitation in the Canary Islands for the Period 1595-1836.
NASA Astrophysics Data System (ADS)
García, Ricardo; Macias, Antonio; Gallego, David; Hernández, Emiliano; Gimeno, Luis; Ribera, Pedro
2003-08-01
Historical documentary sources in the Canary Islands have been used to construct cereal production series for the period 1595-1836. The cereal growth period in this region covers essentially the rainy season, making these crops adequate to characterize the annual precipitation. A proxy for the Islands' rainfall based on the historical series of wheat and barley production has been constructed and assessed by using two independent series of dry and wet years. The spectral analysis of the crop production reveals a strong non stationary behavior. This fact, along with the direct comparison with several reconstructed and instrumental North Atlantic Oscillation series, suggests the potential use of the reconstructed precipitation as a proxy for this climatic oscillation during preinstrumental times.This is an abridged version of the full-length article that is available online (10.1175/BAMS-84-8-García)
Using First Differences to Reduce Inhomogeneity in Radiosonde Temperature Datasets.
NASA Astrophysics Data System (ADS)
Free, Melissa; Angell, James K.; Durre, Imke; Lanzante, John; Peterson, Thomas C.; Seidel, Dian J.
2004-11-01
The utility of a “first difference” method for producing temporally homogeneous large-scale mean time series is assessed. Starting with monthly averages, the method involves dropping data around the time of suspected discontinuities and then calculating differences in temperature from one year to the next, resulting in a time series of year-to-year differences for each month at each station. These first difference time series are then combined to form large-scale means, and mean temperature time series are constructed from the first difference series. When applied to radiosonde temperature data, the method introduces random errors that decrease with the number of station time series used to create the large-scale time series and increase with the number of temporal gaps in the station time series. Root-mean-square errors for annual means of datasets produced with this method using over 500 stations are estimated at no more than 0.03 K, with errors in trends less than 0.02 K decade-1 for 1960 97 at 500 mb. For a 50-station dataset, errors in trends in annual global means introduced by the first differencing procedure may be as large as 0.06 K decade-1 (for six breaks per series), which is greater than the standard error of the trend. Although the first difference method offers significant resource and labor advantages over methods that attempt to adjust the data, it introduces an error in large-scale mean time series that may be unacceptable in some cases.
NASA Astrophysics Data System (ADS)
Yuan, Y.; Meng, Y.; Chen, Y. X.; Jiang, C.; Yue, A. Z.
2018-04-01
In this study, we proposed a method to map urban encroachment onto farmland using satellite image time series (SITS) based on the hierarchical hidden Markov model (HHMM). In this method, the farmland change process is decomposed into three hierarchical levels, i.e., the land cover level, the vegetation phenology level, and the SITS level. Then a three-level HHMM is constructed to model the multi-level semantic structure of farmland change process. Once the HHMM is established, a change from farmland to built-up could be detected by inferring the underlying state sequence that is most likely to generate the input time series. The performance of the method is evaluated on MODIS time series in Beijing. Results on both simulated and real datasets demonstrate that our method improves the change detection accuracy compared with the HMM-based method.
Fractal analysis on human dynamics of library loans
NASA Astrophysics Data System (ADS)
Fan, Chao; Guo, Jin-Li; Zha, Yi-Long
2012-12-01
In this paper, the fractal characteristic of human behaviors is investigated from the perspective of time series constructed with the amount of library loans. The values of the Hurst exponent and length of non-periodic cycle calculated through rescaled range analysis indicate that the time series of human behaviors and their sub-series are fractal with self-similarity and long-range dependence. Then the time series are converted into complex networks by the visibility algorithm. The topological properties of the networks such as scale-free property and small-world effect imply that there is a close relationship among the numbers of repetitious behaviors performed by people during certain periods of time. Our work implies that there is intrinsic regularity in the human collective repetitious behaviors. The conclusions may be helpful to develop some new approaches to investigate the fractal feature and mechanism of human dynamics, and provide some references for the management and forecast of human collective behaviors.
NASA Astrophysics Data System (ADS)
Heinemeier, Jan; Jungner, Högne; Lindroos, Alf; Ringbom, Åsa; von Konow, Thorborg; Rud, Niels
1997-03-01
A method for refining lime mortar samples for 14C dating has been developed. It includes mechanical and chemical separation of mortar carbonate with optical control of the purity of the samples. The method has been applied to a large series of AMS datings on lime mortar from three medieval churches on the Åland Islands, Finland. The datings show convincing internal consistency and confine the construction time of the churches to AD 1280-1380 with a most probable date just before AD 1300. We have also applied the method to the controversial Newport Tower, Rhode Island, USA. Our mortar datings confine the building to colonial time in the 17th century and thus refute claims of Viking origin of the tower. For the churches, a parallel series of datings of organic (charcoal) inclusions in the mortar show less reliable results than the mortar samples, which is ascribed to poor association with the construction time.
NASA Astrophysics Data System (ADS)
Chorozoglou, D.; Kugiumtzis, D.; Papadimitriou, E.
2018-06-01
The seismic hazard assessment in the area of Greece is attempted by studying the earthquake network structure, such as small-world and random. In this network, a node represents a seismic zone in the study area and a connection between two nodes is given by the correlation of the seismic activity of two zones. To investigate the network structure, and particularly the small-world property, the earthquake correlation network is compared with randomized ones. Simulations on multivariate time series of different length and number of variables show that for the construction of randomized networks the method randomizing the time series performs better than methods randomizing directly the original network connections. Based on the appropriate randomization method, the network approach is applied to time series of earthquakes that occurred between main shocks in the territory of Greece spanning the period 1999-2015. The characterization of networks on sliding time windows revealed that small-world structure emerges in the last time interval, shortly before the main shock.
Financing School Construction. Educational Facilities Review Series, Number 12.
ERIC Educational Resources Information Center
Piele, Philip K.
The combination of defeated bond issues and rising building costs is contributing to a decline in both the construction of new school buildings and the remodeling of existing buildings. For the first time in many years, debt service and capital outlay expenditures actually declined on a per pupil basis. No change in either voter preferences or…
Chen, Chi-Kan
2017-07-26
The identification of genetic regulatory networks (GRNs) provides insights into complex cellular processes. A class of recurrent neural networks (RNNs) captures the dynamics of GRN. Algorithms combining the RNN and machine learning schemes were proposed to reconstruct small-scale GRNs using gene expression time series. We present new GRN reconstruction methods with neural networks. The RNN is extended to a class of recurrent multilayer perceptrons (RMLPs) with latent nodes. Our methods contain two steps: the edge rank assignment step and the network construction step. The former assigns ranks to all possible edges by a recursive procedure based on the estimated weights of wires of RNN/RMLP (RE RNN /RE RMLP ), and the latter constructs a network consisting of top-ranked edges under which the optimized RNN simulates the gene expression time series. The particle swarm optimization (PSO) is applied to optimize the parameters of RNNs and RMLPs in a two-step algorithm. The proposed RE RNN -RNN and RE RMLP -RNN algorithms are tested on synthetic and experimental gene expression time series of small GRNs of about 10 genes. The experimental time series are from the studies of yeast cell cycle regulated genes and E. coli DNA repair genes. The unstable estimation of RNN using experimental time series having limited data points can lead to fairly arbitrary predicted GRNs. Our methods incorporate RNN and RMLP into a two-step structure learning procedure. Results show that the RE RMLP using the RMLP with a suitable number of latent nodes to reduce the parameter dimension often result in more accurate edge ranks than the RE RNN using the regularized RNN on short simulated time series. Combining by a weighted majority voting rule the networks derived by the RE RMLP -RNN using different numbers of latent nodes in step one to infer the GRN, the method performs consistently and outperforms published algorithms for GRN reconstruction on most benchmark time series. The framework of two-step algorithms can potentially incorporate with different nonlinear differential equation models to reconstruct the GRN.
Testing the shape of distributions of weather data
NASA Astrophysics Data System (ADS)
Baccon, Ana L. P.; Lunardi, José T.
2016-08-01
The characterization of the statistical distributions of observed weather data is of crucial importance both for the construction and for the validation of weather models, such as weather generators (WG's). An important class of WG's (e.g., the Richardson-type generators) reduce the time series of each variable to a time series of its residual elements, and the residuals are often assumed to be normally distributed. In this work we propose an approach to investigate if the shape assumed for the distribution of residuals is consistent or not with the observed data of a given site. Specifically, this procedure tests if the same distribution shape for the residuals noise is maintained along the time. The proposed approach is an adaptation to climate time series of a procedure first introduced to test the shapes of distributions of growth rates of business firms aggregated in large panels of short time series. We illustrate the procedure by applying it to the residuals time series of maximum temperature in a given location, and investigate the empirical consistency of two assumptions, namely i) the most common assumption that the distribution of the residuals is Gaussian and ii) that the residuals noise has a time invariant shape which coincides with the empirical distribution of all the residuals noise of the whole time series pooled together.
Atlantic multi-decadal oscillation covaries with Agulhas leakage
Biastoch, Arne; Durgadoo, Jonathan V.; Morrison, Adele K.; van Sebille, Erik; Weijer, Wilbert; Griffies, Stephen M.
2015-01-01
The interoceanic transfer of seawater between the Indian Ocean and the Atlantic, ‘Agulhas leakage', forms a choke point for the overturning circulation in the global ocean. Here, by combining output from a series of high-resolution ocean and climate models with in situ and satellite observations, we construct a time series of Agulhas leakage for the period 1870–2014. The time series demonstrates the impact of Southern Hemisphere westerlies on decadal timescales. Agulhas leakage shows a correlation with the Atlantic Multi-decadal Oscillation on multi-decadal timescales; the former leading by 15 years. This is relevant for climate in the North Atlantic. PMID:26656850
Atlantic multi-decadal oscillation covaries with Agulhas leakage
Biastoch, Arne; Durgadoo, Jonathan V.; Morrison, Adele K.; ...
2015-12-10
The interoceanic transfer of seawater between the Indian Ocean and the Atlantic, ‘Agulhas leakage’, forms a choke point for the overturning circulation in the global ocean. Here, by combining output from a series of high-resolution ocean and climate models with in situ and satellite observations, we construct a time series of Agulhas leakage for the period 1870–2014. The time series demonstrates the impact of Southern Hemisphere westerlies on decadal timescales. Agulhas leakage shows a correlation with the Atlantic Multi-decadal Oscillation on multi-decadal timescales; the former leading by 15 years. Lastly, this is relevant for climate in the North Atlantic.
Zhang, Fang; Wagner, Anita K; Soumerai, Stephen B; Ross-Degnan, Dennis
2009-02-01
Interrupted time series (ITS) is a strong quasi-experimental research design, which is increasingly applied to estimate the effects of health services and policy interventions. We describe and illustrate two methods for estimating confidence intervals (CIs) around absolute and relative changes in outcomes calculated from segmented regression parameter estimates. We used multivariate delta and bootstrapping methods (BMs) to construct CIs around relative changes in level and trend, and around absolute changes in outcome based on segmented linear regression analyses of time series data corrected for autocorrelated errors. Using previously published time series data, we estimated CIs around the effect of prescription alerts for interacting medications with warfarin on the rate of prescriptions per 10,000 warfarin users per month. Both the multivariate delta method (MDM) and the BM produced similar results. BM is preferred for calculating CIs of relative changes in outcomes of time series studies, because it does not require large sample sizes when parameter estimates are obtained correctly from the model. Caution is needed when sample size is small.
Dst and a map of average equivalent ring current: 1958-2007
NASA Astrophysics Data System (ADS)
Love, J. J.
2008-12-01
A new Dst index construction is made using the original hourly magnetic-observatory data collected over the years 1958-2007; stations: Hermanus South Africa, Kakioka Japan, Honolulu Hawaii, and San Juan Puerto Rico. The construction method we use is generally consistent with the algorithm defined by Sugiura (1964), and which forms the basis for the standard Kyoto Dst index. This involves corrections for observatory baseline shifts, subtraction of the main-field secular variation, and subtraction of specific harmonics that approximate the solar-quiet (Sq) variation. Fourier analysis of the observatory data reveals the nature of Sq: it consists primarily of periodic variation driven by the Earth's rotation, the Moon's orbit, the Earth's orbit, and, to some extent, the solar cycle. Cross coupling of the harmonics associated with each of the external periodic driving forces results in a seemingly complicated Sq time series that is sometimes considered to be relatively random and unpredictable, but which is, in fact, well described in terms of Fourier series. Working in the frequency domain, Sq can be filtered out, and, upon return to the time domain, the local disturbance time series (Dist) for each observatory can be recovered. After averaging the local disturbance time series from each observatory, the global magnetic disturbance time series Dst is obtained. Analysis of this new Dst index is compared with that produced by Kyoto, and various biases and differences are discussed. The combination of the Dist and Dst time series can be used to explore the local-time/universal-time symmetry of an equivalent ring current. Individual magnetic storms can have a complicated disturbance field that is asymmetrical in longitude, presumably due to partial ring currents. Using 50 years of data we map the average local-time magnetic disturbance, finding that it is very nearly proportional to Dst. To our surprise, the primary asymmetry in mean magnetic disturbance is not between midnight and noon, but rather between dawn and dusk, with greatest mean disturbance occurring at dusk. As a result, proposed corrections to Dst for magnetopause and tail currents might be reasonably reconsidered.
Future mission studies: Forecasting solar flux directly from its chaotic time series
NASA Technical Reports Server (NTRS)
Ashrafi, S.
1991-01-01
The mathematical structure of the programs written to construct a nonlinear predictive model to forecast solar flux directly from its time series without reference to any underlying solar physics is presented. This method and the programs are written so that one could apply the same technique to forecast other chaotic time series, such as geomagnetic data, attitude and orbit data, and even financial indexes and stock market data. Perhaps the most important application of this technique to flight dynamics is to model Goddard Trajectory Determination System (GTDS) output of residues between observed position of spacecraft and calculated position with no drag (drag flag = off). This would result in a new model of drag working directly from observed data.
[Introduction and some problems of the rapid time series laboratory reporting system].
Kanao, M; Yamashita, K; Kuwajima, M
1999-09-01
We introduced an on-line system of biochemical, hematological, serological, urinary, bacteriological, and emergency examinations and associated office work using a client server system NEC PC-LACS based on a system consisting of concentration of outpatient blood collection, concentration of outpatient reception, and outpatient examination by reservation. Using this on-line system, results of 71 items in chemical serological, hematological, and urinary examinations are rapidly reported within 1 hour. Since the ordering system at our hospital has not been completed yet, we constructed a rapid time series reporting system in which time series data obtained on 5 serial occasions are printed on 2 sheets of A4 paper at the time of the final report. In each consultation room of the medical outpatient clinic, at the neuromedical outpatient clinic, and at the kidney center where examinations are frequently performed, terminal equipment and a printer for inquiry were established for real-time output of time series reports. Results are reported by FAX to the other outpatient clinics and wards, and subsequently, time series reports are output at the clinical laboratory department. This system allowed rapid examination, especially preconsultation examination. This system was also useful for reducing office work and effectively utilize examination data.
Rai, Shesh N; Trainor, Patrick J; Khosravi, Farhad; Kloecker, Goetz; Panchapakesan, Balaji
2016-01-01
The development of biosensors that produce time series data will facilitate improvements in biomedical diagnostics and in personalized medicine. The time series produced by these devices often contains characteristic features arising from biochemical interactions between the sample and the sensor. To use such characteristic features for determining sample class, similarity-based classifiers can be utilized. However, the construction of such classifiers is complicated by the variability in the time domains of such series that renders the traditional distance metrics such as Euclidean distance ineffective in distinguishing between biological variance and time domain variance. The dynamic time warping (DTW) algorithm is a sequence alignment algorithm that can be used to align two or more series to facilitate quantifying similarity. In this article, we evaluated the performance of DTW distance-based similarity classifiers for classifying time series that mimics electrical signals produced by nanotube biosensors. Simulation studies demonstrated the positive performance of such classifiers in discriminating between time series containing characteristic features that are obscured by noise in the intensity and time domains. We then applied a DTW distance-based k -nearest neighbors classifier to distinguish the presence/absence of mesenchymal biomarker in cancer cells in buffy coats in a blinded test. Using a train-test approach, we find that the classifier had high sensitivity (90.9%) and specificity (81.8%) in differentiating between EpCAM-positive MCF7 cells spiked in buffy coats and those in plain buffy coats.
NASA Astrophysics Data System (ADS)
Saturnino, Diana; Olsen, Nils; Finlay, Chris
2017-04-01
High-precision magnetic measurements collected by satellites such as Swarm or CHAMP,flying at altitudes between 300 and 800km, allow for improved geomagnetic field modelling. An accurate description of the internal (core and crust) field must account for contributions from other sources, such as the ionosphere and magnetosphere. However, the description of the rapidly changing external field contributions, particularly during the quiet times from which the data are selected, constitutes a major challenge of the construction of such models. Our study attempts to obtain improved knowledge on ionospheric field contributions during quiet times conditions, in particular during night local times. We use two different datasets: ground magnetic observatories time series (obtained below the ionospheric E-layer currents), and Swarm satellites measurements acquired above these currents. First, we remove from the data estimates of the core, lithospheric and large-scale magnetospheric magnetic contributions as given by the CHAOS-6 model, to obtain corrected time series. Then, we focus on the differences of the corrected time series: for a pair of ground magnetic observatories, we determine the time series of the difference, and similarly we determine time series differences at satellite altitude, given by the difference between the Swarm Alpha and Charlie satellites taken in the vicinity of the ground observatory locations. The obtained differences time series are analysed regarding their temporal and spatial scales variations, with emphasis on measurements during night local times.
Influence of the time scale on the construction of financial networks.
Emmert-Streib, Frank; Dehmer, Matthias
2010-09-30
In this paper we investigate the definition and formation of financial networks. Specifically, we study the influence of the time scale on their construction. For our analysis we use correlation-based networks obtained from the daily closing prices of stock market data. More precisely, we use the stocks that currently comprise the Dow Jones Industrial Average (DJIA) and estimate financial networks where nodes correspond to stocks and edges correspond to none vanishing correlation coefficients. That means only if a correlation coefficient is statistically significant different from zero, we include an edge in the network. This construction procedure results in unweighted, undirected networks. By separating the time series of stock prices in non-overlapping intervals, we obtain one network per interval. The length of these intervals corresponds to the time scale of the data, whose influence on the construction of the networks will be studied in this paper. Numerical analysis of four different measures in dependence on the time scale for the construction of networks allows us to gain insights about the intrinsic time scale of the stock market with respect to a meaningful graph-theoretical analysis.
Wood construction codes issues in the United States
Douglas R. Rammer
2006-01-01
The current wood construction codes find their origin in the 1935 Wood Handbook: Wood as an Engineering Material published by the USDA Forest Service. Many of the current design recommendations can be traced back to statements from this book. Since this time a series of development both historical and recent has led to a multi-layered system for use of wood products in...
ERIC Educational Resources Information Center
Trent, John
2016-01-01
This article reports the results of a multiple qualitative case study which investigated the challenges that seven early career English language teachers in Hong Kong confronted as they constructed their professional and personal identities. A series of in-depth interviews with participants during the entire first year of their full-time teaching…
Persistent topological features of dynamical systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maletić, Slobodan, E-mail: slobodan@hitsz.edu.cn; Institute of Nuclear Sciences Vinča, University of Belgrade, Belgrade; Zhao, Yi, E-mail: zhao.yi@hitsz.edu.cn
Inspired by an early work of Muldoon et al., Physica D 65, 1–16 (1993), we present a general method for constructing simplicial complex from observed time series of dynamical systems based on the delay coordinate reconstruction procedure. The obtained simplicial complex preserves all pertinent topological features of the reconstructed phase space, and it may be analyzed from topological, combinatorial, and algebraic aspects. In focus of this study is the computation of homology of the invariant set of some well known dynamical systems that display chaotic behavior. Persistent homology of simplicial complex and its relationship with the embedding dimensions are examinedmore » by studying the lifetime of topological features and topological noise. The consistency of topological properties for different dynamic regimes and embedding dimensions is examined. The obtained results shed new light on the topological properties of the reconstructed phase space and open up new possibilities for application of advanced topological methods. The method presented here may be used as a generic method for constructing simplicial complex from a scalar time series that has a number of advantages compared to the mapping of the same time series to a complex network.« less
Evolution of the Sunspot Number and Solar Wind B Time Series
NASA Astrophysics Data System (ADS)
Cliver, Edward W.; Herbst, Konstantin
2018-03-01
The past two decades have witnessed significant changes in our knowledge of long-term solar and solar wind activity. The sunspot number time series (1700-present) developed by Rudolf Wolf during the second half of the 19th century was revised and extended by the group sunspot number series (1610-1995) of Hoyt and Schatten during the 1990s. The group sunspot number is significantly lower than the Wolf series before ˜1885. An effort from 2011-2015 to understand and remove differences between these two series via a series of workshops had the unintended consequence of prompting several alternative constructions of the sunspot number. Thus it has been necessary to expand and extend the sunspot number reconciliation process. On the solar wind side, after a decade of controversy, an ISSI International Team used geomagnetic and sunspot data to obtain a high-confidence time series of the solar wind magnetic field strength (B) from 1750-present that can be compared with two independent long-term (> ˜600 year) series of annual B-values based on cosmogenic nuclides. In this paper, we trace the twists and turns leading to our current understanding of long-term solar and solar wind activity.
Early-Time Solution of the Horizontal Unconfined Aquifer in the Buildup Phase
NASA Astrophysics Data System (ADS)
Gravanis, Elias; Akylas, Evangelos
2017-10-01
We derive the early-time solution of the Boussinesq equation for the horizontal unconfined aquifer in the buildup phase under constant recharge and zero inflow. The solution is expressed as a power series of a suitable similarity variable, which is constructed so that to satisfy the boundary conditions at both ends of the aquifer, that is, it is a polynomial approximation of the exact solution. The series turns out to be asymptotic and it is regularized by resummation techniques that are used to define divergent series. The outflow rate in this regime is linear in time, and the (dimensionless) coefficient is calculated to eight significant figures. The local error of the series is quantified by its deviation from satisfying the self-similar Boussinesq equation at every point. The local error turns out to be everywhere positive, hence, so is the integrated error, which in turn quantifies the degree of convergence of the series to the exact solution.
Studies in astronomical time series analysis. I - Modeling random processes in the time domain
NASA Technical Reports Server (NTRS)
Scargle, J. D.
1981-01-01
Several random process models in the time domain are defined and discussed. Attention is given to the moving average model, the autoregressive model, and relationships between and combinations of these models. Consideration is then given to methods for investigating pulse structure, procedures of model construction, computational methods, and numerical experiments. A FORTRAN algorithm of time series analysis has been developed which is relatively stable numerically. Results of test cases are given to study the effect of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the light curve of the quasar 3C 272 is considered as an example.
NASA Astrophysics Data System (ADS)
Feng, Lian-Li; Tian, Shou-Fu; Wang, Xiu-Bin; Zhang, Tian-Tian
2016-09-01
In this paper, the time fractional Fordy-Gibbons equation is investigated with Riemann-Liouville derivative. The equation can be reduced to the Caudrey-Dodd-Gibbon equation, Savada-Kotera equation and the Kaup-Kupershmidt equation, etc. By means of the Lie group analysis method, the invariance properties and symmetry reductions of the equation are derived. Furthermore, by means of the power series theory, its exact power series solutions of the equation are also constructed. Finally, two kinds of conservation laws of the equation are well obtained with aid of the self-adjoint method. Supported by the Fundamental Research Funds for Key Discipline Construction under Grant No. XZD201602, the Fundamental Research Funds for the Central Universities under Grant Nos. 2015QNA53 and 2015XKQY14, the Fundamental Research Funds for Postdoctoral at the Key Laboratory of Gas and Fire Control for Coal Mines, the General Financial Grant from the China Postdoctoral Science Foundation under Grant No. 2015M570498, and Natural Sciences Foundation of China under Grant No. 11301527
PSO-MISMO modeling strategy for multistep-ahead time series prediction.
Bao, Yukun; Xiong, Tao; Hu, Zhongyi
2014-05-01
Multistep-ahead time series prediction is one of the most challenging research topics in the field of time series modeling and prediction, and is continually under research. Recently, the multiple-input several multiple-outputs (MISMO) modeling strategy has been proposed as a promising alternative for multistep-ahead time series prediction, exhibiting advantages compared with the two currently dominating strategies, the iterated and the direct strategies. Built on the established MISMO strategy, this paper proposes a particle swarm optimization (PSO)-based MISMO modeling strategy, which is capable of determining the number of sub-models in a self-adaptive mode, with varying prediction horizons. Rather than deriving crisp divides with equal-size s prediction horizons from the established MISMO, the proposed PSO-MISMO strategy, implemented with neural networks, employs a heuristic to create flexible divides with varying sizes of prediction horizons and to generate corresponding sub-models, providing considerable flexibility in model construction, which has been validated with simulated and real datasets.
NASA Astrophysics Data System (ADS)
Abe, R.; Hamada, K.; Hirata, N.; Tamura, R.; Nishi, N.
2015-05-01
As well as the BIM of quality management in the construction industry, demand for quality management of the manufacturing process of the member is higher in shipbuilding field. The time series of three-dimensional deformation of the each process, and are accurately be grasped strongly demanded. In this study, we focused on the shipbuilding field, will be examined three-dimensional measurement method. The shipyard, since a large equipment and components are intricately arranged in a limited space, the installation of the measuring equipment and the target is limited. There is also the element to be measured is moved in each process, the establishment of the reference point for time series comparison is necessary to devise. In this paper will be discussed method for measuring the welding deformation in time series by using a total station. In particular, by using a plurality of measurement data obtained from this approach and evaluated the amount of deformation of each process.
Fundamentals of Construction. Introduction to Construction Series. Instructor Edition.
ERIC Educational Resources Information Center
Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.
This competency-based curriculum guide begins the Introduction to Construction series. The series is designed with the flexible training requirements of open shop contractors, preapprenticeship programs, multicraft high school programs, technology education programs, and cooperative education programs in mind. This guide contains 3 sections and 15…
NASA Astrophysics Data System (ADS)
Serov, Vladislav V.; Kheifets, A. S.
2014-12-01
We analyze a transfer ionization (TI) reaction in the fast proton-helium collision H++He →H0+He2 ++ e- by solving a time-dependent Schrödinger equation (TDSE) under the classical projectile motion approximation in one-dimensional kinematics. In addition, we construct various time-independent analogs of our model using lowest-order perturbation theory in the form of the Born series. By comparing various aspects of the TDSE and the Born series calculations, we conclude that the recent discrepancies of experimental and theoretical data may be attributed to deficiency of the Born models used by other authors. We demonstrate that the correct Born series for TI should include the momentum-space overlap between the double-ionization amplitude and the wave function of the transferred electron.
Causality networks from multivariate time series and application to epilepsy.
Siggiridou, Elsa; Koutlis, Christos; Tsimpiris, Alkiviadis; Kimiskidis, Vasilios K; Kugiumtzis, Dimitris
2015-08-01
Granger causality and variants of this concept allow the study of complex dynamical systems as networks constructed from multivariate time series. In this work, a large number of Granger causality measures used to form causality networks from multivariate time series are assessed. For this, realizations on high dimensional coupled dynamical systems are considered and the performance of the Granger causality measures is evaluated, seeking for the measures that form networks closest to the true network of the dynamical system. In particular, the comparison focuses on Granger causality measures that reduce the state space dimension when many variables are observed. Further, the linear and nonlinear Granger causality measures of dimension reduction are compared to a standard Granger causality measure on electroencephalographic (EEG) recordings containing episodes of epileptiform discharges.
Drought over Seoul and Its Association with Solar Cycles
NASA Astrophysics Data System (ADS)
Park, Jong-Hyeok; Chang, Heon-Young
2013-12-01
We have investigated drought periodicities occurred in Seoul to find out any indication of relationship between drought in Korea and solar activities. It is motivated, in view of solar-terrestrial connection, to search for an example of extreme weather condition controlled by solar activity. The periodicity of drought in Seoul has been re-examined using the wavelet transform technique as the consensus is not achieved yet. The reason we have chosen Seoul is because daily precipitation was recorded for longer than 200 years, which meets our requirement that analyses of drought frequency demand long-term historical data to ensure reliable estimates. We have examined three types of time series of the Effective Drought Index (EDI). We have directly analyzed EDI time series in the first place. And we have constructed and analyzed time series of histogram in which the number of days whose EDI is less than -1.5 for a given month of the year is given as a function of time, and one in which the number of occasions where EDI values of three consecutive days are all less than -1.5 is given as a function of time. All the time series data sets we analyzed are periodic. Apart from the annual cycle due to seasonal variations, periodicities shorter than the 11 year sunspot cycle, ~ 3, ~ 4, ~ 6 years, have been confirmed. Periodicities to which theses short periodicities (shorter than Hale period) may be corresponding are not yet known. Longer periodicities possibly related to Gleissberg cycles, ~ 55, ~ 120 years, can be also seen. However, periodicity comparable to the 11 year solar cycle seems absent in both EDI and the constructed data sets.
Application of information-retrieval methods to the classification of physical data
NASA Technical Reports Server (NTRS)
Mamotko, Z. N.; Khorolskaya, S. K.; Shatrovskiy, L. I.
1975-01-01
Scientific data received from satellites are characterized as a multi-dimensional time series, whose terms are vector functions of a vector of measurement conditions. Information retrieval methods are used to construct lower dimensional samples on the basis of the condition vector, in order to obtain these data and to construct partial relations. The methods are applied to the joint Soviet-French Arkad project.
McKenna, Thomas M; Bawa, Gagandeep; Kumar, Kamal; Reifman, Jaques
2007-04-01
The physiology analysis system (PAS) was developed as a resource to support the efficient warehousing, management, and analysis of physiology data, particularly, continuous time-series data that may be extensive, of variable quality, and distributed across many files. The PAS incorporates time-series data collected by many types of data-acquisition devices, and it is designed to free users from data management burdens. This Web-based system allows both discrete (attribute) and time-series (ordered) data to be manipulated, visualized, and analyzed via a client's Web browser. All processes occur on a server, so that the client does not have to download data or any application programs, and the PAS is independent of the client's computer operating system. The PAS contains a library of functions, written in different computer languages that the client can add to and use to perform specific data operations. Functions from the library are sequentially inserted into a function chain-based logical structure to construct sophisticated data operators from simple function building blocks, affording ad hoc query and analysis of time-series data. These features support advanced mining of physiology data.
NASA Astrophysics Data System (ADS)
Grohs, Jacob R.; Li, Yongqiang; Dillard, David A.; Case, Scott W.; Ellis, Michael W.; Lai, Yeh-Hung; Gittleman, Craig S.
Temperature and humidity fluctuations in operating fuel cells impose significant biaxial stresses in the constrained proton exchange membranes (PEMs) of a fuel cell stack. The strength of the PEM, and its ability to withstand cyclic environment-induced stresses, plays an important role in membrane integrity and consequently, fuel cell durability. In this study, a pressure loaded blister test is used to characterize the biaxial strength of Gore-Select ® series 57 over a range of times and temperatures. Hencky's classical solution for a pressurized circular membrane is used to estimate biaxial strength values from burst pressure measurements. A hereditary integral is employed to construct the linear viscoelastic analog to Hencky's linear elastic exact solution. Biaxial strength master curves are constructed using traditional time-temperature superposition principle techniques and the associated temperature shift factors show good agreement with shift factors obtained from constitutive (stress relaxation) and fracture (knife slit) tests of the material.
Students' Conception of Infinite Series
ERIC Educational Resources Information Center
Martinez-Planell, Rafael; Gonzalez, Ana Carmen; DiCristina, Gladys; Acevedo, Vanessa
2012-01-01
This is a report of a study of students' understanding of infinite series. It has a three-fold purpose: to show that students may construct two essentially different notions of infinite series, to show that one of the constructions is particularly difficult for students, and to examine the way in which these two different constructions may be…
Cross over of recurrence networks to random graphs and random geometric graphs
NASA Astrophysics Data System (ADS)
Jacob, Rinku; Harikrishnan, K. P.; Misra, R.; Ambika, G.
2017-02-01
Recurrence networks are complex networks constructed from the time series of chaotic dynamical systems where the connection between two nodes is limited by the recurrence threshold. This condition makes the topology of every recurrence network unique with the degree distribution determined by the probability density variations of the representative attractor from which it is constructed. Here we numerically investigate the properties of recurrence networks from standard low-dimensional chaotic attractors using some basic network measures and show how the recurrence networks are different from random and scale-free networks. In particular, we show that all recurrence networks can cross over to random geometric graphs by adding sufficient amount of noise to the time series and into the classical random graphs by increasing the range of interaction to the system size. We also highlight the effectiveness of a combined plot of characteristic path length and clustering coefficient in capturing the small changes in the network characteristics.
Mehdizadeh, Sina; Sanjari, Mohammad Ali
2017-11-07
This study aimed to determine the effect of added noise, filtering and time series length on the largest Lyapunov exponent (LyE) value calculated for time series obtained from a passive dynamic walker. The simplest passive dynamic walker model comprising of two massless legs connected by a frictionless hinge joint at the hip was adopted to generate walking time series. The generated time series was used to construct a state space with the embedding dimension of 3 and time delay of 100 samples. The LyE was calculated as the exponential rate of divergence of neighboring trajectories of the state space using Rosenstein's algorithm. To determine the effect of noise on LyE values, seven levels of Gaussian white noise (SNR=55-25dB with 5dB steps) were added to the time series. In addition, the filtering was performed using a range of cutoff frequencies from 3Hz to 19Hz with 2Hz steps. The LyE was calculated for both noise-free and noisy time series with different lengths of 6, 50, 100 and 150 strides. Results demonstrated a high percent error in the presence of noise for LyE. Therefore, these observations suggest that Rosenstein's algorithm might not perform well in the presence of added experimental noise. Furthermore, findings indicated that at least 50 walking strides are required to calculate LyE to account for the effect of noise. Finally, observations support that a conservative filtering of the time series with a high cutoff frequency might be more appropriate prior to calculating LyE. Copyright © 2017 Elsevier Ltd. All rights reserved.
Kepler Fine Guidance Sensor Data
NASA Technical Reports Server (NTRS)
Van Cleve, Jeffrey; Campbell, Jennifer Roseanna
2017-01-01
The Kepler and K2 missions collected Fine Guidance Sensor (FGS) data in addition to the science data, as discussed in the Kepler Instrument Handbook (KIH, Van Cleve and Caldwell 2016). The FGS CCDs are frame transfer devices (KIH Table 7) located in the corners of the Kepler focal plane (KIH Figure 24), which are read out 10 times every second. The FGS data are being made available to the user community for scientific analysis as flux and centroid time series, along with a limited number of FGS full frame images which may be useful for constructing a World Coordinate System (WCS) or otherwise putting the time series data in context. This document will describe the data content and file format, and give example MATLAB scripts to read the time series. There are three file types delivered as the FGS data.1. Flux and Centroid (FLC) data: time series of star signal and centroid data. 2. Ancillary FGS Reference (AFR) data: catalog of information about the observed stars in the FLC data. 3. FGS Full-Frame Image (FGI) data: full-frame image snapshots of the FGS CCDs.
NASA Astrophysics Data System (ADS)
Diao, Chunyuan
In today's big data era, the increasing availability of satellite and airborne platforms at various spatial and temporal scales creates unprecedented opportunities to understand the complex and dynamic systems (e.g., plant invasion). Time series remote sensing is becoming more and more important to monitor the earth system dynamics and interactions. To date, most of the time series remote sensing studies have been conducted with the images acquired at coarse spatial scale, due to their relatively high temporal resolution. The construction of time series at fine spatial scale, however, is limited to few or discrete images acquired within or across years. The objective of this research is to advance the time series remote sensing at fine spatial scale, particularly to shift from discrete time series remote sensing to continuous time series remote sensing. The objective will be achieved through the following aims: 1) Advance intra-annual time series remote sensing under the pure-pixel assumption; 2) Advance intra-annual time series remote sensing under the mixed-pixel assumption; 3) Advance inter-annual time series remote sensing in monitoring the land surface dynamics; and 4) Advance the species distribution model with time series remote sensing. Taking invasive saltcedar as an example, four methods (i.e., phenological time series remote sensing model, temporal partial unmixing method, multiyear spectral angle clustering model, and time series remote sensing-based spatially explicit species distribution model) were developed to achieve the objectives. Results indicated that the phenological time series remote sensing model could effectively map saltcedar distributions through characterizing the seasonal phenological dynamics of plant species throughout the year. The proposed temporal partial unmixing method, compared to conventional unmixing methods, could more accurately estimate saltcedar abundance within a pixel by exploiting the adequate temporal signatures of saltcedar. The multiyear spectral angle clustering model could guide the selection of the most representative remotely sensed image for repetitive saltcedar mapping over space and time. Through incorporating spatial autocorrelation, the species distribution model developed in the study could identify the suitable habitats of saltcedar at a fine spatial scale and locate appropriate areas at high risk of saltcedar infestation. Among 10 environmental variables, the distance to the river and the phenological attributes summarized by the time series remote sensing were regarded as the most important. These methods developed in the study provide new perspectives on how the continuous time series can be leveraged under various conditions to investigate the plant invasion dynamics.
Modeling multivariate time series on manifolds with skew radial basis functions.
Jamshidi, Arta A; Kirby, Michael J
2011-01-01
We present an approach for constructing nonlinear empirical mappings from high-dimensional domains to multivariate ranges. We employ radial basis functions and skew radial basis functions for constructing a model using data that are potentially scattered or sparse. The algorithm progresses iteratively, adding a new function at each step to refine the model. The placement of the functions is driven by a statistical hypothesis test that accounts for correlation in the multivariate range variables. The test is applied on training and validation data and reveals nonstatistical or geometric structure when it fails. At each step, the added function is fit to data contained in a spatiotemporally defined local region to determine the parameters--in particular, the scale of the local model. The scale of the function is determined by the zero crossings of the autocorrelation function of the residuals. The model parameters and the number of basis functions are determined automatically from the given data, and there is no need to initialize any ad hoc parameters save for the selection of the skew radial basis functions. Compactly supported skew radial basis functions are employed to improve model accuracy, order, and convergence properties. The extension of the algorithm to higher-dimensional ranges produces reduced-order models by exploiting the existence of correlation in the range variable data. Structure is tested not just in a single time series but between all pairs of time series. We illustrate the new methodologies using several illustrative problems, including modeling data on manifolds and the prediction of chaotic time series.
Influence of the Time Scale on the Construction of Financial Networks
Emmert-Streib, Frank; Dehmer, Matthias
2010-01-01
Background In this paper we investigate the definition and formation of financial networks. Specifically, we study the influence of the time scale on their construction. Methodology/Principal Findings For our analysis we use correlation-based networks obtained from the daily closing prices of stock market data. More precisely, we use the stocks that currently comprise the Dow Jones Industrial Average (DJIA) and estimate financial networks where nodes correspond to stocks and edges correspond to none vanishing correlation coefficients. That means only if a correlation coefficient is statistically significant different from zero, we include an edge in the network. This construction procedure results in unweighted, undirected networks. By separating the time series of stock prices in non-overlapping intervals, we obtain one network per interval. The length of these intervals corresponds to the time scale of the data, whose influence on the construction of the networks will be studied in this paper. Conclusions/Significance Numerical analysis of four different measures in dependence on the time scale for the construction of networks allows us to gain insights about the intrinsic time scale of the stock market with respect to a meaningful graph-theoretical analysis. PMID:20949124
Construction of Optimally Reduced Empirical Model by Spatially Distributed Climate Data
NASA Astrophysics Data System (ADS)
Gavrilov, A.; Mukhin, D.; Loskutov, E.; Feigin, A.
2016-12-01
We present an approach to empirical reconstruction of the evolution operator in stochastic form by space-distributed time series. The main problem in empirical modeling consists in choosing appropriate phase variables which can efficiently reduce the dimension of the model at minimal loss of information about system's dynamics which consequently leads to more robust model and better quality of the reconstruction. For this purpose we incorporate in the model two key steps. The first step is standard preliminary reduction of observed time series dimension by decomposition via certain empirical basis (e. g. empirical orthogonal function basis or its nonlinear or spatio-temporal generalizations). The second step is construction of an evolution operator by principal components (PCs) - the time series obtained by the decomposition. In this step we introduce a new way of reducing the dimension of the embedding in which the evolution operator is constructed. It is based on choosing proper combinations of delayed PCs to take into account the most significant spatio-temporal couplings. The evolution operator is sought as nonlinear random mapping parameterized using artificial neural networks (ANN). Bayesian approach is used to learn the model and to find optimal hyperparameters: the number of PCs, the dimension of the embedding, the degree of the nonlinearity of ANN. The results of application of the method to climate data (sea surface temperature, sea level pressure) and their comparing with the same method based on non-reduced embedding are presented. The study is supported by Government of Russian Federation (agreement #14.Z50.31.0033 with the Institute of Applied Physics of RAS).
Two approaches to timescale modeling for proxy series with chronological errors.
NASA Astrophysics Data System (ADS)
Divine, Dmitry; Godtliebsen, Fred
2010-05-01
A substantial part of proxy series used in paleoclimate research has chronological uncertainties. Any constructed timescale is therefore only an estimate of the true, but unknown timescale. An accurate assessment of the timing of events in the paleoproxy series and networks, as well as the use of proxy-based paleoclimate reconstructions in GCM model scoring experiments, requires the effect of these errors to be properly taken into account. We consider two types of the timescale error models corresponding to the two basic approaches to construction of the (depth-) age scale in a proxy series. Typically, a chronological control of a proxy series stemming from all types of marine and terrestrial sedimentary archives is based on the use of 14C dates, reference horizons or their combination. Depending on the prevalent origin of the available fix points (age markers) the following approaches to timescale modeling are proposed. 1) 14C dates. The algorithm uses Markov-chain Monte Carlo sampling technique to generate the ordered set of perturbed age markers. Proceeding sequentially from the youngest to the oldest fixpoint, the sampler draws random numbers from the age distribution of each individual 14C date. Every following perturbed age marker is generated such that condition of no age reversal is fulfilled. The relevant regression model is then applied to construct a simulated timescale. 2) Reference horizons (f. ex. volcanic or dust layers, T bomb peak) generally provide absolutely dated fixpoints. Due to a natural variability in sedimentation (accumulation) rate, however, the dating uncertainty in the interpolated timescale tends to grow together with a span to the nearest fixpoint. The (accumulation, sedimentation) process associated with formation of a proxy series is modelled using stochastic Levy process. The respective increments for the process are drawn from the log-normal distribution with the mean/variance ratio prescribed as a site(proxy)- dependent external parameter. The number of generated annual increments corresponds to a time interval between the considered reference horizons. The simulated series is then rescaled to match the length of the actual core section being modelled. Within each method the multitude of timescales is generated creating a number of possible realisations of a proxy series or a proxy based reconstruction in the time domain. This allows consideration of a proxy record in a probabilistic framework. The effect of accounting for uncertainties in chronology on a reconstructed environmental variable is illustrated with the two case studies of marine sediment records.
A technique to detect microclimatic inhomogeneities in historical temperature records
NASA Astrophysics Data System (ADS)
Runnalls, K. E.; Oke, T. R.
2003-04-01
A technique to identify inhomogeneities in historical temperature records caused by microclimatic changes to the surroundings of a climate station (e.g. minor instrument relocations, vegetation growth/removal, construction of houses, roads, runways) is presented. The technique uses daily maximum and minimum temperatures to estimate the magnitude of nocturnal cooling. The test station is compared to a nearby reference station by constructing time series of monthly "cooling ratios". It is argued that the cooling ratio is a particularly sensitive measure of microclimatic differences between neighbouring climate stations. Firstly, because microclimatic character is best expressed at night in stable conditions. Secondly, because larger-scale climatic influences common to both stations are removed by the use of a ratio and, because the ratio can be shown to be invariant in the mean with weather variables such as wind and cloud. Inflections (change points) in time series of cooling ratios therefore signal microclimatic change in one of the station records. Hurst rescaling is applied to the time series to aid in the identification of change points, which can then be compared to documented station history events, if sufficient metatdata is available. Results for a variety of air temperature records, ranging from rural to urban stations, are presented to illustrate the applicability of the technique.
Ozone Time Series From GOMOS and SAGE II Measurements
NASA Astrophysics Data System (ADS)
Kyrola, E. T.; Laine, M.; Tukiainen, S.; Sofieva, V.; Zawodny, J. M.; Thomason, L. W.
2011-12-01
Satellite measurements are essential for monitoring changes in the global stratospheric ozone distribution. Both the natural variation and anthropogenic change are strongly dependent on altitude. Stratospheric ozone has been measured from space with good vertical resolution since 1985 by the SAGE II solar occultation instrument. The advantage of the occultation measurement principle is the self-calibration, which is essential to ensuring stable time series. SAGE II measurements in 1985-2005 have been a valuable data set in investigations of trends in the vertical distribution of ozone. This time series can now be extended by the GOMOS measurements started in 2002. GOMOS is a stellar occultation instrument and offers, therefore, a natural continuation of SAGE II measurements. In this paper we study how well GOMOS and SAGE II measurements agree with each other in the period 2002-2005 when both instruments were measuring. We detail how the different spatial and temporal sampling of these two instruments affect the conformity of measurements. We study also how the retrieval specifics like absorption cross sections and assumed aerosol modeling affect the results. Various combined time series are constructed using different estimators and latitude-time grids. We also show preliminary results from a novel time series analysis based on Markov chain Monte Carlo approach.
New Comprehensive System to Construct Speleothem Fabrics Time Series
NASA Astrophysics Data System (ADS)
Frisia, S.; Borsato, A.
2014-12-01
Speleothem fabrics record processes that influence the way geochemical proxy data are encoded in speleothems, yet, there has been little advance in the use of fabrics as a complement to palaeo-proxy datasets since the fabric classification proposed by us in 2010. The systematic use of fabrics documentation in speleothem science has been limited by the absence of a comprehensive, numerical system that would allow constructing fabric time series comparable with the widely used geochemical time series. Documentation of speleothem fabrics is fundamental for a robust interpretation of speleothem time series where stable isotopes and trace elements are used as proxy, because fabrics highlight depositional as well as post-depositional processes whose understanding complements reconstructions based on geochemistry. Here we propose a logic system allowing transformation of microscope observations into numbers tied to acronyms that specify each fabric type and subtype. The rationale for ascribing progressive numbers to fabrics is based on the most up-to-date growth models. In this conceptual framework, the progression reflects hydrological conditions, bio-mediation and diagenesis. The lowest numbers are given to calcite fabrics formed at relatively constant drip rates: the columnar types (compact and open). Higher numbers are ascribed to columnar fabrics characterized by presence of impurities that cause elongation or lattice distortion (Elongated, Fascicular Optic and Radiaxial calcites). The sequence progresses with the dendritic fabrics, followed by micrite (M), which has been observed in association with microbial films. Microsparite (Ms) and mosaic calcite (Mc) have the highest numbers, being considered as diagenetic. Acronyms and subfixes are intended to become universally acknowledged. Thus, fabrics can be plotted vs. age to yield time series, where numbers are replaced by the acronyms. This will result in a visual representation of climate- or environmental-related parameters underpinning speleothem crystal growth. The Fabric log thus becomes a useful tool providing robustness to the geochemical data or test the overall utility of the speleothem record.
Evaluation of agreement between temporal series obtained from electrocardiogram and pulse wave.
NASA Astrophysics Data System (ADS)
Leikan, GM; Rossi, E.; Sanz, MCuadra; Delisle Rodríguez, D.; Mántaras, MC; Nicolet, J.; Zapata, D.; Lapyckyj, I.; Siri, L. Nicola; Perrone, MS
2016-04-01
Heart rate variability allows to study the cardiovascular autonomic nervous system modulation. Usually, this signal is obtained from the electrocardiogram (ECG). A simpler method for recording the pulse wave (PW) is by means of finger photoplethysmography (PPG), which also provides information about the duration of the cardiac cycle. In this study, the correlation and agreement between the time series of the intervals between heartbeats obtained from the ECG with those obtained from the PPG, were studied. Signals analyzed were obtained from young, healthy and resting subjects. For statistical analysis, the Pearson correlation coefficient and the Bland and Altman limits of agreement were used. Results show that the time series constructed from the PW would not replace the ones obtained from ECG.
On system behaviour using complex networks of a compression algorithm
NASA Astrophysics Data System (ADS)
Walker, David M.; Correa, Debora C.; Small, Michael
2018-01-01
We construct complex networks of scalar time series using a data compression algorithm. The structure and statistics of the resulting networks can be used to help characterize complex systems, and one property, in particular, appears to be a useful discriminating statistic in surrogate data hypothesis tests. We demonstrate these ideas on systems with known dynamical behaviour and also show that our approach is capable of identifying behavioural transitions within electroencephalogram recordings as well as changes due to a bifurcation parameter of a chaotic system. The technique we propose is dependent on a coarse grained quantization of the original time series and therefore provides potential for a spatial scale-dependent characterization of the data. Finally the method is as computationally efficient as the underlying compression algorithm and provides a compression of the salient features of long time series.
Patel, Ameera X; Bullmore, Edward T
2016-11-15
Connectome mapping using techniques such as functional magnetic resonance imaging (fMRI) has become a focus of systems neuroscience. There remain many statistical challenges in analysis of functional connectivity and network architecture from BOLD fMRI multivariate time series. One key statistic for any time series is its (effective) degrees of freedom, df, which will generally be less than the number of time points (or nominal degrees of freedom, N). If we know the df, then probabilistic inference on other fMRI statistics, such as the correlation between two voxel or regional time series, is feasible. However, we currently lack good estimators of df in fMRI time series, especially after the degrees of freedom of the "raw" data have been modified substantially by denoising algorithms for head movement. Here, we used a wavelet-based method both to denoise fMRI data and to estimate the (effective) df of the denoised process. We show that seed voxel correlations corrected for locally variable df could be tested for false positive connectivity with better control over Type I error and greater specificity of anatomical mapping than probabilistic connectivity maps using the nominal degrees of freedom. We also show that wavelet despiked statistics can be used to estimate all pairwise correlations between a set of regional nodes, assign a P value to each edge, and then iteratively add edges to the graph in order of increasing P. These probabilistically thresholded graphs are likely more robust to regional variation in head movement effects than comparable graphs constructed by thresholding correlations. Finally, we show that time-windowed estimates of df can be used for probabilistic connectivity testing or dynamic network analysis so that apparent changes in the functional connectome are appropriately corrected for the effects of transient noise bursts. Wavelet despiking is both an algorithm for fMRI time series denoising and an estimator of the (effective) df of denoised fMRI time series. Accurate estimation of df offers many potential advantages for probabilistically thresholding functional connectivity and network statistics tested in the context of spatially variant and non-stationary noise. Code for wavelet despiking, seed correlational testing and probabilistic graph construction is freely available to download as part of the BrainWavelet Toolbox at www.brainwavelet.org. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Fitzgerald, Michael G.; Karlinger, Michael R.
1983-01-01
Time-series models were constructed for analysis of daily runoff and sediment discharge data from selected rivers of the Eastern United States. Logarithmic transformation and first-order differencing of the data sets were necessary to produce second-order, stationary time series and remove seasonal trends. Cyclic models accounted for less than 42 percent of the variance in the water series and 31 percent in the sediment series. Analysis of the apparent oscillations of given frequencies occurring in the data indicates that frequently occurring storms can account for as much as 50 percent of the variation in sediment discharge. Components of the frequency analysis indicate that a linear representation is reasonable for the water-sediment system. Models that incorporate lagged water discharge as input prove superior to univariate techniques in modeling and prediction of sediment discharges. The random component of the models includes errors in measurement and model hypothesis and indicates no serial correlation. An index of sediment production within or between drain-gage basins can be calculated from model parameters.
NASA Astrophysics Data System (ADS)
Filimonov, M. Yu.
2017-12-01
The method of special series with recursively calculated coefficients is used to solve nonlinear partial differential equations. The recurrence of finding the coefficients of the series is achieved due to a special choice of functions, in powers of which the solution is expanded in a series. We obtain a sequence of linear partial differential equations to find the coefficients of the series constructed. In many cases, one can deal with a sequence of linear ordinary differential equations. We construct classes of solutions in the form of convergent series for a certain class of nonlinear evolution equations. A new class of solutions of generalized Boussinesque equation with an arbitrary function in the form of a convergent series is constructed.
Persistent homology of time-dependent functional networks constructed from coupled time series
NASA Astrophysics Data System (ADS)
Stolz, Bernadette J.; Harrington, Heather A.; Porter, Mason A.
2017-04-01
We use topological data analysis to study "functional networks" that we construct from time-series data from both experimental and synthetic sources. We use persistent homology with a weight rank clique filtration to gain insights into these functional networks, and we use persistence landscapes to interpret our results. Our first example uses time-series output from networks of coupled Kuramoto oscillators. Our second example consists of biological data in the form of functional magnetic resonance imaging data that were acquired from human subjects during a simple motor-learning task in which subjects were monitored for three days during a five-day period. With these examples, we demonstrate that (1) using persistent homology to study functional networks provides fascinating insights into their properties and (2) the position of the features in a filtration can sometimes play a more vital role than persistence in the interpretation of topological features, even though conventionally the latter is used to distinguish between signal and noise. We find that persistent homology can detect differences in synchronization patterns in our data sets over time, giving insight both on changes in community structure in the networks and on increased synchronization between brain regions that form loops in a functional network during motor learning. For the motor-learning data, persistence landscapes also reveal that on average the majority of changes in the network loops take place on the second of the three days of the learning process.
Complex effusive events at Kilauea as documented by the GOES satellite and remote video cameras
Harris, A.J.L.; Thornber, C.R.
1999-01-01
GOES provides thermal data for all of the Hawaiian volcanoes once every 15 min. We show how volcanic radiance time series produced from this data stream can be used as a simple measure of effusive activity. Two types of radiance trends in these time series can be used to monitor effusive activity: (a) Gradual variations in radiance reveal steady flow-field extension and tube development. (b) Discrete spikes correlate with short bursts of activity, such as lava fountaining or lava-lake overflows. We are confident that any effusive event covering more than 10,000 m2 of ground in less than 60 min will be unambiguously detectable using this approach. We demonstrate this capability using GOES, video camera and ground-based observational data for the current eruption of Kilauea volcano (Hawai'i). A GOES radiance time series was constructed from 3987 images between 19 June and 12 August 1997. This time series displayed 24 radiance spikes elevated more than two standard deviations above the mean; 19 of these are correlated with video-recorded short-burst effusive events. Less ambiguous events are interpreted, assessed and related to specific volcanic events by simultaneous use of permanently recording video camera data and ground-observer reports. The GOES radiance time series are automatically processed on data reception and made available in near-real-time, so such time series can contribute to three main monitoring functions: (a) automatically alerting major effusive events; (b) event confirmation and assessment; and (c) establishing effusive event chronology.
A time series model: First-order integer-valued autoregressive (INAR(1))
NASA Astrophysics Data System (ADS)
Simarmata, D. M.; Novkaniza, F.; Widyaningsih, Y.
2017-07-01
Nonnegative integer-valued time series arises in many applications. A time series model: first-order Integer-valued AutoRegressive (INAR(1)) is constructed by binomial thinning operator to model nonnegative integer-valued time series. INAR (1) depends on one period from the process before. The parameter of the model can be estimated by Conditional Least Squares (CLS). Specification of INAR(1) is following the specification of (AR(1)). Forecasting in INAR(1) uses median or Bayesian forecasting methodology. Median forecasting methodology obtains integer s, which is cumulative density function (CDF) until s, is more than or equal to 0.5. Bayesian forecasting methodology forecasts h-step-ahead of generating the parameter of the model and parameter of innovation term using Adaptive Rejection Metropolis Sampling within Gibbs sampling (ARMS), then finding the least integer s, where CDF until s is more than or equal to u . u is a value taken from the Uniform(0,1) distribution. INAR(1) is applied on pneumonia case in Penjaringan, Jakarta Utara, January 2008 until April 2016 monthly.
NASA Astrophysics Data System (ADS)
Matsunaga, Y.; Sugita, Y.
2018-06-01
A data-driven modeling scheme is proposed for conformational dynamics of biomolecules based on molecular dynamics (MD) simulations and experimental measurements. In this scheme, an initial Markov State Model (MSM) is constructed from MD simulation trajectories, and then, the MSM parameters are refined using experimental measurements through machine learning techniques. The second step can reduce the bias of MD simulation results due to inaccurate force-field parameters. Either time-series trajectories or ensemble-averaged data are available as a training data set in the scheme. Using a coarse-grained model of a dye-labeled polyproline-20, we compare the performance of machine learning estimations from the two types of training data sets. Machine learning from time-series data could provide the equilibrium populations of conformational states as well as their transition probabilities. It estimates hidden conformational states in more robust ways compared to that from ensemble-averaged data although there are limitations in estimating the transition probabilities between minor states. We discuss how to use the machine learning scheme for various experimental measurements including single-molecule time-series trajectories.
49 CFR 178.274 - Specifications for UN portable tanks.
Code of Federal Regulations, 2010 CFR
2010-10-01
... and 178.277, as applicable. Design type means a portable tank or series of portable tanks made of... the top of the shell during the hydraulic pressure test equal to not less than 1.5 times the design... be designed and constructed to withstand a hydraulic test pressure of not less than 1.5 times the...
49 CFR 178.274 - Specifications for UN portable tanks.
Code of Federal Regulations, 2013 CFR
2013-10-01
... and 178.277, as applicable. Design type means a portable tank or series of portable tanks made of... the top of the shell during the hydraulic pressure test equal to not less than 1.5 times the design... be designed and constructed to withstand a hydraulic test pressure of not less than 1.5 times the...
46 CFR 160.015-3 - Construction of lifeboat winches.
Code of Federal Regulations, 2010 CFR
2010-10-01
... shall be maintained at all times based on the approved working load. (b) Worm gears, spur gears, or a... drums shall be at least 16 times the diameter of the falls. (f) A weighted lever hand brake shall be... davit arms as they approach the final stowed position. These switches shall be connected in series, they...
46 CFR 160.015-3 - Construction of lifeboat winches.
Code of Federal Regulations, 2011 CFR
2011-10-01
... davit arms as they approach the final stowed position. These switches shall be connected in series, they... shall be maintained at all times based on the approved working load. (b) Worm gears, spur gears, or a... drums shall be at least 16 times the diameter of the falls. (f) A weighted lever hand brake shall be...
Decomposition rates for hand-piled fuels
Clinton S. Wright; Alexander M. Evans; Joseph C. Restaino
2017-01-01
Hand-constructed piles in eastern Washington and north-central New Mexico were weighed periodically between October 2011 and June 2015 to develop decay-rate constants that are useful for estimating the rate of piled biomass loss over time. Decay-rate constants (k) were determined by fitting negative exponential curves to time series of pile weight for each site. Piles...
New Features for Neuron Classification.
Hernández-Pérez, Leonardo A; Delgado-Castillo, Duniel; Martín-Pérez, Rainer; Orozco-Morales, Rubén; Lorenzo-Ginori, Juan V
2018-04-28
This paper addresses the problem of obtaining new neuron features capable of improving results of neuron classification. Most studies on neuron classification using morphological features have been based on Euclidean geometry. Here three one-dimensional (1D) time series are derived from the three-dimensional (3D) structure of neuron instead, and afterwards a spatial time series is finally constructed from which the features are calculated. Digitally reconstructed neurons were separated into control and pathological sets, which are related to three categories of alterations caused by epilepsy, Alzheimer's disease (long and local projections), and ischemia. These neuron sets were then subjected to supervised classification and the results were compared considering three sets of features: morphological, features obtained from the time series and a combination of both. The best results were obtained using features from the time series, which outperformed the classification using only morphological features, showing higher correct classification rates with differences of 5.15, 3.75, 5.33% for epilepsy and Alzheimer's disease (long and local projections) respectively. The morphological features were better for the ischemia set with a difference of 3.05%. Features like variance, Spearman auto-correlation, partial auto-correlation, mutual information, local minima and maxima, all related to the time series, exhibited the best performance. Also we compared different evaluators, among which ReliefF was the best ranked.
Park, Chihyun; Yun, So Jeong; Ryu, Sung Jin; Lee, Soyoung; Lee, Young-Sam; Yoon, Youngmi; Park, Sang Chul
2017-03-15
Cellular senescence irreversibly arrests growth of human diploid cells. In addition, recent studies have indicated that senescence is a multi-step evolving process related to important complex biological processes. Most studies analyzed only the genes and their functions representing each senescence phase without considering gene-level interactions and continuously perturbed genes. It is necessary to reveal the genotypic mechanism inferred by affected genes and their interaction underlying the senescence process. We suggested a novel computational approach to identify an integrative network which profiles an underlying genotypic signature from time-series gene expression data. The relatively perturbed genes were selected for each time point based on the proposed scoring measure denominated as perturbation scores. Then, the selected genes were integrated with protein-protein interactions to construct time point specific network. From these constructed networks, the conserved edges across time point were extracted for the common network and statistical test was performed to demonstrate that the network could explain the phenotypic alteration. As a result, it was confirmed that the difference of average perturbation scores of common networks at both two time points could explain the phenotypic alteration. We also performed functional enrichment on the common network and identified high association with phenotypic alteration. Remarkably, we observed that the identified cell cycle specific common network played an important role in replicative senescence as a key regulator. Heretofore, the network analysis from time series gene expression data has been focused on what topological structure was changed over time point. Conversely, we focused on the conserved structure but its context was changed in course of time and showed it was available to explain the phenotypic changes. We expect that the proposed method will help to elucidate the biological mechanism unrevealed by the existing approaches.
Empirical Investigation of Critical Transitions in Paleoclimate
NASA Astrophysics Data System (ADS)
Loskutov, E. M.; Mukhin, D.; Gavrilov, A.; Feigin, A.
2016-12-01
In this work we apply a new empirical method for the analysis of complex spatially distributed systems to the analysis of paleoclimate data. The method consists of two general parts: (i) revealing the optimal phase-space variables and (ii) construction the empirical prognostic model by observed time series. The method of phase space variables construction based on the data decomposition into nonlinear dynamical modes which was successfully applied to global SST field and allowed clearly separate time scales and reveal climate shift in the observed data interval [1]. The second part, the Bayesian approach to optimal evolution operator reconstruction by time series is based on representation of evolution operator in the form of nonlinear stochastic function represented by artificial neural networks [2,3]. In this work we are focused on the investigation of critical transitions - the abrupt changes in climate dynamics - in match longer time scale process. It is well known that there were number of critical transitions on different time scales in the past. In this work, we demonstrate the first results of applying our empirical methods to analysis of paleoclimate variability. In particular, we discuss the possibility of detecting, identifying and prediction such critical transitions by means of nonlinear empirical modeling using the paleoclimate record time series. The study is supported by Government of Russian Federation (agreement #14.Z50.31.0033 with the Institute of Applied Physics of RAS). 1. Mukhin, D., Gavrilov, A., Feigin, A., Loskutov, E., & Kurths, J. (2015). Principal nonlinear dynamical modes of climate variability. Scientific Reports, 5, 15510. http://doi.org/10.1038/srep155102. Ya. I. Molkov, D. N. Mukhin, E. M. Loskutov, A.M. Feigin, (2012) : Random dynamical models from time series. Phys. Rev. E, Vol. 85, n.3.3. Mukhin, D., Kondrashov, D., Loskutov, E., Gavrilov, A., Feigin, A., & Ghil, M. (2015). Predicting Critical Transitions in ENSO models. Part II: Spatially Dependent Models. Journal of Climate, 28(5), 1962-1976. http://doi.org/10.1175/JCLI-D-14-00240.1
Approximate scaling properties of RNA free energy landscapes
NASA Technical Reports Server (NTRS)
Baskaran, S.; Stadler, P. F.; Schuster, P.
1996-01-01
RNA free energy landscapes are analysed by means of "time-series" that are obtained from random walks restricted to excursion sets. The power spectra, the scaling of the jump size distribution, and the scaling of the curve length measured with different yard stick lengths are used to describe the structure of these "time series". Although they are stationary by construction, we find that their local behavior is consistent with both AR(1) and self-affine processes. Random walks confined to excursion sets (i.e., with the restriction that the fitness value exceeds a certain threshold at each step) exhibit essentially the same statistics as free random walks. We find that an AR(1) time series is in general approximately self-affine on timescales up to approximately the correlation length. We present an empirical relation between the correlation parameter rho of the AR(1) model and the exponents characterizing self-affinity.
Detecting a currency’s dominance using multivariate time series analysis
NASA Astrophysics Data System (ADS)
Syahidah Yusoff, Nur; Sharif, Shamshuritawati
2017-09-01
A currency exchange rate is the price of one country’s currency in terms of another country’s currency. There are four different prices; opening, closing, highest, and lowest can be achieved from daily trading activities. In the past, a lot of studies have been carried out by using closing price only. However, those four prices are interrelated to each other. Thus, the multivariate time series can provide more information than univariate time series. Therefore, the enthusiasm of this paper is to compare the results of two different approaches, which are mean vector and Escoufier’s RV coefficient in constructing similarity matrices of 20 world currencies. Consequently, both matrices are used to substitute the correlation matrix required by network topology. With the help of degree centrality measure, we can detect the currency’s dominance for both networks. The pros and cons for both approaches will be presented at the end of this paper.
Novel Flood Detection and Analysis Method Using Recurrence Property
NASA Astrophysics Data System (ADS)
Wendi, Dadiyorto; Merz, Bruno; Marwan, Norbert
2016-04-01
Temporal changes in flood hazard are known to be difficult to detect and attribute due to multiple drivers that include processes that are non-stationary and highly variable. These drivers, such as human-induced climate change, natural climate variability, implementation of flood defence, river training, or land use change, could impact variably on space-time scales and influence or mask each other. Flood time series may show complex behavior that vary at a range of time scales and may cluster in time. This study focuses on the application of recurrence based data analysis techniques (recurrence plot) for understanding and quantifying spatio-temporal changes in flood hazard in Germany. The recurrence plot is known as an effective tool to visualize the dynamics of phase space trajectories i.e. constructed from a time series by using an embedding dimension and a time delay, and it is known to be effective in analyzing non-stationary and non-linear time series. The emphasis will be on the identification of characteristic recurrence properties that could associate typical dynamic behavior to certain flood situations.
POD Model Reconstruction for Gray-Box Fault Detection
NASA Technical Reports Server (NTRS)
Park, Han; Zak, Michail
2007-01-01
Proper orthogonal decomposition (POD) is the mathematical basis of a method of constructing low-order mathematical models for the "gray-box" fault-detection algorithm that is a component of a diagnostic system known as beacon-based exception analysis for multi-missions (BEAM). POD has been successfully applied in reducing computational complexity by generating simple models that can be used for control and simulation for complex systems such as fluid flows. In the present application to BEAM, POD brings the same benefits to automated diagnosis. BEAM is a method of real-time or offline, automated diagnosis of a complex dynamic system.The gray-box approach makes it possible to utilize incomplete or approximate knowledge of the dynamics of the system that one seeks to diagnose. In the gray-box approach, a deterministic model of the system is used to filter a time series of system sensor data to remove the deterministic components of the time series from further examination. What is left after the filtering operation is a time series of residual quantities that represent the unknown (or at least unmodeled) aspects of the behavior of the system. Stochastic modeling techniques are then applied to the residual time series. The procedure for detecting abnormal behavior of the system then becomes one of looking for statistical differences between the residual time series and the predictions of the stochastic model.
Huang, Xin; Zeng, Jun; Zhou, Lina; Hu, Chunxiu; Yin, Peiyuan; Lin, Xiaohui
2016-08-31
Time-series metabolomics studies can provide insight into the dynamics of disease development and facilitate the discovery of prospective biomarkers. To improve the performance of early risk identification, a new strategy for analyzing time-series data based on dynamic networks (ATSD-DN) in a systematic time dimension is proposed. In ATSD-DN, the non-overlapping ratio was applied to measure the changes in feature ratios during the process of disease development and to construct dynamic networks. Dynamic concentration analysis and network topological structure analysis were performed to extract early warning information. This strategy was applied to the study of time-series lipidomics data from a stepwise hepatocarcinogenesis rat model. A ratio of lyso-phosphatidylcholine (LPC) 18:1/free fatty acid (FFA) 20:5 was identified as the potential biomarker for hepatocellular carcinoma (HCC). It can be used to classify HCC and non-HCC rats, and the area under the curve values in the discovery and external validation sets were 0.980 and 0.972, respectively. This strategy was also compared with a weighted relative difference accumulation algorithm (wRDA), multivariate empirical Bayes statistics (MEBA) and support vector machine-recursive feature elimination (SVM-RFE). The better performance of ATSD-DN suggests its potential for a more complete presentation of time-series changes and effective extraction of early warning information.
NASA Astrophysics Data System (ADS)
Huang, Xin; Zeng, Jun; Zhou, Lina; Hu, Chunxiu; Yin, Peiyuan; Lin, Xiaohui
2016-08-01
Time-series metabolomics studies can provide insight into the dynamics of disease development and facilitate the discovery of prospective biomarkers. To improve the performance of early risk identification, a new strategy for analyzing time-series data based on dynamic networks (ATSD-DN) in a systematic time dimension is proposed. In ATSD-DN, the non-overlapping ratio was applied to measure the changes in feature ratios during the process of disease development and to construct dynamic networks. Dynamic concentration analysis and network topological structure analysis were performed to extract early warning information. This strategy was applied to the study of time-series lipidomics data from a stepwise hepatocarcinogenesis rat model. A ratio of lyso-phosphatidylcholine (LPC) 18:1/free fatty acid (FFA) 20:5 was identified as the potential biomarker for hepatocellular carcinoma (HCC). It can be used to classify HCC and non-HCC rats, and the area under the curve values in the discovery and external validation sets were 0.980 and 0.972, respectively. This strategy was also compared with a weighted relative difference accumulation algorithm (wRDA), multivariate empirical Bayes statistics (MEBA) and support vector machine-recursive feature elimination (SVM-RFE). The better performance of ATSD-DN suggests its potential for a more complete presentation of time-series changes and effective extraction of early warning information.
Visibility graph analysis on quarterly macroeconomic series of China based on complex network theory
NASA Astrophysics Data System (ADS)
Wang, Na; Li, Dong; Wang, Qiwen
2012-12-01
The visibility graph approach and complex network theory provide a new insight into time series analysis. The inheritance of the visibility graph from the original time series was further explored in the paper. We found that degree distributions of visibility graphs extracted from Pseudo Brownian Motion series obtained by the Frequency Domain algorithm exhibit exponential behaviors, in which the exponential exponent is a binomial function of the Hurst index inherited in the time series. Our simulations presented that the quantitative relations between the Hurst indexes and the exponents of degree distribution function are different for different series and the visibility graph inherits some important features of the original time series. Further, we convert some quarterly macroeconomic series including the growth rates of value-added of three industry series and the growth rates of Gross Domestic Product series of China to graphs by the visibility algorithm and explore the topological properties of graphs associated from the four macroeconomic series, namely, the degree distribution and correlations, the clustering coefficient, the average path length, and community structure. Based on complex network analysis we find degree distributions of associated networks from the growth rates of value-added of three industry series are almost exponential and the degree distributions of associated networks from the growth rates of GDP series are scale free. We also discussed the assortativity and disassortativity of the four associated networks as they are related to the evolutionary process of the original macroeconomic series. All the constructed networks have “small-world” features. The community structures of associated networks suggest dynamic changes of the original macroeconomic series. We also detected the relationship among government policy changes, community structures of associated networks and macroeconomic dynamics. We find great influences of government policies in China on the changes of dynamics of GDP and the three industries adjustment. The work in our paper provides a new way to understand the dynamics of economic development.
Cascading Oscillators in Decoding Speech: Reflection of a Cortical Computation Principle
2016-09-06
Combining an experimental paradigm based on Ghitza and Greenberg (2009) for speech with the approach of Farbood et al. (2013) to timing in key...Fuglsang, 2015). A model was developed which uses modulation spectrograms to construct an oscillating time - series synchronized with the slowly varying...estimated to average 1 hour per response, including the time for reviewing instructions, searching data sources, gathering and maintaining the data
Zhou, Wei; Wen, Junhao; Qu, Qiang; Zeng, Jun; Cheng, Tian
2018-01-01
Recommender systems are vulnerable to shilling attacks. Forged user-generated content data, such as user ratings and reviews, are used by attackers to manipulate recommendation rankings. Shilling attack detection in recommender systems is of great significance to maintain the fairness and sustainability of recommender systems. The current studies have problems in terms of the poor universality of algorithms, difficulty in selection of user profile attributes, and lack of an optimization mechanism. In this paper, a shilling behaviour detection structure based on abnormal group user findings and rating time series analysis is proposed. This paper adds to the current understanding in the field by studying the credibility evaluation model in-depth based on the rating prediction model to derive proximity-based predictions. A method for detecting suspicious ratings based on suspicious time windows and target item analysis is proposed. Suspicious rating time segments are determined by constructing a time series, and data streams of the rating items are examined and suspicious rating segments are checked. To analyse features of shilling attacks by a group user's credibility, an abnormal group user discovery method based on time series and time window is proposed. Standard testing datasets are used to verify the effect of the proposed method.
Wen, Junhao; Qu, Qiang; Zeng, Jun; Cheng, Tian
2018-01-01
Recommender systems are vulnerable to shilling attacks. Forged user-generated content data, such as user ratings and reviews, are used by attackers to manipulate recommendation rankings. Shilling attack detection in recommender systems is of great significance to maintain the fairness and sustainability of recommender systems. The current studies have problems in terms of the poor universality of algorithms, difficulty in selection of user profile attributes, and lack of an optimization mechanism. In this paper, a shilling behaviour detection structure based on abnormal group user findings and rating time series analysis is proposed. This paper adds to the current understanding in the field by studying the credibility evaluation model in-depth based on the rating prediction model to derive proximity-based predictions. A method for detecting suspicious ratings based on suspicious time windows and target item analysis is proposed. Suspicious rating time segments are determined by constructing a time series, and data streams of the rating items are examined and suspicious rating segments are checked. To analyse features of shilling attacks by a group user’s credibility, an abnormal group user discovery method based on time series and time window is proposed. Standard testing datasets are used to verify the effect of the proposed method. PMID:29742134
NASA Astrophysics Data System (ADS)
Chen, Yu-Wen; Wang, Yetmen; Chang, Liang-Cheng
2017-04-01
Groundwater resources play a vital role on regional supply. To avoid irreversible environmental impact such as land subsidence, the characteristic identification of groundwater system is crucial before sustainable management of groundwater resource. This study proposes a signal process approach to identify the character of groundwater systems based on long-time hydrologic observations include groundwater level and rainfall. The study process contains two steps. First, a linear signal model (LSM) is constructed and calibrated to simulate the variation of underground hydrology based on the time series of groundwater levels and rainfall. The mass balance equation of the proposed LSM contains three major terms contain net rate of horizontal exchange, rate of rainfall recharge and rate of pumpage and four parameters are required to calibrate. Because reliable records of pumpage is rare, the time-variant groundwater amplitudes of daily frequency (P ) calculated by STFT are assumed as linear indicators of puamage instead of pumpage records. Time series obtained from 39 observation wells and 50 rainfall stations in and around the study area, Pintung Plain, are paired for model construction. Second, the well-calibrated parameters of the linear signal model can be used to interpret the characteristic of groundwater system. For example, the rainfall recharge coefficient (γ) means the transform ratio between rainfall intention and groundwater level raise. The area around the observation well with higher γ means that the saturated zone here is easily affected by rainfall events and the material of unsaturated zone might be gravel or coarse sand with high infiltration ratio. Considering the spatial distribution of γ, the values of γ decrease from the upstream to the downstream of major rivers and also are correlated to the spatial distribution of grain size of surface soil. Via the time-series of groundwater levels and rainfall, the well-calibrated parameters of LSM have ability to identify the characteristic of aquifer.
ERIC Educational Resources Information Center
Hobson, Sarah R.; Vu, Julie F.
2015-01-01
This paper draws upon an explanation of the proleptic, an understanding of time as being socially constructed within specific contexts, to interpret a series of dramatic sequences enacted in ethnodramatic pedagogy. The authors present two major arguments: (1) teachers can help students analyze the processes that influence and shape their…
Xia, Li C; Steele, Joshua A; Cram, Jacob A; Cardon, Zoe G; Simmons, Sheri L; Vallino, Joseph J; Fuhrman, Jed A; Sun, Fengzhu
2011-01-01
The increasing availability of time series microbial community data from metagenomics and other molecular biological studies has enabled the analysis of large-scale microbial co-occurrence and association networks. Among the many analytical techniques available, the Local Similarity Analysis (LSA) method is unique in that it captures local and potentially time-delayed co-occurrence and association patterns in time series data that cannot otherwise be identified by ordinary correlation analysis. However LSA, as originally developed, does not consider time series data with replicates, which hinders the full exploitation of available information. With replicates, it is possible to understand the variability of local similarity (LS) score and to obtain its confidence interval. We extended our LSA technique to time series data with replicates and termed it extended LSA, or eLSA. Simulations showed the capability of eLSA to capture subinterval and time-delayed associations. We implemented the eLSA technique into an easy-to-use analytic software package. The software pipeline integrates data normalization, statistical correlation calculation, statistical significance evaluation, and association network construction steps. We applied the eLSA technique to microbial community and gene expression datasets, where unique time-dependent associations were identified. The extended LSA analysis technique was demonstrated to reveal statistically significant local and potentially time-delayed association patterns in replicated time series data beyond that of ordinary correlation analysis. These statistically significant associations can provide insights to the real dynamics of biological systems. The newly designed eLSA software efficiently streamlines the analysis and is freely available from the eLSA homepage, which can be accessed at http://meta.usc.edu/softs/lsa.
2011-01-01
Background The increasing availability of time series microbial community data from metagenomics and other molecular biological studies has enabled the analysis of large-scale microbial co-occurrence and association networks. Among the many analytical techniques available, the Local Similarity Analysis (LSA) method is unique in that it captures local and potentially time-delayed co-occurrence and association patterns in time series data that cannot otherwise be identified by ordinary correlation analysis. However LSA, as originally developed, does not consider time series data with replicates, which hinders the full exploitation of available information. With replicates, it is possible to understand the variability of local similarity (LS) score and to obtain its confidence interval. Results We extended our LSA technique to time series data with replicates and termed it extended LSA, or eLSA. Simulations showed the capability of eLSA to capture subinterval and time-delayed associations. We implemented the eLSA technique into an easy-to-use analytic software package. The software pipeline integrates data normalization, statistical correlation calculation, statistical significance evaluation, and association network construction steps. We applied the eLSA technique to microbial community and gene expression datasets, where unique time-dependent associations were identified. Conclusions The extended LSA analysis technique was demonstrated to reveal statistically significant local and potentially time-delayed association patterns in replicated time series data beyond that of ordinary correlation analysis. These statistically significant associations can provide insights to the real dynamics of biological systems. The newly designed eLSA software efficiently streamlines the analysis and is freely available from the eLSA homepage, which can be accessed at http://meta.usc.edu/softs/lsa. PMID:22784572
Sriyudthsak, Kansuporn; Shiraishi, Fumihide; Hirai, Masami Yokota
2016-01-01
The high-throughput acquisition of metabolome data is greatly anticipated for the complete understanding of cellular metabolism in living organisms. A variety of analytical technologies have been developed to acquire large-scale metabolic profiles under different biological or environmental conditions. Time series data are useful for predicting the most likely metabolic pathways because they provide important information regarding the accumulation of metabolites, which implies causal relationships in the metabolic reaction network. Considerable effort has been undertaken to utilize these data for constructing a mathematical model merging system properties and quantitatively characterizing a whole metabolic system in toto. However, there are technical difficulties between benchmarking the provision and utilization of data. Although, hundreds of metabolites can be measured, which provide information on the metabolic reaction system, simultaneous measurement of thousands of metabolites is still challenging. In addition, it is nontrivial to logically predict the dynamic behaviors of unmeasurable metabolite concentrations without sufficient information on the metabolic reaction network. Yet, consolidating the advantages of advancements in both metabolomics and mathematical modeling remain to be accomplished. This review outlines the conceptual basis of and recent advances in technologies in both the research fields. It also highlights the potential for constructing a large-scale mathematical model by estimating model parameters from time series metabolome data in order to comprehensively understand metabolism at the systems level.
Iwata, Michio; Miyawaki-Kuwakado, Atsuko; Yoshida, Erika; Komori, Soichiro; Shiraishi, Fumihide
2018-02-02
In a mathematical model, estimation of parameters from time-series data of metabolic concentrations in cells is a challenging task. However, it seems that a promising approach for such estimation has not yet been established. Biochemical Systems Theory (BST) is a powerful methodology to construct a power-law type model for a given metabolic reaction system and to then characterize it efficiently. In this paper, we discuss the use of an S-system root-finding method (S-system method) to estimate parameters from time-series data of metabolite concentrations. We demonstrate that the S-system method is superior to the Newton-Raphson method in terms of the convergence region and iteration number. We also investigate the usefulness of a translocation technique and a complex-step differentiation method toward the practical application of the S-system method. The results indicate that the S-system method is useful to construct mathematical models for a variety of metabolic reaction networks. Copyright © 2018 Elsevier Inc. All rights reserved.
Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo
2014-07-01
Reservoir computing is a recently introduced machine learning paradigm that has already shown excellent performances in the processing of empirical data. We study a particular kind of reservoir computers called time-delay reservoirs that are constructed out of the sampling of the solution of a time-delay differential equation and show their good performance in the forecasting of the conditional covariances associated to multivariate discrete-time nonlinear stochastic processes of VEC-GARCH type as well as in the prediction of factual daily market realized volatilities computed with intraday quotes, using as training input daily log-return series of moderate size. We tackle some problems associated to the lack of task-universality for individually operating reservoirs and propose a solution based on the use of parallel arrays of time-delay reservoirs. Copyright © 2014 Elsevier Ltd. All rights reserved.
PROMIS series. Volume 8: Midlatitude ground magnetograms
NASA Technical Reports Server (NTRS)
Fairfield, D. H.; Russell, C. T.
1990-01-01
This is the eighth in a series of volumes pertaining to the Polar Region Outer Magnetosphere International Study (PROMIS). This volume contains 24 hour stack plots of 1-minute average, H and D component, ground magnetograms for the period March 10 through June 16, 1986. Nine midlatitude ground stations were selected from the UCLA magnetogram data base that was constructed from all available digitized magnetogram stations. The primary purpose of this publication is to allow users to define universal times and onset longitudes of magnetospheric substorms.
Introduction to Carpentry. Introduction to Construction Series. Instructor Edition.
ERIC Educational Resources Information Center
Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.
This competency-based curriculum guide on the specialty area of carpentry is part of the Introduction to Construction series. The series is designed with the flexible training requirements of open shop contractors, preapprenticeship programs, multicraft high school programs, technology education programs, and cooperative education programs in…
Introduction to Bricklaying. Introduction to Construction Series. Instructor Edition.
ERIC Educational Resources Information Center
Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.
This competency-based curriculum guide on the specialty area of bricklaying is part of the Introduction to Construction series. The series is designed with the flexible training requirements of open shop contractors, preapprenticeship programs, multicraft high school programs, technology education programs, and cooperative education programs in…
Xie, Ping; Wu, Zi Yi; Zhao, Jiang Yan; Sang, Yan Fang; Chen, Jie
2018-04-01
A stochastic hydrological process is influenced by both stochastic and deterministic factors. A hydrological time series contains not only pure random components reflecting its inheri-tance characteristics, but also deterministic components reflecting variability characteristics, such as jump, trend, period, and stochastic dependence. As a result, the stochastic hydrological process presents complicated evolution phenomena and rules. To better understand these complicated phenomena and rules, this study described the inheritance and variability characteristics of an inconsistent hydrological series from two aspects: stochastic process simulation and time series analysis. In addition, several frequency analysis approaches for inconsistent time series were compared to reveal the main problems in inconsistency study. Then, we proposed a new concept of hydrological genes origined from biological genes to describe the inconsistent hydrolocal processes. The hydrologi-cal genes were constructed using moments methods, such as general moments, weight function moments, probability weight moments and L-moments. Meanwhile, the five components, including jump, trend, periodic, dependence and pure random components, of a stochastic hydrological process were defined as five hydrological bases. With this method, the inheritance and variability of inconsistent hydrological time series were synthetically considered and the inheritance, variability and evolution principles were fully described. Our study would contribute to reveal the inheritance, variability and evolution principles in probability distribution of hydrological elements.
InPRO: Automated Indoor Construction Progress Monitoring Using Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Hamledari, Hesam
In this research, an envisioned automated intelligent robotic solution for automated indoor data collection and inspection that employs a series of unmanned aerial vehicles (UAV), entitled "InPRO", is presented. InPRO consists of four stages, namely: 1) automated path planning; 2) autonomous UAV-based indoor inspection; 3) automated computer vision-based assessment of progress; and, 4) automated updating of 4D building information models (BIM). The works presented in this thesis address the third stage of InPRO. A series of computer vision-based methods that automate the assessment of construction progress using images captured at indoor sites are introduced. The proposed methods employ computer vision and machine learning techniques to detect the components of under-construction indoor partitions. In particular, framing (studs), insulation, electrical outlets, and different states of drywall sheets (installing, plastering, and painting) are automatically detected using digital images. High accuracy rates, real-time performance, and operation without a priori information are indicators of the methods' promising performance.
NASA Astrophysics Data System (ADS)
Yamada, Masayoshi; Fukuzawa, Masayuki; Kitsunezuka, Yoshiki; Kishida, Jun; Nakamori, Nobuyuki; Kanamori, Hitoshi; Sakurai, Takashi; Kodama, Souichi
1995-05-01
In order to detect pulsation from a series of noisy ultrasound-echo moving images of a newborn baby's head for pediatric diagnosis, a digital image processing system capable of recording at the video rate and processing the recorded series of images was constructed. The time-sequence variations of each pixel value in a series of moving images were analyzed and then an algorithm based on Fourier transform was developed for the pulsation detection, noting that the pulsation associated with blood flow was periodically changed by heartbeat. Pulsation detection for pediatric diagnosis was successfully made from a series of noisy ultrasound-echo moving images of newborn baby's head by using the image processing system and the pulsation detection algorithm developed here.
Implicit Wiener series analysis of epileptic seizure recordings.
Barbero, Alvaro; Franz, Matthias; van Drongelen, Wim; Dorronsoro, José R; Schölkopf, Bernhard; Grosse-Wentrup, Moritz
2009-01-01
Implicit Wiener series are a powerful tool to build Volterra representations of time series with any degree of non-linearity. A natural question is then whether higher order representations yield more useful models. In this work we shall study this question for ECoG data channel relationships in epileptic seizure recordings, considering whether quadratic representations yield more accurate classifiers than linear ones. To do so we first show how to derive statistical information on the Volterra coefficient distribution and how to construct seizure classification patterns over that information. As our results illustrate, a quadratic model seems to provide no advantages over a linear one. Nevertheless, we shall also show that the interpretability of the implicit Wiener series provides insights into the inter-channel relationships of the recordings.
NASA Astrophysics Data System (ADS)
Feigin, Alexander; Gavrilov, Andrey; Loskutov, Evgeny; Mukhin, Dmitry
2015-04-01
Proper decomposition of the complex system into well separated "modes" is a way to reveal and understand the mechanisms governing the system behaviour as well as discover essential feedbacks and nonlinearities. The decomposition is also natural procedure that provides to construct adequate and concurrently simplest models of both corresponding sub-systems, and of the system in whole. In recent works two new methods of decomposition of the Earth's climate system into well separated modes were discussed. The first method [1-3] is based on the MSSA (Multichannel Singular Spectral Analysis) [4] for linear expanding vector (space-distributed) time series and makes allowance delayed correlations of the processes recorded in spatially separated points. The second one [5-7] allows to construct nonlinear dynamic modes, but neglects delay of correlations. It was demonstrated [1-3] that first method provides effective separation of different time scales, but prevent from correct reduction of data dimension: slope of variance spectrum of spatio-temporal empirical orthogonal functions that are "structural material" for linear spatio-temporal modes, is too flat. The second method overcomes this problem: variance spectrum of nonlinear modes falls essentially sharply [5-7]. However neglecting time-lag correlations brings error of mode selection that is uncontrolled and increases with growth of mode time scale. In the report we combine these two methods in such a way that the developed algorithm allows constructing nonlinear spatio-temporal modes. The algorithm is applied for decomposition of (i) multi hundreds years globally distributed data generated by the INM RAS Coupled Climate Model [8], and (ii) 156 years time series of SST anomalies distributed over the globe [9]. We compare efficiency of different methods of decomposition and discuss the abilities of nonlinear spatio-temporal modes for construction of adequate and concurrently simplest ("optimal") models of climate systems. 1. Feigin A.M., Mukhin D., Gavrilov A., Volodin E.M., and Loskutov E.M. (2013) "Separation of spatial-temporal patterns ("climatic modes") by combined analysis of really measured and generated numerically vector time series", AGU 2013 Fall Meeting, Abstract NG33A-1574. 2. Alexander Feigin, Dmitry Mukhin, Andrey Gavrilov, Evgeny Volodin, and Evgeny Loskutov (2014) "Approach to analysis of multiscale space-distributed time series: separation of spatio-temporal modes with essentially different time scales", Geophysical Research Abstracts, Vol. 16, EGU2014-6877. 3. Dmitry Mukhin, Dmitri Kondrashov, Evgeny Loskutov, Andrey Gavrilov, Alexander Feigin, and Michael Ghil (2014) "Predicting critical transitions in ENSO models, Part II: Spatially dependent models", Journal of Climate (accepted, doi: 10.1175/JCLI-D-14-00240.1). 4. Ghil, M., R. M. Allen, M. D. Dettinger, K. Ide, D. Kondrashov, et al. (2002) "Advanced spectral methods for climatic time series", Rev. Geophys. 40(1), 3.1-3.41. 5. Dmitry Mukhin, Andrey Gavrilov, Evgeny M Loskutov and Alexander M Feigin (2014) "Nonlinear Decomposition of Climate Data: a New Method for Reconstruction of Dynamical Modes", AGU 2014 Fall Meeting, Abstract NG43A-3752. 6. Andrey Gavrilov, Dmitry Mukhin, Evgeny Loskutov, and Alexander Feigin (2015) "Empirical decomposition of climate data into nonlinear dynamic modes", Geophysical Research Abstracts, Vol. 17, EGU2015-627. 7. Dmitry Mukhin, Andrey Gavrilov, Evgeny Loskutov, Alexander Feigin, and Juergen Kurths (2015) "Reconstruction of principal dynamical modes from climatic variability: nonlinear approach", Geophysical Research Abstracts, Vol. 17, EGU2015-5729. 8. http://83.149.207.89/GCM_DATA_PLOTTING/GCM_INM_DATA_XY_en.htm. 9. http://iridl.ldeo.columbia.edu/SOURCES/.KAPLAN/.EXTENDED/.v2/.ssta/.
Use of a prototype pulse oximeter for time series analysis of heart rate variability
NASA Astrophysics Data System (ADS)
González, Erika; López, Jehú; Hautefeuille, Mathieu; Velázquez, Víctor; Del Moral, Jésica
2015-05-01
This work presents the development of a low cost pulse oximeter prototype consisting of pulsed red and infrared commercial LEDs and a broad spectral photodetector used to register time series of heart rate and oxygen saturation of blood. This platform, besides providing these values, like any other pulse oximeter, processes the signals to compute a power spectrum analysis of the patient heart rate variability in real time and, additionally, the device allows access to all raw and analyzed data if databases construction is required or another kind of further analysis is desired. Since the prototype is capable of acquiring data for long periods of time, it is suitable for collecting data in real life activities, enabling the development of future wearable applications.
Introduction to Drywall. Introduction to Construction Series. Instructor Edition.
ERIC Educational Resources Information Center
Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.
This competency-based curriculum guide on the specialty area of drywall is part of the Introduction to Construction series. The series is designed with the flexible training requirements of open shop contractors, preapprenticeship programs, multicraft high school programs, technology education programs, and cooperative education programs in mind.…
Introduction to Concrete Masonry. Introduction to Construction Series. Instructor Edition.
ERIC Educational Resources Information Center
Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.
This competency-based curriculum guide on the specialty area of concrete masonry is part of the Introduction to Construction series. The series is designed with the flexible training requirements of open shop contractors, preapprenticeship programs, multicraft high school programs, technology education programs, and cooperative education programs…
Introduction to Plumbing. Introduction to Construction Series. Instructor Edition.
ERIC Educational Resources Information Center
Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.
This competency-based curriculum guide on the specialty area of plumbing is part of the Introduction to Construction series. The series is designed with the flexible training requirements of open shop contractors, preapprenticeship programs, multicraft high school programs, technology education programs, and cooperative education programs in mind.…
Introduction to Sheet Metal. Introduction to Construction Series. Instructor Edition.
ERIC Educational Resources Information Center
Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.
This competency-based curriculum guide on the specialty area of sheet metal is part of the Introduction to Construction series. The series is designed with the flexible training requirements of open shop contractors, preapprenticeship programs, multicraft high school programs, technology education programs, and cooperative education programs in…
Signatures of ecological processes in microbial community time series.
Faust, Karoline; Bauchinger, Franziska; Laroche, Béatrice; de Buyl, Sophie; Lahti, Leo; Washburne, Alex D; Gonze, Didier; Widder, Stefanie
2018-06-28
Growth rates, interactions between community members, stochasticity, and immigration are important drivers of microbial community dynamics. In sequencing data analysis, such as network construction and community model parameterization, we make implicit assumptions about the nature of these drivers and thereby restrict model outcome. Despite apparent risk of methodological bias, the validity of the assumptions is rarely tested, as comprehensive procedures are lacking. Here, we propose a classification scheme to determine the processes that gave rise to the observed time series and to enable better model selection. We implemented a three-step classification scheme in R that first determines whether dependence between successive time steps (temporal structure) is present in the time series and then assesses with a recently developed neutrality test whether interactions between species are required for the dynamics. If the first and second tests confirm the presence of temporal structure and interactions, then parameters for interaction models are estimated. To quantify the importance of temporal structure, we compute the noise-type profile of the community, which ranges from black in case of strong dependency to white in the absence of any dependency. We applied this scheme to simulated time series generated with the Dirichlet-multinomial (DM) distribution, Hubbell's neutral model, the generalized Lotka-Volterra model and its discrete variant (the Ricker model), and a self-organized instability model, as well as to human stool microbiota time series. The noise-type profiles for all but DM data clearly indicated distinctive structures. The neutrality test correctly classified all but DM and neutral time series as non-neutral. The procedure reliably identified time series for which interaction inference was suitable. Both tests were required, as we demonstrated that all structured time series, including those generated with the neutral model, achieved a moderate to high goodness of fit to the Ricker model. We present a fast and robust scheme to classify community structure and to assess the prevalence of interactions directly from microbial time series data. The procedure not only serves to determine ecological drivers of microbial dynamics, but also to guide selection of appropriate community models for prediction and follow-up analysis.
Nonlinear Dynamics of River Runoff Elucidated by Horizontal Visibility Graphs
NASA Astrophysics Data System (ADS)
Lange, Holger; Rosso, Osvaldo A.
2017-04-01
We investigate a set of long-term river runoff time series at daily resolution from Brazil, monitored by the Agencia Nacional de Aguas. A total of 150 time series was obtained, with an average length of 65 years. Both long-term trends and human influence (water management, e.g. for power production) on the dynamical behaviour are analyzed. We use Horizontal Visibility Graphs (HVGs) to determine the individual temporal networks for the time series, and extract their degree and their distance (shortest path length) distributions. Statistical and information-theoretic properties of these distributions are calculated: robust estimators of skewness and kurtosis, the maximum degree occurring in the time series, the Shannon entropy, permutation complexity and Fisher Information. For the latter, we also compare the information measures obtained from the degree distributions to those using the original time series directly, to investigate the impact of graph construction on the dynamical properties as reflected in these measures. Focus is on one hand on universal properties of the HVG, common to all runoff series, and on site-specific aspects on the other. Results demonstrate that the assumption of power law behaviour for the degree distribtion does not generally hold, and that management has a significant impact on this distribution. We also show that a specific pretreatment of the time series conventional in hydrology, the elimination of seasonality by a separate z-transformation for each calendar day, is highly detrimental to the nonlinear behaviour. It changes long-term correlations and the overall dynamics towards more random behaviour. Analysis based on the transformed data easily leads to spurious results, and bear a high risk of misinterpretation.
Complex network construction based on user group attention sequence
NASA Astrophysics Data System (ADS)
Zhang, Gaowei; Xu, Lingyu; Wang, Lei
2018-04-01
In the traditional complex network construction, it is often to use the similarity between nodes, build the weight of the network, and finally build the network. However, this approach tends to focus only on the coupling between nodes, while ignoring the information transfer between nodes and the transfer of directionality. In the network public opinion space, based on the set of stock series that the network groups pay attention to within a certain period of time, we vectorize the different stocks and build a complex network.
Nonlinear analysis of the occurrence of hurricanes in the Gulf of Mexico and the Caribbean Sea
NASA Astrophysics Data System (ADS)
Rojo-Garibaldi, Berenice; Salas-de-León, David Alberto; Adela Monreal-Gómez, María; Sánchez-Santillán, Norma Leticia; Salas-Monreal, David
2018-04-01
Hurricanes are complex systems that carry large amounts of energy. Their impact often produces natural disasters involving the loss of human lives and materials, such as infrastructure, valued at billions of US dollars. However, not everything about hurricanes is negative, as hurricanes are the main source of rainwater for the regions where they develop. This study shows a nonlinear analysis of the time series of the occurrence of hurricanes in the Gulf of Mexico and the Caribbean Sea obtained from 1749 to 2012. The construction of the hurricane time series was carried out based on the hurricane database of the North Atlantic basin hurricane database (HURDAT) and the published historical information. The hurricane time series provides a unique historical record on information about ocean-atmosphere interactions. The Lyapunov exponent indicated that the system presented chaotic dynamics, and the spectral analysis and nonlinear analyses of the time series of the hurricanes showed chaotic edge behavior. One possible explanation for this chaotic edge is the individual chaotic behavior of hurricanes, either by category or individually regardless of their category and their behavior on a regular basis.
NASA Astrophysics Data System (ADS)
Vlasenko, A. V.; Sizonenko, A. B.; Zhdanov, A. A.
2018-05-01
Discrete time series or mappings are proposed for describing the dynamics of a nonlinear system. The article considers the problems of forecasting the dynamics of the system from the time series generated by it. In particular, the commercial rate of drilling oil and gas wells can be considered as a series where each next value depends on the previous one. The main parameter here is the technical drilling speed. With the aim of eliminating the measurement error and presenting the commercial speed of the object to the current with a good accuracy, future or any of the elapsed time points, the use of the Kalman filter is suggested. For the transition from a deterministic model to a probabilistic one, the use of ensemble modeling is suggested. Ensemble systems can provide a wide range of visual output, which helps the user to evaluate the measure of confidence in the model. In particular, the availability of information on the estimated calendar duration of the construction of oil and gas wells will allow drilling companies to optimize production planning by rationalizing the approach to loading drilling rigs, which ultimately leads to maximization of profit and an increase of their competitiveness.
"Time-dependent flow-networks"
NASA Astrophysics Data System (ADS)
Tupikina, Liubov; Molkentin, Nora; Lopez, Cristobal; Hernandez-Garcia, Emilio; Marwan, Norbert; Kurths, Jürgen
2015-04-01
Complex networks have been successfully applied to various systems such as society, technology, and recently climate. Links in a climate network are defined between two geographical locations if the correlation between the time series of some climate variable is higher than a threshold. Therefore, network links are considered to imply information or heat exchange. However, the relationship between the oceanic and atmospheric flows and the climate network's structure is still unclear. Recently, a theoretical approach verifying the correlation between ocean currents and surface air temperature networks has been introduced, where the Pearson correlation networks were constructed from advection-diffusion dynamics on an underlying flow. Since the continuous approach has its limitations, i.e. high computational complexity and fixed variety of the flows in the underlying system, we introduce a new, method of flow-networks for changing in time velocity fields including external forcing in the system, noise and temperature-decay. Method of the flow-network construction can be divided into several steps: first we obtain the linear recursive equation for the temperature time-series. Then we compute the correlation matrix for time-series averaging the tensor product over all realizations of the noise, which we interpret as a weighted adjacency matrix of the flow-network and analyze using network measures. We apply the method to different types of moving flows with geographical relevance such as meandering flow. Analyzing the flow-networks using network measures we find that our approach can highlight zones of high velocity by degree and transition zones by betweenness, while the combination of these network measures can uncover how the flow propagates within time. Flow-networks can be powerful tool to understand the connection between system's dynamics and network's topology analyzed using network measures in order to shed light on different climatic phenomena.
Hangar Fire Suppression Utilizing Novec 1230
2018-01-01
The public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing...fuel fires in aircraft hangars. A 30×30×8-ft concrete-and-steel test structure was constructed for this test series . Four discharge assemblies...structure. System discharge parameters---discharge time , discharge rate, and quantity of agent discharged---were adjusted to produce the desired Novec 1230
Identifying Changes of Complex Flood Dynamics with Recurrence Analysis
NASA Astrophysics Data System (ADS)
Wendi, D.; Merz, B.; Marwan, N.
2016-12-01
Temporal changes in flood hazard system are known to be difficult to detect and attribute due to multiple drivers that include complex processes that are non-stationary and highly variable. These drivers, such as human-induced climate change, natural climate variability, implementation of flood defense, river training, or land use change, could impact variably on space-time scales and influence or mask each other. Flood time series may show complex behavior that vary at a range of time scales and may cluster in time. Moreover hydrological time series (i.e. discharge) are often subject to measurement errors, such as rating curve error especially in the case of extremes where observation are actually derived through extrapolation. This study focuses on the application of recurrence based data analysis techniques (recurrence plot) for understanding and quantifying spatio-temporal changes in flood hazard in Germany. The recurrence plot is known as an effective tool to visualize the dynamics of phase space trajectories i.e. constructed from a time series by using an embedding dimension and a time delay, and it is known to be effective in analyzing non-stationary and non-linear time series. Sensitivity of the common measurement errors and noise on recurrence analysis will also be analyzed and evaluated against conventional methods. The emphasis will be on the identification of characteristic recurrence properties that could associate typical dynamic to certain flood events.
Minimum complexity echo state network.
Rodan, Ali; Tino, Peter
2011-01-01
Reservoir computing (RC) refers to a new class of state-space models with a fixed state transition structure (the reservoir) and an adaptable readout form the state space. The reservoir is supposed to be sufficiently complex so as to capture a large number of features of the input stream that can be exploited by the reservoir-to-output readout mapping. The field of RC has been growing rapidly with many successful applications. However, RC has been criticized for not being principled enough. Reservoir construction is largely driven by a series of randomized model-building stages, with both researchers and practitioners having to rely on a series of trials and errors. To initialize a systematic study of the field, we concentrate on one of the most popular classes of RC methods, namely echo state network, and ask: What is the minimal complexity of reservoir construction for obtaining competitive models and what is the memory capacity (MC) of such simplified reservoirs? On a number of widely used time series benchmarks of different origin and characteristics, as well as by conducting a theoretical analysis we show that a simple deterministically constructed cycle reservoir is comparable to the standard echo state network methodology. The (short-term) MC of linear cyclic reservoirs can be made arbitrarily close to the proved optimal value.
33 CFR 149.315 - What embarkation, launching, and recovery arrangements must rescue boats meet?
Code of Federal Regulations, 2014 CFR
2014-07-01
... GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: DESIGN, CONSTRUCTION... the shortest possible time. (c) If the rescue boat is one of the deepwater port's survival craft, then... disengaging apparatus, approved under approval series 160.170, instead of a lifeboat release mechanism. (f...
Selection of Marine Corps Drill Instructors
1980-03-01
8 4. ., ey- Construction and Cross-Validation Statistics for Drill Instructor School Performance Success Keys...Race, and School Attrition ........... ............................. ... 15 13. Key- Construction and Cross-Validation Statistics for Drill... constructed form, the Alternation Ranking of Series Drill Instruc- tors. In this form, DIs in a Series are ranked from highest to lowest in terms of their
NASA Astrophysics Data System (ADS)
Tarmizi, H. B.; Daulay, M.; Muda, I.
2017-03-01
This study aims to test the aggregation of the economic growth of North Sumatra and the influence of the Tax on Acquisition of Land and Building to the Construction Cost Index in North Sumatra. This type of research is explanatory survey with quantitative methods. The population and the sample district in North Sumatra with the observation time series and cross sectional. The analysis tool used is multiple regression. The results showed that there was economic growth aggregation of North Sumatra and the influence of the Tax on Acquisition of Land and Building affect the Construction Cost Index.
A method for analyzing temporal patterns of variability of a time series from Poincare plots.
Fishman, Mikkel; Jacono, Frank J; Park, Soojin; Jamasebi, Reza; Thungtong, Anurak; Loparo, Kenneth A; Dick, Thomas E
2012-07-01
The Poincaré plot is a popular two-dimensional, time series analysis tool because of its intuitive display of dynamic system behavior. Poincaré plots have been used to visualize heart rate and respiratory pattern variabilities. However, conventional quantitative analysis relies primarily on statistical measurements of the cumulative distribution of points, making it difficult to interpret irregular or complex plots. Moreover, the plots are constructed to reflect highly correlated regions of the time series, reducing the amount of nonlinear information that is presented and thereby hiding potentially relevant features. We propose temporal Poincaré variability (TPV), a novel analysis methodology that uses standard techniques to quantify the temporal distribution of points and to detect nonlinear sources responsible for physiological variability. In addition, the analysis is applied across multiple time delays, yielding a richer insight into system dynamics than the traditional circle return plot. The method is applied to data sets of R-R intervals and to synthetic point process data extracted from the Lorenz time series. The results demonstrate that TPV complements the traditional analysis and can be applied more generally, including Poincaré plots with multiple clusters, and more consistently than the conventional measures and can address questions regarding potential structure underlying the variability of a data set.
VizieR Online Data Catalog: Evolution of solar irradiance during Holocene (Vieira+, 2011)
NASA Astrophysics Data System (ADS)
Vieira, L. E. A.; Solanki, S. K.; Krivova, N. A.; Usoskin, I.
2011-05-01
This is a composite total solar irradiance (TSI) time series for 9495BC to 2007AD constructed as described in Sect. 3.3 of the paper. Since the TSI is the main external heat input into the Earth's climate system, a consistent record covering as long period as possible is needed for climate models. This was our main motivation for constructing this composite TSI time series. In order to produce a representative time series, we divided the Holocene into four periods according to the available data for each period. Table 4 (see below) summarizes the periods considered and the models available for each period. After the end of the Maunder Minimum we compute daily values, while prior to the end of the Maunder Minimum we compute 10-year averages. For the period for which both solar disk magnetograms and continuum images are available (period 1) we employ the SATIRE-S reconstruction (Krivova et al. 2003A&A...399L...1K; Wenzler et al. 2006A&A...460..583W). SATIRE-T (Krivova et al. 2010JGRA..11512112K) reconstruction is used from the beginning of the Maunder Minimum (approximately 1640AD) to 1977AD. Prior to 1640AD reconstructions are based on cosmogenic isotopes (this paper). Different models of the Earth's geomagnetic field are available before and after approximately 5000BC. Therefore we treat periods 3 and 4 (before and after 5000BC) separately. Further details can be found in the paper. We emphasize that the reconstructions based on different proxies have different time resolutions. (1 data file).
Kim, Kiyeon; Omori, Ryosuke; Ito, Kimihito
2017-12-01
The estimation of the basic reproduction number is essential to understand epidemic dynamics, and time series data of infected individuals are usually used for the estimation. However, such data are not always available. Methods to estimate the basic reproduction number using genealogy constructed from nucleotide sequences of pathogens have been proposed so far. Here, we propose a new method to estimate epidemiological parameters of outbreaks using the time series change of Tajima's D statistic on the nucleotide sequences of pathogens. To relate the time evolution of Tajima's D to the number of infected individuals, we constructed a parsimonious mathematical model describing both the transmission process of pathogens among hosts and the evolutionary process of the pathogens. As a case study we applied this method to the field data of nucleotide sequences of pandemic influenza A (H1N1) 2009 viruses collected in Argentina. The Tajima's D-based method estimated basic reproduction number to be 1.55 with 95% highest posterior density (HPD) between 1.31 and 2.05, and the date of epidemic peak to be 10th July with 95% HPD between 22nd June and 9th August. The estimated basic reproduction number was consistent with estimation by birth-death skyline plot and estimation using the time series of the number of infected individuals. These results suggested that Tajima's D statistic on nucleotide sequences of pathogens could be useful to estimate epidemiological parameters of outbreaks. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
A novel approach for detecting heat waves: the Standardized Heat-Wave Index.
NASA Astrophysics Data System (ADS)
Cucchi, Marco; Petitta, Marcello; Calmanti, Sandro
2016-04-01
Extreme temperatures have an impact on the energy balance of any living organism and on the operational capabilities of critical infrastructures. The ability to capture the occurrence of extreme temperature events is therefore an essential property of a multi-hazard extreme climate indicator. In this paper we introduce a new index for the detection of such extreme temperature events called SHI (Standardized Heat-Wave Index), developed in the context of XCF project for the construction of a multi-hazard extreme climate indicator (ECI). SHI is a probabilistic index based on the analysis of maximum daily temperatures time series; it is standardized, enabling comparisons overs space/time and with other indices, and it is capable of describing both extreme cold and hot events. Given a particular location, SHI is constructed using the time series of local maximum daily temperatures with the following procedure: three-days cumulated maximum daily temperatures are assigned to each day of the time series; probabilities of occurrence in the same months the reference days belong to are computed for each of the previous calculated values; such probability values are thus projected on a standard normal distribution, obtaining our standardized indices. In this work we present results obtained using NCEP Reanalysis dataset for air temperature at sigma 0.995 level, which timespan ranges from 1948 to 2014. Given the specific framework of this work, the geographical focus of this study is limited to the African continent. We present a validation of the index by showing its use for monitoring heat-waves under different climate regimes.
Investigation on Law and Economics Based on Complex Network and Time Series Analysis.
Yang, Jian; Qu, Zhao; Chang, Hui
2015-01-01
The research focuses on the cooperative relationship and the strategy tendency among three mutually interactive parties in financing: small enterprises, commercial banks and micro-credit companies. Complex network theory and time series analysis were applied to figure out the quantitative evidence. Moreover, this paper built up a fundamental model describing the particular interaction among them through evolutionary game. Combining the results of data analysis and current situation, it is justifiable to put forward reasonable legislative recommendations for regulations on lending activities among small enterprises, commercial banks and micro-credit companies. The approach in this research provides a framework for constructing mathematical models and applying econometrics and evolutionary game in the issue of corporation financing.
NASA Astrophysics Data System (ADS)
Yashima, Kenta; Ito, Kana; Nakamura, Kazuyuki
2013-03-01
When an Infectious disease where to prevail throughout the population, epidemic parameters such as the basic reproduction ratio, initial point of infection etc. are estimated from the time series data of infected population. However, it is unclear how does the structure of host population affects this estimation accuracy. In other words, what kind of city is difficult to estimate its epidemic parameters? To answer this question, epidemic data are simulated by constructing a commuting network with different network structure and running the infection process over this network. From the given time series data for each network structure, we would like to analyzed estimation accuracy of epidemic parameters.
Application of computational mechanics to the analysis of natural data: an example in geomagnetism.
Clarke, Richard W; Freeman, Mervyn P; Watkins, Nicholas W
2003-01-01
We discuss how the ideal formalism of computational mechanics can be adapted to apply to a noninfinite series of corrupted and correlated data, that is typical of most observed natural time series. Specifically, a simple filter that removes the corruption that creates rare unphysical causal states is demonstrated, and the concept of effective soficity is introduced. We believe that computational mechanics cannot be applied to a noisy and finite data series without invoking an argument based upon effective soficity. A related distinction between noise and unresolved structure is also defined: Noise can only be eliminated by increasing the length of the time series, whereas the resolution of previously unresolved structure only requires the finite memory of the analysis to be increased. The benefits of these concepts are demonstrated in a simulated times series by (a) the effective elimination of white noise corruption from a periodic signal using the expletive filter and (b) the appearance of an effectively sofic region in the statistical complexity of a biased Poisson switch time series that is insensitive to changes in the word length (memory) used in the analysis. The new algorithm is then applied to an analysis of a real geomagnetic time series measured at Halley, Antarctica. Two principal components in the structure are detected that are interpreted as the diurnal variation due to the rotation of the Earth-based station under an electrical current pattern that is fixed with respect to the Sun-Earth axis and the random occurrence of a signature likely to be that of the magnetic substorm. In conclusion, some useful terminology for the discussion of model construction in general is introduced.
Visibility graphlet approach to chaotic time series
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mutua, Stephen; Computer Science Department, Masinde Muliro University of Science and Technology, P.O. Box 190-50100, Kakamega; Gu, Changgui, E-mail: gu-changgui@163.com, E-mail: hjyang@ustc.edu.cn
Many novel methods have been proposed for mapping time series into complex networks. Although some dynamical behaviors can be effectively captured by existing approaches, the preservation and tracking of the temporal behaviors of a chaotic system remains an open problem. In this work, we extended the visibility graphlet approach to investigate both discrete and continuous chaotic time series. We applied visibility graphlets to capture the reconstructed local states, so that each is treated as a node and tracked downstream to create a temporal chain link. Our empirical findings show that the approach accurately captures the dynamical properties of chaotic systems.more » Networks constructed from periodic dynamic phases all converge to regular networks and to unique network structures for each model in the chaotic zones. Furthermore, our results show that the characterization of chaotic and non-chaotic zones in the Lorenz system corresponds to the maximal Lyapunov exponent, thus providing a simple and straightforward way to analyze chaotic systems.« less
Credit Default Swaps networks and systemic risk
Puliga, Michelangelo; Caldarelli, Guido; Battiston, Stefano
2014-01-01
Credit Default Swaps (CDS) spreads should reflect default risk of the underlying corporate debt. Actually, it has been recognized that CDS spread time series did not anticipate but only followed the increasing risk of default before the financial crisis. In principle, the network of correlations among CDS spread time series could at least display some form of structural change to be used as an early warning of systemic risk. Here we study a set of 176 CDS time series of financial institutions from 2002 to 2011. Networks are constructed in various ways, some of which display structural change at the onset of the credit crisis of 2008, but never before. By taking these networks as a proxy of interdependencies among financial institutions, we run stress-test based on Group DebtRank. Systemic risk before 2008 increases only when incorporating a macroeconomic indicator reflecting the potential losses of financial assets associated with house prices in the US. This approach indicates a promising way to detect systemic instabilities. PMID:25366654
Credit Default Swaps networks and systemic risk.
Puliga, Michelangelo; Caldarelli, Guido; Battiston, Stefano
2014-11-04
Credit Default Swaps (CDS) spreads should reflect default risk of the underlying corporate debt. Actually, it has been recognized that CDS spread time series did not anticipate but only followed the increasing risk of default before the financial crisis. In principle, the network of correlations among CDS spread time series could at least display some form of structural change to be used as an early warning of systemic risk. Here we study a set of 176 CDS time series of financial institutions from 2002 to 2011. Networks are constructed in various ways, some of which display structural change at the onset of the credit crisis of 2008, but never before. By taking these networks as a proxy of interdependencies among financial institutions, we run stress-test based on Group DebtRank. Systemic risk before 2008 increases only when incorporating a macroeconomic indicator reflecting the potential losses of financial assets associated with house prices in the US. This approach indicates a promising way to detect systemic instabilities.
Credit Default Swaps networks and systemic risk
NASA Astrophysics Data System (ADS)
Puliga, Michelangelo; Caldarelli, Guido; Battiston, Stefano
2014-11-01
Credit Default Swaps (CDS) spreads should reflect default risk of the underlying corporate debt. Actually, it has been recognized that CDS spread time series did not anticipate but only followed the increasing risk of default before the financial crisis. In principle, the network of correlations among CDS spread time series could at least display some form of structural change to be used as an early warning of systemic risk. Here we study a set of 176 CDS time series of financial institutions from 2002 to 2011. Networks are constructed in various ways, some of which display structural change at the onset of the credit crisis of 2008, but never before. By taking these networks as a proxy of interdependencies among financial institutions, we run stress-test based on Group DebtRank. Systemic risk before 2008 increases only when incorporating a macroeconomic indicator reflecting the potential losses of financial assets associated with house prices in the US. This approach indicates a promising way to detect systemic instabilities.
Inference for local autocorrelations in locally stationary models.
Zhao, Zhibiao
2015-04-01
For non-stationary processes, the time-varying correlation structure provides useful insights into the underlying model dynamics. We study estimation and inferences for local autocorrelation process in locally stationary time series. Our constructed simultaneous confidence band can be used to address important hypothesis testing problems, such as whether the local autocorrelation process is indeed time-varying and whether the local autocorrelation is zero. In particular, our result provides an important generalization of the R function acf() to locally stationary Gaussian processes. Simulation studies and two empirical applications are developed. For the global temperature series, we find that the local autocorrelations are time-varying and have a "V" shape during 1910-1960. For the S&P 500 index, we conclude that the returns satisfy the efficient-market hypothesis whereas the magnitudes of returns show significant local autocorrelations.
Koda, Satoru; Onda, Yoshihiko; Matsui, Hidetoshi; Takahagi, Kotaro; Yamaguchi-Uehara, Yukiko; Shimizu, Minami; Inoue, Komaki; Yoshida, Takuhiro; Sakurai, Tetsuya; Honda, Hiroshi; Eguchi, Shinto; Nishii, Ryuei; Mochida, Keiichi
2017-01-01
We report the comprehensive identification of periodic genes and their network inference, based on a gene co-expression analysis and an Auto-Regressive eXogenous (ARX) model with a group smoothly clipped absolute deviation (SCAD) method using a time-series transcriptome dataset in a model grass, Brachypodium distachyon . To reveal the diurnal changes in the transcriptome in B. distachyon , we performed RNA-seq analysis of its leaves sampled through a diurnal cycle of over 48 h at 4 h intervals using three biological replications, and identified 3,621 periodic genes through our wavelet analysis. The expression data are feasible to infer network sparsity based on ARX models. We found that genes involved in biological processes such as transcriptional regulation, protein degradation, and post-transcriptional modification and photosynthesis are significantly enriched in the periodic genes, suggesting that these processes might be regulated by circadian rhythm in B. distachyon . On the basis of the time-series expression patterns of the periodic genes, we constructed a chronological gene co-expression network and identified putative transcription factors encoding genes that might be involved in the time-specific regulatory transcriptional network. Moreover, we inferred a transcriptional network composed of the periodic genes in B. distachyon , aiming to identify genes associated with other genes through variable selection by grouping time points for each gene. Based on the ARX model with the group SCAD regularization using our time-series expression datasets of the periodic genes, we constructed gene networks and found that the networks represent typical scale-free structure. Our findings demonstrate that the diurnal changes in the transcriptome in B. distachyon leaves have a sparse network structure, demonstrating the spatiotemporal gene regulatory network over the cyclic phase transitions in B. distachyon diurnal growth.
NASA Astrophysics Data System (ADS)
Watkins, Nicholas; Clarke, Richard; Freeman, Mervyn
2002-11-01
We discuss how the ideal formalism of Computational Mechanics can be adapted to apply to a non-infinite series of corrupted and correlated data, that is typical of most observed natural time series. Specifically, a simple filter that removes the corruption that creates rare unphysical causal states is demonstrated, and the new concept of effective soficity is introduced. The benefits of these new concepts are demonstrated on simulated time series by (a) the effective elimination of white noise corruption from a periodic signal using the expletive filter and (b) the appearance of an effectively sofic region in the statistical complexity of a biased Poisson switch time series that is insensitive to changes in the word length (memory) used in the analysis. The new algorithm is then applied to analysis of a real geomagnetic time series measured at Halley, Antarctica. Two principal components in the structure are detected that are interpreted as the diurnal variation due to the rotation of the earth-based station under an electrical current pattern that is fixed with respect to the sun-earth axis and the random occurrence of a signature likely to be that of the magnetic substorm. In conclusion, a hypothesis is advanced about model construction in general (see also Clarke et al; arXiv::cond-mat/0110228).
Interventions for preventing injuries in the construction industry.
van der Molen, H F; Lehtola, M M; Lappalainen, J; Hoonakker, P L T; Hsiao, H; Haslam, R; Hale, A R; Verbeek, J
2007-10-17
Construction workers are frequently exposed to various types of injury-inducing hazards. A number of injury prevention interventions have been proposed, yet the effectiveness of these is uncertain. To assess the effects of interventions for preventing injuries among workers at construction sites. We searched the Cochrane Injuries Group's specialised register, CENTRAL, MEDLINE, EMBASE, PsycINFO, OSH-ROM (including NIOSHTIC and HSELINE), EI Compendex. The reference lists of relevant papers, reviews and websites were also searched. The searches were not restricted by language or publication status. All databases were searched up to June 2006. Randomized controlled trials, controlled before-after studies and interrupted time series of all types of interventions for preventing fatal and non-fatal injuries among workers at construction sites. Two authors independently extracted data and assessed study quality. For interrupted time series, we reanalysed the studies and used an initial effect, measured as the change in injury-rate in the year after the intervention, as well as a sustained effect, measured as the change in time trend before and after the intervention. Five interrupted time series studies met the inclusion criteria. Three studies evaluated the effect of regulations, one evaluated a safety campaign, and one a drug-free workplace program on fatal or non-fatal injuries compared to no drug-free workplace program. The overall methodological quality was low. The regulatory interventions did not show either an initial or sustained effect on fatal or non-fatal injuries, with effect sizes of 0.69 (95% confidence interval (CI) -1.70 to 3.09) and 0.28 (95% CI 0.05 to 0.51). The safety campaign did have an initial and sustained effect, reducing non-fatal injuries with effect sizes of -1.82 (95% CI -2.90 to -0.75) and -1.30 (95% CI -1.79 to -0.80) respectively. The drug-free workplace program did have an initial and sustained effect, reducing non-fatal injuries compared to no intervention, with effect sizes of -6.74 (95% CI -10.02 to -3.54) and -1.76 (95% CI -3.11 to -0.41) respectively. The vast majority of technical, human factors and organisational interventions which are recommended by standard texts of safety, consultants and safety courses, have not been adequately evaluated. There is no evidence that regulations for reducing fatal and non-fatal injuries are effective. There is limited evidence that a multifaceted safety campaign and a multifaceted drug program can reduce non-fatal injuries in the construction industry.
ERIC Educational Resources Information Center
Cenziper, Debbie; Grotto, Jason
This series of articles examines the condition of public schools and public school construction in Florida's Miami and Dade Counties. To prepare the series, the Miami Herald studied thousands of pages of construction records, correspondence, school district reports, and accounting statements over 15 years. It analyzed state and national…
Gupta, Rahul; Audhkhasi, Kartik; Jacokes, Zach; Rozga, Agata; Narayanan, Shrikanth
2018-01-01
Studies of time-continuous human behavioral phenomena often rely on ratings from multiple annotators. Since the ground truth of the target construct is often latent, the standard practice is to use ad-hoc metrics (such as averaging annotator ratings). Despite being easy to compute, such metrics may not provide accurate representations of the underlying construct. In this paper, we present a novel method for modeling multiple time series annotations over a continuous variable that computes the ground truth by modeling annotator specific distortions. We condition the ground truth on a set of features extracted from the data and further assume that the annotators provide their ratings as modification of the ground truth, with each annotator having specific distortion tendencies. We train the model using an Expectation-Maximization based algorithm and evaluate it on a study involving natural interaction between a child and a psychologist, to predict confidence ratings of the children's smiles. We compare and analyze the model against two baselines where: (i) the ground truth in considered to be framewise mean of ratings from various annotators and, (ii) each annotator is assumed to bear a distinct time delay in annotation and their annotations are aligned before computing the framewise mean.
Inverse sequential procedures for the monitoring of time series
NASA Technical Reports Server (NTRS)
Radok, Uwe; Brown, Timothy J.
1995-01-01
When one or more new values are added to a developing time series, they change its descriptive parameters (mean, variance, trend, coherence). A 'change index (CI)' is developed as a quantitative indicator that the changed parameters remain compatible with the existing 'base' data. CI formulate are derived, in terms of normalized likelihood ratios, for small samples from Poisson, Gaussian, and Chi-Square distributions, and for regression coefficients measuring linear or exponential trends. A substantial parameter change creates a rapid or abrupt CI decrease which persists when the length of the bases is changed. Except for a special Gaussian case, the CI has no simple explicit regions for tests of hypotheses. However, its design ensures that the series sampled need not conform strictly to the distribution form assumed for the parameter estimates. The use of the CI is illustrated with both constructed and observed data samples, processed with a Fortran code 'Sequitor'.
Xie, Ping; Zhao, Jiang Yan; Wu, Zi Yi; Sang, Yan Fang; Chen, Jie; Li, Bin Bin; Gu, Hai Ting
2018-04-01
The analysis of inconsistent hydrological series is one of the major problems that should be solved for engineering hydrological calculation in changing environment. In this study, the diffe-rences of non-consistency and non-stationarity were analyzed from the perspective of composition of hydrological series. The inconsistent hydrological phenomena were generalized into hydrological processes with inheritance, variability and evolution characteristics or regulations. Furthermore, the hydrological genes were identified following the theory of biological genes, while their inheritance bases and variability bases were determined based on composition of hydrological series under diffe-rent time scales. To identify and test the components of hydrological genes, we constructed a diagnosis system of hydrological genes. With the P-3 distribution as an example, we described the process of construction and expression of the moment genes to illustrate the inheritance, variability and evolution principles of hydrological genes. With the annual minimum 1-month runoff series of Yunjinghong station in Lancangjiang River basin as an example, we verified the feasibility and practicability of hydrological gene theory for the calculation of inconsistent hydrological frequency. The results showed that the method could be used to reveal the evolution of inconsistent hydrological series. Therefore, it provided a new research pathway for engineering hydrological calculation in changing environment and an essential reference for the assessment of water security.
A generalized conditional heteroscedastic model for temperature downscaling
NASA Astrophysics Data System (ADS)
Modarres, R.; Ouarda, T. B. M. J.
2014-11-01
This study describes a method for deriving the time varying second order moment, or heteroscedasticity, of local daily temperature and its association to large Coupled Canadian General Circulation Models predictors. This is carried out by applying a multivariate generalized autoregressive conditional heteroscedasticity (MGARCH) approach to construct the conditional variance-covariance structure between General Circulation Models (GCMs) predictors and maximum and minimum temperature time series during 1980-2000. Two MGARCH specifications namely diagonal VECH and dynamic conditional correlation (DCC) are applied and 25 GCM predictors were selected for a bivariate temperature heteroscedastic modeling. It is observed that the conditional covariance between predictors and temperature is not very strong and mostly depends on the interaction between the random process governing temporal variation of predictors and predictants. The DCC model reveals a time varying conditional correlation between GCM predictors and temperature time series. No remarkable increasing or decreasing change is observed for correlation coefficients between GCM predictors and observed temperature during 1980-2000 while weak winter-summer seasonality is clear for both conditional covariance and correlation. Furthermore, the stationarity and nonlinearity Kwiatkowski-Phillips-Schmidt-Shin (KPSS) and Brock-Dechert-Scheinkman (BDS) tests showed that GCM predictors, temperature and their conditional correlation time series are nonlinear but stationary during 1980-2000 according to BDS and KPSS test results. However, the degree of nonlinearity of temperature time series is higher than most of the GCM predictors.
The geometry of chaotic dynamics — a complex network perspective
NASA Astrophysics Data System (ADS)
Donner, R. V.; Heitzig, J.; Donges, J. F.; Zou, Y.; Marwan, N.; Kurths, J.
2011-12-01
Recently, several complex network approaches to time series analysis have been developed and applied to study a wide range of model systems as well as real-world data, e.g., geophysical or financial time series. Among these techniques, recurrence-based concepts and prominently ɛ-recurrence networks, most faithfully represent the geometrical fine structure of the attractors underlying chaotic (and less interestingly non-chaotic) time series. In this paper we demonstrate that the well known graph theoretical properties local clustering coefficient and global (network) transitivity can meaningfully be exploited to define two new local and two new global measures of dimension in phase space: local upper and lower clustering dimension as well as global upper and lower transitivity dimension. Rigorous analytical as well as numerical results for self-similar sets and simple chaotic model systems suggest that these measures are well-behaved in most non-pathological situations and that they can be estimated reasonably well using ɛ-recurrence networks constructed from relatively short time series. Moreover, we study the relationship between clustering and transitivity dimensions on the one hand, and traditional measures like pointwise dimension or local Lyapunov dimension on the other hand. We also provide further evidence that the local clustering coefficients, or equivalently the local clustering dimensions, are useful for identifying unstable periodic orbits and other dynamically invariant objects from time series. Our results demonstrate that ɛ-recurrence networks exhibit an important link between dynamical systems and graph theory.
33 CFR 149.315 - What embarkation, launching, and recovery arrangements must rescue boats meet?
Code of Federal Regulations, 2012 CFR
2012-07-01
... GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: DESIGN, CONSTRUCTION... the shortest possible time. (c) If the rescue boat is one of the deepwater port's survival craft, then... apparatus, approved under approval series 160.170, instead of a lifeboat release mechanism. (f) The rescue...
33 CFR 149.315 - What embarkation, launching, and recovery arrangements must rescue boats meet?
Code of Federal Regulations, 2010 CFR
2010-07-01
... GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: DESIGN, CONSTRUCTION... the shortest possible time. (c) If the rescue boat is one of the deepwater port's survival craft, then... apparatus, approved under approval series 160.170, instead of a lifeboat release mechanism. (f) The rescue...
33 CFR 149.315 - What embarkation, launching, and recovery arrangements must rescue boats meet?
Code of Federal Regulations, 2011 CFR
2011-07-01
... GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: DESIGN, CONSTRUCTION... the shortest possible time. (c) If the rescue boat is one of the deepwater port's survival craft, then... apparatus, approved under approval series 160.170, instead of a lifeboat release mechanism. (f) The rescue...
ERIC Educational Resources Information Center
Bobbitt, Larry; Otto, Mark
Three Autoregressive Integrated Moving Averages (ARIMA) forecast procedures for Census Bureau X-11 concurrent seasonal adjustment were empirically tested. Forty time series from three Census Bureau economic divisions (business, construction, and industry) were analyzed. Forecasts were obtained from fitted seasonal ARIMA models augmented with…
14 CFR Appendix A to Part 150 - Noise Exposure Maps
Code of Federal Regulations, 2012 CFR
2012-01-01
... series of n events in time period T, in seconds. Note: When T is one hour, LT is referred to as one-hour... sound attenuation into the design and construction of a structure may be necessary to achieve..., noise exposure maps prepared in connection with studies which were either Federally funded or Federally...
14 CFR Appendix A to Part 150 - Noise Exposure Maps
Code of Federal Regulations, 2010 CFR
2010-01-01
... series of n events in time period T, in seconds. Note: When T is one hour, LT is referred to as one-hour... sound attenuation into the design and construction of a structure may be necessary to achieve..., noise exposure maps prepared in connection with studies which were either Federally funded or Federally...
14 CFR Appendix A to Part 150 - Noise Exposure Maps
Code of Federal Regulations, 2014 CFR
2014-01-01
... series of n events in time period T, in seconds. Note: When T is one hour, LT is referred to as one-hour... sound attenuation into the design and construction of a structure may be necessary to achieve..., noise exposure maps prepared in connection with studies which were either Federally funded or Federally...
14 CFR Appendix A to Part 150 - Noise Exposure Maps
Code of Federal Regulations, 2013 CFR
2013-01-01
... series of n events in time period T, in seconds. Note: When T is one hour, LT is referred to as one-hour... sound attenuation into the design and construction of a structure may be necessary to achieve..., noise exposure maps prepared in connection with studies which were either Federally funded or Federally...
14 CFR Appendix A to Part 150 - Noise Exposure Maps
Code of Federal Regulations, 2011 CFR
2011-01-01
... series of n events in time period T, in seconds. Note: When T is one hour, LT is referred to as one-hour... sound attenuation into the design and construction of a structure may be necessary to achieve..., noise exposure maps prepared in connection with studies which were either Federally funded or Federally...
NASA Astrophysics Data System (ADS)
Stiller-Reeve, Mathew; Stephenson, David; Spengler, Thomas
2017-04-01
For climate services to be relevant and informative for users, scientific data definitions need to match users' perceptions or beliefs. This study proposes and tests novel yet simple methods to compare beliefs of timing of recurrent climatic events with empirical evidence from multiple historical time series. The methods are tested by applying them to the onset date of the monsoon in Bangladesh, where several scientific monsoon definitions can be applied, yielding different results for monsoon onset dates. It is a challenge to know which monsoon definition compares best with people's beliefs. Time series from eight different scientific monsoon definitions in six regions are compared with respondent beliefs from a previously completed survey concerning the monsoon onset. Beliefs about the timing of the monsoon onset are represented probabilistically for each respondent by constructing a probability mass function (PMF) from elicited responses about the earliest, normal, and latest dates for the event. A three-parameter circular modified triangular distribution (CMTD) is used to allow for the possibility (albeit small) of the onset at any time of the year. These distributions are then compared to the historical time series using two approaches: likelihood scores, and the mean and standard deviation of time series of dates simulated from each belief distribution. The methods proposed give the basis for further iterative discussion with decision-makers in the development of eventual climate services. This study uses Jessore, Bangladesh, as an example and finds that a rainfall definition, applying a 10 mm day-1 threshold to NCEP-NCAR reanalysis (Reanalysis-1) data, best matches the survey respondents' beliefs about monsoon onset.
Finite element techniques in computational time series analysis of turbulent flows
NASA Astrophysics Data System (ADS)
Horenko, I.
2009-04-01
In recent years there has been considerable increase of interest in the mathematical modeling and analysis of complex systems that undergo transitions between several phases or regimes. Such systems can be found, e.g., in weather forecast (transitions between weather conditions), climate research (ice and warm ages), computational drug design (conformational transitions) and in econometrics (e.g., transitions between different phases of the market). In all cases, the accumulation of sufficiently detailed time series has led to the formation of huge databases, containing enormous but still undiscovered treasures of information. However, the extraction of essential dynamics and identification of the phases is usually hindered by the multidimensional nature of the signal, i.e., the information is "hidden" in the time series. The standard filtering approaches (like f.~e. wavelets-based spectral methods) have in general unfeasible numerical complexity in high-dimensions, other standard methods (like f.~e. Kalman-filter, MVAR, ARCH/GARCH etc.) impose some strong assumptions about the type of the underlying dynamics. Approach based on optimization of the specially constructed regularized functional (describing the quality of data description in terms of the certain amount of specified models) will be introduced. Based on this approach, several new adaptive mathematical methods for simultaneous EOF/SSA-like data-based dimension reduction and identification of hidden phases in high-dimensional time series will be presented. The methods exploit the topological structure of the analysed data an do not impose severe assumptions on the underlying dynamics. Special emphasis will be done on the mathematical assumptions and numerical cost of the constructed methods. The application of the presented methods will be first demonstrated on a toy example and the results will be compared with the ones obtained by standard approaches. The importance of accounting for the mathematical assumptions used in the analysis will be pointed up in this example. Finally, applications to analysis of meteorological and climate data will be presented.
Determination of sustainable values for the parameters of the construction of residential buildings
NASA Astrophysics Data System (ADS)
Grigoreva, Larisa; Grigoryev, Vladimir
2018-03-01
For the formation of programs for housing construction and planning of capital investments, when developing the strategic planning companies by construction companies, the norms or calculated indicators of the duration of the construction of high-rise residential buildings and multifunctional complexes are mandatory. Determination of stable values of the parameters for the high-rise construction residential buildings provides an opportunity to establish a reasonable duration of construction at the planning and design stages of residential complexes, taking into account the influence of market conditions factors. The concept of the formation of enlarged models for the high-rise construction residential buildings is based on a real mapping in time and space of the most significant redistribution with their organizational and technological interconnection - the preparatory period, the underground part, the above-ground part, external engineering networks, landscaping. The total duration of the construction of a residential building, depending on the duration of each redistribution and the degree of their overlapping, can be determined by one of the proposed four options. At the same time, a unified approach to determining the overall duration of construction on the basis of the provisions of a streamlined construction organization with the testing of results on the example of high-rise residential buildings of the typical I-155B series was developed, and the coefficients for combining the work and the main redevelopment of the building were determined.
Early-time solution of the horizontal unconfined aquifer in the build-up phase
NASA Astrophysics Data System (ADS)
Gravanis, Elias; Akylas, Evangelos
2017-04-01
The Boussinesq equation is a dynamical equation for the free surface of saturated subsurface flows over an impervious bed. Boussinesq equation is non-linear. The non-linearity comes from the reduction of the dimensionality of the problem: The flow is assumed to be vertically homogeneous, therefore the flow rate through a cross section of the flow is proportional to the free surface height times the hydraulic gradient, which is assumed to be equal to the slope of the free surface (Dupuit approximation). In general, 'vertically' means normally on the bed; combining the Dupuit approximation with the continuity equation leads to the Boussinesq equation. There are very few transient exact solutions. Self- similar solutions have been constructed in the past by various authors. A power series type of solution was derived for a self-similar Boussinesq equation by Barenblatt in 1990. That type of solution has generated a certain amount of literature. For the unconfined flow case for zero recharge rate Boussinesq derived for the horizontal aquifer an exact solution assuming separation of variables. This is actually an exact asymptotic solution of the horizontal aquifer recession phase for late times. The kinematic wave is an interesting solution obtained by dropping the non-linear term in the Boussinesq equation. Although it is an approximate solution, and holds well only for small values of the Henderson and Wooding λ parameter (that is, for steep slopes, high conductivity or small recharge rate), it becomes less and less approximate for smaller values of the parameter, that is, it is asymptotically exact with respect to that parameter. In the present work we consider the case of the unconfined subsurface flow over horizontal bed in the build-up phase under constant recharge rate. This is a case with an infinite Henderson and Wooding parameter, that is, it is the limiting case where the non-linear term is present in the Boussinesq while the linear spatial derivative term goes away. Nonetheless, no analogue of the kinematic wave or the Boussinesq separable solution exists in this case. The late time state of the build-up phase under constant recharge rate is very simply the steady state solution. Our aim is to construct the early time asymptotic solution of this problem. The solution is expressed as a power series of a suitable similarity variable, which is constructed so that to satisfy the boundary conditions at both ends of the aquifer, that is, it is a polynomial approximation of the exact solution. The series turn out to be asymptotic and it is regularized by re-summation techniques which are used to define divergent series. The outflow rate in this regime is linear in time, and the (dimensionless) coefficient is calculated to eight significant figures. The local error of the series is quantified by its deviation from satisfying the self-similar Boussinesq equation at every point. The local error turns out to be everywhere positive, hence, so is the integrated error, which in turn quantifies the degree of convergence of the series to the exact solution.
Lambert, Bruno; Flahault, Antoine; Chartier-Kastler, Emmanuel; Hanslik, Thomas
2013-01-01
Background Despite the fact that urinary tract infection (UTI) is a very frequent disease, little is known about its seasonality in the community. Methods and Findings To estimate seasonality of UTI using multiple time series constructed with available proxies of UTI. Eight time series based on two databases were used: sales of urinary antibacterial medications reported by a panel of pharmacy stores in France between 2000 and 2012, and search trends on the Google search engine for UTI-related terms between 2004 and 2012 in France, Germany, Italy, the USA, China, Australia and Brazil. Differences between summers and winters were statistically assessed with the Mann-Whitney test. We evaluated seasonality by applying the Harmonics Product Spectrum on Fast Fourier Transform. Seven time series out of eight displayed a significant increase in medication sales or web searches in the summer compared to the winter, ranging from 8% to 20%. The eight time series displayed a periodicity of one year. Annual increases were seen in the summer for UTI drug sales in France and Google searches in France, the USA, Germany, Italy, and China. Increases occurred in the austral summer for Google searches in Brazil and Australia. Conclusions An annual seasonality of UTIs was evidenced in seven different countries, with peaks during the summer. PMID:24204587
Nonlinear model updating applied to the IMAC XXXII Round Robin benchmark system
NASA Astrophysics Data System (ADS)
Kurt, Mehmet; Moore, Keegan J.; Eriten, Melih; McFarland, D. Michael; Bergman, Lawrence A.; Vakakis, Alexander F.
2017-05-01
We consider the application of a new nonlinear model updating strategy to a computational benchmark system. The approach relies on analyzing system response time series in the frequency-energy domain by constructing both Hamiltonian and forced and damped frequency-energy plots (FEPs). The system parameters are then characterized and updated by matching the backbone branches of the FEPs with the frequency-energy wavelet transforms of experimental and/or computational time series. The main advantage of this method is that no nonlinearity model is assumed a priori, and the system model is updated solely based on simulation and/or experimental measured time series. By matching the frequency-energy plots of the benchmark system and its reduced-order model, we show that we are able to retrieve the global strongly nonlinear dynamics in the frequency and energy ranges of interest, identify bifurcations, characterize local nonlinearities, and accurately reconstruct time series. We apply the proposed methodology to a benchmark problem, which was posed to the system identification community prior to the IMAC XXXII (2014) and XXXIII (2015) Conferences as a "Round Robin Exercise on Nonlinear System Identification". We show that we are able to identify the parameters of the non-linear element in the problem with a priori knowledge about its position.
NASA Astrophysics Data System (ADS)
Butler, P. G.; Scourse, J. D.; Richardson, C. A.; Wanamaker, A. D., Jr.
2009-04-01
Determinations of the local correction (ΔR) to the globally averaged marine radiocarbon reservoir age are often isolated in space and time, derived from heterogeneous sources and constrained by significant uncertainties. Although time series of ΔR at single sites can be obtained from sediment cores, these are subject to multiple uncertainties related to sedimentation rates, bioturbation and interspecific variations in the source of radiocarbon in the analysed samples. Coral records provide better resolution, but these are available only for tropical locations. It is shown here that it is possible to use the shell of the long-lived bivalve mollusc Arctica islandica as a source of high resolution time series of absolutely-dated marine radiocarbon determinations for the shelf seas surrounding the North Atlantic ocean. Annual growth increments in the shell can be crossdated and chronologies can be constructed in a precise analogue with the use of tree-rings. Because the calendar dates of the samples are known, ΔR can be determined with high precision and accuracy and because all the samples are from the same species, the time series of ΔR values possesses a high degree of internal consistency. Presented here is a multi-centennial (AD 1593 - AD 1933) time series of 31 ΔR values for a site in the Irish Sea close to the Isle of Man. The mean value of ΔR (-62 14C yrs) does not change significantly during this period but increased variability is apparent before AD 1750.
Indicator saturation: a novel approach to detect multiple breaks in geodetic time series.
NASA Astrophysics Data System (ADS)
Jackson, L. P.; Pretis, F.; Williams, S. D. P.
2016-12-01
Geodetic time series can record long term trends, quasi-periodic signals at a variety of time scales from days to decades, and sudden breaks due to natural or anthropogenic causes. The causes of breaks range from instrument replacement to earthquakes to unknown (i.e. no attributable cause). Furthermore, breaks can be permanent or short-lived and range at least two orders of magnitude in size (mm to 100's mm). To account for this range of possible signal-characteristics requires a flexible time series method that can distinguish between true and false breaks, outliers and time-varying trends. One such method, Indicator Saturation (IS) comes from the field of econometrics where analysing stochastic signals in these terms is a common problem. The IS approach differs from alternative break detection methods by considering every point in the time series as a break until it is demonstrated statistically that it is not. A linear model is constructed with a break function at every point in time, and all but statistically significant breaks are removed through a general-to-specific model selection algorithm for more variables than observations. The IS method is flexible because it allows multiple breaks of different forms (e.g. impulses, shifts in the mean, and changing trends) to be detected, while simultaneously modelling any underlying variation driven by additional covariates. We apply the IS method to identify breaks in a suite of synthetic GPS time series used for the Detection of Offsets in GPS Experiments (DOGEX). We optimise the method to maximise the ratio of true-positive to false-positive detections, which improves estimates of errors in the long term rates of land motion currently required by the GPS community.
NASA Astrophysics Data System (ADS)
Sun, Chao; Liu, Yongxue; Zhao, Saishuai; Zhou, Minxi; Yang, Yuhao; Li, Feixue
2016-03-01
Salt marshes are seen as the most dynamic and valuable ecosystems in coastal zones, and in these areas, it is crucial to obtain accurate remote sensing information on the spatial distributions of species over time. However, discriminating various types of salt marsh is rather difficult because of their strong spectral similarities. Previous salt marsh mapping studies have focused mainly on high spatial and spectral (i.e., hyperspectral) resolution images combined with auxiliary information; however, the results are often limited to small regions. With a high temporal and moderate spatial resolution, the Chinese HuanJing-1 (HJ-1) satellite optical imagery can be used not only to monitor phenological changes of salt marsh vegetation over short-time intervals, but also to obtain coverage of large areas. Here, we apply HJ-1 satellite imagery to the middle coast of Jiangsu in east China to monitor changes in saltmarsh vegetation cover. First, we constructed a monthly NDVI time-series to classify various types of salt marsh and then we tested the possibility of using compressed time-series continuously, to broaden the applicability of this particular approach. Our principal findings are as follows: (1) the overall accuracy of salt marsh mapping based on the monthly NDVI time-series was 90.3%, which was ∼16.0% higher than the single-phase classification strategy; (2) a compressed time-series, including NDVI from six key months (April, June-September, and November), demonstrated very little reduction (2.3%) in overall accuracy but led to obvious improvements in unstable regions; and (3) a simple rule for Spartina alterniflora identification was established using a scene solely from November, which may provide an effective way for regularly monitoring its distribution.
MIMO model of an interacting series process for Robust MPC via System Identification.
Wibowo, Tri Chandra S; Saad, Nordin
2010-07-01
This paper discusses the empirical modeling using system identification technique with a focus on an interacting series process. The study is carried out experimentally using a gaseous pilot plant as the process, in which the dynamic of such a plant exhibits the typical dynamic of an interacting series process. Three practical approaches are investigated and their performances are evaluated. The models developed are also examined in real-time implementation of a linear model predictive control. The selected model is able to reproduce the main dynamic characteristics of the plant in open-loop and produces zero steady-state errors in closed-loop control system. Several issues concerning the identification process and the construction of a MIMO state space model for a series interacting process are deliberated. 2010 ISA. Published by Elsevier Ltd. All rights reserved.
Chaos control in delayed phase space constructed by the Takens embedding theory
NASA Astrophysics Data System (ADS)
Hajiloo, R.; Salarieh, H.; Alasty, A.
2018-01-01
In this paper, the problem of chaos control in discrete-time chaotic systems with unknown governing equations and limited measurable states is investigated. Using the time-series of only one measurable state, an algorithm is proposed to stabilize unstable fixed points. The approach consists of three steps: first, using Takens embedding theory, a delayed phase space preserving the topological characteristics of the unknown system is reconstructed. Second, a dynamic model is identified by recursive least squares method to estimate the time-series data in the delayed phase space. Finally, based on the reconstructed model, an appropriate linear delayed feedback controller is obtained for stabilizing unstable fixed points of the system. Controller gains are computed using a systematic approach. The effectiveness of the proposed algorithm is examined by applying it to the generalized hyperchaotic Henon system, prey-predator population map, and the discrete-time Lorenz system.
Investigation on Law and Economics Based on Complex Network and Time Series Analysis
Yang, Jian; Qu, Zhao; Chang, Hui
2015-01-01
The research focuses on the cooperative relationship and the strategy tendency among three mutually interactive parties in financing: small enterprises, commercial banks and micro-credit companies. Complex network theory and time series analysis were applied to figure out the quantitative evidence. Moreover, this paper built up a fundamental model describing the particular interaction among them through evolutionary game. Combining the results of data analysis and current situation, it is justifiable to put forward reasonable legislative recommendations for regulations on lending activities among small enterprises, commercial banks and micro-credit companies. The approach in this research provides a framework for constructing mathematical models and applying econometrics and evolutionary game in the issue of corporation financing. PMID:26076460
NASA Astrophysics Data System (ADS)
Inc, Mustafa; Yusuf, Abdullahi; Aliyu, Aliyu Isa; Baleanu, Dumitru
2018-04-01
This paper studies the symmetry analysis, explicit solutions, convergence analysis, and conservation laws (Cls) for two different space-time fractional nonlinear evolution equations with Riemann-Liouville (RL) derivative. The governing equations are reduced to nonlinear ordinary differential equation (ODE) of fractional order using their Lie point symmetries. In the reduced equations, the derivative is in Erdelyi-Kober (EK) sense, power series technique is applied to derive an explicit solutions for the reduced fractional ODEs. The convergence of the obtained power series solutions is also presented. Moreover, the new conservation theorem and the generalization of the Noether operators are developed to construct the nonlocal Cls for the equations . Some interesting figures for the obtained explicit solutions are presented.
Deconvolution of time series in the laboratory
NASA Astrophysics Data System (ADS)
John, Thomas; Pietschmann, Dirk; Becker, Volker; Wagner, Christian
2016-10-01
In this study, we present two practical applications of the deconvolution of time series in Fourier space. First, we reconstruct a filtered input signal of sound cards that has been heavily distorted by a built-in high-pass filter using a software approach. Using deconvolution, we can partially bypass the filter and extend the dynamic frequency range by two orders of magnitude. Second, we construct required input signals for a mechanical shaker in order to obtain arbitrary acceleration waveforms, referred to as feedforward control. For both situations, experimental and theoretical approaches are discussed to determine the system-dependent frequency response. Moreover, for the shaker, we propose a simple feedback loop as an extension to the feedforward control in order to handle nonlinearities of the system.
75 FR 9002 - Draft Regulatory Guide: Issuance, Availability
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-26
... Guide, DG-3040, ``Design, Construction, and Inspection of Embankment Retention Systems at Fuel Cycle... guide in the agency's ``Regulatory Guide'' series. This series was developed to describe and make... to be satisfactory for the design, construction, and inspection of embankment retention systems used...
Cheng, Karen Elizabeth; Crary, David J; Ray, Jaideep; Safta, Cosmin
2013-01-01
Objective We discuss the use of structural models for the analysis of biosurveillance related data. Methods and results Using a combination of real and simulated data, we have constructed a data set that represents a plausible time series resulting from surveillance of a large scale bioterrorist anthrax attack in Miami. We discuss the performance of anomaly detection with structural models for these data using receiver operating characteristic (ROC) and activity monitoring operating characteristic (AMOC) analysis. In addition, we show that these techniques provide a method for predicting the level of the outbreak valid for approximately 2 weeks, post-alarm. Conclusions Structural models provide an effective tool for the analysis of biosurveillance data, in particular for time series with noisy, non-stationary background and missing data. PMID:23037798
Spectral analysis of finite-time correlation matrices near equilibrium phase transitions
NASA Astrophysics Data System (ADS)
Vinayak; Prosen, T.; Buča, B.; Seligman, T. H.
2014-10-01
We study spectral densities for systems on lattices, which, at a phase transition display, power-law spatial correlations. Constructing the spatial correlation matrix we prove that its eigenvalue density shows a power law that can be derived from the spatial correlations. In practice time series are short in the sense that they are either not stationary over long time intervals or not available over long time intervals. Also we usually do not have time series for all variables available. We shall make numerical simulations on a two-dimensional Ising model with the usual Metropolis algorithm as time evolution. Using all spins on a grid with periodic boundary conditions we find a power law, that is, for large grids, compatible with the analytic result. We still find a power law even if we choose a fairly small subset of grid points at random. The exponents of the power laws will be smaller under such circumstances. For very short time series leading to singular correlation matrices we use a recently developed technique to lift the degeneracy at zero in the spectrum and find a significant signature of critical behavior even in this case as compared to high temperature results which tend to those of random matrix models.
46 CFR 199.150 - Survival craft launching and recovery arrangements; general.
Code of Federal Regulations, 2013 CFR
2013-10-01
... approval series 160.163. (b) Unless expressly provided otherwise in this part, each survival craft must be... attachment to the vessel must be designed, based on the ultimate strength of the construction material, to be at least 4.5 times the load imparted on the attachment by the launching appliance and its fully...
Boolean network inference from time series data incorporating prior biological knowledge.
Haider, Saad; Pal, Ranadip
2012-01-01
Numerous approaches exist for modeling of genetic regulatory networks (GRNs) but the low sampling rates often employed in biological studies prevents the inference of detailed models from experimental data. In this paper, we analyze the issues involved in estimating a model of a GRN from single cell line time series data with limited time points. We present an inference approach for a Boolean Network (BN) model of a GRN from limited transcriptomic or proteomic time series data based on prior biological knowledge of connectivity, constraints on attractor structure and robust design. We applied our inference approach to 6 time point transcriptomic data on Human Mammary Epithelial Cell line (HMEC) after application of Epidermal Growth Factor (EGF) and generated a BN with a plausible biological structure satisfying the data. We further defined and applied a similarity measure to compare synthetic BNs and BNs generated through the proposed approach constructed from transitions of various paths of the synthetic BNs. We have also compared the performance of our algorithm with two existing BN inference algorithms. Through theoretical analysis and simulations, we showed the rarity of arriving at a BN from limited time series data with plausible biological structure using random connectivity and absence of structure in data. The framework when applied to experimental data and data generated from synthetic BNs were able to estimate BNs with high similarity scores. Comparison with existing BN inference algorithms showed the better performance of our proposed algorithm for limited time series data. The proposed framework can also be applied to optimize the connectivity of a GRN from experimental data when the prior biological knowledge on regulators is limited or not unique.
Series resonant converter with auxiliary winding turns: analysis, design and implementation
NASA Astrophysics Data System (ADS)
Lin, Bor-Ren
2018-05-01
Conventional series resonant converters have researched and applied for high-efficiency power units due to the benefit of its low switching losses. The main problems of series resonant converters are wide frequency variation and high circulating current. Thus, resonant converter is limited at narrow input voltage range and large input capacitor is normally adopted in commercial power units to provide the minimum hold-up time requirement when AC power is off. To overcome these problems, the resonant converter with auxiliary secondary windings are presented in this paper to achieve high voltage gain at low input voltage case such as hold-up time duration when utility power is off. Since the high voltage gain is used at low input voltage cased, the frequency variation of the proposed converter compared to the conventional resonant converter is reduced. Compared to conventional resonant converter, the hold-up time in the proposed converter is more than 40ms. The larger magnetising inductance of transformer is used to reduce the circulating current losses. Finally, a laboratory prototype is constructed and experiments are provided to verify the converter performance.
Multifractality and Network Analysis of Phase Transition
Li, Wei; Yang, Chunbin; Han, Jihui; Su, Zhu; Zou, Yijiang
2017-01-01
Many models and real complex systems possess critical thresholds at which the systems shift dramatically from one sate to another. The discovery of early-warnings in the vicinity of critical points are of great importance to estimate how far the systems are away from the critical states. Multifractal Detrended Fluctuation analysis (MF-DFA) and visibility graph method have been employed to investigate the multifractal and geometrical properties of the magnetization time series of the two-dimensional Ising model. Multifractality of the time series near the critical point has been uncovered from the generalized Hurst exponents and singularity spectrum. Both long-term correlation and broad probability density function are identified to be the sources of multifractality. Heterogeneous nature of the networks constructed from magnetization time series have validated the fractal properties. Evolution of the topological quantities of the visibility graph, along with the variation of multifractality, serve as new early-warnings of phase transition. Those methods and results may provide new insights about the analysis of phase transition problems and can be used as early-warnings for a variety of complex systems. PMID:28107414
Davis, Lindsay E
2014-12-15
To utilize a skills-based workshop series to develop pharmacy students' drug information, writing, critical-thinking, and evaluation skills during the final didactic year of training. A workshop series was implemented to focus on written (researched) responses to drug information questions. These workshops used blinded peer-grading to facilitate timely feedback and strengthen assessment skills. Each workshop was aligned to the didactic coursework content to complement and extend learning, while bridging and advancing research, writing, and critical thinking skills. Attainment of knowledge and skills was assessed by rubric-facilitated peer grades, faculty member grading, peer critique, and faculty member-guided discussion of drug information responses. Annual instructor and course evaluations consistently revealed favorable student feedback regarding workshop value. A drug information workshop series using peer-grading as the primary assessment tool was successfully implemented and was well received by pharmacy students.
Predicting disease progression from short biomarker series using expert advice algorithm
NASA Astrophysics Data System (ADS)
Morino, Kai; Hirata, Yoshito; Tomioka, Ryota; Kashima, Hisashi; Yamanishi, Kenji; Hayashi, Norihiro; Egawa, Shin; Aihara, Kazuyuki
2015-05-01
Well-trained clinicians may be able to provide diagnosis and prognosis from very short biomarker series using information and experience gained from previous patients. Although mathematical methods can potentially help clinicians to predict the progression of diseases, there is no method so far that estimates the patient state from very short time-series of a biomarker for making diagnosis and/or prognosis by employing the information of previous patients. Here, we propose a mathematical framework for integrating other patients' datasets to infer and predict the state of the disease in the current patient based on their short history. We extend a machine-learning framework of ``prediction with expert advice'' to deal with unstable dynamics. We construct this mathematical framework by combining expert advice with a mathematical model of prostate cancer. Our model predicted well the individual biomarker series of patients with prostate cancer that are used as clinical samples.
Predicting disease progression from short biomarker series using expert advice algorithm.
Morino, Kai; Hirata, Yoshito; Tomioka, Ryota; Kashima, Hisashi; Yamanishi, Kenji; Hayashi, Norihiro; Egawa, Shin; Aihara, Kazuyuki
2015-05-20
Well-trained clinicians may be able to provide diagnosis and prognosis from very short biomarker series using information and experience gained from previous patients. Although mathematical methods can potentially help clinicians to predict the progression of diseases, there is no method so far that estimates the patient state from very short time-series of a biomarker for making diagnosis and/or prognosis by employing the information of previous patients. Here, we propose a mathematical framework for integrating other patients' datasets to infer and predict the state of the disease in the current patient based on their short history. We extend a machine-learning framework of "prediction with expert advice" to deal with unstable dynamics. We construct this mathematical framework by combining expert advice with a mathematical model of prostate cancer. Our model predicted well the individual biomarker series of patients with prostate cancer that are used as clinical samples.
Automatic construction of a recurrent neural network based classifier for vehicle passage detection
NASA Astrophysics Data System (ADS)
Burnaev, Evgeny; Koptelov, Ivan; Novikov, German; Khanipov, Timur
2017-03-01
Recurrent Neural Networks (RNNs) are extensively used for time-series modeling and prediction. We propose an approach for automatic construction of a binary classifier based on Long Short-Term Memory RNNs (LSTM-RNNs) for detection of a vehicle passage through a checkpoint. As an input to the classifier we use multidimensional signals of various sensors that are installed on the checkpoint. Obtained results demonstrate that the previous approach to handcrafting a classifier, consisting of a set of deterministic rules, can be successfully replaced by an automatic RNN training on an appropriately labelled data.
TaiWan Ionospheric Model (TWIM) prediction based on time series autoregressive analysis
NASA Astrophysics Data System (ADS)
Tsai, L. C.; Macalalad, Ernest P.; Liu, C. H.
2014-10-01
As described in a previous paper, a three-dimensional ionospheric electron density (Ne) model has been constructed from vertical Ne profiles retrieved from the FormoSat3/Constellation Observing System for Meteorology, Ionosphere, and Climate GPS radio occultation measurements and worldwide ionosonde foF2 and foE data and named the TaiWan Ionospheric Model (TWIM). The TWIM exhibits vertically fitted α-Chapman-type layers with distinct F2, F1, E, and D layers, and surface spherical harmonic approaches for the fitted layer parameters including peak density, peak density height, and scale height. To improve the TWIM into a real-time model, we have developed a time series autoregressive model to forecast short-term TWIM coefficients. The time series of TWIM coefficients are considered as realizations of stationary stochastic processes within a processing window of 30 days. These autocorrelation coefficients are used to derive the autoregressive parameters and then forecast the TWIM coefficients, based on the least squares method and Lagrange multiplier technique. The forecast root-mean-square relative TWIM coefficient errors are generally <30% for 1 day predictions. The forecast TWIM values of foE and foF2 values are also compared and evaluated using worldwide ionosonde data.
Evaluation of COPD's diaphragm motion extracted from 4D-MRI
NASA Astrophysics Data System (ADS)
Swastika, Windra; Masuda, Yoshitada; Kawata, Naoko; Matsumoto, Koji; Suzuki, Toshio; Iesato, Ken; Tada, Yuji; Sugiura, Toshihiko; Tanabe, Nobuhiro; Tatsumi, Koichiro; Ohnishi, Takashi; Haneishi, Hideaki
2015-03-01
We have developed a method called intersection profile method to construct a 4D-MRI (3D+time) from time-series of 2D-MRI. The basic idea is to find the best matching of the intersection profile from the time series of 2D-MRI in sagittal plane (navigator slice) and time series of 2D-MRI in coronal plane (data slice). In this study, we use 4D-MRI to semiautomatically extract the right diaphragm motion of 16 subjects (8 healthy subjects and 8 COPD patients). The diaphragm motion is then evaluated quantitatively by calculating the displacement of each subjects and normalized it. We also generate phase-length map to view and locate paradoxical motion of the COPD patients. The quantitative results of the normalized displacement shows that COPD patients tend to have smaller displacement compared to healthy subjects. The average normalized displacement of total 8 COPD patients is 9.4mm and the average of normalized displacement of 8 healthy volunteers is 15.3mm. The generated phase-length maps show that not all of the COPD patients have paradoxical motion, however if it has paradoxical motion, the phase-length map is able to locate where does it occur.
Network Analyses for Space-Time High Frequency Wind Data
NASA Astrophysics Data System (ADS)
Laib, Mohamed; Kanevski, Mikhail
2017-04-01
Recently, network science has shown an important contribution to the analysis, modelling and visualization of complex time series. Numerous existing methods have been proposed for constructing networks. This work studies spatio-temporal wind data by using networks based on the Granger causality test. Furthermore, a visual comparison is carried out with several frequencies of data and different size of moving window. The main attention is paid to the temporal evolution of connectivity intensity. The Hurst exponent is applied on the provided time series in order to explore if there is a long connectivity memory. The results explore the space time structure of wind data and can be applied to other environmental data. The used dataset presents a challenging case study. It consists of high frequency (10 minutes) wind data from 120 measuring stations in Switzerland, for a time period of 2012-2013. The distribution of stations covers different geomorphological zones and elevation levels. The results are compared with the Person correlation network as well.
Predicting future forestland area: a comparison of econometric approaches.
SoEun Ahn; Andrew J. Plantinga; Ralph J. Alig
2000-01-01
Predictions of future forestland area are an important component of forest policy analyses. In this article, we test the ability of econometric land use models to accurately forecast forest area. We construct a panel data set for Alabama consisting of county and time-series observation for the period 1964 to 1992. We estimate models using restricted data sets-namely,...
ERIC Educational Resources Information Center
Lee, Jinsol
2017-01-01
In the fall of 2012, a series of teacher union strikes in Chicago catalyzed controversial discussions in education within the political sector, as the goals for student achievement gained increasing attention. Hence, discourses as systems of representation within the particular context and time-period of the teacher union strikes in Chicago…
Creating Worlds, Constructing Meaning: The Scottish Storyline Method. Teacher to Teacher Series.
ERIC Educational Resources Information Center
Creswell, Jeff
The approach known as the Storyline Method was developed by a group of educators at Jordanhill College of Education in Glasgow (Scotland). The development of the Storyline Method took place over years, and the approach, with its simple framework of Storyline, key questions, and activities, has stood the test of time. Storyline uses the power of…
Subsidence Evaluation of High-Speed Railway in Shenyang Based on Time-Series Insar
NASA Astrophysics Data System (ADS)
Zhang, Yun; Wei, Lianhuan; Li, Jiayu; Liu, Shanjun; Mao, Yachun; Wu, Lixin
2018-04-01
More and more high-speed railway are under construction in China. The slow settlement along high-speed railway tracks and newly-built stations would lead to inhomogeneous deformation of local area, and the accumulation may be a threat to the safe operation of high-speed rail system. In this paper, surface deformation of the newly-built high-speed railway station as well as the railway lines in Shenyang region will be retrieved by time series InSAR analysis using multi-orbit COSMO-SkyMed images. This paper focuses on the non-uniform subsidence caused by the changing of local environment along the railway. The accuracy of the settlement results can be verified by cross validation of the results obtained from two different orbits during the same period.
Understanding multi-scale structural evolution in granular systems through gMEMS
NASA Astrophysics Data System (ADS)
Walker, David M.; Tordesillas, Antoinette
2013-06-01
We show how the rheological response of a material to applied loads can be systematically coded, analyzed and succinctly summarized, according to an individual grain's property (e.g. kinematics). Individual grains are considered as their own smart sensor akin to microelectromechanical systems (e.g. gyroscopes, accelerometers), each capable of recognizing their evolving role within self-organizing building block structures (e.g. contact cycles and force chains). A symbolic time series is used to represent their participation in such self-assembled building blocks and a complex network summarizing their interrelationship with other grains is constructed. In particular, relationships between grain time series are determined according to the information theory Hamming distance or the metric Euclidean distance. We then use topological distance to find network communities enabling groups of grains at remote physical metric distances in the material to share a classification. In essence grains with similar structural and functional roles at different scales are identified together. This taxonomy distills the dissipative structural rearrangements of grains down to its essential features and thus provides pointers for objective physics-based internal variable formalisms used in the construction of robust predictive continuum models.
Construction and Analysis of Long-Term Surface Temperature Dataset in Fujian Province
NASA Astrophysics Data System (ADS)
Li, W. E.; Wang, X. Q.; Su, H.
2017-09-01
Land surface temperature (LST) is a key parameter of land surface physical processes on global and regional scales, linking the heat fluxes and interactions between the ground and atmosphere. Based on MODIS 8-day LST products (MOD11A2) from the split-window algorithms, we constructed and obtained the monthly and annual LST dataset of Fujian Province from 2000 to 2015. Then, we analyzed the monthly and yearly time series LST data and further investigated the LST distribution and its evolution features. The average LST of Fujian Province reached the highest in July, while the lowest in January. The monthly and annual LST time series present a significantly periodic features (annual and interannual) from 2000 to 2015. The spatial distribution showed that the LST in North and West was lower than South and East in Fujian Province. With the rapid development and urbanization of the coastal area in Fujian Province, the LST in coastal urban region was significantly higher than that in mountainous rural region. The LST distributions might affected by the climate, topography and land cover types. The spatio-temporal distribution characteristics of LST could provide good references for the agricultural layout and environment monitoring in Fujian Province.
The Decline in Diffuse Support for National Politics
Jennings, Will; Clarke, Nick; Moss, Jonathan; Stoker, Gerry
2017-01-01
Abstract This research note considers how to track long-term trajectories of political discontent in Britain. Many accounts are confined to using either survey data drawn from recent decades or imperfect behavioral measures such as voting or party membership as indicators of political disengagement. We instead develop an approach that provides the long view on political disaffection. We first consider time-series data available from repeated survey measures. We next replicate historic survey questions to observe change in public opinion relative to earlier points in time. Finally, we use Stimson’s (1991) dyad-ratios algorithm to construct an over-time index of political discontent that combines data from multiple poll series. This reveals rising levels of political discontent for both specific and diffuse measures of mass opinion. Our method and findings offer insights into the rising tide of disillusionment afflicting many contemporary democracies. PMID:29731522
Jennings, Will; Clarke, Nick; Moss, Jonathan; Stoker, Gerry
2017-09-01
This research note considers how to track long-term trajectories of political discontent in Britain. Many accounts are confined to using either survey data drawn from recent decades or imperfect behavioral measures such as voting or party membership as indicators of political disengagement. We instead develop an approach that provides the long view on political disaffection. We first consider time-series data available from repeated survey measures. We next replicate historic survey questions to observe change in public opinion relative to earlier points in time. Finally, we use Stimson's (1991) dyad-ratios algorithm to construct an over-time index of political discontent that combines data from multiple poll series. This reveals rising levels of political discontent for both specific and diffuse measures of mass opinion. Our method and findings offer insights into the rising tide of disillusionment afflicting many contemporary democracies.
NASA Technical Reports Server (NTRS)
Struzenberg, L. L.; West, J. S.
2011-01-01
This paper describes the use of targeted Loci/CHEM CFD simulations to evaluate the effects of a dual-engine first-stage hot-fire test on an evolving integrated launch pad/test article design. This effort was undertaken as a part of the NESC Independent Assessment of the Taurus II Stage Test Series. The underlying conceptual model included development of a series of computational models and simulations to analyze the plume induced environments on the pad, facility structures and test article. A pathfinder simulation was first developed, capable of providing quick-turn around evaluation of plume impingement pressures on the flame deflector. Results from this simulation were available in time to provide data for an ongoing structural assessment of the deflector. The resulting recommendation was available in a timely manner and was incorporated into construction schedule for the new launch stand under construction at Wallops Flight Facility. A series of Reynolds-Averaged Navier-Stokes (RANS) quasi-steady simulations representative of various key elements of the test profile was performed to identify potential concerns with the test configuration and test profile. As required, unsteady Hybrid-RANS/LES simulations were performed, to provide additional insight into critical aspects of the test sequence. Modifications to the test-specific hardware and facility structures thermal protection as well as modifications to the planned hot-fire test profile were implemented based on these simulation results.
NASA Astrophysics Data System (ADS)
Mosier, T. M.; Hill, D. F.; Sharp, K. V.
2013-12-01
High spatial resolution time-series data are critical for many hydrological and earth science studies. Multiple groups have developed historical and forecast datasets of high-resolution monthly time-series for regions of the world such as the United States (e.g. PRISM for hindcast data and MACA for long-term forecasts); however, analogous datasets have not been available for most data scarce regions. The current work fills this data need by producing and freely distributing hindcast and forecast time-series datasets of monthly precipitation and mean temperature for all global land surfaces, gridded at a 30 arc-second resolution. The hindcast data are constructed through a Delta downscaling method, using as inputs 0.5 degree monthly time-series and 30 arc-second climatology global weather datasets developed by Willmott & Matsuura and WorldClim, respectively. The forecast data are formulated using a similar downscaling method, but with an additional step to remove bias from the climate variable's probability distribution over each region of interest. The downscaling package is designed to be compatible with a number of general circulation models (GCM) (e.g. with GCMs developed for the IPCC AR4 report and CMIP5), and is presently implemented using time-series data from the NCAR CESM1 model in conjunction with 30 arc-second future decadal climatologies distributed by the Consultative Group on International Agricultural Research. The resulting downscaled datasets are 30 arc-second time-series forecasts of monthly precipitation and mean temperature available for all global land areas. As an example of these data, historical and forecast 30 arc-second monthly time-series from 1950 through 2070 are created and analyzed for the region encompassing Pakistan. For this case study, forecast datasets corresponding to the future representative concentration pathways 45 and 85 scenarios developed by the IPCC are presented and compared. This exercise highlights a range of potential meteorological trends for the Pakistan region and more broadly serves to demonstrate the utility of the presented 30 arc-second monthly precipitation and mean temperature datasets for use in data scarce regions.
Improving Photometry and Stellar Signal Preservation with Pixel-Level Systematic Error Correction
NASA Technical Reports Server (NTRS)
Kolodzijczak, Jeffrey J.; Smith, Jeffrey C.; Jenkins, Jon M.
2013-01-01
The Kepler Mission has demonstrated that excellent stellar photometric performance can be achieved using apertures constructed from optimally selected CCD pixels. The clever methods used to correct for systematic errors, while very successful, still have some limitations in their ability to extract long-term trends in stellar flux. They also leave poorly correlated bias sources, such as drifting moiré pattern, uncorrected. We will illustrate several approaches where applying systematic error correction algorithms to the pixel time series, rather than the co-added raw flux time series, provide significant advantages. Examples include, spatially localized determination of time varying moiré pattern biases, greater sensitivity to radiation-induced pixel sensitivity drops (SPSDs), improved precision of co-trending basis vectors (CBV), and a means of distinguishing the stellar variability from co-trending terms even when they are correlated. For the last item, the approach enables physical interpretation of appropriately scaled coefficients derived in the fit of pixel time series to the CBV as linear combinations of various spatial derivatives of the pixel response function (PRF). We demonstrate that the residuals of a fit of soderived pixel coefficients to various PRF-related components can be deterministically interpreted in terms of physically meaningful quantities, such as the component of the stellar flux time series which is correlated with the CBV, as well as, relative pixel gain, proper motion and parallax. The approach also enables us to parameterize and assess the limiting factors in the uncertainties in these quantities.
[The construction of life profiles by social class in Chile].
Torres, C
1989-11-01
The author develops a series of life profiles for men and women living in the Greater Santiago area of Chile over the past 25 years. These profiles, which are based on the concept of life expectancy at birth, illustrate the length of time individuals take to go through such life cycle stages as education, employment, unemployment, and retirement. The concept is used to analyze changes in the life profile over time and how these differ by class. (SUMMARY IN ENG)
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-31
... a novel or unusual design feature associated with aluminum-lithium fuselage construction that may... of the Bombardier C-series airplanes will be fabricated using aluminum-lithium construction. Structure fabricated from aluminum-lithium may provide different levels of protection from post-crash fuel...
Dean, Roger T; Dunsmuir, William T M
2016-06-01
Many articles on perception, performance, psychophysiology, and neuroscience seek to relate pairs of time series through assessments of their cross-correlations. Most such series are individually autocorrelated: they do not comprise independent values. Given this situation, an unfounded reliance is often placed on cross-correlation as an indicator of relationships (e.g., referent vs. response, leading vs. following). Such cross-correlations can indicate spurious relationships, because of autocorrelation. Given these dangers, we here simulated how and why such spurious conclusions can arise, to provide an approach to resolving them. We show that when multiple pairs of series are aggregated in several different ways for a cross-correlation analysis, problems remain. Finally, even a genuine cross-correlation function does not answer key motivating questions, such as whether there are likely causal relationships between the series. Thus, we illustrate how to obtain a transfer function describing such relationships, informed by any genuine cross-correlations. We illustrate the confounds and the meaningful transfer functions by two concrete examples, one each in perception and performance, together with key elements of the R software code needed. The approach involves autocorrelation functions, the establishment of stationarity, prewhitening, the determination of cross-correlation functions, the assessment of Granger causality, and autoregressive model development. Autocorrelation also limits the interpretability of other measures of possible relationships between pairs of time series, such as mutual information. We emphasize that further complexity may be required as the appropriate analysis is pursued fully, and that causal intervention experiments will likely also be needed.
A Bayesian CUSUM plot: Diagnosing quality of treatment.
Rosthøj, Steen; Jacobsen, Rikke-Line
2017-12-01
To present a CUSUM plot based on Bayesian diagnostic reasoning displaying evidence in favour of "healthy" rather than "sick" quality of treatment (QOT), and to demonstrate a technique using Kaplan-Meier survival curves permitting application to case series with ongoing follow-up. For a case series with known final outcomes: Consider each case a diagnostic test of good versus poor QOT (expected vs. increased failure rates), determine the likelihood ratio (LR) of the observed outcome, convert LR to weight taking log to base 2, and add up weights sequentially in a plot showing how many times odds in favour of good QOT have been doubled. For a series with observed survival times and an expected survival curve: Divide the curve into time intervals, determine "healthy" and specify "sick" risks of failure in each interval, construct a "sick" survival curve, determine the LR of survival or failure at the given observation times, convert to weights, and add up. The Bayesian plot was applied retrospectively to 39 children with acute lymphoblastic leukaemia with completed follow-up, using Nordic collaborative results as reference, showing equal odds between good and poor QOT. In the ongoing treatment trial, with 22 of 37 children still at risk for event, QOT has been monitored with average survival curves as reference, odds so far favoring good QOT 2:1. QOT in small patient series can be assessed with a Bayesian CUSUM plot, retrospectively when all treatment outcomes are known, but also in ongoing series with unfinished follow-up. © 2017 John Wiley & Sons, Ltd.
Fractional Brownian motion time-changed by gamma and inverse gamma process
NASA Astrophysics Data System (ADS)
Kumar, A.; Wyłomańska, A.; Połoczański, R.; Sundar, S.
2017-02-01
Many real time-series exhibit behavior adequate to long range dependent data. Additionally very often these time-series have constant time periods and also have characteristics similar to Gaussian processes although they are not Gaussian. Therefore there is need to consider new classes of systems to model these kinds of empirical behavior. Motivated by this fact in this paper we analyze two processes which exhibit long range dependence property and have additional interesting characteristics which may be observed in real phenomena. Both of them are constructed as the superposition of fractional Brownian motion (FBM) and other process. In the first case the internal process, which plays role of the time, is the gamma process while in the second case the internal process is its inverse. We present in detail their main properties paying main attention to the long range dependence property. Moreover, we show how to simulate these processes and estimate their parameters. We propose to use a novel method based on rescaled modified cumulative distribution function for estimation of parameters of the second considered process. This method is very useful in description of rounded data, like waiting times of subordinated processes delayed by inverse subordinators. By using the Monte Carlo method we show the effectiveness of proposed estimation procedures. Finally, we present the applications of proposed models to real time series.
Forecasting Construction Cost Index based on visibility graph: A network approach
NASA Astrophysics Data System (ADS)
Zhang, Rong; Ashuri, Baabak; Shyr, Yu; Deng, Yong
2018-03-01
Engineering News-Record (ENR), a professional magazine in the field of global construction engineering, publishes Construction Cost Index (CCI) every month. Cost estimators and contractors assess projects, arrange budgets and prepare bids by forecasting CCI. However, fluctuations and uncertainties of CCI cause irrational estimations now and then. This paper aims at achieving more accurate predictions of CCI based on a network approach in which time series is firstly converted into a visibility graph and future values are forecasted relied on link prediction. According to the experimental results, the proposed method shows satisfactory performance since the error measures are acceptable. Compared with other methods, the proposed method is easier to implement and is able to forecast CCI with less errors. It is convinced that the proposed method is efficient to provide considerably accurate CCI predictions, which will make contributions to the construction engineering by assisting individuals and organizations in reducing costs and making project schedules.
Space architecture monograph series. Volume 4: Genesis 2: Advanced lunar outpost
NASA Technical Reports Server (NTRS)
Fieber, Joseph P.; Huebner-Moths, Janis; Paruleski, Kerry L.; Moore, Gary T. (Editor)
1991-01-01
This research and design study investigated advanced lunar habitats for astronauts and mission specialists on the Earth's moon. Design recommendations are based on environmental response to the lunar environment, human habitability (human factors and environmental behavior research), transportability (structural and materials system with least mass), constructability (minimizing extravehicular time), construction dependability and resilience, and suitability for NASA launch research missions in the 21st century. The recommended design uses lunar lava tubes, with construction being a combination of Space Station Freedom derived hard modules and light weight Kevlar laminate inflatable structures. The proposed habitat includes research labs and a biotron, crew quarters and crew support facility, mission control, health maintenance facility, maintenance work areas for psychological retreat, privacy, and comtemplation. Furniture, specialized equipment, and lighting are included in the analysis and design. Drawings include base master plans, construction sequencing, overall architectural configuration, detailed floor plans, sections and axonometrics, with interior perspectives.
Chen, Yushun; Viadero, Roger C; Wei, Xinchao; Fortney, Ronald; Hedrick, Lara B; Welsh, Stuart A; Anderson, James T; Lin, Lian-Shin
2009-01-01
Refining best management practices (BMPs) for future highway construction depends on a comprehensive understanding of environmental impacts from current construction methods. Based on a before-after-control impact (BACI) experimental design, long-term stream monitoring (1997-2006) was conducted at upstream (as control, n = 3) and downstream (as impact, n = 6) sites in the Lost River watershed of the Mid-Atlantic Highlands region, West Virginia. Monitoring data were analyzed to assess impacts of during and after highway construction on 15 water quality parameters and macroinvertebrate condition using the West Virginia stream condition index (WVSCI). Principal components analysis (PCA) identified regional primary water quality variances, and paired t tests and time series analysis detected seven highway construction-impacted water quality parameters which were mainly associated with the second principal component. In particular, impacts on turbidity, total suspended solids, and total iron during construction, impacts on chloride and sulfate during and after construction, and impacts on acidity and nitrate after construction were observed at the downstream sites. The construction had statistically significant impacts on macroinvertebrate index scores (i.e., WVSCI) after construction, but did not change the overall good biological condition. Implementing BMPs that address those construction-impacted water quality parameters can be an effective mitigation strategy for future highway construction in this highlands region.
Data Rescue for precipitation station network in Slovak Republic
NASA Astrophysics Data System (ADS)
Fasko, Pavel; Bochníček, Oliver; Švec, Marek; Paľušová, Zuzana; Markovič, Ladislav
2016-04-01
Transparency of archive catalogues presents very important task for the data saving. It helps to the further activities e.g. digitalization and homogenization. For the time being visualization of time series continuation in precipitation stations (approximately 1250 stations) is under way in Slovak Republic since the beginning of observation (meteorological stations gradually began to operate during the second half of the 19th century in Slovakia). Visualization is joined with the activities like verification and accessibility of the data mentioned in the archive catalogue, station localization according to the historical annual books, conversion of coordinates into x-JTSK, y-JTSK and hydrological catchment assignment. Clustering of precipitation stations at the specific hydrological catchment in the map and visualization of the data duration (line graph) will lead to the effective assignment of corresponding precipitation stations for the prolongation of time series. This process should be followed by the process of turn or trend detection and homogenization. The risks and problems at verification of records from archive catalogues, their digitalization, repairs and the way of visualization will be seen in poster. During the searching process of the historical and often short time series, we realized the importance of mainly those stations, located in the middle and higher altitudes. They might be used as replacement for up to now quoted fictive points used at the construction of precipitation maps. Supplementing and enhancing the time series of individual stations will enable to follow changes in precipitation totals during the certain period as well as area totals for individual catchments in various time periods appreciated mainly by hydrologists and agro-climatologists.
NASA Technical Reports Server (NTRS)
Chelton, Dudley B.; Schlax, Michael G.
1991-01-01
The sampling error of an arbitrary linear estimate of a time-averaged quantity constructed from a time series of irregularly spaced observations at a fixed located is quantified through a formalism. The method is applied to satellite observations of chlorophyll from the coastal zone color scanner. The two specific linear estimates under consideration are the composite average formed from the simple average of all observations within the averaging period and the optimal estimate formed by minimizing the mean squared error of the temporal average based on all the observations in the time series. The resulting suboptimal estimates are shown to be more accurate than composite averages. Suboptimal estimates are also found to be nearly as accurate as optimal estimates using the correct signal and measurement error variances and correlation functions for realistic ranges of these parameters, which makes it a viable practical alternative to the composite average method generally employed at present.
Inferring the interplay between network structure and market effects in Bitcoin
NASA Astrophysics Data System (ADS)
Kondor, Dániel; Csabai, István; Szüle, János; Pósfai, Márton; Vattay, Gábor
2014-12-01
A main focus in economics research is understanding the time series of prices of goods and assets. While statistical models using only the properties of the time series itself have been successful in many aspects, we expect to gain a better understanding of the phenomena involved if we can model the underlying system of interacting agents. In this article, we consider the history of Bitcoin, a novel digital currency system, for which the complete list of transactions is available for analysis. Using this dataset, we reconstruct the transaction network between users and analyze changes in the structure of the subgraph induced by the most active users. Our approach is based on the unsupervised identification of important features of the time variation of the network. Applying the widely used method of Principal Component Analysis to the matrix constructed from snapshots of the network at different times, we are able to show how structural changes in the network accompany significant changes in the exchange price of bitcoins.
Zapadlo, Michal; Krupcík, Ján; Májek, Pavel; Armstrong, Daniel W; Sandra, Pat
2010-09-10
The orthogonality of three columns coupled in two series was studied for the congener specific comprehensive two-dimensional GC separation of polychlorinated biphenyls (PCBs). A non-polar capillary column coated with poly(5%-phenyl-95%-methyl)siloxane was used as the first ((1)D) column in both series. A polar capillary column coated with 70% cyanopropyl-polysilphenylene-siloxane or a capillary column coated with the ionic liquid 1,12-di(tripropylphosphonium)dodecane bis(trifluoromethane-sulfonyl)imide were used as the second ((2)D) columns. Nine multi-congener standard PCB solutions containing subsets of all native 209 PCBs, a mixture of 209 PCBs as well as Aroclor 1242 and 1260 formulations were used to study the orthogonality of both column series. Retention times of the corresponding PCB congeners on (1)D and (2)D columns were used to construct retention time dependences (apex plots) for assessing orthogonality of both columns coupled in series. For a visual assessment of the peak density of PCBs congeners on a retention plane, 2D images were compared. The degree of orthogonality of both column series was, along the visual assessment of distribution of PCBs on the retention plane, evaluated also by Pearson's correlation coefficient, which was found by correlation of retention times t(R,i,2D) and t(R,i,1D) of corresponding PCB congeners on both column series. It was demonstrated that the apolar+ionic liquid column series is almost orthogonal both for the 2D separation of PCBs present in Aroclor 1242 and 1260 formulations as well as for the separation of all of 209 PCBs. All toxic, dioxin-like PCBs, with the exception of PCB 118 that overlaps with PCB 106, were resolved by the apolar/ionic liquid series while on the apolar/polar column series three toxic PCBs overlapped (105+127, 81+148 and 118+106). Copyright 2010 Elsevier B.V. All rights reserved.
Constructing Weyl group multiple Dirichlet series
NASA Astrophysics Data System (ADS)
Chinta, Gautam; Gunnells, Paul E.
2010-01-01
Let Phi be a reduced root system of rank r . A Weyl group multiple Dirichlet series for Phi is a Dirichlet series in r complex variables s_1,dots,s_r , initially converging for {Re}(s_i) sufficiently large, that has meromorphic continuation to {{C}}^r and satisfies functional equations under the transformations of {{C}}^r corresponding to the Weyl group of Phi . A heuristic definition of such a series was given by Brubaker, Bump, Chinta, Friedberg, and Hoffstein, and they have been investigated in certain special cases by others. In this paper we generalize results by Chinta and Gunnells to construct Weyl group multiple Dirichlet series by a uniform method and show in all cases that they have the expected properties.
User's manual for the Graphical Constituent Loading Analysis System (GCLAS)
Koltun, G.F.; Eberle, Michael; Gray, J.R.; Glysson, G.D.
2006-01-01
This manual describes the Graphical Constituent Loading Analysis System (GCLAS), an interactive cross-platform program for computing the mass (load) and average concentration of a constituent that is transported in stream water over a period of time. GCLAS computes loads as a function of an equal-interval streamflow time series and an equal- or unequal-interval time series of constituent concentrations. The constituent-concentration time series may be composed of measured concentrations or a combination of measured and estimated concentrations. GCLAS is not intended for use in situations where concentration data (or an appropriate surrogate) are collected infrequently or where an appreciable amount of the concentration values are censored. It is assumed that the constituent-concentration time series used by GCLAS adequately represents the true time-varying concentration. Commonly, measured constituent concentrations are collected at a frequency that is less than ideal (from a load-computation standpoint), so estimated concentrations must be inserted in the time series to better approximate the expected chemograph. GCLAS provides tools to facilitate estimation and entry of instantaneous concentrations for that purpose. Water-quality samples collected for load computation frequently are collected in a single vertical or at single point in a stream cross section. Several factors, some of which may vary as a function of time and (or) streamflow, can affect whether the sample concentrations are representative of the mean concentration in the cross section. GCLAS provides tools to aid the analyst in assessing whether concentrations in samples collected in a single vertical or at single point in a stream cross section exhibit systematic bias with respect to the mean concentrations. In cases where bias is evident, the analyst can construct coefficient relations in GCLAS to reduce or eliminate the observed bias. GCLAS can export load and concentration data in formats suitable for entry into the U.S. Geological Survey's National Water Information System. GCLAS can also import and export data in formats that are compatible with various commonly used spreadsheet and statistics programs.
Assessment and maintenance of a 15 year old stress-laminated timber bridge
T. Russell Gentry; Karl N. Brohammer; John Wells; James P. Wacker
2006-01-01
A timber bridge consisting of three 6.7 meter spans with a stress laminated deck was constructed in 1991 in the Spirit Creek State Forest near August, Georgia, USA. The stress laminated bridge uses a series of post-tensioning bars to hold the laminations together. The bridge remained in service until 2001 with no maintenance, at which time the bridge was inspected,...
ERIC Educational Resources Information Center
Ginevra, Maria Cristina; Di Maggio, Ilaria; Nota, Laura; Soresi, Salvatore
2017-01-01
A career intervention based on life design approach was devised for a group of young adults at risk for the process of career construction. It was aimed at fostering a series of resources useful to cope with career transitions, to encourage reflection on the future, to identify one's own strengths, and to plan future projects. Results of the study…
A strategy for Local Surface Stability Monitoring Using SAR Imagery
NASA Astrophysics Data System (ADS)
Kim, J.; Lan, C. W.; Lin, S. Y.; vanGasselt, S.; Yun, H.
2017-12-01
In order to provide sufficient facilities to satisfy a growing number of residents, nowadays there are many constructions and maintenance of infrastructures or buildings undergoing above and below the surface of urban area. In some cases we have learned that disasters might happen if the developments were conducted on unknown or geologically unstable ground or in over-developed areas. To avoid damages caused by such settings, it is essential to perform a regular monitoring scheme to understand the ground stability over the whole urban area. Through long-term monitoring, we firstly aim to observe surface stability over the construction sites. Secondly, we propose to implement an automatic extraction and tracking of suspicious unstable area. To achieve this, we used 12-days-interval C-band Sentinel-1A Synthetic Aperture Radar (SAR) images as the main source to perform regular monitoring. Differential Interferometric SAR (D-InSAR) technique was applied to generate interferograms. Together with the accumulation of updated Sentinel-1A SAR images, time series interferograms were formed accordingly. For the purpose of observing surface stability over known construction sites, the interferograms and the unwrapped products could be used to identify the surface displacement occurring before and after specific events. In addition, Small Baseline Subset (SBAS) and Permanent Scatterers (PS) approaches combining a set of unwrapped D-InSAR interferograms were also applied to derive displacement velocities over long-term periods. For some cases, we conducted the ascending and descending mode time series analysis to decompose three surface migration vectors and to precisely identify the risk pattern. Regarding the extraction of suspicious unstable areas, we propose to develop an automatic pattern recognition algorithm for the identification of specific fringe patterns involving various potential risks. The detected fringes were tracked in the time series interferograms and overlapped with various GIS layers to find correlations with the environmental elements causing the risks. Taipei City and Taichung City located in northern Taiwan and Ulsan City in Korea were selected to demonstrate the feasibility of the proposed method.
NASA Astrophysics Data System (ADS)
Niazmardi, S.; Safari, A.; Homayouni, S.
2017-09-01
Crop mapping through classification of Satellite Image Time-Series (SITS) data can provide very valuable information for several agricultural applications, such as crop monitoring, yield estimation, and crop inventory. However, the SITS data classification is not straightforward. Because different images of a SITS data have different levels of information regarding the classification problems. Moreover, the SITS data is a four-dimensional data that cannot be classified using the conventional classification algorithms. To address these issues in this paper, we presented a classification strategy based on Multiple Kernel Learning (MKL) algorithms for SITS data classification. In this strategy, initially different kernels are constructed from different images of the SITS data and then they are combined into a composite kernel using the MKL algorithms. The composite kernel, once constructed, can be used for the classification of the data using the kernel-based classification algorithms. We compared the computational time and the classification performances of the proposed classification strategy using different MKL algorithms for the purpose of crop mapping. The considered MKL algorithms are: MKL-Sum, SimpleMKL, LPMKL and Group-Lasso MKL algorithms. The experimental tests of the proposed strategy on two SITS data sets, acquired by SPOT satellite sensors, showed that this strategy was able to provide better performances when compared to the standard classification algorithm. The results also showed that the optimization method of the used MKL algorithms affects both the computational time and classification accuracy of this strategy.
Wang, Xuanwen; Dong, Xiuwen Sue; Choi, Sang D; Dement, John
2017-05-01
Examine trends and patterns of work-related musculoskeletal disorders (WMSDs) among construction workers in the USA, with an emphasis on older workers. WMSDs were identified from the 1992-2014 Survey of Occupational Injuries and Illnesses (SOII), and employment was estimated from the Current Population Survey (CPS). Risk of WMSDs was measured by number of WMSDs per 10 000 full-time equivalent workers and stratified by major demographic and employment subgroups. Time series analysis was performed to examine the trend of WMSDs in construction. The number of WMSDs significantly dropped in the US construction industry, following the overall injury trends. However, the rate of WMSDs in construction remained higher than in all industries combined; the median days away from work increased from 8 days in 1992 to 13 days in 2014, and the proportion of WMSDs for construction workers aged 55 to 64 years almost doubled. By occupation, construction labourers had the largest number of WMSD cases, while helpers, heating and air-conditioning mechanics, cement masons and sheet metal workers had the highest rates of WMSDs. The major cause of WMSDs in construction was overexertion, and back injuries accounted for more than 40% of WMSDs among construction workers. The estimated wage loss for private wage-and-salary construction workers was $46 million in 2014. Construction workers continue to face a higher risk of WMSDs. Ergonomic solutions that reduce overexertion-the primary exposure for WMSDs-should be adopted extensively at construction sites, particularly for workers with a higher risk of WMSDs. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Nonlinear multi-analysis of agent-based financial market dynamics by epidemic system
NASA Astrophysics Data System (ADS)
Lu, Yunfan; Wang, Jun; Niu, Hongli
2015-10-01
Based on the epidemic dynamical system, we construct a new agent-based financial time series model. In order to check and testify its rationality, we compare the statistical properties of the time series model with the real stock market indices, Shanghai Stock Exchange Composite Index and Shenzhen Stock Exchange Component Index. For analyzing the statistical properties, we combine the multi-parameter analysis with the tail distribution analysis, the modified rescaled range analysis, and the multifractal detrended fluctuation analysis. For a better perspective, the three-dimensional diagrams are used to present the analysis results. The empirical research in this paper indicates that the long-range dependence property and the multifractal phenomenon exist in the real returns and the proposed model. Therefore, the new agent-based financial model can recurrence some important features of real stock markets.
An agreement coefficient for image comparison
Ji, Lei; Gallo, Kevin
2006-01-01
Combination of datasets acquired from different sensor systems is necessary to construct a long time-series dataset for remotely sensed land-surface variables. Assessment of the agreement of the data derived from various sources is an important issue in understanding the data continuity through the time-series. Some traditional measures, including correlation coefficient, coefficient of determination, mean absolute error, and root mean square error, are not always optimal for evaluating the data agreement. For this reason, we developed a new agreement coefficient for comparing two different images. The agreement coefficient has the following properties: non-dimensional, bounded, symmetric, and distinguishable between systematic and unsystematic differences. The paper provides examples of agreement analyses for hypothetical data and actual remotely sensed data. The results demonstrate that the agreement coefficient does include the above properties, and therefore is a useful tool for image comparison.
Using terrestrial radar to explore lava channel erosion on Momotombo volcano, Nicaragua
NASA Astrophysics Data System (ADS)
Gallant, E.; Deng, F.; Xie, S.; Connor, L.; Connor, C.; Saballos, J. A.; Dixon, T. H.; Myhre, D.
2017-12-01
We explore the application of terrestrial radar as a tool for imaging topography on Momotombo volcano, Nicaragua. A major feature of the edifice is an incised lava flow channel (possibly created by the 1904 eruption) that measures 150m in width and up to 60m in depth. This feature is unusual because most lava channels are constructional in nature and constrained by levees on their margins. The radar elevation model was used alongside a TerraSAR-X/TanDEM-X DEM to help create a topographic time series. We consider the possibility that the channel was formed during the 1904 eruption by thermal and / or mechanical erosion. We aim to quantify the energy required to create the observed topography by merging this topographic time series with existing field observations and mathematical models of erosion via lava flow.
Advanced functional network analysis in the geosciences: The pyunicorn package
NASA Astrophysics Data System (ADS)
Donges, Jonathan F.; Heitzig, Jobst; Runge, Jakob; Schultz, Hanna C. H.; Wiedermann, Marc; Zech, Alraune; Feldhoff, Jan; Rheinwalt, Aljoscha; Kutza, Hannes; Radebach, Alexander; Marwan, Norbert; Kurths, Jürgen
2013-04-01
Functional networks are a powerful tool for analyzing large geoscientific datasets such as global fields of climate time series originating from observations or model simulations. pyunicorn (pythonic unified complex network and recurrence analysis toolbox) is an open-source, fully object-oriented and easily parallelizable package written in the language Python. It allows for constructing functional networks (aka climate networks) representing the structure of statistical interrelationships in large datasets and, subsequently, investigating this structure using advanced methods of complex network theory such as measures for networks of interacting networks, node-weighted statistics or network surrogates. Additionally, pyunicorn allows to study the complex dynamics of geoscientific systems as recorded by time series by means of recurrence networks and visibility graphs. The range of possible applications of the package is outlined drawing on several examples from climatology.
Shoreline Position Dynamics: Measurement and Analysis
NASA Astrophysics Data System (ADS)
Barton, C. C.; Rigling, B.; Hunter, N.; Tebbens, S. F.
2012-12-01
The dynamics of sandy shoreline position is a fundamental property of complex beach face processes and is characterized by the power scaling exponent. Spectral analysis was performed on the temporal position of four sandy shorelines extracted from four shore perpendicular profiles each resurveyed approximately seven times per year over twenty-seven years at the Field Research Facility (FRF) by the U.S. Army Corps of Engineers, located at Kitty Hawk, NC. The four shorelines we studied are mean-higher-high-water (MHHW), mean-high-water (MHW), and mean-low-water (MLW) and mean-lower-low-water (MLLW) with elevations of 0.75m, 0.65m, -0.33m, and -0.37m respectively, relative to the NGVD29 geodetic datum. Spectral analysis used to quantify scaling exponents requires data evenly spaced in time. Our previous studies of shoreline dynamics used the Lomb Periodogram method for spectral analysis, which we now show does not return the correct scaling exponent for unevenly spaced data. New to this study is the use of slotted resampling and a linear predictor to construct an evenly spaced data set from an unevenly spaced data set which has been shown with synthetic data to return correct values of the scaling exponents. A periodogram linear regression (PLR) estimate is used to determine the scaling exponent β of the constructed evenly spaced time series. This study shows that sandy shoreline position exhibits nonlinear self-affine dynamics through time. The times series of each of the four shorelines has scaling exponents ranging as follows: MHHW, β = 1.3-2.2; MHW, β = 1.3-2.1; MLW, β = 1.2-1.6; and MLLW, β = 1.2-1.6. Time series with β greater than 1 are non-stationary (mean and standard deviation are not constant through time) and are increasingly internally correlated with increasing β. The range of scaling exponents of the MLW and MLLW shorelines, near β = 1.5, is indicative of a diffusion process. The range of scaling exponents for the MHW and MHHW shorelines indicates spatially variable dynamics higher on the beach face.
Cunningham, James K; Liu, Lon-Mu; Callaghan, Russell C
2016-11-01
In December 2006 the United States regulated sodium permanganate, a cocaine essential chemical. In March 2007 Mexico, the United States' primary source for methamphetamine, closed a chemical company accused of illicitly importing 60+ tons of pseudoephedrine, a methamphetamine precursor chemical. US cocaine availability and methamphetamine availability, respectively, decreased in association. This study tested whether the controls had impacts upon the numbers of US cocaine users and methamphetamine users. Auto-regressive integrated moving average (ARIMA) intervention time-series analysis. Comparison series-heroin and marijuana users-were used. United States, 2002-14. The National Survey on Drug Use and Health (n = 723 283), a complex sample survey of the US civilian, non-institutionalized population. Estimates of the numbers of (1) past-year users and (2) past-month users were constructed for each calendar quarter from 2002 to 2014, providing each series with 52 time-periods. Downward shifts in cocaine users started at the time of the cocaine regulation. Past-year and past-month cocaine users series levels decreased by approximately 1 946 271 (-32%) (P < 0.05) and 694 770 (-29%) (P < 0.01), respectively-no apparent recovery occurred through 2014. Downward shifts in methamphetamine users started at the time of the chemical company closure. Past-year and past-month methamphetamine series levels decreased by 494 440 (-35%) [P < 0.01; 95% confidence interval (CI) = -771 897, -216 982] and 277 380 (-45%) (P < 0.05; CI = -554 073, -686), respectively-partial recovery possibly occurred in 2013. The comparison series changed little at the intervention times. Essential/precursor chemical controls in the United States (2006) and Mexico (2007) were associated with large, extended (7+ years) reductions in cocaine users and methamphetamine users in the United States. © 2016 Society for the Study of Addiction.
The Chern-Simons Current in Systems of DNA-RNA Transcriptions
NASA Astrophysics Data System (ADS)
Capozziello, Salvatore; Pincak, Richard; Kanjamapornkul, Kabin; Saridakis, Emmanuel N.
2018-04-01
A Chern-Simons current, coming from ghost and anti-ghost fields of supersymmetry theory, can be used to define a spectrum of gene expression in new time series data where a spinor field, as alternative representation of a gene, is adopted instead of using the standard alphabet sequence of bases $A, T, C, G, U$. After a general discussion on the use of supersymmetry in biological systems, we give examples of the use of supersymmetry for living organism, discuss the codon and anti-codon ghost fields and develop an algebraic construction for the trash DNA, the DNA area which does not seem active in biological systems. As a general result, all hidden states of codon can be computed by Chern-Simons 3 forms. Finally, we plot a time series of genetic variations of viral glycoprotein gene and host T-cell receptor gene by using a gene tensor correlation network related to the Chern-Simons current. An empirical analysis of genetic shift, in host cell receptor genes with separated cluster of gene and genetic drift in viral gene, is obtained by using a tensor correlation plot over time series data derived as the empirical mode decomposition of Chern-Simons current.
NASA Astrophysics Data System (ADS)
Yan, Ying; Zhang, Shen; Tang, Jinjun; Wang, Xiaofei
2017-07-01
Discovering dynamic characteristics in traffic flow is the significant step to design effective traffic managing and controlling strategy for relieving traffic congestion in urban cities. A new method based on complex network theory is proposed to study multivariate traffic flow time series. The data were collected from loop detectors on freeway during a year. In order to construct complex network from original traffic flow, a weighted Froenius norm is adopt to estimate similarity between multivariate time series, and Principal Component Analysis is implemented to determine the weights. We discuss how to select optimal critical threshold for networks at different hour in term of cumulative probability distribution of degree. Furthermore, two statistical properties of networks: normalized network structure entropy and cumulative probability of degree, are utilized to explore hourly variation in traffic flow. The results demonstrate these two statistical quantities express similar pattern to traffic flow parameters with morning and evening peak hours. Accordingly, we detect three traffic states: trough, peak and transitional hours, according to the correlation between two aforementioned properties. The classifying results of states can actually represent hourly fluctuation in traffic flow by analyzing annual average hourly values of traffic volume, occupancy and speed in corresponding hours.
NASA Astrophysics Data System (ADS)
Dergachev, V. A.; Dmitriev, P. B.
2017-12-01
An inhomogeneous time series of measurements of the percentage content of biogenic silica in the samples of joint cores BDP-96-1 and BDP-96-2 from the bottom of Lake Baikal drilled at a depth of 321 m under water has been analyzed. The composite depth of cores is 77 m, which covers the Pleistocene Epoch to 1.8 Ma. The time series was reduced to a regular form with a time step of 1 kyr, which allowed 16 distinct quasi-periodic components with periods from 19 to 251 kyr to be revealed in this series at a significance level of their amplitudes exceeding 4σ. For this, the combined spectral periodogram (a modification of the spectral analysis method) was used. Some of the revealed quasi-harmonics are related to the characteristic cyclical oscillations of the Earth's orbital parameters. Special focus was payed to the temporal change in the parameters of the revealed quasi-harmonic components over the Pleistocene Epoch, which was studied by constructing the spectral density of the analyzed data in the running window of 201 and 701 kyr.
Liao, Fuyuan; Jan, Yih-Kuen
2012-06-01
This paper presents a recurrence network approach for the analysis of skin blood flow dynamics in response to loading pressure. Recurrence is a fundamental property of many dynamical systems, which can be explored in phase spaces constructed from observational time series. A visualization tool of recurrence analysis called recurrence plot (RP) has been proved to be highly effective to detect transitions in the dynamics of the system. However, it was found that delay embedding can produce spurious structures in RPs. Network-based concepts have been applied for the analysis of nonlinear time series recently. We demonstrate that time series with different types of dynamics exhibit distinct global clustering coefficients and distributions of local clustering coefficients and that the global clustering coefficient is robust to the embedding parameters. We applied the approach to study skin blood flow oscillations (BFO) response to loading pressure. The results showed that global clustering coefficients of BFO significantly decreased in response to loading pressure (p<0.01). Moreover, surrogate tests indicated that such a decrease was associated with a loss of nonlinearity of BFO. Our results suggest that the recurrence network approach can practically quantify the nonlinear dynamics of BFO.
Stochastic modeling for time series InSAR: with emphasis on atmospheric effects
NASA Astrophysics Data System (ADS)
Cao, Yunmeng; Li, Zhiwei; Wei, Jianchao; Hu, Jun; Duan, Meng; Feng, Guangcai
2018-02-01
Despite the many applications of time series interferometric synthetic aperture radar (TS-InSAR) techniques in geophysical problems, error analysis and assessment have been largely overlooked. Tropospheric propagation error is still the dominant error source of InSAR observations. However, the spatiotemporal variation of atmospheric effects is seldom considered in the present standard TS-InSAR techniques, such as persistent scatterer interferometry and small baseline subset interferometry. The failure to consider the stochastic properties of atmospheric effects not only affects the accuracy of the estimators, but also makes it difficult to assess the uncertainty of the final geophysical results. To address this issue, this paper proposes a network-based variance-covariance estimation method to model the spatiotemporal variation of tropospheric signals, and to estimate the temporal variance-covariance matrix of TS-InSAR observations. The constructed stochastic model is then incorporated into the TS-InSAR estimators both for parameters (e.g., deformation velocity, topography residual) estimation and uncertainty assessment. It is an incremental and positive improvement to the traditional weighted least squares methods to solve the multitemporal InSAR time series. The performance of the proposed method is validated by using both simulated and real datasets.
He, Meilin; Shen, Wenbin; Chen, Ruizhi; Ding, Hao; Guo, Guangyi
2017-01-01
The solid Earth deforms elastically in response to variations of surface atmosphere, hydrology, and ice/glacier mass loads. Continuous geodetic observations by Global Positioning System (CGPS) stations and Gravity Recovery and Climate Experiment (GRACE) record such deformations to estimate seasonal and secular mass changes. In this paper, we present the seasonal variation of the surface mass changes and the crustal vertical deformation in the South China Block (SCB) identified by GPS and GRACE observations with records spanning from 1999 to 2016. We used 33 CGPS stations to construct a time series of coordinate changes, which are decomposed by empirical orthogonal functions (EOFs) in SCB. The average weighted root-mean-square (WRMS) reduction is 38% when we subtract GRACE-modeled vertical displacements from GPS time series. The first common mode shows clear seasonal changes, indicating seasonal surface mass re-distribution in and around the South China Block. The correlation between GRACE and GPS time series is analyzed which provides a reference for further improvement of the seasonal variation of CGPS time series. The results of the GRACE observations inversion are the surface deformations caused by the surface mass change load at a rate of about −0.4 to −0.8 mm/year, which is used to improve the long-term trend of non-tectonic loads of the GPS vertical velocity field to further explain the crustal tectonic movement in the SCB and surroundings. PMID:29301236
Fundamentals of Construction. Instructor Edition. Introduction to Construction Series.
ERIC Educational Resources Information Center
Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.
This instructor's guide contains the materials required to teach a competency-based introductory course in the fundamentals of construction to students who have chosen to explore careers in construction. The following topics are covered in the course's 10 instructional units: industry orientation (exploring the construction industry and starting a…
Jane, Nancy Yesudhas; Nehemiah, Khanna Harichandran; Arputharaj, Kannan
2016-01-01
Clinical time-series data acquired from electronic health records (EHR) are liable to temporal complexities such as irregular observations, missing values and time constrained attributes that make the knowledge discovery process challenging. This paper presents a temporal rough set induced neuro-fuzzy (TRiNF) mining framework that handles these complexities and builds an effective clinical decision-making system. TRiNF provides two functionalities namely temporal data acquisition (TDA) and temporal classification. In TDA, a time-series forecasting model is constructed by adopting an improved double exponential smoothing method. The forecasting model is used in missing value imputation and temporal pattern extraction. The relevant attributes are selected using a temporal pattern based rough set approach. In temporal classification, a classification model is built with the selected attributes using a temporal pattern induced neuro-fuzzy classifier. For experimentation, this work uses two clinical time series dataset of hepatitis and thrombosis patients. The experimental result shows that with the proposed TRiNF framework, there is a significant reduction in the error rate, thereby obtaining the classification accuracy on an average of 92.59% for hepatitis and 91.69% for thrombosis dataset. The obtained classification results prove the efficiency of the proposed framework in terms of its improved classification accuracy.
Markovic, Gabriela; Schult, Marie-Louise; Bartfai, Aniko; Elg, Mattias
2017-01-31
Progress in early cognitive recovery after acquired brain injury is uneven and unpredictable, and thus the evaluation of rehabilitation is complex. The use of time-series measurements is susceptible to statistical change due to process variation. To evaluate the feasibility of using a time-series method, statistical process control, in early cognitive rehabilitation. Participants were 27 patients with acquired brain injury undergoing interdisciplinary rehabilitation of attention within 4 months post-injury. The outcome measure, the Paced Auditory Serial Addition Test, was analysed using statistical process control. Statistical process control identifies if and when change occurs in the process according to 3 patterns: rapid, steady or stationary performers. The statistical process control method was adjusted, in terms of constructing the baseline and the total number of measurement points, in order to measure a process in change. Statistical process control methodology is feasible for use in early cognitive rehabilitation, since it provides information about change in a process, thus enabling adjustment of the individual treatment response. Together with the results indicating discernible subgroups that respond differently to rehabilitation, statistical process control could be a valid tool in clinical decision-making. This study is a starting-point in understanding the rehabilitation process using a real-time-measurements approach.
Reduced rank models for travel time estimation of low order mode pulses.
Chandrayadula, Tarun K; Wage, Kathleen E; Worcester, Peter F; Dzieciuch, Matthew A; Mercer, James A; Andrew, Rex K; Howe, Bruce M
2013-10-01
Mode travel time estimation in the presence of internal waves (IWs) is a challenging problem. IWs perturb the sound speed, which results in travel time wander and mode scattering. A standard approach to travel time estimation is to pulse compress the broadband signal, pick the peak of the compressed time series, and average the peak time over multiple receptions to reduce variance. The peak-picking approach implicitly assumes there is a single strong arrival and does not perform well when there are multiple arrivals due to scattering. This article presents a statistical model for the scattered mode arrivals and uses the model to design improved travel time estimators. The model is based on an Empirical Orthogonal Function (EOF) analysis of the mode time series. Range-dependent simulations and data from the Long-range Ocean Acoustic Propagation Experiment (LOAPEX) indicate that the modes are represented by a small number of EOFs. The reduced-rank EOF model is used to construct a travel time estimator based on the Matched Subspace Detector (MSD). Analysis of simulation and experimental data show that the MSDs are more robust to IW scattering than peak picking. The simulation analysis also highlights how IWs affect the mode excitation by the source.
Mapping the structure of the world economy.
Lenzen, Manfred; Kanemoto, Keiichiro; Moran, Daniel; Geschke, Arne
2012-08-07
We have developed a new series of environmentally extended multi-region input-output (MRIO) tables with applications in carbon, water, and ecological footprinting, and Life-Cycle Assessment, as well as trend and key driver analyses. Such applications have recently been at the forefront of global policy debates, such as about assigning responsibility for emissions embodied in internationally traded products. The new time series was constructed using advanced parallelized supercomputing resources, and significantly advances the previous state of art because of four innovations. First, it is available as a continuous 20-year time series of MRIO tables. Second, it distinguishes 187 individual countries comprising more than 15,000 industry sectors, and hence offers unsurpassed detail. Third, it provides information just 1-3 years delayed therefore significantly improving timeliness. Fourth, it presents MRIO elements with accompanying standard deviations in order to allow users to understand the reliability of data. These advances will lead to material improvements in the capability of applications that rely on input-output tables. The timeliness of information means that analyses are more relevant to current policy questions. The continuity of the time series enables the robust identification of key trends and drivers of global environmental change. The high country and sector detail drastically improves the resolution of Life-Cycle Assessments. Finally, the availability of information on uncertainty allows policy-makers to quantitatively judge the level of confidence that can be placed in the results of analyses.
Revision of Primary Series Maps
,
2000-01-01
In 1992, the U.S. Geological Survey (USGS) completed a 50-year effort to provide primary series map coverage of the United States. Many of these maps now need to be updated to reflect the construction of new roads and highways and other changes that have taken place over time. The USGS has formulated a graphic revision plan to help keep the primary series maps current. Primary series maps include 1:20,000-scale quadrangles of Puerto Rico, 1:24,000- or 1:25,000-scale quadrangles of the conterminous United States, Hawaii, and U.S. Territories, and 1:63,360-scale quadrangles of Alaska. The revision of primary series maps from new collection sources is accomplished using a variety of processes. The raster revision process combines the scanned content of paper maps with raster updating technologies. The vector revision process involves the automated plotting of updated vector files. Traditional processes use analog stereoplotters and manual scribing instruments on specially coated map separates. The ability to select from or combine these processes increases the efficiency of the National Mapping Division map revision program.
Nonlinear time-series analysis of current signal in cathodic contact glow discharge electrolysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allagui, Anis, E-mail: aallagui@sharjah.ac.ae; Abdelkareem, Mohammad Ali; Rojas, Andrea Espinel
In the standard two-electrode configuration employed in electrolytic process, when the control dc voltage is brought to a critical value, the system undergoes a transition from conventional electrolysis to contact glow discharge electrolysis (CGDE), which has also been referred to as liquid-submerged micro-plasma, glow discharge plasma electrolysis, electrode effect, electrolytic plasma, etc. The light-emitting process is associated with the development of an irregular and erratic current time-series which has been arbitrarily labelled as “random,” and thus dissuaded further research in this direction. Here, we examine the current time-series signals measured in cathodic CGDE configuration in a concentrated KOH solution atmore » different dc bias voltages greater than the critical voltage. We show that the signals are, in fact, not random according to the NIST SP. 800-22 test suite definition. We also demonstrate that post-processing low-pass filtered sequences requires less time than the native as-measured sequences, suggesting a superposition of low frequency chaotic fluctuations and high frequency behaviors (which may be produced by more than one possible source of entropy). Using an array of nonlinear time-series analyses for dynamical systems, i.e., the computation of largest Lyapunov exponents and correlation dimensions, and re-construction of phase portraits, we found that low-pass filtered datasets undergo a transition from quasi-periodic to chaotic to quasi-hyper-chaotic behavior, and back again to chaos when the voltage controlling-parameter is increased. The high frequency part of the signals is discussed in terms of highly nonlinear turbulent motion developed around the working electrode.« less
Time-series analysis of delta13C from tree rings. I. Time trends and autocorrelation.
Monserud, R A; Marshall, J D
2001-09-01
Univariate time-series analyses were conducted on stable carbon isotope ratios obtained from tree-ring cellulose. We looked for the presence and structure of autocorrelation. Significant autocorrelation violates the statistical independence assumption and biases hypothesis tests. Its presence would indicate the existence of lagged physiological effects that persist for longer than the current year. We analyzed data from 28 trees (60-85 years old; mean = 73 years) of western white pine (Pinus monticola Dougl.), ponderosa pine (Pinus ponderosa Laws.), and Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco var. glauca) growing in northern Idaho. Material was obtained by the stem analysis method from rings laid down in the upper portion of the crown throughout each tree's life. The sampling protocol minimized variation caused by changing light regimes within each tree. Autoregressive moving average (ARMA) models were used to describe the autocorrelation structure over time. Three time series were analyzed for each tree: the stable carbon isotope ratio (delta(13)C); discrimination (delta); and the difference between ambient and internal CO(2) concentrations (c(a) - c(i)). The effect of converting from ring cellulose to whole-leaf tissue did not affect the analysis because it was almost completely removed by the detrending that precedes time-series analysis. A simple linear or quadratic model adequately described the time trend. The residuals from the trend had a constant mean and variance, thus ensuring stationarity, a requirement for autocorrelation analysis. The trend over time for c(a) - c(i) was particularly strong (R(2) = 0.29-0.84). Autoregressive moving average analyses of the residuals from these trends indicated that two-thirds of the individual tree series contained significant autocorrelation, whereas the remaining third were random (white noise) over time. We were unable to distinguish between individuals with and without significant autocorrelation beforehand. Significant ARMA models were all of low order, with either first- or second-order (i.e., lagged 1 or 2 years, respectively) models performing well. A simple autoregressive (AR(1)), model was the most common. The most useful generalization was that the same ARMA model holds for each of the three series (delta(13)C, delta, c(a) - c(i)) for an individual tree, if the time trend has been properly removed for each series. The mean series for the two pine species were described by first-order ARMA models (1-year lags), whereas the Douglas-fir mean series were described by second-order models (2-year lags) with negligible first-order effects. Apparently, the process of constructing a mean time series for a species preserves an underlying signal related to delta(13)C while canceling some of the random individual tree variation. Furthermore, the best model for the overall mean series (e.g., for a species) cannot be inferred from a consensus of the individual tree model forms, nor can its parameters be estimated reliably from the mean of the individual tree parameters. Because two-thirds of the individual tree time series contained significant autocorrelation, the normal assumption of a random structure over time is unwarranted, even after accounting for the time trend. The residuals of an appropriate ARMA model satisfy the independence assumption, and can be used to make hypothesis tests.
Convoy projects: a case study from Germany
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gruener, W.
The Convoy concept was formulated at a time when Kraftwerk Union (KWU) anticipated the start of five to six turnkey nuclear power plant projects within a very short time frame, related to as many customers, sites, and licensing authorities. To counteract the rapidly escalating costs and schedules of projects then under construction, a series of measures were enacted with the purpose of streamlining and optimizing the work procedures both within the company and in relation with the external partners - customers, licensing authorities and their technical advisors, suppliers, and subsuppliers. The organizational framework and the various tools deployed by themore » turnkey contractor ensured smooth and effective design and construction procedures. The adherence to the original budget and schedule is truly remarkable, encouraging further investigation into the applicability of such methods and procedures for future projects.« less
The use of Tcl and Tk to improve design and code reutilization
NASA Technical Reports Server (NTRS)
Rodriguez, Lisbet; Reinholtz, Kirk
1995-01-01
Tcl and Tk facilitate design and code reuse in the ZIPSIM series of high-performance, high-fidelity spacecraft simulators. Tcl and Tk provide a framework for the construction of the Graphical User Interfaces for the simulators. The interfaces are architected such that a large proportion of the design and code is used for several applications, which has reduced design time and life-cycle costs.
Monitoring tropical environments with Space Shuttle photography
NASA Technical Reports Server (NTRS)
Helfert, Michael R.; Lulla, Kamlesh P.
1989-01-01
Orbital photography from the Space Shuttle missions (1981-88) and earlier manned spaceflight programs (1962-1975) allows remote sensing time series to be constructed for observations of environmental change in selected portions of the global tropics. Particular topics and regions include deforestation, soil erosion, supersedimentation in streams, lacustrine, and estuarine environments, and desertification in the greater Amazon, tropical Africa and Madagascar, South and Southeast Asia, and the Indo-Pacific archipelagoes.
ERIC Educational Resources Information Center
Martinez, George A.
Mexican Americans were legally defined as Whites as a result of treaty obligations with Mexico that expressly allowed Mexicans to become U.S. citizens. Federal laws of the time required that an alien be White to become a U.S. citizen. The government of Mexico and the U.S. Department of State pressured the U.S. Census Bureau to reclassify Mexican…
NASA Technical Reports Server (NTRS)
Prochzaka, Ivan; Kodat, Jan; Blazej, Josef; Sun, Xiaoli (Editor)
2015-01-01
We are reporting on a design, construction and performance of photon-counting detector packages based on silicon avalanche photodiodes. These photon-counting devices have been optimized for extremely high stability of their detection delay. The detectors have been designed for future applications in fundamental metrology and optical time transfer in space. The detectors have been qualified for operation in space missions. The exceptional radiation tolerance of the detection chip itself and of all critical components of a detector package has been verified in a series of experiments.
Design of fuzzy cognitive maps using neural networks for predicting chaotic time series.
Song, H J; Miao, C Y; Shen, Z Q; Roel, W; Maja, D H; Francky, C
2010-12-01
As a powerful paradigm for knowledge representation and a simulation mechanism applicable to numerous research and application fields, Fuzzy Cognitive Maps (FCMs) have attracted a great deal of attention from various research communities. However, the traditional FCMs do not provide efficient methods to determine the states of the investigated system and to quantify causalities which are the very foundation of the FCM theory. Therefore in many cases, constructing FCMs for complex causal systems greatly depends on expert knowledge. The manually developed models have a substantial shortcoming due to model subjectivity and difficulties with accessing its reliability. In this paper, we propose a fuzzy neural network to enhance the learning ability of FCMs so that the automatic determination of membership functions and quantification of causalities can be incorporated with the inference mechanism of conventional FCMs. In this manner, FCM models of the investigated systems can be automatically constructed from data, and therefore are independent of the experts. Furthermore, we employ mutual subsethood to define and describe the causalities in FCMs. It provides more explicit interpretation for causalities in FCMs and makes the inference process easier to understand. To validate the performance, the proposed approach is tested in predicting chaotic time series. The simulation studies show the effectiveness of the proposed approach. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Doin, Marie-Pierre; Lodge, Felicity; Guillaso, Stephane; Jolivet, Romain; Lasserre, Cecile; Ducret, Gabriel; Grandin, Raphael; Pathier, Erwan; Pinel, Virginie
2012-01-01
We assemble a processing chain that handles InSAR computation from raw data to time series analysis. A large part of the chain (from raw data to geocoded unwrapped interferograms) is based on ROI PAC modules (Rosen et al., 2004), with original routines rearranged and combined with new routines to process in series and in a common radar geometry all SAR images and interferograms. A new feature of the software is the range-dependent spectral filtering to improve coherence in interferograms with long spatial baselines. Additional components include a module to estimate and remove digital elevation model errors before unwrapping, a module to mitigate the effects of the atmospheric phase delay and remove residual orbit errors, and a module to construct the phase change time series from small baseline interferograms (Berardino et al. 2002). This paper describes the main elements of the processing chain and presents an example of application of the software using a data set from the ENVISAT mission covering the Etna volcano.
A Fast Framework for Abrupt Change Detection Based on Binary Search Trees and Kolmogorov Statistic
Qi, Jin-Peng; Qi, Jie; Zhang, Qing
2016-01-01
Change-Point (CP) detection has attracted considerable attention in the fields of data mining and statistics; it is very meaningful to discuss how to quickly and efficiently detect abrupt change from large-scale bioelectric signals. Currently, most of the existing methods, like Kolmogorov-Smirnov (KS) statistic and so forth, are time-consuming, especially for large-scale datasets. In this paper, we propose a fast framework for abrupt change detection based on binary search trees (BSTs) and a modified KS statistic, named BSTKS (binary search trees and Kolmogorov statistic). In this method, first, two binary search trees, termed as BSTcA and BSTcD, are constructed by multilevel Haar Wavelet Transform (HWT); second, three search criteria are introduced in terms of the statistic and variance fluctuations in the diagnosed time series; last, an optimal search path is detected from the root to leaf nodes of two BSTs. The studies on both the synthetic time series samples and the real electroencephalograph (EEG) recordings indicate that the proposed BSTKS can detect abrupt change more quickly and efficiently than KS, t-statistic (t), and Singular-Spectrum Analyses (SSA) methods, with the shortest computation time, the highest hit rate, the smallest error, and the highest accuracy out of four methods. This study suggests that the proposed BSTKS is very helpful for useful information inspection on all kinds of bioelectric time series signals. PMID:27413364
Globally-Gridded Interpolated Night-Time Marine Air Temperatures 1900-2014
NASA Astrophysics Data System (ADS)
Junod, R.; Christy, J. R.
2016-12-01
Over the past century, climate records have pointed to an increase in global near-surface average temperature. Near-surface air temperature over the oceans is a relatively unused parameter in understanding the current state of climate, but is useful as an independent temperature metric over the oceans and serves as a geographical and physical complement to near-surface air temperature over land. Though versions of this dataset exist (i.e. HadMAT1 and HadNMAT2), it has been strongly recommended that various groups generate climate records independently. This University of Alabama in Huntsville (UAH) study began with the construction of monthly night-time marine air temperature (UAHNMAT) values from the early-twentieth century through to the present era. Data from the International Comprehensive Ocean and Atmosphere Data Set (ICOADS) were used to compile a time series of gridded UAHNMAT, (20S-70N). This time series was homogenized to correct for the many biases such as increasing ship height, solar deck heating, etc. The time series of UAHNMAT, once adjusted to a standard reference height, is gridded to 1.25° pentad grid boxes and interpolated using the kriging interpolation technique. This study will present results which quantify the variability and trends and compare to current trends of other related datasets that include HadNMAT2 and sea-surface temperatures (HadISST & ERSSTv4).
A Fast Framework for Abrupt Change Detection Based on Binary Search Trees and Kolmogorov Statistic.
Qi, Jin-Peng; Qi, Jie; Zhang, Qing
2016-01-01
Change-Point (CP) detection has attracted considerable attention in the fields of data mining and statistics; it is very meaningful to discuss how to quickly and efficiently detect abrupt change from large-scale bioelectric signals. Currently, most of the existing methods, like Kolmogorov-Smirnov (KS) statistic and so forth, are time-consuming, especially for large-scale datasets. In this paper, we propose a fast framework for abrupt change detection based on binary search trees (BSTs) and a modified KS statistic, named BSTKS (binary search trees and Kolmogorov statistic). In this method, first, two binary search trees, termed as BSTcA and BSTcD, are constructed by multilevel Haar Wavelet Transform (HWT); second, three search criteria are introduced in terms of the statistic and variance fluctuations in the diagnosed time series; last, an optimal search path is detected from the root to leaf nodes of two BSTs. The studies on both the synthetic time series samples and the real electroencephalograph (EEG) recordings indicate that the proposed BSTKS can detect abrupt change more quickly and efficiently than KS, t-statistic (t), and Singular-Spectrum Analyses (SSA) methods, with the shortest computation time, the highest hit rate, the smallest error, and the highest accuracy out of four methods. This study suggests that the proposed BSTKS is very helpful for useful information inspection on all kinds of bioelectric time series signals.
Wohlin, Åsa
2015-03-21
The distribution of codons in the nearly universal genetic code is a long discussed issue. At the atomic level, the numeral series 2x(2) (x=5-0) lies behind electron shells and orbitals. Numeral series appear in formulas for spectral lines of hydrogen. The question here was if some similar scheme could be found in the genetic code. A table of 24 codons was constructed (synonyms counted as one) for 20 amino acids, four of which have two different codons. An atomic mass analysis was performed, built on common isotopes. It was found that a numeral series 5 to 0 with exponent 2/3 times 10(2) revealed detailed congruency with codon-grouped amino acid side-chains, simultaneously with the division on atom kinds, further with main 3rd base groups, backbone chains and with codon-grouped amino acids in relation to their origin from glycolysis or the citrate cycle. Hence, it is proposed that this series in a dynamic way may have guided the selection of amino acids into codon domains. Series with simpler exponents also showed noteworthy correlations with the atomic mass distribution on main codon domains; especially the 2x(2)-series times a factor 16 appeared as a conceivable underlying level, both for the atomic mass and charge distribution. Furthermore, it was found that atomic mass transformations between numeral systems, possibly interpretable as dimension degree steps, connected the atomic mass of codon bases with codon-grouped amino acids and with the exponent 2/3-series in several astonishing ways. Thus, it is suggested that they may be part of a deeper reference system. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Ward, Robin Eichel
This research explored the effects of Roundhouse diagram construction and use on meaningful learning of science concepts in a 6th-grade science classroom. This investigation examined the transformation of students' science concepts as they became more proficient in constructing Roundhouse diagrams, what problems students encountered while constructing Roundhouse diagrams, and how choices of iconic images affected their progress in meaningfully learning science concepts as they constructed a series of Roundhouse diagrams. The process of constructing a Roundhouse diagram involved recognizing the learner's relevant existing concepts, evaluating the central concepts for a science lesson and breaking them down into their component parts, reconstructing the learner's conceptual framework by reducing the amount of detail efficiently, reviewing the reconstruction process, and linking each key concept to an iconic image. The researcher collected and analyzed qualitative and quantitative data to determine the effectiveness of the Roundhouse diagram. Data included field notes, observations, students' responses to Roundhouse diagram worksheets, students' perceptions from evaluation sheets, students' mastery of technique sheets, tapes and transcripts of students' interviews, student-constructed Roundhouse diagrams, and documentation of science grades both pre- and post-Roundhouse diagramming. This multiple case study focused on six students although the whole class was used for statistical purposes. Stratified purposeful sampling was used to facilitate comparisons as well as week-by-week comparisons of students' science grades and Roundhouse diagram scores to gain additional insight into the effectiveness of the Roundhouse diagramming method. Through participation in constructing a series of Roundhouse diagrams, middle school students gained a greater understanding of science concepts. Roundhouse diagram scores improved over time during the 10-week Roundhouse diagramming session. Students' science scores improved as they became more proficient in constructing the Roundhouse diagrams. The major problems associated with constructing Roundhouse diagrams were extracting the main ideas from the textbook, understanding science concepts in terms of whole/part relationships, paraphrasing sentences effectively, and sequencing events in an accurate order. A positive relationship existed for the case study group based on students' choices and drawings of iconic images and the meaningful learning of science concepts.
Welsh, Stuart A.; Chen, Yushun; Viadero, Stuart C.; Wei, Xinchao; Hedrick, Lara B.; Anderson, James T.; Lin, Lian-Shin
2009-01-01
Refining best management practices (BMPs) for future highway construction depends on a comprehensive understanding of environmental impacts from current construction methods. Based on a before-after-control impact (BACI) experimental design, long-term stream monitoring (1997–2006) was conducted at upstream (as control, n = 3) and downstream (as impact, n = 6) sites in the Lost River watershed of the Mid-Atlantic Highlands region, West Virginia. Monitoring data were analyzed to assess impacts of during and after highway construction on 15 water quality parameters and macroinvertebrate condition using the West Virginia stream condition index (WVSCI). Principal components analysis (PCA) identified regional primary water quality variances, and paired t tests and time series analysis detected seven highway construction-impacted water quality parameters which were mainly associated with the second principal component. In particular, impacts on turbidity, total suspended solids, and total iron during construction, impacts on chloride and sulfate during and after construction, and impacts on acidity and nitrate after construction were observed at the downstream sites. The construction had statistically significant impacts on macroinvertebrate index scores (i.e., WVSCI) after construction, but did not change the overall good biological condition. Implementing BMPs that address those construction-impacted water quality parameters can be an effective mitigation strategy for future highway construction in this highlands region.
Extracting Leading Nonlinear Modes of Changing Climate From Global SST Time Series
NASA Astrophysics Data System (ADS)
Mukhin, D.; Gavrilov, A.; Loskutov, E. M.; Feigin, A. M.; Kurths, J.
2017-12-01
Data-driven modeling of climate requires adequate principal variables extracted from observed high-dimensional data. For constructing such variables it is needed to find spatial-temporal patterns explaining a substantial part of the variability and comprising all dynamically related time series from the data. The difficulties of this task rise from the nonlinearity and non-stationarity of the climate dynamical system. The nonlinearity leads to insufficiency of linear methods of data decomposition for separating different processes entangled in the observed time series. On the other hand, various forcings, both anthropogenic and natural, make the dynamics non-stationary, and we should be able to describe the response of the system to such forcings in order to separate the modes explaining the internal variability. The method we present is aimed to overcome both these problems. The method is based on the Nonlinear Dynamical Mode (NDM) decomposition [1,2], but takes into account external forcing signals. An each mode depends on hidden, unknown a priori, time series which, together with external forcing time series, are mapped onto data space. Finding both the hidden signals and the mapping allows us to study the evolution of the modes' structure in changing external conditions and to compare the roles of the internal variability and forcing in the observed behavior. The method is used for extracting of the principal modes of SST variability on inter-annual and multidecadal time scales accounting the external forcings such as CO2, variations of the solar activity and volcanic activity. The structure of the revealed teleconnection patterns as well as their forecast under different CO2 emission scenarios are discussed.[1] Mukhin, D., Gavrilov, A., Feigin, A., Loskutov, E., & Kurths, J. (2015). Principal nonlinear dynamical modes of climate variability. Scientific Reports, 5, 15510. [2] Gavrilov, A., Mukhin, D., Loskutov, E., Volodin, E., Feigin, A., & Kurths, J. (2016). Method for reconstructing nonlinear modes with adaptive structure from multidimensional data. Chaos: An Interdisciplinary Journal of Nonlinear Science, 26(12), 123101.
ERIC Educational Resources Information Center
Stickler, Leslie; Sykes, Gary
2016-01-01
This report reviews the scholarly and research evidence supporting the construct labeled modeling and explaining content (MEC), which is measured via a performance assessment in the "ETS"® National Observational Teaching Examination (NOTE) assessment series. This construct involves practices at the heart of teaching that deal with how…
Characteristic research on Hong Kong "I learned" series computer textbooks
NASA Astrophysics Data System (ADS)
Hu, Jinyan; Liu, Zhongxia; Li, Yuanyuan; Lu, Jianheng; Zhang, Lili
2011-06-01
Currently, the construction of information technology textbooks in the primary and middle schools is an important content of the information technology curriculum reform. The article expect to have any inspire and reference on inland China school information technology teaching material construction and development through the analyzing and refining the characteristics of the Hong Kong quality textbook series - "I learn . elementary school computer cognitive curriculum".
Shi, Wei; Xia, Jun
2017-02-01
Water quality risk management is a global hot research linkage with the sustainable water resource development. Ammonium nitrogen (NH 3 -N) and permanganate index (COD Mn ) as the focus indicators in Huai River Basin, are selected to reveal their joint transition laws based on Markov theory. The time-varying moments model with either time or land cover index as explanatory variables is applied to build the time-varying marginal distributions of water quality time series. Time-varying copula model, which takes the non-stationarity in the marginal distribution and/or the time variation in dependence structure between water quality series into consideration, is constructed to describe a bivariate frequency analysis for NH 3 -N and COD Mn series at the same monitoring gauge. The larger first-order Markov joint transition probability indicates water quality state Class V w , Class IV and Class III will occur easily in the water body of Bengbu Sluice. Both marginal distribution and copula models are nonstationary, and the explanatory variable time yields better performance than land cover index in describing the non-stationarities in the marginal distributions. In modelling the dependence structure changes, time-varying copula has a better fitting performance than the copula with the constant or the time-trend dependence parameter. The largest synchronous encounter risk probability of NH 3 -N and COD Mn simultaneously reaching Class V is 50.61%, while the asynchronous encounter risk probability is largest when NH 3 -N and COD Mn is inferior to class V and class IV water quality standards, respectively.
Tong, Feifei; Lian, Yan; Zhou, Huang; Shi, Xiaohong; He, Fengjiao
2014-10-21
A new multichannel series piezoelectric quartz crystal (MSPQC) cell sensor for real time monitoring of living cells in vitro was reported in this paper. The constructed sensor was used successfully to monitor adhesion, spreading, proliferation, and apoptosis of MG63 osteosarcoma cells and investigate the effects of different concentrations of cobalt chloride on MG63 cells. Quantitative real time and dynamic cell analyses data were conducted using the MSPQC cell sensor. Compared with methods such as fluorescence staining and morphology observation by microscopy, the MSPQC cell sensor is noninvasive, label free, simple, cheap, and capable of online monitoring. It can automatically record the growth status of cells and quantitatively evaluate cell proliferation and the apoptotic response to drugs. It will be a valuable detection and analysis tool for the acquisition of cellular level information and is anticipated to have application in the field of cell biology research or cytotoxicity testing in the future.
Extended space expectation values in quantum dynamical system evolutions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demiralp, Metin
2014-10-06
The time variant power series expansion for the expectation value of a given quantum dynamical operator is well-known and well-investigated issue in quantum dynamics. However, depending on the operator and Hamiltonian singularities this expansion either may not exist or may not converge for all time instances except the beginning of the evolution. This work focuses on this issue and seeks certain cures for the negativities. We work in the extended space obtained by adding all images of the initial wave function under the system Hamiltonian’s positive integer powers. This requires the introduction of certain appropriately defined weight operators. The resultingmore » better convergence in the temporal power series urges us to call the new defined entities “extended space expectation values” even though they are constructed over certain weight operators and are somehow pseudo expectation values.« less
NASA Technical Reports Server (NTRS)
Rhodes, E. J., Jr.; Cacciani, A.; Blamont, J.; Tomczyk, S.; Ulrich, R. K.; Howard, R. F.
1984-01-01
A program was developed to evaluate the performance of three different devices as possible space-borne solar velocity field imagers. Two of these three devices, a magneto-optical filter and a molecular adherence Fabry-Perot interferometer were installed in a newly-constructed observing system located at the 60-foot tower telescope at the Mt. Wilson Observatory. Time series of solar filtergrams and Dopplergrams lasting up to 10 hours per day were obtained with the filter while shorter runs were obtained with the Fabry-Perot. Two-dimensional k (sub h)-omega power spectra which show clearly the well-known p-mode ridges were computed from the time series obtained with the magneto-optical filter. These power spectra were compared with similar power spectra obtained recently with the 13.7-m McMath spectrograph at Kitt Peak.
Testing for detailed balance in a financial market
NASA Astrophysics Data System (ADS)
Fiebig, H. R.; Musgrove, D. P.
2015-06-01
We test a historical price-time series in a financial market (the NASDAQ 100 index) for a statistical property known as detailed balance. The presence of detailed balance would imply that the market can be modeled by a stochastic process based on a Markov chain, thus leading to equilibrium. In economic terms, a positive outcome of the test would support the efficient market hypothesis, a cornerstone of neo-classical economic theory. In contrast to the usage in prevalent economic theory the term equilibrium here is tied to the returns, rather than the price-time series. The test is based on an action functional S constructed from the elements of the detailed balance condition and the historical data set, and then analyzing S by means of simulated annealing. Checks are performed to verify the validity of the analysis method. We discuss the outcome of this analysis.
NASA Technical Reports Server (NTRS)
Talpe, Matthieu J.; Nerem, R. Steven; Forootan, Ehsan; Schmidt, Michael; Lemoine, Frank G.; Enderlin, Ellyn M.; Landerer, Felix W.
2017-01-01
We construct long-term time series of Greenland and Antarctic ice sheet mass change from satellite gravity measurements. A statistical reconstruction approach is developed based on a principal component analysis (PCA) to combine high-resolution spatial modes from the Gravity Recovery and Climate Experiment (GRACE) mission with the gravity information from conventional satellite tracking data. Uncertainties of this reconstruction are rigorously assessed; they include temporal limitations for short GRACE measurements, spatial limitations for the low-resolution conventional tracking data measurements, and limitations of the estimated statistical relationships between low- and high-degree potential coefficients reflected in the PCA modes. Trends of mass variations in Greenland and Antarctica are assessed against a number of previous studies. The resulting time series for Greenland show a higher rate of mass loss than other methods before 2000, while the Antarctic ice sheet appears heavily influenced by interannual variations.
Howell, G. P.; Ryan, J. M.; Morgans, B. T.; Cooper, G. J.
1991-01-01
Laparotomy and anastomosis of the small bowel after penetrating injury to the abdomen is a lengthy procedure. This paper describes the use of skin staplers for bowel anastomosis and presents the results of a short series of experiments upon dead pigs to compare the staple technique with conventional handsewn anastomosis. The time taken to perform each small bowel anastomosis, the integrity of the anastomosis and the skill required were assessed. The staple technique was considerably faster (mean construction time: 5.4 min, range 4-6 min) than the handsewn technique (mean construction time: 12 min, range 10-14 min), at least halving the anastomosis time (Kolmogorov two-sample test P = 0.05). In addition, the stapled anastomosis had a higher intraluminal failure pressure (mean failure pressure: 65 cmH2O, 6.37 kPa, range 30-70 cmH2O) than the handsewn anastomosis (mean failure pressure: 38.6 cmH2O, 3.78 kPa, range 10-70 cmH2O). Images Figure 1 Figure 2 Figure 3 Figure 4 PMID:2018326
[Multivariate Adaptive Regression Splines (MARS), an alternative for the analysis of time series].
Vanegas, Jairo; Vásquez, Fabián
Multivariate Adaptive Regression Splines (MARS) is a non-parametric modelling method that extends the linear model, incorporating nonlinearities and interactions between variables. It is a flexible tool that automates the construction of predictive models: selecting relevant variables, transforming the predictor variables, processing missing values and preventing overshooting using a self-test. It is also able to predict, taking into account structural factors that might influence the outcome variable, thereby generating hypothetical models. The end result could identify relevant cut-off points in data series. It is rarely used in health, so it is proposed as a tool for the evaluation of relevant public health indicators. For demonstrative purposes, data series regarding the mortality of children under 5 years of age in Costa Rica were used, comprising the period 1978-2008. Copyright © 2016 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.
Construction Theory and Noise Analysis Method of Global CGCS2000 Coordinate Frame
NASA Astrophysics Data System (ADS)
Jiang, Z.; Wang, F.; Bai, J.; Li, Z.
2018-04-01
The definition, renewal and maintenance of geodetic datum has been international hot issue. In recent years, many countries have been studying and implementing modernization and renewal of local geodetic reference coordinate frame. Based on the precise result of continuous observation for recent 15 years from state CORS (continuously operating reference system) network and the mainland GNSS (Global Navigation Satellite System) network between 1999 and 2007, this paper studies the construction of mathematical model of the Global CGCS2000 frame, mainly analyzes the theory and algorithm of two-step method for Global CGCS2000 Coordinate Frame formulation. Finally, the noise characteristic of the coordinate time series are estimated quantitatively with the criterion of maximum likelihood estimation.
ERIC Educational Resources Information Center
Odell, John H.
A school construction guide offers key personnel in school development projects information on the complex task of master planning and construction of schools in Australia. This chapter of the guide provides advice on the various types of construction that may be used, the materials available, and some elementary aspects of the services required…
Careers in Construction: Construction Industry Series: Student Manual and Instructor's Guide.
ERIC Educational Resources Information Center
Texas Education Agency, Austin. Dept. of Occupational Education and Technology.
The guide for instructors of construction occupations provides instructional suggestions and informational sources for structuring an exploratory program. The program is divided into the following blocks, representing different experiences in construction: (1) wood; (2) finishing; (3) engineering, support, and management services; (4) metal; (5)…
In and out of glacial extremes by way of dust-climate feedbacks.
Shaffer, Gary; Lambert, Fabrice
2018-02-27
Mineral dust aerosols cool Earth directly by scattering incoming solar radiation and indirectly by affecting clouds and biogeochemical cycles. Recent Earth history has featured quasi-100,000-y, glacial-interglacial climate cycles with lower/higher temperatures and greenhouse gas concentrations during glacials/interglacials. Global average, glacial maxima dust levels were more than 3 times higher than during interglacials, thereby contributing to glacial cooling. However, the timing, strength, and overall role of dust-climate feedbacks over these cycles remain unclear. Here we use dust deposition data and temperature reconstructions from ice sheet, ocean sediment, and land archives to construct dust-climate relationships. Although absolute dust deposition rates vary greatly among these archives, they all exhibit striking, nonlinear increases toward coldest glacial conditions. From these relationships and reconstructed temperature time series, we diagnose glacial-interglacial time series of dust radiative forcing and iron fertilization of ocean biota, and use these time series to force Earth system model simulations. The results of these simulations show that dust-climate feedbacks, perhaps set off by orbital forcing, push the system in and out of extreme cold conditions such as glacial maxima. Without these dust effects, glacial temperature and atmospheric CO 2 concentrations would have been much more stable at higher, intermediate glacial levels. The structure of residual anomalies over the glacial-interglacial climate cycles after subtraction of dust effects provides constraints for the strength and timing of other processes governing these cycles. Copyright © 2018 the Author(s). Published by PNAS.
Cohen, Michael X
2017-09-27
The number of simultaneously recorded electrodes in neuroscience is steadily increasing, providing new opportunities for understanding brain function, but also new challenges for appropriately dealing with the increase in dimensionality. Multivariate source separation analysis methods have been particularly effective at improving signal-to-noise ratio while reducing the dimensionality of the data and are widely used for cleaning, classifying and source-localizing multichannel neural time series data. Most source separation methods produce a spatial component (that is, a weighted combination of channels to produce one time series); here, this is extended to apply source separation to a time series, with the idea of obtaining a weighted combination of successive time points, such that the weights are optimized to satisfy some criteria. This is achieved via a two-stage source separation procedure, in which an optimal spatial filter is first constructed and then its optimal temporal basis function is computed. This second stage is achieved with a time-delay-embedding matrix, in which additional rows of a matrix are created from time-delayed versions of existing rows. The optimal spatial and temporal weights can be obtained by solving a generalized eigendecomposition of covariance matrices. The method is demonstrated in simulated data and in an empirical electroencephalogram study on theta-band activity during response conflict. Spatiotemporal source separation has several advantages, including defining empirical filters without the need to apply sinusoidal narrowband filters. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
In and out of glacial extremes by way of dust‑climate feedbacks
NASA Astrophysics Data System (ADS)
Shaffer, Gary; Lambert, Fabrice
2018-03-01
Mineral dust aerosols cool Earth directly by scattering incoming solar radiation and indirectly by affecting clouds and biogeochemical cycles. Recent Earth history has featured quasi-100,000-y, glacial‑interglacial climate cycles with lower/higher temperatures and greenhouse gas concentrations during glacials/interglacials. Global average, glacial maxima dust levels were more than 3 times higher than during interglacials, thereby contributing to glacial cooling. However, the timing, strength, and overall role of dust‑climate feedbacks over these cycles remain unclear. Here we use dust deposition data and temperature reconstructions from ice sheet, ocean sediment, and land archives to construct dust‑climate relationships. Although absolute dust deposition rates vary greatly among these archives, they all exhibit striking, nonlinear increases toward coldest glacial conditions. From these relationships and reconstructed temperature time series, we diagnose glacial‑interglacial time series of dust radiative forcing and iron fertilization of ocean biota, and use these time series to force Earth system model simulations. The results of these simulations show that dust‑climate feedbacks, perhaps set off by orbital forcing, push the system in and out of extreme cold conditions such as glacial maxima. Without these dust effects, glacial temperature and atmospheric CO2 concentrations would have been much more stable at higher, intermediate glacial levels. The structure of residual anomalies over the glacial‑interglacial climate cycles after subtraction of dust effects provides constraints for the strength and timing of other processes governing these cycles.
In and out of glacial extremes by way of dust−climate feedbacks
Lambert, Fabrice
2018-01-01
Mineral dust aerosols cool Earth directly by scattering incoming solar radiation and indirectly by affecting clouds and biogeochemical cycles. Recent Earth history has featured quasi-100,000-y, glacial−interglacial climate cycles with lower/higher temperatures and greenhouse gas concentrations during glacials/interglacials. Global average, glacial maxima dust levels were more than 3 times higher than during interglacials, thereby contributing to glacial cooling. However, the timing, strength, and overall role of dust−climate feedbacks over these cycles remain unclear. Here we use dust deposition data and temperature reconstructions from ice sheet, ocean sediment, and land archives to construct dust−climate relationships. Although absolute dust deposition rates vary greatly among these archives, they all exhibit striking, nonlinear increases toward coldest glacial conditions. From these relationships and reconstructed temperature time series, we diagnose glacial−interglacial time series of dust radiative forcing and iron fertilization of ocean biota, and use these time series to force Earth system model simulations. The results of these simulations show that dust−climate feedbacks, perhaps set off by orbital forcing, push the system in and out of extreme cold conditions such as glacial maxima. Without these dust effects, glacial temperature and atmospheric CO2 concentrations would have been much more stable at higher, intermediate glacial levels. The structure of residual anomalies over the glacial−interglacial climate cycles after subtraction of dust effects provides constraints for the strength and timing of other processes governing these cycles. PMID:29440407
Flow networks for Ocean currents
NASA Astrophysics Data System (ADS)
Tupikina, Liubov; Molkenthin, Nora; Marwan, Norbert; Kurths, Jürgen
2014-05-01
Complex networks have been successfully applied to various systems such as society, technology, and recently climate. Links in a climate network are defined between two geographical locations if the correlation between the time series of some climate variable is higher than a threshold. Therefore, network links are considered to imply heat exchange. However, the relationship between the oceanic and atmospheric flows and the climate network's structure is still unclear. Recently, a theoretical approach verifying the correlation between ocean currents and surface air temperature networks has been introduced, where the Pearson correlation networks were constructed from advection-diffusion dynamics on an underlying flow. Since the continuous approach has its limitations, i.e., by its high computational complexity, we here introduce a new, discrete construction of flow-networks, which is then applied to static and dynamic velocity fields. Analyzing the flow-networks of prototypical flows we find that our approach can highlight the zones of high velocity by degree and transition zones by betweenness, while the combination of these network measures can uncover how the flow propagates within time. We also apply the method to time series data of the Equatorial Pacific Ocean Current and the Gulf Stream ocean current for the changing velocity fields, which could not been done before, and analyse the properties of the dynamical system. Flow-networks can be powerful tools to theoretically understand the step from system's dynamics to network's topology that can be analyzed using network measures and is used for shading light on different climatic phenomena.
Two-body potential model based on cosine series expansion for ionic materials
Oda, Takuji; Weber, William J.; Tanigawa, Hisashi
2015-09-23
There is a method to construct a two-body potential model for ionic materials with a Fourier series basis and we examine it. For this method, the coefficients of cosine basis functions are uniquely determined by solving simultaneous linear equations to minimize the sum of weighted mean square errors in energy, force and stress, where first-principles calculation results are used as the reference data. As a validation test of the method, potential models for magnesium oxide are constructed. The mean square errors appropriately converge with respect to the truncation of the cosine series. This result mathematically indicates that the constructed potentialmore » model is sufficiently close to the one that is achieved with the non-truncated Fourier series and demonstrates that this potential virtually provides minimum error from the reference data within the two-body representation. The constructed potential models work appropriately in both molecular statics and dynamics simulations, especially if a two-step correction to revise errors expected in the reference data is performed, and the models clearly outperform two existing Buckingham potential models that were tested. Moreover, the good agreement over a broad range of energies and forces with first-principles calculations should enable the prediction of materials behavior away from equilibrium conditions, such as a system under irradiation.« less
Ultrasonic Cole-Cole diagram for solutions and application to alpha-chymotrypsin.
Cerf, R; Salehi, S T; Rogez, D
1989-04-01
Deconvolution of ultrasonic data into single relaxations is rarely feasible when only the absorption or the velocity of the waves is measured. Here we use both series of data to construct a Cole-Cole diagram for a solution. When applied to alpha-chymotrypsin, this method shows two relaxations that are well separated on the time scale, a result that will help simplify analyses of the ultrasonic data for this enzyme.
Ultrasonic Cole-Cole diagram for solutions and application to alpha-chymotrypsin.
Cerf, R; Salehi, S T; Rogez, D
1989-01-01
Deconvolution of ultrasonic data into single relaxations is rarely feasible when only the absorption or the velocity of the waves is measured. Here we use both series of data to construct a Cole-Cole diagram for a solution. When applied to alpha-chymotrypsin, this method shows two relaxations that are well separated on the time scale, a result that will help simplify analyses of the ultrasonic data for this enzyme. PMID:2720063
Construction of Solar-Wind-Like Magnetic Fields
NASA Technical Reports Server (NTRS)
Roberts, Dana Aaron
2012-01-01
Fluctuations in the solar wind fields tend to not only have velocities and magnetic fields correlated in the sense consistent with Alfven waves traveling from the Sun, but they also have the magnitude of the magnetic field remarkably constant despite their being broadband. This paper provides, for the first time, a method for constructing fields with nearly constant magnetic field, zero divergence, and with any specified power spectrum for the fluctuations of the components of the field. Every wave vector, k, is associated with two polarizations the relative phases of these can be chosen to minimize the variance of the field magnitude while retaining the\\random character of the fields. The method is applied to a case with one spatial coordinate that demonstrates good agreement with observed time series and power spectra of the magnetic field in the solar wind, as well as with the distribution of the angles of rapid changes (discontinuities), thus showing a deep connection between two seemingly unrelated issues. It is suggested that using this construction will lead to more realistic simulations of solar wind turbulence and of the propagation of energetic particles.
Small-world bias of correlation networks: From brain to climate
NASA Astrophysics Data System (ADS)
Hlinka, Jaroslav; Hartman, David; Jajcay, Nikola; Tomeček, David; Tintěra, Jaroslav; Paluš, Milan
2017-03-01
Complex systems are commonly characterized by the properties of their graph representation. Dynamical complex systems are then typically represented by a graph of temporal dependencies between time series of state variables of their subunits. It has been shown recently that graphs constructed in this way tend to have relatively clustered structure, potentially leading to spurious detection of small-world properties even in the case of systems with no or randomly distributed true interactions. However, the strength of this bias depends heavily on a range of parameters and its relevance for real-world data has not yet been established. In this work, we assess the relevance of the bias using two examples of multivariate time series recorded in natural complex systems. The first is the time series of local brain activity as measured by functional magnetic resonance imaging in resting healthy human subjects, and the second is the time series of average monthly surface air temperature coming from a large reanalysis of climatological data over the period 1948-2012. In both cases, the clustering in the thresholded correlation graph is substantially higher compared with a realization of a density-matched random graph, while the shortest paths are relatively short, showing thus distinguishing features of small-world structure. However, comparable or even stronger small-world properties were reproduced in correlation graphs of model processes with randomly scrambled interconnections. This suggests that the small-world properties of the correlation matrices of these real-world systems indeed do not reflect genuinely the properties of the underlying interaction structure, but rather result from the inherent properties of correlation matrix.
Small-world bias of correlation networks: From brain to climate.
Hlinka, Jaroslav; Hartman, David; Jajcay, Nikola; Tomeček, David; Tintěra, Jaroslav; Paluš, Milan
2017-03-01
Complex systems are commonly characterized by the properties of their graph representation. Dynamical complex systems are then typically represented by a graph of temporal dependencies between time series of state variables of their subunits. It has been shown recently that graphs constructed in this way tend to have relatively clustered structure, potentially leading to spurious detection of small-world properties even in the case of systems with no or randomly distributed true interactions. However, the strength of this bias depends heavily on a range of parameters and its relevance for real-world data has not yet been established. In this work, we assess the relevance of the bias using two examples of multivariate time series recorded in natural complex systems. The first is the time series of local brain activity as measured by functional magnetic resonance imaging in resting healthy human subjects, and the second is the time series of average monthly surface air temperature coming from a large reanalysis of climatological data over the period 1948-2012. In both cases, the clustering in the thresholded correlation graph is substantially higher compared with a realization of a density-matched random graph, while the shortest paths are relatively short, showing thus distinguishing features of small-world structure. However, comparable or even stronger small-world properties were reproduced in correlation graphs of model processes with randomly scrambled interconnections. This suggests that the small-world properties of the correlation matrices of these real-world systems indeed do not reflect genuinely the properties of the underlying interaction structure, but rather result from the inherent properties of correlation matrix.
Sensitivity vector fields in time-delay coordinate embeddings: theory and experiment.
Sloboda, A R; Epureanu, B I
2013-02-01
Identifying changes in the parameters of a dynamical system can be vital in many diagnostic and sensing applications. Sensitivity vector fields (SVFs) are one way of identifying such parametric variations by quantifying their effects on the morphology of a dynamical system's attractor. In many cases, SVFs are a more effective means of identification than commonly employed modal methods. Previously, it has only been possible to construct SVFs for a given dynamical system when a full set of state variables is available. This severely restricts SVF applicability because it may be cost prohibitive, or even impossible, to measure the entire state in high-dimensional systems. Thus, the focus of this paper is constructing SVFs with only partial knowledge of the state by using time-delay coordinate embeddings. Local models are employed in which the embedded states of a neighborhood are weighted in a way referred to as embedded point cloud averaging. Application of the presented methodology to both simulated and experimental time series demonstrates its utility and reliability.
Alvine, Gregory F; Swain, James M; Asher, Marc A; Burton, Douglas C
2004-08-01
The controversy of burst fracture surgical management is addressed in this retrospective case study and literature review. The series consisted of 40 consecutive patients, index included, with 41 fractures treated with stiff, limited segment transpedicular bone-anchored instrumentation and arthrodesis from 1987 through 1994. No major acute complications such as death, paralysis, or infection occurred. For the 30 fractures with pre- and postoperative computed tomography studies, spinal canal compromise was 61% and 32%, respectively. Neurologic function improved in 7 of 14 patients (50%) and did not worsen in any. The principal problem encountered was screw breakage, which occurred in 16 of the 41 (39%) instrumented fractures. As we have previously reported, transpedicular anterior bone graft augmentation significantly decreased variable screw placement (VSP) implant breakage. However, it did not prevent Isola implant breakage in two-motion segment constructs. Compared with VSP, Isola provided better sagittal plane realignment and constructs that have been found to be significantly stiffer. Unplanned reoperation was necessary in 9 of the 40 patients (23%). At 1- and 2-year follow-up, 95% and 79% of patients were available for study, and a satisfactory outcome was achieved in 84% and 79%, respectively. These satisfaction and reoperation rates are consistent with the literature of the time. Based on these observations and the loads to which implant constructs are exposed following posterior realignment and stabilization of burst fractures, we recommend that three- or four-motion segment constructs, rather than two motion, be used. To save valuable motion segments, planned construct shortening can be used. An alternative is sequential or staged anterior corpectomy and structural grafting.
Lienkamp, Karen; Madkour, Ahmad E.; Musante, Ashlan; Nelson, Christopher F.; Nüsslein, Klaus
2014-01-01
Synthetic Mimics of Antimicrobial Peptides (SMAMPs) imitate natural host-defense peptides, a vital component of the body’s immune system. This work presents a molecular construction kit that allows the easy and versatile synthesis of a broad variety of facially amphiphilic oxanorbornene-derived monomers. Their ring-opening metathesis polymerization (ROMP) and deprotection provide several series of SMAMPs. Using amphiphilicity, monomer feed ratio, and molecular weight as parameters, polymers with 533 times higher selectivitiy (selecitviy = hemolytic concentration/minimum inhibitory concentration) for bacteria over mammalian cells were discovered. Some of these polymers were 50 times more selective for Gram-positive over Gram-negative bacteria while other polymers surprisingly showed the opposite preference. This kind of “double selectivity” (bacteria over mammalian and one bacterial type over another) is unprecedented in other polymer systems and is attributed to the monomer’s facial amphiphilicity. PMID:18593128
NASA Astrophysics Data System (ADS)
Auer, I.; Kirchengast, A.; Proske, H.
2009-09-01
The ongoing climate change debate focuses more and more on changing extreme events. Information on past events can be derived from a number of sources, such as instrumental data, residual impacts in the landscape, but also chronicles and people's memories. A project called "A Tale of Two Valleys” within the framework of the research program "proVision” allowed to study past extreme events in two inner-alpine valleys from the sources mentioned before. Instrumental climate time series provided information for the past 200 years, however great attention had to be given to the homogeneity of the series. To derive homogenized time series of selected climate change indices methods like HOCLIS and Vincent have been applied. Trend analyses of climate change indices inform about increase or decrease of extreme events. Traces of major geomorphodynamic processes of the past (e.g. rockfalls, landslides, debris flows) which were triggered or affected by extreme weather events are still apparent in the landscape and could be evaluated by geomorphological analysis using remote sensing and field data. Regional chronicles provided additional knowledge and covered longer periods back in time, however compared to meteorological time series they enclose a high degree of subjectivity and intermittent recordings cannot be obviated. Finally, questionnaires and oral history complemented our picture of past extreme weather events. People were differently affected and have different memories of it. The joint analyses of these four data sources showed agreement to some extent, however also showed some reasonable differences: meteorological data are point measurements only with a sometimes too coarse temporal resolution. Due to land-use changes and improved constructional measures the impact of an extreme meteorological event may be different today compared to earlier times.
Holocene monsoon variability as resolved in small complex networks from palaeodata
NASA Astrophysics Data System (ADS)
Rehfeld, K.; Marwan, N.; Breitenbach, S.; Kurths, J.
2012-04-01
To understand the impacts of Holocene precipitation and/or temperature changes in the spatially extensive and complex region of Asia, it is promising to combine the information from palaeo archives, such as e.g. stalagmites, tree rings and marine sediment records from India and China. To this end, complex networks present a powerful and increasingly popular tool for the description and analysis of interactions within complex spatially extended systems in the geosciences and therefore appear to be predestined for this task. Such a network is typically constructed by thresholding a similarity matrix which in turn is based on a set of time series representing the (Earth) system dynamics at different locations. Looking into the pre-instrumental past, information about the system's processes and thus its state is available only through the reconstructed time series which -- most often -- are irregularly sampled in time and space. Interpolation techniques are often used for signal reconstruction, but they introduce additional errors, especially when records have large gaps. We have recently developed and extensively tested methods to quantify linear (Pearson correlation) and non-linear (mutual information) similarity in presence of heterogeneous and irregular sampling. To illustrate our approach we derive small networks from significantly correlated, linked, time series which are supposed to capture the underlying Asian Monsoon dynamics. We assess and discuss whether and where links and directionalities in these networks from irregularly sampled time series can be soundly detected. Finally, we investigate the role of the Northern Hemispheric temperature with respect to the correlation patterns and find that those derived from warm phases (e.g. Medieval Warm Period) are significantly different from patterns found in cold phases (e.g. Little Ice Age).
Monitoring forest dynamics with multi-scale and time series imagery.
Huang, Chunbo; Zhou, Zhixiang; Wang, Di; Dian, Yuanyong
2016-05-01
To learn the forest dynamics and evaluate the ecosystem services of forest effectively, a timely acquisition of spatial and quantitative information of forestland is very necessary. Here, a new method was proposed for mapping forest cover changes by combining multi-scale satellite remote-sensing imagery with time series data. Using time series Normalized Difference Vegetation Index products derived from the Moderate Resolution Imaging Spectroradiometer images (MODIS-NDVI) and Landsat Thematic Mapper/Enhanced Thematic Mapper Plus (TM/ETM+) images as data source, a hierarchy stepwise analysis from coarse scale to fine scale was developed for detecting the forest change area. At the coarse scale, MODIS-NDVI data with 1-km resolution were used to detect the changes in land cover types and a land cover change map was constructed using NDVI values at vegetation growing seasons. At the fine scale, based on the results at the coarse scale, Landsat TM/ETM+ data with 30-m resolution were used to precisely detect the forest change location and forest change trend by analyzing time series forest vegetation indices (IFZ). The method was tested using the data for Hubei Province, China. The MODIS-NDVI data from 2001 to 2012 were used to detect the land cover changes, and the overall accuracy was 94.02 % at the coarse scale. At the fine scale, the available TM/ETM+ images at vegetation growing seasons between 2001 and 2012 were used to locate and verify forest changes in the Three Gorges Reservoir Area, and the overall accuracy was 94.53 %. The accuracy of the two layer hierarchical monitoring results indicated that the multi-scale monitoring method is feasible and reliable.
NASA Astrophysics Data System (ADS)
Zheng, Jinde; Pan, Haiyang; Cheng, Junsheng
2017-02-01
To timely detect the incipient failure of rolling bearing and find out the accurate fault location, a novel rolling bearing fault diagnosis method is proposed based on the composite multiscale fuzzy entropy (CMFE) and ensemble support vector machines (ESVMs). Fuzzy entropy (FuzzyEn), as an improvement of sample entropy (SampEn), is a new nonlinear method for measuring the complexity of time series. Since FuzzyEn (or SampEn) in single scale can not reflect the complexity effectively, multiscale fuzzy entropy (MFE) is developed by defining the FuzzyEns of coarse-grained time series, which represents the system dynamics in different scales. However, the MFE values will be affected by the data length, especially when the data are not long enough. By combining information of multiple coarse-grained time series in the same scale, the CMFE algorithm is proposed in this paper to enhance MFE, as well as FuzzyEn. Compared with MFE, with the increasing of scale factor, CMFE obtains much more stable and consistent values for a short-term time series. In this paper CMFE is employed to measure the complexity of vibration signals of rolling bearings and is applied to extract the nonlinear features hidden in the vibration signals. Also the physically meanings of CMFE being suitable for rolling bearing fault diagnosis are explored. Based on these, to fulfill an automatic fault diagnosis, the ensemble SVMs based multi-classifier is constructed for the intelligent classification of fault features. Finally, the proposed fault diagnosis method of rolling bearing is applied to experimental data analysis and the results indicate that the proposed method could effectively distinguish different fault categories and severities of rolling bearings.
DOT National Transportation Integrated Search
2017-06-01
Performance analyses of newly constructed linear BMPs in retaining stormwater run-off from 1 in. precipitation in : post-construction highway applications and urban areas were conducted using numerical simulations and field : observation. A series of...
Comparison of Meteorological Data and Stable Isotope Time Series from an Indonesian Stalagmite
NASA Astrophysics Data System (ADS)
Watanabe, Y.; Matsuoka, H.; Sakai, S.; Ueda, J.; Yamada, M.; Ohsawa, S.; Kiguchi, M.; Satomura, T.; Nakai, S.; Brahmantyo, B.; Maryunani, K. A.; Tagami, T.; Takemura, K.; Yoden, S.
2007-12-01
In the last decade, geochemical records in stalagmites have been widely recognized as a powerful tool for the elucidation of paleoclimate/environment of the terrestrial areas. The previous data are mainly reported from middle latitude. However, this study aims at reconstructing past climate variations in the Asian equatorial regions by using oxygen and carbon isotope ratios recorded in Indonesian stalagmites. Especially, we focused on the comparison of meteorological data and stable isotope time series from an Indonesia stalagmite, in order to check whether the geochemistry of stalagmite is influenced by local precipitation. We performed geological surveys in Buniayu limestone caves, Sukabumi, West Java, Indonesia, and collected a series of stalagmites/stalactites and drip water samples. A stalagmite sample was observed using thin sections to identify banding. Moreover, to construct the age model of the stalagmite, we also measured both (1) the number of bands and (2) uranium series disequilibrium ages using the MC-ICP-MS. These data suggest that each layer is annual banding dominantly. Oxygen and carbon isotope ratios were analyzed on the stalagmite for annual time scales. The carbon isotope ratio has a clear correlation with oxygen isotope ratios. Furthermore, the proxy data was compared with meteorological data set in the past 80 years, showing a good correlation between the temporal variation of oxygen/carbon isotope ratios and annual precipitation. These lines of evidence suggest that the isotopic variation is predominantly caused by kinetic mass fractionation driven by the degassing of carbon dioxide in the cave.
NASA Astrophysics Data System (ADS)
Max-Moerbeck, W.; Richards, J. L.; Hovatta, T.; Pavlidou, V.; Pearson, T. J.; Readhead, A. C. S.
2014-11-01
We present a practical implementation of a Monte Carlo method to estimate the significance of cross-correlations in unevenly sampled time series of data, whose statistical properties are modelled with a simple power-law power spectral density. This implementation builds on published methods; we introduce a number of improvements in the normalization of the cross-correlation function estimate and a bootstrap method for estimating the significance of the cross-correlations. A closely related matter is the estimation of a model for the light curves, which is critical for the significance estimates. We present a graphical and quantitative demonstration that uses simulations to show how common it is to get high cross-correlations for unrelated light curves with steep power spectral densities. This demonstration highlights the dangers of interpreting them as signs of a physical connection. We show that by using interpolation and the Hanning sampling window function we are able to reduce the effects of red-noise leakage and to recover steep simple power-law power spectral densities. We also introduce the use of a Neyman construction for the estimation of the errors in the power-law index of the power spectral density. This method provides a consistent way to estimate the significance of cross-correlations in unevenly sampled time series of data.
Discrimination of coherent features in turbulent boundary layers by the entropy method
NASA Technical Reports Server (NTRS)
Corke, T. C.; Guezennec, Y. G.
1984-01-01
Entropy in information theory is defined as the expected or mean value of the measure of the amount of self-information contained in the ith point of a distribution series x sub i, based on its probability of occurrence p(x sub i). If p(x sub i) is the probability of the ith state of the system in probability space, then the entropy, E(X) = - sigma p(x sub i) logp (x sub i), is a measure of the disorder in the system. Based on this concept, a method was devised which sought to minimize the entropy in a time series in order to construct the signature of the most coherent motions. The constrained minimization was performed using a Lagrange multiplier approach which resulted in the solution of a simultaneous set of non-linear coupled equations to obtain the coherent time series. The application of the method to space-time data taken by a rake of sensors in the near-wall region of a turbulent boundary layer was presented. The results yielded coherent velocity motions made up of locally decelerated or accelerated fluid having a streamwise scale of approximately 100 nu/u(tau), which is in qualitative agreement with the results from other less objective discrimination methods.
NASA Astrophysics Data System (ADS)
Chen, Tsing-Chang; Yen, Ming-Cheng; Wu, Kuang-Der; Ng, Thomas
1992-08-01
The time evolution of the Indian monsoon is closely related to locations of the northward migrating monsoon troughs and ridges which can be well depicted with the 30 60day filtered 850-mb streamfunction. Thus, long-range forecasts of the large-scale low-level monsoon can be obtained from those of the filtered 850-mb streamfunction. These long-range forecasts were made in this study in terms of the Auto Regressive (AR) Moving-Average process. The historical series of the AR model were constructed with the 30 60day filtered 850-mb streamfunction [˜ψ (850mb)] time series of 4months. However, the phase of the last low-frequency cycle in the ˜ψ (850mb) time series can be skewed by the bandpass filtering. To reduce this phase skewness, a simple scheme is introduced. With this phase modification of the filtered 850-mb streamfunction, we performed the pilot forecast experiments of three summers with the AR forecast process. The forecast errors in the positions of the northward propagating monsoon troughs and ridges at Day 20 are generally within the range of 1
2days behind the observed, except in some extreme cases.
Agreement evaluation of AVHRR and MODIS 16-day composite NDVI data sets
Ji, Lei; Gallo, Kevin P.; Eidenshink, Jeffery C.; Dwyer, John L.
2008-01-01
Satellite-derived normalized difference vegetation index (NDVI) data have been used extensively to detect and monitor vegetation conditions at regional and global levels. A combination of NDVI data sets derived from AVHRR and MODIS can be used to construct a long NDVI time series that may also be extended to VIIRS. Comparative analysis of NDVI data derived from AVHRR and MODIS is critical to understanding the data continuity through the time series. In this study, the AVHRR and MODIS 16-day composite NDVI products were compared using regression and agreement analysis methods. The analysis shows a high agreement between the AVHRR-NDVI and MODIS-NDVI observed from 2002 and 2003 for the conterminous United States, but the difference between the two data sets is appreciable. Twenty per cent of the total difference between the two data sets is due to systematic difference, with the remainder due to unsystematic difference. The systematic difference can be eliminated with a linear regression-based transformation between two data sets, and the unsystematic difference can be reduced partially by applying spatial filters to the data. We conclude that the continuity of NDVI time series from AVHRR to MODIS is satisfactory, but a linear transformation between the two sets is recommended.
Tani, Yuji; Ogasawara, Katsuhiko
2012-01-01
This study aimed to contribute to the management of a healthcare organization by providing management information using time-series analysis of business data accumulated in the hospital information system, which has not been utilized thus far. In this study, we examined the performance of the prediction method using the auto-regressive integrated moving-average (ARIMA) model, using the business data obtained at the Radiology Department. We made the model using the data used for analysis, which was the number of radiological examinations in the past 9 years, and we predicted the number of radiological examinations in the last 1 year. Then, we compared the actual value with the forecast value. We were able to establish that the performance prediction method was simple and cost-effective by using free software. In addition, we were able to build the simple model by pre-processing the removal of trend components using the data. The difference between predicted values and actual values was 10%; however, it was more important to understand the chronological change rather than the individual time-series values. Furthermore, our method was highly versatile and adaptable compared to the general time-series data. Therefore, different healthcare organizations can use our method for the analysis and forecasting of their business data.
Global Warming Estimation From Microwave Sounding Unit
NASA Technical Reports Server (NTRS)
Prabhakara, C.; Iacovazzi, R., Jr.; Yoo, J.-M.; Dalu, G.
1998-01-01
Microwave Sounding Unit (MSU) Ch 2 data sets, collected from sequential, polar-orbiting, Sun-synchronous National Oceanic and Atmospheric Administration operational satellites, contain systematic calibration errors that are coupled to the diurnal temperature cycle over the globe. Since these coupled errors in MSU data differ between successive satellites, it is necessary to make compensatory adjustments to these multisatellite data sets in order to determine long-term global temperature change. With the aid of the observations during overlapping periods of successive satellites, we can determine such adjustments and use them to account for the coupled errors in the long-term time series of MSU Ch 2 global temperature. In turn, these adjusted MSU Ch 2 data sets can be used to yield global temperature trend. In a pioneering study, Spencer and Christy (SC) (1990) developed a procedure to derive the global temperature trend from MSU Ch 2 data. Such a procedure can leave unaccounted residual errors in the time series of the temperature anomalies deduced by SC, which could lead to a spurious long-term temperature trend derived from their analysis. In the present study, we have developed a method that avoids the shortcomings of the SC procedure, the magnitude of the coupled errors is not determined explicitly. Furthermore, based on some assumptions, these coupled errors are eliminated in three separate steps. Such a procedure can leave unaccounted residual errors in the time series of the temperature anomalies deduced by SC, which could lead to a spurious long-term temperature trend derived from their analysis. In the present study, we have developed a method that avoids the shortcomings of the SC procedures. Based on our analysis, we find there is a global warming of 0.23+/-0.12 K between 1980 and 1991. Also, in this study, the time series of global temperature anomalies constructed by removing the global mean annual temperature cycle compares favorably with a similar time series obtained from conventional observations of temperature.
An Update of Sea Level Rise in the northwestern part of the Arabian Gulf
NASA Astrophysics Data System (ADS)
Alothman, Abdulaziz; Bos, Machiel; Fernandes, Rui
2017-04-01
Relative sea level variations in the northwestern part of the Arabian Gulf have been estimated in the past using no more than 10 to 15 years of observations. In Alothman et al. (2014), we have almost doubled the period to 28.7 years by examining all available tide gauge data in the area and constructing a mean gauge time-series from seven coastal tide gauges. We found for the period 1979-2007 a relative sea level rise of about 2mm/yr, which correspond to an absolute sea level rise of about 1.5mm/yr based on the vertical displacement of GNSS stations in the region. By taking into account the temporal correlations we concluded that previous published results underestimate the true sea level rate error in this area by a factor of 5-10. In this work, we discuss and update the methodology and results from Alothman et al. (2014), particularly by checking and extending the GNSS solutions. Since 3 of the 6 GPS stations used only started observing in the end of 2011, the longer time series have now significantly lower uncertainties in the estimated vertical rate. In addition, we compare our results with GRACE derived ocean bottom pressure time series which are a good proxy of the changes in water mass in this area over time.
Initial Validation of NDVI time seriesfrom AVHRR, VEGETATION, and MODIS
NASA Technical Reports Server (NTRS)
Morisette, Jeffrey T.; Pinzon, Jorge E.; Brown, Molly E.; Tucker, Jim; Justice, Christopher O.
2004-01-01
The paper will address Theme 7: Multi-sensor opportunities for VEGETATION. We present analysis of a long-term vegetation record derived from three moderate resolution sensors: AVHRR, VEGETATION, and MODIS. While empirically based manipulation can ensure agreement between the three data sets, there is a need to validate the series. This paper uses atmospherically corrected ETM+ data available over the EOS Land Validation Core Sites as an independent data set with which to compare the time series. We use ETM+ data from 15 globally distributed sites, 7 of which contain repeat coverage in time. These high-resolution data are compared to the values of each sensor by spatially aggregating the ETM+ to each specific sensors' spatial coverage. The aggregated ETM+ value provides a point estimate for a specific site on a specific date. The standard deviation of that point estimate is used to construct a confidence interval for that point estimate. The values from each moderate resolution sensor are then evaluated with respect to that confident interval. Result show that AVHRR, VEGETATION, and MODIS data can be combined to assess temporal uncertainties and address data continuity issues and that the atmospherically corrected ETM+ data provide an independent source with which to compare that record. The final product is a consistent time series climate record that links historical observations to current and future measurements.
Estimated Marine Residence Times for Drowned Barbadian Paleoreefs
NASA Astrophysics Data System (ADS)
Mey, J. L.
2008-12-01
Fossil corals are used to estimate past sea level and also to calibrate 14C ages with the aid of U-Th and U-Pa dating methods. These coral fossils have often been subaerially exposed and thus are affected by diagenesis during their initial interaction with fresh water. In an effort to understand when such disequilibria in fossil coral reefs occurred, we have quantified our 'dissolution-cum-adsorption' model (Mey, 2008) for the uranium series disequilibria using a geometrical construction, based on the evolution of the activities in a 230Th/238U versus 234U/238U diagram for closed versus open systems. The traditional age equations for the uranium-series with excess daughters have been used to construct a relationship between (i) the angles of the equal age lines in the 230Th/238U versus 234U/238U activity diagrams, and (ii) the quantified angles of the regressed lines of several uranium series disequilibria trends from Barbados. Our results indicate that the severity of the Barbados uranium series disequilibria is not only explained by 234U and 230Th addition, but may also reflect a loss of 238U through dissolution of coral skeletal structure. The net effect is 238U removal, whereas 234U and 230Th remain; thus, the disequilibria for the extant coral increase the excess daughters' ratio. Our results further indicate that the activity of 234U is reduced (compared to 230Th), as would be expected in regard to the lower mobility of trapped 230Th. It is proposed that the major dissolution that caused the uranium series disequilibria occurred during one relatively short-lived event when the paleoreefs experienced the very first freshwater exposure. During this event, the diagenetic potential was at its maximum for redistribution of the uranium series; this then caused the 234U and the 230Th to behave in a systematic way, resulting in linear trends. The linear trends in the open system uranium series were set early, as shown in the 230Th/238U versus 234U/238U activity diagrams. The timing of the first exposure of the freshwater in the reefs is calculated based on the results of our new model. From the relationship between, (i) dissolution, (ii) in-grown 230Th, and (iii) excess 234U, we derived that the 60,000 old Marine Isotope stage 3 (MIS 3) reef was exposed to freshwater 36-38,000 years after growth in the marine environment. We have calculated these 'marine residence times' for the MIS 3 5a, 5c, 5e, 6.0, 7a and 7c reefs; our results correspond with the duration of the sea level high stand in each of the stages. References: Mey, J. L., (2008) The Uranium Series Diagenesis and the Morphology of Drowned Barbadian Paleoreefs, PhD dissertation, 325pp: Graduate Center, City University of New York, New York.
ERIC Educational Resources Information Center
Argon, Joe, Ed.; Spoor, Dana L.; Cox, Susan M.; Brown, Andrew; Ray, Jennifer
1998-01-01
Presents a series of articles that examine decision making in school construction and renovation projects. Topics include preparing for a construction project, purchasing windows that provide protection at a reasonable cost, choosing the best flooring and carpeting, and dealing with deregulation. An industry roundtable discussion on project…
Introduction to Construction Welding. Instructor Edition. Introduction to Construction Series.
ERIC Educational Resources Information Center
Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.
This instructor's guide contains the materials required to teach a competency-based introductory course in construction welding to students who have chosen to explore careers in construction. It contains three units: welding materials, welding tools, and applied skills. Each instructional unit includes some or all of the basic components of a unit…
Ground-water conditions in Utah, spring of 2002
Burden, Carole B.; Enright, Michael; Danner, M.R.; Fisher, M.J.; Haraden, Peter L.; Kenney, T.A.; Wilkowske, C.D.; Eacret, Robert J.; Downhour, Paul; Slaugh, B.A.; Swenson, R.L.; Howells, J.H.; Christiansen, H.K.
2002-01-01
This is the thirty-ninth in a series of annual reports that describe ground-water conditions in Utah. Reports in this series, published cooperatively by the U.S. Geological Survey and the Utah Department of Natural Resources, Division of Water Resources and Division of Water Rights, provide data to enable interested parties to maintain awareness of changing ground-water conditions.This report, like the others in the series, contains information on well construction, ground-water withdrawal from wells, water-level changes, precipitation, streamflow, and chemical quality of water. Information on well construction included in this report refers only to wells constructed for new appropriations of ground water. Supplementary data are included in reports of this series only for those years or areas which are important to a discussion of changing ground-water conditions and for which applicable data are available.This report includes individual discussions of selected significant areas of ground-water development in the State for calendar year 2001. Most of the reported data were collected by the U.S. Geological Survey in cooperation with the Utah Department of Natural Resources, Division of Water Rights and Division of Water Resources.
Ground-water conditions in Utah, spring of 1997
Gerner, S.J.; Steiger, J.I.; Sory, J.D.; Burden, Carole B.; Loving, B.L.; Brockner, S.J.; Danner, M.R.; Downhour, Paul; Slaugh, B.A.; Swenson, R.L.; Howells, J.H.; Christiansen, H.K.; Herbert, L.R.
1997-01-01
This is the thirty-fourth in a series of annual reports that describe ground-water conditions in Utah. Reports in this series, published cooperatively by the U.S. Geological Survey and the Utah Department of Natural Resources, Division of Water Resources, provide data to enable interested parties to keep aware of changing ground-water conditions.This report, like the others in the series, contains information on well construction, ground-water withdrawal from wells, water-level changes, precipitation, streamflow, and chemical quality of water. Information on well construction included in this report refers only to wells constructed for new appropriations of ground water. Supplementary data are included in reports of this series only for those years or areas for which applicable data are available and are important to a discussion of changing ground-water conditions.This report includes individual discussions of selected significant areas of ground-water development in the State for calendar year 1996. Most of the reported data were collected by the U.S. Geological Survey in cooperation with the Utah Department of Natural Resources, Divisions of Water Rights and Water Resources.
Ground-water conditions in Utah, spring of 1999
Burden, Carole B.; Spangler, L.E.; Sory, J.D.; Eacret, Robert J.; Kenney, T.A.; Johnson, K.K.; Loving, B.L.; Brockner, S.J.; Danner, M.R.; Downhour, Paul; Slaugh, B.A.; Swenson, R.L.; Howells, J.H.; Christiansen, H.K.; Fisher, M.J.
1999-01-01
This is the thirty-sixth in a series of annual reports that describe ground-water conditions in Utah. Reports in this series, published cooperatively by the U.S. Geological Survey and the Utah Department of Natural Resources, Division of Water Resources and Division of Water Rights, provide data to enable interested parties to maintain awareness of changing ground-water conditions.This report, like the others in the series, contains information on well construction, ground-water withdrawal from wells, water-level changes, precipitation, streamflow, and chemical quality of water. Information on well construction included in this report refers only to wells constructed for new appropriations of ground water. Supplementary data are included in reports of this series only for those years or areas which are important to a discussion of changing ground-water conditions and for which applicable data are available.This report includes individual discussions of selected significant areas of ground-water development in the State for calendar year 1998. Most of the reported data were collected by the U.S. Geological Survey in cooperation with the Utah Department of Natural Resources, Divisions of Water Rights and Water Resources.
Ground-water conditions in Utah, spring of 2001
Burden, Carole B.; Sory, J.D.; Danner, M.R.; Fisher, M.J.; Haraden, Peter L.; Kenney, T.A.; Eacret, Robert J.; Downhour, Paul; Slaugh, B.A.; Swenson, R.L.; Howells, J.H.; Christiansen, H.K.
2001-01-01
This is the thirty-eighth in a series of annual reports that describe ground-water conditions in Utah. Reports in this series, published cooperatively by the U.S. Geological Survey and the Utah Department of Natural Resources, Division of Water Resources and Division of Water Rights, provide data to enable interested parties to maintain awareness of changing ground-water conditions.This report, like the others in the series, contains information on well construction, ground-water withdrawal from wells, water-level changes, precipitation, streamflow, and chemical quality of water. Information on well construction included in this report refers only to wells constructed for new appropriations of ground water. Supplementary data are included in reports of this series only for those years or areas which are important to a discussion of changing ground-water conditions and for which applicable data are available.This report includes individual discussions of selected significant areas of ground-water development in the State for calendar year 2000. Most of the reported data were collected by the U.S. Geological Survey in cooperation with the Utah Department of Natural Resources, Division of Water Rights and Division of Water Resources.
Ground-water conditions in Utah, spring of 1998
Susong, David D.; Burden, Carole B.; Sory, J.D.; Eacret, Robert J.; Johnson, K.K.; Loving, B.L.; Brockner, S.J.; Danner, M.R.; Downhour, Paul; Slaugh, B.A.; Swenson, R.L.; Howells, J.H.; Christiansen, H.K.; Herbert, L.R.
1998-01-01
This is the thirty-fifth in a series of annual reports that describe ground-water conditions in Utah. Reports in this series, published cooperatively by the U.S. Geological Survey and the Utah Department of Natural Resources, Division of Water Resources, provide data to enable interested parties to maintain awareness of changing ground-water conditions.This report, like the others in the series, contains information on well construction, ground-water withdrawal from wells, water-level changes, precipitation, streamflow, and chemical quality of water. Information on well construction included in this report refers only to wells constructed for new appropriations of ground water. Supplementary data are included in reports of this series only for those years or areas which are important to a discussion of changing ground-water conditions and for which applicable data are available.This report includes individual discussions of selected significant areas of ground-water development in the State for calendar year 1997. Most of the reported data were collected by the U.S. Geological Survey in cooperation with the Utah Department of Natural Resources, Divisions of Water Rights and Water Resources.
Ground-water conditions in Utah, spring of 2003
Burden, Carole B.; Enright, Michael; Danner, M.R.; Fisher, M.J.; Haraden, Peter L.; Kenney, T.A.; Wilkowske, C.D.; Eacret, Robert J.; Downhour, Paul; Slaugh, B.A.; Swenson, R.L.; Howells, J.H.; Christiansen, H.K.
2003-01-01
This is the fortieth in a series of annual reports that describe ground-water conditions in Utah. Reports in this series, published cooperatively by the U.S. Geological Survey and the Utah Department of Natural Resources, Division of Water Resources and Division of Water Rights, provide data to enable interested parties to maintain awareness of changing ground-water conditions.This report, like the others in the series, contains information on well construction, ground-water withdrawal from wells, water-level changes, precipitation, streamflow, and chemical quality of water. Information on well construction included in this report refers only to wells constructed for new appropriations of ground water. Supplementary data are included in reports of this series only for those years or areas which are important to a discussion of changing ground-water conditions and for which applicable data are available.This report includes individual discussions of selected significant areas of ground-water development in the State for calendar year 2002. Most of the reported data were collected by the U.S. Geological Survey in cooperation with the Utah Department of Natural Resources, Division of Water Rights and Division of Water Resources.
Ground-water conditions in Utah, spring of 2000
Burden, Carole B.; Sory, J.D.; Danner, M.R.; Johnson, K.K.; Kenny, T.A.; Brockner, S.J.; Eacret, Robert J.; Downhour, Paul; Slaugh, B.A.; Swenson, R.L.; Howells, J.H.; Christiansen, H.K.; Fisher, M.J.
2000-01-01
This is the thirty-seventh in a series of annual reports that describe ground-water conditions in Utah. Reports in this series, published cooperatively by the U.S. Geological Survey and the Utah Department of Natural Resources, Division of Water Resources and Division of Water Rights, provide data to enable interested parties to maintain awareness of changing ground-water conditions.This report, like the others in the series, contains information on well construction, ground-water withdrawal from wells, water-level changes, precipitation, streamflow, and chemical quality of water. Information on well construction included in this report refers only to wells constructed for new appropriations of ground water. Supplementary data are included in reports of this series only for those years or areas which are important to a discussion of changing ground-water conditions and for which applicable data are available.This report includes individual discussions of selected significant areas of ground-water development in the State for calendar year 1999. Most of the reported data were collected by the U.S. Geological Survey in cooperation with the Utah Department of Natural Resources, Divisions of Water Rights and Water Resources.
Ground-water conditions in Utah, spring of 2004
Burden, Carole B.; Allen, David V.; Danner, M.R.; Walzem, Vince; Cillessen, J.L.; Kenney, T.A.; Wilkowske, C.D.; Eacret, Robert J.; Downhour, Paul; Slaugh, B.A.; Swenson, R.L.; Howells, J.H.; Christiansen, H.K.; Fisher, M.J.
2004-01-01
This is the forty-first in a series of annual reports that describe ground-water conditions in Utah. Reports in this series, published cooperatively by the U.S. Geological Survey and the Utah Department of Natural Resources, Division of Water Resources and Division of Water Rights, provide data to enable interested parties to maintain awareness of changing ground-water conditions.This report, like the others in the series, contains information on well construction, ground-water withdrawal from wells, water-level changes, precipitation, streamflow, and chemical quality of water. Information on well construction included in this report refers only to wells constructed for new appropriations of ground water. Supplementary data are included in reports of this series only for those years or areas which are important to a discussion of changing ground-water conditions and for which applicable data are available.This report includes individual discussions of selected significant areas of ground-water development in the State for calendar year 2003. Most of the reported data were collected by the U.S. Geological Survey in cooperation with the Utah Department of Natural Resources, Division of Water Rights and Division of Water Resources.
Implications of Version 8 TOMS and SBUV Data for Long-Term Trend Analysis
NASA Technical Reports Server (NTRS)
Frith, Stacey M.
2004-01-01
Total ozone data from the Total Ozone Mapping Spectrometer (TOMS) and profile/total ozone data from the Solar Backscatter Ultraviolet (SBUV; SBW/2) series of instruments have recently been reprocessed using new retrieval algorithms (referred to as Version 8 for both) and updated calibrations. In this paper, we incorporate the Version 8 data into a TOMS/SBW merged total ozone data set and an S B W merged profile ozone data set. The Total Merged Ozone Data (Total MOD) combines data from multiple TOMS and SBW instruments to form an internally consistent global data set with virtually complete time coverage from October 1978 through December 2003. Calibration differences between instruments are accounted for using external adjustments based on instrument intercomparisons during overlap periods. Previous results showed errors due to aerosol loading and sea glint are significantly reduced in the V8 TOMS retrievals. Using SBW as a transfer standard, calibration differences between V8 Nimbus 7 and Earth Probe TOMS data are approx. 1.3%, suggesting small errors in calibration remain. We will present updated total ozone long-term trends based on the Version 8 data. The Profile Merged Ozone Data (Profile MOD) data set is constructed using data from the SBUV series of instruments. In previous versions, SAGE data were used to establish the long-term external calibration of the combined data set. The SBW Version 8 we assess the V8 profile data through comparisons with SAGE and between SBW instruments in overlap periods. We then construct a consistently-calibrated long term time series. Updated zonal mean trends as a function of altitude and season from the new profile data set will be shown, and uncertainties in determining the best long-term calibration will be discussed.
Schoberle, Taylor J; Nguyen-Coleman, C Kim; May, Gregory S
2013-01-01
Fungal species are continuously being studied to not only understand disease in humans and plants but also to identify novel antibiotics and other metabolites of industrial importance. Genetic manipulations, such as gene deletion, gene complementation, and gene over-expression, are common techniques to investigate fungal gene functions. Although advances in transformation efficiency and promoter usage have improved genetic studies, some basic steps in vector construction are still laborious and time-consuming. Gateway cloning technology solves this problem by increasing the efficiency of vector construction through the use of λ phage integrase proteins and att recombination sites. We developed a series of Gateway-compatible vectors for use in genetic studies in a range of fungal species. They contain nutritional and drug-resistance markers and can be utilized to manipulate different filamentous fungal genomes. Copyright © 2013 Elsevier Inc. All rights reserved.
On vector-valued Poincaré series of weight 2
NASA Astrophysics Data System (ADS)
Meneses, Claudio
2017-10-01
Given a pair (Γ , ρ) of a Fuchsian group of the first kind, and a unitary representation ρ of Γ of arbitrary rank, the problem of construction of vector-valued Poincaré series of weight 2 is considered. Implications in the theory of parabolic bundles are discussed. When the genus of the group is zero, it is shown how an explicit basis for the space of these functions can be constructed.
Zhao, Zhibiao
2011-06-01
We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.
Detecting recurrence domains of dynamical systems by symbolic dynamics.
beim Graben, Peter; Hutt, Axel
2013-04-12
We propose an algorithm for the detection of recurrence domains of complex dynamical systems from time series. Our approach exploits the characteristic checkerboard texture of recurrence domains exhibited in recurrence plots. In phase space, recurrence plots yield intersecting balls around sampling points that could be merged into cells of a phase space partition. We construct this partition by a rewriting grammar applied to the symbolic dynamics of time indices. A maximum entropy principle defines the optimal size of intersecting balls. The final application to high-dimensional brain signals yields an optimal symbolic recurrence plot revealing functional components of the signal.
Chen, Y.; Viadero, R.C.; Wei, X.; Fortney, Ronald H.; Hedrick, Lara B.; Welsh, S.A.; Anderson, James T.; Lin, L.-S.
2009-01-01
Refining best management practices (BMPs) for future highway construction depends on a comprehensive understanding of environmental impacts from current construction methods. Based on a before-after-control impact (BACI) experimental design, long-term stream monitoring (1997-2006) was conducted at upstream (as control, n = 3) and downstream (as impact, n = 6) sites in the Lost River watershed of the Mid-Atlantic Highlands region, West Virginia. Monitoring data were analyzed to assess impacts of during and after highway construction on 15 water quality parameters and macroinvertebrate condition using the West Virginia stream condition index (WVSCI). Principal components analysis (PCA) identified regional primary water quality variances, and paired t tests and time series analysis detected seven highway construction-impacted water quality parameters which were mainly associated with the second principal component. In particular, impacts on turbidity, total suspended solids, and total iron during construction, impacts on chloride and sulfate during and after construction, and impacts on acidity and nitrate after construction were observed at the downstream sites. The construction had statistically significant impacts on macroinvertebrate index scores (i.e., WVSCI) after construction, but did not change the overall good biological condition. Implementing BMPs that address those construction-impacted water quality parameters can be an effective mitigation strategy for future highway construction in this highlands region. Copyright ?? 2009 by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America. All rights reserved.
Deciphering hierarchical features in the energy landscape of adenylate kinase folding/unfolding
NASA Astrophysics Data System (ADS)
Taylor, J. Nicholas; Pirchi, Menahem; Haran, Gilad; Komatsuzaki, Tamiki
2018-03-01
Hierarchical features of the energy landscape of the folding/unfolding behavior of adenylate kinase, including its dependence on denaturant concentration, are elucidated in terms of single-molecule fluorescence resonance energy transfer (smFRET) measurements in which the proteins are encapsulated in a lipid vesicle. The core in constructing the energy landscape from single-molecule time-series across different denaturant concentrations is the application of rate-distortion theory (RDT), which naturally considers the effects of measurement noise and sampling error, in combination with change-point detection and the quantification of the FRET efficiency-dependent photobleaching behavior. Energy landscapes are constructed as a function of observation time scale, revealing multiple partially folded conformations at small time scales that are situated in a superbasin. As the time scale increases, these denatured states merge into a single basin, demonstrating the coarse-graining of the energy landscape as observation time increases. Because the photobleaching time scale is dependent on the conformational state of the protein, possible nonequilibrium features are discussed, and a statistical test for violation of the detailed balance condition is developed based on the state sequences arising from the RDT framework.
Satellite Ocean Color: Present Status, Future Challenges
NASA Technical Reports Server (NTRS)
Gregg, Watson W.; McClain, Charles R.; Zukor, Dorothy J. (Technical Monitor)
2001-01-01
We are midway into our 5th consecutive year of nearly continuous, high quality ocean color observations from space. The Ocean Color and Temperature Scanner/Polarization and Directionality of the Earth's Reflectances (OCTS/POLDER: Nov. 1996 - Jun. 1997), the Sea-viewing Wide Field-of-view Sensor (SeaWiFS: Sep. 1997 - present), and now the Moderate Resolution Imaging Spectrometer (MODIS: Sep. 2000 - present) have and are providing unprecedented views of chlorophyll dynamics on global scales. Global synoptic views of ocean chlorophyll were once a fantasy for ocean color scientists. It took nearly the entire 8-year lifetime of limited Coastal Zone Color Scanner (CZCS) observations to compile seasonal climatologies. Now SeaWIFS produces comparably complete fields in about 8 days. For the first time, scientists may observe spatial and temporal variability never before seen in a synoptic context. Even more exciting, we are beginning to plausibly ask questions of interannual variability. We stand at the beginning of long-time time series of ocean color, from which we may begin to ask questions of interdecadal variability and climate change. These are the scientific questions being addressed by users of the 18-year Advanced Very High Resolution Radiometer time series with respect to terrestrial processes and ocean temperatures. The nearly 5-year time series of ocean color observations now being constructed, with possibilities of continued observations, can put us at comparable standing with our terrestrial and physical oceanographic colleagues, and enable us to understand how ocean biological processes contribute to, and are affected by global climate change.
Lee, Casey J.; Ziegler, Andrew C.
2010-01-01
The U.S. Geological Survey, in cooperation with the Johnson County, Kansas, Stormwater Management Program, investigated the effects of urbanization, construction activity, management practices, and impoundments on suspended-sediment transport in Johnson County from February 2006 through November 2008. Streamgages and continuous turbidity sensors were operated at 15 sites within the urbanizing 57-square-mile Mill Creek Basin, and 4 sites downstream from the other largest basins (49 to 66 square miles) in Johnson County. The largest sediment yields in Johnson County were observed downstream from basins with increased construction activity. Sediment yields attributed to the largest (68 acre) active construction site in the study area were 9,300 tons per square mile in 2007 and 12,200 tons per square mile in 2008; 5 to 55 times larger than yields observed at other sampling sites. However, given erodible soils and steep slopes at this site, sediment yields were relatively small compared to the range in historic values from construction sites without erosion and sediment controls in the United States (2,300 to 140,000 tons per square mile). Downstream from this construction site, a sediment forebay and wetland were constructed in series upstream from Shawnee Mission Lake, a 120-acre reservoir within Shawnee Mission Park. Although the original intent of the sediment forebay and constructed wetland were unrelated to upstream construction, they were nonetheless evaluated in 2008 to characterize sediment removal before stream entry into the lake. The sediment forebay was estimated to reduce 33 percent of sediment transported to the lake, whereas the wetland did not appear to decrease downstream sediment transport. Comparisons of time-series data and relations between turbidity and sediment concentration indicate that larger silt-sized particles were deposited within the sediment forebay, whereas smaller silt and clay-sized sediments were transported through the wetland and into the lake. Data collected at sites up and downstream from the constructed wetland indicated that hydraulic retention alone did not substantially reduce sediment loading to Shawnee Mission Lake. Mean-daily turbidity values at sampling sites downstream from basins with increased construction activity were compared to U.S. Environmental Protection Agency turbidity criteria designed to reduce discharge of pollutants from construction sites. The U.S. Environmental Protection Agency numeric turbidity criteria specifies that effluent from construction sites greater than 20 acres not exceed a mean-daily turbidity value of 280 nephelometric turbidity units beginning in 2011; this criteria will apply to sites greater than 10 acres beginning in 2014. Although numeric criteria would not have been applicable to data from sampling sites in Johnson County because they were not directly downstream from construction sites and because individual states still have to determine additional details as to how this criteria will be enforced, comparisons were made to characterize the potential of construction site effluent in Johnson County to exceed U.S. Environmental Protection Agency Criteria, even under extensive erosion and sediment controls. Numeric criteria were exceeded at sampling sites downstream from basins with increased construction activity for multiple days during the study period, potentially indicating the need for additional erosion and sediment controls and (or) treatment to bring discharges from construction sites into compliance with future numeric turbidity criteria. Among sampling sites in the Mill Creek Basin, sediment yields from the urbanizing Clear Creek Basin were approximately 2 to 3 times those from older, more stable urban or rural basins. Sediments eroded from construction sites adjacent to or surrounding streams appear to be more readily transported downstream, whereas sediments eroded from construction sites in headwater areas are more likely to
Bed-level adjustments in the Arno River, central Italy
NASA Astrophysics Data System (ADS)
Rinaldi, Massimo; Simon, Andrew
1998-02-01
Two distinct phases of bed-level adjustment over the last 150 years are identified for the principal alluvial reaches of the Arno River (Upper Valdarno and Lower Valdarno). The planimetric configuration of the river in these reaches is the result of a series of hydraulic works (canalization, rectification, artificial cut-offs, etc.) carried out particularly between the 18th and the 19th centuries. Subsequently, a series of interventions at basin level (construction of weirs, variations in land use), intense instream gravel-mining after World War II, and the construction of two dams on the Arno River, caused widespread degradation of the streambed. Since about 1900, total lowering of the channel bed is typically between 2 and 4 m in the Upper Valdarno Reach and between 5 and 8 m in some areas of the Lower Valdarno Reach. Bed-level adjustments with time are analyzed for a large number of cross-sections and described by an exponential-decay function. This analysis identified the existence of two main phases of lowering: the first, triggered at the end of the past century; the second, triggered in the interval 1945-1960 and characterized by more intense degradation of the streambed. The first phase derived from changes in land-use and land-management practices. The second phase is the result of the superimposition of two factors: intense instream mining of gravel, and the construction of the Levane and La Penna dams.
Neural network approaches to capture temporal information
NASA Astrophysics Data System (ADS)
van Veelen, Martijn; Nijhuis, Jos; Spaanenburg, Ben
2000-05-01
The automated design and construction of neural networks receives growing attention of the neural networks community. Both the growing availability of computing power and development of mathematical and probabilistic theory have had severe impact on the design and modelling approaches of neural networks. This impact is most apparent in the use of neural networks to time series prediction. In this paper, we give our views on past, contemporary and future design and modelling approaches to neural forecasting.
NGNP High Temperature Materials White Paper
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lew Lommers; George Honma
2012-08-01
This white paper is one in a series of white papers that address key generic issues of the combined construction and operating license (COL) pre-application program key generic issues for the Next Generation Nuclear Plant reactor using the prismatic block fuel technology. The purpose of the pre-application program interactions with the NRC staff is to reduce the time required for COL application review by identifying and addressing key regulatory issues and, if possible, obtaining agreements for their resolution
Celestial mechanics solutions that escape
NASA Astrophysics Data System (ADS)
Gingold, Harry; Solomon, Daniel
2017-08-01
We establish the existence of an open set of initial conditions through which pass solutions without singularities to Newton's gravitational equations in R3 on a semi-infinite interval in forward time, for which every pair of particles separates like At , A > 0, as t → ∞. The solutions are constructable as series with rapid uniform convergence and their asymptotic behavior to any order is prescribed. We show that this family of solutions depends on 6N parameters subject to certain constraints.
Collison, R S; Grismer, M E
2015-11-01
We evaluated subsurface flow (SSF) constructed wetland treatment performance with respect to organics (COD) and nitrogen (ammonium and nitrate) removal from domestic (septic tank) wastewater as affected by the presence of plants, substrate "rock" cation exchange capacity (CEC), laboratory versus field conditions and use of synthetic as compared to actual domestic wastewater. This article considers the effects of plants on constructed wetland treatment in the field. Each constructed wetland system was comprised of two beds (2.6 m long by 0.28 m wide and deep filled with ~18 mm crushed lava rock) separated by an aeration tank connected in series. The lava rock had a porosity of ~47% and a CEC of 4 meq/100 gm. One pair of constructed wetland systems was planted with cattails in May 2008, while an adjacent pair of systems remained un-planted. Collected septic tank or synthesized wastewater was allowed to gravity feed each constructed wetland system and effluent samples were regularly collected and tested for COD and nitrogen species during four time periods spanning November 2008 through June 2009. These effluent concentrations were tested for statistical differences at the 95% level for individual time periods as well as the overall 6-month period. Organics removal from domestic wastewater was 78.8% and 76.1% in the planted and un-planted constructed wetland systems, respectively, while ammonium removal was 94.5% and 90.2%, respectively. Similarly, organics removal from the synthetic wastewater of equivalent strength was 88.8% and 90.1% for planted and un-planted constructed wetland systems, respectively, while ammonium removal was 96.9% and 97.3%, respectively.
ERIC Educational Resources Information Center
Department of Health , Education, and Welfare, Washington., DC. Office of the Secretary.
This handbook provides a basis for consideration of acceptable approaches which are available and may be used to initiate the use of construction management services in the planning, design, and construction of federally assisted construction projects. It includes the mandatory federal requirements as well as acceptable procedures for selecting…
Architectural: Construction, Supervision, and Inspection. Course of Study.
ERIC Educational Resources Information Center
Robson, Frank
This architectural course of study is part of a construction, supervision, and inspection series, which provides instructional materials for community or junior college technical courses in the inspection program. Material covered pertains to: construction contracts, schedules, and site preparation; footings and foundations; masonry and…
Teaching Construction: A Design-Based Course Model
ERIC Educational Resources Information Center
Love, Tyler S.; Salgado, Carlos A.
2016-01-01
The focus on construction in T&E education has drastically changed. This article presents a series of topics and design-based labs that can be taught at various grade levels to integrate STEM concepts while also increasing students' overall awareness of construction and structural technologies.
ERIC Educational Resources Information Center
Hadipriono, Fabian C.; And Others
An interactive training model called SAVR (Safety in Construction Using Virtual Reality) was developed to train construction students, novice engineers, and construction workers to prevent falls from scaffolding. The model was implemented in a graphics supercomputer, the ONYX Reality Engine2. The SAVR model provides trainees with an immersive,…
NASA Astrophysics Data System (ADS)
Alakent, Burak; Camurdan, Mehmet C.; Doruker, Pemra
2005-10-01
Time series models, which are constructed from the projections of the molecular-dynamics (MD) runs on principal components (modes), are used to mimic the dynamics of two proteins: tendamistat and immunity protein of colicin E7 (ImmE7). Four independent MD runs of tendamistat and three independent runs of ImmE7 protein in vacuum are used to investigate the energy landscapes of these proteins. It is found that mean-square displacements of residues along the modes in different time scales can be mimicked by time series models, which are utilized in dividing protein dynamics into different regimes with respect to the dominating motion type. The first two regimes constitute the dominance of intraminimum motions during the first 5ps and the random walk motion in a hierarchically higher-level energy minimum, which comprise the initial time period of the trajectories up to 20-40ps for tendamistat and 80-120ps for ImmE7. These are also the time ranges within which the linear nonstationary time series are completely satisfactory in explaining protein dynamics. Encountering energy barriers enclosing higher-level energy minima constrains the random walk motion of the proteins, and pseudorelaxation processes at different levels of minima are detected in tendamistat, depending on the sampling window size. Correlation (relaxation) times of 30-40ps and 150-200ps are detected for two energy envelopes of successive levels for tendamistat, which gives an overall idea about the hierarchical structure of the energy landscape. However, it should be stressed that correlation times of the modes are highly variable with respect to conformational subspaces and sampling window sizes, indicating the absence of an actual relaxation. The random-walk step sizes and the time length of the second regime are used to illuminate an important difference between the dynamics of the two proteins, which cannot be clarified by the investigation of relaxation times alone: ImmE7 has lower-energy barriers enclosing the higher-level energy minimum, preventing the protein to relax and letting it move in a random-walk fashion for a longer period of time.
NASA Astrophysics Data System (ADS)
Kasatkina, T. I.; Dushkin, A. V.; Pavlov, V. A.; Shatovkin, R. R.
2018-03-01
In the development of information, systems and programming to predict the series of dynamics, neural network methods have recently been applied. They are more flexible, in comparison with existing analogues and are capable of taking into account the nonlinearities of the series. In this paper, we propose a modified algorithm for predicting the series of dynamics, which includes a method for training neural networks, an approach to describing and presenting input data, based on the prediction by the multilayer perceptron method. To construct a neural network, the values of a series of dynamics at the extremum points and time values corresponding to them, formed based on the sliding window method, are used as input data. The proposed algorithm can act as an independent approach to predicting the series of dynamics, and be one of the parts of the forecasting system. The efficiency of predicting the evolution of the dynamics series for a short-term one-step and long-term multi-step forecast by the classical multilayer perceptron method and a modified algorithm using synthetic and real data is compared. The result of this modification was the minimization of the magnitude of the iterative error that arises from the previously predicted inputs to the inputs to the neural network, as well as the increase in the accuracy of the iterative prediction of the neural network.
Research on Evaluation of resource allocation efficiency of transportation system based on DEA
NASA Astrophysics Data System (ADS)
Zhang, Zhehui; Du, Linan
2017-06-01
In this paper, we select the time series data onto 1985-2015 years, construct the land (shoreline) resources, capital and labor as inputs. The index system of the output is freight volume and passenger volume, we use Quantitative analysis based on DEA method evaluated the resource allocation efficiency of railway, highway, water transport and civil aviation in China. Research shows that the resource allocation efficiency of various modes of transport has obvious difference, and the impact on scale efficiency is more significant. The most important two ways to optimize the allocation of resources to improve the efficiency of the combination of various modes of transport is promoting the co-ordination of various modes of transport and constructing integrated transportation system.
NASA Astrophysics Data System (ADS)
Brunner, Manuela Irene; Seibert, Jan; Favre, Anne-Catherine
2018-02-01
Traditional design flood estimation approaches have focused on peak discharges and have often neglected other hydrograph characteristics such as hydrograph volume and shape. Synthetic design hydrograph estimation procedures overcome this deficiency by jointly considering peak discharge, hydrograph volume, and shape. Such procedures have recently been extended to allow for the consideration of process variability within a catchment by a flood-type specific construction of design hydrographs. However, they depend on observed runoff time series and are not directly applicable in ungauged catchments where such series are not available. To obtain reliable flood estimates, there is a need for an approach that allows for the consideration of process variability in the construction of synthetic design hydrographs in ungauged catchments. In this study, we therefore propose an approach that combines a bivariate index flood approach with event-type specific synthetic design hydrograph construction. First, regions of similar flood reactivity are delineated and a classification rule that enables the assignment of ungauged catchments to one of these reactivity regions is established. Second, event-type specific synthetic design hydrographs are constructed using the pooled data divided by event type from the corresponding reactivity region in a bivariate index flood procedure. The approach was tested and validated on a dataset of 163 Swiss catchments. The results indicated that 1) random forest is a suitable classification model for the assignment of an ungauged catchment to one of the reactivity regions, 2) the combination of a bivariate index flood approach and event-type specific synthetic design hydrograph construction enables the consideration of event types in ungauged catchments, and 3) the use of probabilistic class memberships in regional synthetic design hydrograph construction helps to alleviate the problem of misclassification. Event-type specific synthetic design hydrograph sets enable the inclusion of process variability into design flood estimation and can be used as a compromise between single best estimate synthetic design hydrographs and continuous simulation studies.
NASA Astrophysics Data System (ADS)
Langbein, J. O.
2016-12-01
Most time series of geophysical phenomena are contaminated with temporally correlated errors that limit the precision of any derived parameters. Ignoring temporal correlations will result in biased and unrealistic estimates of velocity and its error estimated from geodetic position measurements. Obtaining better estimates of uncertainties is limited by several factors, including selection of the correct model for the background noise and the computational requirements to estimate the parameters of the selected noise model when there are numerous observations. Here, I address the second problem of computational efficiency using maximum likelihood estimates (MLE). Most geophysical time series have background noise processes that can be represented as a combination of white and power-law noise, 1/fn , with frequency, f. Time domain techniques involving construction and inversion of large data covariance matrices are employed. Bos et al. [2012] demonstrate one technique that substantially increases the efficiency of the MLE methods, but it provides only an approximate solution for power-law indices greater than 1.0. That restriction can be removed by simply forming a data-filter that adds noise processes rather than combining them in quadrature. Consequently, the inversion of the data covariance matrix is simplified and it provides robust results for a wide range of power-law indices. With the new formulation, the efficiency is typically improved by about a factor of 8 over previous MLE algorithms [Langbein, 2004]. The new algorithm can be downloaded at http://earthquake.usgs.gov/research/software/#est_noise. The main program provides a number of basic functions that can be used to model the time-dependent part of time series and a variety of models that describe the temporal covariance of the data. In addition, the program is packaged with a few companion programs and scripts that can help with data analysis and with interpretation of the noise modeling.
The magnetic tides of Honolulu
Love, Jeffrey J.; Rigler, Erin Joshua
2013-01-01
We review the phenomenon of time-stationary, periodic quiet-time geomagnetic tides. These are generated by the ionospheric and oceanic dynamos, and, to a lesser-extent, by the quiet-time magnetosphere, and they are affected by currents induced in the Earth's electrically conducting interior. We examine historical time series of hourly magnetic-vector measurements made at the Honolulu observatory. We construct high-resolution, frequency-domain Lomb-periodogram and maximum-entropy power spectra that reveal a panorama of stationary harmonics across periods from 0.1 to 10000.0-d, including harmonics that result from amplitude and phase modulation. We identify solar-diurnal tides and their annual and solar-cycle sideband modulations, lunar semi-diurnal tides and their solar-diurnal sidebands, and tides due to precession of lunar eccentricity and nodes. We provide evidence that a method intended for separating the ionospheric and oceanic dynamo signals by midnight subsampling of observatory data time series is prone to frequency-domain aliasing. The tidal signals we summarize in this review can be used to test our fundamental understanding of the dynamics of the quiet-time ionosphere and magnetosphere, induction in the ocean and in the electrically conducting interior of the Earth, and they are useful for defining a quiet-time baseline against which magnetospheric-storm intensity is measured.
Stock market context of the Lévy walks with varying velocity
NASA Astrophysics Data System (ADS)
Kutner, Ryszard
2002-11-01
We developed the most general Lévy walks with varying velocity, shorter called the Weierstrass walks (WW) model, by which one can describe both stationary and non-stationary stochastic time series. We considered a non-Brownian random walk where the walker moves, in general, with a velocity that assumes a different constant value between the successive turning points, i.e., the velocity is a piecewise constant function. This model is a kind of Lévy walks where we assume a hierarchical, self-similar in a stochastic sense, spatio-temporal representation of the main quantities such as waiting-time distribution and sojourn probability density (which are principal quantities in the continuous-time random walk formalism). The WW model makes possible to analyze both the structure of the Hurst exponent and the power-law behavior of kurtosis. This structure results from the hierarchical, spatio-temporal coupling between the walker displacement and the corresponding time of the walks. The analysis uses both the fractional diffusion and the super Burnett coefficients. We constructed the diffusion phase diagram which distinguishes regions occupied by classes of different universality. We study only such classes which are characteristic for stationary situations. We thus have a model ready for describing the data presented, e.g., in the form of moving averages; the operation is often used for stochastic time series, especially financial ones. The model was inspired by properties of financial time series and tested for empirical data extracted from the Warsaw stock exchange since it offers an opportunity to study in an unbiased way several features of stock exchange in its early stage.
Weiss, Jonathan D.
1995-01-01
A shock velocity and damage location sensor providing a means of measuring shock speed and damage location. The sensor consists of a long series of time-of-arrival "points" constructed with fiber optics. The fiber optic sensor apparatus measures shock velocity as the fiber sensor is progressively crushed as a shock wave proceeds in a direction along the fiber. The light received by a receiving means changes as time-of-arrival points are destroyed as the sensor is disturbed by the shock. The sensor may comprise a transmitting fiber bent into a series of loops and fused to a receiving fiber at various places, time-of-arrival points, along the receiving fibers length. At the "points" of contact, where a portion of the light leaves the transmitting fiber and enters the receiving fiber, the loops would be required to allow the light to travel backwards through the receiving fiber toward a receiving means. The sensor may also comprise a single optical fiber wherein the time-of-arrival points are comprised of reflection planes distributed along the fibers length. In this configuration, as the shock front proceeds along the fiber it destroys one reflector after another. The output received by a receiving means from this sensor may be a series of downward steps produced as the shock wave destroys one time-of-arrival point after another, or a nonsequential pattern of steps in the event time-of-arrival points are destroyed at any point along the sensor.
Weiss, J.D.
1995-08-29
A shock velocity and damage location sensor providing a means of measuring shock speed and damage location is disclosed. The sensor consists of a long series of time-of-arrival ``points`` constructed with fiber optics. The fiber optic sensor apparatus measures shock velocity as the fiber sensor is progressively crushed as a shock wave proceeds in a direction along the fiber. The light received by a receiving means changes as time-of-arrival points are destroyed as the sensor is disturbed by the shock. The sensor may comprise a transmitting fiber bent into a series of loops and fused to a receiving fiber at various places, time-of-arrival points, along the receiving fibers length. At the ``points`` of contact, where a portion of the light leaves the transmitting fiber and enters the receiving fiber, the loops would be required to allow the light to travel backwards through the receiving fiber toward a receiving means. The sensor may also comprise a single optical fiber wherein the time-of-arrival points are comprised of reflection planes distributed along the fibers length. In this configuration, as the shock front proceeds along the fiber it destroys one reflector after another. The output received by a receiving means from this sensor may be a series of downward steps produced as the shock wave destroys one time-of-arrival point after another, or a nonsequential pattern of steps in the event time-of-arrival points are destroyed at any point along the sensor. 6 figs.
Temporal structure and gain-loss asymmetry for real and artificial stock indices
NASA Astrophysics Data System (ADS)
Siven, Johannes Vitalis; Lins, Jeffrey Todd
2009-11-01
Previous research has shown that for stock indices, the most likely time until a return of a particular size has been observed is longer for gains than for losses. We demonstrate that this so-called gain-loss asymmetry vanishes if the temporal dependence structure is destroyed by scrambling the time series. We also show that an artificial index constructed by a simple average of a number of individual stocks display gain-loss asymmetry—this allows us to explicitly analyze the dependence between the index constituents. We consider mutual information and correlation-based measures and show that the stock returns indeed have a higher degree of dependence in times of market downturns than upturns.
Human Migration and Agricultural Expansion: An Impending Threat to the Maya Biosphere Reserve
NASA Technical Reports Server (NTRS)
Sader, Steven; Reining, Conard; Sever, Thomas L.; Soza, Carlos
1997-01-01
Evidence is presented of the current threats to the Maya Biosphere Reserve in northern Guatemala as derived through time-series Landsat Thematic Mapper observations and analysis. Estimates of deforestation rates and trends are examined for different management units within the reserve and buffer zones. The satellite imagery was used to quantify and monitor rates, patterns, and trends of forest clearing during a time period corresponding to new road construction and significant human migration into the newly accessible forest region. Satellite imagery is appropriate technology in a vast and remote tropical region where aerial photography and extensive field-based methods are not cost-effective and current, timely data is essential for establishing conservation priorities.
The business cycle and the incidence of workplace injuries: evidence from the U.S.A.
Asfaw, Abay; Pana-Cryan, Regina; Rosa, Roger
2011-02-01
The current study explored the association between the business cycle and the incidence of workplace injuries to identify cyclically sensitive industries and the relative contribution of physical capital and labor utilization within industries. Bureau of Labor Statistics nonfatal injury rates from 1976 through 2007 were examined across five industry sectors with respect to several macroeconomic indicators. Within industries, injury associations with utilization of labor and physical capital over time were tested using time series regression methods. Pro-cyclical associations between business cycle indicators and injury incidence were observed in mining, construction, and manufacturing but not in agriculture or trade. Physical capital utilization was the highest potential contributor to injuries in mining while labor utilization was the highest potential contributor in construction. In manufacturing each effect had a similar association with injuries. The incidence of workplace injury is associated with the business cycle. However, the degree of association and the mechanisms through with the business cycle affects the incidence of workplace injuries was not the same across industries. The results suggest that firms in the construction, manufacturing, and mining industries should take additional precautionary safety measures during cyclical upturns. Potential differences among industries in the mechanisms through which the business cycle affects injury incidence suggest different protective strategies for those industries. For example, in construction, additional efforts might be undertaken to ensure workers are adequately trained and not excessively fatigued, while safety procedures continue to be followed even during boom times. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Dodds, S. F.; Mock, C. J.
2009-12-01
All available instrumental winter precipitation data for the Central Valley of California back to 1850 were digitized and analyzed to construct continuous time series. Many of these data, in paper or microfilm format, extend prior to modern National Weather Service Cooperative Data Program and Historical Climate Network data, and were recorded by volunteer observers from networks such as the US Army Surgeon General, Smithsonian Institution, and US Army Signal Service. Given incomplete individual records temporally, detailed documentary data from newspapers, personal diaries and journals, ship logbooks, and weather enthusiasts’ instrumental data, were used in conjunction with instrumental data to reconstruct precipitation frequency per month and season, continuous days of precipitation, and to identify anomalous precipitation events. Multilinear regression techniques, using surrounding stations and the relationships between modern and historical records, bridge timeframes lacking data and provided homogeneous nature of time series. The metadata for each station was carefully screened, and notes were made about any possible changes to the instrumentation, location of instruments, or an untrained observer to verify that anomalous events were not recorded incorrectly. Precipitation in the Central Valley varies throughout the entire region, but waterways link the differing elevations and latitudes. This study integrates the individual station data with additional accounts of flood descriptions through unique newspaper and journal data. River heights and flood extent inundating cities, agricultural lands, and individual homes are often recorded within unique documentary sources, which add to the understanding of flood occurrence within this area. Comparisons were also made between dam and levee construction through time and how waters are diverted through cities in natural and anthropogenically changed environments. Some precipitation that lead to flooding events that occur in the Central Valley in the mid-19th century through the early 20th century are more outstanding at some particular stations than the modern records include. Several years that are included in the study are 1850, 1862, 1868, 1878, 1881, 1890, and 1907. These flood years were compared to the modern record and reconstructed through time series and maps. Incorporating the extent and effects these anomalous events in future climate studies could improve models and preparedness for the future floods.
Teen Series' Reception: Television, Adolescence and Culture of Feelings.
ERIC Educational Resources Information Center
Pasquier, Dominique
1996-01-01
Noting the popularity of television teen series among young viewers in France, this study examined how the programs are used as a way of defining gender identity for children and adolescents. Results indicated construction of meanings of characters and plots varied by age, gender, and social background of viewers. Relationship to series relied on…
New Ground Truth Capability from InSAR Time Series Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buckley, S; Vincent, P; Yang, D
2005-07-13
We demonstrate that next-generation interferometric synthetic aperture radar (InSAR) processing techniques applied to existing data provide rich InSAR ground truth content for exploitation in seismic source identification. InSAR time series analyses utilize tens of interferograms and can be implemented in different ways. In one such approach, conventional InSAR displacement maps are inverted in a final post-processing step. Alternatively, computationally intensive data reduction can be performed with specialized InSAR processing algorithms. The typical final result of these approaches is a synthesized set of cumulative displacement maps. Examples from our recent work demonstrate that these InSAR processing techniques can provide appealing newmore » ground truth capabilities. We construct movies showing the areal and temporal evolution of deformation associated with previous nuclear tests. In other analyses, we extract time histories of centimeter-scale surface displacement associated with tunneling. The potential exists to identify millimeter per year surface movements when sufficient data exists for InSAR techniques to isolate and remove phase signatures associated with digital elevation model errors and the atmosphere.« less
Liu, Shuyuan; Liu, Xiangnan; Liu, Meiling; Wu, Ling; Ding, Chao; Huang, Zhi
2017-05-30
An effective method to monitor heavy metal stress in crops is of critical importance to assure agricultural production and food security. Phenology, as a sensitive indicator of environmental change, can respond to heavy metal stress in crops and remote sensing is an effective method to detect plant phenological changes. This study focused on identifying the rice phenological differences under varied heavy metal stress using EVI (enhanced vegetation index) time-series, which was obtained from HJ-1A/B CCD images and fitted with asymmetric Gaussian model functions. We extracted three phenological periods using first derivative analysis: the tillering period, heading period, and maturation period; and constructed two kinds of metrics with phenological characteristics: date-intervals and time-integrated EVI, to explore the rice phenological differences under mild and severe stress levels. Results indicated that under severe stress the values of the metrics for presenting rice phenological differences in the experimental areas of heavy metal stress were smaller than the ones under mild stress. This finding represents a new method for monitoring heavy metal contamination through rice phenology.
Data-Driven Modeling of Complex Systems by means of a Dynamical ANN
NASA Astrophysics Data System (ADS)
Seleznev, A.; Mukhin, D.; Gavrilov, A.; Loskutov, E.; Feigin, A.
2017-12-01
The data-driven methods for modeling and prognosis of complex dynamical systems become more and more popular in various fields due to growth of high-resolution data. We distinguish the two basic steps in such an approach: (i) determining the phase subspace of the system, or embedding, from available time series and (ii) constructing an evolution operator acting in this reduced subspace. In this work we suggest a novel approach combining these two steps by means of construction of an artificial neural network (ANN) with special topology. The proposed ANN-based model, on the one hand, projects the data onto a low-dimensional manifold, and, on the other hand, models a dynamical system on this manifold. Actually, this is a recurrent multilayer ANN which has internal dynamics and capable of generating time series. Very important point of the proposed methodology is the optimization of the model allowing us to avoid overfitting: we use Bayesian criterion to optimize the ANN structure and estimate both the degree of evolution operator nonlinearity and the complexity of nonlinear manifold which the data are projected on. The proposed modeling technique will be applied to the analysis of high-dimensional dynamical systems: Lorenz'96 model of atmospheric turbulence, producing high-dimensional space-time chaos, and quasi-geostrophic three-layer model of the Earth's atmosphere with the natural orography, describing the dynamics of synoptical vortexes as well as mesoscale blocking systems. The possibility of application of the proposed methodology to analyze real measured data is also discussed. The study was supported by the Russian Science Foundation (grant #16-12-10198).
Rubinstein, Robert; Kurien, Susan; Cambon, Claude
2015-06-22
The representation theory of the rotation group is applied to construct a series expansion of the correlation tensor in homogeneous anisotropic turbulence. The resolution of angular dependence is the main analytical difficulty posed by anisotropic turbulence; representation theory parametrises this dependence by a tensor analogue of the standard spherical harmonics expansion of a scalar. As a result, the series expansion is formulated in terms of explicitly constructed tensor bases with scalar coefficients determined by angular moments of the correlation tensor.
NASA Astrophysics Data System (ADS)
Ai, Jinquan; Gao, Wei; Gao, Zhiqiang; Shi, Runhe; Zhang, Chao
2017-04-01
Spartina alterniflora is an aggressive invasive plant species that replaces native species, changes the structure and function of the ecosystem across coastal wetlands in China, and is thus a major conservation concern. Mapping the spread of its invasion is a necessary first step for the implementation of effective ecological management strategies. The performance of a phenology-based approach for S. alterniflora mapping is explored in the coastal wetland of the Yangtze Estuary using a time series of GaoFen satellite no. 1 wide field of view camera (GF-1 WFV) imagery. First, a time series of the normalized difference vegetation index (NDVI) was constructed to evaluate the phenology of S. alterniflora. Two phenological stages (the senescence stage from November to mid-December and the green-up stage from late April to May) were determined as important for S. alterniflora detection in the study area based on NDVI temporal profiles, spectral reflectance curves of S. alterniflora and its coexistent species, and field surveys. Three phenology feature sets representing three major phenology-based detection strategies were then compared to map S. alterniflora: (1) the single-date imagery acquired within the optimal phenological window, (2) the multitemporal imagery, including four images from the two important phenological windows, and (3) the monthly NDVI time series imagery. Support vector machines and maximum likelihood classifiers were applied on each phenology feature set at different training sample sizes. For all phenology feature sets, the overall results were produced consistently with high mapping accuracies under sufficient training samples sizes, although significantly improved classification accuracies (10%) were obtained when the monthly NDVI time series imagery was employed. The optimal single-date imagery had the lowest accuracies of all detection strategies. The multitemporal analysis demonstrated little reduction in the overall accuracy compared with the use of monthly NDVI time series imagery. These results show the importance of considering the phenological stage for image selection for mapping S. alterniflora using GF-1 WFV imagery. Furthermore, in light of the better tradeoff between the number of images and classification accuracy when using multitemporal GF-1 WFV imagery, we suggest using multitemporal imagery acquired at appropriate phenological windows for S. alterniflora mapping at regional scales.
Lafuente, M J; Petit, T; Gancedo, C
1997-12-22
We have constructed a series of plasmids to facilitate the fusion of promoters with or without coding regions of genes of Schizosaccharomyces pombe to the lacZ gene of Escherichia coli. These vectors carry a multiple cloning region in which fission yeast DNA may be inserted in three different reading frames with respect to the coding region of lacZ. The plasmids were constructed with the ura4+ or the his3+ marker of S. pombe. Functionality of the plasmids was tested measuring in parallel the expression of fructose 1,6-bisphosphatase and beta-galactosidase under the control of the fbp1+ promoter in different conditions.
Brief history of US debt limits before 1939
Hall, George J.; Sargent, Thomas J.
2018-01-01
Between 1776 and 1920, the US Congress designed more than 200 distinct securities and stated the maximum amount of each that the Treasury could sell. Between 1917 and 1939, Congress gradually delegated all decisions about designing US debt instruments to the Treasury. In 1939, Congress began imposing a limit on the par value of total federal debt outstanding. By summing Congressional borrowing authorizations outstanding each year for each bond, we construct a time series of implied federal debt limits before 1939. PMID:29507220
Visibility graph analysis on heartbeat dynamics of meditation training
NASA Astrophysics Data System (ADS)
Jiang, Sen; Bian, Chunhua; Ning, Xinbao; Ma, Qianli D. Y.
2013-06-01
We apply the visibility graph analysis to human heartbeat dynamics by constructing the complex networks of heartbeat interval time series and investigating the statistical properties of the network before and during chi and yoga meditation. The experiment results show that visibility graph analysis can reveal the dynamical changes caused by meditation training manifested as regular heartbeat, which is closely related to the adjustment of autonomous neural system, and visibility graph analysis is effective to evaluate the effect of meditation.
Stellar Variability in the VVV Survey: An Update
NASA Astrophysics Data System (ADS)
Catelan, M.; Dekany, I.; Hempel, M.; Minniti, D.
The Vista Variables in the Via Lactea (VVV) ESO Public Survey consists in a near-infrared time-series survey of the Galactic bulge and inner disk; covering 562 square degrees of the sky; over a total timespan of more than 5 years. In this paper; we provide an updated account of the current status of the survey; especially in the context of stellar variability studies. In this sense; we give a first description of our efforts towards the construction of the VVV Variable Star Catalog (VVV-VSC).
Ant colony system algorithm for the optimization of beer fermentation control.
Xiao, Jie; Zhou, Ze-Kui; Zhang, Guang-Xin
2004-12-01
Beer fermentation is a dynamic process that must be guided along a temperature profile to obtain the desired results. Ant colony system algorithm was applied to optimize the kinetic model of this process. During a fixed period of fermentation time, a series of different temperature profiles of the mixture were constructed. An optimal one was chosen at last. Optimal temperature profile maximized the final ethanol production and minimized the byproducts concentration and spoilage risk. The satisfactory results obtained did not require much computation effort.
ERIC Educational Resources Information Center
Cummins, John
2017-01-01
This paper is a description and analysis of the history of the renovation of Memorial Stadium and the building of the Barclay Simpson Student Athlete High Performance Center (SAHPC) on the Berkeley campus, showing how incremental changes over time result in a much riskier and financially less viable project than originally anticipated. It…
The ATLAS Experiment: Mapping the Secrets of the Universe (LBNL Summer Lecture Series)
Barnett, Michael [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Physics Division
2018-01-12
Summer Lecture Series 2007: Michael Barnett of Berkeley Lab's Physics Division discusses the ATLAS Experiment at the European Laboratory for Particle Physics' (CERN) Large Hadron Collider. The collider will explore the aftermath of collisions at the highest energy ever produced in the lab, and will recreate the conditions of the universe a billionth of a second after the Big Bang. The ATLAS detector is half the size of the Notre Dame Cathedral and required 2000 physicists and engineers from 35 countries for its construction. Its goals are to examine mini-black holes, identify dark matter, understand antimatter, search for extra dimensions of space, and learn about the fundamental forces that have shaped the universe since the beginning of time and will determine its fate.
Disequilibrium in the uranium and actinium series in oil scale samples.
Landsberger, S; Tamalis, D; Leblanc, C; Yoho, M D
2017-01-01
We have investigated the disequilibrium of the uranium and actinium series and have found both 226 Ra (90,200 ± 4300 Bq/kg) and 228 Ra have activity concentrations orders of magnitude higher that 238 U (1.83 ± 0.36 Bq/kg) and 232 Th (7.0 ± 0.4) which are at the head of the decay series. As well the activity concentration of 210 Pb (24,400 ± 1200 Bg/kg) was about 3.6 times less than 226 Ra. Once an efficiency curve was constructed summing corrections for specific isotopes in the decay change also needed to be taken in consideration. Furthermore, self-attenuation of the photons especially the 46.5 keV belonging to 210 Pb was calculated to be 78% since the scale had elevated elemental concentrations of high-Z elements such as barium and strontium. Copyright © 2016 Elsevier Ltd. All rights reserved.
Therapeutic Assessment of Complex Trauma: A Single-Case Time-Series Study.
Tarocchi, Anna; Aschieri, Filippo; Fantini, Francesca; Smith, Justin D
2013-06-01
The cumulative effect of repeated traumatic experiences in early childhood incrementally increases the risk of adjustment problems later in life. Surviving traumatic environments can lead to the development of an interrelated constellation of emotional and interpersonal symptoms termed complex posttraumatic stress disorder (CPTSD). Effective treatment of trauma begins with a multimethod psychological assessment and requires the use of several evidence-based therapeutic processes, including establishing a safe therapeutic environment, reprocessing the trauma, constructing a new narrative, and managing emotional dysregulation. Therapeutic Assessment (TA) is a semistructured, brief intervention that uses psychological testing to promote positive change. The case study of Kelly, a middle-aged woman with a history of repeated interpersonal trauma, illustrates delivery of the TA model for CPTSD. Results of this single-case time-series experiment indicate statistically significant symptom improvement as a result of participating in TA. We discuss the implications of these findings for assessing and treating trauma-related concerns, such as CPTSD.
The string prediction models as invariants of time series in the forex market
NASA Astrophysics Data System (ADS)
Pincak, R.
2013-12-01
In this paper we apply a new approach of string theory to the real financial market. The models are constructed with an idea of prediction models based on the string invariants (PMBSI). The performance of PMBSI is compared to support vector machines (SVM) and artificial neural networks (ANN) on an artificial and a financial time series. A brief overview of the results and analysis is given. The first model is based on the correlation function as invariant and the second one is an application based on the deviations from the closed string/pattern form (PMBCS). We found the difference between these two approaches. The first model cannot predict the behavior of the forex market with good efficiency in comparison with the second one which is, in addition, able to make relevant profit per year. The presented string models could be useful for portfolio creation and financial risk management in the banking sector as well as for a nonlinear statistical approach to data optimization.
Variable Basal Melt Rates of Antarctic Peninsula Ice Shelves, 1994-2016
NASA Astrophysics Data System (ADS)
Adusumilli, Susheel; Fricker, Helen Amanda; Siegfried, Matthew R.; Padman, Laurie; Paolo, Fernando S.; Ligtenberg, Stefan R. M.
2018-05-01
We have constructed 23-year (1994-2016) time series of Antarctic Peninsula (AP) ice-shelf height change using data from four satellite radar altimeters (ERS-1, ERS-2, Envisat, and CryoSat-2). Combining these time series with output from atmospheric and firn models, we partitioned the total height-change signal into contributions from varying surface mass balance, firn state, ice dynamics, and basal mass balance. On the Bellingshausen coast of the AP, ice shelves lost 84 ± 34 Gt a-1 to basal melting, compared to contributions of 50 ± 7 Gt a-1 from surface mass balance and ice dynamics. Net basal melting on the Weddell coast was 51 ± 71 Gt a-1. Recent changes in ice-shelf height include increases over major AP ice shelves driven by changes in firn state. Basal melt rates near Bawden Ice Rise, a major pinning point of Larsen C Ice Shelf, showed large increases, potentially leading to substantial loss of buttressing if sustained.
Matsunaga, Yasuhiro
2018-01-01
Single-molecule experiments and molecular dynamics (MD) simulations are indispensable tools for investigating protein conformational dynamics. The former provide time-series data, such as donor-acceptor distances, whereas the latter give atomistic information, although this information is often biased by model parameters. Here, we devise a machine-learning method to combine the complementary information from the two approaches and construct a consistent model of conformational dynamics. It is applied to the folding dynamics of the formin-binding protein WW domain. MD simulations over 400 μs led to an initial Markov state model (MSM), which was then "refined" using single-molecule Förster resonance energy transfer (FRET) data through hidden Markov modeling. The refined or data-assimilated MSM reproduces the FRET data and features hairpin one in the transition-state ensemble, consistent with mutation experiments. The folding pathway in the data-assimilated MSM suggests interplay between hydrophobic contacts and turn formation. Our method provides a general framework for investigating conformational transitions in other proteins. PMID:29723137
Multi-frequency complex network from time series for uncovering oil-water flow structure.
Gao, Zhong-Ke; Yang, Yu-Xuan; Fang, Peng-Cheng; Jin, Ning-De; Xia, Cheng-Yi; Hu, Li-Dan
2015-02-04
Uncovering complex oil-water flow structure represents a challenge in diverse scientific disciplines. This challenge stimulates us to develop a new distributed conductance sensor for measuring local flow signals at different positions and then propose a novel approach based on multi-frequency complex network to uncover the flow structures from experimental multivariate measurements. In particular, based on the Fast Fourier transform, we demonstrate how to derive multi-frequency complex network from multivariate time series. We construct complex networks at different frequencies and then detect community structures. Our results indicate that the community structures faithfully represent the structural features of oil-water flow patterns. Furthermore, we investigate the network statistic at different frequencies for each derived network and find that the frequency clustering coefficient enables to uncover the evolution of flow patterns and yield deep insights into the formation of flow structures. Current results present a first step towards a network visualization of complex flow patterns from a community structure perspective.
Automatic Detection of Driver Fatigue Using Driving Operation Information for Transportation Safety
Li, Zuojin; Chen, Liukui; Peng, Jun; Wu, Ying
2017-01-01
Fatigued driving is a major cause of road accidents. For this reason, the method in this paper is based on the steering wheel angles (SWA) and yaw angles (YA) information under real driving conditions to detect drivers’ fatigue levels. It analyzes the operation features of SWA and YA under different fatigue statuses, then calculates the approximate entropy (ApEn) features of a short sliding window on time series. Using the nonlinear feature construction theory of dynamic time series, with the fatigue features as input, designs a “2-6-6-3” multi-level back propagation (BP) Neural Networks classifier to realize the fatigue detection. An approximately 15-h experiment is carried out on a real road, and the data retrieved are segmented and labeled with three fatigue levels after expert evaluation, namely “awake”, “drowsy” and “very drowsy”. The average accuracy of 88.02% in fatigue identification was achieved in the experiment, endorsing the value of the proposed method for engineering applications. PMID:28587072
Automatic Detection of Driver Fatigue Using Driving Operation Information for Transportation Safety.
Li, Zuojin; Chen, Liukui; Peng, Jun; Wu, Ying
2017-05-25
Fatigued driving is a major cause of road accidents. For this reason, the method in this paper is based on the steering wheel angles (SWA) and yaw angles (YA) information under real driving conditions to detect drivers' fatigue levels. It analyzes the operation features of SWA and YA under different fatigue statuses, then calculates the approximate entropy (ApEn) features of a short sliding window on time series. Using the nonlinear feature construction theory of dynamic time series, with the fatigue features as input, designs a "2-6-6-3" multi-level back propagation (BP) Neural Networks classifier to realize the fatigue detection. An approximately 15-h experiment is carried out on a real road, and the data retrieved are segmented and labeled with three fatigue levels after expert evaluation, namely "awake", "drowsy" and "very drowsy". The average accuracy of 88.02% in fatigue identification was achieved in the experiment, endorsing the value of the proposed method for engineering applications.
Therapeutic Assessment of Complex Trauma: A Single-Case Time-Series Study
Tarocchi, Anna; Aschieri, Filippo; Fantini, Francesca; Smith, Justin D.
2013-01-01
The cumulative effect of repeated traumatic experiences in early childhood incrementally increases the risk of adjustment problems later in life. Surviving traumatic environments can lead to the development of an interrelated constellation of emotional and interpersonal symptoms termed complex posttraumatic stress disorder (CPTSD). Effective treatment of trauma begins with a multimethod psychological assessment and requires the use of several evidence-based therapeutic processes, including establishing a safe therapeutic environment, reprocessing the trauma, constructing a new narrative, and managing emotional dysregulation. Therapeutic Assessment (TA) is a semistructured, brief intervention that uses psychological testing to promote positive change. The case study of Kelly, a middle-aged woman with a history of repeated interpersonal trauma, illustrates delivery of the TA model for CPTSD. Results of this single-case time-series experiment indicate statistically significant symptom improvement as a result of participating in TA. We discuss the implications of these findings for assessing and treating trauma-related concerns, such as CPTSD. PMID:24159267
A dynamic factor model of the evaluation of the financial crisis in Turkey.
Sezgin, F; Kinay, B
2010-01-01
Factor analysis has been widely used in economics and finance in situations where a relatively large number of variables are believed to be driven by few common causes of variation. Dynamic factor analysis (DFA) which is a combination of factor and time series analysis, involves autocorrelation matrices calculated from multivariate time series. Dynamic factor models were traditionally used to construct economic indicators, macroeconomic analysis, business cycles and forecasting. In recent years, dynamic factor models have become more popular in empirical macroeconomics. They have more advantages than other methods in various respects. Factor models can for instance cope with many variables without running into scarce degrees of freedom problems often faced in regression-based analysis. In this study, a model which determines the effect of the global crisis on Turkey is proposed. The main aim of the paper is to analyze how several macroeconomic quantities show an alteration before the evolution of the crisis and to decide if a crisis can be forecasted or not.
Matsunaga, Yasuhiro; Sugita, Yuji
2018-05-03
Single-molecule experiments and molecular dynamics (MD) simulations are indispensable tools for investigating protein conformational dynamics. The former provide time-series data, such as donor-acceptor distances, whereas the latter give atomistic information, although this information is often biased by model parameters. Here, we devise a machine-learning method to combine the complementary information from the two approaches and construct a consistent model of conformational dynamics. It is applied to the folding dynamics of the formin-binding protein WW domain. MD simulations over 400 μs led to an initial Markov state model (MSM), which was then "refined" using single-molecule Förster resonance energy transfer (FRET) data through hidden Markov modeling. The refined or data-assimilated MSM reproduces the FRET data and features hairpin one in the transition-state ensemble, consistent with mutation experiments. The folding pathway in the data-assimilated MSM suggests interplay between hydrophobic contacts and turn formation. Our method provides a general framework for investigating conformational transitions in other proteins. © 2018, Matsunaga et al.
Real Time Search Algorithm for Observation Outliers During Monitoring Engineering Constructions
NASA Astrophysics Data System (ADS)
Latos, Dorota; Kolanowski, Bogdan; Pachelski, Wojciech; Sołoducha, Ryszard
2017-12-01
Real time monitoring of engineering structures in case of an emergency of disaster requires collection of a large amount of data to be processed by specific analytical techniques. A quick and accurate assessment of the state of the object is crucial for a probable rescue action. One of the more significant evaluation methods of large sets of data, either collected during a specified interval of time or permanently, is the time series analysis. In this paper presented is a search algorithm for those time series elements which deviate from their values expected during monitoring. Quick and proper detection of observations indicating anomalous behavior of the structure allows to take a variety of preventive actions. In the algorithm, the mathematical formulae used provide maximal sensitivity to detect even minimal changes in the object's behavior. The sensitivity analyses were conducted for the algorithm of moving average as well as for the Douglas-Peucker algorithm used in generalization of linear objects in GIS. In addition to determining the size of deviations from the average it was used the so-called Hausdorff distance. The carried out simulation and verification of laboratory survey data showed that the approach provides sufficient sensitivity for automatic real time analysis of large amount of data obtained from different and various sensors (total stations, leveling, camera, radar).
Edgelist phase unwrapping algorithm for time series InSAR analysis.
Shanker, A Piyush; Zebker, Howard
2010-03-01
We present here a new integer programming formulation for phase unwrapping of multidimensional data. Phase unwrapping is a key problem in many coherent imaging systems, including time series synthetic aperture radar interferometry (InSAR), with two spatial and one temporal data dimensions. The minimum cost flow (MCF) [IEEE Trans. Geosci. Remote Sens. 36, 813 (1998)] phase unwrapping algorithm describes a global cost minimization problem involving flow between phase residues computed over closed loops. Here we replace closed loops by reliable edges as the basic construct, thus leading to the name "edgelist." Our algorithm has several advantages over current methods-it simplifies the representation of multidimensional phase unwrapping, it incorporates data from external sources, such as GPS, where available to better constrain the unwrapped solution, and it treats regularly sampled or sparsely sampled data alike. It thus is particularly applicable to time series InSAR, where data are often irregularly spaced in time and individual interferograms can be corrupted with large decorrelated regions. We show that, similar to the MCF network problem, the edgelist formulation also exhibits total unimodularity, which enables us to solve the integer program problem by using efficient linear programming tools. We apply our method to a persistent scatterer-InSAR data set from the creeping section of the Central San Andreas Fault and find that the average creep rate of 22 mm/Yr is constant within 3 mm/Yr over 1992-2004 but varies systematically with ground location, with a slightly higher rate in 1992-1998 than in 1999-2003.
Application of the Hilbert-Huang Transform to Financial Data
NASA Technical Reports Server (NTRS)
Huang, Norden
2005-01-01
A paper discusses the application of the Hilbert-Huang transform (HHT) method to time-series financial-market data. The method was described, variously without and with the HHT name, in several prior NASA Tech Briefs articles and supporting documents. To recapitulate: The method is especially suitable for analyzing time-series data that represent nonstationary and nonlinear phenomena including physical phenomena and, in the present case, financial-market processes. The method involves the empirical mode decomposition (EMD), in which a complicated signal is decomposed into a finite number of functions, called "intrinsic mode functions" (IMFs), that admit well-behaved Hilbert transforms. The HHT consists of the combination of EMD and Hilbert spectral analysis. The local energies and the instantaneous frequencies derived from the IMFs through Hilbert transforms can be used to construct an energy-frequency-time distribution, denoted a Hilbert spectrum. The instant paper begins with a discussion of prior approaches to quantification of market volatility, summarizes the HHT method, then describes the application of the method in performing time-frequency analysis of mortgage-market data from the years 1972 through 2000. Filtering by use of the EMD is shown to be useful for quantifying market volatility.
Binding Isotherms and Time Courses Readily from Magnetic Resonance.
Xu, Jia; Van Doren, Steven R
2016-08-16
Evidence is presented that binding isotherms, simple or biphasic, can be extracted directly from noninterpreted, complex 2D NMR spectra using principal component analysis (PCA) to reveal the largest trend(s) across the series. This approach renders peak picking unnecessary for tracking population changes. In 1:1 binding, the first principal component captures the binding isotherm from NMR-detected titrations in fast, slow, and even intermediate and mixed exchange regimes, as illustrated for phospholigand associations with proteins. Although the sigmoidal shifts and line broadening of intermediate exchange distorts binding isotherms constructed conventionally, applying PCA directly to these spectra along with Pareto scaling overcomes the distortion. Applying PCA to time-domain NMR data also yields binding isotherms from titrations in fast or slow exchange. The algorithm readily extracts from magnetic resonance imaging movie time courses such as breathing and heart rate in chest imaging. Similarly, two-step binding processes detected by NMR are easily captured by principal components 1 and 2. PCA obviates the customary focus on specific peaks or regions of images. Applying it directly to a series of complex data will easily delineate binding isotherms, equilibrium shifts, and time courses of reactions or fluctuations.
Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim
2015-01-01
Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.
NASA Astrophysics Data System (ADS)
Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim
2015-11-01
Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two-dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.
A Generalized Wave Diagram for Moving Sources
NASA Astrophysics Data System (ADS)
Alt, Robert; Wiley, Sam
2004-12-01
Many introductory physics texts1-5 accompany the discussion of the Doppler effect and the formation of shock waves with diagrams illustrating the effect of a source moving through an elastic medium. Typically these diagrams consist of a series of equally spaced dots, representing the location of the source at different times. These are surrounded by a series of successively smaller circles representing wave fronts (see Fig. 1). While such a diagram provides a clear illustration of the shock wave produced by a source moving at a speed greater than the wave speed, and also the resultant pattern when the source speed is less than the wave speed (the Doppler effect), the texts do not often show the details of the construction. As a result, the key connection between the relative distance traveled by the source and the distance traveled by the wave is not explicitly made. In this paper we describe an approach emphasizing this connection that we have found to be a useful classroom supplement to the usual text presentation. As shown in Fig. 2 and Fig. 3, the Doppler effect and the shock wave can be illustrated by diagrams generated by the construction that follows.
Vegetation Response to Climate Change in the Southern Part of Qinghai-Tibet Plateau at Basinal Scale
NASA Astrophysics Data System (ADS)
Liu, X.; Liu, C.; Kang, Q.; Yin, B.
2018-04-01
Global climate change has significantly affected vegetation variation in the third-polar region of the world - the Qinghai-Tibet Plateau. As one of the most important indicators of vegetation variation (growth, coverage and tempo-spatial change), the Normalized Difference Vegetation Index (NDVI) is widely employed to study the response of vegetation to climate change. However, a long-term series analysis cannot be achieved because a single data source is constrained by time sequence. Therefore, a new framework was presented in this paper to extend the product series of monthly NDVI, taking as an example the Yarlung Zangbo River Basin, one of the most important river basins in the Qinghai-Tibet Plateau. NDVI products were acquired from two public sources: Global Inventory Modeling and Mapping Studies (GIMMS) Advanced Very High Resolution Radiometer (AVHRR) and Moderate-Resolution Imaging spectroradiometer (MODIS). After having been extended using the new framework, the new time series of NDVI covers a 384 months period (1982-2013), 84 months longer than previous time series of NDVI product, greatly facilitating NDVI related scientific research. In the new framework, the Gauss Filtering Method was employed to filter out noise in the NDVI product. Next, the standard method was introduced to enhance the comparability of the two data sources, and a pixel-based regression method was used to construct NDVI-extending models with one pixel after another. The extended series of NDVI fit well with original AVHRR-NDVI. With the extended time-series, temporal trends and spatial heterogeneity of NDVI in the study area were studied. Principal influencing factors on NDVI were further determined. The monthly NDVI is highly correlated with air temperature and precipitation in terms of climatic change wherein the spatially averaged NDVI slightly increases in the summer and has increased in temperature and decreased in precipitation in the 32 years period. The spatial heterogeneity of NDVI is in accordance with the seasonal variation of the two climate-change factors. All of these findings can provide valuable scientific support for water-land resources exploration in the third-polar region of the world.
Capattery double layer capacitor life performance
NASA Astrophysics Data System (ADS)
Evans, David A.; Clark, Nancy H.; Baca, W. E.; Miller, John R.; Barker, Thomas B.
Double layer capacitors (DLCs) have received increased use in computer memory backup applications for consumer products during the past ten years. Their extraordinarily high capacitance density along with their maintenance-free operation makes them particularly suited for these products. These same features also make DLCs very attractive in military type applications. Unfortunately, lifetime performance data has not been reported in the literature for any DLC component. Our objective in this study was to investigate the effects that voltage and temperature have on the properties and performance of single and series-connected DLCs as a function of time. Evans model RE110474, 0.47-farad, 11.0-volt Capatteries were evaluated. These components have a tantalum package, use welded construction, and contain a glass-to-metal seal, all incorporated to circumvent the typical DLC failure modes of electrolyte loss and container corrosion. A five-level, two-factor Central Composite Design was used in the study. Single and series-connected Capatteries rated at 85 C, 11.0-volts operation were subjected to test temperatures between 25 and 95 C, and voltages between 0 and 12.9 volts (9 test conditions). Measured responses included capacitance, equivalent series resistance, and discharge time. Data were analyzed using a regression analysis to obtain response functions relating DLC properties to their voltage, temperature, and test time history. These results are described and should aid system and component engineers in using DLCs in critical applications.
Motor potential profile and a robust method for extracting it from time series of motor positions.
Wang, Hongyun
2006-10-21
Molecular motors are small, and, as a result, motor operation is dominated by high-viscous friction and large thermal fluctuations from the surrounding fluid environment. The small size has hindered, in many ways, the studies of physical mechanisms of molecular motors. For a macroscopic motor, it is possible to observe/record experimentally the internal operation details of the motor. This is not yet possible for molecular motors. The chemical reaction in a molecular motor has many occupancy states, each having a different effect on the motor motion. The overall effect of the chemical reaction on the motor motion can be characterized by the motor potential profile. The potential profile reveals how the motor force changes with position in a motor step, which may lead to insights into how the chemical reaction is coupled to force generation. In this article, we propose a mathematical formulation and a robust method for constructing motor potential profiles from time series of motor positions measured in single molecule experiments. Numerical examples based on simulated data are shown to demonstrate the method. Interestingly, it is the small size of molecular motors (negligible inertia) that makes it possible to recover the potential profile from time series of motor positions. For a macroscopic motor, the variation of driving force within a cycle is smoothed out by the large inertia.
The risk characteristics of solar and geomagnetic activity
NASA Astrophysics Data System (ADS)
Podolska, Katerina
2016-04-01
The main aim of this contribution is a deeper analysis of the influence of solar activity which is expected to have an impact on human health, and therefore on mortality, in particular civilization and degenerative diseases. We have constructed the characteristics that represent the risk of solar and geomagnetic activity on human health on the basis of our previous analysis of association between the daily numbers of death on diseases of the nervous system and diseases of the circulatory system and solar and geomagnetic activity in the Czech Republic during the years 1994 - 2013. We used long period daily time series of numbers of deaths by cause, long period time series of solar activity indices (namely R and F10.7), geomagnetic indicies (Kp planetary index, Dst) and ionospheric parameters (foF2 and TEC). The ionospheric parameters were related to the geographic location of the Czech Republic and adjusted for middle geographic latitudes. The risk characteristics were composed by cluster analysis in time series according to the phases of the solar cycle resp. the seasonal insolation at mid-latitudes or the daily period according to the impact of solar and geomagnetic activity on mortality by cause of death from medical cause groups of death VI. Diseases of the nervous system and IX. Diseases of the circulatory system mortality by 10th Revision of International Classification of Diseases WHO (ICD-10).
On the divergence of triangular and eccentric spherical sums of double Fourier series
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karagulyan, G A
We construct a continuous function on the torus with almost everywhere divergent triangular sums of double Fourier series. We also prove an analogous theorem for eccentric spherical sums. Bibliography: 14 titles.
Lindhiem, Oliver; Shaffer, Anne
2017-04-01
Parenting behaviors are multifaceted and dynamic and therefore challenging to quantify. Measurement methods have critical implications for study results, particularly for prevention trials designed to modify parenting behaviors. Although multiple approaches can complement one another and contribute to a more complete understanding of prevention trials, the assumptions and implications of each approach are not always clearly addressed. Greater attention to the measurement of complex constructs such as parenting is needed to advance the field of prevention science. This series examines the challenges of measuring changes in parenting behaviors in the context of prevention trials. All manuscripts in the special series address measurement issues and make practical recommendations for prevention researchers. Manuscripts in this special series include (1) empirical studies that demonstrate novel measurement approaches, (2) re-analyses of prevention trial outcome data directly comparing and contrasting two or more methods, and (3) a statistical primer and practical guide to analyzing proportion data.
Nonparametric model validations for hidden Markov models with applications in financial econometrics
Zhao, Zhibiao
2011-01-01
We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise. PMID:21750601
DynamicRoots: A Software Platform for the Reconstruction and Analysis of Growing Plant Roots.
Symonova, Olga; Topp, Christopher N; Edelsbrunner, Herbert
2015-01-01
We present a software platform for reconstructing and analyzing the growth of a plant root system from a time-series of 3D voxelized shapes. It aligns the shapes with each other, constructs a geometric graph representation together with the function that records the time of growth, and organizes the branches into a hierarchy that reflects the order of creation. The software includes the automatic computation of structural and dynamic traits for each root in the system enabling the quantification of growth on fine-scale. These are important advances in plant phenotyping with applications to the study of genetic and environmental influences on growth.
Construct-a-Greenhouse. Science by Design Series.
ERIC Educational Resources Information Center
Lee, Felicia
This book is one of four books in the Science-by-Design Series created by TERC and funded by the National Science Foundation (NSF). This series presents directed instruction on how to successfully formulate and carry out product design. Students learn and apply concepts in science and technology to design and build a pair of insulated gloves, a…
Bose, Eliezer; Hravnak, Marilyn; Sereika, Susan M
Patients undergoing continuous vital sign monitoring (heart rate [HR], respiratory rate [RR], pulse oximetry [SpO2]) in real time display interrelated vital sign changes during situations of physiological stress. Patterns in this physiological cross-talk could portend impending cardiorespiratory instability (CRI). Vector autoregressive (VAR) modeling with Granger causality tests is one of the most flexible ways to elucidate underlying causal mechanisms in time series data. The purpose of this article is to illustrate the development of patient-specific VAR models using vital sign time series data in a sample of acutely ill, monitored, step-down unit patients and determine their Granger causal dynamics prior to onset of an incident CRI. CRI was defined as vital signs beyond stipulated normality thresholds (HR = 40-140/minute, RR = 8-36/minute, SpO2 < 85%) and persisting for 3 minutes within a 5-minute moving window (60% of the duration of the window). A 6-hour time segment prior to onset of first CRI was chosen for time series modeling in 20 patients using a six-step procedure: (a) the uniform time series for each vital sign was assessed for stationarity, (b) appropriate lag was determined using a lag-length selection criteria, (c) the VAR model was constructed, (d) residual autocorrelation was assessed with the Lagrange Multiplier test, (e) stability of the VAR system was checked, and (f) Granger causality was evaluated in the final stable model. The primary cause of incident CRI was low SpO2 (60% of cases), followed by out-of-range RR (30%) and HR (10%). Granger causality testing revealed that change in RR caused change in HR (21%; i.e., RR changed before HR changed) more often than change in HR causing change in RR (15%). Similarly, changes in RR caused changes in SpO2 (15%) more often than changes in SpO2 caused changes in RR (9%). For HR and SpO2, changes in HR causing changes in SpO2 and changes in SpO2 causing changes in HR occurred with equal frequency (18%). Within this sample of acutely ill patients who experienced a CRI event, VAR modeling indicated that RR changes tend to occur before changes in HR and SpO2. These findings suggest that contextual assessment of RR changes as the earliest sign of CRI is warranted. Use of VAR modeling may be helpful in other nursing research applications based on time series data.
3-D ultrasound volume reconstruction using the direct frame interpolation method.
Scheipers, Ulrich; Koptenko, Sergei; Remlinger, Rachel; Falco, Tony; Lachaine, Martin
2010-11-01
A new method for 3-D ultrasound volume reconstruction using tracked freehand 3-D ultrasound is proposed. The method is based on solving the forward volume reconstruction problem using direct interpolation of high-resolution ultrasound B-mode image frames. A series of ultrasound B-mode image frames (an image series) is acquired using the freehand scanning technique and position sensing via optical tracking equipment. The proposed algorithm creates additional intermediate image frames by directly interpolating between two or more adjacent image frames of the original image series. The target volume is filled using the original frames in combination with the additionally constructed frames. Compared with conventional volume reconstruction methods, no additional filling of empty voxels or holes within the volume is required, because the whole extent of the volume is defined by the arrangement of the original and the additionally constructed B-mode image frames. The proposed direct frame interpolation (DFI) method was tested on two different data sets acquired while scanning the head and neck region of different patients. The first data set consisted of eight B-mode 2-D frame sets acquired under optimal laboratory conditions. The second data set consisted of 73 image series acquired during a clinical study. Sample volumes were reconstructed for all 81 image series using the proposed DFI method with four different interpolation orders, as well as with the pixel nearest-neighbor method using three different interpolation neighborhoods. In addition, volumes based on a reduced number of image frames were reconstructed for comparison of the different methods' accuracy and robustness in reconstructing image data that lies between the original image frames. The DFI method is based on a forward approach making use of a priori information about the position and shape of the B-mode image frames (e.g., masking information) to optimize the reconstruction procedure and to reduce computation times and memory requirements. The method is straightforward, independent of additional input or parameters, and uses the high-resolution B-mode image frames instead of usually lower-resolution voxel information for interpolation. The DFI method can be considered as a valuable alternative to conventional 3-D ultrasound reconstruction methods based on pixel or voxel nearest-neighbor approaches, offering better quality and competitive reconstruction time.
Nowcasting influenza outbreaks using open-source media report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ray, Jaideep; Brownstein, John S.
We construct and verify a statistical method to nowcast influenza activity from a time-series of the frequency of reports concerning influenza related topics. Such reports are published electronically by both public health organizations as well as newspapers/media sources, and thus can be harvested easily via web crawlers. Since media reports are timely, whereas reports from public health organization are delayed by at least two weeks, using timely, open-source data to compensate for the lag in %E2%80%9Cofficial%E2%80%9D reports can be useful. We use morbidity data from networks of sentinel physicians (both the Center of Disease Control's ILINet and France's Sentinelles network)more » as the gold standard of influenza-like illness (ILI) activity. The time-series of media reports is obtained from HealthMap (http://healthmap.org). We find that the time-series of media reports shows some correlation ( 0.5) with ILI activity; further, this can be leveraged into an autoregressive moving average model with exogenous inputs (ARMAX model) to nowcast ILI activity. We find that the ARMAX models have more predictive skill compared to autoregressive (AR) models fitted to ILI data i.e., it is possible to exploit the information content in the open-source data. We also find that when the open-source data are non-informative, the ARMAX models reproduce the performance of AR models. The statistical models are tested on data from the 2009 swine-flu outbreak as well as the mild 2011-2012 influenza season in the U.S.A.« less
School Construction Defies Fiscal Doldrums
ERIC Educational Resources Information Center
Sack, Joetta L.
2004-01-01
This paper constitutes the first of a three part series examining the boom in the construction and renovation of K-12 schools and the continuing challenges that communities face in getting the facilities their students and educators need. Part 1 reports on the increase of school construction and renovation that began in 2002, despite a sagging…
NASA Astrophysics Data System (ADS)
Forootan, Ehsan; Kusche, Jürgen; Talpe, Matthieu; Shum, C. K.; Schmidt, Michael
2017-12-01
In recent decades, decomposition techniques have enabled increasingly more applications for dimension reduction, as well as extraction of additional information from geophysical time series. Traditionally, the principal component analysis (PCA)/empirical orthogonal function (EOF) method and more recently the independent component analysis (ICA) have been applied to extract, statistical orthogonal (uncorrelated), and independent modes that represent the maximum variance of time series, respectively. PCA and ICA can be classified as stationary signal decomposition techniques since they are based on decomposing the autocovariance matrix and diagonalizing higher (than two) order statistical tensors from centered time series, respectively. However, the stationarity assumption in these techniques is not justified for many geophysical and climate variables even after removing cyclic components, e.g., the commonly removed dominant seasonal cycles. In this paper, we present a novel decomposition method, the complex independent component analysis (CICA), which can be applied to extract non-stationary (changing in space and time) patterns from geophysical time series. Here, CICA is derived as an extension of real-valued ICA, where (a) we first define a new complex dataset that contains the observed time series in its real part, and their Hilbert transformed series as its imaginary part, (b) an ICA algorithm based on diagonalization of fourth-order cumulants is then applied to decompose the new complex dataset in (a), and finally, (c) the dominant independent complex modes are extracted and used to represent the dominant space and time amplitudes and associated phase propagation patterns. The performance of CICA is examined by analyzing synthetic data constructed from multiple physically meaningful modes in a simulation framework, with known truth. Next, global terrestrial water storage (TWS) data from the Gravity Recovery And Climate Experiment (GRACE) gravimetry mission (2003-2016), and satellite radiometric sea surface temperature (SST) data (1982-2016) over the Atlantic and Pacific Oceans are used with the aim of demonstrating signal separations of the North Atlantic Oscillation (NAO) from the Atlantic Multi-decadal Oscillation (AMO), and the El Niño Southern Oscillation (ENSO) from the Pacific Decadal Oscillation (PDO). CICA results indicate that ENSO-related patterns can be extracted from the Gravity Recovery And Climate Experiment Terrestrial Water Storage (GRACE TWS) with an accuracy of 0.5-1 cm in terms of equivalent water height (EWH). The magnitude of errors in extracting NAO or AMO from SST data using the complex EOF (CEOF) approach reaches up to 50% of the signal itself, while it is reduced to 16% when applying CICA. Larger errors with magnitudes of 100% and 30% of the signal itself are found while separating ENSO from PDO using CEOF and CICA, respectively. We thus conclude that the CICA is more effective than CEOF in separating non-stationary patterns.
Series Transmission Line Transformer
Buckles, Robert A.; Booth, Rex; Yen, Boris T.
2004-06-29
A series transmission line transformer is set forth which includes two or more of impedance matched sets of at least two transmissions lines such as shielded cables, connected in parallel at one end ans series at the other in a cascading fashion. The cables are wound about a magnetic core. The series transmission line transformer (STLT) which can provide for higher impedance ratios and bandwidths, which is scalable, and which is of simpler design and construction.
Optimization of power rationing order based on fuzzy evaluation model
NASA Astrophysics Data System (ADS)
Zhang, Siyuan; Liu, Li; Xie, Peiyuan; Tang, Jihong; Wang, Canlin
2018-04-01
With the development of production and economic growth, China's electricity load has experienced a significant increase. Over the years, in order to alleviate the contradiction of power shortage, a series of policies and measures to speed up electric power construction have been made in china, which promotes the rapid development of the power industry and the power construction has made great achievements. For China, after large-scale power facilities, power grid long-term power shortage situation has been improved to some extent, but in a certain period of time, the power development still exists uneven development. On the whole, it is still in the state of insufficient power, and the situation of power restriction is still severe in some areas, so it is necessary to study on the power rationing.
Alternative methods of flexible base compaction acceptance.
DOT National Transportation Integrated Search
2012-05-01
In the Texas Department of Transportation, flexible base construction is governed by a series of stockpile : and field tests. A series of concerns with these existing methods, along with some premature failures in the : field, led to this project inv...
Measurement invariance, the lack thereof, and modeling change.
Edwards, Michael C; Houts, Carrie R; Wirth, R J
2017-08-17
Measurement invariance issues should be considered during test construction. In this paper, we provide a conceptual overview of measurement invariance and describe how the concept is implemented in several different statistical approaches. Typical applications look for invariance over things such as mode of administration (paper and pencil vs. computer based), language/translation, age, time, and gender, to cite just a few examples. To the extent that the relationships between items and constructs are stable/invariant, we can be more confident in score interpretations. A series of simulated examples are reported which highlight different kinds of non-invariance, the impact it can have, and the effect of appropriately modeling a lack of invariance. One example focuses on the longitudinal context, where measurement invariance is critical to understanding trends over time. Software syntax is provided to help researchers apply these models with their own data. The simulation studies demonstrate the negative impact an erroneous assumption of invariance may have on scores and substantive conclusions drawn from naively analyzing those scores. Measurement invariance implies that the links between the items and the construct of interest are invariant over some domain, grouping, or classification. Examining a new or existing test for measurement invariance should be part of any test construction/implementation plan. In addition to reviewing implications of the simulation study results, we also provide a discussion of the limitations of current approaches and areas in need of additional research.
A voxel-based finite element model for the prediction of bladder deformation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chai Xiangfei; Herk, Marcel van; Hulshof, Maarten C. C. M.
2012-01-15
Purpose: A finite element (FE) bladder model was previously developed to predict bladder deformation caused by bladder filling change. However, two factors prevent a wide application of FE models: (1) the labor required to construct a FE model with high quality mesh and (2) long computation time needed to construct the FE model and solve the FE equations. In this work, we address these issues by constructing a low-resolution voxel-based FE bladder model directly from the binary segmentation images and compare the accuracy and computational efficiency of the voxel-based model used to simulate bladder deformation with those of a classicalmore » FE model with a tetrahedral mesh. Methods: For ten healthy volunteers, a series of MRI scans of the pelvic region was recorded at regular intervals of 10 min over 1 h. For this series of scans, the bladder volume gradually increased while rectal volume remained constant. All pelvic structures were defined from a reference image for each volunteer, including bladder wall, small bowel, prostate (male), uterus (female), rectum, pelvic bone, spine, and the rest of the body. Four separate FE models were constructed from these structures: one with a tetrahedral mesh (used in previous study), one with a uniform hexahedral mesh, one with a nonuniform hexahedral mesh, and one with a low-resolution nonuniform hexahedral mesh. Appropriate material properties were assigned to all structures and uniform pressure was applied to the inner bladder wall to simulate bladder deformation from urine inflow. Performance of the hexahedral meshes was evaluated against the performance of the standard tetrahedral mesh by comparing the accuracy of bladder shape prediction and computational efficiency. Results: FE model with a hexahedral mesh can be quickly and automatically constructed. No substantial differences were observed between the simulation results of the tetrahedral mesh and hexahedral meshes (<1% difference in mean dice similarity coefficient to manual contours and <0.02 cm difference in mean standard deviation of residual errors). The average equation solving time (without manual intervention) for the first two types of hexahedral meshes increased to 2.3 h and 2.6 h compared to the 1.1 h needed for the tetrahedral mesh, however, the low-resolution nonuniform hexahedral mesh dramatically decreased the equation solving time to 3 min without reducing accuracy. Conclusions: Voxel-based mesh generation allows fast, automatic, and robust creation of finite element bladder models directly from binary segmentation images without user intervention. Even the low-resolution voxel-based hexahedral mesh yields comparable accuracy in bladder shape prediction and more than 20 times faster in computational speed compared to the tetrahedral mesh. This approach makes it more feasible and accessible to apply FE method to model bladder deformation in adaptive radiotherapy.« less
Degradation data analysis based on a generalized Wiener process subject to measurement error
NASA Astrophysics Data System (ADS)
Li, Junxing; Wang, Zhihua; Zhang, Yongbo; Fu, Huimin; Liu, Chengrui; Krishnaswamy, Sridhar
2017-09-01
Wiener processes have received considerable attention in degradation modeling over the last two decades. In this paper, we propose a generalized Wiener process degradation model that takes unit-to-unit variation, time-correlated structure and measurement error into considerations simultaneously. The constructed methodology subsumes a series of models studied in the literature as limiting cases. A simple method is given to determine the transformed time scale forms of the Wiener process degradation model. Then model parameters can be estimated based on a maximum likelihood estimation (MLE) method. The cumulative distribution function (CDF) and the probability distribution function (PDF) of the Wiener process with measurement errors are given based on the concept of the first hitting time (FHT). The percentiles of performance degradation (PD) and failure time distribution (FTD) are also obtained. Finally, a comprehensive simulation study is accomplished to demonstrate the necessity of incorporating measurement errors in the degradation model and the efficiency of the proposed model. Two illustrative real applications involving the degradation of carbon-film resistors and the wear of sliding metal are given. The comparative results show that the constructed approach can derive a reasonable result and an enhanced inference precision.
Kitayama, Tomoya; Kinoshita, Ayako; Sugimoto, Masahiro; Nakayama, Yoichi; Tomita, Masaru
2006-07-17
In order to improve understanding of metabolic systems there have been attempts to construct S-system models from time courses. Conventionally, non-linear curve-fitting algorithms have been used for modelling, because of the non-linear properties of parameter estimation from time series. However, the huge iterative calculations required have hindered the development of large-scale metabolic pathway models. To solve this problem we propose a novel method involving power-law modelling of metabolic pathways from the Jacobian of the targeted system and the steady-state flux profiles by linearization of S-systems. The results of two case studies modelling a straight and a branched pathway, respectively, showed that our method reduced the number of unknown parameters needing to be estimated. The time-courses simulated by conventional kinetic models and those described by our method behaved similarly under a wide range of perturbations of metabolite concentrations. The proposed method reduces calculation complexity and facilitates the construction of large-scale S-system models of metabolic pathways, realizing a practical application of reverse engineering of dynamic simulation models from the Jacobian of the targeted system and steady-state flux profiles.
Reconstructing biochemical pathways from time course data.
Srividhya, Jeyaraman; Crampin, Edmund J; McSharry, Patrick E; Schnell, Santiago
2007-03-01
Time series data on biochemical reactions reveal transient behavior, away from chemical equilibrium, and contain information on the dynamic interactions among reacting components. However, this information can be difficult to extract using conventional analysis techniques. We present a new method to infer biochemical pathway mechanisms from time course data using a global nonlinear modeling technique to identify the elementary reaction steps which constitute the pathway. The method involves the generation of a complete dictionary of polynomial basis functions based on the law of mass action. Using these basis functions, there are two approaches to model construction, namely the general to specific and the specific to general approach. We demonstrate that our new methodology reconstructs the chemical reaction steps and connectivity of the glycolytic pathway of Lactococcus lactis from time course experimental data.
ERIC Educational Resources Information Center
Heinecken, Dawn
2013-01-01
This essay follows the insights of reader response theory to examine how readers of Phyllis Reynolds Naylor's Alice McKinley series negotiate textual meaning and construct particular identities in relation to the series' controversial content. Ranking second on the American Library Association's top one hundred list of banned and challenged books…
PASMet: a web-based platform for prediction, modelling and analyses of metabolic systems
Sriyudthsak, Kansuporn; Mejia, Ramon Francisco; Arita, Masanori; Hirai, Masami Yokota
2016-01-01
PASMet (Prediction, Analysis and Simulation of Metabolic networks) is a web-based platform for proposing and verifying mathematical models to understand the dynamics of metabolism. The advantages of PASMet include user-friendliness and accessibility, which enable biologists and biochemists to easily perform mathematical modelling. PASMet offers a series of user-functions to handle the time-series data of metabolite concentrations. The functions are organised into four steps: (i) Prediction of a probable metabolic pathway and its regulation; (ii) Construction of mathematical models; (iii) Simulation of metabolic behaviours; and (iv) Analysis of metabolic system characteristics. Each function contains various statistical and mathematical methods that can be used independently. Users who may not have enough knowledge of computing or programming can easily and quickly analyse their local data without software downloads, updates or installations. Users only need to upload their files in comma-separated values (CSV) format or enter their model equations directly into the website. Once the time-series data or mathematical equations are uploaded, PASMet automatically performs computation on server-side. Then, users can interactively view their results and directly download them to their local computers. PASMet is freely available with no login requirement at http://pasmet.riken.jp/ from major web browsers on Windows, Mac and Linux operating systems. PMID:27174940
Liver DCE-MRI Registration in Manifold Space Based on Robust Principal Component Analysis.
Feng, Qianjin; Zhou, Yujia; Li, Xueli; Mei, Yingjie; Lu, Zhentai; Zhang, Yu; Feng, Yanqiu; Liu, Yaqin; Yang, Wei; Chen, Wufan
2016-09-29
A technical challenge in the registration of dynamic contrast-enhanced magnetic resonance (DCE-MR) imaging in the liver is intensity variations caused by contrast agents. Such variations lead to the failure of the traditional intensity-based registration method. To address this problem, a manifold-based registration framework for liver DCE-MR time series is proposed. We assume that liver DCE-MR time series are located on a low-dimensional manifold and determine intrinsic similarities between frames. Based on the obtained manifold, the large deformation of two dissimilar images can be decomposed into a series of small deformations between adjacent images on the manifold through gradual deformation of each frame to the template image along the geodesic path. Furthermore, manifold construction is important in automating the selection of the template image, which is an approximation of the geodesic mean. Robust principal component analysis is performed to separate motion components from intensity changes induced by contrast agents; the components caused by motion are used to guide registration in eliminating the effect of contrast enhancement. Visual inspection and quantitative assessment are further performed on clinical dataset registration. Experiments show that the proposed method effectively reduces movements while preserving the topology of contrast-enhancing structures and provides improved registration performance.
Purpose Designed Facilities. School Buildings Planning, Design, and Construction Series No. 4.
ERIC Educational Resources Information Center
Odell, John H.
A school construction guide offers key personnel in school development projects information on the complex task of master planning and construction of schools in Australia. This chapter of the guide provides advice on issues involving school space design and the kinds of spaces required in schools, including lecture spaces and seminar rooms,…
ERIC Educational Resources Information Center
Odell, John H.
A school construction guide offers key personnel in school development projects information on the complex task of master planning and construction of schools in Australia. This chapter of the guide provides advice on school building design issues, such as the fundamentals of good design and designs that accommodate change, issues affecting…
1987-11-24
of the assortment of manufactured parts for partial and complete frames, as well as abutments , support walls, and bridgehead construction...Uniform Series II Generation based on anticipated spans; and • Increased effectiveness of prefabrication for steel and masonry bridge construction...support structures and abutments . Parallel to and on an equal par with standard primary construction trades already cited, the scientific-technical
ERIC Educational Resources Information Center
Odell, John H.
A school construction guide offers key personnel in school development projects information on the complex task of master planning and construction of schools in Australia. This chapter of the guide provides advice on site selection covering selection criteria; traffic issues; and site services, such as water, power, and sewer. Additionally…
Planning for Safety on the Jobsite. Safety in Industry, Construction Industry Series.
ERIC Educational Resources Information Center
Occupational Safety and Health Administration, Washington, DC.
Work injuries and their monetary losses in the construction industry can be effectively prevented only through an aggressive and well-planned safety effort. The purpose of this bulletin is to provide guidelines to aid the construction contractor in complying with legal requirements and in attaining the objective of keeping costly accidents and…
NASA Astrophysics Data System (ADS)
Curme, Chester
Technological advances have provided scientists with large high-dimensional datasets that describe the behaviors of complex systems: from the statistics of energy levels in complex quantum systems, to the time-dependent transcription of genes, to price fluctuations among assets in a financial market. In this environment, where it may be difficult to infer the joint distribution of the data, network science has flourished as a way to gain insight into the structure and organization of such systems by focusing on pairwise interactions. This work focuses on a particular setting, in which a system is described by multivariate time series data. We consider time-lagged correlations among elements in this system, in such a way that the measured interactions among elements are asymmetric. Finally, we allow these interactions to be characteristically weak, so that statistical uncertainties may be important to consider when inferring the structure of the system. We introduce a methodology for constructing statistically validated networks to describe such a system, extend the methodology to accommodate interactions with a periodic component, and show how consideration of bipartite community structures in these networks can aid in the construction of robust statistical models. An example of such a system is a financial market, in which high frequency returns data may be used to describe contagion, or the spreading of shocks in price among assets. These data provide the experimental testing ground for our methodology. We study NYSE data from both the present day and one decade ago, examine the time scales over which the validated lagged correlation networks exist, and relate differences in the topological properties of the networks to an increasing economic efficiency. We uncover daily periodicities in the validated interactions, and relate our findings to explanations of the Epps Effect, an empirical phenomenon of financial time series. We also study bipartite community structures in networks composed of market returns and news sentiment signals for 40 countries. We compare the degrees to which markets anticipate news, and news anticipate markets, and use the community structures to construct a recommender system for inputs to prediction models. Finally, we complement this work with novel investigations of the exogenous news items that may drive the financial system using topic models. This includes an analysis of how investors and the general public may interact with these news items using Internet search data, and how the diversity of stories in the news both responds to and influences market movements.
Ground-water conditions in Utah, spring of 2005
Burden, Carole B.; Allen, David V.; Danner, M.R.; Walzem, Vince; Cillessen, J.L.; Kenney, T.A.; Wilkowske, C.D.; Eacret, Robert J.; Downhour, Paul; Slaugh, B.A.; Swenson, R.L.; Howells, J.H.; Christiansen, H.K.; Fisher, M.J.
2005-01-01
This is the forty-second in a series of annual reports that describe ground-water conditions in Utah. Reports in this series, published cooperatively by the U.S. Geological Survey and the Utah Department of Natural Resources, Division of Water Resources and Division of Water Rights, provide data to enable interested parties to maintain awareness of changing ground-water conditions.This report, like the others in the series, contains information on well construction, ground-water withdrawal from wells, water-level changes, precipitation, streamflow, and chemical quality of water. Information on well construction included in this report refers only to wells constructed for new appropriations of ground water. Supplementary data are included in reports of this series only for those years or areas which are important to a discussion of changing ground-water conditions and for which applicable data are available.This report includes individual discussions of selected significant areas of ground-water development in the State for calendar year 2004. Most of the reported data were collected by the U.S. Geological Survey in cooperation with the Utah Department of Natural Resources, Division of Water Rights and Division of Water Resources. This report is available online at http://www.waterrights.utah.gov/techinfo/ wwwpub/gw2005.pdf and http://ut.water.usgs.gov/publications/GW2005.pdf.
Rajaprasad, Sunku Venkata Siva; Chalapathi, Pasupulati Venkata
2015-01-01
Background Construction activity has made considerable breakthroughs in the past two decades on the back of increases in development activities, government policies, and public demand. At the same time, occupational health and safety issues have become a major concern to construction organizations. The unsatisfactory safety performance of the construction industry has always been highlighted since the safety management system is neglected area and not implemented systematically in Indian construction organizations. Due to a lack of enforcement of the applicable legislation, most of the construction organizations are forced to opt for the implementation of Occupational Health Safety Assessment Series (OHSAS) 18001 to improve safety performance. Methods In order to better understand factors influencing the implementation of OHSAS 18001, an interpretive structural modeling approach has been applied and the factors have been classified using matrice d'impacts croises-multiplication appliqué a un classement (MICMAC) analysis. The study proposes the underlying theoretical framework to identify factors and to help management of Indian construction organizations to understand the interaction among factors influencing in implementation of OHSAS 18001. Results Safety culture, continual improvement, morale of employees, and safety training have been identified as dependent variables. Safety performance, sustainable construction, and conducive working environment have been identified as linkage variables. Management commitment and safety policy have been identified as the driver variables. Conclusion Management commitment has the maximum driving power and the most influential factor is safety policy, which states clearly the commitment of top management towards occupational safety and health. PMID:26929828
Multiscale multifractal DCCA and complexity behaviors of return intervals for Potts price model
NASA Astrophysics Data System (ADS)
Wang, Jie; Wang, Jun; Stanley, H. Eugene
2018-02-01
To investigate the characteristics of extreme events in financial markets and the corresponding return intervals among these events, we use a Potts dynamic system to construct a random financial time series model of the attitudes of market traders. We use multiscale multifractal detrended cross-correlation analysis (MM-DCCA) and Lempel-Ziv complexity (LZC) perform numerical research of the return intervals for two significant China's stock market indices and for the proposed model. The new MM-DCCA method is based on the Hurst surface and provides more interpretable cross-correlations of the dynamic mechanism between different return interval series. We scale the LZC method with different exponents to illustrate the complexity of return intervals in different scales. Empirical studies indicate that the proposed return intervals from the Potts system and the real stock market indices hold similar statistical properties.
Relativistic Fluid Dynamics Far From Local Equilibrium
NASA Astrophysics Data System (ADS)
Romatschke, Paul
2018-01-01
Fluid dynamics is traditionally thought to apply only to systems near local equilibrium. In this case, the effective theory of fluid dynamics can be constructed as a gradient series. Recent applications of resurgence suggest that this gradient series diverges, but can be Borel resummed, giving rise to a hydrodynamic attractor solution which is well defined even for large gradients. Arbitrary initial data quickly approaches this attractor via nonhydrodynamic mode decay. This suggests the existence of a new theory of far-from-equilibrium fluid dynamics. In this Letter, the framework of fluid dynamics far from local equilibrium for a conformal system is introduced, and the hydrodynamic attractor solutions for resummed Baier-Romatschke-Son-Starinets-Stephanov theory, kinetic theory in the relaxation time approximation, and strongly coupled N =4 super Yang-Mills theory are identified for a system undergoing Bjorken flow.
Shaping low-thrust trajectories with thrust-handling feature
NASA Astrophysics Data System (ADS)
Taheri, Ehsan; Kolmanovsky, Ilya; Atkins, Ella
2018-02-01
Shape-based methods are becoming popular in low-thrust trajectory optimization due to their fast computation speeds. In existing shape-based methods constraints are treated at the acceleration level but not at the thrust level. These two constraint types are not equivalent since spacecraft mass decreases over time as fuel is expended. This paper develops a shape-based method based on a Fourier series approximation that is capable of representing trajectories defined in spherical coordinates and that enforces thrust constraints. An objective function can be incorporated to minimize overall mission cost, i.e., achieve minimum ΔV . A representative mission from Earth to Mars is studied. The proposed Fourier series technique is demonstrated capable of generating feasible and near-optimal trajectories. These attributes can facilitate future low-thrust mission designs where different trajectory alternatives must be rapidly constructed and evaluated.
2013-01-01
Background Matching pursuit algorithm (MP), especially with recent multivariate extensions, offers unique advantages in analysis of EEG and MEG. Methods We propose a novel construction of an optimal Gabor dictionary, based upon the metrics introduced in this paper. We implement this construction in a freely available software for MP decomposition of multivariate time series, with a user friendly interface via the Svarog package (Signal Viewer, Analyzer and Recorder On GPL, http://braintech.pl/svarog), and provide a hands-on introduction to its application to EEG. Finally, we describe numerical and mathematical optimizations used in this implementation. Results Optimal Gabor dictionaries, based on the metric introduced in this paper, for the first time allowed for a priori assessment of maximum one-step error of the MP algorithm. Variants of multivariate MP, implemented in the accompanying software, are organized according to the mathematical properties of the algorithms, relevant in the light of EEG/MEG analysis. Some of these variants have been successfully applied to both multichannel and multitrial EEG and MEG in previous studies, improving preprocessing for EEG/MEG inverse solutions and parameterization of evoked potentials in single trials; we mention also ongoing work and possible novel applications. Conclusions Mathematical results presented in this paper improve our understanding of the basics of the MP algorithm. Simple introduction of its properties and advantages, together with the accompanying stable and user-friendly Open Source software package, pave the way for a widespread and reproducible analysis of multivariate EEG and MEG time series and novel applications, while retaining a high degree of compatibility with the traditional, visual analysis of EEG. PMID:24059247
NASA Astrophysics Data System (ADS)
Maria, Radoane; Constantin, Nechita; Francisca, Chiriloaei; Nicolae, Radoane
2016-04-01
In this paper are analyzed the climatic and hydrological records from 29 meteorological stations and 48 hydrometric stations which are all overlapping on two big drainage basins (Siret and Prut) from the eastern part of Romania. To these registrations, were added information obtained on basis of more than 20 dendrochronological series collected from the entire surface of the two studied basins. In order to obtain the correlation between radial tree rings growth and climate, were used the climatic data from National Climatic Grid with spatial resolution of 10x10 km. The climatic information obtained on the analysis of dendrochronological series complete the data from the meteorological stations, especially if we consider the fact that some of the tree ages cover older periods than year 1900. Annual series of variables: temperature (T), precipitation (P), number of days from year with sunshine radiation (S) water discharge (Qw), suspended sediments (Qs) cover the time interval 1950-2010. These constitute in a well-known cascade type process of influence transmission: precipitations determine runoff which, in turn, determines erosion. The last variable from this chain - in channel transported sediments and their movement from source to delivery represents a key issue of dynamic geomorphology. The rhythm in which this process occurs may change dramatically, representing the signal of great changes in landform domain (climate changes or human interventions). Climatic, hydrological, dendrochronological time series analysis has the goal the answer the following questions: Which is the spatial variability of these series behavior? Can we identify areas with similar features of series? What factors "complicated" the cascade transmission of P → Qw → Qs variability? Can we identify common thresholds in series change and which are the causes? In Natural vs. Human competition, can we quantify their weight for considered series in the shown geographic space? Using dendrochronological analysis we will establish years with exceptional hydrological events prior to years with instrumental registrations. The obtained data will be modeled considering the known data in order to reconstruct water discharge (Q), suspended sediments (Qs) This application was elaborated for Eastern part of Romania, covered by the hydrographic basins of Siret and Prut, respectively for an area of more than 43000 km2. The conditions for geological structure, relief, land use, etc. offers the potential to better understand the spatial variability of some natural processes behavior in the last 6 decades. Also, a number of human interventions have occurred within the Siret and Prut River's drainage basins (i.e., dam construction, channelization, sediment mining, and deforestation), primarily during the past century. The history of human interventions in the area began during the twentieth century with the construction of bank protection structures, especially following the catastrophic 1970-1975 flood events. Natural reforestation, which followed several centuries of intense deforestation, has been most intense from the 1950s onward.
Gallium arsenide (GaAs) solar cell modeling studies
NASA Technical Reports Server (NTRS)
Heinbockel, J. H.
1980-01-01
Various models were constructed which will allow for the variation of system components. Computer studies were then performed using the models constructed in order to study the effects of various system changes. In particular, GaAs and Si flat plate solar power arrays were studied and compared. Series and shunt resistance models were constructed. Models for the chemical kinetics of the annealing process were prepared. For all models constructed, various parametric studies were performed.
Seismic vulnerability of new highway construction, executive summary.
DOT National Transportation Integrated Search
2002-03-01
This executive summary gives an overview of the results of FHWA Contract DTFH61-92-C-00112, Seismic Research Program, : which performed a series of special studies addressing the seismic design of new construction. The objectives of this project : we...
Seismic Vulnerability of New Highway Construction, Executive Summary.
DOT National Transportation Integrated Search
2002-03-01
This executive summary gives an overview of the results of FHWA Contract DTFH61-92-C-00112, Seismic Research Program, which performed a series of special studies addressing the seismic design of new construction. The objectives of this project were t...
NASA Astrophysics Data System (ADS)
Aljoumani, Basem; Kluge, Björn; sanchez, Josep; Wessolek, Gerd
2017-04-01
Highways and main roads are potential sources of contamination for the surrounding environment. High traffic rates result in elevated heavy metal concentrations in road runoff, soil and water seepage, which has attracted much attention in the recent past. Prediction of heavy metals transfer near the roadside into deeper soil layers are very important to prevent the groundwater pollution. This study was carried out on data of a number of lysimeters which were installed along the A115 highway (Germany) with a mean daily traffic of 90.000 vehicles per day. Three polyethylene (PE) lysimeters were installed at the A115 highway. They have the following dimensions: length 150 cm, width 100 cm, height 60 cm. The lysimeters were filled with different soil materials, which were recently used for embankment construction in Germany. With the obtained data, we will develop a time series analysis model to predict total and dissolved metal concentration in road runoff and in soil solution of the roadside embankments. The time series consisted of monthly measurements of heavy metals and was transformed to a stationary situation. Subsequently, the transformed data will be used to conduct analyses in the time domain in order to obtain the parameters of a seasonal autoregressive integrated moving average (ARIMA) model. Four phase approaches for identifying and fitting ARIMA models will be used: identification, parameter estimation, diagnostic checking, and forecasting. An automatic selection criterion, such as the Akaike information criterion, will use to enhance this flexible approach to model building
A multi-temporal analysis approach for land cover mapping in support of nuclear incident response
NASA Astrophysics Data System (ADS)
Sah, Shagan; van Aardt, Jan A. N.; McKeown, Donald M.; Messinger, David W.
2012-06-01
Remote sensing can be used to rapidly generate land use maps for assisting emergency response personnel with resource deployment decisions and impact assessments. In this study we focus on constructing accurate land cover maps to map the impacted area in the case of a nuclear material release. The proposed methodology involves integration of results from two different approaches to increase classification accuracy. The data used included RapidEye scenes over Nine Mile Point Nuclear Power Station (Oswego, NY). The first step was building a coarse-scale land cover map from freely available, high temporal resolution, MODIS data using a time-series approach. In the case of a nuclear accident, high spatial resolution commercial satellites such as RapidEye or IKONOS can acquire images of the affected area. Land use maps from the two image sources were integrated using a probability-based approach. Classification results were obtained for four land classes - forest, urban, water and vegetation - using Euclidean and Mahalanobis distances as metrics. Despite the coarse resolution of MODIS pixels, acceptable accuracies were obtained using time series features. The overall accuracies using the fusion based approach were in the neighborhood of 80%, when compared with GIS data sets from New York State. The classifications were augmented using this fused approach, with few supplementary advantages such as correction for cloud cover and independence from time of year. We concluded that this method would generate highly accurate land maps, using coarse spatial resolution time series satellite imagery and a single date, high spatial resolution, multi-spectral image.
Brief history of US debt limits before 1939.
Hall, George J; Sargent, Thomas J
2018-03-20
Between 1776 and 1920, the US Congress designed more than 200 distinct securities and stated the maximum amount of each that the Treasury could sell. Between 1917 and 1939, Congress gradually delegated all decisions about designing US debt instruments to the Treasury. In 1939, Congress began imposing a limit on the par value of total federal debt outstanding. By summing Congressional borrowing authorizations outstanding each year for each bond, we construct a time series of implied federal debt limits before 1939. Copyright © 2018 the Author(s). Published by PNAS.
NASA Technical Reports Server (NTRS)
Belton, Michael J. S.; Mueller, Beatrice
1991-01-01
The scientific objectives were as follows: (1) to construct a well sampled photometric time series of comet Halley extending to large heliocentric distances both post and pre-perihelion passage and derive a precise ephemeris for the nuclear spin so that the physical and chemical characteristics of individual regions of activity on the nucleus can be determined; and (2) to extend the techniques in the study of Comet Halley to the study of other cometary nuclei and to obtain new observational data.
NASA Astrophysics Data System (ADS)
Shaulov, S. B.; Besshapov, S. P.; Kabanova, N. V.; Sysoeva, T. I.; Antonov, R. A.; Anyuhina, A. M.; Bronvech, E. A.; Chernov, D. V.; Galkin, V. I.; Tkaczyk, W.; Finger, M.; Sonsky, M.
2009-12-01
The expedition carried out in March, 2008 to Lake Baikal became an important stage in the development of the SPHERE experiment. During the expedition the SPHERE-2 installation was hoisted, for the first time, on a tethered balloon, APA, to a height of 700 m over the lake surface covered with ice and snow. A series of test measurements were made. Preliminary results of the data processing are presented. The next plan of the SPHERE experiment is to begin a set of statistics for constructing the CR spectrum in the energy range 10-10 eV.
Smooth information flow in temperature climate network reflects mass transport
NASA Astrophysics Data System (ADS)
Hlinka, Jaroslav; Jajcay, Nikola; Hartman, David; Paluš, Milan
2017-03-01
A directed climate network is constructed by Granger causality analysis of air temperature time series from a regular grid covering the whole Earth. Using winner-takes-all network thresholding approach, a structure of a smooth information flow is revealed, hidden to previous studies. The relevance of this observation is confirmed by comparison with the air mass transfer defined by the wind field. Their close relation illustrates that although the information transferred due to the causal influence is not a physical quantity, the information transfer is tied to the transfer of mass and energy.
Construction of a surface air temperature series for Qingdao in China for the period 1899 to 2014
NASA Astrophysics Data System (ADS)
Li, Yan; Tinz, Birger; von Storch, Hans; Wang, Qingyuan; Zhou, Qingliang; Zhu, Yani
2018-03-01
We present a homogenized surface air temperature (SAT) time series at 2 m height for the city of Qingdao in China from 1899 to 2014. This series is derived from three data sources: newly digitized and homogenized observations of the German National Meteorological Service from 1899 to 1913, homogenized observation data of the China Meteorological Administration (CMA) from 1961 to 2014 and a gridded dataset of Willmott and Matsuura (2012) in Delaware to fill the gap from 1914 to 1960. Based on this new series, long-term trends are described. The SAT in Qingdao has a significant warming trend of 0.11 ± 0.03 °C decade-1 during 1899-2014. The coldest period occurred during 1909-1918 and the warmest period occurred during 1999-2008. For the seasonal mean SAT, the most significant warming can be found in spring, followed by winter. The homogenized time series of Qingdao is provided and archived by the Deutscher Wetterdienst (DWD) web page under overseas stations of the Deutsche Seewarte (http://www.dwd.de/EN/ourservices/overseas_stations/ueberseedoku/doi_qingdao.html) in ASCII format. Users can also freely obtain a short description of the data at https://doi.org/https://dx.doi.org/10.5676/DWD/Qing_v1. And the data can be downloaded at http://dwd.de/EN/ourservices/overseas_stations/ueberseedoku/data_qingdao.txt.
Retrieval and Mapping of Heavy Metal Concentration in Soil Using Time Series Landsat 8 Imagery
NASA Astrophysics Data System (ADS)
Fang, Y.; Xu, L.; Peng, J.; Wang, H.; Wong, A.; Clausi, D. A.
2018-04-01
Heavy metal pollution is a critical global environmental problem which has always been a concern. Traditional approach to obtain heavy metal concentration relying on field sampling and lab testing is expensive and time consuming. Although many related studies use spectrometers data to build relational model between heavy metal concentration and spectra information, and then use the model to perform prediction using the hyperspectral imagery, this manner can hardly quickly and accurately map soil metal concentration of an area due to the discrepancies between spectrometers data and remote sensing imagery. Taking the advantage of easy accessibility of Landsat 8 data, this study utilizes Landsat 8 imagery to retrieve soil Cu concentration and mapping its distribution in the study area. To enlarge the spectral information for more accurate retrieval and mapping, 11 single date Landsat 8 imagery from 2013-2017 are selected to form a time series imagery. Three regression methods, partial least square regression (PLSR), artificial neural network (ANN) and support vector regression (SVR) are used to model construction. By comparing these models unbiasedly, the best model are selected to mapping Cu concentration distribution. The produced distribution map shows a good spatial autocorrelation and consistency with the mining area locations.
Semiparametric modeling: Correcting low-dimensional model error in parametric models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berry, Tyrus, E-mail: thb11@psu.edu; Harlim, John, E-mail: jharlim@psu.edu; Department of Meteorology, the Pennsylvania State University, 503 Walker Building, University Park, PA 16802-5013
2016-03-01
In this paper, a semiparametric modeling approach is introduced as a paradigm for addressing model error arising from unresolved physical phenomena. Our approach compensates for model error by learning an auxiliary dynamical model for the unknown parameters. Practically, the proposed approach consists of the following steps. Given a physics-based model and a noisy data set of historical observations, a Bayesian filtering algorithm is used to extract a time-series of the parameter values. Subsequently, the diffusion forecast algorithm is applied to the retrieved time-series in order to construct the auxiliary model for the time evolving parameters. The semiparametric forecasting algorithm consistsmore » of integrating the existing physics-based model with an ensemble of parameters sampled from the probability density function of the diffusion forecast. To specify initial conditions for the diffusion forecast, a Bayesian semiparametric filtering method that extends the Kalman-based filtering framework is introduced. In difficult test examples, which introduce chaotically and stochastically evolving hidden parameters into the Lorenz-96 model, we show that our approach can effectively compensate for model error, with forecasting skill comparable to that of the perfect model.« less
Stability Estimation of ABWR on the Basis of Noise Analysis
NASA Astrophysics Data System (ADS)
Furuya, Masahiro; Fukahori, Takanori; Mizokami, Shinya; Yokoya, Jun
In order to investigate the stability of a nuclear reactor core with an oxide mixture of uranium and plutonium (MOX) fuel installed, channel stability and regional stability tests were conducted with the SIRIUS-F facility. The SIRIUS-F facility was designed and constructed to provide a highly accurate simulation of thermal-hydraulic (channel) instabilities and coupled thermalhydraulics-neutronics instabilities of the Advanced Boiling Water Reactors (ABWRs). A real-time simulation was performed by modal point kinetics of reactor neutronics and fuel-rod thermal conduction on the basis of a measured void fraction in a reactor core section of the facility. A time series analysis was performed to calculate decay ratio and resonance frequency from a dominant pole of a transfer function by applying auto regressive (AR) methods to the time-series of the core inlet flow rate. Experiments were conducted with the SIRIUS-F facility, which simulates ABWR with MOX fuel installed. The variations in the decay ratio and resonance frequency among the five common AR methods are within 0.03 and 0.01 Hz, respectively. In this system, the appropriate decay ratio and resonance frequency can be estimated on the basis of the Yule-Walker method with the model order of 30.
Extra-terrestrial construction processes - Advancements, opportunities and challenges
NASA Astrophysics Data System (ADS)
Lim, Sungwoo; Prabhu, Vibha Levin; Anand, Mahesh; Taylor, Lawrence A.
2017-10-01
Government space agencies, including NASA and ESA, are conducting preliminary studies on building alternative space-habitat systems for deep-space exploration. Such studies include development of advanced technologies for planetary surface exploration, including an in-depth understanding of the use of local resources. Currently, NASA plans to land humans on Mars in the 2030s. Similarly, other space agencies from Europe (ESA), Canada (CSA), Russia (Roscosmos), India (ISRO), Japan (JAXA) and China (CNSA) have already initiated or announced their plans for launching a series of lunar missions over the next decade, ranging from orbiters, landers and rovers for extended stays on the lunar surface. As the Space Odyssey is one of humanity's oldest dreams, there has been a series of research works for establishing temporary or permanent settlement on other planetary bodies, including the Moon and Mars. This paper reviews current projects developing extra-terrestrial construction, broadly categorised as: (i) ISRU-based construction materials; (ii) fabrication methods; and (iii) construction processes. It also discusses four categories of challenges to developing an appropriate construction process: (i) lunar simulants; (ii) material fabrication and curing; (iii) microwave-sintering based fabrication; and (iv) fully autonomous and scaled-up construction processes.
On the construction of a time base and the elimination of averaging errors in proxy records
NASA Astrophysics Data System (ADS)
Beelaerts, V.; De Ridder, F.; Bauwens, M.; Schmitz, N.; Pintelon, R.
2009-04-01
Proxies are sources of climate information which are stored in natural archives (e.g. ice-cores, sediment layers on ocean floors and animals with calcareous marine skeletons). Measuring these proxies produces very short records and mostly involves sampling solid substrates, which is subject to the following two problems: Problem 1: Natural archives are equidistantly sampled at a distance grid along their accretion axis. Starting from these distance series, a time series needs to be constructed, as comparison of different data records is only meaningful on a time grid. The time series will be non-equidistant, as the accretion rate is non-constant. Problem 2: A typical example of sampling solid substrates is drilling. Because of the dimensions of the drill, the holes drilled will not be infinitesimally small. Consequently, samples are not taken at a point in distance, but rather over a volume in distance. This holds for most sampling methods in solid substrates. As a consequence, when the continuous proxy signal is sampled, it will be averaged over the volume of the sample, resulting in an underestimation of the amplitude. Whether this averaging effect is significant, depends on the volume of the sample and the variations of interest of the proxy signal. Starting from the measured signal, the continuous signal needs to be reconstructed in order eliminate these averaging errors. The aim is to provide an efficient identification algorithm to identify the non-linearities in the distance-time relationship, called time base distortions, and to correct for the averaging effects. Because this is a parametric method, an assumption about the proxy signal needs to be made: the proxy record on a time base is assumed to be harmonic, this is an obvious assumption because natural archives often exhibit a seasonal cycle. In a first approach the averaging effects are assumed to be in one direction only, i.e. the direction of the axis on which the measurements were performed. The measured averaged proxy signal is modeled by following signal model: -- Δ ∫ n+12Δδ- y(n,θ) = δ- 1Δ- y(m,θ)dm n-2 δ where m is the position, x(m) = Δm; θ are the unknown parameters and y(m,θ) is the proxy signal we want to identify (the proxy signal as found in the natural archive), which we model as: y(m, θ) = A +∑H [A sin(kωt(m ))+ A cos(kωt(m ))] 0 k=1 k k+H With t(m): t(m) = mTS + g(m )TS Here TS = 1/fS is the sampling period, fS the sampling frequency, and g(m) the unknown time base distortion (TBD). In this work a splines approximation of the TBD is chosen: ∑ g(m ) = b blφl(m ) l=1 where, b is a vector of unknown time base distortion parameters, and φ is a set of splines. The estimates of the unknown parameters were obtained with a nonlinear least squares algorithm. The vessel density measured in the mangrove tree R mucronata was used to illustrate the method. The vessel density is a proxy for the rain fall in tropical regions. The proxy data on the newly constructed time base showed a yearly periodicity, this is what we expected and the correction for the averaging effect increased the amplitude by 11.18%.
An algebraic method for constructing stable and consistent autoregressive filters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harlim, John, E-mail: jharlim@psu.edu; Department of Meteorology, the Pennsylvania State University, University Park, PA 16802; Hong, Hoon, E-mail: hong@ncsu.edu
2015-02-15
In this paper, we introduce an algebraic method to construct stable and consistent univariate autoregressive (AR) models of low order for filtering and predicting nonlinear turbulent signals with memory depth. By stable, we refer to the classical stability condition for the AR model. By consistent, we refer to the classical consistency constraints of Adams–Bashforth methods of order-two. One attractive feature of this algebraic method is that the model parameters can be obtained without directly knowing any training data set as opposed to many standard, regression-based parameterization methods. It takes only long-time average statistics as inputs. The proposed method provides amore » discretization time step interval which guarantees the existence of stable and consistent AR model and simultaneously produces the parameters for the AR models. In our numerical examples with two chaotic time series with different characteristics of decaying time scales, we find that the proposed AR models produce significantly more accurate short-term predictive skill and comparable filtering skill relative to the linear regression-based AR models. These encouraging results are robust across wide ranges of discretization times, observation times, and observation noise variances. Finally, we also find that the proposed model produces an improved short-time prediction relative to the linear regression-based AR-models in forecasting a data set that characterizes the variability of the Madden–Julian Oscillation, a dominant tropical atmospheric wave pattern.« less
Attractor States in Teaching and Learning Processes: A Study of Out-of-School Science Education.
Geveke, Carla H; Steenbeek, Henderien W; Doornenbal, Jeannette M; Van Geert, Paul L C
2017-01-01
In order for out-of-school science activities that take place during school hours but outside the school context to be successful, instructors must have sufficient pedagogical content knowledge (PCK) to guarantee high-quality teaching and learning. We argue that PCK is a quality of the instructor-pupil system that is constructed in real-time interaction. When PCK is evident in real-time interaction, we define it as Expressed Pedagogical Content Knowledge (EPCK). The aim of this study is to empirically explore whether EPCK shows a systematic pattern of variation, and if so whether the pattern occurs in recurrent and temporary stable attractor states as predicted in the complex dynamic systems theory. This study concerned nine out-of-school activities in which pupils of upper primary school classes participated. A multivariate coding scheme was used to capture EPCK in real time. A principal component analysis of the time series of all the variables reduced the number of components. A cluster revealed general descriptions of the components across all cases. Cluster analyses of individual cases divided the time series into sequences, revealing High-, Low-, and Non-EPCK states. High-EPCK attractor states emerged at particular moments during activities, rather than being present all the time. Such High-EPCK attractor states were only found in a few cases, namely those where the pupils were prepared for the visit and the instructors were trained.
Fretheim, Atle; Zhang, Fang; Ross-Degnan, Dennis; Oxman, Andrew D; Cheyne, Helen; Foy, Robbie; Goodacre, Steve; Herrin, Jeph; Kerse, Ngaire; McKinlay, R James; Wright, Adam; Soumerai, Stephen B
2015-03-01
There is often substantial uncertainty about the impacts of health system and policy interventions. Despite that, randomized controlled trials (RCTs) are uncommon in this field, partly because experiments can be difficult to carry out. An alternative method for impact evaluation is the interrupted time-series (ITS) design. Little is known, however, about how results from the two methods compare. Our aim was to explore whether ITS studies yield results that differ from those of randomized trials. We conducted single-arm ITS analyses (segmented regression) based on data from the intervention arm of cluster randomized trials (C-RCTs), that is, discarding control arm data. Secondarily, we included the control group data in the analyses, by subtracting control group data points from intervention group data points, thereby constructing a time series representing the difference between the intervention and control groups. We compared the results from the single-arm and controlled ITS analyses with results based on conventional aggregated analyses of trial data. The findings were largely concordant, yielding effect estimates with overlapping 95% confidence intervals (CI) across different analytical methods. However, our analyses revealed the importance of a concurrent control group and of taking baseline and follow-up trends into account in the analysis of C-RCTs. The ITS design is valuable for evaluation of health systems interventions, both when RCTs are not feasible and in the analysis and interpretation of data from C-RCTs. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Spatio-temporal prediction of daily temperatures using time-series of MODIS LST images
NASA Astrophysics Data System (ADS)
Hengl, Tomislav; Heuvelink, Gerard B. M.; Perčec Tadić, Melita; Pebesma, Edzer J.
2012-01-01
A computational framework to generate daily temperature maps using time-series of publicly available MODIS MOD11A2 product Land Surface Temperature (LST) images (1 km resolution; 8-day composites) is illustrated using temperature measurements from the national network of meteorological stations (159) in Croatia. The input data set contains 57,282 ground measurements of daily temperature for the year 2008. Temperature was modeled as a function of latitude, longitude, distance from the sea, elevation, time, insolation, and the MODIS LST images. The original rasters were first converted to principal components to reduce noise and filter missing pixels in the LST images. The residual were next analyzed for spatio-temporal auto-correlation; sum-metric separable variograms were fitted to account for zonal and geometric space-time anisotropy. The final predictions were generated for time-slices of a 3D space-time cube, constructed in the R environment for statistical computing. The results show that the space-time regression model can explain a significant part of the variation in station-data (84%). MODIS LST 8-day (cloud-free) images are unbiased estimator of the daily temperature, but with relatively low precision (±4.1°C); however their added value is that they systematically improve detection of local changes in land surface temperature due to local meteorological conditions and/or active heat sources (urban areas, land cover classes). The results of 10-fold cross-validation show that use of spatio-temporal regression-kriging and incorporation of time-series of remote sensing images leads to significantly more accurate maps of temperature than if plain spatial techniques were used. The average (global) accuracy of mapping temperature was ±2.4°C. The regression-kriging explained 91% of variability in daily temperatures, compared to 44% for ordinary kriging. Further software advancement—interactive space-time variogram exploration and automated retrieval, resampling and filtering of MODIS images—are anticipated.
NASA Technical Reports Server (NTRS)
Olsen, J. H.; Liu, H. T.
1973-01-01
The water tunnel which was constructed at the NASA Ames Research Center is described along with the flow field adjacent to an oscillating airfoil. The design and operational procedures of the tunnel are described in detail. Hydrogen bubble and thymol blue techniques are used to visualize the flow field. Results of the flow visualizations are presented in a series of still pictures and a high speed movie. These results show that time stall is more complicated than simple shedding from the leading edge or the trailing edge, particularly at relatively low frequency oscillations comparable to those of a helicopter blade. Therefore, any successful theory for predicting the stall loads on the helicopter blades must treat an irregular separated region rather than a discrete vortex passing over each blade surface.
Selection of optimal complexity for ENSO-EMR model by minimum description length principle
NASA Astrophysics Data System (ADS)
Loskutov, E. M.; Mukhin, D.; Mukhina, A.; Gavrilov, A.; Kondrashov, D. A.; Feigin, A. M.
2012-12-01
One of the main problems arising in modeling of data taken from natural system is finding a phase space suitable for construction of the evolution operator model. Since we usually deal with strongly high-dimensional behavior, we are forced to construct a model working in some projection of system phase space corresponding to time scales of interest. Selection of optimal projection is non-trivial problem since there are many ways to reconstruct phase variables from given time series, especially in the case of a spatio-temporal data field. Actually, finding optimal projection is significant part of model selection, because, on the one hand, the transformation of data to some phase variables vector can be considered as a required component of the model. On the other hand, such an optimization of a phase space makes sense only in relation to the parametrization of the model we use, i.e. representation of evolution operator, so we should find an optimal structure of the model together with phase variables vector. In this paper we propose to use principle of minimal description length (Molkov et al., 2009) for selection models of optimal complexity. The proposed method is applied to optimization of Empirical Model Reduction (EMR) of ENSO phenomenon (Kravtsov et al. 2005, Kondrashov et. al., 2005). This model operates within a subset of leading EOFs constructed from spatio-temporal field of SST in Equatorial Pacific, and has a form of multi-level stochastic differential equations (SDE) with polynomial parameterization of the right-hand side. Optimal values for both the number of EOF, the order of polynomial and number of levels are estimated from the Equatorial Pacific SST dataset. References: Ya. Molkov, D. Mukhin, E. Loskutov, G. Fidelin and A. Feigin, Using the minimum description length principle for global reconstruction of dynamic systems from noisy time series, Phys. Rev. E, Vol. 80, P 046207, 2009 Kravtsov S, Kondrashov D, Ghil M, 2005: Multilevel regression modeling of nonlinear processes: Derivation and applications to climatic variability. J. Climate, 18 (21): 4404-4424. D. Kondrashov, S. Kravtsov, A. W. Robertson and M. Ghil, 2005. A hierarchy of data-based ENSO models. J. Climate, 18, 4425-4444.
Construction and performance monitoring of various asphalt mixes in Illinois : 2016 interim report.
DOT National Transportation Integrated Search
2017-02-01
A series of five experimental projects were constructed to better determine the life-cycle cost and performance of pavement overlays using various levels of asphalt binder replacement (ABR) from use of reclaimed asphalt pavement (RAP), recycled aspha...
Long-term Trend of Satellite-observed Chlorophyll-a Concentration Variations in the East/Japan Sea
NASA Astrophysics Data System (ADS)
Park, J. E.; PARK, K. A.
2016-02-01
Long-term time-series of satellite ocean color data enable us to analyze the effects of climate change on ocean ecosystem through chlorophyll-a concentration as a proxy for phytoplankton biomass. In this study, we constructed a 17 year-long time-series dataset (1998-2014) of chlorophyll-a concentration by combining SeaWiFS (Obrview-2, 1997-2010) and MODIS (Aqua, 2002-present) data in the East Sea (Japan Sea). Several types of errors such as anonymously high values (a speckle error), stripe-like patterns, discrepancy originating from time gap between the two satellites were eliminated to enhance the accuracy of chlorophyll-a concentration data. The composited chlorophyll-a concentration maps, passing through the post-processing of the speckle errors, were improved significantly, by 14% of abnormal variability in maximum. Using the database, we investigated spatial and temporal variability of chlorophyll-a concentration in the East Sea. Spatial distribution of long-term trend of chlorophyll-a concentration indicated obvious distinction between northern and southern regions of the subpolar front. It revealed predominant seasonal variabilities as well as long-term changes in the timings of spring bloom. This study addresses the important role of local climate change on fast changing ecosystem of the East Sea as one of miniature oceans.
Rio, Daniel E.; Rawlings, Robert R.; Woltz, Lawrence A.; Gilman, Jodi; Hommer, Daniel W.
2013-01-01
A linear time-invariant model based on statistical time series analysis in the Fourier domain for single subjects is further developed and applied to functional MRI (fMRI) blood-oxygen level-dependent (BOLD) multivariate data. This methodology was originally developed to analyze multiple stimulus input evoked response BOLD data. However, to analyze clinical data generated using a repeated measures experimental design, the model has been extended to handle multivariate time series data and demonstrated on control and alcoholic subjects taken from data previously analyzed in the temporal domain. Analysis of BOLD data is typically carried out in the time domain where the data has a high temporal correlation. These analyses generally employ parametric models of the hemodynamic response function (HRF) where prewhitening of the data is attempted using autoregressive (AR) models for the noise. However, this data can be analyzed in the Fourier domain. Here, assumptions made on the noise structure are less restrictive, and hypothesis tests can be constructed based on voxel-specific nonparametric estimates of the hemodynamic transfer function (HRF in the Fourier domain). This is especially important for experimental designs involving multiple states (either stimulus or drug induced) that may alter the form of the response function. PMID:23840281
Rio, Daniel E; Rawlings, Robert R; Woltz, Lawrence A; Gilman, Jodi; Hommer, Daniel W
2013-01-01
A linear time-invariant model based on statistical time series analysis in the Fourier domain for single subjects is further developed and applied to functional MRI (fMRI) blood-oxygen level-dependent (BOLD) multivariate data. This methodology was originally developed to analyze multiple stimulus input evoked response BOLD data. However, to analyze clinical data generated using a repeated measures experimental design, the model has been extended to handle multivariate time series data and demonstrated on control and alcoholic subjects taken from data previously analyzed in the temporal domain. Analysis of BOLD data is typically carried out in the time domain where the data has a high temporal correlation. These analyses generally employ parametric models of the hemodynamic response function (HRF) where prewhitening of the data is attempted using autoregressive (AR) models for the noise. However, this data can be analyzed in the Fourier domain. Here, assumptions made on the noise structure are less restrictive, and hypothesis tests can be constructed based on voxel-specific nonparametric estimates of the hemodynamic transfer function (HRF in the Fourier domain). This is especially important for experimental designs involving multiple states (either stimulus or drug induced) that may alter the form of the response function.
Near-Field Postseismic Deformation Measurements from the Andaman and Nicobar Islands
NASA Astrophysics Data System (ADS)
Freymueller, J. T.; Rajendran, C.; Rajendran, K.; Rajamani, A.
2006-12-01
Since the December 26, 2004 Sumatra-Andaman Islands earthquake, we have carried out campaign GPS measurements at several sites in the Andaman and Nicobar Islands (India) and installed three continuous GPS sites in the region. Most of these sites had pre-earthquake measurements, which showed slow westward motion relative to the Indian plate. Postseismic measurements, on the other hand, show average westward velocities of several cm/yr to a few decimeters per year relative to the Indian plate. The motion of all sites is strongly non-linear in time, and is not uniform in space. We use a combination of continuous site time series and nearby campaign site time series to construct the most complete possible postseismic displacement records. Postseismic deformation from large earthquakes is likely to be dominated by a combination of afterslip on the deeper subduction interface, and viscoelastic relaxation of the mantle. Afterslip following the (similar magnitude) 1964 Alaska earthquake amounted to 20-50% of the magnitude of the coseismic slip, and smaller subduction zone earthquakes have exhibited the same or even larger proportion of afterslip to coseismic slip. We compare the time decay and spatial pattern of the observed postseismic displacement to postseismic deformation models and to observations from the Alaska earthquake.
Flexible Wing Base Micro Aerial Vehicles: Composite Materials for Micro Air Vehicles
NASA Technical Reports Server (NTRS)
Ifju, Peter G.; Ettinger, Scott; Jenkins, David; Martinez, Luis
2002-01-01
This paper will discuss the development of the University of Florida's Micro Air Vehicle concept. A series of flexible wing based aircraft that possess highly desirable flight characteristics were developed. Since computational methods to accurately model flight at the low Reynolds numbers associated with this scale are still under development, our effort has relied heavily on trial and error. Hence a time efficient method was developed to rapidly produce prototype designs. The airframe and wings are fabricated using a unique process that incorporates carbon fiber composite construction. Prototypes can be fabricated in around five man-hours, allowing many design revisions to be tested in a short period of time. The resulting aircraft are far more durable, yet lighter, than their conventional counterparts. This process allows for thorough testing of each design in order to determine what changes were required on the next prototype. The use of carbon fiber allows for wing flexibility without sacrificing durability. The construction methods developed for this project were the enabling technology that allowed us to implement our designs. The resulting aircraft were the winning entries in the International Micro Air Vehicle Competition for the past two years. Details of the construction method are provided in this paper along with a background on our flexible wing concept.
Fluid Mechanics Experiments as a Unifying Theme in the Physics Instrumentation Laboratory Course
NASA Astrophysics Data System (ADS)
Borrero-Echeverry, Daniel
2017-11-01
We discuss the transformation of a junior-level instrumentation laboratory course from a sequence of cookbook lab exercises to a semester-long, project-based course. In the original course, students conducted a series of activities covering the usual electronics topics (amplifiers, filters, oscillators, logic gates, etc.) and learned basic LabVIEW programming for data acquisition and analysis. Students complained that these topics seemed disconnected and not immediately applicable to ``real'' laboratory work. To provide a unifying theme, we restructured the course around the design, construction, instrumentation of a low-cost Taylor-Couette cell where fluid is sheared between rotating coaxial cylinders. The electronics labs were reworked to guide students from fundamental electronics through the design and construction of a stepper motor driver, which was used to actuate the cylinders. Some of the legacy labs were replaced with a module on computer-aided design (CAD) in which students designed parts for the apparatus, which they then built in the departmental machine shop. Signal processing topics like spectral analysis were introduced in the context of time-series analysis of video data acquired from flow visualization. The course culminated with a capstone project in which students conducted experiments of their own design on a variety of topics in rheology and nonlinear dynamics.
An Efficient Pattern Mining Approach for Event Detection in Multivariate Temporal Data
Batal, Iyad; Cooper, Gregory; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos
2015-01-01
This work proposes a pattern mining approach to learn event detection models from complex multivariate temporal data, such as electronic health records. We present Recent Temporal Pattern mining, a novel approach for efficiently finding predictive patterns for event detection problems. This approach first converts the time series data into time-interval sequences of temporal abstractions. It then constructs more complex time-interval patterns backward in time using temporal operators. We also present the Minimal Predictive Recent Temporal Patterns framework for selecting a small set of predictive and non-spurious patterns. We apply our methods for predicting adverse medical events in real-world clinical data. The results demonstrate the benefits of our methods in learning accurate event detection models, which is a key step for developing intelligent patient monitoring and decision support systems. PMID:26752800
ERIC Educational Resources Information Center
Hales, James A.; Snyder, James F.
1982-01-01
Presents narratives, charts, and diagrams showing the unique aspects of the four subsystems of human technical endeavor. These subsystems are communication, construction, manufacturing, and transportation. (For part 1 of this series, see CE 511 770.) (CT)
Bose, Eliezer; Hravnak, Marilyn; Sereika, Susan M.
2016-01-01
Background Patients undergoing continuous vital sign monitoring (heart rate [HR], respiratory rate [RR], pulse oximetry [SpO2]) in real time display inter-related vital sign changes during situations of physiologic stress. Patterns in this physiological cross-talk could portend impending cardiorespiratory instability (CRI). Vector autoregressive (VAR) modeling with Granger causality tests is one of the most flexible ways to elucidate underlying causal mechanisms in time series data. Purpose The purpose of this article is to illustrate development of patient-specific VAR models using vital sign time series (VSTS) data in a sample of acutely ill, monitored, step-down unit (SDU) patients, and determine their Granger causal dynamics prior to onset of an incident CRI. Approach CRI was defined as vital signs beyond stipulated normality thresholds (HR = 40–140/minute, RR = 8–36/minute, SpO2 < 85%) and persisting for 3 minutes within a 5-minute moving window (60% of the duration of the window). A 6-hour time segment prior to onset of first CRI was chosen for time series modeling in 20 patients using a six-step procedure: (a) the uniform time series for each vital sign was assessed for stationarity; (b) appropriate lag was determined using a lag-length selection criteria; (c) the VAR model was constructed; (d) residual autocorrelation was assessed with the Lagrange Multiplier test; (e) stability of the VAR system was checked; and (f) Granger causality was evaluated in the final stable model. Results The primary cause of incident CRI was low SpO2 (60% of cases), followed by out-of-range RR (30%) and HR (10%). Granger causality testing revealed that change in RR caused change in HR (21%) (i.e., RR changed before HR changed) more often than change in HR causing change in RR (15%). Similarly, changes in RR caused changes in SpO2 (15%) more often than changes in SpO2 caused changes in RR (9%). For HR and SpO2, changes in HR causing changes in SpO2 and changes in SpO2 causing changes in HR occurred with equal frequency (18%). Discussion Within this sample of acutely ill patients who experienced a CRI event, VAR modeling indicated that RR changes tend to occur before changes in HR and SpO2. These findings suggest that contextual assessment of RR changes as the earliest sign of CRI is warranted. Use of VAR modeling may be helpful in other nursing research applications based on time series data. PMID:27977564
Neural Networks as a Tool for Constructing Continuous NDVI Time Series from AVHRR and MODIS
NASA Technical Reports Server (NTRS)
Brown, Molly E.; Lary, David J.; Vrieling, Anton; Stathakis, Demetris; Mussa, Hamse
2008-01-01
The long term Advanced Very High Resolution Radiometer-Normalized Difference Vegetation Index (AVHRR-NDVI) record provides a critical historical perspective on vegetation dynamics necessary for global change research. Despite the proliferation of new sources of global, moderate resolution vegetation datasets, the remote sensing community is still struggling to create datasets derived from multiple sensors that allow the simultaneous use of spectral vegetation for time series analysis. To overcome the non-stationary aspect of NDVI, we use an artificial neural network (ANN) to map the NDVI indices from AVHRR to those from MODIS using atmospheric, surface type and sensor-specific inputs to account for the differences between the sensors. The NDVI dynamics and range of MODIS NDVI data at one degree is matched and extended through the AVHRR record. Four years of overlap between the two sensors is used to train a neural network to remove atmospheric and sensor specific effects on the AVHRR NDVI. In this paper, we present the resulting continuous dataset, its relationship to MODIS data, and a validation of the product.
An Environmental Data Set for Vector-Borne Disease Modeling and Epidemiology
Chabot-Couture, Guillaume; Nigmatulina, Karima; Eckhoff, Philip
2014-01-01
Understanding the environmental conditions of disease transmission is important in the study of vector-borne diseases. Low- and middle-income countries bear a significant portion of the disease burden; but data about weather conditions in those countries can be sparse and difficult to reconstruct. Here, we describe methods to assemble high-resolution gridded time series data sets of air temperature, relative humidity, land temperature, and rainfall for such areas; and we test these methods on the island of Madagascar. Air temperature and relative humidity were constructed using statistical interpolation of weather station measurements; the resulting median 95th percentile absolute errors were 2.75°C and 16.6%. Missing pixels from the MODIS11 remote sensing land temperature product were estimated using Fourier decomposition and time-series analysis; thus providing an alternative to the 8-day and 30-day aggregated products. The RFE 2.0 remote sensing rainfall estimator was characterized by comparing it with multiple interpolated rainfall products, and we observed significant differences in temporal and spatial heterogeneity relevant to vector-borne disease modeling. PMID:24755954
Marot, Marci E.; Adams, C. Scott; Richwine, Kathryn A.; Smith, Christopher G.; Osterman, Lisa E.; Bernier, Julie C.
2014-01-01
Scientists from the U.S. Geological Survey, St. Petersburg Coastal and Marine Science Center conducted a time-series collection of shallow sediment cores from the back-barrier environments along the Chandeleur Islands, Louisiana from March 2012 through July 2013. The sampling efforts were part of a larger USGS study to evaluate effects on the geomorphology of the Chandeleur Islands following the construction of an artificial sand berm to reduce oil transport onto federally managed lands. The objective of this study was to evaluate the response of the back-barrier tidal and wetland environments to the berm. This report serves as an archive for sedimentological, radiochemical, and microbiological data derived from the sediment cores. Data are available for a time-series of four sampling periods: March 2012; July 2012; September 2012; and July 2013. Downloadable data are available as Excel spreadsheets and as JPEG files. Additional files include: ArcGIS shapefiles of the sampling sites, detailed results of sediment grain size analyses, and formal Federal Geographic Data Committee metadata.
Drought Analysis for Kuwait Using Standardized Precipitation Index
2014-01-01
Implementation of adequate measures to assess and monitor droughts is recognized as a major matter challenging researchers involved in water resources management. The objective of this study is to assess the hydrologic drought characteristics from the historical rainfall records of Kuwait with arid environment by employing the criterion of Standardized Precipitation Index (SPI). A wide range of monthly total precipitation data from January 1967 to December 2009 is used for the assessment. The computation of the SPI series is performed for intermediate- and long-time scales of 3, 6, 12, and 24 months. The drought severity and duration are also estimated. The bivariate probability distribution for these two drought characteristics is constructed by using Clayton copula. It has been shown that the drought SPI series for the time scales examined have no systematic trend component but a seasonal pattern related to rainfall data. The results are used to perform univariate and bivariate frequency analyses for the drought events. The study will help evaluating the risk of future droughts in the region, assessing their consequences on economy, environment, and society, and adopting measures for mitigating the effect of droughts. PMID:25386598
Short-term prediction of chaotic time series by using RBF network with regression weights.
Rojas, I; Gonzalez, J; Cañas, A; Diaz, A F; Rojas, F J; Rodriguez, M
2000-10-01
We propose a framework for constructing and training a radial basis function (RBF) neural network. The structure of the gaussian functions is modified using a pseudo-gaussian function (PG) in which two scaling parameters sigma are introduced, which eliminates the symmetry restriction and provides the neurons in the hidden layer with greater flexibility with respect to function approximation. We propose a modified PG-BF (pseudo-gaussian basis function) network in which the regression weights are used to replace the constant weights in the output layer. For this purpose, a sequential learning algorithm is presented to adapt the structure of the network, in which it is possible to create a new hidden unit and also to detect and remove inactive units. A salient feature of the network systems is that the method used for calculating the overall output is the weighted average of the output associated with each receptive field. The superior performance of the proposed PG-BF system over the standard RBF are illustrated using the problem of short-term prediction of chaotic time series.
Coral radiocarbon constraints on the source of the Indonesian throughflow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, M.D.; Schrag, D.P.; Kashgarian, M.
1997-06-01
Radiocarbon variability in {ital Porites} spp. corals from Guam and the Makassar Strait (Indonesian Seaway) was used to identify the source waters contributing to the Indonesian throughflow. Time series with bimonthly resolution were constructed using accelerator mass spectrometry. The seasonal variability ranges from 15 to 60{per_thousand}, with large interannual variability. {Delta}{sup 14}C values from Indonesia and Guam have a nearly identical range. Annual mean {Delta}{sup 14}C values from Indonesia are 50 to 60{per_thousand} higher than in corals from Canton in the South Equatorial Current [{ital Druffel}, 1987]. These observations support a year-round North Pacific source for the Indonesian throughflow andmore » imply negligible contribution by South Equatorial Current water. The large seasonality in {Delta}{sup 14}C values from both sites emphasizes the dynamic behavior of radiocarbon in the surface ocean and suggests that {Delta}{sup 14}C time series of similar resolution can help constrain seasonal and interannual changes in ocean circulation in the Pacific over the last several decades.{copyright} 1997 American Geophysical Union« less
Difficulties in tracking the long-term global trend in tropical forest area.
Grainger, Alan
2008-01-15
The long-term trend in tropical forest area receives less scrutiny than the tropical deforestation rate. We show that constructing a reliable trend is difficult and evidence for decline is unclear, within the limits of errors involved in making global estimates. A time series for all tropical forest area, using data from Forest Resources Assessments (FRAs) of the United Nations Food and Agriculture Organization, is dominated by three successively corrected declining trends. Inconsistencies between these trends raise questions about their reliability, especially because differences seem to result as much from errors as from changes in statistical design and use of new data. A second time series for tropical moist forest area shows no apparent decline. The latter may be masked by the errors involved, but a "forest return" effect may also be operating, in which forest regeneration in some areas offsets deforestation (but not biodiversity loss) elsewhere. A better monitoring program is needed to give a more reliable trend. Scientists who use FRA data should check how the accuracy of their findings depends on errors in the data.
A 5-year analysis of crop phenologies from the United States Heartland (Invited)
NASA Astrophysics Data System (ADS)
Johnson, D. M.
2010-12-01
Time series imagery data from the National Aeronautics and Space Administration (NASA) Moderate Resolution Imaging Spectroradiometer (MODIS) was intersected with annually updated field-level crop data from the United States Department of Agriculture (USDA) Farm Service Agency (FSA). Phenological metrics were derived for major crop types found in the United States (US) Heartland region. The specific MODIS data consisted of the 16-day composited Normalized Difference Vegetation Index (NDVI) 250 meter spatial resolution imagery from the Terra satellite. Crops evaluated included corn, soybeans, wheat, cotton, sorghum, rice, and other small grains. Charts showing the annual average state-level NDVI phenologies by crop were constructed for the five years between 2006 and 2010. The states of interest covered the intensively cultivated regions in the US Great Plains, Corn Belt, and Mississippi River Alluvial Plain. Results demonstrated the recent biophysical growth cycles of prevalent and widespread US crops and how they varied by geography and year. Linkages between the time series data and planting practices, weather impacts, crop progress reports, and yields were also investigated.
Hou, Ying-Yu; He, Yan-Bo; Wang, Jian-Lin; Tian, Guo-Liang
2009-10-01
Based on the time series 10-day composite NOAA Pathfinder AVHRR Land (PAL) dataset (8 km x 8 km), and by using land surface energy balance equation and "VI-Ts" (vegetation index-land surface temperature) method, a new algorithm of land surface evapotranspiration (ET) was constructed. This new algorithm did not need the support from meteorological observation data, and all of its parameters and variables were directly inversed or derived from remote sensing data. A widely accepted ET model of remote sensing, i. e., SEBS model, was chosen to validate the new algorithm. The validation test showed that both the ET and its seasonal variation trend estimated by SEBS model and our new algorithm accorded well, suggesting that the ET estimated from the new algorithm was reliable, being able to reflect the actual land surface ET. The new ET algorithm of remote sensing was practical and operational, which offered a new approach to study the spatiotemporal variation of ET in continental scale and global scale based on the long-term time series satellite remote sensing images.
Position-sensitive proportional counter with low-resistance metal-wire anode
Kopp, Manfred K.
1980-01-01
A position-sensitive proportional counter circuit is provided which allows the use of a conventional (low-resistance, metal-wire anode) proportional counter for spatial resolution of an ionizing event along the anode of the counter. A pair of specially designed active-capacitance preamplifiers are used to terminate the anode ends wherein the anode is treated as an RC line. The preamplifiers act as stabilized active capacitance loads and each is composed of a series-feedback, low-noise amplifier, a unity-gain, shunt-feedback amplifier whose output is connected through a feedback capacitor to the series-feedback amplifier input. The stabilized capacitance loading of the anode allows distributed RC-line position encoding and subsequent time difference decoding by sensing the difference in rise times of pulses at the anode ends where the difference is primarily in response to the distributed capacitance along the anode. This allows the use of lower resistance wire anodes for spatial radiation detection which simplifies the counter construction and handling of the anodes, and stabilizes the anode resistivity at high count rates (>10.sup.6 counts/sec).
NASA Astrophysics Data System (ADS)
Das, L.; Dutta, M.; Akhter, J.; Meher, J. K.
2016-12-01
It is a challenging task to create station level (local scale) climate change information over the mountainous locations of Western Himalayan Region (WHR) in India because of limited data availability and poor data quality. In the present study, missing values of station data were handled through Multiple Imputation Chained Equation (MICE) technique. Finally 22 numbers of rain gauge and 16 number of temperature station data having continuous record during 19012005 and 19692009 period respectively were considered as reference stations for developing downscaled rainfall and temperature time series from five commonly available GCMs in the IPCC's different generation assessment reports namely 2nd, 3rd, 4th and 5th hereafter known as SAR, TAR, AR4 and AR5 respectively. Downscaled models were developed using the combined data from the ERA-interim reanalysis and GCMs historical runs (in spite of forcing were not identical in different generation) as predictor and station level rainfall and temperature as predictands. Station level downscaled rainfall and temperature time series were constructed for five GCMs available in each generation. Regional averaged downscaled time series comprising of all stations was prepared for each model and generation and the downscaled results were compared with observed time series. Finally an Overall Model Improvement Index (OMII) was developed using the downscaling results, which was used to investigate the model improvement across generations as well as the improvement of downscaling results obtained from the Empirical Statistical Downscaling (ESD) methods. In case of temperature, models have improved from SAR to AR5 over the study area. In all most all the GCMs TAR is showing worst performance over the WHR by considering the different statistical indices used in this study. In case of precipitation, no model has shown gradual improvement from SAR to AR5 both for interpolated and downscaled values.
An a priori model for the reduction of nutation observations: KSV(1994.3) nutation series
NASA Technical Reports Server (NTRS)
Herring, T. A.
1995-01-01
We discuss the formulation of a new nutation series to be used in the reduction of modern space geodetic data. The motivation for developing such a series is to develop a nutation series that has smaller short period errors than the IAU 1980 nutation series and to provide a series that can be used with techniques such as the Global Positioning System (GPS) that have sensitivity to nutations but can directly separate the effects of nutations from errors in the dynamical force models that effect the satellite orbits. A modern nutation series should allow the errors in the force models for GPS to be better understood. The series is constructed by convolving the Kinoshita and Souchay rigid Earth nutation series with an Earth response function whose parameters are partly based on geophysical models of the Earth and partly estimated from a long series (1979-1993) of very long baseline interferometry (VLBI) estimates of nutation angles. Secular rates of change of the nutation angles to represent corrections to the precession constant and a secular change of the obliquity of the ecliptic are included in the theory. Time dependent amplitudes of the Free Core Nutation (FCN) that is most likely excited by variations in atmospheric pressure are included when the geophysical parameters are estimated. The complex components of the prograde annual nutation are estimated simultaneously with the geophysical parameters because of the large contribution to the nutation from the S(sub 1) atmospheric tide. The weighted root mean square (WRMS) scatter of the nutation angle estimates about this new model are 0.32 mas and the largest correction to the series when the amplitudes of the ten largest nutations are estimated is 0.18 +/- 0.03 mas for the in phase component of the prograde 18. 6 year nutation.
Dziewit, Lukasz; Adamczuk, Marcin; Szuplewska, Magdalena; Bartosik, Dariusz
2011-08-01
We have developed a DIY (Do It Yourself) series of genetic cassettes, which facilitate construction of novel versatile vectors for Alphaproteobacteria. All the cassettes are based on defined genetic modules derived from three natural plasmids of Paracoccus aminophilus JCM 7686. We have constructed over 50 DIY cassettes, which differ in structure and specific features. All of them are functional in eight strains representing three orders of Alphaproteobacteria: Rhodobacterales, Rhizobiales and Caulobacterales. Besides various replication and stabilization systems, many of the cassettes also contain selective markers appropriate for Alphaproteobacteria (40 cassettes) and genetic modules responsible for mobilization for conjugal transfer (24 cassettes). All the DIY cassettes are bordered by different types of polylinkers, which facilitate vector construction. Using these DIY cassettes, we have created a set of compatible Escherichia coli-Alphaproteobacteria mobilizable shuttle vectors (high or low copy number in E. coli), which will greatly assist the genetic manipulation of Alphaproteobacteria. Copyright © 2011 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Odell, John H.
A school construction guide offers key personnel in school development projects information on the complex task of master planning and construction of schools in Australia. This chapter of the guide provides advice on how to set up a master planning team and establish a plan for quickly completing the building process. It provides an overview of…
Ground-water conditions in Utah, spring of 2007
Burden, Carole B.; Allen, David V.; Danner, M.R.; Enright, Michael; Cillessen, J.L.; Gerner, S.J.; Eacret, Robert J.; Downhour, Paul; Slaugh, Bradley A.; Swenson, Robert L.; Howells, James H.; Christiansen, Howard K.; Fisher, Martel J.
2007-01-01
This is the forty-fourth in a series of annual reports that describe ground-water conditions in Utah. Reports in this series, published cooperatively by the U.S. Geological Survey and the Utah Department of Natural Resources, Division of Water Resources and Division of Water Rights, and the Utah Department of Environmental Quality, Division of Water Quality, provide data to enable interested parties to maintain awareness of changing ground-water conditions.This report, like the others in the series, contains information on well construction, ground-water withdrawal from wells, water-level changes, precipitation, streamflow, and chemical quality of water. Information on well construction included in this report refers only to wells constructed for new appropriations of ground water. Supplementary data are included in reports of this series only for those years or areas which are important to a discussion of changing ground-water conditions and for which applicable data are available.This report includes individual discussions of selected significant areas of ground-water development in the State for calendar year 2006. Most of the reported data were collected by the U.S. Geological Survey in cooperation with the Utah Department of Natural Resources, Division of Water Resources and Division of Water Rights, and the Utah Department of Environmental Quality, Division of Water Quality. This report is available online at http://www.waterrights.utah. gov/ and http://ut.water.usgs.gov/newUTAH/GW2007.pdf.
Ground-water conditions in Utah, spring of 2008
Burden, Carole B.; Allen, David V.; Danner, M.R.; Fisher, Martel J.; Freeman, Michael L.; Downhour, Paul; Wilkowske, C.D.; Eacret, Robert J.; Enright, Michael; Swenson, Robert L.; Howells, James H.; Christiansen, Howard K.
2008-01-01
This is the forty-fifth in a series of annual reports that describe ground-water conditions in Utah. Reports in this series, published cooperatively by the U.S. Geological Survey and the Utah Department of Natural Resources, Division of Water Resources and Division of Water Rights, and the Utah Department of Environmental Quality, Division of Water Quality, provide data to enable interested parties to maintain awareness of changing ground-water conditions.This report, like the others in the series, contains information on well construction, ground-water withdrawal from wells, water-level changes, precipitation, streamflow, and chemical quality of water. Information on well construction included in this report refers only to wells constructed for new appropriations of ground water. Supplementary data are included in reports of this series only for those years or areas which are important to a discussion of changing ground-water conditions and for which applicable data are available.This report includes individual discussions of selected significant areas of ground-water development in the State for calendar year 2007. Most of the reported data were collected by the U.S. Geological Survey in cooperation with the Utah Department of Natural Resources, Division of Water Resources and Division of Water Rights, and the Utah Department of Environmental Quality, Division of Water Quality. This report is available online at http://www.waterrights.utah.gov/techinfo/ and http://ut.water.usgs.gov/publications/GW2008.pdf.
Ground-water conditions in Utah, spring of 2009
Burden, Carole B.; Allen, David V.; Rowland, Ryan C.; Fisher, Martel J.; Freeman, Michael L.; Downhour, Paul; Nielson, Ashley; Eacret, Robert J.; Myers, Andrew; Slaugh, Bradley A.; Swenson, Robert L.; Howells, James H.; Christiansen, Howard K.
2009-01-01
This is the forty-sixth in a series of annual reports that describe ground-water conditions in Utah. Reports in this series, published cooperatively by the U.S. Geological Survey and the Utah Department of Natural Resources, Division of Water Resources and Division of Water Rights, and the Utah Department of Environmental Quality, Division of Water Quality, provide data to enable interested parties to maintain awareness of changing ground-water conditions. This report, like the others in the series, contains information on well construction, ground-water withdrawal from wells, water-level changes, precipitation, streamflow, and chemical quality of water. Information on well construction included in this report refers only to wells constructed for new appropriations of ground water. Supplementary data are included in reports of this series only for those years or areas which are important to a discussion of changing ground-water conditions and for which applicable data are available.This report includes individual discussions of selected significant areas of ground-water development in the State for calendar year 2008. Most of the reported data were collected by the U.S. Geological Survey in cooperation with the Utah Department of Natural Resources, Division of Water Resources and Division of Water Rights, and the Utah Department of Environmental Quality, Division of Water Quality. This report is available online at http://www.waterrights. utah.gov/techinfo/ and http://ut.water.usgs.gov/publications/ GW2009.pdf.
Iterative Refinement of a Binding Pocket Model: Active Computational Steering of Lead Optimization
2012-01-01
Computational approaches for binding affinity prediction are most frequently demonstrated through cross-validation within a series of molecules or through performance shown on a blinded test set. Here, we show how such a system performs in an iterative, temporal lead optimization exercise. A series of gyrase inhibitors with known synthetic order formed the set of molecules that could be selected for “synthesis.” Beginning with a small number of molecules, based only on structures and activities, a model was constructed. Compound selection was done computationally, each time making five selections based on confident predictions of high activity and five selections based on a quantitative measure of three-dimensional structural novelty. Compound selection was followed by model refinement using the new data. Iterative computational candidate selection produced rapid improvements in selected compound activity, and incorporation of explicitly novel compounds uncovered much more diverse active inhibitors than strategies lacking active novelty selection. PMID:23046104
NASA Astrophysics Data System (ADS)
Vershkov, A. N.; Petrovskaya, M. S.
2016-11-01
The series in ellipsoidal harmonics for derivatives of the Earth's gravity potential are used only on the reference ellipsoid enveloping the Earth due to their very complex mathematical structure. In the current study, the series in ellipsoidal harmonics are constructed for first- and second-order derivatives of the potential at satellite altitudes; their structure is similar to the series on the reference ellipsoid. The point P is chosen at a random satellite altitude; then, the ellipsoid of revolution is described, which passes through this point and is confocal to the reference ellipsoid. An object-centered coordinate system with the origin at the point P is considered. Using a sequence of transformations, the nonsingular series in ellipsoidal harmonics is constructed for first and second derivatives of the potential in the object-centered coordinate system. These series can be applied to develop a model of the Earth's potential, based on combined use of surface gravitational force measurements, data on the satellite orbital position, its acceleration, or measurements of the gravitational force gradients of the first and second order. The technique is applicable to any other planet of the Solar System.
Accounting Issues: An Essay Series. Part II--Accounts Receivable
ERIC Educational Resources Information Center
Laux, Judith A.
2007-01-01
This is the second in a series of articles designed to help academics refocus the introductory accounting course on the theoretical underpinnings of accounting. Intended as a supplement for the principles course, this article connects the asset Accounts Receivable to the essential theoretical constructs, discusses the inherent tradeoffs and…