Sample records for time series framework

  1. A general framework for time series data mining based on event analysis: application to the medical domains of electroencephalography and stabilometry.

    PubMed

    Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P

    2014-10-01

    There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Clinical time series prediction: Toward a hierarchical dynamical system framework.

    PubMed

    Liu, Zitao; Hauskrecht, Milos

    2015-09-01

    Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. We tested our framework by first learning the time series model from data for the patients in the training set, and then using it to predict future time series values for the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Clinical time series prediction: towards a hierarchical dynamical system framework

    PubMed Central

    Liu, Zitao; Hauskrecht, Milos

    2014-01-01

    Objective Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Materials and methods Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. Results We tested our framework by first learning the time series model from data for the patient in the training set, and then applying the model in order to predict future time series values on the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. Conclusion A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. PMID:25534671

  4. A novel water quality data analysis framework based on time-series data mining.

    PubMed

    Deng, Weihui; Wang, Guoyin

    2017-07-01

    The rapid development of time-series data mining provides an emerging method for water resource management research. In this paper, based on the time-series data mining methodology, we propose a novel and general analysis framework for water quality time-series data. It consists of two parts: implementation components and common tasks of time-series data mining in water quality data. In the first part, we propose to granulate the time series into several two-dimensional normal clouds and calculate the similarities in the granulated level. On the basis of the similarity matrix, the similarity search, anomaly detection, and pattern discovery tasks in the water quality time-series instance dataset can be easily implemented in the second part. We present a case study of this analysis framework on weekly Dissolve Oxygen time-series data collected from five monitoring stations on the upper reaches of Yangtze River, China. It discovered the relationship of water quality in the mainstream and tributary as well as the main changing patterns of DO. The experimental results show that the proposed analysis framework is a feasible and efficient method to mine the hidden and valuable knowledge from water quality historical time-series data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Koopman Operator Framework for Time Series Modeling and Analysis

    NASA Astrophysics Data System (ADS)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  6. A Personalized Predictive Framework for Multivariate Clinical Time Series via Adaptive Model Selection.

    PubMed

    Liu, Zitao; Hauskrecht, Milos

    2017-11-01

    Building of an accurate predictive model of clinical time series for a patient is critical for understanding of the patient condition, its dynamics, and optimal patient management. Unfortunately, this process is not straightforward. First, patient-specific variations are typically large and population-based models derived or learned from many different patients are often unable to support accurate predictions for each individual patient. Moreover, time series observed for one patient at any point in time may be too short and insufficient to learn a high-quality patient-specific model just from the patient's own data. To address these problems we propose, develop and experiment with a new adaptive forecasting framework for building multivariate clinical time series models for a patient and for supporting patient-specific predictions. The framework relies on the adaptive model switching approach that at any point in time selects the most promising time series model out of the pool of many possible models, and consequently, combines advantages of the population, patient-specific and short-term individualized predictive models. We demonstrate that the adaptive model switching framework is very promising approach to support personalized time series prediction, and that it is able to outperform predictions based on pure population and patient-specific models, as well as, other patient-specific model adaptation strategies.

  7. Mining Recent Temporal Patterns for Event Detection in Multivariate Time Series Data

    PubMed Central

    Batal, Iyad; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos

    2015-01-01

    Improving the performance of classifiers using pattern mining techniques has been an active topic of data mining research. In this work we introduce the recent temporal pattern mining framework for finding predictive patterns for monitoring and event detection problems in complex multivariate time series data. This framework first converts time series into time-interval sequences of temporal abstractions. It then constructs more complex temporal patterns backwards in time using temporal operators. We apply our framework to health care data of 13,558 diabetic patients and show its benefits by efficiently finding useful patterns for detecting and diagnosing adverse medical conditions that are associated with diabetes. PMID:25937993

  8. A Framework and Algorithms for Multivariate Time Series Analytics (MTSA): Learning, Monitoring, and Recommendation

    ERIC Educational Resources Information Center

    Ngan, Chun-Kit

    2013-01-01

    Making decisions over multivariate time series is an important topic which has gained significant interest in the past decade. A time series is a sequence of data points which are measured and ordered over uniform time intervals. A multivariate time series is a set of multiple, related time series in a particular domain in which domain experts…

  9. Semi-autonomous remote sensing time series generation tool

    NASA Astrophysics Data System (ADS)

    Babu, Dinesh Kumar; Kaufmann, Christof; Schmidt, Marco; Dhams, Thorsten; Conrad, Christopher

    2017-10-01

    High spatial and temporal resolution data is vital for crop monitoring and phenology change detection. Due to the lack of satellite architecture and frequent cloud cover issues, availability of daily high spatial data is still far from reality. Remote sensing time series generation of high spatial and temporal data by data fusion seems to be a practical alternative. However, it is not an easy process, since it involves multiple steps and also requires multiple tools. In this paper, a framework of Geo Information System (GIS) based tool is presented for semi-autonomous time series generation. This tool will eliminate the difficulties by automating all the steps and enable the users to generate synthetic time series data with ease. Firstly, all the steps required for the time series generation process are identified and grouped into blocks based on their functionalities. Later two main frameworks are created, one to perform all the pre-processing steps on various satellite data and the other one to perform data fusion to generate time series. The two frameworks can be used individually to perform specific tasks or they could be combined to perform both the processes in one go. This tool can handle most of the known geo data formats currently available which makes it a generic tool for time series generation of various remote sensing satellite data. This tool is developed as a common platform with good interface which provides lot of functionalities to enable further development of more remote sensing applications. A detailed description on the capabilities and the advantages of the frameworks are given in this paper.

  10. Clustering of financial time series

    NASA Astrophysics Data System (ADS)

    D'Urso, Pierpaolo; Cappelli, Carmela; Di Lallo, Dario; Massari, Riccardo

    2013-05-01

    This paper addresses the topic of classifying financial time series in a fuzzy framework proposing two fuzzy clustering models both based on GARCH models. In general clustering of financial time series, due to their peculiar features, needs the definition of suitable distance measures. At this aim, the first fuzzy clustering model exploits the autoregressive representation of GARCH models and employs, in the framework of a partitioning around medoids algorithm, the classical autoregressive metric. The second fuzzy clustering model, also based on partitioning around medoids algorithm, uses the Caiado distance, a Mahalanobis-like distance, based on estimated GARCH parameters and covariances that takes into account the information about the volatility structure of time series. In order to illustrate the merits of the proposed fuzzy approaches an application to the problem of classifying 29 time series of Euro exchange rates against international currencies is presented and discussed, also comparing the fuzzy models with their crisp version.

  11. Multichannel biomedical time series clustering via hierarchical probabilistic latent semantic analysis.

    PubMed

    Wang, Jin; Sun, Xiangping; Nahavandi, Saeid; Kouzani, Abbas; Wu, Yuchuan; She, Mary

    2014-11-01

    Biomedical time series clustering that automatically groups a collection of time series according to their internal similarity is of importance for medical record management and inspection such as bio-signals archiving and retrieval. In this paper, a novel framework that automatically groups a set of unlabelled multichannel biomedical time series according to their internal structural similarity is proposed. Specifically, we treat a multichannel biomedical time series as a document and extract local segments from the time series as words. We extend a topic model, i.e., the Hierarchical probabilistic Latent Semantic Analysis (H-pLSA), which was originally developed for visual motion analysis to cluster a set of unlabelled multichannel time series. The H-pLSA models each channel of the multichannel time series using a local pLSA in the first layer. The topics learned in the local pLSA are then fed to a global pLSA in the second layer to discover the categories of multichannel time series. Experiments on a dataset extracted from multichannel Electrocardiography (ECG) signals demonstrate that the proposed method performs better than previous state-of-the-art approaches and is relatively robust to the variations of parameters including length of local segments and dictionary size. Although the experimental evaluation used the multichannel ECG signals in a biometric scenario, the proposed algorithm is a universal framework for multichannel biomedical time series clustering according to their structural similarity, which has many applications in biomedical time series management. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  12. Modeling sports highlights using a time-series clustering framework and model interpretation

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Regunathan; Otsuka, Isao; Xiong, Ziyou; Divakaran, Ajay

    2005-01-01

    In our past work on sports highlights extraction, we have shown the utility of detecting audience reaction using an audio classification framework. The audio classes in the framework were chosen based on intuition. In this paper, we present a systematic way of identifying the key audio classes for sports highlights extraction using a time series clustering framework. We treat the low-level audio features as a time series and model the highlight segments as "unusual" events in a background of an "usual" process. The set of audio classes to characterize the sports domain is then identified by analyzing the consistent patterns in each of the clusters output from the time series clustering framework. The distribution of features from the training data so obtained for each of the key audio classes, is parameterized by a Minimum Description Length Gaussian Mixture Model (MDL-GMM). We also interpret the meaning of each of the mixture components of the MDL-GMM for the key audio class (the "highlight" class) that is correlated with highlight moments. Our results show that the "highlight" class is a mixture of audience cheering and commentator's excited speech. Furthermore, we show that the precision-recall performance for highlights extraction based on this "highlight" class is better than that of our previous approach which uses only audience cheering as the key highlight class.

  13. Integrating field plots, lidar, and landsat time series to provide temporally consistent annual estimates of biomass from 1990 to present

    Treesearch

    Warren B. Cohen; Hans-Erik Andersen; Sean P. Healey; Gretchen G. Moisen; Todd A. Schroeder; Christopher W. Woodall; Grant M. Domke; Zhiqiang Yang; Robert E. Kennedy; Stephen V. Stehman; Curtis Woodcock; Jim Vogelmann; Zhe Zhu; Chengquan Huang

    2015-01-01

    We are developing a system that provides temporally consistent biomass estimates for national greenhouse gas inventory reporting to the United Nations Framework Convention on Climate Change. Our model-assisted estimation framework relies on remote sensing to scale from plot measurements to lidar strip samples, to Landsat time series-based maps. As a demonstration, new...

  14. From Networks to Time Series

    NASA Astrophysics Data System (ADS)

    Shimada, Yutaka; Ikeguchi, Tohru; Shigehara, Takaomi

    2012-10-01

    In this Letter, we propose a framework to transform a complex network to a time series. The transformation from complex networks to time series is realized by the classical multidimensional scaling. Applying the transformation method to a model proposed by Watts and Strogatz [Nature (London) 393, 440 (1998)], we show that ring lattices are transformed to periodic time series, small-world networks to noisy periodic time series, and random networks to random time series. We also show that these relationships are analytically held by using the circulant-matrix theory and the perturbation theory of linear operators. The results are generalized to several high-dimensional lattices.

  15. Non-parametric directionality analysis - Extension for removal of a single common predictor and application to time series.

    PubMed

    Halliday, David M; Senik, Mohd Harizal; Stevenson, Carl W; Mason, Rob

    2016-08-01

    The ability to infer network structure from multivariate neuronal signals is central to computational neuroscience. Directed network analyses typically use parametric approaches based on auto-regressive (AR) models, where networks are constructed from estimates of AR model parameters. However, the validity of using low order AR models for neurophysiological signals has been questioned. A recent article introduced a non-parametric approach to estimate directionality in bivariate data, non-parametric approaches are free from concerns over model validity. We extend the non-parametric framework to include measures of directed conditional independence, using scalar measures that decompose the overall partial correlation coefficient summatively by direction, and a set of functions that decompose the partial coherence summatively by direction. A time domain partial correlation function allows both time and frequency views of the data to be constructed. The conditional independence estimates are conditioned on a single predictor. The framework is applied to simulated cortical neuron networks and mixtures of Gaussian time series data with known interactions. It is applied to experimental data consisting of local field potential recordings from bilateral hippocampus in anaesthetised rats. The framework offers a non-parametric approach to estimation of directed interactions in multivariate neuronal recordings, and increased flexibility in dealing with both spike train and time series data. The framework offers a novel alternative non-parametric approach to estimate directed interactions in multivariate neuronal recordings, and is applicable to spike train and time series data. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Detecting and characterizing high-frequency oscillations in epilepsy: a case study of big data analysis

    NASA Astrophysics Data System (ADS)

    Huang, Liang; Ni, Xuan; Ditto, William L.; Spano, Mark; Carney, Paul R.; Lai, Ying-Cheng

    2017-01-01

    We develop a framework to uncover and analyse dynamical anomalies from massive, nonlinear and non-stationary time series data. The framework consists of three steps: preprocessing of massive datasets to eliminate erroneous data segments, application of the empirical mode decomposition and Hilbert transform paradigm to obtain the fundamental components embedded in the time series at distinct time scales, and statistical/scaling analysis of the components. As a case study, we apply our framework to detecting and characterizing high-frequency oscillations (HFOs) from a big database of rat electroencephalogram recordings. We find a striking phenomenon: HFOs exhibit on-off intermittency that can be quantified by algebraic scaling laws. Our framework can be generalized to big data-related problems in other fields such as large-scale sensor data and seismic data analysis.

  17. Visualizing the intercity correlation of PM2.5 time series in the Beijing-Tianjin-Hebei region using ground-based air quality monitoring data.

    PubMed

    Liu, Jianzheng; Li, Weifeng; Wu, Jiansheng; Liu, Yonghong

    2018-01-01

    The Beijing-Tianjin-Hebei area faces a severe fine particulate matter (PM2.5) problem. To date, considerable progress has been made toward understanding the PM2.5 problem, including spatial-temporal characterization, driving factors, and health effects. However, little research has been done on the dynamic interactions and relationships between PM2.5 concentrations in different cities in this area. To address the research gap, this study discovered a phenomenon of time-lagged intercity correlations of PM2.5 time series and proposed a visualization framework based on this phenomenon to visualize the interaction in PM2.5 concentrations between cities. The visualizations produced using the framework show that there are significant time-lagged correlations between the PM2.5 time series in different cities in this area. The visualizations also show that the correlations are more significant in colder months and between cities that are closer, and that there are seasonal changes in the temporal order of the correlated PM2.5 time series. Further analysis suggests that the time-lagged intercity correlations of PM2.5 time series are most likely due to synoptic meteorological variations. We argue that the visualizations demonstrate the interactions of air pollution between cities in the Beijing-Tianjin-Hebei area and the significant effect of synoptic meteorological conditions on PM2.5 pollution. The visualization framework could help determine the pathway of regional transportation of air pollution and may also be useful in delineating the area of interaction of PM2.5 pollution for impact analysis.

  18. Visualizing the intercity correlation of PM2.5 time series in the Beijing-Tianjin-Hebei region using ground-based air quality monitoring data

    PubMed Central

    Li, Weifeng; Wu, Jiansheng; Liu, Yonghong

    2018-01-01

    The Beijing-Tianjin-Hebei area faces a severe fine particulate matter (PM2.5) problem. To date, considerable progress has been made toward understanding the PM2.5 problem, including spatial-temporal characterization, driving factors, and health effects. However, little research has been done on the dynamic interactions and relationships between PM2.5 concentrations in different cities in this area. To address the research gap, this study discovered a phenomenon of time-lagged intercity correlations of PM2.5 time series and proposed a visualization framework based on this phenomenon to visualize the interaction in PM2.5 concentrations between cities. The visualizations produced using the framework show that there are significant time-lagged correlations between the PM2.5 time series in different cities in this area. The visualizations also show that the correlations are more significant in colder months and between cities that are closer, and that there are seasonal changes in the temporal order of the correlated PM2.5 time series. Further analysis suggests that the time-lagged intercity correlations of PM2.5 time series are most likely due to synoptic meteorological variations. We argue that the visualizations demonstrate the interactions of air pollution between cities in the Beijing-Tianjin-Hebei area and the significant effect of synoptic meteorological conditions on PM2.5 pollution. The visualization framework could help determine the pathway of regional transportation of air pollution and may also be useful in delineating the area of interaction of PM2.5 pollution for impact analysis. PMID:29438417

  19. JTSA: an open source framework for time series abstractions.

    PubMed

    Sacchi, Lucia; Capozzi, Davide; Bellazzi, Riccardo; Larizza, Cristiana

    2015-10-01

    The evaluation of the clinical status of a patient is frequently based on the temporal evolution of some parameters, making the detection of temporal patterns a priority in data analysis. Temporal abstraction (TA) is a methodology widely used in medical reasoning for summarizing and abstracting longitudinal data. This paper describes JTSA (Java Time Series Abstractor), a framework including a library of algorithms for time series preprocessing and abstraction and an engine to execute a workflow for temporal data processing. The JTSA framework is grounded on a comprehensive ontology that models temporal data processing both from the data storage and the abstraction computation perspective. The JTSA framework is designed to allow users to build their own analysis workflows by combining different algorithms. Thanks to the modular structure of a workflow, simple to highly complex patterns can be detected. The JTSA framework has been developed in Java 1.7 and is distributed under GPL as a jar file. JTSA provides: a collection of algorithms to perform temporal abstraction and preprocessing of time series, a framework for defining and executing data analysis workflows based on these algorithms, and a GUI for workflow prototyping and testing. The whole JTSA project relies on a formal model of the data types and of the algorithms included in the library. This model is the basis for the design and implementation of the software application. Taking into account this formalized structure, the user can easily extend the JTSA framework by adding new algorithms. Results are shown in the context of the EU project MOSAIC to extract relevant patterns from data coming related to the long term monitoring of diabetic patients. The proof that JTSA is a versatile tool to be adapted to different needs is given by its possible uses, both as a standalone tool for data summarization and as a module to be embedded into other architectures to select specific phenotypes based on TAs in a large dataset. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. Automatising the analysis of stochastic biochemical time-series

    PubMed Central

    2015-01-01

    Background Mathematical and computational modelling of biochemical systems has seen a lot of effort devoted to the definition and implementation of high-performance mechanistic simulation frameworks. Within these frameworks it is possible to analyse complex models under a variety of configurations, eventually selecting the best setting of, e.g., parameters for a target system. Motivation This operational pipeline relies on the ability to interpret the predictions of a model, often represented as simulation time-series. Thus, an efficient data analysis pipeline is crucial to automatise time-series analyses, bearing in mind that errors in this phase might mislead the modeller's conclusions. Results For this reason we have developed an intuitive framework-independent Python tool to automate analyses common to a variety of modelling approaches. These include assessment of useful non-trivial statistics for simulation ensembles, e.g., estimation of master equations. Intuitive and domain-independent batch scripts will allow the researcher to automatically prepare reports, thus speeding up the usual model-definition, testing and refinement pipeline. PMID:26051821

  1. Registration of 4D time-series of cardiac images with multichannel Diffeomorphic Demons.

    PubMed

    Peyrat, Jean-Marc; Delingette, Hervé; Sermesant, Maxime; Pennec, Xavier; Xu, Chenyang; Ayache, Nicholas

    2008-01-01

    In this paper, we propose a generic framework for intersubject non-linear registration of 4D time-series images. In this framework, spatio-temporal registration is defined by mapping trajectories of physical points as opposed to spatial registration that solely aims at mapping homologous points. First, we determine the trajectories we want to register in each sequence using a motion tracking algorithm based on the Diffeomorphic Demons algorithm. Then, we perform simultaneously pairwise registrations of corresponding time-points with the constraint to map the same physical points over time. We show this trajectory registration can be formulated as a multichannel registration of 3D images. We solve it using the Diffeomorphic Demons algorithm extended to vector-valued 3D images. This framework is applied to the inter-subject non-linear registration of 4D cardiac CT sequences.

  2. A Spatiotemporal Prediction Framework for Air Pollution Based on Deep RNN

    NASA Astrophysics Data System (ADS)

    Fan, J.; Li, Q.; Hou, J.; Feng, X.; Karimian, H.; Lin, S.

    2017-10-01

    Time series data in practical applications always contain missing values due to sensor malfunction, network failure, outliers etc. In order to handle missing values in time series, as well as the lack of considering temporal properties in machine learning models, we propose a spatiotemporal prediction framework based on missing value processing algorithms and deep recurrent neural network (DRNN). By using missing tag and missing interval to represent time series patterns, we implement three different missing value fixing algorithms, which are further incorporated into deep neural network that consists of LSTM (Long Short-term Memory) layers and fully connected layers. Real-world air quality and meteorological datasets (Jingjinji area, China) are used for model training and testing. Deep feed forward neural networks (DFNN) and gradient boosting decision trees (GBDT) are trained as baseline models against the proposed DRNN. Performances of three missing value fixing algorithms, as well as different machine learning models are evaluated and analysed. Experiments show that the proposed DRNN framework outperforms both DFNN and GBDT, therefore validating the capacity of the proposed framework. Our results also provides useful insights for better understanding of different strategies that handle missing values.

  3. Phenological classification of the United States: A geographic framework for extending multi-sensor time-series data

    USGS Publications Warehouse

    Gu, Yingxin; Brown, Jesslyn F.; Miura, Tomoaki; van Leeuwen, Willem J.D.; Reed, Bradley C.

    2010-01-01

    This study introduces a new geographic framework, phenological classification, for the conterminous United States based on Moderate Resolution Imaging Spectroradiometer (MODIS) Normalized Difference Vegetation Index (NDVI) time-series data and a digital elevation model. The resulting pheno-class map is comprised of 40 pheno-classes, each having unique phenological and topographic characteristics. Cross-comparison of the pheno-classes with the 2001 National Land Cover Database indicates that the new map contains additional phenological and climate information. The pheno-class framework may be a suitable basis for the development of an Advanced Very High Resolution Radiometer (AVHRR)-MODIS NDVI translation algorithm and for various biogeographic studies.

  4. A Temporal Mining Framework for Classifying Un-Evenly Spaced Clinical Data: An Approach for Building Effective Clinical Decision-Making System.

    PubMed

    Jane, Nancy Yesudhas; Nehemiah, Khanna Harichandran; Arputharaj, Kannan

    2016-01-01

    Clinical time-series data acquired from electronic health records (EHR) are liable to temporal complexities such as irregular observations, missing values and time constrained attributes that make the knowledge discovery process challenging. This paper presents a temporal rough set induced neuro-fuzzy (TRiNF) mining framework that handles these complexities and builds an effective clinical decision-making system. TRiNF provides two functionalities namely temporal data acquisition (TDA) and temporal classification. In TDA, a time-series forecasting model is constructed by adopting an improved double exponential smoothing method. The forecasting model is used in missing value imputation and temporal pattern extraction. The relevant attributes are selected using a temporal pattern based rough set approach. In temporal classification, a classification model is built with the selected attributes using a temporal pattern induced neuro-fuzzy classifier. For experimentation, this work uses two clinical time series dataset of hepatitis and thrombosis patients. The experimental result shows that with the proposed TRiNF framework, there is a significant reduction in the error rate, thereby obtaining the classification accuracy on an average of 92.59% for hepatitis and 91.69% for thrombosis dataset. The obtained classification results prove the efficiency of the proposed framework in terms of its improved classification accuracy.

  5. Reliability prediction of ontology-based service compositions using Petri net and time series models.

    PubMed

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy.

  6. Analysis of air pollution mortality in terms of life expectancy changes: relation between time series, intervention, and cohort studies.

    PubMed

    Rabl, Ari

    2006-02-01

    Information on life expectancy change is of great concern for policy makers, as evidenced by the discussions of the so-called "harvesting" issue (i.e. the question being, how large a loss each death corresponds to in the mortality results of time series studies). Whereas most epidemiological studies of air pollution mortality have been formulated in terms of mortality risk, this paper shows that a formulation in terms of life expectancy change is mathematically equivalent, but offers several advantages: it automatically takes into account the constraint that everybody dies exactly once, regardless of pollution; it provides a unified framework for time series, intervention studies and cohort studies; and in time series and intervention studies, it yields the life expectancy change directly as a time integral of the observed mortality rate. Results are presented for life expectancy change in time series studies. Determination of the corresponding total number of attributable deaths (as opposed to the number of observed deaths) is shown to be problematic. The time variation of mortality after a change in exposure is shown to depend on the processes by which the body can repair air pollution damage, in particular on their time constants. Hypothetical results are presented for repair models that are plausible in view of the available intervention studies of air pollution and of smoking cessation. If these repair models can also be assumed for acute effects, the results of cohort studies are compatible with those of time series. The proposed life expectancy framework provides information on the life expectancy change in time series studies, and it clarifies the relation between the results of time series, intervention, and cohort studies.

  7. Time series evapotranspiration maps at a regional scale: A methodology, evaluation, and their use in water resources management

    NASA Astrophysics Data System (ADS)

    Gowda, P. H.

    2016-12-01

    Evapotranspiration (ET) is an important process in ecosystems' water budget and closely linked to its productivity. Therefore, regional scale daily time series ET maps developed at high and medium resolutions have large utility in studying the carbon-energy-water nexus and managing water resources. There are efforts to develop such datasets on a regional to global scale but often faced with the limitations of spatial-temporal resolution tradeoffs in satellite remote sensing technology. In this study, we developed frameworks for generating high and medium resolution daily ET maps from Landsat and MODIS (Moderate Resolution Imaging Spectroradiometer) data, respectively. For developing high resolution (30-m) daily time series ET maps with Landsat TM data, the series version of Two Source Energy Balance (TSEB) model was used to compute sensible and latent heat fluxes of soil and canopy separately. Landsat 5 (2000-2011) and Landsat 8 (2013-2014) imageries for row 28/35 and 27/36 covering central Oklahoma was used. MODIS data (2001-2014) covering Oklahoma and Texas Panhandle was used to develop medium resolution (250-m), time series daily ET maps with SEBS (Surface Energy Balance System) model. An extensive network of weather stations managed by Texas High Plains ET Network and Oklahoma Mesonet was used to generate spatially interpolated inputs of air temperature, relative humidity, wind speed, solar radiation, pressure, and reference ET. A linear interpolation sub-model was used to estimate the daily ET between the image acquisition days. Accuracy assessment of daily ET maps were done against eddy covariance data from two grassland sites at El Reno, OK. Statistical results indicated good performance by modeling frameworks developed for deriving time series ET maps. Results indicated that the proposed ET mapping framework is suitable for deriving daily time series ET maps at regional scale with Landsat and MODIS data.

  8. Toward a comprehensive landscape vegetation monitoring framework

    NASA Astrophysics Data System (ADS)

    Kennedy, Robert; Hughes, Joseph; Neeti, Neeti; Larrue, Tara; Gregory, Matthew; Roberts, Heather; Ohmann, Janet; Kane, Van; Kane, Jonathan; Hooper, Sam; Nelson, Peder; Cohen, Warren; Yang, Zhiqiang

    2016-04-01

    Blossoming Earth observation resources provide great opportunity to better understand land vegetation dynamics, but also require new techniques and frameworks to exploit their potential. Here, I describe several parallel projects that leverage time-series Landsat imagery to describe vegetation dynamics at regional and continental scales. At the core of these projects are the LandTrendr algorithms, which distill time-series earth observation data into periods of consistent long or short-duration dynamics. In one approach, we built an integrated, empirical framework to blend these algorithmically-processed time-series data with field data and lidar data to ascribe yearly change in forest biomass across the US states of Washington, Oregon, and California. In a separate project, we expanded from forest-only monitoring to full landscape land cover monitoring over the same regional scale, including both categorical class labels and continuous-field estimates. In these and other projects, we apply machine-learning approaches to ascribe all changes in vegetation to driving processes such as harvest, fire, urbanization, etc., allowing full description of both disturbance and recovery processes and drivers. Finally, we are moving toward extension of these same techniques to continental and eventually global scales using Google Earth Engine. Taken together, these approaches provide one framework for describing and understanding processes of change in vegetation communities at broad scales.

  9. A graph-based approach to detect spatiotemporal dynamics in satellite image time series

    NASA Astrophysics Data System (ADS)

    Guttler, Fabio; Ienco, Dino; Nin, Jordi; Teisseire, Maguelonne; Poncelet, Pascal

    2017-08-01

    Enhancing the frequency of satellite acquisitions represents a key issue for Earth Observation community nowadays. Repeated observations are crucial for monitoring purposes, particularly when intra-annual process should be taken into account. Time series of images constitute a valuable source of information in these cases. The goal of this paper is to propose a new methodological framework to automatically detect and extract spatiotemporal information from satellite image time series (SITS). Existing methods dealing with such kind of data are usually classification-oriented and cannot provide information about evolutions and temporal behaviors. In this paper we propose a graph-based strategy that combines object-based image analysis (OBIA) with data mining techniques. Image objects computed at each individual timestamp are connected across the time series and generates a set of evolution graphs. Each evolution graph is associated to a particular area within the study site and stores information about its temporal evolution. Such information can be deeply explored at the evolution graph scale or used to compare the graphs and supply a general picture at the study site scale. We validated our framework on two study sites located in the South of France and involving different types of natural, semi-natural and agricultural areas. The results obtained from a Landsat SITS support the quality of the methodological approach and illustrate how the framework can be employed to extract and characterize spatiotemporal dynamics.

  10. Reliability Prediction of Ontology-Based Service Compositions Using Petri Net and Time Series Models

    PubMed Central

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy. PMID:24688429

  11. hctsa: A Computational Framework for Automated Time-Series Phenotyping Using Massive Feature Extraction.

    PubMed

    Fulcher, Ben D; Jones, Nick S

    2017-11-22

    Phenotype measurements frequently take the form of time series, but we currently lack a systematic method for relating these complex data streams to scientifically meaningful outcomes, such as relating the movement dynamics of organisms to their genotype or measurements of brain dynamics of a patient to their disease diagnosis. Previous work addressed this problem by comparing implementations of thousands of diverse scientific time-series analysis methods in an approach termed highly comparative time-series analysis. Here, we introduce hctsa, a software tool for applying this methodological approach to data. hctsa includes an architecture for computing over 7,700 time-series features and a suite of analysis and visualization algorithms to automatically select useful and interpretable time-series features for a given application. Using exemplar applications to high-throughput phenotyping experiments, we show how hctsa allows researchers to leverage decades of time-series research to quantify and understand informative structure in time-series data. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Predicting disease progression from short biomarker series using expert advice algorithm

    NASA Astrophysics Data System (ADS)

    Morino, Kai; Hirata, Yoshito; Tomioka, Ryota; Kashima, Hisashi; Yamanishi, Kenji; Hayashi, Norihiro; Egawa, Shin; Aihara, Kazuyuki

    2015-05-01

    Well-trained clinicians may be able to provide diagnosis and prognosis from very short biomarker series using information and experience gained from previous patients. Although mathematical methods can potentially help clinicians to predict the progression of diseases, there is no method so far that estimates the patient state from very short time-series of a biomarker for making diagnosis and/or prognosis by employing the information of previous patients. Here, we propose a mathematical framework for integrating other patients' datasets to infer and predict the state of the disease in the current patient based on their short history. We extend a machine-learning framework of ``prediction with expert advice'' to deal with unstable dynamics. We construct this mathematical framework by combining expert advice with a mathematical model of prostate cancer. Our model predicted well the individual biomarker series of patients with prostate cancer that are used as clinical samples.

  13. Predicting disease progression from short biomarker series using expert advice algorithm.

    PubMed

    Morino, Kai; Hirata, Yoshito; Tomioka, Ryota; Kashima, Hisashi; Yamanishi, Kenji; Hayashi, Norihiro; Egawa, Shin; Aihara, Kazuyuki

    2015-05-20

    Well-trained clinicians may be able to provide diagnosis and prognosis from very short biomarker series using information and experience gained from previous patients. Although mathematical methods can potentially help clinicians to predict the progression of diseases, there is no method so far that estimates the patient state from very short time-series of a biomarker for making diagnosis and/or prognosis by employing the information of previous patients. Here, we propose a mathematical framework for integrating other patients' datasets to infer and predict the state of the disease in the current patient based on their short history. We extend a machine-learning framework of "prediction with expert advice" to deal with unstable dynamics. We construct this mathematical framework by combining expert advice with a mathematical model of prostate cancer. Our model predicted well the individual biomarker series of patients with prostate cancer that are used as clinical samples.

  14. GIAnT - Generic InSAR Analysis Toolbox

    NASA Astrophysics Data System (ADS)

    Agram, P.; Jolivet, R.; Riel, B. V.; Simons, M.; Doin, M.; Lasserre, C.; Hetland, E. A.

    2012-12-01

    We present a computing framework for studying the spatio-temporal evolution of ground deformation from interferometric synthetic aperture radar (InSAR) data. Several open-source tools including Repeat Orbit Interferometry PACkage (ROI-PAC) and InSAR Scientific Computing Environment (ISCE) from NASA-JPL, and Delft Object-oriented Repeat Interferometric Software (DORIS), have enabled scientists to generate individual interferograms from raw radar data with relative ease. Numerous computational techniques and algorithms that reduce phase information from multiple interferograms to a deformation time-series have been developed and verified over the past decade. However, the sharing and direct comparison of products from multiple processing approaches has been hindered by - 1) absence of simple standards for sharing of estimated time-series products, 2) use of proprietary software tools with license restrictions and 3) the closed source nature of the exact implementation of many of these algorithms. We have developed this computing framework to address all of the above issues. We attempt to take the first steps towards creating a community software repository for InSAR time-series analysis. To date, we have implemented the short baseline subset algorithm (SBAS), NSBAS and multi-scale interferometric time-series (MInTS) in this framework and the associated source code is included in the GIAnT distribution. A number of the associated routines have been optimized for performance and scalability with large data sets. Some of the new features in our processing framework are - 1) the use of daily solutions from continuous GPS stations to correct for orbit errors, 2) the use of meteorological data sets to estimate the tropospheric delay screen and 3) a data-driven bootstrapping approach to estimate the uncertainties associated with estimated time-series products. We are currently working on incorporating tidal load corrections for individual interferograms and propagation of noise covariance models through the processing chain for robust estimation of uncertainties in the deformation estimates. We will demonstrate the ease of use of our framework with results ranging from regional scale analysis around Long Valley, CA and Parkfield, CA to continental scale analysis in Western South America. We will also present preliminary results from a new time-series approach that simultaneously estimates deformation over the complete spatial domain at all time epochs on a distributed computing platform. GIAnT has been developed entirely using open source tools and uses Python as the underlying platform. We build on the extensive numerical (NumPy) and scientific (SciPy) computing Python libraries to develop an object-oriented, flexible and modular framework for time-series InSAR applications. The toolbox is currently configured to work with outputs from ROI-PAC, ISCE and DORIS, but can easily be extended to support products from other SAR/InSAR processors. The toolbox libraries include support for hierarchical data format (HDF5) memory mapped files, parallel processing with Python's multi-processing module and support for many convex optimization solvers like CSDP, CVXOPT etc. An extensive set of routines to deal with ASCII and XML files has also been included for controlling the processing parameters.

  15. Penetration with Long Rods: A Theoretical Framework and Comparison with Instrumented Impacts

    DTIC Science & Technology

    1981-05-01

    program to begin probing the details of the interaction process. The theoretical framework underlying such a program is explained in detail. The theory of...of the time sequence of events during penetration. Data from one series of experiments, reported in detail elsewhere, is presented and discussed within the theoretical framework .

  16. A new approach for measuring power spectra and reconstructing time series in active galactic nuclei

    NASA Astrophysics Data System (ADS)

    Li, Yan-Rong; Wang, Jian-Min

    2018-05-01

    We provide a new approach to measure power spectra and reconstruct time series in active galactic nuclei (AGNs) based on the fact that the Fourier transform of AGN stochastic variations is a series of complex Gaussian random variables. The approach parametrizes a stochastic series in frequency domain and transforms it back to time domain to fit the observed data. The parameters and their uncertainties are derived in a Bayesian framework, which also allows us to compare the relative merits of different power spectral density models. The well-developed fast Fourier transform algorithm together with parallel computation enables an acceptable time complexity for the approach.

  17. Identification of stress responsive genes by studying specific relationships between mRNA and protein abundance.

    PubMed

    Morimoto, Shimpei; Yahara, Koji

    2018-03-01

    Protein expression is regulated by the production and degradation of mRNAs and proteins but the specifics of their relationship are controversial. Although technological advances have enabled genome-wide and time-series surveys of mRNA and protein abundance, recent studies have shown paradoxical results, with most statistical analyses being limited to linear correlation, or analysis of variance applied separately to mRNA and protein datasets. Here, using recently analyzed genome-wide time-series data, we have developed a statistical analysis framework for identifying which types of genes or biological gene groups have significant correlation between mRNA and protein abundance after accounting for potential time delays. Our framework stratifies all genes in terms of the extent of time delay, conducts gene clustering in each stratum, and performs a non-parametric statistical test of the correlation between mRNA and protein abundance in a gene cluster. Consequently, we revealed stronger correlations than previously reported between mRNA and protein abundance in two metabolic pathways. Moreover, we identified a pair of stress responsive genes ( ADC17 and KIN1 ) that showed a highly similar time series of mRNA and protein abundance. Furthermore, we confirmed robustness of the analysis framework by applying it to another genome-wide time-series data and identifying a cytoskeleton-related gene cluster (keratin 18, keratin 17, and mitotic spindle positioning) that shows similar correlation. The significant correlation and highly similar changes of mRNA and protein abundance suggests a concerted role of these genes in cellular stress response, which we consider provides an answer to the question of the specific relationships between mRNA and protein in a cell. In addition, our framework for studying the relationship between mRNAs and proteins in a cell will provide a basis for studying specific relationships between mRNA and protein abundance after accounting for potential time delays.

  18. A deep learning framework for financial time series using stacked autoencoders and long-short term memory.

    PubMed

    Bao, Wei; Yue, Jun; Rao, Yulei

    2017-01-01

    The application of deep learning approaches to finance has received a great deal of attention from both investors and researchers. This study presents a novel deep learning framework where wavelet transforms (WT), stacked autoencoders (SAEs) and long-short term memory (LSTM) are combined for stock price forecasting. The SAEs for hierarchically extracted deep features is introduced into stock price forecasting for the first time. The deep learning framework comprises three stages. First, the stock price time series is decomposed by WT to eliminate noise. Second, SAEs is applied to generate deep high-level features for predicting the stock price. Third, high-level denoising features are fed into LSTM to forecast the next day's closing price. Six market indices and their corresponding index futures are chosen to examine the performance of the proposed model. Results show that the proposed model outperforms other similar models in both predictive accuracy and profitability performance.

  19. Learning time series for intelligent monitoring

    NASA Technical Reports Server (NTRS)

    Manganaris, Stefanos; Fisher, Doug

    1994-01-01

    We address the problem of classifying time series according to their morphological features in the time domain. In a supervised machine-learning framework, we induce a classification procedure from a set of preclassified examples. For each class, we infer a model that captures its morphological features using Bayesian model induction and the minimum message length approach to assign priors. In the performance task, we classify a time series in one of the learned classes when there is enough evidence to support that decision. Time series with sufficiently novel features, belonging to classes not present in the training set, are recognized as such. We report results from experiments in a monitoring domain of interest to NASA.

  20. Using missing ordinal patterns to detect nonlinearity in time series data.

    PubMed

    Kulp, Christopher W; Zunino, Luciano; Osborne, Thomas; Zawadzki, Brianna

    2017-08-01

    The number of missing ordinal patterns (NMP) is the number of ordinal patterns that do not appear in a series after it has been symbolized using the Bandt and Pompe methodology. In this paper, the NMP is demonstrated as a test for nonlinearity using a surrogate framework in order to see if the NMP for a series is statistically different from the NMP of iterative amplitude adjusted Fourier transform (IAAFT) surrogates. It is found that the NMP works well as a test statistic for nonlinearity, even in the cases of very short time series. Both model and experimental time series are used to demonstrate the efficacy of the NMP as a test for nonlinearity.

  1. Compression based entropy estimation of heart rate variability on multiple time scales.

    PubMed

    Baumert, Mathias; Voss, Andreas; Javorka, Michal

    2013-01-01

    Heart rate fluctuates beat by beat in a complex manner. The aim of this study was to develop a framework for entropy assessment of heart rate fluctuations on multiple time scales. We employed the Lempel-Ziv algorithm for lossless data compression to investigate the compressibility of RR interval time series on different time scales, using a coarse-graining procedure. We estimated the entropy of RR interval time series of 20 young and 20 old subjects and also investigated the compressibility of randomly shuffled surrogate RR time series. The original RR time series displayed significantly smaller compression entropy values than randomized RR interval data. The RR interval time series of older subjects showed significantly different entropy characteristics over multiple time scales than those of younger subjects. In conclusion, data compression may be useful approach for multiscale entropy assessment of heart rate variability.

  2. Event coincidence analysis for quantifying statistical interrelationships between event time series. On the role of flood events as triggers of epidemic outbreaks

    NASA Astrophysics Data System (ADS)

    Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.

    2016-05-01

    Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.

  3. Towards a Socio-Cultural Framework for the Analysis of Joint Student-Teacher Development over Technology-Based Mathematics Lessons

    ERIC Educational Resources Information Center

    Monaghan, John

    2013-01-01

    This paper offers a framework, an extension of Valsiner's "zone theory", for the analysis of joint student-teacher development over a series of technology-based mathematics lessons. The framework is suitable for developing research studies over a moderately long period of time and considers interrelated student-teacher development as…

  4. Fuzzy time-series based on Fibonacci sequence for stock price forecasting

    NASA Astrophysics Data System (ADS)

    Chen, Tai-Liang; Cheng, Ching-Hsue; Jong Teoh, Hia

    2007-07-01

    Time-series models have been utilized to make reasonably accurate predictions in the areas of stock price movements, academic enrollments, weather, etc. For promoting the forecasting performance of fuzzy time-series models, this paper proposes a new model, which incorporates the concept of the Fibonacci sequence, the framework of Song and Chissom's model and the weighted method of Yu's model. This paper employs a 5-year period TSMC (Taiwan Semiconductor Manufacturing Company) stock price data and a 13-year period of TAIEX (Taiwan Stock Exchange Capitalization Weighted Stock Index) stock index data as experimental datasets. By comparing our forecasting performances with Chen's (Forecasting enrollments based on fuzzy time-series. Fuzzy Sets Syst. 81 (1996) 311-319), Yu's (Weighted fuzzy time-series models for TAIEX forecasting. Physica A 349 (2004) 609-624) and Huarng's (The application of neural networks to forecast fuzzy time series. Physica A 336 (2006) 481-491) models, we conclude that the proposed model surpasses in accuracy these conventional fuzzy time-series models.

  5. ControlShell - A real-time software framework

    NASA Technical Reports Server (NTRS)

    Schneider, Stanley A.; Ullman, Marc A.; Chen, Vincent W.

    1991-01-01

    ControlShell is designed to enable modular design and impplementation of real-time software. It is an object-oriented tool-set for real-time software system programming. It provides a series of execution and data interchange mechansims that form a framework for building real-time applications. These mechanisms allow a component-based approach to real-time software generation and mangement. By defining a set of interface specifications for intermodule interaction, ControlShell provides a common platform that is the basis for real-time code development and exchange.

  6. Data-driven discovery of partial differential equations.

    PubMed

    Rudy, Samuel H; Brunton, Steven L; Proctor, Joshua L; Kutz, J Nathan

    2017-04-01

    We propose a sparse regression method capable of discovering the governing partial differential equation(s) of a given system by time series measurements in the spatial domain. The regression framework relies on sparsity-promoting techniques to select the nonlinear and partial derivative terms of the governing equations that most accurately represent the data, bypassing a combinatorially large search through all possible candidate models. The method balances model complexity and regression accuracy by selecting a parsimonious model via Pareto analysis. Time series measurements can be made in an Eulerian framework, where the sensors are fixed spatially, or in a Lagrangian framework, where the sensors move with the dynamics. The method is computationally efficient, robust, and demonstrated to work on a variety of canonical problems spanning a number of scientific domains including Navier-Stokes, the quantum harmonic oscillator, and the diffusion equation. Moreover, the method is capable of disambiguating between potentially nonunique dynamical terms by using multiple time series taken with different initial data. Thus, for a traveling wave, the method can distinguish between a linear wave equation and the Korteweg-de Vries equation, for instance. The method provides a promising new technique for discovering governing equations and physical laws in parameterized spatiotemporal systems, where first-principles derivations are intractable.

  7. Monitoring of tissue ablation using time series of ultrasound RF data.

    PubMed

    Imani, Farhad; Wu, Mark Z; Lasso, Andras; Burdette, Everett C; Daoud, Mohammad; Fitchinger, Gabor; Abolmaesumi, Purang; Mousavi, Parvin

    2011-01-01

    This paper is the first report on the monitoring of tissue ablation using ultrasound RF echo time series. We calcuate frequency and time domain features of time series of RF echoes from stationary tissue and transducer, and correlate them with ablated and non-ablated tissue properties. We combine these features in a nonlinear classification framework and demonstrate up to 99% classification accuracy in distinguishing ablated and non-ablated regions of tissue, in areas as small as 12mm2 in size. We also demonstrate significant improvement of ablated tissue classification using RF time series compared to the conventional approach of using single RF scan lines. The results of this study suggest RF echo time series as a promising approach for monitoring ablation, and capturing the changes in the tissue microstructure as a result of heat-induced necrosis.

  8. A Regularized Linear Dynamical System Framework for Multivariate Time Series Analysis.

    PubMed

    Liu, Zitao; Hauskrecht, Milos

    2015-01-01

    Linear Dynamical System (LDS) is an elegant mathematical framework for modeling and learning Multivariate Time Series (MTS). However, in general, it is difficult to set the dimension of an LDS's hidden state space. A small number of hidden states may not be able to model the complexities of a MTS, while a large number of hidden states can lead to overfitting. In this paper, we study learning methods that impose various regularization penalties on the transition matrix of the LDS model and propose a regularized LDS learning framework (rLDS) which aims to (1) automatically shut down LDSs' spurious and unnecessary dimensions, and consequently, address the problem of choosing the optimal number of hidden states; (2) prevent the overfitting problem given a small amount of MTS data; and (3) support accurate MTS forecasting. To learn the regularized LDS from data we incorporate a second order cone program and a generalized gradient descent method into the Maximum a Posteriori framework and use Expectation Maximization to obtain a low-rank transition matrix of the LDS model. We propose two priors for modeling the matrix which lead to two instances of our rLDS. We show that our rLDS is able to recover well the intrinsic dimensionality of the time series dynamics and it improves the predictive performance when compared to baselines on both synthetic and real-world MTS datasets.

  9. Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation Approaches at Chlorinated Solvent Sites

    DTIC Science & Technology

    2015-03-19

    to Abiotic Degradation  Magnetite (FeO.Fe2O3) often occurs naturally in sediments formed by weathering of igneous or metamorphic rock  Magnetite...send questions at any time using the Q&A panel 6 SERDP & ESTCP Webinar Series (#11) SERDP & ESTCP Webinar Series SERDP and ESTCP Overview Andrea...Attenuation (MNA)  Integrate the decision-making framework into an easy to use application • Excel spreadsheet  Guide users in the selection of

  10. Automated Bayesian model development for frequency detection in biological time series.

    PubMed

    Granqvist, Emma; Oldroyd, Giles E D; Morris, Richard J

    2011-06-24

    A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and the requirement for uniformly sampled data. Biological time series often deviate significantly from the requirements of optimality for Fourier transformation. In this paper we present an alternative approach based on Bayesian inference. We show the value of placing spectral analysis in the framework of Bayesian inference and demonstrate how model comparison can automate this procedure.

  11. Automated Bayesian model development for frequency detection in biological time series

    PubMed Central

    2011-01-01

    Background A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. Results In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Conclusions Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and the requirement for uniformly sampled data. Biological time series often deviate significantly from the requirements of optimality for Fourier transformation. In this paper we present an alternative approach based on Bayesian inference. We show the value of placing spectral analysis in the framework of Bayesian inference and demonstrate how model comparison can automate this procedure. PMID:21702910

  12. A Unified Framework for Estimating Minimum Detectable Effects for Comparative Short Interrupted Time Series Designs

    ERIC Educational Resources Information Center

    Price, Cristofer; Unlu, Fatih

    2014-01-01

    The Comparative Short Interrupted Time Series (C-SITS) design is a frequently employed quasi-experimental method, in which the pre- and post-intervention changes observed in the outcome levels of a treatment group is compared with those of a comparison group where the difference between the former and the latter is attributed to the treatment. The…

  13. Probabilistic reasoning over seismic RMS time series: volcano monitoring through HMMs and SAX technique

    NASA Astrophysics Data System (ADS)

    Aliotta, M. A.; Cassisi, C.; Prestifilippo, M.; Cannata, A.; Montalto, P.; Patanè, D.

    2014-12-01

    During the last years, volcanic activity at Mt. Etna was often characterized by cyclic occurrences of fountains. In the period between January 2011 and June 2013, 38 episodes of lava fountains has been observed. Automatic recognition of the volcano's states related to lava fountain episodes (Quiet, Pre-Fountaining, Fountaining, Post-Fountaining) is very useful for monitoring purposes. We discovered that such states are strongly related to the trend of RMS (Root Mean Square) of the seismic signal recorded in the summit area. In the framework of the project PON SIGMA (Integrated Cloud-Sensor System for Advanced Multirisk Management) work, we tried to model the system generating its sampled values (assuming to be a Markov process and assuming that RMS time series is a stochastic process), by using Hidden Markov models (HMMs), that are a powerful tool for modeling any time-varying series. HMMs analysis seeks to discover the sequence of hidden states from the observed emissions. In our framework, observed emissions are characters generated by SAX (Symbolic Aggregate approXimation) technique. SAX is able to map RMS time series values with discrete literal emissions. Our experiments showed how to predict volcano states by means of SAX and HMMs.

  14. Sequentially Executed Model Evaluation Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-20

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and I/O through the time domain (or other discrete domain), and sample I/O drivers. This is a library framework, and does not, itself, solve any problems or execute any modeling. The SeMe framework aids in development of models which operate on sequential information, such as time-series, where evaluation is based on prior results combined with new data for this iteration. Has applications in quality monitoring, and was developed as partmore » of the CANARY-EDS software, where real-time water quality data is being analyzed for anomalies.« less

  15. A deep learning framework for financial time series using stacked autoencoders and long-short term memory

    PubMed Central

    Bao, Wei; Rao, Yulei

    2017-01-01

    The application of deep learning approaches to finance has received a great deal of attention from both investors and researchers. This study presents a novel deep learning framework where wavelet transforms (WT), stacked autoencoders (SAEs) and long-short term memory (LSTM) are combined for stock price forecasting. The SAEs for hierarchically extracted deep features is introduced into stock price forecasting for the first time. The deep learning framework comprises three stages. First, the stock price time series is decomposed by WT to eliminate noise. Second, SAEs is applied to generate deep high-level features for predicting the stock price. Third, high-level denoising features are fed into LSTM to forecast the next day’s closing price. Six market indices and their corresponding index futures are chosen to examine the performance of the proposed model. Results show that the proposed model outperforms other similar models in both predictive accuracy and profitability performance. PMID:28708865

  16. The Use of Piecewise Growth Models to Estimate Learning Trajectories and RtI Instructional Effects in a Comparative Interrupted Time-Series Design

    ERIC Educational Resources Information Center

    Zvoch, Keith

    2016-01-01

    Piecewise growth models (PGMs) were used to estimate and model changes in the preliteracy skill development of kindergartners in a moderately sized school district in the Pacific Northwest. PGMs were applied to interrupted time-series (ITS) data that arose within the context of a response-to-intervention (RtI) instructional framework. During the…

  17. A data mining framework for time series estimation.

    PubMed

    Hu, Xiao; Xu, Peng; Wu, Shaozhi; Asgari, Shadnaz; Bergsneider, Marvin

    2010-04-01

    Time series estimation techniques are usually employed in biomedical research to derive variables less accessible from a set of related and more accessible variables. These techniques are traditionally built from systems modeling approaches including simulation, blind decovolution, and state estimation. In this work, we define target time series (TTS) and its related time series (RTS) as the output and input of a time series estimation process, respectively. We then propose a novel data mining framework for time series estimation when TTS and RTS represent different sets of observed variables from the same dynamic system. This is made possible by mining a database of instances of TTS, its simultaneously recorded RTS, and the input/output dynamic models between them. The key mining strategy is to formulate a mapping function for each TTS-RTS pair in the database that translates a feature vector extracted from RTS to the dissimilarity between true TTS and its estimate from the dynamic model associated with the same TTS-RTS pair. At run time, a feature vector is extracted from an inquiry RTS and supplied to the mapping function associated with each TTS-RTS pair to calculate a dissimilarity measure. An optimal TTS-RTS pair is then selected by analyzing these dissimilarity measures. The associated input/output model of the selected TTS-RTS pair is then used to simulate the TTS given the inquiry RTS as an input. An exemplary implementation was built to address a biomedical problem of noninvasive intracranial pressure assessment. The performance of the proposed method was superior to that of a simple training-free approach of finding the optimal TTS-RTS pair by a conventional similarity-based search on RTS features. 2009 Elsevier Inc. All rights reserved.

  18. Operational data fusion framework for building frequent Landsat-like imagery in a cloudy region

    USDA-ARS?s Scientific Manuscript database

    An operational data fusion framework is built to generate dense time-series Landsat-like images for a cloudy region by fusing Moderate Resolution Imaging Spectroradiometer (MODIS) data products and Landsat imagery. The Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM) is integrated in ...

  19. 78 FR 64478 - Request for Comments on the Preliminary Cybersecurity Framework

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-29

    ... February 26, 2013, and a series of open public workshops. The preliminary Framework was developed in..., methodologies, procedures and processes that align policy, business, and technological approaches to address....nist.gov/itl/cyberframework.cfm . DATES: Comments must be received by 5:00 p.m. Eastern Time December...

  20. A Multilevel Multiset Time-Series Model for Describing Complex Developmental Processes

    PubMed Central

    Ma, Xin; Shen, Jianping

    2017-01-01

    The authors sought to develop an analytical platform where multiple sets of time series can be examined simultaneously. This multivariate platform capable of testing interaction effects among multiple sets of time series can be very useful in empirical research. The authors demonstrated that the multilevel framework can readily accommodate this analytical capacity. Given their intention to use the multilevel multiset time-series model to pursue complicated research purposes, their resulting model is relatively simple to specify, to run, and to interpret. These advantages make the adoption of their model relatively effortless as long as researchers have the basic knowledge and skills in working with multilevel growth modeling. With multiple potential extensions of their model, the establishment of this analytical platform for analysis of multiple sets of time series can inspire researchers to pursue far more advanced research designs to address complex developmental processes in reality. PMID:29881094

  1. Data-driven discovery of partial differential equations

    PubMed Central

    Rudy, Samuel H.; Brunton, Steven L.; Proctor, Joshua L.; Kutz, J. Nathan

    2017-01-01

    We propose a sparse regression method capable of discovering the governing partial differential equation(s) of a given system by time series measurements in the spatial domain. The regression framework relies on sparsity-promoting techniques to select the nonlinear and partial derivative terms of the governing equations that most accurately represent the data, bypassing a combinatorially large search through all possible candidate models. The method balances model complexity and regression accuracy by selecting a parsimonious model via Pareto analysis. Time series measurements can be made in an Eulerian framework, where the sensors are fixed spatially, or in a Lagrangian framework, where the sensors move with the dynamics. The method is computationally efficient, robust, and demonstrated to work on a variety of canonical problems spanning a number of scientific domains including Navier-Stokes, the quantum harmonic oscillator, and the diffusion equation. Moreover, the method is capable of disambiguating between potentially nonunique dynamical terms by using multiple time series taken with different initial data. Thus, for a traveling wave, the method can distinguish between a linear wave equation and the Korteweg–de Vries equation, for instance. The method provides a promising new technique for discovering governing equations and physical laws in parameterized spatiotemporal systems, where first-principles derivations are intractable. PMID:28508044

  2. A KST framework for correlation network construction from time series signals

    NASA Astrophysics Data System (ADS)

    Qi, Jin-Peng; Gu, Quan; Zhu, Ying; Zhang, Ping

    2018-04-01

    A KST (Kolmogorov-Smirnov test and T statistic) method is used for construction of a correlation network based on the fluctuation of each time series within the multivariate time signals. In this method, each time series is divided equally into multiple segments, and the maximal data fluctuation in each segment is calculated by a KST change detection procedure. Connections between each time series are derived from the data fluctuation matrix, and are used for construction of the fluctuation correlation network (FCN). The method was tested with synthetic simulations and the result was compared with those from using KS or T only for detection of data fluctuation. The novelty of this study is that the correlation analyses was based on the data fluctuation in each segment of each time series rather than on the original time signals, which would be more meaningful for many real world applications and for analysis of large-scale time signals where prior knowledge is uncertain.

  3. Causal Analysis of Self-tracked Time Series Data Using a Counterfactual Framework for N-of-1 Trials.

    PubMed

    Daza, Eric J

    2018-02-01

    Many of an individual's historically recorded personal measurements vary over time, thereby forming a time series (e.g., wearable-device data, self-tracked fitness or nutrition measurements, regularly monitored clinical events or chronic conditions). Statistical analyses of such n-of-1 (i.e., single-subject) observational studies (N1OSs) can be used to discover possible cause-effect relationships to then self-test in an n-of-1 randomized trial (N1RT). However, a principled way of determining how and when to interpret an N1OS association as a causal effect (e.g., as if randomization had occurred) is needed.Our goal in this paper is to help bridge the methodological gap between risk-factor discovery and N1RT testing by introducing a basic counterfactual framework for N1OS design and personalized causal analysis.We introduce and characterize what we call the average period treatment effect (APTE), i.e., the estimand of interest in an N1RT, and build an analytical framework around it that can accommodate autocorrelation and time trends in the outcome, effect carryover from previous treatment periods, and slow onset or decay of the effect. The APTE is loosely defined as a contrast (e.g., difference, ratio) of averages of potential outcomes the individual can theoretically experience under different treatment levels during a given treatment period. To illustrate the utility of our framework for APTE discovery and estimation, two common causal inference methods are specified within the N1OS context. We then apply the framework and methods to search for estimable and interpretable APTEs using six years of the author's self-tracked weight and exercise data, and report both the preliminary findings and the challenges we faced in conducting N1OS causal discovery.Causal analysis of an individual's time series data can be facilitated by an N1RT counterfactual framework. However, for inference to be valid, the veracity of certain key assumptions must be assessed critically, and the hypothesized causal models must be interpretable and meaningful. Schattauer GmbH.

  4. Development of a Neural Network-Based Renewable Energy Forecasting Framework for Process Industries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Soobin; Ryu, Jun-Hyung; Hodge, Bri-Mathias

    2016-06-25

    This paper presents a neural network-based forecasting framework for photovoltaic power (PV) generation as a decision-supporting tool to employ renewable energies in the process industry. The applicability of the proposed framework is illustrated by comparing its performance against other methodologies such as linear and nonlinear time series modelling approaches. A case study of an actual PV power plant in South Korea is presented.

  5. Uncertainty evaluation of a regional real-time system for rain-induced landslides

    NASA Astrophysics Data System (ADS)

    Kirschbaum, Dalia; Stanley, Thomas; Yatheendradas, Soni

    2015-04-01

    A new prototype regional model and evaluation framework has been developed over Central America and the Caribbean region using satellite-based information including precipitation estimates, modeled soil moisture, topography, soils, as well as regionally available datasets such as road networks and distance to fault zones. The algorithm framework incorporates three static variables: a susceptibility map; a 24-hr rainfall triggering threshold; and an antecedent soil moisture variable threshold, which have been calibrated using historic landslide events. The thresholds are regionally heterogeneous and are based on the percentile distribution of the rainfall or antecedent moisture time series. A simple decision tree algorithm framework integrates all three variables with the rainfall and soil moisture time series and generates a landslide nowcast in real-time based on the previous 24 hours over this region. This system has been evaluated using several available landslide inventories over the Central America and Caribbean region. Spatiotemporal uncertainty and evaluation metrics of the model are presented here based on available landslides reports. This work also presents a probabilistic representation of potential landslide activity over the region which can be used to further refine and improve the real-time landslide hazard assessment system as well as better identify and characterize the uncertainties inherent in this type of regional approach. The landslide algorithm provides a flexible framework to improve hazard estimation and reduce uncertainty at any spatial and temporal scale.

  6. Price responsiveness of demand for cigarettes: does rationality matter?

    PubMed

    Laporte, Audrey

    2006-01-01

    Meta-analysis is applied to aggregate-level studies that model the demand for cigarettes using static, myopic, or rational addiction frameworks in an attempt to synthesize key findings in the literature and to identify determinants of the variation in reported price elasticity estimates across studies. The results suggest that the rational addiction framework produces statistically similar estimates to the static framework but that studies that use the myopic framework tend to report more elastic price effects. Studies that applied panel data techniques or controlled for cross-border smuggling reported more elastic price elasticity estimates, whereas the use of instrumental variable techniques and time trends or time dummy variables produced less elastic estimates. The finding that myopic models produce different estimates than either of the other two model frameworks underscores that careful attention must be given to time series properties of the data.

  7. Phase unwrapping in three dimensions with application to InSAR time series.

    PubMed

    Hooper, Andrew; Zebker, Howard A

    2007-09-01

    The problem of phase unwrapping in two dimensions has been studied extensively in the past two decades, but the three-dimensional (3D) problem has so far received relatively little attention. We develop here a theoretical framework for 3D phase unwrapping and also describe two algorithms for implementation, both of which can be applied to synthetic aperture radar interferometry (InSAR) time series. We test the algorithms on simulated data and find both give more accurate results than a two-dimensional algorithm. When applied to actual InSAR time series, we find good agreement both between the algorithms and with ground truth.

  8. State-space forecasting of Schistosoma haematobium time-series in Niono, Mali.

    PubMed

    Medina, Daniel C; Findley, Sally E; Doumbia, Seydou

    2008-08-13

    Much of the developing world, particularly sub-Saharan Africa, exhibits high levels of morbidity and mortality associated with infectious diseases. The incidence of Schistosoma sp.-which are neglected tropical diseases exposing and infecting more than 500 and 200 million individuals in 77 countries, respectively-is rising because of 1) numerous irrigation and hydro-electric projects, 2) steady shifts from nomadic to sedentary existence, and 3) ineffective control programs. Notwithstanding the colossal scope of these parasitic infections, less than 0.5% of Schistosoma sp. investigations have attempted to predict their spatial and or temporal distributions. Undoubtedly, public health programs in developing countries could benefit from parsimonious forecasting and early warning systems to enhance management of these parasitic diseases. In this longitudinal retrospective (01/1996-06/2004) investigation, the Schistosoma haematobium time-series for the district of Niono, Mali, was fitted with general-purpose exponential smoothing methods to generate contemporaneous on-line forecasts. These methods, which are encapsulated within a state-space framework, accommodate seasonal and inter-annual time-series fluctuations. Mean absolute percentage error values were circa 25% for 1- to 5-month horizon forecasts. The exponential smoothing state-space framework employed herein produced reasonably accurate forecasts for this time-series, which reflects the incidence of S. haematobium-induced terminal hematuria. It obliquely captured prior non-linear interactions between disease dynamics and exogenous covariates (e.g., climate, irrigation, and public health interventions), thus obviating the need for more complex forecasting methods in the district of Niono, Mali. Therefore, this framework could assist with managing and assessing S. haematobium transmission and intervention impact, respectively, in this district and potentially elsewhere in the Sahel.

  9. State–Space Forecasting of Schistosoma haematobium Time-Series in Niono, Mali

    PubMed Central

    Medina, Daniel C.; Findley, Sally E.; Doumbia, Seydou

    2008-01-01

    Background Much of the developing world, particularly sub-Saharan Africa, exhibits high levels of morbidity and mortality associated with infectious diseases. The incidence of Schistosoma sp.—which are neglected tropical diseases exposing and infecting more than 500 and 200 million individuals in 77 countries, respectively—is rising because of 1) numerous irrigation and hydro-electric projects, 2) steady shifts from nomadic to sedentary existence, and 3) ineffective control programs. Notwithstanding the colossal scope of these parasitic infections, less than 0.5% of Schistosoma sp. investigations have attempted to predict their spatial and or temporal distributions. Undoubtedly, public health programs in developing countries could benefit from parsimonious forecasting and early warning systems to enhance management of these parasitic diseases. Methodology/Principal Findings In this longitudinal retrospective (01/1996–06/2004) investigation, the Schistosoma haematobium time-series for the district of Niono, Mali, was fitted with general-purpose exponential smoothing methods to generate contemporaneous on-line forecasts. These methods, which are encapsulated within a state–space framework, accommodate seasonal and inter-annual time-series fluctuations. Mean absolute percentage error values were circa 25% for 1- to 5-month horizon forecasts. Conclusions/Significance The exponential smoothing state–space framework employed herein produced reasonably accurate forecasts for this time-series, which reflects the incidence of S. haematobium–induced terminal hematuria. It obliquely captured prior non-linear interactions between disease dynamics and exogenous covariates (e.g., climate, irrigation, and public health interventions), thus obviating the need for more complex forecasting methods in the district of Niono, Mali. Therefore, this framework could assist with managing and assessing S. haematobium transmission and intervention impact, respectively, in this district and potentially elsewhere in the Sahel. PMID:18698361

  10. Copulas and time series with long-ranged dependencies.

    PubMed

    Chicheportiche, Rémy; Chakraborti, Anirban

    2014-04-01

    We review ideas on temporal dependencies and recurrences in discrete time series from several areas of natural and social sciences. We revisit existing studies and redefine the relevant observables in the language of copulas (joint laws of the ranks). We propose that copulas provide an appropriate mathematical framework to study nonlinear time dependencies and related concepts-like aftershocks, Omori law, recurrences, and waiting times. We also critically argue, using this global approach, that previous phenomenological attempts involving only a long-ranged autocorrelation function lacked complexity in that they were essentially monoscale.

  11. Functional linear models to test for differences in prairie wetland hydraulic gradients

    USGS Publications Warehouse

    Greenwood, Mark C.; Sojda, Richard S.; Preston, Todd M.; Swayne, David A.; Yang, Wanhong; Voinov, A.A.; Rizzoli, A.; Filatova, T.

    2010-01-01

    Functional data analysis provides a framework for analyzing multiple time series measured frequently in time, treating each series as a continuous function of time. Functional linear models are used to test for effects on hydraulic gradient functional responses collected from three types of land use in Northeastern Montana at fourteen locations. Penalized regression-splines are used to estimate the underlying continuous functions based on the discretely recorded (over time) gradient measurements. Permutation methods are used to assess the statistical significance of effects. A method for accommodating missing observations in each time series is described. Hydraulic gradients may be an initial and fundamental ecosystem process that responds to climate change. We suggest other potential uses of these methods for detecting evidence of climate change.

  12. Theories of behaviour change synthesised into a set of theoretical groupings: introducing a thematic series on the theoretical domains framework.

    PubMed

    Francis, Jill J; O'Connor, Denise; Curran, Janet

    2012-04-24

    Behaviour change is key to increasing the uptake of evidence into healthcare practice. Designing behaviour-change interventions first requires problem analysis, ideally informed by theory. Yet the large number of partly overlapping theories of behaviour makes it difficult to select the most appropriate theory. The need for an overarching theoretical framework of behaviour change was addressed in research in which 128 explanatory constructs from 33 theories of behaviour were identified and grouped. The resulting Theoretical Domains Framework (TDF) appears to be a helpful basis for investigating implementation problems. Research groups in several countries have conducted TDF-based studies. It seems timely to bring together the experience of these teams in a thematic series to demonstrate further applications and to report key developments. This overview article describes the TDF, provides a brief critique of the framework, and introduces this thematic series.In a brief review to assess the extent of TDF-based research, we identified 133 papers that cite the framework. Of these, 17 used the TDF as the basis for empirical studies to explore health professionals' behaviour. The identified papers provide evidence of the impact of the TDF on implementation research. Two major strengths of the framework are its theoretical coverage and its capacity to elicit beliefs that could signify key mediators of behaviour change. The TDF provides a useful conceptual basis for assessing implementation problems, designing interventions to enhance healthcare practice, and understanding behaviour-change processes. We discuss limitations and research challenges and introduce papers in this series.

  13. Theories of behaviour change synthesised into a set of theoretical groupings: introducing a thematic series on the theoretical domains framework

    PubMed Central

    2012-01-01

    Behaviour change is key to increasing the uptake of evidence into healthcare practice. Designing behaviour-change interventions first requires problem analysis, ideally informed by theory. Yet the large number of partly overlapping theories of behaviour makes it difficult to select the most appropriate theory. The need for an overarching theoretical framework of behaviour change was addressed in research in which 128 explanatory constructs from 33 theories of behaviour were identified and grouped. The resulting Theoretical Domains Framework (TDF) appears to be a helpful basis for investigating implementation problems. Research groups in several countries have conducted TDF-based studies. It seems timely to bring together the experience of these teams in a thematic series to demonstrate further applications and to report key developments. This overview article describes the TDF, provides a brief critique of the framework, and introduces this thematic series. In a brief review to assess the extent of TDF-based research, we identified 133 papers that cite the framework. Of these, 17 used the TDF as the basis for empirical studies to explore health professionals’ behaviour. The identified papers provide evidence of the impact of the TDF on implementation research. Two major strengths of the framework are its theoretical coverage and its capacity to elicit beliefs that could signify key mediators of behaviour change. The TDF provides a useful conceptual basis for assessing implementation problems, designing interventions to enhance healthcare practice, and understanding behaviour-change processes. We discuss limitations and research challenges and introduce papers in this series. PMID:22531601

  14. GRASS GIS: The first Open Source Temporal GIS

    NASA Astrophysics Data System (ADS)

    Gebbert, Sören; Leppelt, Thomas

    2015-04-01

    GRASS GIS is a full featured, general purpose Open Source geographic information system (GIS) with raster, 3D raster and vector processing support[1]. Recently, time was introduced as a new dimension that transformed GRASS GIS into the first Open Source temporal GIS with comprehensive spatio-temporal analysis, processing and visualization capabilities[2]. New spatio-temporal data types were introduced in GRASS GIS version 7, to manage raster, 3D raster and vector time series. These new data types are called space time datasets. They are designed to efficiently handle hundreds of thousands of time stamped raster, 3D raster and vector map layers of any size. Time stamps can be defined as time intervals or time instances in Gregorian calendar time or relative time. Space time datasets are simplifying the processing and analysis of large time series in GRASS GIS, since these new data types are used as input and output parameter in temporal modules. The handling of space time datasets is therefore equal to the handling of raster, 3D raster and vector map layers in GRASS GIS. A new dedicated Python library, the GRASS GIS Temporal Framework, was designed to implement the spatio-temporal data types and their management. The framework provides the functionality to efficiently handle hundreds of thousands of time stamped map layers and their spatio-temporal topological relations. The framework supports reasoning based on the temporal granularity of space time datasets as well as their temporal topology. It was designed in conjunction with the PyGRASS [3] library to support parallel processing of large datasets, that has a long tradition in GRASS GIS [4,5]. We will present a subset of more than 40 temporal modules that were implemented based on the GRASS GIS Temporal Framework, PyGRASS and the GRASS GIS Python scripting library. These modules provide a comprehensive temporal GIS tool set. The functionality range from space time dataset and time stamped map layer management over temporal aggregation, temporal accumulation, spatio-temporal statistics, spatio-temporal sampling, temporal algebra, temporal topology analysis, time series animation and temporal topology visualization to time series import and export capabilities with support for NetCDF and VTK data formats. We will present several temporal modules that support parallel processing of raster and 3D raster time series. [1] GRASS GIS Open Source Approaches in Spatial Data Handling In Open Source Approaches in Spatial Data Handling, Vol. 2 (2008), pp. 171-199, doi:10.1007/978-3-540-74831-19 by M. Neteler, D. Beaudette, P. Cavallini, L. Lami, J. Cepicky edited by G. Brent Hall, Michael G. Leahy [2] Gebbert, S., Pebesma, E., 2014. A temporal GIS for field based environmental modeling. Environ. Model. Softw. 53, 1-12. [3] Zambelli, P., Gebbert, S., Ciolli, M., 2013. Pygrass: An Object Oriented Python Application Programming Interface (API) for Geographic Resources Analysis Support System (GRASS) Geographic Information System (GIS). ISPRS Intl Journal of Geo-Information 2, 201-219. [4] Löwe, P., Klump, J., Thaler, J. (2012): The FOSS GIS Workbench on the GFZ Load Sharing Facility compute cluster, (Geophysical Research Abstracts Vol. 14, EGU2012-4491, 2012), General Assembly European Geosciences Union (Vienna, Austria 2012). [5] Akhter, S., Aida, K., Chemin, Y., 2010. "GRASS GIS on High Performance Computing with MPI, OpenMP and Ninf-G Programming Framework". ISPRS Conference, Kyoto, 9-12 August 2010

  15. Time series modeling in traffic safety research.

    PubMed

    Lavrenz, Steven M; Vlahogianni, Eleni I; Gkritza, Konstantina; Ke, Yue

    2018-08-01

    The use of statistical models for analyzing traffic safety (crash) data has been well-established. However, time series techniques have traditionally been underrepresented in the corresponding literature, due to challenges in data collection, along with a limited knowledge of proper methodology. In recent years, new types of high-resolution traffic safety data, especially in measuring driver behavior, have made time series modeling techniques an increasingly salient topic of study. Yet there remains a dearth of information to guide analysts in their use. This paper provides an overview of the state of the art in using time series models in traffic safety research, and discusses some of the fundamental techniques and considerations in classic time series modeling. It also presents ongoing and future opportunities for expanding the use of time series models, and explores newer modeling techniques, including computational intelligence models, which hold promise in effectively handling ever-larger data sets. The information contained herein is meant to guide safety researchers in understanding this broad area of transportation data analysis, and provide a framework for understanding safety trends that can influence policy-making. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Boolean network inference from time series data incorporating prior biological knowledge.

    PubMed

    Haider, Saad; Pal, Ranadip

    2012-01-01

    Numerous approaches exist for modeling of genetic regulatory networks (GRNs) but the low sampling rates often employed in biological studies prevents the inference of detailed models from experimental data. In this paper, we analyze the issues involved in estimating a model of a GRN from single cell line time series data with limited time points. We present an inference approach for a Boolean Network (BN) model of a GRN from limited transcriptomic or proteomic time series data based on prior biological knowledge of connectivity, constraints on attractor structure and robust design. We applied our inference approach to 6 time point transcriptomic data on Human Mammary Epithelial Cell line (HMEC) after application of Epidermal Growth Factor (EGF) and generated a BN with a plausible biological structure satisfying the data. We further defined and applied a similarity measure to compare synthetic BNs and BNs generated through the proposed approach constructed from transitions of various paths of the synthetic BNs. We have also compared the performance of our algorithm with two existing BN inference algorithms. Through theoretical analysis and simulations, we showed the rarity of arriving at a BN from limited time series data with plausible biological structure using random connectivity and absence of structure in data. The framework when applied to experimental data and data generated from synthetic BNs were able to estimate BNs with high similarity scores. Comparison with existing BN inference algorithms showed the better performance of our proposed algorithm for limited time series data. The proposed framework can also be applied to optimize the connectivity of a GRN from experimental data when the prior biological knowledge on regulators is limited or not unique.

  17. Highly active self-immobilized FI-Zr catalysts in a PCP framework for ethylene polymerization.

    PubMed

    Li, He; Xu, Bo; He, Jianghao; Liu, Xiaoming; Gao, Wei; Mu, Ying

    2015-12-04

    A series of zirconium-based porous coordination polymers (PCPs) containing FI catalysts in the frameworks have been developed and studied as catalysts for ethylene polymerization. These PCPs exhibit good catalytic activities and long life times, producing polyethylenes with high molecular weights and bimodal molecular weight distribution in the form of particles.

  18. Continuous time transfer using GPS carrier phase.

    PubMed

    Dach, Rolf; Schildknecht, Thomas; Springer, Tim; Dudle, Gregor; Prost, Leon

    2002-11-01

    The Astronomical Institute of the University of Berne is hosting one of the Analysis Centers (AC) of the International GPS Service (IGS). A network of a few GPS stations in Europe and North America is routinely analyzed for time transfer purposes, using the carrier phase observations. This work is done in the framework of a joint project with the Swiss Federal Office of Metrology and Accreditation (METAS). The daily solutions are computed independently. The resulting time transfer series show jumps of up to 1 ns at the day boundaries. A method to concatenate the daily time transfer solutions to a continuous series was developed. A continuous time series is available for a time span of more than 4 mo. The results were compared with the time transfer results from other techniques such as two-way satellite time and frequency transfer. This concatenation improves the results obtained in a daily computing scheme because a continuous time series better reflects the characteristics of continuously working clocks.

  19. Stochastic modelling of non-stationary financial assets

    NASA Astrophysics Data System (ADS)

    Estevens, Joana; Rocha, Paulo; Boto, João P.; Lind, Pedro G.

    2017-11-01

    We model non-stationary volume-price distributions with a log-normal distribution and collect the time series of its two parameters. The time series of the two parameters are shown to be stationary and Markov-like and consequently can be modelled with Langevin equations, which are derived directly from their series of values. Having the evolution equations of the log-normal parameters, we reconstruct the statistics of the first moments of volume-price distributions which fit well the empirical data. Finally, the proposed framework is general enough to study other non-stationary stochastic variables in other research fields, namely, biology, medicine, and geology.

  20. An Observation Analysis Tool for time-series analysis and sensor management in the FREEWAT GIS environment for water resources management

    NASA Astrophysics Data System (ADS)

    Cannata, Massimiliano; Neumann, Jakob; Cardoso, Mirko; Rossetto, Rudy; Foglia, Laura; Borsi, Iacopo

    2017-04-01

    In situ time-series are an important aspect of environmental modelling, especially with the advancement of numerical simulation techniques and increased model complexity. In order to make use of the increasing data available through the requirements of the EU Water Framework Directive, the FREEWAT GIS environment incorporates the newly developed Observation Analysis Tool for time-series analysis. The tool is used to import time-series data into QGIS from local CSV files, online sensors using the istSOS service, or MODFLOW model result files and enables visualisation, pre-processing of data for model development, and post-processing of model results. OAT can be used as a pre-processor for calibration observations, integrating the creation of observations for calibration directly from sensor time-series. The tool consists in an expandable Python library of processing methods and an interface integrated in the QGIS FREEWAT plug-in which includes a large number of modelling capabilities, data management tools and calibration capacity.

  1. Classification of time-series images using deep convolutional neural networks

    NASA Astrophysics Data System (ADS)

    Hatami, Nima; Gavet, Yann; Debayle, Johan

    2018-04-01

    Convolutional Neural Networks (CNN) has achieved a great success in image recognition task by automatically learning a hierarchical feature representation from raw data. While the majority of Time-Series Classification (TSC) literature is focused on 1D signals, this paper uses Recurrence Plots (RP) to transform time-series into 2D texture images and then take advantage of the deep CNN classifier. Image representation of time-series introduces different feature types that are not available for 1D signals, and therefore TSC can be treated as texture image recognition task. CNN model also allows learning different levels of representations together with a classifier, jointly and automatically. Therefore, using RP and CNN in a unified framework is expected to boost the recognition rate of TSC. Experimental results on the UCR time-series classification archive demonstrate competitive accuracy of the proposed approach, compared not only to the existing deep architectures, but also to the state-of-the art TSC algorithms.

  2. Hierarchical Aligned Cluster Analysis for Temporal Clustering of Human Motion.

    PubMed

    Zhou, Feng; De la Torre, Fernando; Hodgins, Jessica K

    2013-03-01

    Temporal segmentation of human motion into plausible motion primitives is central to understanding and building computational models of human motion. Several issues contribute to the challenge of discovering motion primitives: the exponential nature of all possible movement combinations, the variability in the temporal scale of human actions, and the complexity of representing articulated motion. We pose the problem of learning motion primitives as one of temporal clustering, and derive an unsupervised hierarchical bottom-up framework called hierarchical aligned cluster analysis (HACA). HACA finds a partition of a given multidimensional time series into m disjoint segments such that each segment belongs to one of k clusters. HACA combines kernel k-means with the generalized dynamic time alignment kernel to cluster time series data. Moreover, it provides a natural framework to find a low-dimensional embedding for time series. HACA is efficiently optimized with a coordinate descent strategy and dynamic programming. Experimental results on motion capture and video data demonstrate the effectiveness of HACA for segmenting complex motions and as a visualization tool. We also compare the performance of HACA to state-of-the-art algorithms for temporal clustering on data of a honey bee dance. The HACA code is available online.

  3. A framework for the recognition of high-level surgical tasks from video images for cataract surgeries

    PubMed Central

    Lalys, Florent; Riffaud, Laurent; Bouget, David; Jannin, Pierre

    2012-01-01

    The need for a better integration of the new generation of Computer-Assisted-Surgical (CAS) systems has been recently emphasized. One necessity to achieve this objective is to retrieve data from the Operating Room (OR) with different sensors, then to derive models from these data. Recently, the use of videos from cameras in the OR has demonstrated its efficiency. In this paper, we propose a framework to assist in the development of systems for the automatic recognition of high level surgical tasks using microscope videos analysis. We validated its use on cataract procedures. The idea is to combine state-of-the-art computer vision techniques with time series analysis. The first step of the framework consisted in the definition of several visual cues for extracting semantic information, therefore characterizing each frame of the video. Five different pieces of image-based classifiers were therefore implemented. A step of pupil segmentation was also applied for dedicated visual cue detection. Time series classification algorithms were then applied to model time-varying data. Dynamic Time Warping (DTW) and Hidden Markov Models (HMM) were tested. This association combined the advantages of all methods for better understanding of the problem. The framework was finally validated through various studies. Six binary visual cues were chosen along with 12 phases to detect, obtaining accuracies of 94%. PMID:22203700

  4. Vegetation Response to Climate Change in the Southern Part of Qinghai-Tibet Plateau at Basinal Scale

    NASA Astrophysics Data System (ADS)

    Liu, X.; Liu, C.; Kang, Q.; Yin, B.

    2018-04-01

    Global climate change has significantly affected vegetation variation in the third-polar region of the world - the Qinghai-Tibet Plateau. As one of the most important indicators of vegetation variation (growth, coverage and tempo-spatial change), the Normalized Difference Vegetation Index (NDVI) is widely employed to study the response of vegetation to climate change. However, a long-term series analysis cannot be achieved because a single data source is constrained by time sequence. Therefore, a new framework was presented in this paper to extend the product series of monthly NDVI, taking as an example the Yarlung Zangbo River Basin, one of the most important river basins in the Qinghai-Tibet Plateau. NDVI products were acquired from two public sources: Global Inventory Modeling and Mapping Studies (GIMMS) Advanced Very High Resolution Radiometer (AVHRR) and Moderate-Resolution Imaging spectroradiometer (MODIS). After having been extended using the new framework, the new time series of NDVI covers a 384 months period (1982-2013), 84 months longer than previous time series of NDVI product, greatly facilitating NDVI related scientific research. In the new framework, the Gauss Filtering Method was employed to filter out noise in the NDVI product. Next, the standard method was introduced to enhance the comparability of the two data sources, and a pixel-based regression method was used to construct NDVI-extending models with one pixel after another. The extended series of NDVI fit well with original AVHRR-NDVI. With the extended time-series, temporal trends and spatial heterogeneity of NDVI in the study area were studied. Principal influencing factors on NDVI were further determined. The monthly NDVI is highly correlated with air temperature and precipitation in terms of climatic change wherein the spatially averaged NDVI slightly increases in the summer and has increased in temperature and decreased in precipitation in the 32 years period. The spatial heterogeneity of NDVI is in accordance with the seasonal variation of the two climate-change factors. All of these findings can provide valuable scientific support for water-land resources exploration in the third-polar region of the world.

  5. Model-based Clustering of Categorical Time Series with Multinomial Logit Classification

    NASA Astrophysics Data System (ADS)

    Frühwirth-Schnatter, Sylvia; Pamminger, Christoph; Winter-Ebmer, Rudolf; Weber, Andrea

    2010-09-01

    A common problem in many areas of applied statistics is to identify groups of similar time series in a panel of time series. However, distance-based clustering methods cannot easily be extended to time series data, where an appropriate distance-measure is rather difficult to define, particularly for discrete-valued time series. Markov chain clustering, proposed by Pamminger and Frühwirth-Schnatter [6], is an approach for clustering discrete-valued time series obtained by observing a categorical variable with several states. This model-based clustering method is based on finite mixtures of first-order time-homogeneous Markov chain models. In order to further explain group membership we present an extension to the approach of Pamminger and Frühwirth-Schnatter [6] by formulating a probabilistic model for the latent group indicators within the Bayesian classification rule by using a multinomial logit model. The parameters are estimated for a fixed number of clusters within a Bayesian framework using an Markov chain Monte Carlo (MCMC) sampling scheme representing a (full) Gibbs-type sampler which involves only draws from standard distributions. Finally, an application to a panel of Austrian wage mobility data is presented which leads to an interesting segmentation of the Austrian labour market.

  6. Application of process monitoring to anomaly detection in nuclear material processing systems via system-centric event interpretation of data from multiple sensors of varying reliability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao

    In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a systemcentric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologiesmore » within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.« less

  7. An Experimental Design of a Foundational Framework for the Application of Affective Computing to Soaring Flight Simulation and Training

    ERIC Educational Resources Information Center

    Moon, Shannon

    2017-01-01

    In the absence of tools for intelligent tutoring systems for soaring flight simulation training, this study evaluated a framework foundation to measure pilot performance, affect, and physiological response to training in real-time. Volunteers were asked to perform a series of flight tasks selected from Federal Aviation Administration Practical…

  8. Learning in the model space for cognitive fault diagnosis.

    PubMed

    Chen, Huanhuan; Tino, Peter; Rodan, Ali; Yao, Xin

    2014-01-01

    The emergence of large sensor networks has facilitated the collection of large amounts of real-time data to monitor and control complex engineering systems. However, in many cases the collected data may be incomplete or inconsistent, while the underlying environment may be time-varying or unformulated. In this paper, we develop an innovative cognitive fault diagnosis framework that tackles the above challenges. This framework investigates fault diagnosis in the model space instead of the signal space. Learning in the model space is implemented by fitting a series of models using a series of signal segments selected with a sliding window. By investigating the learning techniques in the fitted model space, faulty models can be discriminated from healthy models using a one-class learning algorithm. The framework enables us to construct a fault library when unknown faults occur, which can be regarded as cognitive fault isolation. This paper also theoretically investigates how to measure the pairwise distance between two models in the model space and incorporates the model distance into the learning algorithm in the model space. The results on three benchmark applications and one simulated model for the Barcelona water distribution network confirm the effectiveness of the proposed framework.

  9. Transition Icons for Time-Series Visualization and Exploratory Analysis.

    PubMed

    Nickerson, Paul V; Baharloo, Raheleh; Wanigatunga, Amal A; Manini, Todd M; Tighe, Patrick J; Rashidi, Parisa

    2018-03-01

    The modern healthcare landscape has seen the rapid emergence of techniques and devices that temporally monitor and record physiological signals. The prevalence of time-series data within the healthcare field necessitates the development of methods that can analyze the data in order to draw meaningful conclusions. Time-series behavior is notoriously difficult to intuitively understand due to its intrinsic high-dimensionality, which is compounded in the case of analyzing groups of time series collected from different patients. Our framework, which we call transition icons, renders common patterns in a visual format useful for understanding the shared behavior within groups of time series. Transition icons are adept at detecting and displaying subtle differences and similarities, e.g., between measurements taken from patients receiving different treatment strategies or stratified by demographics. We introduce various methods that collectively allow for exploratory analysis of groups of time series, while being free of distribution assumptions and including simple heuristics for parameter determination. Our technique extracts discrete transition patterns from symbolic aggregate approXimation representations, and compiles transition frequencies into a bag of patterns constructed for each group. These transition frequencies are normalized and aligned in icon form to intuitively display the underlying patterns. We demonstrate the transition icon technique for two time-series datasets-postoperative pain scores, and hip-worn accelerometer activity counts. We believe transition icons can be an important tool for researchers approaching time-series data, as they give rich and intuitive information about collective time-series behaviors.

  10. A procedure of multiple period searching in unequally spaced time-series with the Lomb-Scargle method

    NASA Technical Reports Server (NTRS)

    Van Dongen, H. P.; Olofsen, E.; VanHartevelt, J. H.; Kruyt, E. W.; Dinges, D. F. (Principal Investigator)

    1999-01-01

    Periodogram analysis of unequally spaced time-series, as part of many biological rhythm investigations, is complicated. The mathematical framework is scattered over the literature, and the interpretation of results is often debatable. In this paper, we show that the Lomb-Scargle method is the appropriate tool for periodogram analysis of unequally spaced data. A unique procedure of multiple period searching is derived, facilitating the assessment of the various rhythms that may be present in a time-series. All relevant mathematical and statistical aspects are considered in detail, and much attention is given to the correct interpretation of results. The use of the procedure is illustrated by examples, and problems that may be encountered are discussed. It is argued that, when following the procedure of multiple period searching, we can even benefit from the unequal spacing of a time-series in biological rhythm research.

  11. Reconstructing land use history from Landsat time-series. Case study of a swidden agriculture system in Brazil

    NASA Astrophysics Data System (ADS)

    Dutrieux, Loïc P.; Jakovac, Catarina C.; Latifah, Siti H.; Kooistra, Lammert

    2016-05-01

    We developed a method to reconstruct land use history from Landsat images time-series. The method uses a breakpoint detection framework derived from the econometrics field and applicable to time-series regression models. The Breaks For Additive Season and Trend (BFAST) framework is used for defining the time-series regression models which may contain trend and phenology, hence appropriately modelling vegetation intra and inter-annual dynamics. All available Landsat data are used for a selected study area, and the time-series are partitioned into segments delimited by breakpoints. Segments can be associated to land use regimes, while the breakpoints then correspond to shifts in land use regimes. In order to further characterize these shifts, we classified the unlabelled breakpoints returned by the algorithm into their corresponding processes. We used a Random Forest classifier, trained from a set of visually interpreted time-series profiles to infer the processes and assign labels to the breakpoints. The whole approach was applied to quantifying the number of cultivation cycles in a swidden agriculture system in Brazil (state of Amazonas). Number and frequency of cultivation cycles is of particular ecological relevance in these systems since they largely affect the capacity of the forest to regenerate after land abandonment. We applied the method to a Landsat time-series of Normalized Difference Moisture Index (NDMI) spanning the 1984-2015 period and derived from it the number of cultivation cycles during that period at the individual field scale level. Agricultural fields boundaries used to apply the method were derived using a multi-temporal segmentation approach. We validated the number of cultivation cycles predicted by the method against in-situ information collected from farmers interviews, resulting in a Normalized Residual Mean Squared Error (NRMSE) of 0.25. Overall the method performed well, producing maps with coherent spatial patterns. We identified various sources of error in the approach, including low data availability in the 90s and sub-object mixture of land uses. We conclude that the method holds great promise for land use history mapping in the tropics and beyond.

  12. Reconstructing Land Use History from Landsat Time-Series. Case study of Swidden Agriculture Intensification in Brazil

    NASA Astrophysics Data System (ADS)

    Dutrieux, L.; Jakovac, C. C.; Siti, L. H.; Kooistra, L.

    2015-12-01

    We developed a method to reconstruct land use history from Landsat images time-series. The method uses a breakpoint detection framework derived from the econometrics field and applicable to time-series regression models. The BFAST framework is used for defining the time-series regression models which may contain trend and phenology, hence appropriately modelling vegetation intra and inter-annual dynamics. All available Landsat data are used, and the time-series are partitioned into segments delimited by breakpoints. Segments can be associated to land use regimes, while the breakpoints then correspond to shifts in regimes. To further characterize these shifts, we classified the unlabelled breakpoints returned by the algorithm into their corresponding processes. We used a Random Forest classifier, trained from a set of visually interpreted time-series profiles to infer the processes and assign labels to the breakpoints. The whole approach was applied to quantifying the number of cultivation cycles in a swidden agriculture system in Brazil. Number and frequency of cultivation cycles is of particular ecological relevance in these systems since they largely affect the capacity of the forest to regenerate after abandonment. We applied the method to a Landsat time-series of Normalized Difference Moisture Index (NDMI) spanning the 1984-2015 period and derived from it the number of cultivation cycles during that period at the individual field scale level. Agricultural fields boundaries used to apply the method were derived using a multi-temporal segmentation. We validated the number of cultivation cycles predicted against in-situ information collected from farmers interviews, resulting in a Normalized RMSE of 0.25. Overall the method performed well, producing maps with coherent patterns. We identified various sources of error in the approach, including low data availability in the 90s and sub-object mixture of land uses. We conclude that the method holds great promise for land use history mapping in the tropics and beyond. Spatial and temporal patterns were further analysed with an ecological perspective in a follow-up study. Results show that changes in land use patterns such as land use intensification and reduced agricultural expansion reflect the socio-economic transformations that occurred in the region

  13. A Proposed Data Fusion Architecture for Micro-Zone Analysis and Data Mining

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kevin McCarthy; Milos Manic

    Data Fusion requires the ability to combine or “fuse” date from multiple data sources. Time Series Analysis is a data mining technique used to predict future values from a data set based upon past values. Unlike other data mining techniques, however, Time Series places special emphasis on periodicity and how seasonal and other time-based factors tend to affect trends over time. One of the difficulties encountered in developing generic time series techniques is the wide variability of the data sets available for analysis. This presents challenges all the way from the data gathering stage to results presentation. This paper presentsmore » an architecture designed and used to facilitate the collection of disparate data sets well suited to Time Series analysis as well as other predictive data mining techniques. Results show this architecture provides a flexible, dynamic framework for the capture and storage of a myriad of dissimilar data sets and can serve as a foundation from which to build a complete data fusion architecture.« less

  14. Combining synthetic controls and interrupted time series analysis to improve causal inference in program evaluation.

    PubMed

    Linden, Ariel

    2018-04-01

    Interrupted time series analysis (ITSA) is an evaluation methodology in which a single treatment unit's outcome is studied over time and the intervention is expected to "interrupt" the level and/or trend of the outcome. The internal validity is strengthened considerably when the treated unit is contrasted with a comparable control group. In this paper, we introduce a robust evaluation framework that combines the synthetic controls method (SYNTH) to generate a comparable control group and ITSA regression to assess covariate balance and estimate treatment effects. We evaluate the effect of California's Proposition 99 for reducing cigarette sales, by comparing California to other states not exposed to smoking reduction initiatives. SYNTH is used to reweight nontreated units to make them comparable to the treated unit. These weights are then used in ITSA regression models to assess covariate balance and estimate treatment effects. Covariate balance was achieved for all but one covariate. While California experienced a significant decrease in the annual trend of cigarette sales after Proposition 99, there was no statistically significant treatment effect when compared to synthetic controls. The advantage of using this framework over regression alone is that it ensures that a comparable control group is generated. Additionally, it offers a common set of statistical measures familiar to investigators, the capability for assessing covariate balance, and enhancement of the evaluation with a comprehensive set of postestimation measures. Therefore, this robust framework should be considered as a primary approach for evaluating treatment effects in multiple group time series analysis. © 2018 John Wiley & Sons, Ltd.

  15. A matching framework to improve causal inference in interrupted time-series analysis.

    PubMed

    Linden, Ariel

    2018-04-01

    Interrupted time-series analysis (ITSA) is a popular evaluation methodology in which a single treatment unit's outcome is studied over time and the intervention is expected to "interrupt" the level and/or trend of the outcome, subsequent to its introduction. When ITSA is implemented without a comparison group, the internal validity may be quite poor. Therefore, adding a comparable control group to serve as the counterfactual is always preferred. This paper introduces a novel matching framework, ITSAMATCH, to create a comparable control group by matching directly on covariates and then use these matches in the outcomes model. We evaluate the effect of California's Proposition 99 (passed in 1988) for reducing cigarette sales, by comparing California to other states not exposed to smoking reduction initiatives. We compare ITSAMATCH results to 2 commonly used matching approaches, synthetic controls (SYNTH), and regression adjustment; SYNTH reweights nontreated units to make them comparable to the treated unit, and regression adjusts covariates directly. Methods are compared by assessing covariate balance and treatment effects. Both ITSAMATCH and SYNTH achieved covariate balance and estimated similar treatment effects. The regression model found no treatment effect and produced inconsistent covariate adjustment. While the matching framework achieved results comparable to SYNTH, it has the advantage of being technically less complicated, while producing statistical estimates that are straightforward to interpret. Conversely, regression adjustment may "adjust away" a treatment effect. Given its advantages, ITSAMATCH should be considered as a primary approach for evaluating treatment effects in multiple-group time-series analysis. © 2017 John Wiley & Sons, Ltd.

  16. The forecasting of menstruation based on a state-space modeling of basal body temperature time series.

    PubMed

    Fukaya, Keiichi; Kawamori, Ai; Osada, Yutaka; Kitazawa, Masumi; Ishiguro, Makio

    2017-09-20

    Women's basal body temperature (BBT) shows a periodic pattern that associates with menstrual cycle. Although this fact suggests a possibility that daily BBT time series can be useful for estimating the underlying phase state as well as for predicting the length of current menstrual cycle, little attention has been paid to model BBT time series. In this study, we propose a state-space model that involves the menstrual phase as a latent state variable to explain the daily fluctuation of BBT and the menstruation cycle length. Conditional distributions of the phase are obtained by using sequential Bayesian filtering techniques. A predictive distribution of the next menstruation day can be derived based on this conditional distribution and the model, leading to a novel statistical framework that provides a sequentially updated prediction for upcoming menstruation day. We applied this framework to a real data set of women's BBT and menstruation days and compared prediction accuracy of the proposed method with that of previous methods, showing that the proposed method generally provides a better prediction. Because BBT can be obtained with relatively small cost and effort, the proposed method can be useful for women's health management. Potential extensions of this framework as the basis of modeling and predicting events that are associated with the menstrual cycles are discussed. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  17. Quantifying Transmission Heterogeneity Using Both Pathogen Phylogenies and Incidence Time Series

    PubMed Central

    Li, Lucy M.; Grassly, Nicholas C.; Fraser, Christophe

    2017-01-01

    Abstract Heterogeneity in individual-level transmissibility can be quantified by the dispersion parameter k of the offspring distribution. Quantifying heterogeneity is important as it affects other parameter estimates, it modulates the degree of unpredictability of an epidemic, and it needs to be accounted for in models of infection control. Aggregated data such as incidence time series are often not sufficiently informative to estimate k. Incorporating phylogenetic analysis can help to estimate k concurrently with other epidemiological parameters. We have developed an inference framework that uses particle Markov Chain Monte Carlo to estimate k and other epidemiological parameters using both incidence time series and the pathogen phylogeny. Using the framework to fit a modified compartmental transmission model that includes the parameter k to simulated data, we found that more accurate and less biased estimates of the reproductive number were obtained by combining epidemiological and phylogenetic analyses. However, k was most accurately estimated using pathogen phylogeny alone. Accurately estimating k was necessary for unbiased estimates of the reproductive number, but it did not affect the accuracy of reporting probability and epidemic start date estimates. We further demonstrated that inference was possible in the presence of phylogenetic uncertainty by sampling from the posterior distribution of phylogenies. Finally, we used the inference framework to estimate transmission parameters from epidemiological and genetic data collected during a poliovirus outbreak. Despite the large degree of phylogenetic uncertainty, we demonstrated that incorporating phylogenetic data in parameter inference improved the accuracy and precision of estimates. PMID:28981709

  18. Recursive Bayesian recurrent neural networks for time-series modeling.

    PubMed

    Mirikitani, Derrick T; Nikolaev, Nikolay

    2010-02-01

    This paper develops a probabilistic approach to recursive second-order training of recurrent neural networks (RNNs) for improved time-series modeling. A general recursive Bayesian Levenberg-Marquardt algorithm is derived to sequentially update the weights and the covariance (Hessian) matrix. The main strengths of the approach are a principled handling of the regularization hyperparameters that leads to better generalization, and stable numerical performance. The framework involves the adaptation of a noise hyperparameter and local weight prior hyperparameters, which represent the noise in the data and the uncertainties in the model parameters. Experimental investigations using artificial and real-world data sets show that RNNs equipped with the proposed approach outperform standard real-time recurrent learning and extended Kalman training algorithms for recurrent networks, as well as other contemporary nonlinear neural models, on time-series modeling.

  19. An Algorithm Framework for Isolating Anomalous Signals in Electromagnetic Data

    NASA Astrophysics Data System (ADS)

    Kappler, K. N.; Schneider, D.; Bleier, T.; MacLean, L. S.

    2016-12-01

    QuakeFinder and its international collaborators have installed and currently maintain an array of 165 three-axis induction magnetometer instrument sites in California, Peru, Taiwan, Greece, Chile and Sumatra. Based on research by Bleier et al. (2009), Fraser-Smith et al. (1990), and Freund (2007), the electromagnetic data from these instruments are being analyzed for pre-earthquake signatures. This analysis consists of both private research by QuakeFinder, and institutional collaborators (PUCP in Peru, NCU in Taiwan, NOA in Greece, LASP at University of Colorado, Stanford, UCLA, NASA-ESI, NASA-AMES and USC-CSEP). QuakeFinder has developed an algorithm framework aimed at isolating anomalous signals (pulses) in the time series. Results are presented from an application of this framework to induction-coil magnetometer data. Our data driven approach starts with sliding windows applied to uniformly resampled array data with a variety of lengths and overlap. Data variance (a proxy for energy) is calculated on each window and a short-term average/ long-term average (STA/LTA) filter is applied to the variance time series. Pulse identification is done by flagging time intervals in the STA/LTA filtered time series which exceed a threshold. Flagged time intervals are subsequently fed into a feature extraction program which computes statistical properties of the resampled data. These features are then filtered using a Principal Component Analysis (PCA) based method to cluster similar pulses. We explore the extent to which this approach categorizes pulses with known sources (e.g. cars, lightning, etc.) and the remaining pulses of unknown origin can be analyzed with respect to their relationship with seismicity. We seek a correlation between these daily pulse-counts (with known sources removed) and subsequent (days to weeks) seismic events greater than M5 within 15km radius. Thus we explore functions which map daily pulse-counts to a time series representing the likelihood of a seismic event occurring at some future time. These "pseudo-probabilities" can in turn be represented as Molchan diagrams. The Molchan curve provides an effective cost function for optimization and allows for a rigorous statistical assessment of the validity of pre-earthquake signals in the electromagnetic data.

  20. Applying time series Landsat data for vegetation change analysis in the Florida Everglades Water Conservation Area 2A during 1996-2016

    NASA Astrophysics Data System (ADS)

    Zhang, Caiyun; Smith, Molly; Lv, Jie; Fang, Chaoyang

    2017-05-01

    Mapping plant communities and documenting their changes is critical to the on-going Florida Everglades restoration project. In this study, a framework was designed to map dominant vegetation communities and inventory their changes in the Florida Everglades Water Conservation Area 2A (WCA-2A) using time series Landsat images spanning 1996-2016. The object-based change analysis technique was combined in the framework. A hybrid pixel/object-based change detection approach was developed to effectively collect training samples for historical images with sparse reference data. An object-based quantification approach was also developed to assess the expansion/reduction of a specific class such as cattail (an invasive species in the Everglades) from the object-based classifications of two dates of imagery. The study confirmed the results in the literature that cattail was largely expanded during 1996-2007. It also revealed that cattail expansion was constrained after 2007. Application of time series Landsat data is valuable to document vegetation changes for the WCA-2A impoundment. The digital techniques developed will benefit global wetland mapping and change analysis in general, and the Florida Everglades WCA-2A in particular.

  1. Cerium-based metal organic frameworks with UiO-66 architecture: synthesis, properties and redox catalytic activity.

    PubMed

    Lammert, Martin; Wharmby, Michael T; Smolders, Simon; Bueken, Bart; Lieb, Alexandra; Lomachenko, Kirill A; Vos, Dirk De; Stock, Norbert

    2015-08-14

    A series of nine Ce(iv)-based metal organic frameworks with the UiO-66 structure containing linker molecules of different sizes and functionalities were obtained under mild synthesis conditions and short reaction times. Thermal and chemical stabilities were determined and a Ce-UiO-66-BDC/TEMPO system was successfully employed for the aerobic oxidation of benzyl alcohol.

  2. Statistical tools for analysis and modeling of cosmic populations and astronomical time series: CUDAHM and TSE

    NASA Astrophysics Data System (ADS)

    Loredo, Thomas; Budavari, Tamas; Scargle, Jeffrey D.

    2018-01-01

    This presentation provides an overview of open-source software packages addressing two challenging classes of astrostatistics problems. (1) CUDAHM is a C++ framework for hierarchical Bayesian modeling of cosmic populations, leveraging graphics processing units (GPUs) to enable applying this computationally challenging paradigm to large datasets. CUDAHM is motivated by measurement error problems in astronomy, where density estimation and linear and nonlinear regression must be addressed for populations of thousands to millions of objects whose features are measured with possibly complex uncertainties, potentially including selection effects. An example calculation demonstrates accurate GPU-accelerated luminosity function estimation for simulated populations of $10^6$ objects in about two hours using a single NVIDIA Tesla K40c GPU. (2) Time Series Explorer (TSE) is a collection of software in Python and MATLAB for exploratory analysis and statistical modeling of astronomical time series. It comprises a library of stand-alone functions and classes, as well as an application environment for interactive exploration of times series data. The presentation will summarize key capabilities of this emerging project, including new algorithms for analysis of irregularly-sampled time series.

  3. Scale Mixture Models with Applications to Bayesian Inference

    NASA Astrophysics Data System (ADS)

    Qin, Zhaohui S.; Damien, Paul; Walker, Stephen

    2003-11-01

    Scale mixtures of uniform distributions are used to model non-normal data in time series and econometrics in a Bayesian framework. Heteroscedastic and skewed data models are also tackled using scale mixture of uniform distributions.

  4. Optimal HRF and smoothing parameters for fMRI time series within an autoregressive modeling framework.

    PubMed

    Galka, Andreas; Siniatchkin, Michael; Stephani, Ulrich; Groening, Kristina; Wolff, Stephan; Bosch-Bayard, Jorge; Ozaki, Tohru

    2010-12-01

    The analysis of time series obtained by functional magnetic resonance imaging (fMRI) may be approached by fitting predictive parametric models, such as nearest-neighbor autoregressive models with exogeneous input (NNARX). As a part of the modeling procedure, it is possible to apply instantaneous linear transformations to the data. Spatial smoothing, a common preprocessing step, may be interpreted as such a transformation. The autoregressive parameters may be constrained, such that they provide a response behavior that corresponds to the canonical haemodynamic response function (HRF). We present an algorithm for estimating the parameters of the linear transformations and of the HRF within a rigorous maximum-likelihood framework. Using this approach, an optimal amount of both the spatial smoothing and the HRF can be estimated simultaneously for a given fMRI data set. An example from a motor-task experiment is discussed. It is found that, for this data set, weak, but non-zero, spatial smoothing is optimal. Furthermore, it is demonstrated that activated regions can be estimated within the maximum-likelihood framework.

  5. Design of the HELICS High-Performance Transmission-Distribution-Communication-Market Co-Simulation Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmintier, Bryan S; Krishnamurthy, Dheepak; Top, Philip

    This paper describes the design rationale for a new cyber-physical-energy co-simulation framework for electric power systems. This new framework will support very large-scale (100,000+ federates) co-simulations with off-the-shelf power-systems, communication, and end-use models. Other key features include cross-platform operating system support, integration of both event-driven (e.g. packetized communication) and time-series (e.g. power flow) simulation, and the ability to co-iterate among federates to ensure model convergence at each time step. After describing requirements, we begin by evaluating existing co-simulation frameworks, including HLA and FMI, and conclude that none provide the required features. Then we describe the design for the new layeredmore » co-simulation architecture.« less

  6. Design of the HELICS High-Performance Transmission-Distribution-Communication-Market Co-Simulation Framework: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmintier, Bryan S; Krishnamurthy, Dheepak; Top, Philip

    This paper describes the design rationale for a new cyber-physical-energy co-simulation framework for electric power systems. This new framework will support very large-scale (100,000+ federates) co-simulations with off-the-shelf power-systems, communication, and end-use models. Other key features include cross-platform operating system support, integration of both event-driven (e.g. packetized communication) and time-series (e.g. power flow) simulation, and the ability to co-iterate among federates to ensure model convergence at each time step. After describing requirements, we begin by evaluating existing co-simulation frameworks, including HLA and FMI, and conclude that none provide the required features. Then we describe the design for the new layeredmore » co-simulation architecture.« less

  7. Utilization of Historic Information in an Optimisation Task

    NASA Technical Reports Server (NTRS)

    Boesser, T.

    1984-01-01

    One of the basic components of a discrete model of motor behavior and decision making, which describes tracking and supervisory control in unitary terms, is assumed to be a filtering mechanism which is tied to the representational principles of human memory for time-series information. In a series of experiments subjects used the time-series information with certain significant limitations: there is a range-effect; asymmetric distributions seem to be recognized, but it does not seem to be possible to optimize performance based on skewed distributions. Thus there is a transformation of the displayed data between the perceptual system and representation in memory involving a loss of information. This rules out a number of representational principles for time-series information in memory and fits very well into the framework of a comprehensive discrete model for control of complex systems, modelling continuous control (tracking), discrete responses, supervisory behavior and learning.

  8. A Fast Framework for Abrupt Change Detection Based on Binary Search Trees and Kolmogorov Statistic

    PubMed Central

    Qi, Jin-Peng; Qi, Jie; Zhang, Qing

    2016-01-01

    Change-Point (CP) detection has attracted considerable attention in the fields of data mining and statistics; it is very meaningful to discuss how to quickly and efficiently detect abrupt change from large-scale bioelectric signals. Currently, most of the existing methods, like Kolmogorov-Smirnov (KS) statistic and so forth, are time-consuming, especially for large-scale datasets. In this paper, we propose a fast framework for abrupt change detection based on binary search trees (BSTs) and a modified KS statistic, named BSTKS (binary search trees and Kolmogorov statistic). In this method, first, two binary search trees, termed as BSTcA and BSTcD, are constructed by multilevel Haar Wavelet Transform (HWT); second, three search criteria are introduced in terms of the statistic and variance fluctuations in the diagnosed time series; last, an optimal search path is detected from the root to leaf nodes of two BSTs. The studies on both the synthetic time series samples and the real electroencephalograph (EEG) recordings indicate that the proposed BSTKS can detect abrupt change more quickly and efficiently than KS, t-statistic (t), and Singular-Spectrum Analyses (SSA) methods, with the shortest computation time, the highest hit rate, the smallest error, and the highest accuracy out of four methods. This study suggests that the proposed BSTKS is very helpful for useful information inspection on all kinds of bioelectric time series signals. PMID:27413364

  9. A Fast Framework for Abrupt Change Detection Based on Binary Search Trees and Kolmogorov Statistic.

    PubMed

    Qi, Jin-Peng; Qi, Jie; Zhang, Qing

    2016-01-01

    Change-Point (CP) detection has attracted considerable attention in the fields of data mining and statistics; it is very meaningful to discuss how to quickly and efficiently detect abrupt change from large-scale bioelectric signals. Currently, most of the existing methods, like Kolmogorov-Smirnov (KS) statistic and so forth, are time-consuming, especially for large-scale datasets. In this paper, we propose a fast framework for abrupt change detection based on binary search trees (BSTs) and a modified KS statistic, named BSTKS (binary search trees and Kolmogorov statistic). In this method, first, two binary search trees, termed as BSTcA and BSTcD, are constructed by multilevel Haar Wavelet Transform (HWT); second, three search criteria are introduced in terms of the statistic and variance fluctuations in the diagnosed time series; last, an optimal search path is detected from the root to leaf nodes of two BSTs. The studies on both the synthetic time series samples and the real electroencephalograph (EEG) recordings indicate that the proposed BSTKS can detect abrupt change more quickly and efficiently than KS, t-statistic (t), and Singular-Spectrum Analyses (SSA) methods, with the shortest computation time, the highest hit rate, the smallest error, and the highest accuracy out of four methods. This study suggests that the proposed BSTKS is very helpful for useful information inspection on all kinds of bioelectric time series signals.

  10. Statistical attribution analysis of the nonstationarity of the annual runoff series of the Weihe River.

    PubMed

    Xiong, Lihua; Jiang, Cong; Du, Tao

    2014-01-01

    Time-varying moments models based on Pearson Type III and normal distributions respectively are built under the generalized additive model in location, scale and shape (GAMLSS) framework to analyze the nonstationarity of the annual runoff series of the Weihe River, the largest tributary of the Yellow River. The detection of nonstationarities in hydrological time series (annual runoff, precipitation and temperature) from 1960 to 2009 is carried out using a GAMLSS model, and then the covariate analysis for the annual runoff series is implemented with GAMLSS. Finally, the attribution of each covariate to the nonstationarity of annual runoff is analyzed quantitatively. The results demonstrate that (1) obvious change-points exist in all three hydrological series, (2) precipitation, temperature and irrigated area are all significant covariates of the annual runoff series, and (3) temperature increase plays the main role in leading to the reduction of the annual runoff series in the study basin, followed by the decrease of precipitation and the increase of irrigated area.

  11. MVIAeval: a web tool for comprehensively evaluating the performance of a new missing value imputation algorithm.

    PubMed

    Wu, Wei-Sheng; Jhou, Meng-Jhun

    2017-01-13

    Missing value imputation is important for microarray data analyses because microarray data with missing values would significantly degrade the performance of the downstream analyses. Although many microarray missing value imputation algorithms have been developed, an objective and comprehensive performance comparison framework is still lacking. To solve this problem, we previously proposed a framework which can perform a comprehensive performance comparison of different existing algorithms. Also the performance of a new algorithm can be evaluated by our performance comparison framework. However, constructing our framework is not an easy task for the interested researchers. To save researchers' time and efforts, here we present an easy-to-use web tool named MVIAeval (Missing Value Imputation Algorithm evaluator) which implements our performance comparison framework. MVIAeval provides a user-friendly interface allowing users to upload the R code of their new algorithm and select (i) the test datasets among 20 benchmark microarray (time series and non-time series) datasets, (ii) the compared algorithms among 12 existing algorithms, (iii) the performance indices from three existing ones, (iv) the comprehensive performance scores from two possible choices, and (v) the number of simulation runs. The comprehensive performance comparison results are then generated and shown as both figures and tables. MVIAeval is a useful tool for researchers to easily conduct a comprehensive and objective performance evaluation of their newly developed missing value imputation algorithm for microarray data or any data which can be represented as a matrix form (e.g. NGS data or proteomics data). Thus, MVIAeval will greatly expedite the progress in the research of missing value imputation algorithms.

  12. A novel multi-target regression framework for time-series prediction of drug efficacy.

    PubMed

    Li, Haiqing; Zhang, Wei; Chen, Ying; Guo, Yumeng; Li, Guo-Zheng; Zhu, Xiaoxin

    2017-01-18

    Excavating from small samples is a challenging pharmacokinetic problem, where statistical methods can be applied. Pharmacokinetic data is special due to the small samples of high dimensionality, which makes it difficult to adopt conventional methods to predict the efficacy of traditional Chinese medicine (TCM) prescription. The main purpose of our study is to obtain some knowledge of the correlation in TCM prescription. Here, a novel method named Multi-target Regression Framework to deal with the problem of efficacy prediction is proposed. We employ the correlation between the values of different time sequences and add predictive targets of previous time as features to predict the value of current time. Several experiments are conducted to test the validity of our method and the results of leave-one-out cross-validation clearly manifest the competitiveness of our framework. Compared with linear regression, artificial neural networks, and partial least squares, support vector regression combined with our framework demonstrates the best performance, and appears to be more suitable for this task.

  13. A novel multi-target regression framework for time-series prediction of drug efficacy

    PubMed Central

    Li, Haiqing; Zhang, Wei; Chen, Ying; Guo, Yumeng; Li, Guo-Zheng; Zhu, Xiaoxin

    2017-01-01

    Excavating from small samples is a challenging pharmacokinetic problem, where statistical methods can be applied. Pharmacokinetic data is special due to the small samples of high dimensionality, which makes it difficult to adopt conventional methods to predict the efficacy of traditional Chinese medicine (TCM) prescription. The main purpose of our study is to obtain some knowledge of the correlation in TCM prescription. Here, a novel method named Multi-target Regression Framework to deal with the problem of efficacy prediction is proposed. We employ the correlation between the values of different time sequences and add predictive targets of previous time as features to predict the value of current time. Several experiments are conducted to test the validity of our method and the results of leave-one-out cross-validation clearly manifest the competitiveness of our framework. Compared with linear regression, artificial neural networks, and partial least squares, support vector regression combined with our framework demonstrates the best performance, and appears to be more suitable for this task. PMID:28098186

  14. Flood mapping in ungauged basins using fully continuous hydrologic-hydraulic modeling

    NASA Astrophysics Data System (ADS)

    Grimaldi, Salvatore; Petroselli, Andrea; Arcangeletti, Ettore; Nardi, Fernando

    2013-04-01

    SummaryIn this work, a fully-continuous hydrologic-hydraulic modeling framework for flood mapping is introduced and tested. It is characterized by a simulation of a long rainfall time series at sub-daily resolution that feeds a continuous rainfall-runoff model producing a discharge time series that is directly given as an input to a bi-dimensional hydraulic model. The main advantage of the proposed approach is to avoid the use of the design hyetograph and the design hydrograph that constitute the main source of subjective analysis and uncertainty for standard methods. The proposed procedure is optimized for small and ungauged watersheds where empirical models are commonly applied. Results of a simple real case study confirm that this experimental fully-continuous framework may pave the way for the implementation of a less subjective and potentially automated procedure for flood hazard mapping.

  15. A combinatorial framework to quantify peak/pit asymmetries in complex dynamics.

    PubMed

    Hasson, Uri; Iacovacci, Jacopo; Davis, Ben; Flanagan, Ryan; Tagliazucchi, Enzo; Laufs, Helmut; Lacasa, Lucas

    2018-02-23

    We explore a combinatorial framework which efficiently quantifies the asymmetries between minima and maxima in local fluctuations of time series. We first showcase its performance by applying it to a battery of synthetic cases. We find rigorous results on some canonical dynamical models (stochastic processes with and without correlations, chaotic processes) complemented by extensive numerical simulations for a range of processes which indicate that the methodology correctly distinguishes different complex dynamics and outperforms state of the art metrics in several cases. Subsequently, we apply this methodology to real-world problems emerging across several disciplines including cases in neurobiology, finance and climate science. We conclude that differences between the statistics of local maxima and local minima in time series are highly informative of the complex underlying dynamics and a graph-theoretic extraction procedure allows to use these features for statistical learning purposes.

  16. What InSAR time-series methods are best suited for the Ecuadorian volcanoes

    NASA Astrophysics Data System (ADS)

    Mirzaee, S.; Amelung, F.

    2017-12-01

    Ground displacement measurements from stacks of SAR images obtained using interferometric time-series approaches play an increasingly important role for volcanic hazard assessment. The inflation of the ground surface can indicate that magma ascends to shallower levels and that a volcano gets ready for an eruption. Commonly used InSAR time-series approaches include Small Baseline (SB), Persistent Scatter InSAR (PSI) and SqueeSAR methods but it remains unclear which approach is best suited for volcanic environments. On this poster we present InSAR deformation measurements for the active volcanoes of Ecuador (Cotopaxi, Tungurahua and Pichincha) using a variety of INSAR time-series methods. We discuss the pros and cons of each method given the available data stacks (TerraSAR-X, Cosmo-Skymed and Sentinel-1) in an effort to design a comprehensive observation strategy for the Ecuadorian volcanoes. SAR data are provided in the framework of the Group on Earth Observation's Ecuadorian Volcano Geohazard Supersite.

  17. Paleoclimate networks: a concept meeting central challenges in the reconstruction of paleoclimate dynamics

    NASA Astrophysics Data System (ADS)

    Rehfeld, Kira; Goswami, Bedartha; Marwan, Norbert; Breitenbach, Sebastian; Kurths, Jürgen

    2013-04-01

    Statistical analysis of dependencies amongst paleoclimate data helps to infer on the climatic processes they reflect. Three key challenges have to be addressed, however: the datasets are heterogeneous in time (i) and space (ii), and furthermore time itself is a variable that needs to be reconstructed, which (iii) introduces additional uncertainties. To address these issues in a flexible way we developed the paleoclimate network framework, inspired by the increasing application of complex networks in climate research. Nodes in the paleoclimate network represent a paleoclimate archive, and an associated time series. Links between these nodes are assigned, if these time series are significantly similar. Therefore, the base of the paleoclimate network is formed by linear and nonlinear estimators for Pearson correlation, mutual information and event synchronization, which quantify similarity from irregularly sampled time series. Age uncertainties are propagated into the final network analysis using time series ensembles which reflect the uncertainty. We discuss how spatial heterogeneity influences the results obtained from network measures, and demonstrate the power of the approach by inferring teleconnection variability of the Asian summer monsoon for the past 1000 years.

  18. Frequency-phase analysis of resting-state functional MRI

    PubMed Central

    Goelman, Gadi; Dan, Rotem; Růžička, Filip; Bezdicek, Ondrej; Růžička, Evžen; Roth, Jan; Vymazal, Josef; Jech, Robert

    2017-01-01

    We describe an analysis method that characterizes the correlation between coupled time-series functions by their frequencies and phases. It provides a unified framework for simultaneous assessment of frequency and latency of a coupled time-series. The analysis is demonstrated on resting-state functional MRI data of 34 healthy subjects. Interactions between fMRI time-series are represented by cross-correlation (with time-lag) functions. A general linear model is used on the cross-correlation functions to obtain the frequencies and phase-differences of the original time-series. We define symmetric, antisymmetric and asymmetric cross-correlation functions that correspond respectively to in-phase, 90° out-of-phase and any phase difference between a pair of time-series, where the last two were never introduced before. Seed maps of the motor system were calculated to demonstrate the strength and capabilities of the analysis. Unique types of functional connections, their dominant frequencies and phase-differences have been identified. The relation between phase-differences and time-delays is shown. The phase-differences are speculated to inform transfer-time and/or to reflect a difference in the hemodynamic response between regions that are modulated by neurotransmitters concentration. The analysis can be used with any coupled functions in many disciplines including electrophysiology, EEG or MEG in neuroscience. PMID:28272522

  19. A novel framework to simulating non-stationary, non-linear, non-Normal hydrological time series using Markov Switching Autoregressive Models

    NASA Astrophysics Data System (ADS)

    Birkel, C.; Paroli, R.; Spezia, L.; Tetzlaff, D.; Soulsby, C.

    2012-12-01

    In this paper we present a novel model framework using the class of Markov Switching Autoregressive Models (MSARMs) to examine catchments as complex stochastic systems that exhibit non-stationary, non-linear and non-Normal rainfall-runoff and solute dynamics. Hereby, MSARMs are pairs of stochastic processes, one observed and one unobserved, or hidden. We model the unobserved process as a finite state Markov chain and assume that the observed process, given the hidden Markov chain, is conditionally autoregressive, which means that the current observation depends on its recent past (system memory). The model is fully embedded in a Bayesian analysis based on Markov Chain Monte Carlo (MCMC) algorithms for model selection and uncertainty assessment. Hereby, the autoregressive order and the dimension of the hidden Markov chain state-space are essentially self-selected. The hidden states of the Markov chain represent unobserved levels of variability in the observed process that may result from complex interactions of hydroclimatic variability on the one hand and catchment characteristics affecting water and solute storage on the other. To deal with non-stationarity, additional meteorological and hydrological time series along with a periodic component can be included in the MSARMs as covariates. This extension allows identification of potential underlying drivers of temporal rainfall-runoff and solute dynamics. We applied the MSAR model framework to streamflow and conservative tracer (deuterium and oxygen-18) time series from an intensively monitored 2.3 km2 experimental catchment in eastern Scotland. Statistical time series analysis, in the form of MSARMs, suggested that the streamflow and isotope tracer time series are not controlled by simple linear rules. MSARMs showed that the dependence of current observations on past inputs observed by transport models often in form of the long-tailing of travel time and residence time distributions can be efficiently explained by non-stationarity either of the system input (climatic variability) and/or the complexity of catchment storage characteristics. The statistical model is also capable of reproducing short (event) and longer-term (inter-event) and wet and dry dynamical "hydrological states". These reflect the non-linear transport mechanisms of flow pathways induced by transient climatic and hydrological variables and modified by catchment characteristics. We conclude that MSARMs are a powerful tool to analyze the temporal dynamics of hydrological data, allowing for explicit integration of non-stationary, non-linear and non-Normal characteristics.

  20. A Modularized Efficient Framework for Non-Markov Time Series Estimation

    NASA Astrophysics Data System (ADS)

    Schamberg, Gabriel; Ba, Demba; Coleman, Todd P.

    2018-06-01

    We present a compartmentalized approach to finding the maximum a-posteriori (MAP) estimate of a latent time series that obeys a dynamic stochastic model and is observed through noisy measurements. We specifically consider modern signal processing problems with non-Markov signal dynamics (e.g. group sparsity) and/or non-Gaussian measurement models (e.g. point process observation models used in neuroscience). Through the use of auxiliary variables in the MAP estimation problem, we show that a consensus formulation of the alternating direction method of multipliers (ADMM) enables iteratively computing separate estimates based on the likelihood and prior and subsequently "averaging" them in an appropriate sense using a Kalman smoother. As such, this can be applied to a broad class of problem settings and only requires modular adjustments when interchanging various aspects of the statistical model. Under broad log-concavity assumptions, we show that the separate estimation problems are convex optimization problems and that the iterative algorithm converges to the MAP estimate. As such, this framework can capture non-Markov latent time series models and non-Gaussian measurement models. We provide example applications involving (i) group-sparsity priors, within the context of electrophysiologic specrotemporal estimation, and (ii) non-Gaussian measurement models, within the context of dynamic analyses of learning with neural spiking and behavioral observations.

  1. A Content-Adaptive Analysis and Representation Framework for Audio Event Discovery from "Unscripted" Multimedia

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao

    2006-12-01

    We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.

  2. Robust evaluation of time series classification algorithms for structural health monitoring

    NASA Astrophysics Data System (ADS)

    Harvey, Dustin Y.; Worden, Keith; Todd, Michael D.

    2014-03-01

    Structural health monitoring (SHM) systems provide real-time damage and performance information for civil, aerospace, and mechanical infrastructure through analysis of structural response measurements. The supervised learning methodology for data-driven SHM involves computation of low-dimensional, damage-sensitive features from raw measurement data that are then used in conjunction with machine learning algorithms to detect, classify, and quantify damage states. However, these systems often suffer from performance degradation in real-world applications due to varying operational and environmental conditions. Probabilistic approaches to robust SHM system design suffer from incomplete knowledge of all conditions a system will experience over its lifetime. Info-gap decision theory enables nonprobabilistic evaluation of the robustness of competing models and systems in a variety of decision making applications. Previous work employed info-gap models to handle feature uncertainty when selecting various components of a supervised learning system, namely features from a pre-selected family and classifiers. In this work, the info-gap framework is extended to robust feature design and classifier selection for general time series classification through an efficient, interval arithmetic implementation of an info-gap data model. Experimental results are presented for a damage type classification problem on a ball bearing in a rotating machine. The info-gap framework in conjunction with an evolutionary feature design system allows for fully automated design of a time series classifier to meet performance requirements under maximum allowable uncertainty.

  3. Scenario Generation and Assessment Framework Solution in Support of the Comprehensive Approach

    DTIC Science & Technology

    2010-04-01

    attention, stress, fatigue etc.) and neurofeedback tracking for evaluation in a qualitative manner the real involvement of the trained participants in CAX...Series, Softrade, 2006 (in Bulgarian). [11] Minchev Z., Dukov G., Georgiev S. EEG Spectral Analysis in Serious Gaming: An Ad Hoc Experimental...Nonlinear and linear forecasting of the EEG time series, Biological Cybernetics, 66, 221-259, 1991. [20] Schubert, J., Svenson, P., and Mårtenson, Ch

  4. A Multipixel Time Series Analysis Method Accounting for Ground Motion, Atmospheric Noise, and Orbital Errors

    NASA Astrophysics Data System (ADS)

    Jolivet, R.; Simons, M.

    2018-02-01

    Interferometric synthetic aperture radar time series methods aim to reconstruct time-dependent ground displacements over large areas from sets of interferograms in order to detect transient, periodic, or small-amplitude deformation. Because of computational limitations, most existing methods consider each pixel independently, ignoring important spatial covariances between observations. We describe a framework to reconstruct time series of ground deformation while considering all pixels simultaneously, allowing us to account for spatial covariances, imprecise orbits, and residual atmospheric perturbations. We describe spatial covariances by an exponential decay function dependent of pixel-to-pixel distance. We approximate the impact of imprecise orbit information and residual long-wavelength atmosphere as a low-order polynomial function. Tests on synthetic data illustrate the importance of incorporating full covariances between pixels in order to avoid biased parameter reconstruction. An example of application to the northern Chilean subduction zone highlights the potential of this method.

  5. A model-free characterization of recurrences in stationary time series

    NASA Astrophysics Data System (ADS)

    Chicheportiche, Rémy; Chakraborti, Anirban

    2017-05-01

    Study of recurrences in earthquakes, climate, financial time-series, etc. is crucial to better forecast disasters and limit their consequences. Most of the previous phenomenological studies of recurrences have involved only a long-ranged autocorrelation function, and ignored the multi-scaling properties induced by potential higher order dependencies. We argue that copulas is a natural model-free framework to study non-linear dependencies in time series and related concepts like recurrences. Consequently, we arrive at the facts that (i) non-linear dependences do impact both the statistics and dynamics of recurrence times, and (ii) the scaling arguments for the unconditional distribution may not be applicable. Hence, fitting and/or simulating the intertemporal distribution of recurrence intervals is very much system specific, and cannot actually benefit from universal features, in contrast to the previous claims. This has important implications in epilepsy prognosis and financial risk management applications.

  6. Determination of sample size for higher volatile data using new framework of Box-Jenkins model with GARCH: A case study on gold price

    NASA Astrophysics Data System (ADS)

    Roslindar Yaziz, Siti; Zakaria, Roslinazairimah; Hura Ahmad, Maizah

    2017-09-01

    The model of Box-Jenkins - GARCH has been shown to be a promising tool for forecasting higher volatile time series. In this study, the framework of determining the optimal sample size using Box-Jenkins model with GARCH is proposed for practical application in analysing and forecasting higher volatile data. The proposed framework is employed to daily world gold price series from year 1971 to 2013. The data is divided into 12 different sample sizes (from 30 to 10200). Each sample is tested using different combination of the hybrid Box-Jenkins - GARCH model. Our study shows that the optimal sample size to forecast gold price using the framework of the hybrid model is 1250 data of 5-year sample. Hence, the empirical results of model selection criteria and 1-step-ahead forecasting evaluations suggest that the latest 12.25% (5-year data) of 10200 data is sufficient enough to be employed in the model of Box-Jenkins - GARCH with similar forecasting performance as by using 41-year data.

  7. Enrollment Projection within a Decision-Making Framework.

    ERIC Educational Resources Information Center

    Armstrong, David F.; Nunley, Charlene Wenckowski

    1981-01-01

    Two methods used to predict enrollment at Montgomery College in Maryland are compared and evaluated, and the administrative context in which they are used is considered. The two methods involve time series analysis (curve fitting) and indicator techniques (yield from components). (MSE)

  8. A tool for NDVI time series extraction from wide-swath remotely sensed images

    NASA Astrophysics Data System (ADS)

    Li, Zhishan; Shi, Runhe; Zhou, Cong

    2015-09-01

    Normalized Difference Vegetation Index (NDVI) is one of the most widely used indicators for monitoring the vegetation coverage in land surface. The time series features of NDVI are capable of reflecting dynamic changes of various ecosystems. Calculating NDVI via Moderate Resolution Imaging Spectrometer (MODIS) and other wide-swath remotely sensed images provides an important way to monitor the spatial and temporal characteristics of large-scale NDVI. However, difficulties are still existed for ecologists to extract such information correctly and efficiently because of the problems in several professional processes on the original remote sensing images including radiometric calibration, geometric correction, multiple data composition and curve smoothing. In this study, we developed an efficient and convenient online toolbox for non-remote sensing professionals who want to extract NDVI time series with a friendly graphic user interface. It is based on Java Web and Web GIS technically. Moreover, Struts, Spring and Hibernate frameworks (SSH) are integrated in the system for the purpose of easy maintenance and expansion. Latitude, longitude and time period are the key inputs that users need to provide, and the NDVI time series are calculated automatically.

  9. Agatha: Disentangling period signals from correlated noise in a periodogram framework

    NASA Astrophysics Data System (ADS)

    Feng, F.; Tuomi, M.; Jones, H. R. A.

    2018-04-01

    Agatha is a framework of periodograms to disentangle periodic signals from correlated noise and to solve the two-dimensional model selection problem: signal dimension and noise model dimension. These periodograms are calculated by applying likelihood maximization and marginalization and combined in a self-consistent way. Agatha can be used to select the optimal noise model and to test the consistency of signals in time and can be applied to time series analyses in other astronomical and scientific disciplines. An interactive web implementation of the software is also available at http://agatha.herts.ac.uk/.

  10. 33 CFR Appendix B to Part 263 - Application of Multiobjective Planning Framework to Continuing Authorities Program

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Authorities Program 1. General. The planning process described in the ER 1105-2-200 series of regulations... at the same time keeping the requirements for information and analyses consistent with the scope of the study, solutions recommended, and the Program completion-time objectives outlined in § 263.18 of...

  11. 33 CFR Appendix B to Part 263 - Application of Multiobjective Planning Framework to Continuing Authorities Program

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Authorities Program 1. General. The planning process described in the ER 1105-2-200 series of regulations... at the same time keeping the requirements for information and analyses consistent with the scope of the study, solutions recommended, and the Program completion-time objectives outlined in § 263.18 of...

  12. 33 CFR Appendix B to Part 263 - Application of Multiobjective Planning Framework to Continuing Authorities Program

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Authorities Program 1. General. The planning process described in the ER 1105-2-200 series of regulations... at the same time keeping the requirements for information and analyses consistent with the scope of the study, solutions recommended, and the Program completion-time objectives outlined in § 263.18 of...

  13. 33 CFR Appendix B to Part 263 - Application of Multiobjective Planning Framework to Continuing Authorities Program

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Authorities Program 1. General. The planning process described in the ER 1105-2-200 series of regulations... at the same time keeping the requirements for information and analyses consistent with the scope of the study, solutions recommended, and the Program completion-time objectives outlined in § 263.18 of...

  14. 33 CFR Appendix B to Part 263 - Application of Multiobjective Planning Framework to Continuing Authorities Program

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Authorities Program 1. General. The planning process described in the ER 1105-2-200 series of regulations... at the same time keeping the requirements for information and analyses consistent with the scope of the study, solutions recommended, and the Program completion-time objectives outlined in § 263.18 of...

  15. Predicting critical transitions in dynamical systems from time series using nonstationary probability density modeling.

    PubMed

    Kwasniok, Frank

    2013-11-01

    A time series analysis method for predicting the probability density of a dynamical system is proposed. A nonstationary parametric model of the probability density is estimated from data within a maximum likelihood framework and then extrapolated to forecast the future probability density and explore the system for critical transitions or tipping points. A full systematic account of parameter uncertainty is taken. The technique is generic, independent of the underlying dynamics of the system. The method is verified on simulated data and then applied to prediction of Arctic sea-ice extent.

  16. A novel Bayesian approach to acoustic emission data analysis.

    PubMed

    Agletdinov, E; Pomponi, E; Merson, D; Vinogradov, A

    2016-12-01

    Acoustic emission (AE) technique is a popular tool for materials characterization and non-destructive testing. Originating from the stochastic motion of defects in solids, AE is a random process by nature. The challenging problem arises whenever an attempt is made to identify specific points corresponding to the changes in the trends in the fluctuating AE time series. A general Bayesian framework is proposed for the analysis of AE time series, aiming at automated finding the breakpoints signaling a crossover in the dynamics of underlying AE sources. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Kernel canonical-correlation Granger causality for multiple time series

    NASA Astrophysics Data System (ADS)

    Wu, Guorong; Duan, Xujun; Liao, Wei; Gao, Qing; Chen, Huafu

    2011-04-01

    Canonical-correlation analysis as a multivariate statistical technique has been applied to multivariate Granger causality analysis to infer information flow in complex systems. It shows unique appeal and great superiority over the traditional vector autoregressive method, due to the simplified procedure that detects causal interaction between multiple time series, and the avoidance of potential model estimation problems. However, it is limited to the linear case. Here, we extend the framework of canonical correlation to include the estimation of multivariate nonlinear Granger causality for drawing inference about directed interaction. Its feasibility and effectiveness are verified on simulated data.

  18. Multimedia content description framework

    NASA Technical Reports Server (NTRS)

    Bergman, Lawrence David (Inventor); Mohan, Rakesh (Inventor); Li, Chung-Sheng (Inventor); Smith, John Richard (Inventor); Kim, Michelle Yoonk Yung (Inventor)

    2003-01-01

    A framework is provided for describing multimedia content and a system in which a plurality of multimedia storage devices employing the content description methods of the present invention can interoperate. In accordance with one form of the present invention, the content description framework is a description scheme (DS) for describing streams or aggregations of multimedia objects, which may comprise audio, images, video, text, time series, and various other modalities. This description scheme can accommodate an essentially limitless number of descriptors in terms of features, semantics or metadata, and facilitate content-based search, index, and retrieval, among other capabilities, for both streamed or aggregated multimedia objects.

  19. Discovering System Health Anomalies Using Data Mining Techniques

    NASA Technical Reports Server (NTRS)

    Sriastava, Ashok, N.

    2005-01-01

    We present a data mining framework for the analysis and discovery of anomalies in high-dimensional time series of sensor measurements that would be found in an Integrated System Health Monitoring system. We specifically treat the problem of discovering anomalous features in the time series that may be indicative of a system anomaly, or in the case of a manned system, an anomaly due to the human. Identification of these anomalies is crucial to building stable, reusable, and cost-efficient systems. The framework consists of an analysis platform and new algorithms that can scale to thousands of sensor streams to discovers temporal anomalies. We discuss the mathematical framework that underlies the system and also describe in detail how this framework is general enough to encompass both discrete and continuous sensor measurements. We also describe a new set of data mining algorithms based on kernel methods and hidden Markov models that allow for the rapid assimilation, analysis, and discovery of system anomalies. We then describe the performance of the system on a real-world problem in the aircraft domain where we analyze the cockpit data from aircraft as well as data from the aircraft propulsion, control, and guidance systems. These data are discrete and continuous sensor measurements and are dealt with seamlessly in order to discover anomalous flights. We conclude with recommendations that describe the tradeoffs in building an integrated scalable platform for robust anomaly detection in ISHM applications.

  20. Functional connectivity analysis in resting state fMRI with echo-state networks and non-metric clustering for network structure recovery

    NASA Astrophysics Data System (ADS)

    Wismüller, Axel; DSouza, Adora M.; Abidin, Anas Z.; Wang, Xixi; Hobbs, Susan K.; Nagarajan, Mahesh B.

    2015-03-01

    Echo state networks (ESN) are recurrent neural networks where the hidden layer is replaced with a fixed reservoir of neurons. Unlike feed-forward networks, neuron training in ESN is restricted to the output neurons alone thereby providing a computational advantage. We demonstrate the use of such ESNs in our mutual connectivity analysis (MCA) framework for recovering the primary motor cortex network associated with hand movement from resting state functional MRI (fMRI) data. Such a framework consists of two steps - (1) defining a pair-wise affinity matrix between different pixel time series within the brain to characterize network activity and (2) recovering network components from the affinity matrix with non-metric clustering. Here, ESNs are used to evaluate pair-wise cross-estimation performance between pixel time series to create the affinity matrix, which is subsequently subject to non-metric clustering with the Louvain method. For comparison, the ground truth of the motor cortex network structure is established with a task-based fMRI sequence. Overlap between the primary motor cortex network recovered with our model free MCA approach and the ground truth was measured with the Dice coefficient. Our results show that network recovery with our proposed MCA approach is in close agreement with the ground truth. Such network recovery is achieved without requiring low-pass filtering of the time series ensembles prior to analysis, an fMRI preprocessing step that has courted controversy in recent years. Thus, we conclude our MCA framework can allow recovery and visualization of the underlying functionally connected networks in the brain on resting state fMRI.

  1. Inner shelf morphologic controls on the dynamics of the beach and bar system, Fire Island, New York

    USGS Publications Warehouse

    Hapke, Cheryl J.; Schwab, William C.; Gayes, Paul T.; McCoy, Clay; Viso, Richard; Lentz, Erika E.; Rosati, Julie D.; Wang, Ping; Roberts, Tiffany M.

    2011-01-01

    The mechanism of sediment exchange between offshore sand ridges and the beach at Fire Island, New York is largely unknown. However, recent evidence from repeat nearshore bathymetry surveys, coupled with the complex but consistent bar morphology and patterns of shoreline change demonstrate that there is a feedback occurring between the regional geologic framework and modern processes. Analysis of bathymetric survey data provides direct confirmation that the offshore ridges are connected to the shoreface and are spatially persistent. The fixed nature of the nearshore morphology is further supported by time series camera data that indicate persistent bars with breaks that re-form in the same locations. A long-term time series of shoreline change shows distinct zones of erosion and accretion that are pervasive over time scales greater than a half-century, and their length-scales are similar to the spacing of the offshore ridge-trough system. The first-order geologic framework is responsible for the existence and locations of the ridges and troughs, which then influence the morphodynamics of the beach and bar system.

  2. Spatio-temporal Granger causality: a new framework

    PubMed Central

    Luo, Qiang; Lu, Wenlian; Cheng, Wei; Valdes-Sosa, Pedro A.; Wen, Xiaotong; Ding, Mingzhou; Feng, Jianfeng

    2015-01-01

    That physiological oscillations of various frequencies are present in fMRI signals is the rule, not the exception. Herein, we propose a novel theoretical framework, spatio-temporal Granger causality, which allows us to more reliably and precisely estimate the Granger causality from experimental datasets possessing time-varying properties caused by physiological oscillations. Within this framework, Granger causality is redefined as a global index measuring the directed information flow between two time series with time-varying properties. Both theoretical analyses and numerical examples demonstrate that Granger causality is a monotonically increasing function of the temporal resolution used in the estimation. This is consistent with the general principle of coarse graining, which causes information loss by smoothing out very fine-scale details in time and space. Our results confirm that the Granger causality at the finer spatio-temporal scales considerably outperforms the traditional approach in terms of an improved consistency between two resting-state scans of the same subject. To optimally estimate the Granger causality, the proposed theoretical framework is implemented through a combination of several approaches, such as dividing the optimal time window and estimating the parameters at the fine temporal and spatial scales. Taken together, our approach provides a novel and robust framework for estimating the Granger causality from fMRI, EEG, and other related data. PMID:23643924

  3. DTWscore: differential expression and cell clustering analysis for time-series single-cell RNA-seq data.

    PubMed

    Wang, Zhuo; Jin, Shuilin; Liu, Guiyou; Zhang, Xiurui; Wang, Nan; Wu, Deliang; Hu, Yang; Zhang, Chiping; Jiang, Qinghua; Xu, Li; Wang, Yadong

    2017-05-23

    The development of single-cell RNA sequencing has enabled profound discoveries in biology, ranging from the dissection of the composition of complex tissues to the identification of novel cell types and dynamics in some specialized cellular environments. However, the large-scale generation of single-cell RNA-seq (scRNA-seq) data collected at multiple time points remains a challenge to effective measurement gene expression patterns in transcriptome analysis. We present an algorithm based on the Dynamic Time Warping score (DTWscore) combined with time-series data, that enables the detection of gene expression changes across scRNA-seq samples and recovery of potential cell types from complex mixtures of multiple cell types. The DTWscore successfully classify cells of different types with the most highly variable genes from time-series scRNA-seq data. The study was confined to methods that are implemented and available within the R framework. Sample datasets and R packages are available at https://github.com/xiaoxiaoxier/DTWscore .

  4. i-SVOC -- A simulation program for indoor SVOCs (Version 1.0)

    EPA Science Inventory

    Program i-SVOC estimates the emissions, transport, and sorption of semivolatile organic compounds (SVOCs) in the indoor environment as functions of time when a series of initial conditions is given. This program implements a framework for dynamic modeling of indoor SVOCs develope...

  5. Applicability of Hydrologic Landscapes for Model Calibration at the Watershed Scale in the Pacific Northwest

    EPA Science Inventory

    The Pacific Northwest Hydrologic Landscapes (PNW HL) at the assessment unit scale has provided a solid conceptual classification framework to relate and transfer hydrologically meaningful information between watersheds without access to streamflow time series. A collection of tec...

  6. Tipping point analysis of ocean acoustic noise

    NASA Astrophysics Data System (ADS)

    Livina, Valerie N.; Brouwer, Albert; Harris, Peter; Wang, Lian; Sotirakopoulos, Kostas; Robinson, Stephen

    2018-02-01

    We apply tipping point analysis to a large record of ocean acoustic data to identify the main components of the acoustic dynamical system and study possible bifurcations and transitions of the system. The analysis is based on a statistical physics framework with stochastic modelling, where we represent the observed data as a composition of deterministic and stochastic components estimated from the data using time-series techniques. We analyse long-term and seasonal trends, system states and acoustic fluctuations to reconstruct a one-dimensional stochastic equation to approximate the acoustic dynamical system. We apply potential analysis to acoustic fluctuations and detect several changes in the system states in the past 14 years. These are most likely caused by climatic phenomena. We analyse trends in sound pressure level within different frequency bands and hypothesize a possible anthropogenic impact on the acoustic environment. The tipping point analysis framework provides insight into the structure of the acoustic data and helps identify its dynamic phenomena, correctly reproducing the probability distribution and scaling properties (power-law correlations) of the time series.

  7. Early warning by near-real time disturbance monitoring (Invited)

    NASA Astrophysics Data System (ADS)

    Verbesselt, J.; Zeileis, A.; Herold, M.

    2013-12-01

    Near real-time monitoring of ecosystem disturbances is critical for rapidly assessing and addressing impacts on carbon dynamics, biodiversity, and socio-ecological processes. Satellite remote sensing enables cost-effective and accurate monitoring at frequent time steps over large areas. Yet, generic methods to detect disturbances within newly captured satellite images are lacking. We propose a multi-purpose time-series-based disturbance detection approach that identifies and models stable historical variation to enable change detection within newly acquired data. Satellite image time series of vegetation greenness provide a global record of terrestrial vegetation productivity over the past decades. Here, we assess and demonstrate the method by applying it to (1) real-world satellite greenness image time series between February 2000 and July 2011 covering Somalia to detect drought-related vegetation disturbances (2) landsat image time series to detect forest disturbances. First, results illustrate that disturbances are successfully detected in near real-time while being robust to seasonality and noise. Second, major drought-related disturbance corresponding with most drought-stressed regions in Somalia are detected from mid-2010 onwards. Third, the method can be applied to landsat image time series having a lower temporal data density. Furthermore the method can analyze in-situ or satellite data time series of biophysical indicators from local to global scale since it is fast, does not depend on thresholds and does not require time series gap filling. While the data and methods used are appropriate for proof-of-concept development of global scale disturbance monitoring, specific applications (e.g., drought or deforestation monitoring) mandates integration within an operational monitoring framework. Furthermore, the real-time monitoring method is implemented in open-source environment and is freely available in the BFAST package for R software. Information illustrating how to apply the method on satellite image time series are available at http://bfast.R-Forge.R-project.org/ and the example section of the bfastmonitor() function within the BFAST package.

  8. Perl Modules for Constructing Iterators

    NASA Technical Reports Server (NTRS)

    Tilmes, Curt

    2009-01-01

    The Iterator Perl Module provides a general-purpose framework for constructing iterator objects within Perl, and a standard API for interacting with those objects. Iterators are an object-oriented design pattern where a description of a series of values is used in a constructor. Subsequent queries can request values in that series. These Perl modules build on the standard Iterator framework and provide iterators for some other types of values. Iterator::DateTime constructs iterators from DateTime objects or Date::Parse descriptions and ICal/RFC 2445 style re-currence descriptions. It supports a variety of input parameters, including a start to the sequence, an end to the sequence, an Ical/RFC 2445 recurrence describing the frequency of the values in the series, and a format description that can refine the presentation manner of the DateTime. Iterator::String constructs iterators from string representations. This module is useful in contexts where the API consists of supplying a string and getting back an iterator where the specific iteration desired is opaque to the caller. It is of particular value to the Iterator::Hash module which provides nested iterations. Iterator::Hash constructs iterators from Perl hashes that can include multiple iterators. The constructed iterators will return all the permutations of the iterations of the hash by nested iteration of embedded iterators. A hash simply includes a set of keys mapped to values. It is a very common data structure used throughout Perl programming. The Iterator:: Hash module allows a hash to include strings defining iterators (parsed and dispatched with Iterator::String) that are used to construct an overall series of hash values.

  9. Nonparametric autocovariance estimation from censored time series by Gaussian imputation.

    PubMed

    Park, Jung Wook; Genton, Marc G; Ghosh, Sujit K

    2009-02-01

    One of the most frequently used methods to model the autocovariance function of a second-order stationary time series is to use the parametric framework of autoregressive and moving average models developed by Box and Jenkins. However, such parametric models, though very flexible, may not always be adequate to model autocovariance functions with sharp changes. Furthermore, if the data do not follow the parametric model and are censored at a certain value, the estimation results may not be reliable. We develop a Gaussian imputation method to estimate an autocovariance structure via nonparametric estimation of the autocovariance function in order to address both censoring and incorrect model specification. We demonstrate the effectiveness of the technique in terms of bias and efficiency with simulations under various rates of censoring and underlying models. We describe its application to a time series of silicon concentrations in the Arctic.

  10. Exploring fractal behaviour of blood oxygen saturation in preterm babies

    NASA Astrophysics Data System (ADS)

    Zahari, Marina; Hui, Tan Xin; Zainuri, Nuryazmin Ahmat; Darlow, Brian A.

    2017-04-01

    Recent evidence has been emerging that oxygenation instability in preterm babies could lead to an increased risk of retinal injury such as retinopathy of prematurity. There is a potential that disease severity could be better understood using nonlinear methods for time series data such as fractal theories [1]. Theories on fractal behaviours have been employed by researchers in various disciplines who were motivated to look into the behaviour or structure of irregular fluctuations in temporal data. In this study, an investigation was carried out to examine whether fractal behaviour could be detected in blood oxygen time series. Detection for the presence of fractals in oxygen data of preterm infants was performed using the methods of power spectrum, empirical probability distribution function and autocorrelation function. The results from these fractal identification methods indicate the possibility that these data exhibit fractal nature. Subsequently, a fractal framework for future research was suggested for oxygen time series.

  11. Probabilistic Reasoning Over Seismic Time Series: Volcano Monitoring by Hidden Markov Models at Mt. Etna

    NASA Astrophysics Data System (ADS)

    Cassisi, Carmelo; Prestifilippo, Michele; Cannata, Andrea; Montalto, Placido; Patanè, Domenico; Privitera, Eugenio

    2016-07-01

    From January 2011 to December 2015, Mt. Etna was mainly characterized by a cyclic eruptive behavior with more than 40 lava fountains from New South-East Crater. Using the RMS (Root Mean Square) of the seismic signal recorded by stations close to the summit area, an automatic recognition of the different states of volcanic activity (QUIET, PRE-FOUNTAIN, FOUNTAIN, POST-FOUNTAIN) has been applied for monitoring purposes. Since values of the RMS time series calculated on the seismic signal are generated from a stochastic process, we can try to model the system generating its sampled values, assumed to be a Markov process, using Hidden Markov Models (HMMs). HMMs analysis seeks to recover the sequence of hidden states from the observations. In our framework, observations are characters generated by the Symbolic Aggregate approXimation (SAX) technique, which maps RMS time series values with symbols of a pre-defined alphabet. The main advantages of the proposed framework, based on HMMs and SAX, with respect to other automatic systems applied on seismic signals at Mt. Etna, are the use of multiple stations and static thresholds to well characterize the volcano states. Its application on a wide seismic dataset of Etna volcano shows the possibility to guess the volcano states. The experimental results show that, in most of the cases, we detected lava fountains in advance.

  12. Mutual connectivity analysis (MCA) using generalized radial basis function neural networks for nonlinear functional connectivity network recovery in resting-state functional MRI

    NASA Astrophysics Data System (ADS)

    D'Souza, Adora M.; Abidin, Anas Zainul; Nagarajan, Mahesh B.; Wismüller, Axel

    2016-03-01

    We investigate the applicability of a computational framework, called mutual connectivity analysis (MCA), for directed functional connectivity analysis in both synthetic and resting-state functional MRI data. This framework comprises of first evaluating non-linear cross-predictability between every pair of time series prior to recovering the underlying network structure using community detection algorithms. We obtain the non-linear cross-prediction score between time series using Generalized Radial Basis Functions (GRBF) neural networks. These cross-prediction scores characterize the underlying functionally connected networks within the resting brain, which can be extracted using non-metric clustering approaches, such as the Louvain method. We first test our approach on synthetic models with known directional influence and network structure. Our method is able to capture the directional relationships between time series (with an area under the ROC curve = 0.92 +/- 0.037) as well as the underlying network structure (Rand index = 0.87 +/- 0.063) with high accuracy. Furthermore, we test this method for network recovery on resting-state fMRI data, where results are compared to the motor cortex network recovered from a motor stimulation sequence, resulting in a strong agreement between the two (Dice coefficient = 0.45). We conclude that our MCA approach is effective in analyzing non-linear directed functional connectivity and in revealing underlying functional network structure in complex systems.

  13. Cross-entropy clustering framework for catchment classification

    NASA Astrophysics Data System (ADS)

    Tongal, Hakan; Sivakumar, Bellie

    2017-09-01

    There is an increasing interest in catchment classification and regionalization in hydrology, as they are useful for identification of appropriate model complexity and transfer of information from gauged catchments to ungauged ones, among others. This study introduces a nonlinear cross-entropy clustering (CEC) method for classification of catchments. The method specifically considers embedding dimension (m), sample entropy (SampEn), and coefficient of variation (CV) to represent dimensionality, complexity, and variability of the time series, respectively. The method is applied to daily streamflow time series from 217 gauging stations across Australia. The results suggest that a combination of linear and nonlinear parameters (i.e. m, SampEn, and CV), representing different aspects of the underlying dynamics of streamflows, could be useful for determining distinct patterns of flow generation mechanisms within a nonlinear clustering framework. For the 217 streamflow time series, nine hydrologically homogeneous clusters that have distinct patterns of flow regime characteristics and specific dominant hydrological attributes with different climatic features are obtained. Comparison of the results with those obtained using the widely employed k-means clustering method (which results in five clusters, with the loss of some information about the features of the clusters) suggests the superiority of the cross-entropy clustering method. The outcomes from this study provide a useful guideline for employing the nonlinear dynamic approaches based on hydrologic signatures and for gaining an improved understanding of streamflow variability at a large scale.

  14. Mutual Connectivity Analysis (MCA) Using Generalized Radial Basis Function Neural Networks for Nonlinear Functional Connectivity Network Recovery in Resting-State Functional MRI.

    PubMed

    DSouza, Adora M; Abidin, Anas Zainul; Nagarajan, Mahesh B; Wismüller, Axel

    2016-03-29

    We investigate the applicability of a computational framework, called mutual connectivity analysis (MCA), for directed functional connectivity analysis in both synthetic and resting-state functional MRI data. This framework comprises of first evaluating non-linear cross-predictability between every pair of time series prior to recovering the underlying network structure using community detection algorithms. We obtain the non-linear cross-prediction score between time series using Generalized Radial Basis Functions (GRBF) neural networks. These cross-prediction scores characterize the underlying functionally connected networks within the resting brain, which can be extracted using non-metric clustering approaches, such as the Louvain method. We first test our approach on synthetic models with known directional influence and network structure. Our method is able to capture the directional relationships between time series (with an area under the ROC curve = 0.92 ± 0.037) as well as the underlying network structure (Rand index = 0.87 ± 0.063) with high accuracy. Furthermore, we test this method for network recovery on resting-state fMRI data, where results are compared to the motor cortex network recovered from a motor stimulation sequence, resulting in a strong agreement between the two (Dice coefficient = 0.45). We conclude that our MCA approach is effective in analyzing non-linear directed functional connectivity and in revealing underlying functional network structure in complex systems.

  15. A hierarchical spatial framework and database for the national river fish habitat condition assessment

    USGS Publications Warehouse

    Wang, L.; Infante, D.; Esselman, P.; Cooper, A.; Wu, D.; Taylor, W.; Beard, D.; Whelan, G.; Ostroff, A.

    2011-01-01

    Fisheries management programs, such as the National Fish Habitat Action Plan (NFHAP), urgently need a nationwide spatial framework and database for health assessment and policy development to protect and improve riverine systems. To meet this need, we developed a spatial framework and database using National Hydrography Dataset Plus (I-.100,000-scale); http://www.horizon-systems.com/nhdplus). This framework uses interconfluence river reaches and their local and network catchments as fundamental spatial river units and a series of ecological and political spatial descriptors as hierarchy structures to allow users to extract or analyze information at spatial scales that they define. This database consists of variables describing channel characteristics, network position/connectivity, climate, elevation, gradient, and size. It contains a series of catchment-natural and human-induced factors that are known to influence river characteristics. Our framework and database assembles all river reaches and their descriptors in one place for the first time for the conterminous United States. This framework and database provides users with the capability of adding data, conducting analyses, developing management scenarios and regulation, and tracking management progresses at a variety of spatial scales. This database provides the essential data needs for achieving the objectives of NFHAP and other management programs. The downloadable beta version database is available at http://ec2-184-73-40-15.compute-1.amazonaws.com/nfhap/main/.

  16. Comprehensive Assessment of Models and Events based on Library tools (CAMEL)

    NASA Astrophysics Data System (ADS)

    Rastaetter, L.; Boblitt, J. M.; DeZeeuw, D.; Mays, M. L.; Kuznetsova, M. M.; Wiegand, C.

    2017-12-01

    At the Community Coordinated Modeling Center (CCMC), the assessment of modeling skill using a library of model-data comparison metrics is taken to the next level by fully integrating the ability to request a series of runs with the same model parameters for a list of events. The CAMEL framework initiates and runs a series of selected, pre-defined simulation settings for participating models (e.g., WSA-ENLIL, SWMF-SC+IH for the heliosphere, SWMF-GM, OpenGGCM, LFM, GUMICS for the magnetosphere) and performs post-processing using existing tools for a host of different output parameters. The framework compares the resulting time series data with respective observational data and computes a suite of metrics such as Prediction Efficiency, Root Mean Square Error, Probability of Detection, Probability of False Detection, Heidke Skill Score for each model-data pair. The system then plots scores by event and aggregated over all events for all participating models and run settings. We are building on past experiences with model-data comparisons of magnetosphere and ionosphere model outputs in GEM2008, GEM-CEDAR CETI2010 and Operational Space Weather Model challenges (2010-2013). We can apply the framework also to solar-heliosphere as well as radiation belt models. The CAMEL framework takes advantage of model simulations described with Space Physics Archive Search and Extract (SPASE) metadata and a database backend design developed for a next-generation Run-on-Request system at the CCMC.

  17. Stochastic modeling of sunshine number data

    NASA Astrophysics Data System (ADS)

    Brabec, Marek; Paulescu, Marius; Badescu, Viorel

    2013-11-01

    In this paper, we will present a unified statistical modeling framework for estimation and forecasting sunshine number (SSN) data. Sunshine number has been proposed earlier to describe sunshine time series in qualitative terms (Theor Appl Climatol 72 (2002) 127-136) and since then, it was shown to be useful not only for theoretical purposes but also for practical considerations, e.g. those related to the development of photovoltaic energy production. Statistical modeling and prediction of SSN as a binary time series has been challenging problem, however. Our statistical model for SSN time series is based on an underlying stochastic process formulation of Markov chain type. We will show how its transition probabilities can be efficiently estimated within logistic regression framework. In fact, our logistic Markovian model can be relatively easily fitted via maximum likelihood approach. This is optimal in many respects and it also enables us to use formalized statistical inference theory to obtain not only the point estimates of transition probabilities and their functions of interest, but also related uncertainties, as well as to test of various hypotheses of practical interest, etc. It is straightforward to deal with non-homogeneous transition probabilities in this framework. Very importantly from both physical and practical points of view, logistic Markov model class allows us to test hypotheses about how SSN dependents on various external covariates (e.g. elevation angle, solar time, etc.) and about details of the dynamic model (order and functional shape of the Markov kernel, etc.). Therefore, using generalized additive model approach (GAM), we can fit and compare models of various complexity which insist on keeping physical interpretation of the statistical model and its parts. After introducing the Markovian model and general approach for identification of its parameters, we will illustrate its use and performance on high resolution SSN data from the Solar Radiation Monitoring Station of the West University of Timisoara.

  18. Stochastic modeling of sunshine number data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brabec, Marek, E-mail: mbrabec@cs.cas.cz; Paulescu, Marius; Badescu, Viorel

    2013-11-13

    In this paper, we will present a unified statistical modeling framework for estimation and forecasting sunshine number (SSN) data. Sunshine number has been proposed earlier to describe sunshine time series in qualitative terms (Theor Appl Climatol 72 (2002) 127-136) and since then, it was shown to be useful not only for theoretical purposes but also for practical considerations, e.g. those related to the development of photovoltaic energy production. Statistical modeling and prediction of SSN as a binary time series has been challenging problem, however. Our statistical model for SSN time series is based on an underlying stochastic process formulation ofmore » Markov chain type. We will show how its transition probabilities can be efficiently estimated within logistic regression framework. In fact, our logistic Markovian model can be relatively easily fitted via maximum likelihood approach. This is optimal in many respects and it also enables us to use formalized statistical inference theory to obtain not only the point estimates of transition probabilities and their functions of interest, but also related uncertainties, as well as to test of various hypotheses of practical interest, etc. It is straightforward to deal with non-homogeneous transition probabilities in this framework. Very importantly from both physical and practical points of view, logistic Markov model class allows us to test hypotheses about how SSN dependents on various external covariates (e.g. elevation angle, solar time, etc.) and about details of the dynamic model (order and functional shape of the Markov kernel, etc.). Therefore, using generalized additive model approach (GAM), we can fit and compare models of various complexity which insist on keeping physical interpretation of the statistical model and its parts. After introducing the Markovian model and general approach for identification of its parameters, we will illustrate its use and performance on high resolution SSN data from the Solar Radiation Monitoring Station of the West University of Timisoara.« less

  19. Using a time-series statistical framework to quantify trends and abrupt change in US corn, soybean, and wheat yields from 1970-2016

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Ives, A. R.; Turner, M. G.; Kucharik, C. J.

    2017-12-01

    Previous studies have identified global agricultural regions where "stagnation" of long-term crop yield increases has occurred. These studies have used a variety of simple statistical methods that often ignore important aspects of time series regression modeling. These methods can lead to differing and contradictory results, which creates uncertainty regarding food security given rapid global population growth. Here, we present a new statistical framework incorporating time series-based algorithms into standard regression models to quantify spatiotemporal yield trends of US maize, soybean, and winter wheat from 1970-2016. Our primary goal was to quantify spatial differences in yield trends for these three crops using USDA county level data. This information was used to identify regions experiencing the largest changes in the rate of yield increases over time, and to determine whether abrupt shifts in the rate of yield increases have occurred. Although crop yields continue to increase in most maize-, soybean-, and winter wheat-growing areas, yield increases have stagnated in some key agricultural regions during the most recent 15 to 16 years: some maize-growing areas, except for the northern Great Plains, have shown a significant trend towards smaller annual yield increases for maize; soybean has maintained an consistent long-term yield gains in the Northern Great Plains, the Midwest, and southeast US, but has experienced a shift to smaller annual increases in other regions; winter wheat maintained a moderate annual increase in eastern South Dakota and eastern US locations, but showed a decline in the magnitude of annual increases across the central Great Plains and western US regions. Our results suggest that there were abrupt shifts in the rate of annual yield increases in a variety of US regions among the three crops. The framework presented here can be broadly applied to additional yield trend analyses for different crops and regions of the Earth.

  20. Classification of Satellite Derived Chlorophyll a Space-Time Series by Means of Quantile Regression: An Application to the Adriatic Sea

    NASA Astrophysics Data System (ADS)

    Girardi, P.; Pastres, R.; Gaetan, C.; Mangin, A.; Taji, M. A.

    2015-12-01

    In this paper, we present the results of a classification of Adriatic waters, based on spatial time series of remotely sensed Chlorophyll type-a. The study was carried out using a clustering procedure combining quantile smoothing and an agglomerative clustering algorithms. The smoothing function includes a seasonal term, thus allowing one to classify areas according to “similar” seasonal evolution, as well as according to “similar” trends. This methodology, which is here applied for the first time to Ocean Colour data, is more robust with respect to other classical methods, as it does not require any assumption on the probability distribution of the data. This approach was applied to the classification of an eleven year long time series, from January 2002 to December 2012, of monthly values of Chlorophyll type-a concentrations covering the whole Adriatic Sea. The data set was made available by ACRI (http://hermes.acri.fr) in the framework of the Glob-Colour Project (http://www.globcolour.info). Data were obtained by calibrating Ocean Colour data provided by different satellite missions, such as MERIS, SeaWiFS and MODIS. The results clearly show the presence of North-South and West-East gradient in the level of Chlorophyll, which is consistent with literature findings. This analysis could provide a sound basis for the identification of “water bodies” and of Chlorophyll type-a thresholds which define their Good Ecological Status, in terms of trophic level, as required by the implementation of the Marine Strategy Framework Directive. The forthcoming availability of Sentinel-3 OLCI data, in continuity of the previous missions, and with perspective of more than a 15-year monitoring system, offers a real opportunity of expansion of our study as a strong support to the implementation of both the EU Marine Strategy Framework Directive and the UNEP-MAP Ecosystem Approach in the Mediterranean.

  1. Methods for associating or dissociating guest materials with a metal organic framework, systems for associating or dissociating guest materials within a series of metal organic frameworks, thermal energy transfer assemblies, and methods for transferring thermal energy

    DOEpatents

    McGrail, B. Peter; Brown, Daryl R.; Thallapally, Praveen K.

    2016-08-02

    Methods for releasing associated guest materials from a metal organic framework are provided. Methods for associating guest materials with a metal organic framework are also provided. Methods are provided for selectively associating or dissociating guest materials with a metal organic framework. Systems for associating or dissociating guest materials within a series of metal organic frameworks are provided. Thermal energy transfer assemblies are provided. Methods for transferring thermal energy are also provided.

  2. Methods for associating or dissociating guest materials with a metal organic framework, systems for associating or dissociating guest materials within a series of metal organic frameworks, thermal energy transfer assemblies, and methods for transferring thermal energy

    DOEpatents

    McGrail, B. Peter; Brown, Daryl R.; Thallapally, Praveen K.

    2014-08-05

    Methods for releasing associated guest materials from a metal organic framework are provided. Methods for associating guest materials with a metal organic framework are also provided. Methods are provided for selectively associating or dissociating guest materials with a metal organic framework. Systems for associating or dissociating guest materials within a series of metal organic frameworks are provided. Thermal energy transfer assemblies are provided. Methods for transferring thermal energy are also provided.

  3. Monitoring volcano activity through Hidden Markov Model

    NASA Astrophysics Data System (ADS)

    Cassisi, C.; Montalto, P.; Prestifilippo, M.; Aliotta, M.; Cannata, A.; Patanè, D.

    2013-12-01

    During 2011-2013, Mt. Etna was mainly characterized by cyclic occurrences of lava fountains, totaling to 38 episodes. During this time interval Etna volcano's states (QUIET, PRE-FOUNTAIN, FOUNTAIN, POST-FOUNTAIN), whose automatic recognition is very useful for monitoring purposes, turned out to be strongly related to the trend of RMS (Root Mean Square) of the seismic signal recorded by stations close to the summit area. Since RMS time series behavior is considered to be stochastic, we can try to model the system generating its values, assuming to be a Markov process, by using Hidden Markov models (HMMs). HMMs are a powerful tool in modeling any time-varying series. HMMs analysis seeks to recover the sequence of hidden states from the observed emissions. In our framework, observed emissions are characters generated by the SAX (Symbolic Aggregate approXimation) technique, which maps RMS time series values with discrete literal emissions. The experiments show how it is possible to guess volcano states by means of HMMs and SAX.

  4. Information mining over heterogeneous and high-dimensional time-series data in clinical trials databases.

    PubMed

    Altiparmak, Fatih; Ferhatosmanoglu, Hakan; Erdal, Selnur; Trost, Donald C

    2006-04-01

    An effective analysis of clinical trials data involves analyzing different types of data such as heterogeneous and high dimensional time series data. The current time series analysis methods generally assume that the series at hand have sufficient length to apply statistical techniques to them. Other ideal case assumptions are that data are collected in equal length intervals, and while comparing time series, the lengths are usually expected to be equal to each other. However, these assumptions are not valid for many real data sets, especially for the clinical trials data sets. An addition, the data sources are different from each other, the data are heterogeneous, and the sensitivity of the experiments varies by the source. Approaches for mining time series data need to be revisited, keeping the wide range of requirements in mind. In this paper, we propose a novel approach for information mining that involves two major steps: applying a data mining algorithm over homogeneous subsets of data, and identifying common or distinct patterns over the information gathered in the first step. Our approach is implemented specifically for heterogeneous and high dimensional time series clinical trials data. Using this framework, we propose a new way of utilizing frequent itemset mining, as well as clustering and declustering techniques with novel distance metrics for measuring similarity between time series data. By clustering the data, we find groups of analytes (substances in blood) that are most strongly correlated. Most of these relationships already known are verified by the clinical panels, and, in addition, we identify novel groups that need further biomedical analysis. A slight modification to our algorithm results an effective declustering of high dimensional time series data, which is then used for "feature selection." Using industry-sponsored clinical trials data sets, we are able to identify a small set of analytes that effectively models the state of normal health.

  5. Causality Analysis of fMRI Data Based on the Directed Information Theory Framework.

    PubMed

    Wang, Zhe; Alahmadi, Ahmed; Zhu, David C; Li, Tongtong

    2016-05-01

    This paper aims to conduct fMRI-based causality analysis in brain connectivity by exploiting the directed information (DI) theory framework. Unlike the well-known Granger causality (GC) analysis, which relies on the linear prediction technique, the DI theory framework does not have any modeling constraints on the sequences to be evaluated and ensures estimation convergence. Moreover, it can be used to generate the GC graphs. In this paper, first, we introduce the core concepts in the DI framework. Second, we present how to conduct causality analysis using DI measures between two time series. We provide the detailed procedure on how to calculate the DI for two finite-time series. The two major steps involved here are optimal bin size selection for data digitization and probability estimation. Finally, we demonstrate the applicability of DI-based causality analysis using both the simulated data and experimental fMRI data, and compare the results with that of the GC analysis. Our analysis indicates that GC analysis is effective in detecting linear or nearly linear causal relationship, but may have difficulty in capturing nonlinear causal relationships. On the other hand, DI-based causality analysis is more effective in capturing both linear and nonlinear causal relationships. Moreover, it is observed that brain connectivity among different regions generally involves dynamic two-way information transmissions between them. Our results show that when bidirectional information flow is present, DI is more effective than GC to quantify the overall causal relationship.

  6. Remaking the Facilities Organization. APPA Thought Leaders Series 2016

    ERIC Educational Resources Information Center

    APPA: Association of Higher Education Facilities Officers, 2016

    2016-01-01

    The 2016 Thought Leaders symposium, supported in part by long-time sponsor Jacobs, focused on "Remaking the Facilities Organization." The report makes the case for a customer-centric higher education facilities organization. Experts provide their insights on how a focus on the customer provides a framework for driving change. The…

  7. Impact of model relative accuracy in framework of rescaling observations in hydrological data assimilation studies

    USDA-ARS?s Scientific Manuscript database

    Soil moisture datasets (e.g. satellite-, model-, station-based) vary greatly with respect to their signal, noise, and/or combined time-series variability. Minimizing differences in signal variances is particularly important in data assimilation techniques to optimize the accuracy of the analysis obt...

  8. Comparing drinking water treatment costs to source water protection costs using time series analysis.

    EPA Science Inventory

    We present a framework to compare water treatment costs to source water protection costs, an important knowledge gap for drinking water treatment plants (DWTPs). This trade-off helps to determine what incentives a DWTP has to invest in natural infrastructure or pollution reductio...

  9. An architecture for consolidating multidimensional time-series data onto a common coordinate grid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shippert, Tim; Gaustad, Krista

    In this paper, consolidating measurement data for use by data models or in inter-comparison studies frequently requires transforming the data onto a common grid. Standard methods for interpolating multidimensional data are often not appropriate for data with non-homogenous dimensionality, and are hard to implement in a consistent manner for different datastreams. In addition, these challenges are increased when dealing with the automated procedures necessary for use with continuous, operational datastreams. In this paper we introduce a method of applying a series of one-dimensional transformations to merge data onto a common grid, examine the challenges of ensuring consistent application of datamore » consolidation methods, present a framework for addressing those challenges, and describe the implementation of such a framework for the Atmospheric Radiation Measurement (ARM) program.« less

  10. An architecture for consolidating multidimensional time-series data onto a common coordinate grid

    DOE PAGES

    Shippert, Tim; Gaustad, Krista

    2016-12-16

    In this paper, consolidating measurement data for use by data models or in inter-comparison studies frequently requires transforming the data onto a common grid. Standard methods for interpolating multidimensional data are often not appropriate for data with non-homogenous dimensionality, and are hard to implement in a consistent manner for different datastreams. In addition, these challenges are increased when dealing with the automated procedures necessary for use with continuous, operational datastreams. In this paper we introduce a method of applying a series of one-dimensional transformations to merge data onto a common grid, examine the challenges of ensuring consistent application of datamore » consolidation methods, present a framework for addressing those challenges, and describe the implementation of such a framework for the Atmospheric Radiation Measurement (ARM) program.« less

  11. A scalable database model for multiparametric time series: a volcano observatory case study

    NASA Astrophysics Data System (ADS)

    Montalto, Placido; Aliotta, Marco; Cassisi, Carmelo; Prestifilippo, Michele; Cannata, Andrea

    2014-05-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  12. A multidisciplinary database for geophysical time series management

    NASA Astrophysics Data System (ADS)

    Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.

    2013-12-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  13. Flexible risk metrics for identifying and monitoring conservation-priority species

    USGS Publications Warehouse

    Stanton, Jessica C.; Semmens, Brice X.; McKann, Patrick C.; Will, Tom; Thogmartin, Wayne E.

    2016-01-01

    Region-specific conservation programs should have objective, reliable metrics for species prioritization and progress evaluation that are customizable to the goals of a program, easy to comprehend and communicate, and standardized across time. Regional programs may have vastly different goals, spatial coverage, or management agendas, and one-size-fits-all schemes may not always be the best approach. We propose a quantitative and objective framework for generating metrics for prioritizing species that is straightforward to implement and update, customizable to different spatial resolutions, and based on readily available time-series data. This framework is also well-suited to handling missing-data and observer error. We demonstrate this approach using North American Breeding Bird Survey (NABBS) data to identify conservation priority species from a list of over 300 landbirds across 33 bird conservation regions (BCRs). To highlight the flexibility of the framework for different management goals and timeframes we calculate two different metrics. The first identifies species that may be inadequately monitored by NABBS protocols in the near future (TMT, time to monitoring threshold), and the other identifies species likely to decline significantly in the near future based on recent trends (TPD, time to percent decline). Within the individual BCRs we found up to 45% (mean 28%) of the species analyzed had overall declining population trajectories, which could result in up to 37 species declining below a minimum NABBS monitoring threshold in at least one currently occupied BCR within the next 50 years. Additionally, up to 26% (mean 8%) of the species analyzed within the individual BCRs may decline by 30% within the next decade. Conservation workers interested in conserving avian diversity and abundance within these BCRs can use these metrics to plan alternative monitoring schemes or highlight the urgency of those populations experiencing the fastest declines. However, this framework is adaptable to many taxa besides birds where abundance time-series data are available.

  14. A risk-based framework to assess long-term effects of policy and water supply changes on water resources systems

    NASA Astrophysics Data System (ADS)

    Hassanzadeh, Elmira; Elshorbagy, Amin; Wheater, Howard; Gober, Patricia

    2015-04-01

    Climate uncertainty can affect water resources availability and management decisions. Sustainable water resources management therefore requires evaluation of policy and management decisions under a wide range of possible future water supply conditions. This study proposes a risk-based framework to integrate water supply uncertainty into a forward-looking decision making context. To apply this framework, a stochastic reconstruction scheme is used to generate a large ensemble of flow series. For the Rocky Mountain basins considered here, two key characteristics of the annual hydrograph are its annual flow volume and the timing of the seasonal flood peak. These are perturbed to represent natural randomness and potential changes due to future climate. 30-year series of perturbed flows are used as input to the SWAMP model - an integrated water resources model that simulates regional water supply-demand system and estimates economic productivity of water and other sustainability indicators, including system vulnerability and resilience. The simulation results are used to construct 2D-maps of net revenue of a particular water sector; e.g., hydropower, or for all sectors combined. Each map cell represents a risk scenario of net revenue based on a particular annual flow volume, timing of the peak flow, and 200 stochastic realizations of flow series. This framework is demonstrated for a water resources system in the Saskatchewan River Basin (SaskRB) in Saskatchewan, Canada. Critical historical drought sequences, derived from tree-ring reconstructions of several hundred years of annual river flows, are used to evaluate the system's performance (net revenue risk) under extremely low flow conditions and also to locate them on the previously produced 2D risk maps. This simulation and analysis framework is repeated under various reservoir operation strategies (e.g., maximizing flood protection or maximizing water supply security); development proposals, such as irrigation expansion; and change in energy prices. Such risk-based analysis demonstrates relative reduction/increase of risk associated with management and policy decisions and allow decision makers to explore the relative importance of policy versus natural water supply change in a water resources system.

  15. Time series modeling of live-cell shape dynamics for image-based phenotypic profiling.

    PubMed

    Gordonov, Simon; Hwang, Mun Kyung; Wells, Alan; Gertler, Frank B; Lauffenburger, Douglas A; Bathe, Mark

    2016-01-01

    Live-cell imaging can be used to capture spatio-temporal aspects of cellular responses that are not accessible to fixed-cell imaging. As the use of live-cell imaging continues to increase, new computational procedures are needed to characterize and classify the temporal dynamics of individual cells. For this purpose, here we present the general experimental-computational framework SAPHIRE (Stochastic Annotation of Phenotypic Individual-cell Responses) to characterize phenotypic cellular responses from time series imaging datasets. Hidden Markov modeling is used to infer and annotate morphological state and state-switching properties from image-derived cell shape measurements. Time series modeling is performed on each cell individually, making the approach broadly useful for analyzing asynchronous cell populations. Two-color fluorescent cells simultaneously expressing actin and nuclear reporters enabled us to profile temporal changes in cell shape following pharmacological inhibition of cytoskeleton-regulatory signaling pathways. Results are compared with existing approaches conventionally applied to fixed-cell imaging datasets, and indicate that time series modeling captures heterogeneous dynamic cellular responses that can improve drug classification and offer additional important insight into mechanisms of drug action. The software is available at http://saphire-hcs.org.

  16. Spectral Unmixing Analysis of Time Series Landsat 8 Images

    NASA Astrophysics Data System (ADS)

    Zhuo, R.; Xu, L.; Peng, J.; Chen, Y.

    2018-05-01

    Temporal analysis of Landsat 8 images opens up new opportunities in the unmixing procedure. Although spectral analysis of time series Landsat imagery has its own advantage, it has rarely been studied. Nevertheless, using the temporal information can provide improved unmixing performance when compared to independent image analyses. Moreover, different land cover types may demonstrate different temporal patterns, which can aid the discrimination of different natures. Therefore, this letter presents time series K-P-Means, a new solution to the problem of unmixing time series Landsat imagery. The proposed approach is to obtain the "purified" pixels in order to achieve optimal unmixing performance. The vertex component analysis (VCA) is used to extract endmembers for endmember initialization. First, nonnegative least square (NNLS) is used to estimate abundance maps by using the endmember. Then, the estimated endmember is the mean value of "purified" pixels, which is the residual of the mixed pixel after excluding the contribution of all nondominant endmembers. Assembling two main steps (abundance estimation and endmember update) into the iterative optimization framework generates the complete algorithm. Experiments using both simulated and real Landsat 8 images show that the proposed "joint unmixing" approach provides more accurate endmember and abundance estimation results compared with "separate unmixing" approach.

  17. A framework for delineating the regional boundaries of PM2.5 pollution: A case study of China.

    PubMed

    Liu, Jianzheng; Li, Weifeng; Wu, Jiansheng

    2018-04-01

    Fine particulate matter (PM 2.5 ) pollution has been a major issue in many countries. Considerable studies have demonstrated that PM 2.5 pollution is a regional issue, but little research has been done to investigate the regional extent of PM 2.5 pollution or to define areas in which PM 2.5 pollutants interact. To allow for a better understanding of the regional nature and spatial patterns of PM 2.5 pollution, This study proposes a novel framework for delineating regional boundaries of PM 2.5 pollution. The framework consists of four steps, including cross-correlation analysis, time-series clustering, generation of Voronoi polygons, and polygon smoothing using polynomial approximation with exponential kernel method. Using the framework, the regional PM 2.5 boundaries for China are produced and the boundaries define areas where the monthly PM 2.5 time series of any two cities show, on average, more than 50% similarity with each other. These areas demonstrate straightforwardly that PM 2.5 pollution is not limited to a single city or a single province. We also found that the PM 2.5 areas in China tend to be larger in cold months, but more fragmented in warm months, suggesting that, in cold months, the interactions between PM 2.5 concentrations in adjacent cities are stronger than in warmer months. The proposed framework provides a tool to delineate PM 2.5 boundaries and identify areas where PM 2.5 pollutants interact. It can help define air pollution management zones and assess impacts related to PM 2.5 pollution. It can also be used in analyses of other air pollutants. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Using permutation tests to enhance causal inference in interrupted time series analysis.

    PubMed

    Linden, Ariel

    2018-06-01

    Interrupted time series analysis (ITSA) is an evaluation methodology in which a single treatment unit's outcome is studied serially over time and the intervention is expected to "interrupt" the level and/or trend of that outcome. The internal validity is strengthened considerably when the treated unit is contrasted with a comparable control group. In this paper, we introduce a robustness check based on permutation tests to further improve causal inference. We evaluate the effect of California's Proposition 99 for reducing cigarette sales by iteratively casting each nontreated state into the role of "treated," creating a comparable control group using the ITSAMATCH package in Stata, and then evaluating treatment effects using ITSA regression. If statistically significant "treatment effects" are estimated for pseudotreated states, then any significant changes in the outcome of the actual treatment unit (California) cannot be attributed to the intervention. We perform these analyses setting the cutpoint significance level to P > .40 for identifying balanced matches (the highest threshold possible for which controls could still be found for California) and use the difference in differences of trends as the treatment effect estimator. Only California attained a statistically significant treatment effect, strengthening confidence in the conclusion that Proposition 99 reduced cigarette sales. The proposed permutation testing framework provides an additional robustness check to either support or refute a treatment effect identified in for the true treated unit in ITSA. Given its value and ease of implementation, this framework should be considered as a standard robustness test in all multiple group interrupted time series analyses. © 2018 John Wiley & Sons, Ltd.

  19. JCell--a Java-based framework for inferring regulatory networks from time series data.

    PubMed

    Spieth, C; Supper, J; Streichert, F; Speer, N; Zell, A

    2006-08-15

    JCell is a Java-based application for reconstructing gene regulatory networks from experimental data. The framework provides several algorithms to identify genetic and metabolic dependencies based on experimental data conjoint with mathematical models to describe and simulate regulatory systems. Owing to the modular structure, researchers can easily implement new methods. JCell is a pure Java application with additional scripting capabilities and thus widely usable, e.g. on parallel or cluster computers. The software is freely available for download at http://www-ra.informatik.uni-tuebingen.de/software/JCell.

  20. Halogen dependent symmetry change in two series of wheel cluster organic frameworks built from La18 tertiary building units.

    PubMed

    Fang, Wei-Hui; Zhang, Lei; Zhang, Jian; Yang, Guo-Yu

    2016-01-25

    Two series of wheel cluster organic frameworks (WCOFs) built from La18 tertiary building units are hydrothermally made, which show halogen-dependent structural symmetry, and demonstrate different chiral performances.

  1. Investigation on Law and Economics Based on Complex Network and Time Series Analysis.

    PubMed

    Yang, Jian; Qu, Zhao; Chang, Hui

    2015-01-01

    The research focuses on the cooperative relationship and the strategy tendency among three mutually interactive parties in financing: small enterprises, commercial banks and micro-credit companies. Complex network theory and time series analysis were applied to figure out the quantitative evidence. Moreover, this paper built up a fundamental model describing the particular interaction among them through evolutionary game. Combining the results of data analysis and current situation, it is justifiable to put forward reasonable legislative recommendations for regulations on lending activities among small enterprises, commercial banks and micro-credit companies. The approach in this research provides a framework for constructing mathematical models and applying econometrics and evolutionary game in the issue of corporation financing.

  2. Forecasting Electricity Prices in an Optimization Hydrothermal Problem

    NASA Astrophysics Data System (ADS)

    Matías, J. M.; Bayón, L.; Suárez, P.; Argüelles, A.; Taboada, J.

    2007-12-01

    This paper presents an economic dispatch algorithm in a hydrothermal system within the framework of a competitive and deregulated electricity market. The optimization problem of one firm is described, whose objective function can be defined as its profit maximization. Since next-day price forecasting is an aspect crucial, this paper proposes an efficient yet highly accurate next-day price new forecasting method using a functional time series approach trying to exploit the daily seasonal structure of the series of prices. For the optimization problem, an optimal control technique is applied and Pontryagin's theorem is employed.

  3. Statistical analysis of hydrological response in urbanising catchments based on adaptive sampling using inter-amount times

    NASA Astrophysics Data System (ADS)

    ten Veldhuis, Marie-Claire; Schleiss, Marc

    2017-04-01

    In this study, we introduced an alternative approach for analysis of hydrological flow time series, using an adaptive sampling framework based on inter-amount times (IATs). The main difference with conventional flow time series is the rate at which low and high flows are sampled: the unit of analysis for IATs is a fixed flow amount, instead of a fixed time window. We analysed statistical distributions of flows and IATs across a wide range of sampling scales to investigate sensitivity of statistical properties such as quantiles, variance, skewness, scaling parameters and flashiness indicators to the sampling scale. We did this based on streamflow time series for 17 (semi)urbanised basins in North Carolina, US, ranging from 13 km2 to 238 km2 in size. Results showed that adaptive sampling of flow time series based on inter-amounts leads to a more balanced representation of low flow and peak flow values in the statistical distribution. While conventional sampling gives a lot of weight to low flows, as these are most ubiquitous in flow time series, IAT sampling gives relatively more weight to high flow values, when given flow amounts are accumulated in shorter time. As a consequence, IAT sampling gives more information about the tail of the distribution associated with high flows, while conventional sampling gives relatively more information about low flow periods. We will present results of statistical analyses across a range of subdaily to seasonal scales and will highlight some interesting insights that can be derived from IAT statistics with respect to basin flashiness and impact urbanisation on hydrological response.

  4. A Comparison of Joint Model and Fully Conditional Specification Imputation for Multilevel Missing Data

    ERIC Educational Resources Information Center

    Mistler, Stephen A.; Enders, Craig K.

    2017-01-01

    Multiple imputation methods can generally be divided into two broad frameworks: joint model (JM) imputation and fully conditional specification (FCS) imputation. JM draws missing values simultaneously for all incomplete variables using a multivariate distribution, whereas FCS imputes variables one at a time from a series of univariate conditional…

  5. Toward A Theory of HRD Learning Participation

    ERIC Educational Resources Information Center

    Wang, Greg G.; Wang, Jia

    2005-01-01

    This article fills a gap by identifying an under-studied area for learning participation (LP) in HRD theory building. A literature review is presented to identify gaps in adult education and HRD literature. An HRD LP framework is then proposed, from cross-sectional/time-series perspectives, to describe the pattern, factors, structure, and the…

  6. Did School Finance Equalization Increase Revenue Instability for School Districts?

    ERIC Educational Resources Information Center

    Balu, Rekha

    2011-01-01

    This study uses an Interrupted Time Series analysis with a non-equivalent comparison group to estimate the causal effect of school finance equalization on district revenue instability. The author applies a microeconomic framework to an understudied problem in education finance and policy. In so doing, she illustrates how policies can sometimes…

  7. Fast and Flexible Multivariate Time Series Subsequence Search

    NASA Technical Reports Server (NTRS)

    Bhaduri, Kanishka; Oza, Nikunj C.; Zhu, Qiang; Srivastava, Ashok N.

    2010-01-01

    Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical monitoring, and financial systems. Domain experts are often interested in searching for interesting multivariate patterns from these MTS databases which often contain several gigabytes of data. Surprisingly, research on MTS search is very limited. Most of the existing work only supports queries with the same length of data, or queries on a fixed set of variables. In this paper, we propose an efficient and flexible subsequence search framework for massive MTS databases, that, for the first time, enables querying on any subset of variables with arbitrary time delays between them. We propose two algorithms to solve this problem (1) a List Based Search (LBS) algorithm which uses sorted lists for indexing, and (2) a R*-tree Based Search (RBS) which uses Minimum Bounding Rectangles (MBR) to organize the subsequences. Both algorithms guarantee that all matching patterns within the specified thresholds will be returned (no false dismissals). The very few false alarms can be removed by a post-processing step. Since our framework is also capable of Univariate Time-Series (UTS) subsequence search, we first demonstrate the efficiency of our algorithms on several UTS datasets previously used in the literature. We follow this up with experiments using two large MTS databases from the aviation domain, each containing several millions of observations. Both these tests show that our algorithms have very high prune rates (>99%) thus needing actual disk access for only less than 1% of the observations. To the best of our knowledge, MTS subsequence search has never been attempted on datasets of the size we have used in this paper.

  8. A framework for assessing cumulative effects in watersheds: an introduction to Canadian case studies.

    PubMed

    Dubé, Monique G; Duinker, Peter; Greig, Lorne; Carver, Martin; Servos, Mark; McMaster, Mark; Noble, Bram; Schreier, Hans; Jackson, Lee; Munkittrick, Kelly R

    2013-07-01

    From 2008 to 2013, a series of studies supported by the Canadian Water Network were conducted in Canadian watersheds in an effort to improve methods to assess cumulative effects. These studies fit under a common framework for watershed cumulative effects assessment (CEA). This article presents an introduction to the Special Series on Watershed CEA in IEAM including the framework and its impetus, a brief introduction to each of the articles in the series, challenges, and a path forward. The framework includes a regional water monitoring program that produces 3 core outputs: an accumulated state assessment, stressor-response relationships, and development of predictive cumulative effects scenario models. The framework considers core values, indicators, thresholds, and use of consistent terminology. It emphasizes that CEA requires 2 components, accumulated state quantification and predictive scenario forecasting. It recognizes both of these components must be supported by a regional, multiscale monitoring program. Copyright © 2013 SETAC.

  9. Data-driven coarse graining in action: Modeling and prediction of complex systems

    NASA Astrophysics Data System (ADS)

    Krumscheid, S.; Pradas, M.; Pavliotis, G. A.; Kalliadasis, S.

    2015-10-01

    In many physical, technological, social, and economic applications, one is commonly faced with the task of estimating statistical properties, such as mean first passage times of a temporal continuous process, from empirical data (experimental observations). Typically, however, an accurate and reliable estimation of such properties directly from the data alone is not possible as the time series is often too short, or the particular phenomenon of interest is only rarely observed. We propose here a theoretical-computational framework which provides us with a systematic and rational estimation of statistical quantities of a given temporal process, such as waiting times between subsequent bursts of activity in intermittent signals. Our framework is illustrated with applications from real-world data sets, ranging from marine biology to paleoclimatic data.

  10. Modeling Geomagnetic Variations using a Machine Learning Framework

    NASA Astrophysics Data System (ADS)

    Cheung, C. M. M.; Handmer, C.; Kosar, B.; Gerules, G.; Poduval, B.; Mackintosh, G.; Munoz-Jaramillo, A.; Bobra, M.; Hernandez, T.; McGranaghan, R. M.

    2017-12-01

    We present a framework for data-driven modeling of Heliophysics time series data. The Solar Terrestrial Interaction Neural net Generator (STING) is an open source python module built on top of state-of-the-art statistical learning frameworks (traditional machine learning methods as well as deep learning). To showcase the capability of STING, we deploy it for the problem of predicting the temporal variation of geomagnetic fields. The data used includes solar wind measurements from the OMNI database and geomagnetic field data taken by magnetometers at US Geological Survey observatories. We examine the predictive capability of different machine learning techniques (recurrent neural networks, support vector machines) for a range of forecasting times (minutes to 12 hours). STING is designed to be extensible to other types of data. We show how STING can be used on large sets of data from different sensors/observatories and adapted to tackle other problems in Heliophysics.

  11. Time series analysis of temporal networks

    NASA Astrophysics Data System (ADS)

    Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh

    2016-01-01

    A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue "Temporal Network Theory and Applications", edited by Petter Holme.

  12. Conditional Spectral Analysis of Replicated Multiple Time Series with Application to Nocturnal Physiology.

    PubMed

    Krafty, Robert T; Rosen, Ori; Stoffer, David S; Buysse, Daniel J; Hall, Martica H

    2017-01-01

    This article considers the problem of analyzing associations between power spectra of multiple time series and cross-sectional outcomes when data are observed from multiple subjects. The motivating application comes from sleep medicine, where researchers are able to non-invasively record physiological time series signals during sleep. The frequency patterns of these signals, which can be quantified through the power spectrum, contain interpretable information about biological processes. An important problem in sleep research is drawing connections between power spectra of time series signals and clinical characteristics; these connections are key to understanding biological pathways through which sleep affects, and can be treated to improve, health. Such analyses are challenging as they must overcome the complicated structure of a power spectrum from multiple time series as a complex positive-definite matrix-valued function. This article proposes a new approach to such analyses based on a tensor-product spline model of Cholesky components of outcome-dependent power spectra. The approach exibly models power spectra as nonparametric functions of frequency and outcome while preserving geometric constraints. Formulated in a fully Bayesian framework, a Whittle likelihood based Markov chain Monte Carlo (MCMC) algorithm is developed for automated model fitting and for conducting inference on associations between outcomes and spectral measures. The method is used to analyze data from a study of sleep in older adults and uncovers new insights into how stress and arousal are connected to the amount of time one spends in bed.

  13. Empirical intrinsic geometry for nonlinear modeling and time series filtering.

    PubMed

    Talmon, Ronen; Coifman, Ronald R

    2013-07-30

    In this paper, we present a method for time series analysis based on empirical intrinsic geometry (EIG). EIG enables one to reveal the low-dimensional parametric manifold as well as to infer the underlying dynamics of high-dimensional time series. By incorporating concepts of information geometry, this method extends existing geometric analysis tools to support stochastic settings and parametrizes the geometry of empirical distributions. However, the statistical models are not required as priors; hence, EIG may be applied to a wide range of real signals without existing definitive models. We show that the inferred model is noise-resilient and invariant under different observation and instrumental modalities. In addition, we show that it can be extended efficiently to newly acquired measurements in a sequential manner. These two advantages enable us to revisit the Bayesian approach and incorporate empirical dynamics and intrinsic geometry into a nonlinear filtering framework. We show applications to nonlinear and non-Gaussian tracking problems as well as to acoustic signal localization.

  14. Supercapacitors of nanocrystalline metal-organic frameworks.

    PubMed

    Choi, Kyung Min; Jeong, Hyung Mo; Park, Jung Hyo; Zhang, Yue-Biao; Kang, Jeung Ku; Yaghi, Omar M

    2014-07-22

    The high porosity of metal-organic frameworks (MOFs) has been used to achieve exceptional gas adsorptive properties but as yet remains largely unexplored for electrochemical energy storage devices. This study shows that MOFs made as nanocrystals (nMOFs) can be doped with graphene and successfully incorporated into devices to function as supercapacitors. A series of 23 different nMOFs with multiple organic functionalities and metal ions, differing pore sizes and shapes, discrete and infinite metal oxide backbones, large and small nanocrystals, and a variety of structure types have been prepared and examined. Several members of this series give high capacitance; in particular, a zirconium MOF exhibits exceptionally high capacitance. It has the stack and areal capacitance of 0.64 and 5.09 mF cm(-2), about 6 times that of the supercapacitors made from the benchmark commercial activated carbon materials and a performance that is preserved over at least 10000 charge/discharge cycles.

  15. Mixture Hidden Markov Models in Finance Research

    NASA Astrophysics Data System (ADS)

    Dias, José G.; Vermunt, Jeroen K.; Ramos, Sofia

    Finite mixture models have proven to be a powerful framework whenever unobserved heterogeneity cannot be ignored. We introduce in finance research the Mixture Hidden Markov Model (MHMM) that takes into account time and space heterogeneity simultaneously. This approach is flexible in the sense that it can deal with the specific features of financial time series data, such as asymmetry, kurtosis, and unobserved heterogeneity. This methodology is applied to model simultaneously 12 time series of Asian stock markets indexes. Because we selected a heterogeneous sample of countries including both developed and emerging countries, we expect that heterogeneity in market returns due to country idiosyncrasies will show up in the results. The best fitting model was the one with two clusters at country level with different dynamics between the two regimes.

  16. How to measure ecosystem stability? An evaluation of the reliability of stability metrics based on remote sensing time series across the major global ecosystems.

    PubMed

    De Keersmaecker, Wanda; Lhermitte, Stef; Honnay, Olivier; Farifteh, Jamshid; Somers, Ben; Coppin, Pol

    2014-07-01

    Increasing frequency of extreme climate events is likely to impose increased stress on ecosystems and to jeopardize the services that ecosystems provide. Therefore, it is of major importance to assess the effects of extreme climate events on the temporal stability (i.e., the resistance, the resilience, and the variance) of ecosystem properties. Most time series of ecosystem properties are, however, affected by varying data characteristics, uncertainties, and noise, which complicate the comparison of ecosystem stability metrics (ESMs) between locations. Therefore, there is a strong need for a more comprehensive understanding regarding the reliability of stability metrics and how they can be used to compare ecosystem stability globally. The objective of this study was to evaluate the performance of temporal ESMs based on time series of the Moderate Resolution Imaging Spectroradiometer derived Normalized Difference Vegetation Index of 15 global land-cover types. We provide a framework (i) to assess the reliability of ESMs in function of data characteristics, uncertainties and noise and (ii) to integrate reliability estimates in future global ecosystem stability studies against climate disturbances. The performance of our framework was tested through (i) a global ecosystem comparison and (ii) an comparison of ecosystem stability in response to the 2003 drought. The results show the influence of data quality on the accuracy of ecosystem stability. White noise, biased noise, and trends have a stronger effect on the accuracy of stability metrics than the length of the time series, temporal resolution, or amount of missing values. Moreover, we demonstrate the importance of integrating reliability estimates to interpret stability metrics within confidence limits. Based on these confidence limits, other studies dealing with specific ecosystem types or locations can be put into context, and a more reliable assessment of ecosystem stability against environmental disturbances can be obtained. © 2013 John Wiley & Sons Ltd.

  17. The Hurst exponent in energy futures prices

    NASA Astrophysics Data System (ADS)

    Serletis, Apostolos; Rosenberg, Aryeh Adam

    2007-07-01

    This paper extends the work in Elder and Serletis [Long memory in energy futures prices, Rev. Financial Econ., forthcoming, 2007] and Serletis et al. [Detrended fluctuation analysis of the US stock market, Int. J. Bifurcation Chaos, forthcoming, 2007] by re-examining the empirical evidence for random walk type behavior in energy futures prices. In doing so, it uses daily data on energy futures traded on the New York Mercantile Exchange, over the period from July 2, 1990 to November 1, 2006, and a statistical physics approach-the ‘detrending moving average’ technique-providing a reliable framework for testing the information efficiency in financial markets as shown by Alessio et al. [Second-order moving average and scaling of stochastic time series, Eur. Phys. J. B 27 (2002) 197-200] and Carbone et al. [Time-dependent hurst exponent in financial time series. Physica A 344 (2004) 267-271; Analysis of clusters formed by the moving average of a long-range correlated time series. Phys. Rev. E 69 (2004) 026105]. The results show that energy futures returns display long memory and that the particular form of long memory is anti-persistence.

  18. Dynamical glucometry: Use of multiscale entropy analysis in diabetes

    NASA Astrophysics Data System (ADS)

    Costa, Madalena D.; Henriques, Teresa; Munshi, Medha N.; Segal, Alissa R.; Goldberger, Ary L.

    2014-09-01

    Diabetes mellitus (DM) is one of the world's most prevalent medical conditions. Contemporary management focuses on lowering mean blood glucose values toward a normal range, but largely ignores the dynamics of glucose fluctuations. We probed analyte time series obtained from continuous glucose monitor (CGM) sensors. We show that the fluctuations in CGM values sampled every 5 min are not uncorrelated noise. Next, using multiscale entropy analysis, we quantified the complexity of the temporal structure of the CGM time series from a group of elderly subjects with type 2 DM and age-matched controls. We further probed the structure of these CGM time series using detrended fluctuation analysis. Our findings indicate that the dynamics of glucose fluctuations from control subjects are more complex than those of subjects with type 2 DM over time scales ranging from about 5 min to 5 h. These findings support consideration of a new framework, dynamical glucometry, to guide mechanistic research and to help assess and compare therapeutic interventions, which should enhance complexity of glucose fluctuations and not just lower mean and variance of blood glucose levels.

  19. Growth Modeling with Nonignorable Dropout: Alternative Analyses of the STAR*D Antidepressant Trial

    ERIC Educational Resources Information Center

    Muthen, Bengt; Asparouhov, Tihomir; Hunter, Aimee M.; Leuchter, Andrew F.

    2011-01-01

    This article uses a general latent variable framework to study a series of models for nonignorable missingness due to dropout. Nonignorable missing data modeling acknowledges that missingness may depend not only on covariates and observed outcomes at previous time points as with the standard missing at random assumption, but also on latent…

  20. LASP Time Series Server (LaTiS): Overcoming Data Access Barriers via a Common Data Model in the Middle Tier (Invited)

    NASA Astrophysics Data System (ADS)

    Lindholm, D. M.; Wilson, A.

    2010-12-01

    The Laboratory for Atmospheric and Space Physics at the University of Colorado has developed an Open Source, OPeNDAP compliant, Java Servlet based, RESTful web service to serve time series data. In addition to handling OPeNDAP style requests and returning standard responses, existing modules for alternate output formats can be reused or customized. It is also simple to reuse or customize modules to directly read various native data sources and even to perform some processing on the server. The server is built around a common data model based on the Unidata Common Data Model (CDM) which merges the NetCDF, HDF, and OPeNDAP data models. The server framework features a modular architecture that supports pluggable Readers, Writers, and Filters via the common interface to the data, enabling a workflow that reads data from their native form, performs some processing on the server, and presents the results to the client in its preferred form. The service is currently being used operationally to serve time series data for the LASP Interactive Solar Irradiance Data Center (LISIRD, http://lasp.colorado.edu/lisird/) and as part of the Time Series Data Server (TSDS, http://tsds.net/). I will present the data model and how it enables reading, writing, and processing concerns to be separated into loosely coupled components. I will also share thoughts for evolving beyond the time series abstraction and providing a general purpose data service that can be orchestrated into larger workflows.

  1. Efficient hemodynamic event detection utilizing relational databases and wavelet analysis

    NASA Technical Reports Server (NTRS)

    Saeed, M.; Mark, R. G.

    2001-01-01

    Development of a temporal query framework for time-oriented medical databases has hitherto been a challenging problem. We describe a novel method for the detection of hemodynamic events in multiparameter trends utilizing wavelet coefficients in a MySQL relational database. Storage of the wavelet coefficients allowed for a compact representation of the trends, and provided robust descriptors for the dynamics of the parameter time series. A data model was developed to allow for simplified queries along several dimensions and time scales. Of particular importance, the data model and wavelet framework allowed for queries to be processed with minimal table-join operations. A web-based search engine was developed to allow for user-defined queries. Typical queries required between 0.01 and 0.02 seconds, with at least two orders of magnitude improvement in speed over conventional queries. This powerful and innovative structure will facilitate research on large-scale time-oriented medical databases.

  2. On the relationship between health, education and economic growth: Time series evidence from Malaysia

    NASA Astrophysics Data System (ADS)

    Khan, Habib Nawaz; Razali, Radzuan B.; Shafei, Afza Bt.

    2016-11-01

    The objectives of this paper is two-fold: First, to empirically investigate the effects of an enlarged number of healthy and well-educated people on economic growth in Malaysia within the Endogeneous Growth Model framework. Second, to examine the causal links between education, health and economic growth using annual time series data from 1981 to 2014 for Malaysia. Data series were checked for the time series properties by using ADF and KPSS tests. Long run co-integration relationship was investigated with the help of vector autoregressive (VAR) method. For short and long run dynamic relationship investigation vector error correction model (VECM) was applied. Causality analysis was performed through Engle-Granger technique. The study results showed long run co-integration relation and positively significant effects of education and health on economic growth in Malaysia. The reported results also confirmed a feedback hypothesis between the variables in the case of Malaysia. The study results have policy relevance of the importance of human capital (health and education) to the growth process of the Malaysia. Thus, it is suggested that policy makers focus on education and health sectors for sustainable economic growth in Malaysia.

  3. The GOLM-database standard- a framework for time-series data management based on free software

    NASA Astrophysics Data System (ADS)

    Eichler, M.; Francke, T.; Kneis, D.; Reusser, D.

    2009-04-01

    Monitoring and modelling projects usually involve time series data originating from different sources. Often, file formats, temporal resolution and meta-data documentation rarely adhere to a common standard. As a result, much effort is spent on converting, harmonizing, merging, checking, resampling and reformatting these data. Moreover, in work groups or during the course of time, these tasks tend to be carried out redundantly and repeatedly, especially when new data becomes available. The resulting duplication of data in various formats strains additional ressources. We propose a database structure and complementary scripts for facilitating these tasks. The GOLM- (General Observation and Location Management) framework allows for import and storage of time series data of different type while assisting in meta-data documentation, plausibility checking and harmonization. The imported data can be visually inspected and its coverage among locations and variables may be visualized. Supplementing scripts provide options for data export for selected stations and variables and resampling of the data to the desired temporal resolution. These tools can, for example, be used for generating model input files or reports. Since GOLM fully supports network access, the system can be used efficiently by distributed working groups accessing the same data over the internet. GOLM's database structure and the complementary scripts can easily be customized to specific needs. Any involved software such as MySQL, R, PHP, OpenOffice as well as the scripts for building and using the data base, including documentation, are free for download. GOLM was developed out of the practical requirements of the OPAQUE-project. It has been tested and further refined in the ERANET-CRUE and SESAM projects, all of which used GOLM to manage meteorological, hydrological and/or water quality data.

  4. Investigation on Law and Economics Based on Complex Network and Time Series Analysis

    PubMed Central

    Yang, Jian; Qu, Zhao; Chang, Hui

    2015-01-01

    The research focuses on the cooperative relationship and the strategy tendency among three mutually interactive parties in financing: small enterprises, commercial banks and micro-credit companies. Complex network theory and time series analysis were applied to figure out the quantitative evidence. Moreover, this paper built up a fundamental model describing the particular interaction among them through evolutionary game. Combining the results of data analysis and current situation, it is justifiable to put forward reasonable legislative recommendations for regulations on lending activities among small enterprises, commercial banks and micro-credit companies. The approach in this research provides a framework for constructing mathematical models and applying econometrics and evolutionary game in the issue of corporation financing. PMID:26076460

  5. Delay differential analysis of time series.

    PubMed

    Lainscsek, Claudia; Sejnowski, Terrence J

    2015-03-01

    Nonlinear dynamical system analysis based on embedding theory has been used for modeling and prediction, but it also has applications to signal detection and classification of time series. An embedding creates a multidimensional geometrical object from a single time series. Traditionally either delay or derivative embeddings have been used. The delay embedding is composed of delayed versions of the signal, and the derivative embedding is composed of successive derivatives of the signal. The delay embedding has been extended to nonuniform embeddings to take multiple timescales into account. Both embeddings provide information on the underlying dynamical system without having direct access to all the system variables. Delay differential analysis is based on functional embeddings, a combination of the derivative embedding with nonuniform delay embeddings. Small delay differential equation (DDE) models that best represent relevant dynamic features of time series data are selected from a pool of candidate models for detection or classification. We show that the properties of DDEs support spectral analysis in the time domain where nonlinear correlation functions are used to detect frequencies, frequency and phase couplings, and bispectra. These can be efficiently computed with short time windows and are robust to noise. For frequency analysis, this framework is a multivariate extension of discrete Fourier transform (DFT), and for higher-order spectra, it is a linear and multivariate alternative to multidimensional fast Fourier transform of multidimensional correlations. This method can be applied to short or sparse time series and can be extended to cross-trial and cross-channel spectra if multiple short data segments of the same experiment are available. Together, this time-domain toolbox provides higher temporal resolution, increased frequency and phase coupling information, and it allows an easy and straightforward implementation of higher-order spectra across time compared with frequency-based methods such as the DFT and cross-spectral analysis.

  6. Dynamic decision making for dam-break emergency management - Part 1: Theoretical framework

    NASA Astrophysics Data System (ADS)

    Peng, M.; Zhang, L. M.

    2013-02-01

    An evacuation decision for dam breaks is a very serious issue. A late decision may lead to loss of lives and properties, but a very early evacuation will incur unnecessary expenses. This paper presents a risk-based framework of dynamic decision making for dam-break emergency management (DYDEM). The dam-break emergency management in both time scale and space scale is introduced first to define the dynamic decision problem. The probability of dam failure is taken as a stochastic process and estimated using a time-series analysis method. The flood consequences are taken as functions of warning time and evaluated with a human risk analysis model (HURAM) based on Bayesian networks. A decision criterion is suggested to decide whether to evacuate the population at risk (PAR) or to delay the decision. The optimum time for evacuating the PAR is obtained by minimizing the expected total loss, which integrates the time-related probabilities and flood consequences. When a delayed decision is chosen, the decision making can be updated with available new information. A specific dam-break case study is presented in a companion paper to illustrate the application of this framework to complex dam-breaching problems.

  7. ACCEPT: Introduction of the Adverse Condition and Critical Event Prediction Toolbox

    NASA Technical Reports Server (NTRS)

    Martin, Rodney A.; Santanu, Das; Janakiraman, Vijay Manikandan; Hosein, Stefan

    2015-01-01

    The prediction of anomalies or adverse events is a challenging task, and there are a variety of methods which can be used to address the problem. In this paper, we introduce a generic framework developed in MATLAB (sup registered mark) called ACCEPT (Adverse Condition and Critical Event Prediction Toolbox). ACCEPT is an architectural framework designed to compare and contrast the performance of a variety of machine learning and early warning algorithms, and tests the capability of these algorithms to robustly predict the onset of adverse events in any time-series data generating systems or processes.

  8. Multivariate exploration of non-intrusive load monitoring via spatiotemporal pattern network

    DOE PAGES

    Liu, Chao; Akintayo, Adedotun; Jiang, Zhanhong; ...

    2017-12-18

    Non-intrusive load monitoring (NILM) of electrical demand for the purpose of identifying load components has thus far mostly been studied using univariate data, e.g., using only whole building electricity consumption time series to identify a certain type of end-use such as lighting load. However, using additional variables in the form of multivariate time series data may provide more information in terms of extracting distinguishable features in the context of energy disaggregation. In this work, a novel probabilistic graphical modeling approach, namely the spatiotemporal pattern network (STPN) is proposed for energy disaggregation using multivariate time-series data. The STPN framework is shownmore » to be capable of handling diverse types of multivariate time-series to improve the energy disaggregation performance. The technique outperforms the state of the art factorial hidden Markov models (FHMM) and combinatorial optimization (CO) techniques in multiple real-life test cases. Furthermore, based on two homes' aggregate electric consumption data, a similarity metric is defined for the energy disaggregation of one home using a trained model based on the other home (i.e., out-of-sample case). The proposed similarity metric allows us to enhance scalability via learning supervised models for a few homes and deploying such models to many other similar but unmodeled homes with significantly high disaggregation accuracy.« less

  9. Multivariate exploration of non-intrusive load monitoring via spatiotemporal pattern network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Chao; Akintayo, Adedotun; Jiang, Zhanhong

    Non-intrusive load monitoring (NILM) of electrical demand for the purpose of identifying load components has thus far mostly been studied using univariate data, e.g., using only whole building electricity consumption time series to identify a certain type of end-use such as lighting load. However, using additional variables in the form of multivariate time series data may provide more information in terms of extracting distinguishable features in the context of energy disaggregation. In this work, a novel probabilistic graphical modeling approach, namely the spatiotemporal pattern network (STPN) is proposed for energy disaggregation using multivariate time-series data. The STPN framework is shownmore » to be capable of handling diverse types of multivariate time-series to improve the energy disaggregation performance. The technique outperforms the state of the art factorial hidden Markov models (FHMM) and combinatorial optimization (CO) techniques in multiple real-life test cases. Furthermore, based on two homes' aggregate electric consumption data, a similarity metric is defined for the energy disaggregation of one home using a trained model based on the other home (i.e., out-of-sample case). The proposed similarity metric allows us to enhance scalability via learning supervised models for a few homes and deploying such models to many other similar but unmodeled homes with significantly high disaggregation accuracy.« less

  10. A Physiological Time Series Dynamics-Based Approach to Patient Monitoring and Outcome Prediction

    PubMed Central

    Lehman, Li-Wei H.; Adams, Ryan P.; Mayaud, Louis; Moody, George B.; Malhotra, Atul; Mark, Roger G.; Nemati, Shamim

    2015-01-01

    Cardiovascular variables such as heart rate (HR) and blood pressure (BP) are regulated by an underlying control system, and therefore, the time series of these vital signs exhibit rich dynamical patterns of interaction in response to external perturbations (e.g., drug administration), as well as pathological states (e.g., onset of sepsis and hypotension). A question of interest is whether “similar” dynamical patterns can be identified across a heterogeneous patient cohort, and be used for prognosis of patients’ health and progress. In this paper, we used a switching vector autoregressive framework to systematically learn and identify a collection of vital sign time series dynamics, which are possibly recurrent within the same patient and may be shared across the entire cohort. We show that these dynamical behaviors can be used to characterize the physiological “state” of a patient. We validate our technique using simulated time series of the cardiovascular system, and human recordings of HR and BP time series from an orthostatic stress study with known postural states. Using the HR and BP dynamics of an intensive care unit (ICU) cohort of over 450 patients from the MIMIC II database, we demonstrate that the discovered cardiovascular dynamics are significantly associated with hospital mortality (dynamic modes 3 and 9, p = 0.001, p = 0.006 from logistic regression after adjusting for the APACHE scores). Combining the dynamics of BP time series and SAPS-I or APACHE-III provided a more accurate assessment of patient survival/mortality in the hospital than using SAPS-I and APACHE-III alone (p = 0.005 and p = 0.045). Our results suggest that the discovered dynamics of vital sign time series may contain additional prognostic value beyond that of the baseline acuity measures, and can potentially be used as an independent predictor of outcomes in the ICU. PMID:25014976

  11. Complex-valued time-series correlation increases sensitivity in FMRI analysis.

    PubMed

    Kociuba, Mary C; Rowe, Daniel B

    2016-07-01

    To develop a linear matrix representation of correlation between complex-valued (CV) time-series in the temporal Fourier frequency domain, and demonstrate its increased sensitivity over correlation between magnitude-only (MO) time-series in functional MRI (fMRI) analysis. The standard in fMRI is to discard the phase before the statistical analysis of the data, despite evidence of task related change in the phase time-series. With a real-valued isomorphism representation of Fourier reconstruction, correlation is computed in the temporal frequency domain with CV time-series data, rather than with the standard of MO data. A MATLAB simulation compares the Fisher-z transform of MO and CV correlations for varying degrees of task related magnitude and phase amplitude change in the time-series. The increased sensitivity of the complex-valued Fourier representation of correlation is also demonstrated with experimental human data. Since the correlation description in the temporal frequency domain is represented as a summation of second order temporal frequencies, the correlation is easily divided into experimentally relevant frequency bands for each voxel's temporal frequency spectrum. The MO and CV correlations for the experimental human data are analyzed for four voxels of interest (VOIs) to show the framework with high and low contrast-to-noise ratios in the motor cortex and the supplementary motor cortex. The simulation demonstrates the increased strength of CV correlations over MO correlations for low magnitude contrast-to-noise time-series. In the experimental human data, the MO correlation maps are noisier than the CV maps, and it is more difficult to distinguish the motor cortex in the MO correlation maps after spatial processing. Including both magnitude and phase in the spatial correlation computations more accurately defines the correlated left and right motor cortices. Sensitivity in correlation analysis is important to preserve the signal of interest in fMRI data sets with high noise variance, and avoid excessive processing induced correlation. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Conditional adaptive Bayesian spectral analysis of nonstationary biomedical time series.

    PubMed

    Bruce, Scott A; Hall, Martica H; Buysse, Daniel J; Krafty, Robert T

    2018-03-01

    Many studies of biomedical time series signals aim to measure the association between frequency-domain properties of time series and clinical and behavioral covariates. However, the time-varying dynamics of these associations are largely ignored due to a lack of methods that can assess the changing nature of the relationship through time. This article introduces a method for the simultaneous and automatic analysis of the association between the time-varying power spectrum and covariates, which we refer to as conditional adaptive Bayesian spectrum analysis (CABS). The procedure adaptively partitions the grid of time and covariate values into an unknown number of approximately stationary blocks and nonparametrically estimates local spectra within blocks through penalized splines. CABS is formulated in a fully Bayesian framework, in which the number and locations of partition points are random, and fit using reversible jump Markov chain Monte Carlo techniques. Estimation and inference averaged over the distribution of partitions allows for the accurate analysis of spectra with both smooth and abrupt changes. The proposed methodology is used to analyze the association between the time-varying spectrum of heart rate variability and self-reported sleep quality in a study of older adults serving as the primary caregiver for their ill spouse. © 2017, The International Biometric Society.

  13. Evaluation of recent GRACE monthly solution series with an ice sheet perspective

    NASA Astrophysics Data System (ADS)

    Horwath, Martin; Groh, Andreas

    2016-04-01

    GRACE monthly global gravity field solutions have undergone a remarkable evolution, leading to the latest (Release 5) series by CSR, GFZ, and JPL, to new series by other processing centers, such as ITSG and AIUB, as well as to efforts to derive combined solutions, particularly by the EGSIEM (European Gravity Service for Improved Emergency Management) project. For applications, such as GRACE inferences on ice sheet mass balance, the obvious question is on what GRACE solution series to base the assessment. Here we evaluate different GRACE solution series (including the ones listed above) in a unified framework. We concentrate on solutions expanded up to degree 90 or higher, since this is most appropriate for polar applications. We empirically assess the error levels in the spectral as well as in the spatial domain based on the month-to-month scatter in the high spherical harmonic degrees. We include empirical assessment of error correlations. We then apply all series to infer Antarctic and Greenland mass change time series and compare the results in terms of apparent signal content and noise level. We find that the ITSG solutions show lowest noise level in the high degrees (above 60). A preliminary combined solution from the EGSIEM project shows lowest noise in the degrees below 60. This virtue maps into the derived ice mass time series, where the EGSIEM-based results show the lowest noise in most cases. Meanwhile, there is no indication that any of the considered series systematically dampens actual geophysical signals.

  14. A configurable distributed high-performance computing framework for satellite's TDI-CCD imaging simulation

    NASA Astrophysics Data System (ADS)

    Xue, Bo; Mao, Bingjing; Chen, Xiaomei; Ni, Guoqiang

    2010-11-01

    This paper renders a configurable distributed high performance computing(HPC) framework for TDI-CCD imaging simulation. It uses strategy pattern to adapt multi-algorithms. Thus, this framework help to decrease the simulation time with low expense. Imaging simulation for TDI-CCD mounted on satellite contains four processes: 1) atmosphere leads degradation, 2) optical system leads degradation, 3) electronic system of TDI-CCD leads degradation and re-sampling process, 4) data integration. Process 1) to 3) utilize diversity data-intensity algorithms such as FFT, convolution and LaGrange Interpol etc., which requires powerful CPU. Even uses Intel Xeon X5550 processor, regular series process method takes more than 30 hours for a simulation whose result image size is 1500 * 1462. With literature study, there isn't any mature distributing HPC framework in this field. Here we developed a distribute computing framework for TDI-CCD imaging simulation, which is based on WCF[1], uses Client/Server (C/S) layer and invokes the free CPU resources in LAN. The server pushes the process 1) to 3) tasks to those free computing capacity. Ultimately we rendered the HPC in low cost. In the computing experiment with 4 symmetric nodes and 1 server , this framework reduced about 74% simulation time. Adding more asymmetric nodes to the computing network, the time decreased namely. In conclusion, this framework could provide unlimited computation capacity in condition that the network and task management server are affordable. And this is the brand new HPC solution for TDI-CCD imaging simulation and similar applications.

  15. Model-Based Design of Long-Distance Tracer Transport Experiments in Plants.

    PubMed

    Bühler, Jonas; von Lieres, Eric; Huber, Gregor J

    2018-01-01

    Studies of long-distance transport of tracer isotopes in plants offer a high potential for functional phenotyping, but so far measurement time is a bottleneck because continuous time series of at least 1 h are required to obtain reliable estimates of transport properties. Hence, usual throughput values are between 0.5 and 1 samples h -1 . Here, we propose to increase sample throughput by introducing temporal gaps in the data acquisition of each plant sample and measuring multiple plants one after each other in a rotating scheme. In contrast to common time series analysis methods, mechanistic tracer transport models allow the analysis of interrupted time series. The uncertainties of the model parameter estimates are used as a measure of how much information was lost compared to complete time series. A case study was set up to systematically investigate different experimental schedules for different throughput scenarios ranging from 1 to 12 samples h -1 . Selected designs with only a small amount of data points were found to be sufficient for an adequate parameter estimation, implying that the presented approach enables a substantial increase of sample throughput. The presented general framework for automated generation and evaluation of experimental schedules allows the determination of a maximal sample throughput and the respective optimal measurement schedule depending on the required statistical reliability of data acquired by future experiments.

  16. Exploratory Causal Analysis in Bivariate Time Series Data

    NASA Astrophysics Data System (ADS)

    McCracken, James M.

    Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data sets, but little research exists of how these tools compare to each other in practice. This work introduces and defines exploratory causal analysis (ECA) to address this issue along with the concept of data causality in the taxonomy of causal studies introduced in this work. The motivation is to provide a framework for exploring potential causal structures in time series data sets. ECA is used on several synthetic and empirical data sets, and it is found that all of the tested time series causality tools agree with each other (and intuitive notions of causality) for many simple systems but can provide conflicting causal inferences for more complicated systems. It is proposed that such disagreements between different time series causality tools during ECA might provide deeper insight into the data than could be found otherwise.

  17. The Recalibrated Sunspot Number: Impact on Solar Cycle Predictions

    NASA Astrophysics Data System (ADS)

    Clette, F.; Lefevre, L.

    2017-12-01

    Recently and for the first time since their creation, the sunspot number and group number series were entirely revisited and a first fully recalibrated version was officially released in July 2015 by the World Data Center SILSO (Brussels). Those reference long-term series are widely used as input data or as a calibration reference by various solar cycle prediction methods. Therefore, past predictions may now need to be redone using the new sunspot series, and methods already used for predicting cycle 24 will require adaptations before attempting predictions of the next cycles.In order to clarify the nature of the applied changes, we describe the different corrections applied to the sunspot and group number series, which affect extended time periods and can reach up to 40%. While some changes simply involve constant scale factors, other corrections vary with time or follow the solar cycle modulation. Depending on the prediction method and on the selected time interval, this can lead to different responses and biases. Moreover, together with the new series, standard error estimates are also progressively added to the new sunspot numbers, which may help deriving more accurate uncertainties for predicted activity indices. We conclude on the new round of recalibration that is now undertaken in the framework of a broad multi-team collaboration articulated around upcoming ISSI workshops. We outline the future corrections that can still be expected in the future, as part of a permanent upgrading process and quality control. From now on, future sunspot-based predictive models should thus be made more adaptable, and regular updates of predictions should become common practice in order to track periodic upgrades of the sunspot number series, just like it is done when using other modern solar observational series.

  18. Enhancing Instructional Problem Solving: An Efficient System for Assisting Struggling Learners. Practical Intervention in the Schools Series

    ERIC Educational Resources Information Center

    Begeny, John C.; Schulte, Ann C.; Johnson, Kent

    2012-01-01

    This book presents a schoolwide model of instructional support designed to make the most of available time, resources, and personnel--one that is also fully compatible with other problem-solving models, such as response to intervention. The authors provide a comprehensive and cohesive framework for linking assessment and intervention. They show…

  19. Creating Worlds, Constructing Meaning: The Scottish Storyline Method. Teacher to Teacher Series.

    ERIC Educational Resources Information Center

    Creswell, Jeff

    The approach known as the Storyline Method was developed by a group of educators at Jordanhill College of Education in Glasgow (Scotland). The development of the Storyline Method took place over years, and the approach, with its simple framework of Storyline, key questions, and activities, has stood the test of time. Storyline uses the power of…

  20. Fully Bayesian Estimation of Data from Single Case Designs

    ERIC Educational Resources Information Center

    Rindskopf, David

    2013-01-01

    Single case designs (SCDs) generally consist of a small number of short time series in two or more phases. The analysis of SCDs statistically fits in the framework of a multilevel model, or hierarchical model. The usual analysis does not take into account the uncertainty in the estimation of the random effects. This not only has an effect on the…

  1. Time-series modeling and prediction of global monthly absolute temperature for environmental decision making

    NASA Astrophysics Data System (ADS)

    Ye, Liming; Yang, Guixia; Van Ranst, Eric; Tang, Huajun

    2013-03-01

    A generalized, structural, time series modeling framework was developed to analyze the monthly records of absolute surface temperature, one of the most important environmental parameters, using a deterministicstochastic combined (DSC) approach. Although the development of the framework was based on the characterization of the variation patterns of a global dataset, the methodology could be applied to any monthly absolute temperature record. Deterministic processes were used to characterize the variation patterns of the global trend and the cyclic oscillations of the temperature signal, involving polynomial functions and the Fourier method, respectively, while stochastic processes were employed to account for any remaining patterns in the temperature signal, involving seasonal autoregressive integrated moving average (SARIMA) models. A prediction of the monthly global surface temperature during the second decade of the 21st century using the DSC model shows that the global temperature will likely continue to rise at twice the average rate of the past 150 years. The evaluation of prediction accuracy shows that DSC models perform systematically well against selected models of other authors, suggesting that DSC models, when coupled with other ecoenvironmental models, can be used as a supplemental tool for short-term (˜10-year) environmental planning and decision making.

  2. Cationic Covalent Organic Frameworks: A Simple Platform of Anionic Exchange for Porosity Tuning and Proton Conduction.

    PubMed

    Ma, Heping; Liu, Bailing; Li, Bin; Zhang, Liming; Li, Yang-Guang; Tan, Hua-Qiao; Zang, Hong-Ying; Zhu, Guangshan

    2016-05-11

    Mimicking proton conduction mechanism of Nafion to construct novel proton-conducting materials with low cost and high proton conductivity is of wide interest. Herein, we have designed and synthesized a cationic covalent organic framework with high thermal and chemical stability by combining a cationic monomer, ethidium bromide (EB) (3,8-diamino-5-ethyl-6-phenylphenanthridinium bromide), with 1,3,5-triformylphloroglucinol (TFP) in Schiff base reactions. This is the first time that the stable cationic crystalline frameworks allowed for the fabrication of a series of charged COFs (EB-COF:X, X = F, Cl, Br, I) through ion exchange processes. Exchange of the extra framework ions can finely modulate the COFs' porosity and pore sizes at nanoscale. More importantly, by introducing PW12O40(3-) into this porous cationic framework, we can greatly enhance the proton conductivity of ionic COF-based material. To the best of our knowledge, EB-COF:PW12 shows the best proton conductivity at room temperature among ever reported porous organic materials.

  3. Can an Analysis of the Contrast Between pre-Galilean and Newtonian Theoretical Frameworks Help Students Develop a Scientific Mindset

    NASA Astrophysics Data System (ADS)

    Kalman, Calvin S.; Aulls, Mark W.

    This study examines a course in which students use two writing activities and collaborative group activities to examine the conceptual structure of the calculus-based introductory Physics course. Students are presented with two alternative frameworks; pre-Galilean Physics and Newtonian Physics. The idea of the course design is that students would at first view the frameworks almost in a theatrical sense as a view of a drama involving a conflict of actors;Aristotle, Galileo, Newton and others occurring a long time ago. As participants passing through a series of interventions, the students become aware that the frameworks relate concepts from different parts of the course and learn to evaluate the two alternative frameworks. They develop a scientific mindset changing their outlook on the course material from the viewpoint that it consists of a tool kit of assorted practices, classified according to problem type, to the viewpoint that it comprises a connected structure of concepts.

  4. Benchmarking hydrological model predictive capability for UK River flows and flood peaks.

    NASA Astrophysics Data System (ADS)

    Lane, Rosanna; Coxon, Gemma; Freer, Jim; Wagener, Thorsten

    2017-04-01

    Data and hydrological models are now available for national hydrological analyses. However, hydrological model performance varies between catchments, and lumped, conceptual models are not able to produce adequate simulations everywhere. This study aims to benchmark hydrological model performance for catchments across the United Kingdom within an uncertainty analysis framework. We have applied four hydrological models from the FUSE framework to 1128 catchments across the UK. These models are all lumped models and run at a daily timestep, but differ in the model structural architecture and process parameterisations, therefore producing different but equally plausible simulations. We apply FUSE over a 20 year period from 1988-2008, within a GLUE Monte Carlo uncertainty analyses framework. Model performance was evaluated for each catchment, model structure and parameter set using standard performance metrics. These were calculated both for the whole time series and to assess seasonal differences in model performance. The GLUE uncertainty analysis framework was then applied to produce simulated 5th and 95th percentile uncertainty bounds for the daily flow time-series and additionally the annual maximum prediction bounds for each catchment. The results show that the model performance varies significantly in space and time depending on catchment characteristics including climate, geology and human impact. We identify regions where models are systematically failing to produce good results, and present reasons why this could be the case. We also identify regions or catchment characteristics where one model performs better than others, and have explored what structural component or parameterisation enables certain models to produce better simulations in these catchments. Model predictive capability was assessed for each catchment, through looking at the ability of the models to produce discharge prediction bounds which successfully bound the observed discharge. These results improve our understanding of the predictive capability of simple conceptual hydrological models across the UK and help us to identify where further effort is needed to develop modelling approaches to better represent different catchment and climate typologies.

  5. A new framework for quantifying uncertainties in modelling studies for future climates - how more certain are CMIP5 precipitation and temperature simulations compared to CMIP3?

    NASA Astrophysics Data System (ADS)

    Sharma, A.; Woldemeskel, F. M.; Sivakumar, B.; Mehrotra, R.

    2014-12-01

    We outline a new framework for assessing uncertainties in model simulations, be they hydro-ecological simulations for known scenarios, or climate simulations for assumed scenarios representing the future. This framework is illustrated here using GCM projections for future climates for hydrologically relevant variables (precipitation and temperature), with the uncertainty segregated into three dominant components - model uncertainty, scenario uncertainty (representing greenhouse gas emission scenarios), and ensemble uncertainty (representing uncertain initial conditions and states). A novel uncertainty metric, the Square Root Error Variance (SREV), is used to quantify the uncertainties involved. The SREV requires: (1) Interpolating raw and corrected GCM outputs to a common grid; (2) Converting these to percentiles; (3) Estimating SREV for model, scenario, initial condition and total uncertainty at each percentile; and (4) Transforming SREV to a time series. The outcome is a spatially varying series of SREVs associated with each model that can be used to assess how uncertain the system is at each simulated point or time. This framework, while illustrated in a climate change context, is completely applicable for assessment of uncertainties any modelling framework may be subject to. The proposed method is applied to monthly precipitation and temperature from 6 CMIP3 and 13 CMIP5 GCMs across the world. For CMIP3, B1, A1B and A2 scenarios whereas for CMIP5, RCP2.6, RCP4.5 and RCP8.5 representing low, medium and high emissions are considered. For both CMIP3 and CMIP5, model structure is the largest source of uncertainty, which reduces significantly after correcting for biases. Scenario uncertainly increases, especially for temperature, in future due to divergence of the three emission scenarios analysed. While CMIP5 precipitation simulations exhibit a small reduction in total uncertainty over CMIP3, there is almost no reduction observed for temperature projections. Estimation of uncertainty in both space and time sheds lights on the spatial and temporal patterns of uncertainties in GCM outputs, providing an effective platform for risk-based assessments of any alternate plans or decisions that may be formulated using GCM simulations.

  6. Software Design Challenges in Time Series Prediction Systems Using Parallel Implementation of Artificial Neural Networks.

    PubMed

    Manikandan, Narayanan; Subha, Srinivasan

    2016-01-01

    Software development life cycle has been characterized by destructive disconnects between activities like planning, analysis, design, and programming. Particularly software developed with prediction based results is always a big challenge for designers. Time series data forecasting like currency exchange, stock prices, and weather report are some of the areas where an extensive research is going on for the last three decades. In the initial days, the problems with financial analysis and prediction were solved by statistical models and methods. For the last two decades, a large number of Artificial Neural Networks based learning models have been proposed to solve the problems of financial data and get accurate results in prediction of the future trends and prices. This paper addressed some architectural design related issues for performance improvement through vectorising the strengths of multivariate econometric time series models and Artificial Neural Networks. It provides an adaptive approach for predicting exchange rates and it can be called hybrid methodology for predicting exchange rates. This framework is tested for finding the accuracy and performance of parallel algorithms used.

  7. Software Design Challenges in Time Series Prediction Systems Using Parallel Implementation of Artificial Neural Networks

    PubMed Central

    Manikandan, Narayanan; Subha, Srinivasan

    2016-01-01

    Software development life cycle has been characterized by destructive disconnects between activities like planning, analysis, design, and programming. Particularly software developed with prediction based results is always a big challenge for designers. Time series data forecasting like currency exchange, stock prices, and weather report are some of the areas where an extensive research is going on for the last three decades. In the initial days, the problems with financial analysis and prediction were solved by statistical models and methods. For the last two decades, a large number of Artificial Neural Networks based learning models have been proposed to solve the problems of financial data and get accurate results in prediction of the future trends and prices. This paper addressed some architectural design related issues for performance improvement through vectorising the strengths of multivariate econometric time series models and Artificial Neural Networks. It provides an adaptive approach for predicting exchange rates and it can be called hybrid methodology for predicting exchange rates. This framework is tested for finding the accuracy and performance of parallel algorithms used. PMID:26881271

  8. MARTe: A Multiplatform Real-Time Framework

    NASA Astrophysics Data System (ADS)

    Neto, André C.; Sartori, Filippo; Piccolo, Fabio; Vitelli, Riccardo; De Tommasi, Gianmaria; Zabeo, Luca; Barbalace, Antonio; Fernandes, Horacio; Valcarcel, Daniel F.; Batista, Antonio J. N.

    2010-04-01

    Development of real-time applications is usually associated with nonportable code targeted at specific real-time operating systems. The boundary between hardware drivers, system services, and user code is commonly not well defined, making the development in the target host significantly difficult. The Multithreaded Application Real-Time executor (MARTe) is a framework built over a multiplatform library that allows the execution of the same code in different operating systems. The framework provides the high-level interfaces with hardware, external configuration programs, and user interfaces, assuring at the same time hard real-time performances. End-users of the framework are required to define and implement algorithms inside a well-defined block of software, named Generic Application Module (GAM), that is executed by the real-time scheduler. Each GAM is reconfigurable with a set of predefined configuration meta-parameters and interchanges information using a set of data pipes that are provided as inputs and required as output. Using these connections, different GAMs can be chained either in series or parallel. GAMs can be developed and debugged in a non-real-time system and, only once the robustness of the code and correctness of the algorithm are verified, deployed to the real-time system. The software also supplies a large set of utilities that greatly ease the interaction and debugging of a running system. Among the most useful are a highly efficient real-time logger, HTTP introspection of real-time objects, and HTTP remote configuration. MARTe is currently being used to successfully drive the plasma vertical stabilization controller on the largest magnetic confinement fusion device in the world, with a control loop cycle of 50 ?s and a jitter under 1 ?s. In this particular project, MARTe is used with the Real-Time Application Interface (RTAI)/Linux operating system exploiting the new ?86 multicore processors technology.

  9. Shape Distributions of Nonlinear Dynamical Systems for Video-Based Inference.

    PubMed

    Venkataraman, Vinay; Turaga, Pavan

    2016-12-01

    This paper presents a shape-theoretic framework for dynamical analysis of nonlinear dynamical systems which appear frequently in several video-based inference tasks. Traditional approaches to dynamical modeling have included linear and nonlinear methods with their respective drawbacks. A novel approach we propose is the use of descriptors of the shape of the dynamical attractor as a feature representation of nature of dynamics. The proposed framework has two main advantages over traditional approaches: a) representation of the dynamical system is derived directly from the observational data, without any inherent assumptions, and b) the proposed features show stability under different time-series lengths where traditional dynamical invariants fail. We illustrate our idea using nonlinear dynamical models such as Lorenz and Rossler systems, where our feature representations (shape distribution) support our hypothesis that the local shape of the reconstructed phase space can be used as a discriminative feature. Our experimental analyses on these models also indicate that the proposed framework show stability for different time-series lengths, which is useful when the available number of samples are small/variable. The specific applications of interest in this paper are: 1) activity recognition using motion capture and RGBD sensors, 2) activity quality assessment for applications in stroke rehabilitation, and 3) dynamical scene classification. We provide experimental validation through action and gesture recognition experiments on motion capture and Kinect datasets. In all these scenarios, we show experimental evidence of the favorable properties of the proposed representation.

  10. Inferring the nature of anthropogenic threats from long-term abundance records.

    PubMed

    Shoemaker, Kevin T; Akçakaya, H Resit

    2015-02-01

    Diagnosing the processes that threaten species persistence is critical for recovery planning and risk forecasting. Dominant threats are typically inferred by experts on the basis of a patchwork of informal methods. Transparent, quantitative diagnostic tools would contribute much-needed consistency, objectivity, and rigor to the process of diagnosing anthropogenic threats. Long-term census records, available for an increasingly large and diverse set of taxa, may exhibit characteristic signatures of specific threatening processes and thereby provide information for threat diagnosis. We developed a flexible Bayesian framework for diagnosing threats on the basis of long-term census records and diverse ancillary sources of information. We tested this framework with simulated data from artificial populations subjected to varying degrees of exploitation and habitat loss and several real-world abundance time series for which threatening processes are relatively well understood: bluefin tuna (Thunnus maccoyii) and Atlantic cod (Gadus morhua) (exploitation) and Red Grouse (Lagopus lagopus scotica) and Eurasian Skylark (Alauda arvensis) (habitat loss). Our method correctly identified the process driving population decline for over 90% of time series simulated under moderate to severe threat scenarios. Successful identification of threats approached 100% for severe exploitation and habitat loss scenarios. Our method identified threats less successfully when threatening processes were weak and when populations were simultaneously affected by multiple threats. Our method selected the presumed true threat model for all real-world case studies, although results were somewhat ambiguous in the case of the Eurasian Skylark. In the latter case, incorporation of an ancillary source of information (records of land-use change) increased the weight assigned to the presumed true model from 70% to 92%, illustrating the value of the proposed framework in bringing diverse sources of information into a common rigorous framework. Ultimately, our framework may greatly assist conservation organizations in documenting threatening processes and planning species recovery. © 2014 Society for Conservation Biology.

  11. New anion-templated 3D heterobimetallic open frameworks based on lanthanide-carboxylate layers and copper pillars

    NASA Astrophysics Data System (ADS)

    Li, Yun-Wu; Wang, Yong-Hui; Li, Yang-Guang; Wang, En-Bo

    2008-06-01

    A series of new three-dimensional (3D) lanthanide-transition metal (4 f-3 d) heterobimetallic open frameworks, [ Ln2(1,2-bdc) 2(H 2O) 2 Cu(inic) 2](ClO 4) ( Ln=Eu (1), Tb (2), Nd (3) and Sm (4); 1,2-bdc=1,2-benzenedicarboxylate; Hinic=isonicotinic acid) have been hydrothermally synthesized and characterized by elemental analysis, IR, TG and single-crystal X-ray diffraction analysis. Compounds 1-4 are isostructural. They possess a new anion-templated 3D heterobimetallic open framework, which is observed for the first time in the { Ln/ TM/bdc/inic} ( TM=transition metal) system. Compounds 1 and 2 exhibit the characteristic fluorescent properties of Eu(III) and Tb(III), respectively.

  12. Present-day deformation in Europe, as seen by the EPOS-GNSS prototype solution in double difference, and first co- and post-seismic displacements associated with 2016 Amatrice and Norcia earthquakes (Italy)

    NASA Astrophysics Data System (ADS)

    Socquet, Anne; Déprez, Aline; Cotte, Nathalie; Maubant, Louise; Walpersdorf, Andrea; Bato, Mary Grace

    2017-04-01

    We present here a new pan-European velocity field, obtained by processing 500+ cGPS stations in double difference, in the framework of the implementation phase of the European Plate Observing System (EPOS) project. This prototype solution spans the 2000-2016 period, and includes data from RING, NOA, RENAG and European Permanent Network (EPN) cGPS netwprks. The data set is first split into daily sub-networks (between 8 and 14 sub-networks). The sub-networks consist in about 40 stations, with 2 overlapping stations. For each day and for each sub-network, the GAMIT processing is conducted independently. Once each sub-network achieves satisfactory results, a daily combination is performed in order to produce SINEX files. The Chi square value associated with the combination allows us to evaluate its quality. Eventually, a multi year combination generates position time series for each station. Each time series is visualized and the jumps associated with material change (antenna or receiver) are estimated and corrected. This procedure allows us to generate daily solutions and position time series for all stations. The associated "interseismic" velocity field has then been estimated using a times series analysis using MIDAS software, and compared to another independent estimate obtained by Kalman filtering with globk software. In addition to this velocity field we made a specific zoom on Italy and present a strain rate map as well as time series showing co- and post- seismic movements associated with the 2016 Amatrice and Norcia earthquakes.

  13. Boolean network identification from perturbation time series data combining dynamics abstraction and logic programming.

    PubMed

    Ostrowski, M; Paulevé, L; Schaub, T; Siegel, A; Guziolowski, C

    2016-11-01

    Boolean networks (and more general logic models) are useful frameworks to study signal transduction across multiple pathways. Logic models can be learned from a prior knowledge network structure and multiplex phosphoproteomics data. However, most efficient and scalable training methods focus on the comparison of two time-points and assume that the system has reached an early steady state. In this paper, we generalize such a learning procedure to take into account the time series traces of phosphoproteomics data in order to discriminate Boolean networks according to their transient dynamics. To that end, we identify a necessary condition that must be satisfied by the dynamics of a Boolean network to be consistent with a discretized time series trace. Based on this condition, we use Answer Set Programming to compute an over-approximation of the set of Boolean networks which fit best with experimental data and provide the corresponding encodings. Combined with model-checking approaches, we end up with a global learning algorithm. Our approach is able to learn logic models with a true positive rate higher than 78% in two case studies of mammalian signaling networks; for a larger case study, our method provides optimal answers after 7min of computation. We quantified the gain in our method predictions precision compared to learning approaches based on static data. Finally, as an application, our method proposes erroneous time-points in the time series data with respect to the optimal learned logic models. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  14. A Web-Based Framework For a Time-Domain Warehouse

    NASA Astrophysics Data System (ADS)

    Brewer, J. M.; Bloom, J. S.; Kennedy, R.; Starr, D. L.

    2009-09-01

    The Berkeley Transients Classification Pipeline (TCP) uses a machine-learning classifier to automatically categorize transients from large data torrents and provide automated notification of astronomical events of scientific interest. As part of the training process, we created a large warehouse of light-curve sources with well-labelled classes that serve as priors to the classification engine. This web-based interactive framework, which we are now making public via DotAstro.org (http://dotastro.org/), allows us to ingest time-variable source data in a wide variety of formats and store it in a common internal data model. Data is passed between pipeline modules in a prototype XML representation of time-series format (VOTimeseries), which can also be emitted to collaborators through dotastro.org. After import, the sources can be visualized using Google Sky, light curves can be inspected interactively, and classifications can be manually adjusted.

  15. Multidimensional Recurrence Quantification Analysis (MdRQA) for the Analysis of Multidimensional Time-Series: A Software Implementation in MATLAB and Its Application to Group-Level Data in Joint Action

    PubMed Central

    Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan

    2016-01-01

    We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels—from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures. PMID:27920748

  16. Multidimensional Recurrence Quantification Analysis (MdRQA) for the Analysis of Multidimensional Time-Series: A Software Implementation in MATLAB and Its Application to Group-Level Data in Joint Action.

    PubMed

    Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan

    2016-01-01

    We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels-from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures.

  17. Directed dynamical influence is more detectable with noise

    PubMed Central

    Jiang, Jun-Jie; Huang, Zi-Gang; Huang, Liang; Liu, Huan; Lai, Ying-Cheng

    2016-01-01

    Successful identification of directed dynamical influence in complex systems is relevant to significant problems of current interest. Traditional methods based on Granger causality and transfer entropy have issues such as difficulty with nonlinearity and large data requirement. Recently a framework based on nonlinear dynamical analysis was proposed to overcome these difficulties. We find, surprisingly, that noise can counterintuitively enhance the detectability of directed dynamical influence. In fact, intentionally injecting a proper amount of asymmetric noise into the available time series has the unexpected benefit of dramatically increasing confidence in ascertaining the directed dynamical influence in the underlying system. This result is established based on both real data and model time series from nonlinear ecosystems. We develop a physical understanding of the beneficial role of noise in enhancing detection of directed dynamical influence. PMID:27066763

  18. Directed dynamical influence is more detectable with noise.

    PubMed

    Jiang, Jun-Jie; Huang, Zi-Gang; Huang, Liang; Liu, Huan; Lai, Ying-Cheng

    2016-04-12

    Successful identification of directed dynamical influence in complex systems is relevant to significant problems of current interest. Traditional methods based on Granger causality and transfer entropy have issues such as difficulty with nonlinearity and large data requirement. Recently a framework based on nonlinear dynamical analysis was proposed to overcome these difficulties. We find, surprisingly, that noise can counterintuitively enhance the detectability of directed dynamical influence. In fact, intentionally injecting a proper amount of asymmetric noise into the available time series has the unexpected benefit of dramatically increasing confidence in ascertaining the directed dynamical influence in the underlying system. This result is established based on both real data and model time series from nonlinear ecosystems. We develop a physical understanding of the beneficial role of noise in enhancing detection of directed dynamical influence.

  19. Spatio-temporal Event Classification using Time-series Kernel based Structured Sparsity

    PubMed Central

    Jeni, László A.; Lőrincz, András; Szabó, Zoltán; Cohn, Jeffrey F.; Kanade, Takeo

    2016-01-01

    In many behavioral domains, such as facial expression and gesture, sparse structure is prevalent. This sparsity would be well suited for event detection but for one problem. Features typically are confounded by alignment error in space and time. As a consequence, high-dimensional representations such as SIFT and Gabor features have been favored despite their much greater computational cost and potential loss of information. We propose a Kernel Structured Sparsity (KSS) method that can handle both the temporal alignment problem and the structured sparse reconstruction within a common framework, and it can rely on simple features. We characterize spatio-temporal events as time-series of motion patterns and by utilizing time-series kernels we apply standard structured-sparse coding techniques to tackle this important problem. We evaluated the KSS method using both gesture and facial expression datasets that include spontaneous behavior and differ in degree of difficulty and type of ground truth coding. KSS outperformed both sparse and non-sparse methods that utilize complex image features and their temporal extensions. In the case of early facial event classification KSS had 10% higher accuracy as measured by F1 score over kernel SVM methods1. PMID:27830214

  20. 'TIME': A Web Application for Obtaining Insights into Microbial Ecology Using Longitudinal Microbiome Data.

    PubMed

    Baksi, Krishanu D; Kuntal, Bhusan K; Mande, Sharmila S

    2018-01-01

    Realization of the importance of microbiome studies, coupled with the decreasing sequencing cost, has led to the exponential growth of microbiome data. A number of these microbiome studies have focused on understanding changes in the microbial community over time. Such longitudinal microbiome studies have the potential to offer unique insights pertaining to the microbial social networks as well as their responses to perturbations. In this communication, we introduce a web based framework called 'TIME' (Temporal Insights into Microbial Ecology'), developed specifically to obtain meaningful insights from microbiome time series data. The TIME web-server is designed to accept a wide range of popular formats as input with options to preprocess and filter the data. Multiple samples, defined by a series of longitudinal time points along with their metadata information, can be compared in order to interactively visualize the temporal variations. In addition to standard microbiome data analytics, the web server implements popular time series analysis methods like Dynamic time warping, Granger causality and Dickey Fuller test to generate interactive layouts for facilitating easy biological inferences. Apart from this, a new metric for comparing metagenomic time series data has been introduced to effectively visualize the similarities/differences in the trends of the resident microbial groups. Augmenting the visualizations with the stationarity information pertaining to the microbial groups is utilized to predict the microbial competition as well as community structure. Additionally, the 'causality graph analysis' module incorporated in TIME allows predicting taxa that might have a higher influence on community structure in different conditions. TIME also allows users to easily identify potential taxonomic markers from a longitudinal microbiome analysis. We illustrate the utility of the web-server features on a few published time series microbiome data and demonstrate the ease with which it can be used to perform complex analysis.

  1. An Analytical Time–Domain Expression for the Net Ripple Produced by Parallel Interleaved Converters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Brian B.; Krein, Philip T.

    We apply modular arithmetic and Fourier series to analyze the superposition of N interleaved triangular waveforms with identical amplitudes and duty-ratios. Here, interleaving refers to the condition when a collection of periodic waveforms with identical periods are each uniformly phase-shifted across one period. The main result is a time-domain expression which provides an exact representation of the summed and interleaved triangular waveforms, where the peak amplitude and parameters of the time-periodic component are all specified in closed-form. Analysis is general and can be used to study various applications in multi-converter systems. This model is unique not only in that itmore » reveals a simple and intuitive expression for the net ripple, but its derivation via modular arithmetic and Fourier series is distinct from prior approaches. The analytical framework is experimentally validated with a system of three parallel converters under time-varying operating conditions.« less

  2. A theoretically consistent stochastic cascade for temporal disaggregation of intermittent rainfall

    NASA Astrophysics Data System (ADS)

    Lombardo, F.; Volpi, E.; Koutsoyiannis, D.; Serinaldi, F.

    2017-06-01

    Generating fine-scale time series of intermittent rainfall that are fully consistent with any given coarse-scale totals is a key and open issue in many hydrological problems. We propose a stationary disaggregation method that simulates rainfall time series with given dependence structure, wet/dry probability, and marginal distribution at a target finer (lower-level) time scale, preserving full consistency with variables at a parent coarser (higher-level) time scale. We account for the intermittent character of rainfall at fine time scales by merging a discrete stochastic representation of intermittency and a continuous one of rainfall depths. This approach yields a unique and parsimonious mathematical framework providing general analytical formulations of mean, variance, and autocorrelation function (ACF) for a mixed-type stochastic process in terms of mean, variance, and ACFs of both continuous and discrete components, respectively. To achieve the full consistency between variables at finer and coarser time scales in terms of marginal distribution and coarse-scale totals, the generated lower-level series are adjusted according to a procedure that does not affect the stochastic structure implied by the original model. To assess model performance, we study rainfall process as intermittent with both independent and dependent occurrences, where dependence is quantified by the probability that two consecutive time intervals are dry. In either case, we provide analytical formulations of main statistics of our mixed-type disaggregation model and show their clear accordance with Monte Carlo simulations. An application to rainfall time series from real world is shown as a proof of concept.

  3. A public choice framework for controlling transmissible and evolving diseases

    PubMed Central

    Althouse, Benjamin M.; Bergstrom, Theodore C.; Bergstrom, Carl T.

    2009-01-01

    Control measures used to limit the spread of infectious disease often generate externalities. Vaccination for transmissible diseases can reduce the incidence of disease even among the unvaccinated, whereas antimicrobial chemotherapy can lead to the evolution of antimicrobial resistance and thereby limit its own effectiveness over time. We integrate the economic theory of public choice with mathematical models of infectious disease to provide a quantitative framework for making allocation decisions in the presence of these externalities. To illustrate, we present a series of examples: vaccination for tetanus, vaccination for measles, antibiotic treatment of otitis media, and antiviral treatment of pandemic influenza. PMID:20018681

  4. A general theory on frequency and time-frequency analysis of irregularly sampled time series based on projection methods - Part 2: Extension to time-frequency analysis

    NASA Astrophysics Data System (ADS)

    Lenoir, Guillaume; Crucifix, Michel

    2018-03-01

    Geophysical time series are sometimes sampled irregularly along the time axis. The situation is particularly frequent in palaeoclimatology. Yet, there is so far no general framework for handling the continuous wavelet transform when the time sampling is irregular. Here we provide such a framework. To this end, we define the scalogram as the continuous-wavelet-transform equivalent of the extended Lomb-Scargle periodogram defined in Part 1 of this study (Lenoir and Crucifix, 2018). The signal being analysed is modelled as the sum of a locally periodic component in the time-frequency plane, a polynomial trend, and a background noise. The mother wavelet adopted here is the Morlet wavelet classically used in geophysical applications. The background noise model is a stationary Gaussian continuous autoregressive-moving-average (CARMA) process, which is more general than the traditional Gaussian white and red noise processes. The scalogram is smoothed by averaging over neighbouring times in order to reduce its variance. The Shannon-Nyquist exclusion zone is however defined as the area corrupted by local aliasing issues. The local amplitude in the time-frequency plane is then estimated with least-squares methods. We also derive an approximate formula linking the squared amplitude and the scalogram. Based on this property, we define a new analysis tool: the weighted smoothed scalogram, which we recommend for most analyses. The estimated signal amplitude also gives access to band and ridge filtering. Finally, we design a test of significance for the weighted smoothed scalogram against the stationary Gaussian CARMA background noise, and provide algorithms for computing confidence levels, either analytically or with Monte Carlo Markov chain methods. All the analysis tools presented in this article are available to the reader in the Python package WAVEPAL.

  5. A generic multi-hazard and multi-risk framework and its application illustrated in a virtual city

    NASA Astrophysics Data System (ADS)

    Mignan, Arnaud; Euchner, Fabian; Wiemer, Stefan

    2013-04-01

    We present a generic framework to implement hazard correlations in multi-risk assessment strategies. We consider hazard interactions (process I), time-dependent vulnerability (process II) and time-dependent exposure (process III). Our approach is based on the Monte Carlo method to simulate a complex system, which is defined from assets exposed to a hazardous region. We generate 1-year time series, sampling from a stochastic set of events. Each time series corresponds to one risk scenario and the analysis of multiple time series allows for the probabilistic assessment of losses and for the recognition of more or less probable risk paths. Each sampled event is associated to a time of occurrence, a damage footprint and a loss footprint. The occurrence of an event depends on its rate, which is conditional on the occurrence of past events (process I, concept of correlation matrix). Damage depends on the hazard intensity and on the vulnerability of the asset, which is conditional on previous damage on that asset (process II). Losses are the product of damage and exposure value, this value being the original exposure minus previous losses (process III, no reconstruction considered). The Monte Carlo method allows for a straightforward implementation of uncertainties and for implementation of numerous interactions, which is otherwise challenging in an analytical multi-risk approach. We apply our framework to a synthetic data set, defined by a virtual city within a virtual region. This approach gives the opportunity to perform multi-risk analyses in a controlled environment while not requiring real data, which may be difficultly accessible or simply unavailable to the public. Based on the heuristic approach, we define a 100 by 100 km region where earthquakes, volcanic eruptions, fluvial floods, hurricanes and coastal floods can occur. All hazards are harmonized to a common format. We define a 20 by 20 km city, composed of 50,000 identical buildings with a fixed economic value. Vulnerability curves are defined in terms of mean damage ratio as a function of hazard intensity. All data are based on simple equations found in the literature and on other simplifications. We show the impact of earthquake-earthquake interaction and hurricane-storm surge coupling, as well as of time-dependent vulnerability and exposure, on aggregated loss curves. One main result is the emergence of low probability-high consequences (extreme) events when correlations are implemented. While the concept of virtual city can suggest the theoretical benefits of multi-risk assessment for decision support, identifying their real-world practicality will require the study of real test sites.

  6. A framework for scalable parameter estimation of gene circuit models using structural information.

    PubMed

    Kuwahara, Hiroyuki; Fan, Ming; Wang, Suojin; Gao, Xin

    2013-07-01

    Systematic and scalable parameter estimation is a key to construct complex gene regulatory models and to ultimately facilitate an integrative systems biology approach to quantitatively understand the molecular mechanisms underpinning gene regulation. Here, we report a novel framework for efficient and scalable parameter estimation that focuses specifically on modeling of gene circuits. Exploiting the structure commonly found in gene circuit models, this framework decomposes a system of coupled rate equations into individual ones and efficiently integrates them separately to reconstruct the mean time evolution of the gene products. The accuracy of the parameter estimates is refined by iteratively increasing the accuracy of numerical integration using the model structure. As a case study, we applied our framework to four gene circuit models with complex dynamics based on three synthetic datasets and one time series microarray data set. We compared our framework to three state-of-the-art parameter estimation methods and found that our approach consistently generated higher quality parameter solutions efficiently. Although many general-purpose parameter estimation methods have been applied for modeling of gene circuits, our results suggest that the use of more tailored approaches to use domain-specific information may be a key to reverse engineering of complex biological systems. http://sfb.kaust.edu.sa/Pages/Software.aspx. Supplementary data are available at Bioinformatics online.

  7. Biodiversity-ecosystem functioning relationships in long-term time series and palaeoecological records: deep sea as a test bed.

    PubMed

    Yasuhara, Moriaki; Doi, Hideyuki; Wei, Chih-Lin; Danovaro, Roberto; Myhre, Sarah E

    2016-05-19

    The link between biodiversity and ecosystem functioning (BEF) over long temporal scales is poorly understood. Here, we investigate biological monitoring and palaeoecological records on decadal, centennial and millennial time scales from a BEF framework by using deep sea, soft-sediment environments as a test bed. Results generally show positive BEF relationships, in agreement with BEF studies based on present-day spatial analyses and short-term manipulative experiments. However, the deep-sea BEF relationship is much noisier across longer time scales compared with modern observational studies. We also demonstrate with palaeoecological time-series data that a larger species pool does not enhance ecosystem stability through time, whereas higher abundance as an indicator of higher ecosystem functioning may enhance ecosystem stability. These results suggest that BEF relationships are potentially time scale-dependent. Environmental impacts on biodiversity and ecosystem functioning may be much stronger than biodiversity impacts on ecosystem functioning at long, decadal-millennial, time scales. Longer time scale perspectives, including palaeoecological and ecosystem monitoring data, are critical for predicting future BEF relationships on a rapidly changing planet. © 2016 The Author(s).

  8. Spatiotemporal Visualization of Time-Series Satellite-Derived CO2 Flux Data Using Volume Rendering and Gpu-Based Interpolation on a Cloud-Driven Digital Earth

    NASA Astrophysics Data System (ADS)

    Wu, S.; Yan, Y.; Du, Z.; Zhang, F.; Liu, R.

    2017-10-01

    The ocean carbon cycle has a significant influence on global climate, and is commonly evaluated using time-series satellite-derived CO2 flux data. Location-aware and globe-based visualization is an important technique for analyzing and presenting the evolution of climate change. To achieve realistic simulation of the spatiotemporal dynamics of ocean carbon, a cloud-driven digital earth platform is developed to support the interactive analysis and display of multi-geospatial data, and an original visualization method based on our digital earth is proposed to demonstrate the spatiotemporal variations of carbon sinks and sources using time-series satellite data. Specifically, a volume rendering technique using half-angle slicing and particle system is implemented to dynamically display the released or absorbed CO2 gas. To enable location-aware visualization within the virtual globe, we present a 3D particlemapping algorithm to render particle-slicing textures onto geospace. In addition, a GPU-based interpolation framework using CUDA during real-time rendering is designed to obtain smooth effects in both spatial and temporal dimensions. To demonstrate the capabilities of the proposed method, a series of satellite data is applied to simulate the air-sea carbon cycle in the China Sea. The results show that the suggested strategies provide realistic simulation effects and acceptable interactive performance on the digital earth.

  9. Interpretable Categorization of Heterogeneous Time Series Data

    NASA Technical Reports Server (NTRS)

    Lee, Ritchie; Kochenderfer, Mykel J.; Mengshoel, Ole J.; Silbermann, Joshua

    2017-01-01

    We analyze data from simulated aircraft encounters to validate and inform the development of a prototype aircraft collision avoidance system. The high-dimensional and heterogeneous time series dataset is analyzed to discover properties of near mid-air collisions (NMACs) and categorize the NMAC encounters. Domain experts use these properties to better organize and understand NMAC occurrences. Existing solutions either are not capable of handling high-dimensional and heterogeneous time series datasets or do not provide explanations that are interpretable by a domain expert. The latter is critical to the acceptance and deployment of safety-critical systems. To address this gap, we propose grammar-based decision trees along with a learning algorithm. Our approach extends decision trees with a grammar framework for classifying heterogeneous time series data. A context-free grammar is used to derive decision expressions that are interpretable, application-specific, and support heterogeneous data types. In addition to classification, we show how grammar-based decision trees can also be used for categorization, which is a combination of clustering and generating interpretable explanations for each cluster. We apply grammar-based decision trees to a simulated aircraft encounter dataset and evaluate the performance of four variants of our learning algorithm. The best algorithm is used to analyze and categorize near mid-air collisions in the aircraft encounter dataset. We describe each discovered category in detail and discuss its relevance to aircraft collision avoidance.

  10. ReTrOS: a MATLAB toolbox for reconstructing transcriptional activity from gene and protein expression data.

    PubMed

    Minas, Giorgos; Momiji, Hiroshi; Jenkins, Dafyd J; Costa, Maria J; Rand, David A; Finkenstädt, Bärbel

    2017-06-26

    Given the development of high-throughput experimental techniques, an increasing number of whole genome transcription profiling time series data sets, with good temporal resolution, are becoming available to researchers. The ReTrOS toolbox (Reconstructing Transcription Open Software) provides MATLAB-based implementations of two related methods, namely ReTrOS-Smooth and ReTrOS-Switch, for reconstructing the temporal transcriptional activity profile of a gene from given mRNA expression time series or protein reporter time series. The methods are based on fitting a differential equation model incorporating the processes of transcription, translation and degradation. The toolbox provides a framework for model fitting along with statistical analyses of the model with a graphical interface and model visualisation. We highlight several applications of the toolbox, including the reconstruction of the temporal cascade of transcriptional activity inferred from mRNA expression data and protein reporter data in the core circadian clock in Arabidopsis thaliana, and how such reconstructed transcription profiles can be used to study the effects of different cell lines and conditions. The ReTrOS toolbox allows users to analyse gene and/or protein expression time series where, with appropriate formulation of prior information about a minimum of kinetic parameters, in particular rates of degradation, users are able to infer timings of changes in transcriptional activity. Data from any organism and obtained from a range of technologies can be used as input due to the flexible and generic nature of the model and implementation. The output from this software provides a useful analysis of time series data and can be incorporated into further modelling approaches or in hypothesis generation.

  11. Constaints on Lorentz symmetry violations using lunar laser ranging observations

    NASA Astrophysics Data System (ADS)

    Bourgoin, Adrien

    2016-12-01

    General Relativity (GR) and the standard model of particle physics provide a comprehensive description of the four interactions of nature. A quantum gravity theory is expected to merge these two pillars of modern physics. From unification theories, such a combination would lead to a breaking of fundamental symmetry appearing in both GR and the standard model of particle physics as the Lorentz symmetry. Lorentz symmetry violations in all fields of physics can be parametrized by an effective field theory framework called the standard-model extension (SME). Local Lorentz Invariance violations in the gravitational sector should impact the orbital motion of bodies inside the solar system, such as the Moon. Thus, the accurate lunar laser ranging (LLR) data can be analyzed in order to study precisely the lunar motion to look for irregularities. For this purpose, ELPN (Ephéméride Lunaire Parisienne Numérique), a new lunar ephemeris has been integrated in the SME framework. This new numerical solution of the lunar motion provides time series dated in temps dynamique barycentrique (TDB). Among that series, we mention the barycentric position and velocity of the Earth-Moon vector, the lunar libration angles, the time scale difference between the terrestrial time and TDB and partial derivatives integrated from variational equations. ELPN predictions have been used to analyzed LLR observations. In the GR framework, the residuals standard deviations has turned out to be the same order of magnitude compare to those of INPOP13b and DE430 ephemerides. In the framework of the minimal SME, LLR data analysis provided constraints on local Lorentz invariance violations. Spetial attention was paid to analyze uncertainties to provide the most realistic constraints. Therefore, in a first place, linear combinations of SME coefficients have been derived and fitted to LLR observations. In a second time, realistic uncertainties have been determined with a resampling method. LLR data analysis did not reveal local Lorentz invariance vio lations arising on the lunar orbit. Therefore, GR predictions are recovered with absolute precisions of the order of 10-9 to 10-12.

  12. An historically consistent and broadly applicable MRV system based on LiDAR sampling and Landsat time-series

    Treesearch

    W. Cohen; H. Andersen; S. Healey; G. Moisen; T. Schroeder; C. Woodall; G. Domke; Z. Yang; S. Stehman; R. Kennedy; C. Woodcock; Z. Zhu; J. Vogelmann; D. Steinwand; C. Huang

    2014-01-01

    The authors are developing a REDD+ MRV system that tests different biomass estimation frameworks and components. Design-based inference from a costly fi eld plot network was compared to sampling with LiDAR strips and a smaller set of plots in combination with Landsat for disturbance monitoring. Biomass estimation uncertainties associated with these different data sets...

  13. Potential effects of climate change on streambed scour and risks to salmonid survival in snow-dominated mountain basins

    Treesearch

    Jaime R. Goode; John M. Buffington; Daniele Tonina; Daniel J. Isaak; Russell F. Thurow; Seth Wenger; David Nagel; Charlie Luce; Doerthe Tetzlaff; Chris Soulsby

    2013-01-01

    Snowmelt-dominated basins in northern latitudes provide critical habitat for salmonids. As such, these systems may be especially vulnerable to climate change because of potential shifts in the frequency, magnitude, and timing of flows that can scour incubating embryos. A general framework is presented to examine this issue, using a series of physical models that link...

  14. Flood risk in a changing world - a coupled transdisciplinary modelling framework for flood risk assessment in an Alpine study area

    NASA Astrophysics Data System (ADS)

    Huttenlau, Matthias; Schneeberger, Klaus; Winter, Benjamin; Pazur, Robert; Förster, Kristian; Achleitner, Stefan; Bolliger, Janine

    2017-04-01

    Devastating flood events have caused substantial economic damage across Europe during past decades. Flood risk management has therefore become a topic of crucial interest across state agencies, research communities and the public sector including insurances. There is consensus that mitigating flood risk relies on impact assessments which quantitatively account for a broad range of aspects in a (changing) environment. Flood risk assessments which take into account the interaction between the drivers climate change, land-use change and socio-economic change might bring new insights to the understanding of the magnitude and spatial characteristic of flood risks. Furthermore, the comparative assessment of different adaptation measures can give valuable information for decision-making. With this contribution we present an inter- and transdisciplinary research project aiming at developing and applying such an impact assessment relying on a coupled modelling framework for the Province of Vorarlberg in Austria. Stakeholder engagement ensures that the final outcomes of our study are accepted and successfully implemented in flood management practice. The study addresses three key questions: (i) What are scenarios of land- use and climate change for the study area? (ii) How will the magnitude and spatial characteristic of future flood risk change as a result of changes in climate and land use? (iii) Are there spatial planning and building-protection measures which effectively reduce future flood risk? The modelling framework has a modular structure comprising modules (i) climate change, (ii) land-use change, (iii) hydrologic modelling, (iv) flood risk analysis, and (v) adaptation measures. Meteorological time series are coupled with spatially explicit scenarios of land-use change to model runoff time series. The runoff time series are combined with impact indicators such as building damages and results are statistically assessed to analyse flood risk scenarios. Thus, the regional flood risk can be expressed in terms of expected annual damage and damages associated with a low probability of occurrence. We consider building protection measures explicitly as part of the consequence analysis of flood risk whereas spatial planning measures are already considered as explicit scenarios in the course of land-use change modelling.

  15. The Earth Observation Monitor - Automated monitoring and alerting for spatial time-series data based on OGC web services

    NASA Astrophysics Data System (ADS)

    Eberle, J.; Hüttich, C.; Schmullius, C.

    2014-12-01

    Spatial time series data are freely available around the globe from earth observation satellites and meteorological stations for many years until now. They provide useful and important information to detect ongoing changes of the environment; but for end-users it is often too complex to extract this information out of the original time series datasets. This issue led to the development of the Earth Observation Monitor (EOM), an operational framework and research project to provide simple access, analysis and monitoring tools for global spatial time series data. A multi-source data processing middleware in the backend is linked to MODIS data from Land Processes Distributed Archive Center (LP DAAC) and Google Earth Engine as well as daily climate station data from NOAA National Climatic Data Center. OGC Web Processing Services are used to integrate datasets from linked data providers or external OGC-compliant interfaces to the EOM. Users can either use the web portal (webEOM) or the mobile application (mobileEOM) to execute these processing services and to retrieve the requested data for a given point or polygon in userfriendly file formats (CSV, GeoTiff). Beside providing just data access tools, users can also do further time series analyses like trend calculations, breakpoint detections or the derivation of phenological parameters from vegetation time series data. Furthermore data from climate stations can be aggregated over a given time interval. Calculated results can be visualized in the client and downloaded for offline usage. Automated monitoring and alerting of the time series data integrated by the user is provided by an OGC Sensor Observation Service with a coupled OGC Web Notification Service. Users can decide which datasets and parameters are monitored with a given filter expression (e.g., precipitation value higher than x millimeter per day, occurrence of a MODIS Fire point, detection of a time series anomaly). Datasets integrated in the SOS service are updated in near-realtime based on the linked data providers mentioned above. An alert is automatically pushed to the user if the new data meets the conditions of the registered filter expression. This monitoring service is available on the web portal with alerting by email and within the mobile app with alerting by email and push notification.

  16. pyBadlands: A framework to simulate sediment transport, landscape dynamics and basin stratigraphic evolution through space and time

    PubMed Central

    2018-01-01

    Understanding Earth surface responses in terms of sediment dynamics to climatic variability and tectonics forcing is hindered by limited ability of current models to simulate long-term evolution of sediment transfer and associated morphological changes. This paper presents pyBadlands, an open-source python-based framework which computes over geological time (1) sediment transport from landmasses to coasts, (2) reworking of marine sediments by longshore currents and (3) development of coral reef systems. pyBadlands is cross-platform, distributed under the GPLv3 license and available on GitHub (http://github.com/badlands-model). Here, we describe the underlying physical assumptions behind the simulated processes and the main options already available in the numerical framework. Along with the source code, a list of hands-on examples is provided that illustrates the model capabilities. In addition, pre and post-processing classes have been built and are accessible as a companion toolbox which comprises a series of workflows to efficiently build, quantify and explore simulation input and output files. While the framework has been primarily designed for research, its simplicity of use and portability makes it a great tool for teaching purposes. PMID:29649301

  17. DROUGHT IN THE ANTHROPOCENE: what/who causes abnormally dry conditions? (Invited)

    NASA Astrophysics Data System (ADS)

    Van Loon, A.; Van Lanen, H.

    2013-12-01

    Deforestation for agriculture, reservoir construction for hydropower, groundwater abstraction for irrigation, river diversion for navigation. These are only some examples of human interventions in river basins. The consequences of these interventions can be far-reaching, but are often difficult to distinguish from natural influences on the water system, such as meteorological droughts. River basin managers in water-stressed regions need to quantify both human and natural effects on the water system to adapt their water management accordingly. ';Drought' is a natural hazard, which is caused by climatic processes and their intrinsic variability, and cannot be prevented by short-term, local water management. ';Water scarcity' refers to the long-term unsustainable use of water resources and is a process that water managers and policy makers can influence. Water scarcity and drought are keywords for river basin managers in water-stressed regions, like Australia, California, China and the Mediterranean Basin. The interrelationship between drought and water scarcity, however, is complex. In regions with low water availability and high human pressures, water scarcity situations are common and can be exacerbated by drought events. The worst situation is a multi-year drought in a (semi )arid region with high demand for water. In monitoring the hydrological system for water management purposes, it is difficult (but essential) to determine which part of the temporal variation in a hydrological variable is caused by water scarcity (human induced) and which part by drought (natural). So the urgent question of many water managers is: how to distinguish between water scarcity and drought? Here, we present a new quantitative approach to distinguish, namely the observation-modelling framework proposed by Van Loon and Van Lanen (2013) to separate natural (drought) and human (water scarcity) effects on the hydrological system. The basis of the framework is simulation of the situation that would have occurred without human influence, i.e. the ';naturalised' situation, using a hydrological model. The resulting time series of naturalised state variables and fluxes can then be compared to observed time series. Additionally, anomalies (i.e. deviations from a threshold) are determined from both time series and compared. This analysis allows for quantification of the relative effect of drought and water scarcity. To show the general applicability of the framework, we investigated case study areas with contrasting climate and catchment properties in Spain, Czech Republic and the Netherlands. Using these case study areas we could analyse the effect of groundwater abstraction and water transfer on groundwater levels and streamflow. The proposed observation-modelling framework is rather generic. We demonstrate the range of methods that can be used and the range of human influences the framework can be applied to. The observation-modelling framework can help water managers, policy makers and stakeholders in water-stressed regions to combat water scarcity, and to better adapt to drought by decreasing their vulnerability. A clear distinction between drought and water scarcity is needed in the anthropocene.

  18. Rating knowledge sharing in cross-domain collaborative filtering.

    PubMed

    Li, Bin; Zhu, Xingquan; Li, Ruijiang; Zhang, Chengqi

    2015-05-01

    Cross-domain collaborative filtering (CF) aims to share common rating knowledge across multiple related CF domains to boost the CF performance. In this paper, we view CF domains as a 2-D site-time coordinate system, on which multiple related domains, such as similar recommender sites or successive time-slices, can share group-level rating patterns. We propose a unified framework for cross-domain CF over the site-time coordinate system by sharing group-level rating patterns and imposing user/item dependence across domains. A generative model, say ratings over site-time (ROST), which can generate and predict ratings for multiple related CF domains, is developed as the basic model for the framework. We further introduce cross-domain user/item dependence into ROST and extend it to two real-world cross-domain CF scenarios: 1) ROST (sites) for alleviating rating sparsity in the target domain, where multiple similar sites are viewed as related CF domains and some items in the target domain depend on their correspondences in the related ones; and 2) ROST (time) for modeling user-interest drift over time, where a series of time-slices are viewed as related CF domains and a user at current time-slice depends on herself in the previous time-slice. All these ROST models are instances of the proposed unified framework. The experimental results show that ROST (sites) can effectively alleviate the sparsity problem to improve rating prediction performance and ROST (time) can clearly track and visualize user-interest drift over time.

  19. Developing a Performance Assessment Framework and Indicators for Communicable Disease Management in Natural Disasters.

    PubMed

    Babaie, Javad; Ardalan, Ali; Vatandoost, Hasan; Goya, Mohammad Mehdi; Akbarisari, Ali

    2016-02-01

    Communicable disease management (CDM) is an important component of disaster public health response operations. However, there is a lack of any performance assessment (PA) framework and related indicators for the PA. This study aimed to develop a PA framework and indicators in CDM in disasters. In this study, a series of methods were used. First, a systematic literature review (SLR) was performed in order to extract the existing PA frameworks and indicators. Then, using a qualitative approach, some interviews with purposively selected experts were conducted and used in developing the PA framework and indicators. Finally, the analytical hierarchy process (AHP) was used for weighting of the developed indicators. The input, process, products, and outcomes (IPPO) framework was found to be an appropriate framework for CDM PA. Seven main functions were revealed to CDM during disasters. Forty PA indicators were developed for the four categories. There is a lack of any existing PA framework in CDM in disasters. Thus, in this study, a PA framework (IPPO framework) was developed for the PA of CDM in disasters through a series of methods. It can be an appropriate framework and its indicators could measure the performance of CDM in disasters.

  20. Cross-correlation earthquake precursors in the hydrogeochemical and geoacoustic signals for the Kamchatka peninsula

    NASA Astrophysics Data System (ADS)

    Ryabinin, Gennadiy; Gavrilov, Valeriy; Polyakov, Yuriy; Timashev, Serge

    2012-06-01

    We propose a new type of earthquake precursor based on the analysis of correlation dynamics between geophysical signals of different nature. The precursor is found using a two-parameter cross-correlation function introduced within the framework of flicker-noise spectroscopy, a general statistical physics approach to the analysis of time series. We consider an example of cross-correlation analysis for water salinity time series, an integral characteristic of the chemical composition of groundwater, and geoacoustic emissions recorded at the G-1 borehole on the Kamchatka peninsula in the time frame from 2001 to 2003, which is characterized by a sequence of three groups of significant seismic events. We found that cross-correlation precursors took place 27, 31, and 35 days ahead of the strongest earthquakes for each group of seismic events, respectively. At the same time, precursory anomalies in the signals themselves were observed only in the geoacoustic emissions for one group of earthquakes.

  1. Ice Stream Slowdown Will Drive Long-Term Thinning of the Ross Ice Shelf, With or Without Ocean Warming

    NASA Astrophysics Data System (ADS)

    Campbell, Adam J.; Hulbe, Christina L.; Lee, Choon-Ki

    2018-01-01

    As time series observations of Antarctic change proliferate, it is imperative that mathematical frameworks through which they are understood keep pace. Here we present a new method of interpreting remotely sensed change using spatial statistics and apply it to the specific case of thickness change on the Ross Ice Shelf. First, a numerical model of ice shelf flow is used together with empirical orthogonal function analysis to generate characteristic patterns of response to specific forcings. Because they are continuous and scalable in space and time, the patterns allow short duration observations to be placed in a longer time series context. Second, focusing only on changes that are statistically significant, the synthetic response surfaces are used to extract magnitude and timing of past events from the observational data. Slowdown of Kamb and Whillans Ice Streams is clearly detectable in remotely sensed thickness change. Moreover, those past events will continue to drive thinning into the future.

  2. Reconstructing Genetic Regulatory Networks Using Two-Step Algorithms with the Differential Equation Models of Neural Networks.

    PubMed

    Chen, Chi-Kan

    2017-07-26

    The identification of genetic regulatory networks (GRNs) provides insights into complex cellular processes. A class of recurrent neural networks (RNNs) captures the dynamics of GRN. Algorithms combining the RNN and machine learning schemes were proposed to reconstruct small-scale GRNs using gene expression time series. We present new GRN reconstruction methods with neural networks. The RNN is extended to a class of recurrent multilayer perceptrons (RMLPs) with latent nodes. Our methods contain two steps: the edge rank assignment step and the network construction step. The former assigns ranks to all possible edges by a recursive procedure based on the estimated weights of wires of RNN/RMLP (RE RNN /RE RMLP ), and the latter constructs a network consisting of top-ranked edges under which the optimized RNN simulates the gene expression time series. The particle swarm optimization (PSO) is applied to optimize the parameters of RNNs and RMLPs in a two-step algorithm. The proposed RE RNN -RNN and RE RMLP -RNN algorithms are tested on synthetic and experimental gene expression time series of small GRNs of about 10 genes. The experimental time series are from the studies of yeast cell cycle regulated genes and E. coli DNA repair genes. The unstable estimation of RNN using experimental time series having limited data points can lead to fairly arbitrary predicted GRNs. Our methods incorporate RNN and RMLP into a two-step structure learning procedure. Results show that the RE RMLP using the RMLP with a suitable number of latent nodes to reduce the parameter dimension often result in more accurate edge ranks than the RE RNN using the regularized RNN on short simulated time series. Combining by a weighted majority voting rule the networks derived by the RE RMLP -RNN using different numbers of latent nodes in step one to infer the GRN, the method performs consistently and outperforms published algorithms for GRN reconstruction on most benchmark time series. The framework of two-step algorithms can potentially incorporate with different nonlinear differential equation models to reconstruct the GRN.

  3. Using complex networks towards information retrieval and diagnostics in multidimensional imaging

    NASA Astrophysics Data System (ADS)

    Banerjee, Soumya Jyoti; Azharuddin, Mohammad; Sen, Debanjan; Savale, Smruti; Datta, Himadri; Dasgupta, Anjan Kr; Roy, Soumen

    2015-12-01

    We present a fresh and broad yet simple approach towards information retrieval in general and diagnostics in particular by applying the theory of complex networks on multidimensional, dynamic images. We demonstrate a successful use of our method with the time series generated from high content thermal imaging videos of patients suffering from the aqueous deficient dry eye (ADDE) disease. Remarkably, network analyses of thermal imaging time series of contact lens users and patients upon whom Laser-Assisted in situ Keratomileusis (Lasik) surgery has been conducted, exhibit pronounced similarity with results obtained from ADDE patients. We also propose a general framework for the transformation of multidimensional images to networks for futuristic biometry. Our approach is general and scalable to other fluctuation-based devices where network parameters derived from fluctuations, act as effective discriminators and diagnostic markers.

  4. Using complex networks towards information retrieval and diagnostics in multidimensional imaging.

    PubMed

    Banerjee, Soumya Jyoti; Azharuddin, Mohammad; Sen, Debanjan; Savale, Smruti; Datta, Himadri; Dasgupta, Anjan Kr; Roy, Soumen

    2015-12-02

    We present a fresh and broad yet simple approach towards information retrieval in general and diagnostics in particular by applying the theory of complex networks on multidimensional, dynamic images. We demonstrate a successful use of our method with the time series generated from high content thermal imaging videos of patients suffering from the aqueous deficient dry eye (ADDE) disease. Remarkably, network analyses of thermal imaging time series of contact lens users and patients upon whom Laser-Assisted in situ Keratomileusis (Lasik) surgery has been conducted, exhibit pronounced similarity with results obtained from ADDE patients. We also propose a general framework for the transformation of multidimensional images to networks for futuristic biometry. Our approach is general and scalable to other fluctuation-based devices where network parameters derived from fluctuations, act as effective discriminators and diagnostic markers.

  5. Using complex networks towards information retrieval and diagnostics in multidimensional imaging

    PubMed Central

    Banerjee, Soumya Jyoti; Azharuddin, Mohammad; Sen, Debanjan; Savale, Smruti; Datta, Himadri; Dasgupta, Anjan Kr; Roy, Soumen

    2015-01-01

    We present a fresh and broad yet simple approach towards information retrieval in general and diagnostics in particular by applying the theory of complex networks on multidimensional, dynamic images. We demonstrate a successful use of our method with the time series generated from high content thermal imaging videos of patients suffering from the aqueous deficient dry eye (ADDE) disease. Remarkably, network analyses of thermal imaging time series of contact lens users and patients upon whom Laser-Assisted in situ Keratomileusis (Lasik) surgery has been conducted, exhibit pronounced similarity with results obtained from ADDE patients. We also propose a general framework for the transformation of multidimensional images to networks for futuristic biometry. Our approach is general and scalable to other fluctuation-based devices where network parameters derived from fluctuations, act as effective discriminators and diagnostic markers. PMID:26626047

  6. Responsive space: Concept analysis and theoretical framework

    NASA Astrophysics Data System (ADS)

    Saleh, Joseph H.; Dubos, Gregory F.

    2009-08-01

    Customers' needs are dynamic and evolve in response to unfolding environmental uncertainties. The ability of a company or an industry to address these changing customers' needs in a timely and cost-effective way is a measure of its responsiveness. In the space industry, a systemic discrepancy exists between the time constants associated with the change of customers' needs, and the response time of the industry in delivering on-orbit solutions to these needs. There are important penalties associated with such delays, and space responsiveness is recognized as a strategic imperative in commercial competitive and military environments. In this paper, we provide a critical assessment of the literature on responsive space and introduce a new multi-disciplinary framework for thinking about and addressing issues of space responsiveness. Our framework advocates three levels of responsiveness: a global industry-wide responsiveness, a local stakeholder responsiveness, and an interactive or inter-stakeholder responsiveness. We introduce and motivate the use of "responsiveness maps" for multiple stakeholders. We then identify "levers of responsiveness": technical spacecraft- and launch-centric, as well as "soft" levers (e.g., acquisition policies) for improving the responsiveness of the space industry. Finally, we propose a series of research questions to aggressively tackle problems associated with space responsiveness.

  7. Nosiheptide Biosynthesis Featuring a Unique Indole Side Ring Formation on the Characteristic Thiopeptide Framework

    PubMed Central

    Yu, Yi; Duan, Lian; Zhang, Qi; Liao, Rijing; Ding, Ying; Pan, Haixue; Wendt-Pienkowski, Evelyn; Tang, Gongli; Shen, Ben; Liu, Wen

    2009-01-01

    Nosiheptide (NOS), belonging to the e series of thiopeptide antibiotics that exhibit potent activity against various bacterial pathogens, bears a unique indole side ring system and regiospecific hydroxyl groups on the characteristic macrocyclic core. Here, cloning, sequencing and characterization of the nos gene cluster from Streptomyces actuosus ATCC 25421 as a model for this series of thiopeptides has unveiled new insights into their biosynthesis. Bioinformatics-based sequence analysis and in vivo investigation into the gene functions show that NOS biosynthesis shares a common strategy with recently characterized b or c series thiopeptides for forming the characteristic macrocyclic core, which features a ribosomally synthesized precursor peptide with conserved posttranslational modifications. However, it apparently proceeds via a different route for tailoring the thiopeptide framework, allowing the final product to exhibit the distinct structural characteristics of e series thiopeptides, such as the indole side ring system. Chemical complementation supports the notion that the S-adenosylmethionine (AdoMet)-dependent protein NosL may play a central role in converting Trp to the key 3-methylindole moiety by an unusual carbon side chain rearrangement, most likely via a radical-initiated mechanism. Characterization of the indole side ring-opened analog of NOS from the nosN mutant strain is consistent with the proposed methyltransferase activity of its encoded protein, shedding light into the timing of the individual steps for indole side ring biosynthesis. These results also suggest the feasibility of engineering novel thiopeptides for drug discovery by manipulating the NOS biosynthetic machinery. PMID:19678698

  8. Development of indicators of vegetation recovery based on time series analysis of SPOT Vegetation data

    NASA Astrophysics Data System (ADS)

    Lhermitte, S.; Tips, M.; Verbesselt, J.; Jonckheere, I.; Van Aardt, J.; Coppin, Pol

    2005-10-01

    Large-scale wild fires have direct impacts on natural ecosystems and play a major role in the vegetation ecology and carbon budget. Accurate methods for describing post-fire development of vegetation are therefore essential for the understanding and monitoring of terrestrial ecosystems. Time series analysis of satellite imagery offers the potential to quantify these parameters with spatial and temporal accuracy. Current research focuses on the potential of time series analysis of SPOT Vegetation S10 data (1999-2001) to quantify the vegetation recovery of large-scale burns detected in the framework of GBA2000. The objective of this study was to provide quantitative estimates of the spatio-temporal variation of vegetation recovery based on remote sensing indicators. Southern Africa was used as a pilot study area, given the availability of ground and satellite data. An automated technique was developed to extract consistent indicators of vegetation recovery from the SPOT-VGT time series. Reference areas were used to quantify the vegetation regrowth by means of Regeneration Indices (RI). Two kinds of recovery indicators (time and value- based) were tested for RI's of NDVI, SR, SAVI, NDWI, and pure band information. The effects of vegetation structure and temporal fire regime features on the recovery indicators were subsequently analyzed. Statistical analyses were conducted to assess whether the recovery indicators were different for different vegetation types and dependent on timing of the burning season. Results highlighted the importance of appropriate reference areas and the importance of correct normalization of the SPOT-VGT data.

  9. Memetic Approaches for Optimizing Hidden Markov Models: A Case Study in Time Series Prediction

    NASA Astrophysics Data System (ADS)

    Bui, Lam Thu; Barlow, Michael

    We propose a methodology for employing memetics (local search) within the framework of evolutionary algorithms to optimize parameters of hidden markov models. With this proposal, the rate and frequency of using local search are automatically changed over time either at a population or individual level. At the population level, we allow the rate of using local search to decay over time to zero (at the final generation). At the individual level, each individual is equipped with information of when it will do local search and for how long. This information evolves over time alongside the main elements of the chromosome representing the individual.

  10. Simulating Pre-Asymptotic, Non-Fickian Transport Although Doing Simple Random Walks - Supported By Empirical Pore-Scale Velocity Distributions and Memory Effects

    NASA Astrophysics Data System (ADS)

    Most, S.; Jia, N.; Bijeljic, B.; Nowak, W.

    2016-12-01

    Pre-asymptotic characteristics are almost ubiquitous when analyzing solute transport processes in porous media. These pre-asymptotic aspects are caused by spatial coherence in the velocity field and by its heterogeneity. For the Lagrangian perspective of particle displacements, the causes of pre-asymptotic, non-Fickian transport are skewed velocity distribution, statistical dependencies between subsequent increments of particle positions (memory) and dependence between the x, y and z-components of particle increments. Valid simulation frameworks should account for these factors. We propose a particle tracking random walk (PTRW) simulation technique that can use empirical pore-space velocity distributions as input, enforces memory between subsequent random walk steps, and considers cross dependence. Thus, it is able to simulate pre-asymptotic non-Fickian transport phenomena. Our PTRW framework contains an advection/dispersion term plus a diffusion term. The advection/dispersion term produces time-series of particle increments from the velocity CDFs. These time series are equipped with memory by enforcing that the CDF values of subsequent velocities change only slightly. The latter is achieved through a random walk on the axis of CDF values between 0 and 1. The virtual diffusion coefficient for that random walk is our only fitting parameter. Cross-dependence can be enforced by constraining the random walk to certain combinations of CDF values between the three velocity components in x, y and z. We will show that this modelling framework is capable of simulating non-Fickian transport by comparison with a pore-scale transport simulation and we analyze the approach to asymptotic behavior.

  11. The use of Tcl and Tk to improve design and code reutilization

    NASA Technical Reports Server (NTRS)

    Rodriguez, Lisbet; Reinholtz, Kirk

    1995-01-01

    Tcl and Tk facilitate design and code reuse in the ZIPSIM series of high-performance, high-fidelity spacecraft simulators. Tcl and Tk provide a framework for the construction of the Graphical User Interfaces for the simulators. The interfaces are architected such that a large proportion of the design and code is used for several applications, which has reduced design time and life-cycle costs.

  12. A Comparison of Pseudo-Maximum Likelihood and Asymptotically Distribution-Free Dynamic Factor Analysis Parameter Estimation in Fitting Covariance Structure Models to Block-Toeplitz Matrices Representing Single-Subject Multivariate Time-Series.

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Nesselroade, John R.

    1998-01-01

    Pseudo-Maximum Likelihood (p-ML) and Asymptotically Distribution Free (ADF) estimation methods for estimating dynamic factor model parameters within a covariance structure framework were compared through a Monte Carlo simulation. Both methods appear to give consistent model parameter estimates, but only ADF gives standard errors and chi-square…

  13. About the cumulants of periodic signals

    NASA Astrophysics Data System (ADS)

    Barrau, Axel; El Badaoui, Mohammed

    2018-01-01

    This note studies cumulants of time series. These functions originating from the probability theory being commonly used as features of deterministic signals, their classical properties are examined in this modified framework. We show additivity of cumulants, ensured in the case of independent random variables, requires here a different hypothesis. Practical applications are proposed, in particular an analysis of the failure of the JADE algorithm to separate some specific periodic signals.

  14. Estimating the decomposition of predictive information in multivariate systems

    NASA Astrophysics Data System (ADS)

    Faes, Luca; Kugiumtzis, Dimitris; Nollo, Giandomenico; Jurysta, Fabrice; Marinazzo, Daniele

    2015-03-01

    In the study of complex systems from observed multivariate time series, insight into the evolution of one system may be under investigation, which can be explained by the information storage of the system and the information transfer from other interacting systems. We present a framework for the model-free estimation of information storage and information transfer computed as the terms composing the predictive information about the target of a multivariate dynamical process. The approach tackles the curse of dimensionality employing a nonuniform embedding scheme that selects progressively, among the past components of the multivariate process, only those that contribute most, in terms of conditional mutual information, to the present target process. Moreover, it computes all information-theoretic quantities using a nearest-neighbor technique designed to compensate the bias due to the different dimensionality of individual entropy terms. The resulting estimators of prediction entropy, storage entropy, transfer entropy, and partial transfer entropy are tested on simulations of coupled linear stochastic and nonlinear deterministic dynamic processes, demonstrating the superiority of the proposed approach over the traditional estimators based on uniform embedding. The framework is then applied to multivariate physiologic time series, resulting in physiologically well-interpretable information decompositions of cardiovascular and cardiorespiratory interactions during head-up tilt and of joint brain-heart dynamics during sleep.

  15. Wavelet neural networks: a practical guide.

    PubMed

    Alexandridis, Antonios K; Zapranis, Achilleas D

    2013-06-01

    Wavelet networks (WNs) are a new class of networks which have been used with great success in a wide range of applications. However a general accepted framework for applying WNs is missing from the literature. In this study, we present a complete statistical model identification framework in order to apply WNs in various applications. The following subjects were thoroughly examined: the structure of a WN, training methods, initialization algorithms, variable significance and variable selection algorithms, model selection methods and finally methods to construct confidence and prediction intervals. In addition the complexity of each algorithm is discussed. Our proposed framework was tested in two simulated cases, in one chaotic time series described by the Mackey-Glass equation and in three real datasets described by daily temperatures in Berlin, daily wind speeds in New York and breast cancer classification. Our results have shown that the proposed algorithms produce stable and robust results indicating that our proposed framework can be applied in various applications. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. A penalized framework for distributed lag non-linear models.

    PubMed

    Gasparrini, Antonio; Scheipl, Fabian; Armstrong, Ben; Kenward, Michael G

    2017-09-01

    Distributed lag non-linear models (DLNMs) are a modelling tool for describing potentially non-linear and delayed dependencies. Here, we illustrate an extension of the DLNM framework through the use of penalized splines within generalized additive models (GAM). This extension offers built-in model selection procedures and the possibility of accommodating assumptions on the shape of the lag structure through specific penalties. In addition, this framework includes, as special cases, simpler models previously proposed for linear relationships (DLMs). Alternative versions of penalized DLNMs are compared with each other and with the standard unpenalized version in a simulation study. Results show that this penalized extension to the DLNM class provides greater flexibility and improved inferential properties. The framework exploits recent theoretical developments of GAMs and is implemented using efficient routines within freely available software. Real-data applications are illustrated through two reproducible examples in time series and survival analysis. © 2017 The Authors Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.

  17. High-Performance Integrated Control of water quality and quantity in urban water reservoirs

    NASA Astrophysics Data System (ADS)

    Galelli, S.; Castelletti, A.; Goedbloed, A.

    2015-11-01

    This paper contributes a novel High-Performance Integrated Control framework to support the real-time operation of urban water supply storages affected by water quality problems. We use a 3-D, high-fidelity simulation model to predict the main water quality dynamics and inform a real-time controller based on Model Predictive Control. The integration of the simulation model into the control scheme is performed by a model reduction process that identifies a low-order, dynamic emulator running 4 orders of magnitude faster. The model reduction, which relies on a semiautomatic procedural approach integrating time series clustering and variable selection algorithms, generates a compact and physically meaningful emulator that can be coupled with the controller. The framework is used to design the hourly operation of Marina Reservoir, a 3.2 Mm3 storm-water-fed reservoir located in the center of Singapore, operated for drinking water supply and flood control. Because of its recent formation from a former estuary, the reservoir suffers from high salinity levels, whose behavior is modeled with Delft3D-FLOW. Results show that our control framework reduces the minimum salinity levels by nearly 40% and cuts the average annual deficit of drinking water supply by about 2 times the active storage of the reservoir (about 4% of the total annual demand).

  18. Exploring Evolving Media Discourse Through Event Cueing.

    PubMed

    Lu, Yafeng; Steptoe, Michael; Burke, Sarah; Wang, Hong; Tsai, Jiun-Yi; Davulcu, Hasan; Montgomery, Douglas; Corman, Steven R; Maciejewski, Ross

    2016-01-01

    Online news, microblogs and other media documents all contain valuable insight regarding events and responses to events. Underlying these documents is the concept of framing, a process in which communicators act (consciously or unconsciously) to construct a point of view that encourages facts to be interpreted by others in a particular manner. As media discourse evolves, how topics and documents are framed can undergo change, shifting the discussion to different viewpoints or rhetoric. What causes these shifts can be difficult to determine directly; however, by linking secondary datasets and enabling visual exploration, we can enhance the hypothesis generation process. In this paper, we present a visual analytics framework for event cueing using media data. As discourse develops over time, our framework applies a time series intervention model which tests to see if the level of framing is different before or after a given date. If the model indicates that the times before and after are statistically significantly different, this cues an analyst to explore related datasets to help enhance their understanding of what (if any) events may have triggered these changes in discourse. Our framework consists of entity extraction and sentiment analysis as lenses for data exploration and uses two different models for intervention analysis. To demonstrate the usage of our framework, we present a case study on exploring potential relationships between climate change framing and conflicts in Africa.

  19. The Measurand Framework: Scaling Exploratory Data Analysis

    NASA Astrophysics Data System (ADS)

    Schneider, D.; MacLean, L. S.; Kappler, K. N.; Bleier, T.

    2017-12-01

    Since 2005 QuakeFinder (QF) has acquired a unique dataset with outstanding spatial and temporal sampling of earth's time varying magnetic field along several active fault systems. This QF network consists of 124 stations in California and 45 stations along fault zones in Greece, Taiwan, Peru, Chile and Indonesia. Each station is equipped with three feedback induction magnetometers, two ion sensors, a 4 Hz geophone, a temperature sensor, and a humidity sensor. Data are continuously recorded at 50 Hz with GPS timing and transmitted daily to the QF data center in California for analysis. QF is attempting to detect and characterize anomalous EM activity occurring ahead of earthquakes. In order to analyze this sizable dataset, QF has developed an analytical framework to support processing the time series input data and hypothesis testing to evaluate the statistical significance of potential precursory signals. The framework was developed with a need to support legacy, in-house processing but with an eye towards big-data processing with Apache Spark and other modern big data technologies. In this presentation, we describe our framework, which supports rapid experimentation and iteration of candidate signal processing techniques via modular data transformation stages, tracking of provenance, and automatic re-computation of downstream data when upstream data is updated. Furthermore, we discuss how the processing modules can be ported to big data platforms like Apache Spark and demonstrate a migration path from local, in-house processing to cloud-friendly processing.

  20. Time series modelling and forecasting of emergency department overcrowding.

    PubMed

    Kadri, Farid; Harrou, Fouzi; Chaabane, Sondès; Tahon, Christian

    2014-09-01

    Efficient management of patient flow (demand) in emergency departments (EDs) has become an urgent issue for many hospital administrations. Today, more and more attention is being paid to hospital management systems to optimally manage patient flow and to improve management strategies, efficiency and safety in such establishments. To this end, EDs require significant human and material resources, but unfortunately these are limited. Within such a framework, the ability to accurately forecast demand in emergency departments has considerable implications for hospitals to improve resource allocation and strategic planning. The aim of this study was to develop models for forecasting daily attendances at the hospital emergency department in Lille, France. The study demonstrates how time-series analysis can be used to forecast, at least in the short term, demand for emergency services in a hospital emergency department. The forecasts were based on daily patient attendances at the paediatric emergency department in Lille regional hospital centre, France, from January 2012 to December 2012. An autoregressive integrated moving average (ARIMA) method was applied separately to each of the two GEMSA categories and total patient attendances. Time-series analysis was shown to provide a useful, readily available tool for forecasting emergency department demand.

  1. Modeling of electrical and mesoscopic circuits at quantum nanoscale from heat momentum operator

    NASA Astrophysics Data System (ADS)

    El-Nabulsi, Rami Ahmad

    2018-04-01

    We develop a new method to study electrical circuits at quantum nanoscale by introducing a heat momentum operator which reproduces quantum effects similar to those obtained in Suykens's nonlocal-in-time kinetic energy approach for the case of reversible motion. The series expansion of the heat momentum operator is similar to the momentum operator obtained in the framework of minimal length phenomenologies characterized by the deformation of Heisenberg algebra. The quantization of both LC and mesoscopic circuits revealed a number of motivating features like the emergence of a generalized uncertainty relation and a minimal charge similar to those obtained in the framework of minimal length theories. Additional features were obtained and discussed accordingly.

  2. EMAP and EMAGE: a framework for understanding spatially organized data.

    PubMed

    Baldock, Richard A; Bard, Jonathan B L; Burger, Albert; Burton, Nicolas; Christiansen, Jeff; Feng, Guanjie; Hill, Bill; Houghton, Derek; Kaufman, Matthew; Rao, Jianguo; Sharpe, James; Ross, Allyson; Stevenson, Peter; Venkataraman, Shanmugasundaram; Waterhouse, Andrew; Yang, Yiya; Davidson, Duncan R

    2003-01-01

    The Edinburgh MouseAtlas Project (EMAP) is a time-series of mouse-embryo volumetric models. The models provide a context-free spatial framework onto which structural interpretations and experimental data can be mapped. This enables collation, comparison, and query of complex spatial patterns with respect to each other and with respect to known or hypothesized structure. The atlas also includes a time-dependent anatomical ontology and mapping between the ontology and the spatial models in the form of delineated anatomical regions or tissues. The models provide a natural, graphical context for browsing and visualizing complex data. The Edinburgh Mouse Atlas Gene-Expression Database (EMAGE) is one of the first applications of the EMAP framework and provides a spatially mapped gene-expression database with associated tools for data mapping, submission, and query. In this article, we describe the underlying principles of the Atlas and the gene-expression database, and provide a practical introduction to the use of the EMAP and EMAGE tools, including use of new techniques for whole body gene-expression data capture and mapping.

  3. Revealing Real-Time Emotional Responses: a Personalized Assessment based on Heartbeat Dynamics

    NASA Astrophysics Data System (ADS)

    Valenza, Gaetano; Citi, Luca; Lanatá, Antonio; Scilingo, Enzo Pasquale; Barbieri, Riccardo

    2014-05-01

    Emotion recognition through computational modeling and analysis of physiological signals has been widely investigated in the last decade. Most of the proposed emotion recognition systems require relatively long-time series of multivariate records and do not provide accurate real-time characterizations using short-time series. To overcome these limitations, we propose a novel personalized probabilistic framework able to characterize the emotional state of a subject through the analysis of heartbeat dynamics exclusively. The study includes thirty subjects presented with a set of standardized images gathered from the international affective picture system, alternating levels of arousal and valence. Due to the intrinsic nonlinearity and nonstationarity of the RR interval series, a specific point-process model was devised for instantaneous identification considering autoregressive nonlinearities up to the third-order according to the Wiener-Volterra representation, thus tracking very fast stimulus-response changes. Features from the instantaneous spectrum and bispectrum, as well as the dominant Lyapunov exponent, were extracted and considered as input features to a support vector machine for classification. Results, estimating emotions each 10 seconds, achieve an overall accuracy in recognizing four emotional states based on the circumplex model of affect of 79.29%, with 79.15% on the valence axis, and 83.55% on the arousal axis.

  4. Revealing real-time emotional responses: a personalized assessment based on heartbeat dynamics.

    PubMed

    Valenza, Gaetano; Citi, Luca; Lanatá, Antonio; Scilingo, Enzo Pasquale; Barbieri, Riccardo

    2014-05-21

    Emotion recognition through computational modeling and analysis of physiological signals has been widely investigated in the last decade. Most of the proposed emotion recognition systems require relatively long-time series of multivariate records and do not provide accurate real-time characterizations using short-time series. To overcome these limitations, we propose a novel personalized probabilistic framework able to characterize the emotional state of a subject through the analysis of heartbeat dynamics exclusively. The study includes thirty subjects presented with a set of standardized images gathered from the international affective picture system, alternating levels of arousal and valence. Due to the intrinsic nonlinearity and nonstationarity of the RR interval series, a specific point-process model was devised for instantaneous identification considering autoregressive nonlinearities up to the third-order according to the Wiener-Volterra representation, thus tracking very fast stimulus-response changes. Features from the instantaneous spectrum and bispectrum, as well as the dominant Lyapunov exponent, were extracted and considered as input features to a support vector machine for classification. Results, estimating emotions each 10 seconds, achieve an overall accuracy in recognizing four emotional states based on the circumplex model of affect of 79.29%, with 79.15% on the valence axis, and 83.55% on the arousal axis.

  5. Scale and time dependence of serial correlations in word-length time series of written texts

    NASA Astrophysics Data System (ADS)

    Rodriguez, E.; Aguilar-Cornejo, M.; Femat, R.; Alvarez-Ramirez, J.

    2014-11-01

    This work considered the quantitative analysis of large written texts. To this end, the text was converted into a time series by taking the sequence of word lengths. The detrended fluctuation analysis (DFA) was used for characterizing long-range serial correlations of the time series. To this end, the DFA was implemented within a rolling window framework for estimating the variations of correlations, quantified in terms of the scaling exponent, strength along the text. Also, a filtering derivative was used to compute the dependence of the scaling exponent relative to the scale. The analysis was applied to three famous English-written literary narrations; namely, Alice in Wonderland (by Lewis Carrol), Dracula (by Bram Stoker) and Sense and Sensibility (by Jane Austen). The results showed that high correlations appear for scales of about 50-200 words, suggesting that at these scales the text contains the stronger coherence. The scaling exponent was not constant along the text, showing important variations with apparent cyclical behavior. An interesting coincidence between the scaling exponent variations and changes in narrative units (e.g., chapters) was found. This suggests that the scaling exponent obtained from the DFA is able to detect changes in narration structure as expressed by the usage of words of different lengths.

  6. Detecting population-environmental interactions with mismatched time series data.

    PubMed

    Ferguson, Jake M; Reichert, Brian E; Fletcher, Robert J; Jager, Henriëtte I

    2017-11-01

    Time series analysis is an essential method for decomposing the influences of density and exogenous factors such as weather and climate on population regulation. However, there has been little work focused on understanding how well commonly collected data can reconstruct the effects of environmental factors on population dynamics. We show that, analogous to similar scale issues in spatial data analysis, coarsely sampled temporal data can fail to detect covariate effects when interactions occur on timescales that are fast relative to the survey period. We propose a method for modeling mismatched time series data that couples high-resolution environmental data to low-resolution abundance data. We illustrate our approach with simulations and by applying it to Florida's southern Snail kite population. Our simulation results show that our method can reliably detect linear environmental effects and that detecting nonlinear effects requires high-resolution covariate data even when the population turnover rate is slow. In the Snail kite analysis, our approach performed among the best in a suite of previously used environmental covariates explaining Snail kite dynamics and was able to detect a potential phenological shift in the environmental dependence of Snail kites. Our work provides a statistical framework for reliably detecting population-environment interactions from coarsely surveyed time series. An important implication of this work is that the low predictability of animal population growth by weather variables found in previous studies may be due, in part, to how these data are utilized as covariates. © 2017 by the Ecological Society of America.

  7. Detecting population–environmental interactions with mismatched time series data

    PubMed Central

    Ferguson, Jake M.; Reichert, Brian E.; Fletcher, Robert J.; Jager, Henriëtte I.

    2017-01-01

    Time series analysis is an essential method for decomposing the influences of density and exogenous factors such as weather and climate on population regulation. However, there has been little work focused on understanding how well commonly collected data can reconstruct the effects of environmental factors on population dynamics. We show that, analogous to similar scale issues in spatial data analysis, coarsely sampled temporal data can fail to detect covariate effects when interactions occur on timescales that are fast relative to the survey period. We propose a method for modeling mismatched time series data that couples high-resolution environmental data to low-resolution abundance data. We illustrate our approach with simulations and by applying it to Florida’s southern Snail kite population. Our simulation results show that our method can reliably detect linear environmental effects and that detecting nonlinear effects requires high-resolution covariate data even when the population turnover rate is slow. In the Snail kite analysis, our approach performed among the best in a suite of previously used environmental covariates explaining Snail kite dynamics and was able to detect a potential phenological shift in the environmental dependence of Snail kites. Our work provides a statistical framework for reliably detecting population–environment interactions from coarsely surveyed time series. An important implication of this work is that the low predictability of animal population growth by weather variables found in previous studies may be due, in part, to how these data are utilized as covariates. PMID:28759123

  8. A data-driven dynamics simulation framework for railway vehicles

    NASA Astrophysics Data System (ADS)

    Nie, Yinyu; Tang, Zhao; Liu, Fengjia; Chang, Jian; Zhang, Jianjun

    2018-03-01

    The finite element (FE) method is essential for simulating vehicle dynamics with fine details, especially for train crash simulations. However, factors such as the complexity of meshes and the distortion involved in a large deformation would undermine its calculation efficiency. An alternative method, the multi-body (MB) dynamics simulation provides satisfying time efficiency but limited accuracy when highly nonlinear dynamic process is involved. To maintain the advantages of both methods, this paper proposes a data-driven simulation framework for dynamics simulation of railway vehicles. This framework uses machine learning techniques to extract nonlinear features from training data generated by FE simulations so that specific mesh structures can be formulated by a surrogate element (or surrogate elements) to replace the original mechanical elements, and the dynamics simulation can be implemented by co-simulation with the surrogate element(s) embedded into a MB model. This framework consists of a series of techniques including data collection, feature extraction, training data sampling, surrogate element building, and model evaluation and selection. To verify the feasibility of this framework, we present two case studies, a vertical dynamics simulation and a longitudinal dynamics simulation, based on co-simulation with MATLAB/Simulink and Simpack, and a further comparison with a popular data-driven model (the Kriging model) is provided. The simulation result shows that using the legendre polynomial regression model in building surrogate elements can largely cut down the simulation time without sacrifice in accuracy.

  9. The benefit of modeled ozone data for the reconstruction of a 99-year UV radiation time series

    NASA Astrophysics Data System (ADS)

    Junk, J.; Feister, U.; Helbig, A.; GöRgen, K.; Rozanov, E.; KrzyśCin, J. W.; Hoffmann, L.

    2012-08-01

    Solar erythemal UV radiation (UVER) is highly relevant for numerous biological processes that affect plants, animals, and human health. Nevertheless, long-term UVER records are scarce. As significant declines in the column ozone concentration were observed in the past and a recovery of the stratospheric ozone layer is anticipated by the middle of the 21st century, there is a strong interest in the temporal variation of UVERtime series. Therefore, we combined ground-based measurements of different meteorological variables with modeled ozone data sets to reconstruct time series of daily totals of UVER at the Meteorological Observatory, Potsdam, Germany. Artificial neural networks were trained with measured UVER, sunshine duration, the day of year, measured and modeled total column ozone, as well as the minimum solar zenith angle. This allows for the reconstruction of daily totals of UVERfor the period from 1901 to 1999. Additionally, analyses of the long-term variations from 1901 until 1999 of the reconstructed, new UVER data set are presented. The time series of monthly and annual totals of UVERprovide a long-term meteorological basis for epidemiological investigations in human health and occupational medicine for the region of Potsdam and Berlin. A strong benefit of our ANN-approach is the fact that it can be easily adapted to different geographical locations, as successfully tested in the framework of the COSTAction 726.

  10. Understanding the amplitudes of noise correlation measurements

    USGS Publications Warehouse

    Tsai, Victor C.

    2011-01-01

    Cross correlation of ambient seismic noise is known to result in time series from which station-station travel-time measurements can be made. Part of the reason that these cross-correlation travel-time measurements are reliable is that there exists a theoretical framework that quantifies how these travel times depend on the features of the ambient noise. However, corresponding theoretical results do not currently exist to describe how the amplitudes of the cross correlation depend on such features. For example, currently it is not possible to take a given distribution of noise sources and calculate the cross correlation amplitudes one would expect from such a distribution. Here, we provide a ray-theoretical framework for calculating cross correlations. This framework differs from previous work in that it explicitly accounts for attenuation as well as the spatial distribution of sources and therefore can address the issue of quantifying amplitudes in noise correlation measurements. After introducing the general framework, we apply it to two specific problems. First, we show that we can quantify the amplitudes of coherency measurements, and find that the decay of coherency with station-station spacing depends crucially on the distribution of noise sources. We suggest that researchers interested in performing attenuation measurements from noise coherency should first determine how the dominant sources of noise are distributed. Second, we show that we can quantify the signal-to-noise ratio of noise correlations more precisely than previous work, and that these signal-to-noise ratios can be estimated for given situations prior to the deployment of seismometers. It is expected that there are applications of the theoretical framework beyond the two specific cases considered, but these applications await future work.

  11. A framework for periodic outlier pattern detection in time-series sequences.

    PubMed

    Rasheed, Faraz; Alhajj, Reda

    2014-05-01

    Periodic pattern detection in time-ordered sequences is an important data mining task, which discovers in the time series all patterns that exhibit temporal regularities. Periodic pattern mining has a large number of applications in real life; it helps understanding the regular trend of the data along time, and enables the forecast and prediction of future events. An interesting related and vital problem that has not received enough attention is to discover outlier periodic patterns in a time series. Outlier patterns are defined as those which are different from the rest of the patterns; outliers are not noise. While noise does not belong to the data and it is mostly eliminated by preprocessing, outliers are actual instances in the data but have exceptional characteristics compared with the majority of the other instances. Outliers are unusual patterns that rarely occur, and, thus, have lesser support (frequency of appearance) in the data. Outlier patterns may hint toward discrepancy in the data such as fraudulent transactions, network intrusion, change in customer behavior, recession in the economy, epidemic and disease biomarkers, severe weather conditions like tornados, etc. We argue that detecting the periodicity of outlier patterns might be more important in many sequences than the periodicity of regular, more frequent patterns. In this paper, we present a robust and time efficient suffix tree-based algorithm capable of detecting the periodicity of outlier patterns in a time series by giving more significance to less frequent yet periodic patterns. Several experiments have been conducted using both real and synthetic data; all aspects of the proposed approach are compared with the existing algorithm InfoMiner; the reported results demonstrate the effectiveness and applicability of the proposed approach.

  12. ASSESSING MULTIMEDIA/MULTIPATHWAY EXPOSURE TO ARSENIC USING A MECHANISTIC SOURCE-TO-DOSE MODELING FRAMEWORK

    EPA Science Inventory

    A series of case studies is presented focusing on multimedia/multipathway population exposures to arsenic, employing the Population Based Modeling approach of the MENTOR (Modeling Environment for Total Risks) framework. This framework considers currently five exposure routes: i...

  13. A Space-Time Network-Based Modeling Framework for Dynamic Unmanned Aerial Vehicle Routing in Traffic Incident Monitoring Applications

    PubMed Central

    Zhang, Jisheng; Jia, Limin; Niu, Shuyun; Zhang, Fan; Tong, Lu; Zhou, Xuesong

    2015-01-01

    It is essential for transportation management centers to equip and manage a network of fixed and mobile sensors in order to quickly detect traffic incidents and further monitor the related impact areas, especially for high-impact accidents with dramatic traffic congestion propagation. As emerging small Unmanned Aerial Vehicles (UAVs) start to have a more flexible regulation environment, it is critically important to fully explore the potential for of using UAVs for monitoring recurring and non-recurring traffic conditions and special events on transportation networks. This paper presents a space-time network- based modeling framework for integrated fixed and mobile sensor networks, in order to provide a rapid and systematic road traffic monitoring mechanism. By constructing a discretized space-time network to characterize not only the speed for UAVs but also the time-sensitive impact areas of traffic congestion, we formulate the problem as a linear integer programming model to minimize the detection delay cost and operational cost, subject to feasible flying route constraints. A Lagrangian relaxation solution framework is developed to decompose the original complex problem into a series of computationally efficient time-dependent and least cost path finding sub-problems. Several examples are used to demonstrate the results of proposed models in UAVs’ route planning for small and medium-scale networks. PMID:26076404

  14. A Bayesian Approach for Summarizing and Modeling Time-Series Exposure Data with Left Censoring.

    PubMed

    Houseman, E Andres; Virji, M Abbas

    2017-08-01

    Direct reading instruments are valuable tools for measuring exposure as they provide real-time measurements for rapid decision making. However, their use is limited to general survey applications in part due to issues related to their performance. Moreover, statistical analysis of real-time data is complicated by autocorrelation among successive measurements, non-stationary time series, and the presence of left-censoring due to limit-of-detection (LOD). A Bayesian framework is proposed that accounts for non-stationary autocorrelation and LOD issues in exposure time-series data in order to model workplace factors that affect exposure and estimate summary statistics for tasks or other covariates of interest. A spline-based approach is used to model non-stationary autocorrelation with relatively few assumptions about autocorrelation structure. Left-censoring is addressed by integrating over the left tail of the distribution. The model is fit using Markov-Chain Monte Carlo within a Bayesian paradigm. The method can flexibly account for hierarchical relationships, random effects and fixed effects of covariates. The method is implemented using the rjags package in R, and is illustrated by applying it to real-time exposure data. Estimates for task means and covariates from the Bayesian model are compared to those from conventional frequentist models including linear regression, mixed-effects, and time-series models with different autocorrelation structures. Simulations studies are also conducted to evaluate method performance. Simulation studies with percent of measurements below the LOD ranging from 0 to 50% showed lowest root mean squared errors for task means and the least biased standard deviations from the Bayesian model compared to the frequentist models across all levels of LOD. In the application, task means from the Bayesian model were similar to means from the frequentist models, while the standard deviations were different. Parameter estimates for covariates were significant in some frequentist models, but in the Bayesian model their credible intervals contained zero; such discrepancies were observed in multiple datasets. Variance components from the Bayesian model reflected substantial autocorrelation, consistent with the frequentist models, except for the auto-regressive moving average model. Plots of means from the Bayesian model showed good fit to the observed data. The proposed Bayesian model provides an approach for modeling non-stationary autocorrelation in a hierarchical modeling framework to estimate task means, standard deviations, quantiles, and parameter estimates for covariates that are less biased and have better performance characteristics than some of the contemporary methods. Published by Oxford University Press on behalf of the British Occupational Hygiene Society 2017.

  15. Nonlinear degradation of a visible-light communication link: A Volterra-series approach

    NASA Astrophysics Data System (ADS)

    Kamalakis, Thomas; Dede, Georgia

    2018-06-01

    Visible light communications can be used to provide illumination and data communication at the same time. In this paper, a reverse-engineering approach is presented for assessing the impact of nonlinear signal distortion in visible light communication links. The approach is based on the Volterra series expansion and has the advantage of accurately accounting for memory effects in contrast to the static nonlinear models that are popular in the literature. Volterra kernels describe the end-to-end system response and can be inferred from measurements. Consequently, this approach does not rely on any particular physical models and assumptions regarding the individual link components. We provide the necessary framework for estimating the nonlinear distortion on the symbol estimates of a discrete multitone modulated link. Various design aspects such as waveform clipping and predistortion are also incorporated in the analysis. Using this framework, the nonlinear signal-to-interference is calculated for the system at hand. It is shown that at high signal amplitudes, the nonlinear signal-to-interference can be less than 25 dB.

  16. A Framework to Debug Diagnostic Matrices

    NASA Technical Reports Server (NTRS)

    Kodal, Anuradha; Robinson, Peter; Patterson-Hine, Ann

    2013-01-01

    Diagnostics is an important concept in system health and monitoring of space operations. Many of the existing diagnostic algorithms utilize system knowledge in the form of diagnostic matrix (D-matrix, also popularly known as diagnostic dictionary, fault signature matrix or reachability matrix) gleaned from physical models. But, sometimes, this may not be coherent to obtain high diagnostic performance. In such a case, it is important to modify this D-matrix based on knowledge obtained from other sources such as time-series data stream (simulated or maintenance data) within the context of a framework that includes the diagnostic/inference algorithm. A systematic and sequential update procedure, diagnostic modeling evaluator (DME) is proposed to modify D-matrix and wrapper logic considering least expensive solution first. This iterative procedure includes conditions ranging from modifying 0s and 1s in the matrix, or adding/removing the rows (failure sources) columns (tests). We will experiment this framework on datasets from DX challenge 2009.

  17. Observing climate change trends in ocean biogeochemistry: when and where.

    PubMed

    Henson, Stephanie A; Beaulieu, Claudie; Lampitt, Richard

    2016-04-01

    Understanding the influence of anthropogenic forcing on the marine biosphere is a high priority. Climate change-driven trends need to be accurately assessed and detected in a timely manner. As part of the effort towards detection of long-term trends, a network of ocean observatories and time series stations provide high quality data for a number of key parameters, such as pH, oxygen concentration or primary production (PP). Here, we use an ensemble of global coupled climate models to assess the temporal and spatial scales over which observations of eight biogeochemically relevant variables must be made to robustly detect a long-term trend. We find that, as a global average, continuous time series are required for between 14 (pH) and 32 (PP) years to distinguish a climate change trend from natural variability. Regional differences are extensive, with low latitudes and the Arctic generally needing shorter time series (<~30 years) to detect trends than other areas. In addition, we quantify the 'footprint' of existing and planned time series stations, that is the area over which a station is representative of a broader region. Footprints are generally largest for pH and sea surface temperature, but nevertheless the existing network of observatories only represents 9-15% of the global ocean surface. Our results present a quantitative framework for assessing the adequacy of current and future ocean observing networks for detection and monitoring of climate change-driven responses in the marine ecosystem. © 2016 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.

  18. Discovering significant evolution patterns from satellite image time series.

    PubMed

    Petitjean, François; Masseglia, Florent; Gançarski, Pierre; Forestier, Germain

    2011-12-01

    Satellite Image Time Series (SITS) provide us with precious information on land cover evolution. By studying these series of images we can both understand the changes of specific areas and discover global phenomena that spread over larger areas. Changes that can occur throughout the sensing time can spread over very long periods and may have different start time and end time depending on the location, which complicates the mining and the analysis of series of images. This work focuses on frequent sequential pattern mining (FSPM) methods, since this family of methods fits the above-mentioned issues. This family of methods consists of finding the most frequent evolution behaviors, and is actually able to extract long-term changes as well as short term ones, whenever the change may start and end. However, applying FSPM methods to SITS implies confronting two main challenges, related to the characteristics of SITS and the domain's constraints. First, satellite images associate multiple measures with a single pixel (the radiometric levels of different wavelengths corresponding to infra-red, red, etc.), which makes the search space multi-dimensional and thus requires specific mining algorithms. Furthermore, the non evolving regions, which are the vast majority and overwhelm the evolving ones, challenge the discovery of these patterns. We propose a SITS mining framework that enables discovery of these patterns despite these constraints and characteristics. Our proposal is inspired from FSPM and provides a relevant visualization principle. Experiments carried out on 35 images sensed over 20 years show the proposed approach makes it possible to extract relevant evolution behaviors.

  19. Data-driven Analysis and Prediction of Arctic Sea Ice

    NASA Astrophysics Data System (ADS)

    Kondrashov, D. A.; Chekroun, M.; Ghil, M.; Yuan, X.; Ting, M.

    2015-12-01

    We present results of data-driven predictive analyses of sea ice over the main Arctic regions. Our approach relies on the Multilayer Stochastic Modeling (MSM) framework of Kondrashov, Chekroun and Ghil [Physica D, 2015] and it leads to prognostic models of sea ice concentration (SIC) anomalies on seasonal time scales.This approach is applied to monthly time series of leading principal components from the multivariate Empirical Orthogonal Function decomposition of SIC and selected climate variables over the Arctic. We evaluate the predictive skill of MSM models by performing retrospective forecasts with "no-look ahead" forup to 6-months ahead. It will be shown in particular that the memory effects included in our non-Markovian linear MSM models improve predictions of large-amplitude SIC anomalies in certain Arctic regions. Furtherimprovements allowed by the MSM framework will adopt a nonlinear formulation, as well as alternative data-adaptive decompositions.

  20. History-dependent friction and slow slip from time-dependent microscopic junction laws studied in a statistical framework

    NASA Astrophysics Data System (ADS)

    Thøgersen, Kjetil; Trømborg, Jørgen Kjoshagen; Sveinsson, Henrik Andersen; Malthe-Sørenssen, Anders; Scheibert, Julien

    2014-05-01

    To study how macroscopic friction phenomena originate from microscopic junction laws, we introduce a general statistical framework describing the collective behavior of a large number of individual microjunctions forming a macroscopic frictional interface. Each microjunction can switch in time between two states: a pinned state characterized by a displacement-dependent force and a slipping state characterized by a time-dependent force. Instead of tracking each microjunction individually, the state of the interface is described by two coupled distributions for (i) the stretching of pinned junctions and (ii) the time spent in the slipping state. This framework allows for a whole family of microjunction behavior laws, and we show how it represents an overarching structure for many existing models found in the friction literature. We then use this framework to pinpoint the effects of the time scale that controls the duration of the slipping state. First, we show that the model reproduces a series of friction phenomena already observed experimentally. The macroscopic steady-state friction force is velocity dependent, either monotonic (strengthening or weakening) or nonmonotonic (weakening-strengthening), depending on the microscopic behavior of individual junctions. In addition, slow slip, which has been reported in a wide variety of systems, spontaneously occurs in the model if the friction contribution from junctions in the slipping state is time weakening. Next, we show that the model predicts a nontrivial history dependence of the macroscopic static friction force. In particular, the static friction coefficient at the onset of sliding is shown to increase with increasing deceleration during the final phases of the preceding sliding event. We suggest that this form of history dependence of static friction should be investigated in experiments, and we provide the acceleration range in which this effect is expected to be experimentally observable.

  1. History-dependent friction and slow slip from time-dependent microscopic junction laws studied in a statistical framework.

    PubMed

    Thøgersen, Kjetil; Trømborg, Jørgen Kjoshagen; Sveinsson, Henrik Andersen; Malthe-Sørenssen, Anders; Scheibert, Julien

    2014-05-01

    To study how macroscopic friction phenomena originate from microscopic junction laws, we introduce a general statistical framework describing the collective behavior of a large number of individual microjunctions forming a macroscopic frictional interface. Each microjunction can switch in time between two states: a pinned state characterized by a displacement-dependent force and a slipping state characterized by a time-dependent force. Instead of tracking each microjunction individually, the state of the interface is described by two coupled distributions for (i) the stretching of pinned junctions and (ii) the time spent in the slipping state. This framework allows for a whole family of microjunction behavior laws, and we show how it represents an overarching structure for many existing models found in the friction literature. We then use this framework to pinpoint the effects of the time scale that controls the duration of the slipping state. First, we show that the model reproduces a series of friction phenomena already observed experimentally. The macroscopic steady-state friction force is velocity dependent, either monotonic (strengthening or weakening) or nonmonotonic (weakening-strengthening), depending on the microscopic behavior of individual junctions. In addition, slow slip, which has been reported in a wide variety of systems, spontaneously occurs in the model if the friction contribution from junctions in the slipping state is time weakening. Next, we show that the model predicts a nontrivial history dependence of the macroscopic static friction force. In particular, the static friction coefficient at the onset of sliding is shown to increase with increasing deceleration during the final phases of the preceding sliding event. We suggest that this form of history dependence of static friction should be investigated in experiments, and we provide the acceleration range in which this effect is expected to be experimentally observable.

  2. Long term economic relationships from cointegration maps

    NASA Astrophysics Data System (ADS)

    Vicente, Renato; Pereira, Carlos de B.; Leite, Vitor B. P.; Caticha, Nestor

    2007-07-01

    We employ the Bayesian framework to define a cointegration measure aimed to represent long term relationships between time series. For visualization of these relationships we introduce a dissimilarity matrix and a map based on the sorting points into neighborhoods (SPIN) technique, which has been previously used to analyze large data sets from DNA arrays. We exemplify the technique in three data sets: US interest rates (USIR), monthly inflation rates and gross domestic product (GDP) growth rates.

  3. RadVel: General toolkit for modeling Radial Velocities

    NASA Astrophysics Data System (ADS)

    Fulton, Benjamin J.; Petigura, Erik A.; Blunt, Sarah; Sinukoff, Evan

    2018-01-01

    RadVel models Keplerian orbits in radial velocity (RV) time series. The code is written in Python with a fast Kepler's equation solver written in C. It provides a framework for fitting RVs using maximum a posteriori optimization and computing robust confidence intervals by sampling the posterior probability density via Markov Chain Monte Carlo (MCMC). RadVel can perform Bayesian model comparison and produces publication quality plots and LaTeX tables.

  4. The Superstatistical Nature and Interoccurrence Time of Atmospheric Mercury Concentration Fluctuations

    NASA Astrophysics Data System (ADS)

    Carbone, F.; Bruno, A. G.; Naccarato, A.; De Simone, F.; Gencarelli, C. N.; Sprovieri, F.; Hedgecock, I. M.; Landis, M. S.; Skov, H.; Pfaffhuber, K. A.; Read, K. A.; Martin, L.; Angot, H.; Dommergue, A.; Magand, O.; Pirrone, N.

    2018-01-01

    The probability density function (PDF) of the time intervals between subsequent extreme events in atmospheric Hg0 concentration data series from different latitudes has been investigated. The Hg0 dynamic possesses a long-term memory autocorrelation function. Above a fixed threshold Q in the data, the PDFs of the interoccurrence time of the Hg0 data are well described by a Tsallis q-exponential function. This PDF behavior has been explained in the framework of superstatistics, where the competition between multiple mesoscopic processes affects the macroscopic dynamics. An extensive parameter μ, encompassing all possible fluctuations related to mesoscopic phenomena, has been identified. It follows a χ2 distribution, indicative of the superstatistical nature of the overall process. Shuffling the data series destroys the long-term memory, the distributions become independent of Q, and the PDFs collapse on to the same exponential distribution. The possible central role of atmospheric turbulence on extreme events in the Hg0 data is highlighted.

  5. Emergency material allocation with time-varying supply-demand based on dynamic optimization method for river chemical spills.

    PubMed

    Liu, Jie; Guo, Liang; Jiang, Jiping; Jiang, Dexun; Wang, Peng

    2018-04-13

    Aiming to minimize the damage caused by river chemical spills, efficient emergency material allocation is critical for an actual emergency rescue decision-making in a quick response. In this study, an emergency material allocation framework based on time-varying supply-demand constraint is developed to allocate emergency material, minimize the emergency response time, and satisfy the dynamic emergency material requirements in post-accident phases dealing with river chemical spills. In this study, the theoretically critical emergency response time is firstly obtained for the emergency material allocation system to select a series of appropriate emergency material warehouses as potential supportive centers. Then, an enumeration method is applied to identify the practically critical emergency response time, the optimum emergency material allocation and replenishment scheme. Finally, the developed framework is applied to a computational experiment based on south-to-north water transfer project in China. The results illustrate that the proposed methodology is a simple and flexible tool for appropriately allocating emergency material to satisfy time-dynamic demands during emergency decision-making. Therefore, the decision-makers can identify an appropriate emergency material allocation scheme in a balance between time-effective and cost-effective objectives under the different emergency pollution conditions.

  6. Monitoring performance of a highly distributed and complex computing infrastructure in LHCb

    NASA Astrophysics Data System (ADS)

    Mathe, Z.; Haen, C.; Stagni, F.

    2017-10-01

    In order to ensure an optimal performance of the LHCb Distributed Computing, based on LHCbDIRAC, it is necessary to be able to inspect the behavior over time of many components: firstly the agents and services on which the infrastructure is built, but also all the computing tasks and data transfers that are managed by this infrastructure. This consists of recording and then analyzing time series of a large number of observables, for which the usage of SQL relational databases is far from optimal. Therefore within DIRAC we have been studying novel possibilities based on NoSQL databases (ElasticSearch, OpenTSDB and InfluxDB) as a result of this study we developed a new monitoring system based on ElasticSearch. It has been deployed on the LHCb Distributed Computing infrastructure for which it collects data from all the components (agents, services, jobs) and allows creating reports through Kibana and a web user interface, which is based on the DIRAC web framework. In this paper we describe this new implementation of the DIRAC monitoring system. We give details on the ElasticSearch implementation within the DIRAC general framework, as well as an overview of the advantages of the pipeline aggregation used for creating a dynamic bucketing of the time series. We present the advantages of using the ElasticSearch DSL high-level library for creating and running queries. Finally we shall present the performances of that system.

  7. MONITORING AND ASSESSING THE CONDITION OF AQUATIC RESOURCES: ROLE OF COMPLEX SURVEY DESIGN AND ANALYSIS

    EPA Science Inventory

    The National Water Quality Monitoring Council (NWQMC) developed a common framework for aquatic resource monitoring. The framework is described in a series of articles published in Water Resources IMPACT, September, 2003. One objective of the framework is to encourage consistenc...

  8. VizieR Online Data Catalog: RR Lyrae in SDSS Stripe 82 (Suveges+, 2012)

    NASA Astrophysics Data System (ADS)

    Suveges, M.; Sesar, B.; Varadi, M.; Mowlavi, N.; Becker, A. C.; Ivezic, Z.; Beck, M.; Nienartowicz, K.; Rimoldini, L.; Dubath, P.; Bartholdi, P.; Eyer, L.

    2013-05-01

    We propose a robust principal component analysis framework for the exploitation of multiband photometric measurements in large surveys. Period search results are improved using the time-series of the first principal component due to its optimized signal-to-noise ratio. The presence of correlated excess variations in the multivariate time-series enables the detection of weaker variability. Furthermore, the direction of the largest variance differs for certain types of variable stars. This can be used as an efficient attribute for classification. The application of the method to a subsample of Sloan Digital Sky Survey Stripe 82 data yielded 132 high-amplitude delta Scuti variables. We also found 129 new RR Lyrae variables, complementary to the catalogue of Sesar et al., extending the halo area mapped by Stripe 82 RR Lyrae stars towards the Galactic bulge. The sample also comprises 25 multiperiodic or Blazhko RR Lyrae stars. (8 data files).

  9. The Many Hazards of Trend Evaluation

    NASA Astrophysics Data System (ADS)

    Henebry, G. M.; de Beurs, K.; Zhang, X.; Kimball, J. S.; Small, C.

    2014-12-01

    Given the awareness in the scientific community of global scale drivers such as population growth, globalization, and climatic variation and change, many studies seek to identify temporal patterns in data that may be plausibly related to one or more aspect of global change. Here we explore two questions: "What constitutes a trend in a time series?" and "How can a trend be misinterpreted?" There are manifold hazards—both methodological and psychological—in detecting a trend, quantifying its magnitude, assessing its significance, identifying probable causes, and evaluating the implications of the trend. These hazards can combine to elevate the risk of misinterpreting the trend. In contrast, evaluation of multiple trends within a biogeophysical framework can attenuate the risk of misinterpretation. We review and illustrate these hazards and demonstrate the efficacy of an approach using multiple indicators detecting significant trends (MIDST) applied to time series of remote sensing data products.

  10. Framework for a space shuttle main engine health monitoring system

    NASA Technical Reports Server (NTRS)

    Hawman, Michael W.; Galinaitis, William S.; Tulpule, Sharayu; Mattedi, Anita K.; Kamenetz, Jeffrey

    1990-01-01

    A framework developed for a health management system (HMS) which is directed at improving the safety of operation of the Space Shuttle Main Engine (SSME) is summarized. An emphasis was placed on near term technology through requirements to use existing SSME instrumentation and to demonstrate the HMS during SSME ground tests within five years. The HMS framework was developed through an analysis of SSME failure modes, fault detection algorithms, sensor technologies, and hardware architectures. A key feature of the HMS framework design is that a clear path from the ground test system to a flight HMS was maintained. Fault detection techniques based on time series, nonlinear regression, and clustering algorithms were developed and demonstrated on data from SSME ground test failures. The fault detection algorithms exhibited 100 percent detection of faults, had an extremely low false alarm rate, and were robust to sensor loss. These algorithms were incorporated into a hierarchical decision making strategy for overall assessment of SSME health. A preliminary design for a hardware architecture capable of supporting real time operation of the HMS functions was developed. Utilizing modular, commercial off-the-shelf components produced a reliable low cost design with the flexibility to incorporate advances in algorithm and sensor technology as they become available.

  11. Biogeochemical Response to Mesoscale Physical Forcing in the California Current System

    NASA Technical Reports Server (NTRS)

    Niiler, Pearn P.; Letelier, Ricardo; Moisan, John R.; Marra, John A. (Technical Monitor)

    2001-01-01

    In the first part of the project, we investigated the local response of the coastal ocean ecosystems (changes in chlorophyll, concentration and chlorophyll, fluorescence quantum yield) to physical forcing by developing and deploying Autonomous Drifting Ocean Stations (ADOS) within several mesoscale features along the U.S. west coast. Also, we compared the temporal and spatial variability registered by sensors mounted in the drifters to that registered by the sensors mounted in the satellites in order to assess the scales of variability that are not resolved by the ocean color satellite. The second part of the project used the existing WOCE SVP Surface Lagrangian drifters to track individual water parcels through time. The individual drifter tracks were used to generate multivariate time series by interpolating/extracting the biological and physical data fields retrieved by remote sensors (ocean color, SST, wind speed and direction, wind stress curl, and sea level topography). The individual time series of the physical data (AVHRR, TOPEX, NCEP) were analyzed against the ocean color (SeaWiFS) time-series to determine the time scale of biological response to the physical forcing. The results from this part of the research is being used to compare the decorrelation scales of chlorophyll from a Lagrangian and Eulerian framework. The results from both parts of this research augmented the necessary time series data needed to investigate the interactions between the ocean mesoscale features, wind, and the biogeochemical processes. Using the historical Lagrangian data sets, we have completed a comparison of the decorrelation scales in both the Eulerian and Lagrangian reference frame for the SeaWiFS data set. We are continuing to investigate how these results might be used in objective mapping efforts.

  12. Literature Frameworks: From Apples to Zoos. Second Edition. Professional Growth Series.

    ERIC Educational Resources Information Center

    McElmeel, Sharron L.

    Intended for use by teachers in preschool through intermediate grade classrooms, library media specialists, and other educators, this book presents 26 literature frameworks that involve students in reading and sharing literature in connection with thematic units. Each framework suggests books and literature-related materials to coincide with…

  13. Literature Frameworks--From Apples to Zoos. Professional Growth Series.

    ERIC Educational Resources Information Center

    McElmeel, Sharron L.

    Intended for use by teachers in preschool through intermediate grade classrooms, library media specialists, and other educators, this book presents 26 literature frameworks that involve students in reading and sharing literature in connection with thematic units. Each framework suggests books and literature-related materials to coincide with…

  14. Frameworks for organizing exposure and toxicity data - the Aggregate Exposure Pathway (AEP) and the Adverse Outcome Pathway (AOP)

    EPA Science Inventory

    The Adverse Outcome Pathway (AOP) framework organizes existing knowledge regarding a series of biological events, starting with a molecular initiating event (MIE) and ending at an adverse outcome. The AOP framework provides a biological context to interpret in vitro toxicity dat...

  15. New anion-templated 3D heterobimetallic open frameworks based on lanthanide-carboxylate layers and copper pillars

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li Yunwu; Wang Yonghui; Li Yangguang

    2008-06-15

    A series of new three-dimensional (3D) lanthanide-transition metal (4f-3d) heterobimetallic open frameworks, [Ln{sub 2}(1,2-bdc){sub 2}(H{sub 2}O){sub 2} Cu(inic){sub 2}](ClO{sub 4}) (Ln=Eu (1), Tb (2), Nd (3) and Sm (4); 1,2-bdc=1,2-benzenedicarboxylate; Hinic=isonicotinic acid) have been hydrothermally synthesized and characterized by elemental analysis, IR, TG and single-crystal X-ray diffraction analysis. Compounds 1-4 are isostructural. They possess a new anion-templated 3D heterobimetallic open framework, which is observed for the first time in the {l_brace}Ln/TM/bdc/inic{r_brace} (TM=transition metal) system. Compounds 1 and 2 exhibit the characteristic fluorescent properties of Eu(III) and Tb(III), respectively. - Graphical abstract: A series of new anion-templated 3D heterobimetallic open frameworkmore » based on the lanthanide-carboxylate layers and copper(I)-inic pillars, [Ln{sub 2}(1,2-bdc){sub 2}(H{sub 2}O){sub 2}Cu(inic){sub 2}](ClO{sub 4}) (Ln=Eu (1), Tb (2), Nd (3) and Sm (4); 1,2-bdc=1,2-benzenedicarboxylate; Hinic=isonicotinic acid), have been hydrothermally synthesized and structurally characterized, among which compounds 1 and 2 exhibit good fluorescent properties.« less

  16. “Smooth” Semiparametric Regression Analysis for Arbitrarily Censored Time-to-Event Data

    PubMed Central

    Zhang, Min; Davidian, Marie

    2008-01-01

    Summary A general framework for regression analysis of time-to-event data subject to arbitrary patterns of censoring is proposed. The approach is relevant when the analyst is willing to assume that distributions governing model components that are ordinarily left unspecified in popular semiparametric regression models, such as the baseline hazard function in the proportional hazards model, have densities satisfying mild “smoothness” conditions. Densities are approximated by a truncated series expansion that, for fixed degree of truncation, results in a “parametric” representation, which makes likelihood-based inference coupled with adaptive choice of the degree of truncation, and hence flexibility of the model, computationally and conceptually straightforward with data subject to any pattern of censoring. The formulation allows popular models, such as the proportional hazards, proportional odds, and accelerated failure time models, to be placed in a common framework; provides a principled basis for choosing among them; and renders useful extensions of the models straightforward. The utility and performance of the methods are demonstrated via simulations and by application to data from time-to-event studies. PMID:17970813

  17. A spatial assessment framework for evaluating flood risk under extreme climates.

    PubMed

    Chen, Yun; Liu, Rui; Barrett, Damian; Gao, Lei; Zhou, Mingwei; Renzullo, Luigi; Emelyanova, Irina

    2015-12-15

    Australian coal mines have been facing a major challenge of increasing risk of flooding caused by intensive rainfall events in recent years. In light of growing climate change concerns and the predicted escalation of flooding, estimating flood inundation risk becomes essential for understanding sustainable mine water management in the Australian mining sector. This research develops a spatial multi-criteria decision making prototype for the evaluation of flooding risk at a regional scale using the Bowen Basin and its surroundings in Queensland as a case study. Spatial gridded data, including climate, hydrology, topography, vegetation and soils, were collected and processed in ArcGIS. Several indices were derived based on time series of observations and spatial modeling taking account of extreme rainfall, evapotranspiration, stream flow, potential soil water retention, elevation and slope generated from a digital elevation model (DEM), as well as drainage density and proximity extracted from a river network. These spatial indices were weighted using the analytical hierarchy process (AHP) and integrated in an AHP-based suitability assessment (AHP-SA) model under the spatial risk evaluation framework. A regional flooding risk map was delineated to represent likely impacts of criterion indices at different risk levels, which was verified using the maximum inundation extent detectable by a time series of remote sensing imagery. The result provides baseline information to help Bowen Basin coal mines identify and assess flooding risk when making adaptation strategies and implementing mitigation measures in future. The framework and methodology developed in this research offers the Australian mining industry, and social and environmental studies around the world, an effective way to produce reliable assessment on flood risk for managing uncertainty in water availability under climate change. Copyright © 2015. Published by Elsevier B.V.

  18. Highly Efficient Cooperative Catalysis by Co III (Porphyrin) Pairs in Interpenetrating Metal-Organic Frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Zekai; Zhang, Zhi-Ming; Chen, Yu-Sheng

    2016-12-02

    A series of porous twofold interpenetrated In-Co III(porphyrin) metal–organic frameworks (MOFs) were constructed by in situ metalation of porphyrin bridging ligands and used as efficient cooperative catalysts for the hydration of terminal alkynes. The twofold interpenetrating structure brings adjacent Co III(porphyrins) in the two networks parallel to each other with a distance of about 8.8 Å, an ideal distance for the simultaneous activation of both substrates in alkyne hydration reactions. As a result, the In-Co III(porphyrin) MOFs exhibit much higher (up to 38 times) catalytic activity than either homogeneous catalysts or MOF controls with isolated Co III(porphyrin) centers, thus highlightingmore » the potential application of MOFs in cooperative catalysis.« less

  19. On the space monitoring of the Sassi and Murgia Park (Matera UNESCO site) using LANDSAT time series

    NASA Astrophysics Data System (ADS)

    Lanorte, Antonio; Nole, Gabriele; Manzi, Teresa; Lasaponara, Rosa

    2015-04-01

    This paper is focused on the preliminary results obtained in the framework of the Great relevance project " Smart management of cultural heritage sites in Italy and Argentina: Earth Observation and pilot projects funded by the Ministero degli Affari Esteri e della Cooperazione Internazionale --MAE, 17/04/2014, Prot. nr. 0090692, 2014-2016. A temporal series (1999-2011) of the LANDSAT TM data was used to carried out investigation in the Murgia Park enclosed in the Matera UNESCO site . The PCA was adopted to enhance regions of localized change in multi-temporal data sets (Lasaponara 2006). Both naturally vegetated areas (forest, shrub-land, herbaceous cover) and agricultural lands have been investigated in order to extract the most prominent natural and/or man induced alterations affecting soil and vegetation cover. Results from PCA were compared with independent data sets and field survey to evaluate the reliability of the obtained maps of the ongoing land degradation phenomenon. Such analyses can provide valuable information for an operational monitoring of the status of vegetation which is an indicator of the degree of stress namely any disturbance that adversely influences plants in response to natural hazards and/or anthropogenic activities. Our findings suggest that the Landsat TM time series can provide valuable information for environmental management policies involving biodiversity preservation and rational exploitation of natural and agricultural resources. Acknowledgement This research was performed within the framework of the Great relevance project " Smart management of cultural heritage sites in Italy and Argentina: Earth Observation and pilot projects funded by the Ministero degli Affari Esteri e della Cooperazione Internazionale --MAE, 17/04/2014, Prot. nr. 0090692, 2014-2016

  20. Status on the Development of a Modeling and Simulation Framework for the Economic Assessment of Nuclear Hybrid Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Epiney, Aaron Simon; Chen, Jun; Rabiti, Cristian

    Continued effort to design and build a modeling and simulation framework to assess the economic viability of Nuclear Hybrid Energy Systems (NHES) was undertaken in fiscal year (FY) 2016. The purpose of this report is to document the various tasks associated with the development of such a framework and to provide a status of their progress. Several tasks have been accomplished. First, a synthetic time history generator has been developed in RAVEN, which consists of Fourier series and autoregressive moving average model. The former is used to capture the seasonal trend in historical data, while the latter is to characterizemore » the autocorrelation in residue time series (e.g., measurements with seasonal trends subtracted). As demonstration, both synthetic wind speed and grid demand are generated, showing matching statistics with database. In order to build a design and operations optimizer in RAVEN, a new type of sampler has been developed with highly object-oriented design. In particular, simultaneous perturbation stochastic approximation algorithm is implemented. The optimizer is capable to drive the model to optimize a scalar objective function without constraint in the input space, while the constraints handling is a work in progress and will be implemented to improve the optimization capability. Furthermore, a simplified cash flow model of the performance of an NHES in the electric market has been developed in Python and used as external model in RAVEN to confirm expectations on the analysis capability of RAVEN to provide insight into system economics and to test the capability of RAVEN to identify limit surfaces. Finally, an example calculation is performed that shows the integration and proper data passing in RAVEN of the synthetic time history generator, the cash flow model and the optimizer. It has been shown that the developed Python models external to RAVEN are able to communicate with RAVEN and each other through the newly developed RAVEN capability called “EnsembleModel”.« less

  1. A state space approach for piecewise-linear recurrent neural networks for identifying computational dynamics from neural measurements.

    PubMed

    Durstewitz, Daniel

    2017-06-01

    The computational and cognitive properties of neural systems are often thought to be implemented in terms of their (stochastic) network dynamics. Hence, recovering the system dynamics from experimentally observed neuronal time series, like multiple single-unit recordings or neuroimaging data, is an important step toward understanding its computations. Ideally, one would not only seek a (lower-dimensional) state space representation of the dynamics, but would wish to have access to its statistical properties and their generative equations for in-depth analysis. Recurrent neural networks (RNNs) are a computationally powerful and dynamically universal formal framework which has been extensively studied from both the computational and the dynamical systems perspective. Here we develop a semi-analytical maximum-likelihood estimation scheme for piecewise-linear RNNs (PLRNNs) within the statistical framework of state space models, which accounts for noise in both the underlying latent dynamics and the observation process. The Expectation-Maximization algorithm is used to infer the latent state distribution, through a global Laplace approximation, and the PLRNN parameters iteratively. After validating the procedure on toy examples, and using inference through particle filters for comparison, the approach is applied to multiple single-unit recordings from the rodent anterior cingulate cortex (ACC) obtained during performance of a classical working memory task, delayed alternation. Models estimated from kernel-smoothed spike time data were able to capture the essential computational dynamics underlying task performance, including stimulus-selective delay activity. The estimated models were rarely multi-stable, however, but rather were tuned to exhibit slow dynamics in the vicinity of a bifurcation point. In summary, the present work advances a semi-analytical (thus reasonably fast) maximum-likelihood estimation framework for PLRNNs that may enable to recover relevant aspects of the nonlinear dynamics underlying observed neuronal time series, and directly link these to computational properties.

  2. Bridging the gap between Hydrologic and Atmospheric communities through a standard based framework

    NASA Astrophysics Data System (ADS)

    Boldrini, E.; Salas, F.; Maidment, D. R.; Mazzetti, P.; Santoro, M.; Nativi, S.; Domenico, B.

    2012-04-01

    Data interoperability in the study of Earth sciences is essential to performing interdisciplinary multi-scale multi-dimensional analyses (e.g. hydrologic impacts of global warming, regional urbanization, global population growth etc.). This research aims to bridge the existing gap between hydrologic and atmospheric communities both at semantic and technological levels. Within the context of hydrology, scientists are usually concerned with data organized as time series: a time series can be seen as a variable measured at a particular point in space over a period of time (e.g. the stream flow values as periodically measured by a buoy sensor in a river); atmospheric scientists instead usually organize their data as coverages: a coverage can be seen as a multidimensional data array (e.g. satellite images acquired through time). These differences make non-trivial the set up of a common framework to perform data discovery and access. A set of web services specifications and implementations is already in place in both the scientific communities to allow data discovery and access in the different domains. The CUAHSI-Hydrologic Information System (HIS) service stack lists different services types and implementations: - a metacatalog (implemented as a CSW) used to discover metadata services by distributing the query to a set of catalogs - time series catalogs (implemented as CSW) used to discover datasets published by the feature services - feature services (implemented as WFS) containing features with data access link - sensor observation services (implemented as SOS) enabling access to the stream of acquisitions Within the Unidata framework, there lies a similar service stack for atmospheric data: - the broker service (implemented as a CSW) distributes a user query to a set of heterogeneous services (i.e. catalogs services, but also inventory and access services) - the catalog service (implemented as a CSW) is able to harvest the available metadata offered by THREDDS services, and executes complex queries against the available metadata. - inventory service (implemented as a THREDDS) being able to hierarchically organize and publish a local collection of multi-dimensional arrays (e.g. NetCDF, GRIB files), as well as publish auxiliary standard services to realize the actual data access and visualization (e.g. WCS, OPeNDAP, WMS). The approach followed in this research is to build on top of the existing standards and implementations, by setting up a standard-aware interoperable framework, able to deal with the existing heterogeneity in an organic way. As a methodology, interoperability tests against real services were performed; existing problems were thus highlighted and possibly solved. The use of flexible tools, able to deal in a smart way with heterogeneity has proven to be successful, in particular experiments were carried on with both GI-cat broker and ESRI GeoPortal frameworks. GI-cat discovery broker was proven successful at implementing the CSW interface, as well as federating heterogeneous resources, such as THREDDS and WCS services published by Unidata, HydroServer, WFS and SOS services published by CUAHSI. Experiments with ESRI GeoPortal were also successful: the GeoPortal was used to deploy a web interface able to distribute searches amongst catalog implementations from both the hydrologic and the atmospheric communities, including HydroServers and GI-cat, combining results from both the domains in a seamless way.

  3. Lessons Learned From Developing A Streaming Data Framework for Scientific Analysis

    NASA Technical Reports Server (NTRS)

    Wheeler. Kevin R.; Allan, Mark; Curry, Charles

    2003-01-01

    We describe the development and usage of a streaming data analysis software framework. The framework is used for three different applications: Earth science hyper-spectral imaging analysis, Electromyograph pattern detection, and Electroencephalogram state determination. In each application the framework was used to answer a series of science questions which evolved with each subsequent answer. This evolution is summarized in the form of lessons learned.

  4. Design Framework for an Adaptive MOOC Enhanced by Blended Learning: Supplementary Training and Personalized Learning for Teacher Professional Development

    ERIC Educational Resources Information Center

    Gynther, Karsten

    2016-01-01

    The research project has developed a design framework for an adaptive MOOC that complements the MOOC format with blended learning. The design framework consists of a design model and a series of learning design principles which can be used to design in-service courses for teacher professional development. The framework has been evaluated by…

  5. A crash course on data analysis in asteroseismology

    NASA Astrophysics Data System (ADS)

    Appourchaux, Thierry

    2014-02-01

    In this course, I try to provide a few basics required for performing data analysis in asteroseismology. First, I address how one can properly treat times series: the sampling, the filtering effect, the use of Fourier transform, the associated statistics. Second, I address how one can apply statistics for decision making and for parameter estimation either in a frequentist of a Bayesian framework. Last, I review how these basic principle have been applied (or not) in asteroseismology.

  6. Validation of High-Fidelity CFD/CAA Framework for Launch Vehicle Acoustic Environment Simulation against Scale Model Test Data

    NASA Technical Reports Server (NTRS)

    Liever, Peter A.; West, Jeffrey S.

    2016-01-01

    A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed for launch vehicle liftoff acoustic environment predictions. The framework couples the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate discontinuous Galerkin solver developed in the same production framework, Loci/THRUST, to accurately resolve and propagate acoustic physics across the entire launch environment. Time-accurate, Hybrid RANS/LES CFD modeling is applied for predicting the acoustic generation physics at the plume source, and a high-order accurate unstructured discontinuous Galerkin (DG) method is employed to propagate acoustic waves away from the source across large distances using high-order accurate schemes. The DG solver is capable of solving 2nd, 3rd, and 4th order Euler solutions for non-linear, conservative acoustic field propagation. Initial application testing and validation has been carried out against high resolution acoustic data from the Ares Scale Model Acoustic Test (ASMAT) series to evaluate the capabilities and production readiness of the CFD/CAA system to resolve the observed spectrum of acoustic frequency content. This paper presents results from this validation and outlines efforts to mature and improve the computational simulation framework.

  7. Spatiotemporal Segmentation and Modeling of the Mitral Valve in Real-Time 3D Echocardiographic Images.

    PubMed

    Pouch, Alison M; Aly, Ahmed H; Lai, Eric K; Yushkevich, Natalie; Stoffers, Rutger H; Gorman, Joseph H; Cheung, Albert T; Gorman, Joseph H; Gorman, Robert C; Yushkevich, Paul A

    2017-09-01

    Transesophageal echocardiography is the primary imaging modality for preoperative assessment of mitral valves with ischemic mitral regurgitation (IMR). While there are well known echocardiographic insights into the 3D morphology of mitral valves with IMR, such as annular dilation and leaflet tethering, less is understood about how quantification of valve dynamics can inform surgical treatment of IMR or predict short-term recurrence of the disease. As a step towards filling this knowledge gap, we present a novel framework for 4D segmentation and geometric modeling of the mitral valve in real-time 3D echocardiography (rt-3DE). The framework integrates multi-atlas label fusion and template-based medial modeling to generate quantitatively descriptive models of valve dynamics. The novelty of this work is that temporal consistency in the rt-3DE segmentations is enforced during both the segmentation and modeling stages with the use of groupwise label fusion and Kalman filtering. The algorithm is evaluated on rt-3DE data series from 10 patients: five with normal mitral valve morphology and five with severe IMR. In these 10 data series that total 207 individual 3DE images, each 3DE segmentation is validated against manual tracing and temporal consistency between segmentations is demonstrated. The ultimate goal is to generate accurate and consistent representations of valve dynamics that can both visually and quantitatively provide insight into normal and pathological valve function.

  8. Magnetic storms and solar flares: can be analysed within similar mathematical framework with other extreme events?

    NASA Astrophysics Data System (ADS)

    Balasis, Georgios; Potirakis, Stelios M.; Papadimitriou, Constantinos; Zitis, Pavlos I.; Eftaxias, Konstantinos

    2015-04-01

    The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe a great variety of scientific and technological approaches of different types of natural, artificial, and social systems. We apply concepts of the nonextensive statistical physics, on time-series data of observable manifestations of the underlying complex processes ending up to different extreme events, in order to support the suggestion that a dynamical analogy characterizes the generation of a single magnetic storm, solar flare, earthquake (in terms of pre-seismic electromagnetic signals) , epileptic seizure, and economic crisis. The analysis reveals that all the above mentioned different extreme events can be analyzed within similar mathematical framework. More precisely, we show that the populations of magnitudes of fluctuations included in all the above mentioned pulse-like-type time series follow the traditional Gutenberg-Richter law as well as a nonextensive model for earthquake dynamics, with similar nonextensive q-parameter values. Moreover, based on a multidisciplinary statistical analysis we show that the extreme events are characterized by crucial common symptoms, namely: (i) high organization, high compressibility, low complexity, high information content; (ii) strong persistency; and (iii) existence of clear preferred direction of emerged activities. These symptoms clearly discriminate the appearance of the extreme events under study from the corresponding background noise.

  9. The InSAR Scientific Computing Environment

    NASA Technical Reports Server (NTRS)

    Rosen, Paul A.; Gurrola, Eric; Sacco, Gian Franco; Zebker, Howard

    2012-01-01

    We have developed a flexible and extensible Interferometric SAR (InSAR) Scientific Computing Environment (ISCE) for geodetic image processing. ISCE was designed from the ground up as a geophysics community tool for generating stacks of interferograms that lend themselves to various forms of time-series analysis, with attention paid to accuracy, extensibility, and modularity. The framework is python-based, with code elements rigorously componentized by separating input/output operations from the processing engines. This allows greater flexibility and extensibility in the data models, and creates algorithmic code that is less susceptible to unnecessary modification when new data types and sensors are available. In addition, the components support provenance and checkpointing to facilitate reprocessing and algorithm exploration. The algorithms, based on legacy processing codes, have been adapted to assume a common reference track approach for all images acquired from nearby orbits, simplifying and systematizing the geometry for time-series analysis. The framework is designed to easily allow user contributions, and is distributed for free use by researchers. ISCE can process data from the ALOS, ERS, EnviSAT, Cosmo-SkyMed, RadarSAT-1, RadarSAT-2, and TerraSAR-X platforms, starting from Level-0 or Level 1 as provided from the data source, and going as far as Level 3 geocoded deformation products. With its flexible design, it can be extended with raw/meta data parsers to enable it to work with radar data from other platforms

  10. Outlier detection for groundwater data in France

    NASA Astrophysics Data System (ADS)

    Valmy, Larissa; de Fouquet, Chantal; Bourgine, Bernard

    2014-05-01

    Quality and quantity water in France are increasingly observed since the 70s. Moreover, in 2000, the EU Water Framework Directive established a framework for community action in the water policy field for the protection of inland surface waters (rivers and lakes), transitional waters (estuaries), coastal waters and groundwater. It will ensure that all aquatic ecosystems and, with regard to their water needs, terrestrial ecosystems and wetlands meet 'good status' by 2015. The Directive requires Member States to establish river basin districts and for each of these a river basin management plan. In France, monitoring programs for the water status were implemented in each basin since 2007. The data collected through these programs feed into an information system which contributes to check the compliance of water environmental legislation implementation, assess the status of water guide management actions (programs of measures) and evaluate their effectiveness, and inform the public. Our work consists in study quality and quantity groundwater data for some basins in France. We propose a specific mathematical approach in order to detect outliers and study trends in time series. In statistic, an outlier is an observation that lies outside the overall pattern of a distribution. Usually, the presence of an outlier indicates some sort of problem, thus, it is important to detect it in order to know the cause. In fact, techniques for temporal data analysis have been developed for several decades in parallel with geostatistical methods. However compared to standard statistical methods, geostatistical analysis allows incomplete or irregular time series analysis. Otherwise, tests carried out by the BRGM showed the potential contribution of geostatistical methods for characterization of environmental data time series. Our approach is to exploit this potential through the development of specific algorithms, tests and validation of methods. We will introduce and explain our method and approach by considering the Loire Bretagne basin case.

  11. An analytical framework for extracting hydrological information from time series of small reservoirs in a semi-arid region

    NASA Astrophysics Data System (ADS)

    Annor, Frank; van de Giesen, Nick; Bogaard, Thom; Eilander, Dirk

    2013-04-01

    Small water reservoirs for water resources management have as important socio-economic advantage that they bring water close to villages and households. This proximity allows for many water uses in addition to irrigation, such as fisheries, household water, building materials (loam, reeds), tourism and recreation, and cattle watering. These positive aspects are offset by the relatively large evaporative losses in comparison to larger reservoirs, although, it is not exactly known how large these losses are. For decision makers, investors and donors, the decision to construct a small reservoir should be multifactored; and based on economic, socio-cultural and environmental factors. For the latter, getting the water balance and the energy budget of small reservoirs right is key for any environmental impact analyses. For Northern Ghana, the relation between volume of a small reservoir and its' surface area has been established in a robust equation as: Volume = 0.00857Area1.4367 with the surface area explaining more than 95% of the variation in water volume of the reservoirs. This allows the use of remote sensing observations for estimating water volume of small reservoirs in northern Ghana. Hydrological analyses of time series of small reservoir areas comprises estimates of evaporation fluxes and cumulative surface runoff curves. Once the reservoirs are full, spillage will occur and volumes and surface areas remain stable at their maximum extents. This implies that the time series of reservoir surface area contains information concerning the on-set of downstream surface runoff. This on-set does not coincide with the on-set of the rainy season but largely depends on the distribution of rainfall events and storage capacity in the subsurface. The main requirement for this analysis is that the reservoir has negligible seepage losses or water influx from the underlying subsurface. In our research, we carried out a time series analysis of surface area extent for about 45 small reservoirs in the Upper East Region of Ghana. Reservoirs without obvious large seepage losses (field survey) were selected. To verify this, stable water isotopic samples are collected from groundwater upstream and downstream from the reservoir. By looking at possible enrichment of downstream groundwater, a good estimate of seepage can be made in addition to estimates on evaporation. We estimated the evaporative losses and compared those with field measurements using eddy correlation measurements. Lastly, we determined the cumulative surface runoff curves for the small reservoirs .We will present this analytical framework for extracting hydrological information from time series of small reservoirs and show the first results for our study region of northern Ghana.

  12. Characterizability of metabolic pathway systems from time series data.

    PubMed

    Voit, Eberhard O

    2013-12-01

    Over the past decade, the biomathematical community has devoted substantial effort to the complicated challenge of estimating parameter values for biological systems models. An even more difficult issue is the characterization of functional forms for the processes that govern these systems. Most parameter estimation approaches tacitly assume that these forms are known or can be assumed with some validity. However, this assumption is not always true. The recently proposed method of Dynamic Flux Estimation (DFE) addresses this problem in a genuinely novel fashion for metabolic pathway systems. Specifically, DFE allows the characterization of fluxes within such systems through an analysis of metabolic time series data. Its main drawback is the fact that DFE can only directly be applied if the pathway system contains as many metabolites as unknown fluxes. This situation is unfortunately rare. To overcome this roadblock, earlier work in this field had proposed strategies for augmenting the set of unknown fluxes with independent kinetic information, which however is not always available. Employing Moore-Penrose pseudo-inverse methods of linear algebra, the present article discusses an approach for characterizing fluxes from metabolic time series data that is applicable even if the pathway system is underdetermined and contains more fluxes than metabolites. Intriguingly, this approach is independent of a specific modeling framework and unaffected by noise in the experimental time series data. The results reveal whether any fluxes may be characterized and, if so, which subset is characterizable. They also help with the identification of fluxes that, if they could be determined independently, would allow the application of DFE. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. Characterizability of Metabolic Pathway Systems from Time Series Data

    PubMed Central

    Voit, Eberhard O.

    2013-01-01

    Over the past decade, the biomathematical community has devoted substantial effort to the complicated challenge of estimating parameter values for biological systems models. An even more difficult issue is the characterization of functional forms for the processes that govern these systems. Most parameter estimation approaches tacitly assume that these forms are known or can be assumed with some validity. However, this assumption is not always true. The recently proposed method of Dynamic Flux Estimation (DFE) addresses this problem in a genuinely novel fashion for metabolic pathway systems. Specifically, DFE allows the characterization of fluxes within such systems through an analysis of metabolic time series data. Its main drawback is the fact that DFE can only directly be applied if the pathway system contains as many metabolites as unknown fluxes. This situation is unfortunately rare. To overcome this roadblock, earlier work in this field had proposed strategies for augmenting the set of unknown fluxes with independent kinetic information, which however is not always available. Employing Moore-Penrose pseudo-inverse methods of linear algebra, the present article discusses an approach for characterizing fluxes from metabolic time series data that is applicable even if the pathway system is underdetermined and contains more fluxes than metabolites. Intriguingly, this approach is independent of a specific modeling framework and unaffected by noise in the experimental time series data. The results reveal whether any fluxes may be characterized and, if so, which subset is characterizable. They also help with the identification of fluxes that, if they could be determined independently, would allow the application of DFE. PMID:23391489

  14. ZWD time series analysis derived from NRT data processing. A regional study of PW in Greece.

    NASA Astrophysics Data System (ADS)

    Pikridas, Christos; Balidakis, Kyriakos; Katsougiannopoulos, Symeon

    2015-04-01

    ZWD (Zenith Wet/non-hydrostatic Delay) estimates are routinely derived Near Real Time from the new established Analysis Center in the Department of Geodesy and Surveying of Aristotle University of Thessaloniki (DGS/AUT-AC), in the framework of E-GVAP (EUMETNET GNSS water vapour project) since October 2014. This process takes place on an hourly basis and yields, among else, station coordinates and tropospheric parameter estimates for a network of 90+ permanent GNSS (Global Navigation Satellite System) stations. These are distributed at the wider part of Hellenic region. In this study, temporal and spatial variability of ZWD estimates were examined, as well as their relation with coordinate series extracted from both float and fixed solution of the initial phase ambiguities. For this investigation, Bernese GNSS Software v5.2 was used for the acquisition of the 6 month dataset from the aforementioned network. For time series analysis we employed techniques such as the Generalized Lomb-Scargle periodogram and Burg's maximum entropy method due to inefficiencies of the Discrete Fourier Transform application in the test dataset. Through the analysis, interesting results for further geophysical interpretation were drawn. In addition, the spatial and temporal distributions of Precipitable Water vapour (PW) obtained from both ZWD estimates and ERA-Interim reanalysis grids were investigated.

  15. A Framework for Socio-Scientific Issues Based Education

    ERIC Educational Resources Information Center

    Presley, Morgan L.; Sickel, Aaron J.; Muslu, Nilay; Merle-Johnson, Dominike; Witzig, Stephen B.; Izci, Kemal; Sadler, Troy D.

    2013-01-01

    Science instruction based on student exploration of socio-scientific issues (SSI) has been presented as a powerful strategy for supporting science learning and the development of scientific literacy. This paper presents an instructional framework for SSI based education. The framework is based on a series of research studies conducted in a diverse…

  16. Oncogene activation and tumor suppressor gene inactivation find their sites of expression in the changes in time and space of the age-adjusted cancer incidence rate.

    PubMed

    Kodama, M; Kodama, T; Murakami, M

    2000-01-01

    The purpose of the present investigation is to elucidate the relation between the distribution pattern of the age-adjusted incidence rate (AAIR) changes in time and space of 15 tumors of bothe sexes and the locations of centers of centripetal-(oncogene type) and centrifugal-(tumoe suppressor gene type) forces. The fitness of the observed log AAIR data sets to the oncogene type- and the tumor suppressor gene type-equilibrium models and the locations of 2 force centers were calculated by applying the least square method of Gauss to log AAIR pair data series with and without topological data manipulations, which are so designed as to let log AAIR pair data series fit to 2 variant (x, y) frameworks, the Rect-coordinates and the Para-coordinates. The 2 variant (x, y) coordinates are defined each as an (x, y) framework with its X axis crossed at a right angle to the regression line of the original log AAIR data (the Rect-coordinates) and as another framework with its X axis run in parallel with the regression line of the original log AAIR pair data series (the Para-coordinates). The fitness test of log AAIR data series to either the oncogene activation type equilibrium model (r = -1.000) or the tumor suppressor gene inactivation type (r = 1.000) was conducted for each of the male-female type pair data and the female-male type data, for each of log AAIR changes in space and log AAIR changes in time, and for each of the 3 (x, y) frameworks in a given neoplasia of both sexes. The results obtained are given as follows: 1) The positivity rates of the fitness test to the oncogene type equilibrium model and the tumor suppressor gene type model were each 63.3% and 56.7% with the log AAIR changes in space, and 73.3% and 73.3% with log AAIR changes in time, as tested in 15 human neoplasias of both sexes. 2) Evidence was presented to indicate that the clearance of oncogene activation and tumor suppressor gene inactivation is the sine qua non premise of carciniogenesis. 3) The r profile in which the correlation coefficient r, a measure of fitness to the 2 equilibrium models, is converted to either +(r > 0) or -(0 > r) for each of the original-, the Rect-, and the Para-coordinates was found to be informative in identifying a group of tumors with sex discrimination of cancer risk (log AAIR changes in space) or another group of environmental hormone-linked tumors (log AAIR changes in time and space)--a finding to indicate that the r-profile of a given tumor, when compared with other neoplasias, may provide a clue to investigating the biological behavior of the tumor. 4) The recent risk increase of skin cancer of both sexes, being classified as an example of environmental hormone-linked neoplasias, was found to commit its ascension of cancer risk along the direction of the centrifugal forces of the time- and space-linked tumor suppressor gene inactivation plotted in the 2-dimension diagram. In conclusion, the centripetal force of oncogene activation and centrifugal force of tumor suppressor gene inactivation found their sites of expression in the distribution pattern of a cancer risk parameter, log AAIR, of a given neoplasias of both sexes on the 2-dimension diagram. The application of the least square method of Gauss to the log AAIR changes in time and space, and also with and without topological modulations of the original sets, when presented in terms of the r-profile, was found to be informative in understanding behavioral characteristics of human neoplaisias.

  17. Living with extreme weather events - perspectives from climatology, geomorphological analysis, chronicles and opinion polls

    NASA Astrophysics Data System (ADS)

    Auer, I.; Kirchengast, A.; Proske, H.

    2009-09-01

    The ongoing climate change debate focuses more and more on changing extreme events. Information on past events can be derived from a number of sources, such as instrumental data, residual impacts in the landscape, but also chronicles and people's memories. A project called "A Tale of Two Valleys” within the framework of the research program "proVision” allowed to study past extreme events in two inner-alpine valleys from the sources mentioned before. Instrumental climate time series provided information for the past 200 years, however great attention had to be given to the homogeneity of the series. To derive homogenized time series of selected climate change indices methods like HOCLIS and Vincent have been applied. Trend analyses of climate change indices inform about increase or decrease of extreme events. Traces of major geomorphodynamic processes of the past (e.g. rockfalls, landslides, debris flows) which were triggered or affected by extreme weather events are still apparent in the landscape and could be evaluated by geomorphological analysis using remote sensing and field data. Regional chronicles provided additional knowledge and covered longer periods back in time, however compared to meteorological time series they enclose a high degree of subjectivity and intermittent recordings cannot be obviated. Finally, questionnaires and oral history complemented our picture of past extreme weather events. People were differently affected and have different memories of it. The joint analyses of these four data sources showed agreement to some extent, however also showed some reasonable differences: meteorological data are point measurements only with a sometimes too coarse temporal resolution. Due to land-use changes and improved constructional measures the impact of an extreme meteorological event may be different today compared to earlier times.

  18. A Methodological Framework for Model Selection in Interrupted Time Series Studies.

    PubMed

    Lopez Bernal, J; Soumerai, S; Gasparrini, A

    2018-06-06

    Interrupted time series is a powerful and increasingly popular design for evaluating public health and health service interventions. The design involves analysing trends in the outcome of interest and estimating the change in trend following an intervention relative to the counterfactual (the expected ongoing trend if the intervention had not occurred). There are two key components to modelling this effect: first, defining the counterfactual; second, defining the type of effect that the intervention is expected to have on the outcome, known as the impact model. The counterfactual is defined by extrapolating the underlying trends observed before the intervention to the post-intervention period. In doing this, authors must consider the pre-intervention period that will be included, any time varying confounders, whether trends may vary within different subgroups of the population and whether trends are linear or non-linear. Defining the impact model involves specifying the parameters that model the intervention, including for instance whether to allow for an abrupt level change or a gradual slope change, whether to allow for a lag before any effect on the outcome, whether to allow a transition period during which the intervention is being implemented and whether a ceiling or floor effect might be expected. Inappropriate model specification can bias the results of an interrupted time series analysis and using a model that is not closely tailored to the intervention or testing multiple models increases the risk of false positives being detected. It is important that authors use substantive knowledge to customise their interrupted time series model a priori to the intervention and outcome under study. Where there is uncertainty in model specification, authors should consider using separate data sources to define the intervention, running limited sensitivity analyses or undertaking initial exploratory studies. Copyright © 2018. Published by Elsevier Inc.

  19. A framework for relating the structures and recovery statistics in pressure time-series surveys for dust devils

    NASA Astrophysics Data System (ADS)

    Jackson, Brian; Lorenz, Ralph; Davis, Karan

    2018-01-01

    Dust devils are likely the dominant source of dust for the martian atmosphere, but the amount and frequency of dust-lifting depend on the statistical distribution of dust devil parameters. Dust devils exhibit pressure perturbations and, if they pass near a barometric sensor, they may register as a discernible dip in a pressure time-series. Leveraging this fact, several surveys using barometric sensors on landed spacecraft have revealed dust devil structures and occurrence rates. However powerful they are, though, such surveys suffer from non-trivial biases that skew the inferred dust devil properties. For example, such surveys are most sensitive to dust devils with the widest and deepest pressure profiles, but the recovered profiles will be distorted, broader and shallow than the actual profiles. In addition, such surveys often do not provide wind speed measurements alongside the pressure time series, and so the durations of the dust devil signals in the time series cannot be directly converted to profile widths. Fortunately, simple statistical and geometric considerations can de-bias these surveys, allowing conversion of the duration of dust devil signals into physical widths, given only a distribution of likely translation velocities, and the recovery of the underlying distributions of physical parameters. In this study, we develop a scheme for de-biasing such surveys. Applying our model to an in-situ survey using data from the Phoenix lander suggests a larger dust flux and a dust devil occurrence rate about ten times larger than previously inferred. Comparing our results to dust devil track surveys suggests only about one in five low-pressure cells lifts sufficient dust to leave a visible track.

  20. New Comprehensive System to Construct Speleothem Fabrics Time Series

    NASA Astrophysics Data System (ADS)

    Frisia, S.; Borsato, A.

    2014-12-01

    Speleothem fabrics record processes that influence the way geochemical proxy data are encoded in speleothems, yet, there has been little advance in the use of fabrics as a complement to palaeo-proxy datasets since the fabric classification proposed by us in 2010. The systematic use of fabrics documentation in speleothem science has been limited by the absence of a comprehensive, numerical system that would allow constructing fabric time series comparable with the widely used geochemical time series. Documentation of speleothem fabrics is fundamental for a robust interpretation of speleothem time series where stable isotopes and trace elements are used as proxy, because fabrics highlight depositional as well as post-depositional processes whose understanding complements reconstructions based on geochemistry. Here we propose a logic system allowing transformation of microscope observations into numbers tied to acronyms that specify each fabric type and subtype. The rationale for ascribing progressive numbers to fabrics is based on the most up-to-date growth models. In this conceptual framework, the progression reflects hydrological conditions, bio-mediation and diagenesis. The lowest numbers are given to calcite fabrics formed at relatively constant drip rates: the columnar types (compact and open). Higher numbers are ascribed to columnar fabrics characterized by presence of impurities that cause elongation or lattice distortion (Elongated, Fascicular Optic and Radiaxial calcites). The sequence progresses with the dendritic fabrics, followed by micrite (M), which has been observed in association with microbial films. Microsparite (Ms) and mosaic calcite (Mc) have the highest numbers, being considered as diagenetic. Acronyms and subfixes are intended to become universally acknowledged. Thus, fabrics can be plotted vs. age to yield time series, where numbers are replaced by the acronyms. This will result in a visual representation of climate- or environmental-related parameters underpinning speleothem crystal growth. The Fabric log thus becomes a useful tool providing robustness to the geochemical data or test the overall utility of the speleothem record.

  1. The incorporation of focused history in checklist for early recognition and treatment of acute illness and injury.

    PubMed

    Jayaprakash, Namita; Ali, Rashid; Kashyap, Rahul; Bennett, Courtney; Kogan, Alexander; Gajic, Ognjen

    2016-08-31

    Diagnostic error and delay are critical impediments to the safety of critically ill patients. Checklist for early recognition and treatment of acute illness and injury (CERTAIN) has been developed as a tool that facilitates timely and error-free evaluation of critically ill patients. While the focused history is an essential part of the CERTAIN framework, it is not clear how best to choreograph this step in the process of evaluation and treatment of the acutely decompensating patient. An un-blinded crossover clinical simulation study was designed in which volunteer critical care clinicians (fellows and attendings) were randomly assigned to start with either obtaining a focused history choreographed in series (after) or in parallel to the primary survey. A focused history was obtained using the standardized SAMPLE model that is incorporated into American College of Trauma Life Support (ATLS) and Pediatric Advanced Life Support (PALS). Clinicians were asked to assess six acutely decompensating patients using pre - determined clinical scenarios (three in series choreography, three in parallel). Once the initial choreography was completed the clinician would crossover to the alternative choreography. The primary outcome was the cognitive burden assessed through the NASA task load index. Secondary outcome was time to completion of a focused history. A total of 84 simulated cases (42 in parallel, 42 in series) were tested on 14 clinicians. Both the overall cognitive load and time to completion improved with each successive practice scenario, however no difference was observed between the series versus parallel choreographies. The median (IQR) overall NASA TLX task load index for series was 39 (17 - 58) and for parallel 43 (27 - 52), p = 0.57. The median (IQR) time to completion of the tasks in series was 125 (112 - 158) seconds and in parallel 122 (108 - 158) seconds, p = 0.92. In this clinical simulation study assessing the incorporation of a focused history into the primary survey of a non-trauma critically ill patient, there was no difference in cognitive burden or time to task completion when using series choreography (after the exam) compared to parallel choreography (concurrent with the primary survey physical exam). However, with repetition of the task both overall task load and time to completion improved in each of the choreographies.

  2. Representations of time coordinates in FITS. Time and relative dimension in space

    NASA Astrophysics Data System (ADS)

    Rots, Arnold H.; Bunclark, Peter S.; Calabretta, Mark R.; Allen, Steven L.; Manchester, Richard N.; Thompson, William T.

    2015-02-01

    Context. In a series of three previous papers, formulation and specifics of the representation of world coordinate transformations in FITS data have been presented. This fourth paper deals with encoding time. Aims: Time on all scales and precisions known in astronomical datasets is to be described in an unambiguous, complete, and self-consistent manner. Methods: Employing the well-established World Coordinate System (WCS) framework, and maintaining compatibility with the FITS conventions that are currently in use to specify time, the standard is extended to describe rigorously the time coordinate. Results: World coordinate functions are defined for temporal axes sampled linearly and as specified by a lookup table. The resulting standard is consistent with the existing FITS WCS standards and specifies a metadata set that achieves the aims enunciated above.

  3. IGS14/igs14.atx: a new Framework for the IGS Products

    NASA Astrophysics Data System (ADS)

    Rebischung, P.; Schmid, R.

    2016-12-01

    The International GNSS Service (IGS) is about to switch to a new reference frame (IGS14), based on the latest release of the International Terrestrial Reference Frame (ITRF2014), as the basis for its products. An updated set of satellite and ground antenna calibrations (igs14.atx) will become effective at the same time. IGS14 and igs14.atx will then replace the previous IGS08/igs08.atx framework in use since GPS week 1632 (17 April 2011) and in the second IGS reprocessing campaign (repro2). Despite the negligible scale difference between ITRF2008 and ITRF2014 (0.02 ppb), the radial components of all GPS and GLONASS satellite antenna phase center offsets (z-PCOs) had to be updated in igs14.atx, because of modeling changes recently introduced within the IGS that affect the scale of the IGS products. This was achieved by deriving and averaging time series of satellite z-PCO estimates, consistent with the ITRF2014 scale, from the daily repro2 and latest operational SINEX solutions of seven IGS Analysis Centers (ACs). Compared to igs08.atx, igs14.atx includes robot calibrations for 16 additional ground antenna types, so that the percentage of stations with absolute calibrations in the IGS network will reach 90% after the switch. 19 type-mean robot calibrations were also updated thanks to the availability of calibration results for additional antenna samples. IGS14 is basically an extract of well-suited reference frame stations (i.e., with long and stable position time series) from ITRF2014. However, to make the IGS14 station coordinates consistent with the new igs14.atx ground antenna calibrations, position offsets due to the switch from igs08.atx to igs14.atx were derived for all IGS14 stations affected by ground antenna calibration updates and applied to their ITRF2014 coordinates. This presentation will first detail the different steps of the elaboration of IGS14 and igs14.atx. The impact of the switch on GNSS-derived geodetic parameter time series will then be assessed by re-aligning the daily repro2 and latest operational IGS combined SINEX solutions to IGS14/igs14.atx. A particular focus will finally be given to the biases and trends present in the satellite z-PCO time series derived from the daily AC SINEX solutions, and to their interpretation in terms of scale and scale rate of the terrestrial frame.

  4. Market dynamics and stock price volatility

    NASA Astrophysics Data System (ADS)

    Li, H.; Rosser, J. B., Jr.

    2004-06-01

    This paper presents a possible explanation for some of the empirical properties of asset returns within a heterogeneous-agents framework. The model turns out, even if we assume the input fundamental value follows an simple Gaussian distribution lacking both fat tails and volatility dependence, these features can show up in the time series of asset returns. In this model, the profit comparison and switching between heterogeneous play key roles, which build a connection between endogenous market and the emergence of stylized facts.

  5. A user-friendly, dynamic web environment for remote data browsing and analysis of multiparametric geophysical data within the MULTIMO project

    NASA Astrophysics Data System (ADS)

    Carniel, Roberto; Di Cecca, Mauro; Jaquet, Olivier

    2006-05-01

    In the framework of the EU-funded project "Multi-disciplinary monitoring, modelling and forecasting of volcanic hazard" (MULTIMO), multiparametric data have been recorded at the MULTIMO station in Montserrat. Moreover, several other long time series, recorded at Montserrat and at other volcanoes, have been acquired in order to test stochastic and deterministic methodologies under development. Creating a general framework to handle data efficiently is a considerable task even for homogeneous data. In the case of heterogeneous data, this becomes a major issue. A need for a consistent way of browsing such a heterogeneous dataset in a user-friendly way therefore arose. Additionally, a framework for applying the calculation of the developed dynamical parameters on the data series was also needed in order to easily keep these parameters under control, e.g. for monitoring, research or forecasting purposes. The solution which we present is completely based on Open Source software, including Linux operating system, MySql database management system, Apache web server, Zope application server, Scilab math engine, Plone content management framework, Unified Modelling Language. From the user point of view the main advantage is the possibility of browsing through datasets recorded on different volcanoes, with different instruments, with different sampling frequencies, stored in different formats, all via a consistent, user- friendly interface that transparently runs queries to the database, gets the data from the main storage units, generates the graphs and produces dynamically generated web pages to interact with the user. The involvement of third parties for continuing the development in the Open Source philosophy and/or extending the application fields is now sought.

  6. Deriving Daily Time Series Evapotranspiration, Evaporation and Transpiration Maps With Landsat Data

    NASA Astrophysics Data System (ADS)

    Paul, G.; Gowda, P. H.; Marek, T.; Xiao, X.; Basara, J. B.

    2014-12-01

    Mapping high resolution evapotranspiration (ET) over large region at daily time step is complex and computationally intensive. Utility of high resolution daily ET maps are large ranging from crop water management to watershed management. The aim of this work is to generate daily time series (10 years) ET and its components vegetation transpiration (T) and soil water evaporation (E) maps using Landsat 5 satellite data for Southern Great Plains forage-rangeland-winter wheat production system in Oklahoma (OK). Framework for generating these products included the two source energy balance (TSEB) algorithm and other important features were: (a) atmospheric correction algorithm; (b) spatially interpolated weather inputs; (c) functions for varying Priestley-Taylor coefficient; and (d) ET, E and T extrapolating algorithm utilizing reference ET. An extensive network of 140 weather stations managed by Oklahoma Mesonet was utilized to generate spatially interpolated inputs of air temperature, relative humidity, wind speed, solar radiation, pressure, and reference ET. Validation of the ET maps were done against eddy covariance data from two grassland sites at El Reno, OK suggested good performance (Table 1). Figure 1 illustrates a daily ET map for a very small subset of 18thJuly 2006 ET map, where difference in ET among different land uses such as the irrigated cropland, vegetation along drainage, and grassland is very distinct. Results indicated that the proposed ET mapping framework is suitable for deriving high resolution time series daily ET maps at regional scale with Landsat Thematic Mapper data. . Table 1: Daily actual ET performance statistics for two grassland locations at El Reno OK for year 2005 . Management Type Mean (obs) (mm d-1) Mean (est) (mm d-1) MBE (mm d-1) % MBE (%) RMSE (mm d-1) RMSE (%) MAE (mm d-1) MAPD (%) NSE R2 Control 2.2 1.8 -0.43 -19.4 0.87 38.9 0.65 29.5 0.71 0.79 Burnt 2.0 1.8 -0.15 -7.7 0.80 39.8 0.62 30.7 0.73 0.77

  7. Derivation of GNSS derived station velocities for a surface deformation model in the Austrian region

    NASA Astrophysics Data System (ADS)

    Umnig, Elke; Weber, Robert; Maras, Jadre; Brückl, Ewald

    2016-04-01

    This contribution deals with the first comprehensive analysis of GNSS derived surface velocities computed within an observation network of about 100 stations covering the whole Austrian territory and parts of the neighbouring countries. Coordinate time series are available now, spanning a period of 5 years (2010.0-2015.0) for one focus area in East Austria and one and a half year (2013.5-2015.0) for the remaining part of the tracking network. In principle the data series are stemming from two different GNSS campaigns. The former was set up to investigate intra plate tectonic movements within the framework of the project ALPAACT (seismological and geodetic monitoring of ALpine-PAnnonian ACtive Tectonics), the latter was designed to support a number of various requests, e.g. derivation of GNSS derived water vapour fields, but also to expand the foresaid tectonic studies. In addition the activities within the ALPAACT project supplement the educational initiative SHOOLS & QUAKES, where scholars contribute to seismological research. For the whole period of the processed coordinate time series daily solutions have been computed by means of the Bernese software. The processed coordinate time series are tied to the global reference frame ITRF2000 as well as to the frame ITRF2008. Due to the transition of the reference from ITRF2000 to ITRF2008 within the processing period, but also due to updates of the Bernese software from version 5.0 to 5.2 the time series were initially not fully consistent and have to be re-aligned to a common frame. So the goal of this investigation is to derive a nationwide consistent horizontal motion field on base of GNSS reference station data within the ITRF2008 frame, but also with respect to the Eurasian plate. In this presentation we focus on the set-up of the coordinate time series and on the problem of frame alignment. Special attention is also paid to the separation into linear and periodic motion signals, originating from tectonic or non-tectonic sources.

  8. HYPE: a WFD tool for the identification of significant and sustained upward trends in groundwater time series

    NASA Astrophysics Data System (ADS)

    Lopez, Benjamin; Croiset, Nolwenn; Laurence, Gourcy

    2014-05-01

    The Water Framework Directive 2006/11/CE (WFD) on the protection of groundwater against pollution and deterioration asks Member States to identify significant and sustained upward trends in all bodies or groups of bodies of groundwater that are characterised as being at risk in accordance with Annex II to Directive 2000/60/EC. The Directive indicates that the procedure for the identification of significant and sustained upward trends must be based on a statistical method. Moreover, for significant increases of concentrations of pollutants, trend reversals are identified as being necessary. This means to be able to identify significant trend reversals. A specific tool, named HYPE, has been developed in order to help stakeholders working on groundwater trend assessment. The R encoded tool HYPE provides statistical analysis of groundwater time series. It follows several studies on the relevancy of the use of statistical tests on groundwater data series (Lopez et al., 2011) and other case studies on the thematic (Bourgine et al., 2012). It integrates the most powerful and robust statistical tests for hydrogeological applications. HYPE is linked to the French national database on groundwater data (ADES). So monitoring data gathered by the Water Agencies can be directly processed. HYPE has two main modules: - a characterisation module, which allows to visualize time series. HYPE calculates the main statistical characteristics and provides graphical representations; - a trend module, which identifies significant breaks, trends and trend reversals in time series, providing result table and graphical representation (cf figure). Additional modules are also implemented to identify regional and seasonal trends and to sample time series in a relevant way. HYPE has been used successfully in 2012 by the French Water Agencies to satisfy requirements of the WFD, concerning characterization of groundwater bodies' qualitative status and evaluation of the risk of non-achievement of good status. Bourgine B. et al. 2012, Ninth International Geostatistics Congress, Oslo, Norway June 11 - 15. Lopez B. et al. 2011, Final Report BRGM/RP-59515-FR. 166p.

  9. Multiscale analysis of information dynamics for linear multivariate processes.

    PubMed

    Faes, Luca; Montalto, Alessandro; Stramaglia, Sebastiano; Nollo, Giandomenico; Marinazzo, Daniele

    2016-08-01

    In the study of complex physical and physiological systems represented by multivariate time series, an issue of great interest is the description of the system dynamics over a range of different temporal scales. While information-theoretic approaches to the multiscale analysis of complex dynamics are being increasingly used, the theoretical properties of the applied measures are poorly understood. This study introduces for the first time a framework for the analytical computation of information dynamics for linear multivariate stochastic processes explored at different time scales. After showing that the multiscale processing of a vector autoregressive (VAR) process introduces a moving average (MA) component, we describe how to represent the resulting VARMA process using statespace (SS) models and how to exploit the SS model parameters to compute analytical measures of information storage and information transfer for the original and rescaled processes. The framework is then used to quantify multiscale information dynamics for simulated unidirectionally and bidirectionally coupled VAR processes, showing that rescaling may lead to insightful patterns of information storage and transfer but also to potentially misleading behaviors.

  10. Hybrid Intrusion Forecasting Framework for Early Warning System

    NASA Astrophysics Data System (ADS)

    Kim, Sehun; Shin, Seong-Jun; Kim, Hyunwoo; Kwon, Ki Hoon; Han, Younggoo

    Recently, cyber attacks have become a serious hindrance to the stability of Internet. These attacks exploit interconnectivity of networks, propagate in an instant, and have become more sophisticated and evolutionary. Traditional Internet security systems such as firewalls, IDS and IPS are limited in terms of detecting recent cyber attacks in advance as these systems respond to Internet attacks only after the attacks inflict serious damage. In this paper, we propose a hybrid intrusion forecasting system framework for an early warning system. The proposed system utilizes three types of forecasting methods: time-series analysis, probabilistic modeling, and data mining method. By combining these methods, it is possible to take advantage of the forecasting technique of each while overcoming their drawbacks. Experimental results show that the hybrid intrusion forecasting method outperforms each of three forecasting methods.

  11. High-throughput syntheses of iron phosphite open frameworks in ionic liquids

    NASA Astrophysics Data System (ADS)

    Wang, Zhixiu; Mu, Ying; Wang, Yilin; Bing, Qiming; Su, Tan; Liu, Jingyao

    2017-02-01

    Three open-framework iron phosphites: Feп5(NH4)2(HPO3)6 (1), Feп2Fe♯(NH4)(HPO3)4 (2) and Fe♯2(HPO3)3 (3) have been synthesized under ionothermal conditions. How the different synthesis parameters, such as the gel concentrations, synthetic times, reaction temperatures and solvents affect the products have been monitored by using high-throughput approaches. Within each type of experiment, relevant products have been investigated. The optimal reaction conditions are obtained from a series of experiments by high-throughput approaches. All the structures are determined by single-crystal X-ray diffraction analysis and further characterized by PXRD, TGA and FTIR analyses. Magnetic study reveals that those three compounds show interesting magnetic behavior at low temperature.

  12. A framework for the comparative study of language.

    PubMed

    Uriagereka, Juan; Reggia, James A; Wilkinson, Gerald S

    2013-07-18

    Comparative studies of language are difficult because few language precursors are recognized. In this paper we propose a framework for designing experiments that test for structural and semantic patterns indicative of simple or complex grammars as originally described by Chomsky. We argue that a key issue is whether animals can recognize full recursion, which is the hallmark of context-free grammar. We discuss limitations of recent experiments that have attempted to address this issue, and point out that experiments aimed at detecting patterns that follow a Fibonacci series have advantages over other artificial context-free grammars. We also argue that experiments using complex sequences of behaviors could, in principle, provide evidence for fully recursive thought. Some of these ideas could also be approached using artificial life simulations, which have the potential to reveal the types of evolutionary transitions that could occur over time. Because the framework we propose has specific memory and computational requirements, future experiments could target candidate genes with the goal of revealing the genetic underpinnings of complex cognition.

  13. Multifractal diffusion entropy analysis: Optimal bin width of probability histograms

    NASA Astrophysics Data System (ADS)

    Jizba, Petr; Korbel, Jan

    2014-11-01

    In the framework of Multifractal Diffusion Entropy Analysis we propose a method for choosing an optimal bin-width in histograms generated from underlying probability distributions of interest. The method presented uses techniques of Rényi’s entropy and the mean squared error analysis to discuss the conditions under which the error in the multifractal spectrum estimation is minimal. We illustrate the utility of our approach by focusing on a scaling behavior of financial time series. In particular, we analyze the S&P500 stock index as sampled at a daily rate in the time period 1950-2013. In order to demonstrate a strength of the method proposed we compare the multifractal δ-spectrum for various bin-widths and show the robustness of the method, especially for large values of q. For such values, other methods in use, e.g., those based on moment estimation, tend to fail for heavy-tailed data or data with long correlations. Connection between the δ-spectrum and Rényi’s q parameter is also discussed and elucidated on a simple example of multiscale time series.

  14. Model-Based Clustering of Regression Time Series Data via APECM -- An AECM Algorithm Sung to an Even Faster Beat

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Wei-Chen; Maitra, Ranjan

    2011-01-01

    We propose a model-based approach for clustering time series regression data in an unsupervised machine learning framework to identify groups under the assumption that each mixture component follows a Gaussian autoregressive regression model of order p. Given the number of groups, the traditional maximum likelihood approach of estimating the parameters using the expectation-maximization (EM) algorithm can be employed, although it is computationally demanding. The somewhat fast tune to the EM folk song provided by the Alternating Expectation Conditional Maximization (AECM) algorithm can alleviate the problem to some extent. In this article, we develop an alternative partial expectation conditional maximization algorithmmore » (APECM) that uses an additional data augmentation storage step to efficiently implement AECM for finite mixture models. Results on our simulation experiments show improved performance in both fewer numbers of iterations and computation time. The methodology is applied to the problem of clustering mutual funds data on the basis of their average annual per cent returns and in the presence of economic indicators.« less

  15. An architecture for consolidating multidimensional time-series data onto a common coordinate grid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shippert, Tim; Gaustad, Krista

    Consolidating measurement data for use by data models or in inter-comparison studies frequently requires transforming the data onto a common grid. Standard methods for interpolating multidimensional data are often not appropriate for data with non-homogenous dimensionality, and are hard to implement in a consistent manner for different datastreams. These challenges are increased when dealing with the automated procedures necessary for use with continuous, operational datastreams. In this paper we introduce a method of applying a series of one-dimensional transformations to merge data onto a common grid, examine the challenges of ensuring consistent application of data consolidation methods, present a frameworkmore » for addressing those challenges, and describe the implementation of such a framework for the Atmospheric Radiation Measurement (ARM) program.« less

  16. Decoding divergent series in nonparaxial optics.

    PubMed

    Borghi, Riccardo; Gori, Franco; Guattari, Giorgio; Santarsiero, Massimo

    2011-03-15

    A theoretical analysis aimed at investigating the divergent character of perturbative series involved in the study of free-space nonparaxial propagation of vectorial optical beams is proposed. Our analysis predicts a factorial divergence for such series and provides a theoretical framework within which the results of recently published numerical experiments concerning nonparaxial propagation of vectorial Gaussian beams find a meaningful interpretation in terms of the decoding operated on such series by the Weniger transformation.

  17. Molecular Modeling and Experimental Investigations of Nonlinear Optical Compounds Monosubstituted Derivatives of Dicyanovinylbenzene

    NASA Technical Reports Server (NTRS)

    Timofeeva, Tatiana V.; Nesterov, Vladimir N.; Antipin, Mikhail Yu.; Clark, Ronald D.; Sanghadasa, Mohan; Cardelino, Beatriz H.; Moore, Craig E.; Frazier, Donald O.

    1999-01-01

    A search for potential nonlinear optical compounds was performed using the Cambridge Structure Database and molecular modeling. We investigated a series of monosubstituted derivatives of dicyanovinylbenzene, since the nonlinear optical (NLO) properties of such derivatives (o-methoxy-dicyanovinylbenzene, DIVA) were studied earlier. The molecular geometry of these compounds was investigated with x-ray analysis and discussed along with the results of molecular mechanics and ab initio quantum chemical calculations. The influence of crystal packing on the planarity of the molecules of this series has been revealed. Two new compounds from the series studied, ortho-F and para-Cl-dicyanovinylbenzene, according to powder measurements, were found to be NLO compounds in the crystal state about 10 times more active than urea. The peculiarities of crystal structure formation in the framework of balance between van der Waals and electrostatic interactions have been discussed. The crystal shape of DIVA and two new NLO compounds have been calculated on the basis of the known crystal structure.

  18. Welfare States, Labor Markets, Political Dynamics, and Population Health: A Time-Series Cross-Sectional Analysis Among East and Southeast Asian Nations.

    PubMed

    Ng, Edwin; Muntaner, Carles; Chung, Haejoo

    2016-04-01

    Recent scholarship offers different theories on how macrosocial determinants affect the population health of East and Southeast Asian nations. Dominant theories emphasize the effects of welfare regimes, welfare generosity, and labor market institutions. In this article, we conduct exploratory time-series cross-sectional analyses to generate new evidence on these theories while advancing a political explanation. Using unbalanced data of 7 East Asian countries and 11 Southeast Asian nations from 1960 to 2012, primary findings are 3-fold. First, welfare generosity measured as education and health spending has a positive impact on life expectancy, net of GDP. Second, life expectancy varies significantly by labor markets; however, these differences are explained by differences in welfare generosity. Third, as East and Southeast Asian countries become more democratic, welfare generosity increases, and population health improves. This study provides new evidence on the value of considering politics, welfare states, and labor markets within the same conceptual framework. © 2016 APJPH.

  19. Linking time-series of single-molecule experiments with molecular dynamics simulations by machine learning

    PubMed Central

    Matsunaga, Yasuhiro

    2018-01-01

    Single-molecule experiments and molecular dynamics (MD) simulations are indispensable tools for investigating protein conformational dynamics. The former provide time-series data, such as donor-acceptor distances, whereas the latter give atomistic information, although this information is often biased by model parameters. Here, we devise a machine-learning method to combine the complementary information from the two approaches and construct a consistent model of conformational dynamics. It is applied to the folding dynamics of the formin-binding protein WW domain. MD simulations over 400 μs led to an initial Markov state model (MSM), which was then "refined" using single-molecule Förster resonance energy transfer (FRET) data through hidden Markov modeling. The refined or data-assimilated MSM reproduces the FRET data and features hairpin one in the transition-state ensemble, consistent with mutation experiments. The folding pathway in the data-assimilated MSM suggests interplay between hydrophobic contacts and turn formation. Our method provides a general framework for investigating conformational transitions in other proteins. PMID:29723137

  20. Linking time-series of single-molecule experiments with molecular dynamics simulations by machine learning.

    PubMed

    Matsunaga, Yasuhiro; Sugita, Yuji

    2018-05-03

    Single-molecule experiments and molecular dynamics (MD) simulations are indispensable tools for investigating protein conformational dynamics. The former provide time-series data, such as donor-acceptor distances, whereas the latter give atomistic information, although this information is often biased by model parameters. Here, we devise a machine-learning method to combine the complementary information from the two approaches and construct a consistent model of conformational dynamics. It is applied to the folding dynamics of the formin-binding protein WW domain. MD simulations over 400 μs led to an initial Markov state model (MSM), which was then "refined" using single-molecule Förster resonance energy transfer (FRET) data through hidden Markov modeling. The refined or data-assimilated MSM reproduces the FRET data and features hairpin one in the transition-state ensemble, consistent with mutation experiments. The folding pathway in the data-assimilated MSM suggests interplay between hydrophobic contacts and turn formation. Our method provides a general framework for investigating conformational transitions in other proteins. © 2018, Matsunaga et al.

  1. An evaluation of Dynamic TOPMODEL for low flow simulation

    NASA Astrophysics Data System (ADS)

    Coxon, G.; Freer, J. E.; Quinn, N.; Woods, R. A.; Wagener, T.; Howden, N. J. K.

    2015-12-01

    Hydrological models are essential tools for drought risk management, often providing input to water resource system models, aiding our understanding of low flow processes within catchments and providing low flow predictions. However, simulating low flows and droughts is challenging as hydrological systems often demonstrate threshold effects in connectivity, non-linear groundwater contributions and a greater influence of water resource system elements during low flow periods. These dynamic processes are typically not well represented in commonly used hydrological models due to data and model limitations. Furthermore, calibrated or behavioural models may not be effectively evaluated during more extreme drought periods. A better understanding of the processes that occur during low flows and how these are represented within models is thus required if we want to be able to provide robust and reliable predictions of future drought events. In this study, we assess the performance of dynamic TOPMODEL for low flow simulation. Dynamic TOPMODEL was applied to a number of UK catchments in the Thames region using time series of observed rainfall and potential evapotranspiration data that captured multiple historic droughts over a period of several years. The model performance was assessed against the observed discharge time series using a limits of acceptability framework, which included uncertainty in the discharge time series. We evaluate the models against multiple signatures of catchment low-flow behaviour and investigate differences in model performance between catchments, model diagnostics and for different low flow periods. We also considered the impact of surface water and groundwater abstractions and discharges on the observed discharge time series and how this affected the model evaluation. From analysing the model performance, we suggest future improvements to Dynamic TOPMODEL to improve the representation of low flow processes within the model structure.

  2. Modeling multidecadal surface water inundation dynamics and key drivers on large river basin scale using multiple time series of Earth-observation and river flow data

    NASA Astrophysics Data System (ADS)

    Heimhuber, V.; Tulbure, M. G.; Broich, M.

    2017-02-01

    Periodically inundated floodplain areas are hot spots of biodiversity and provide a broad range of ecosystem services but have suffered alarming declines in recent history. Despite their importance, their long-term surface water (SW) dynamics and hydroclimatic drivers remain poorly quantified on continental scales. In this study, we used a 26 year time series of Landsat-derived SW maps in combination with river flow data from 68 gauges and spatial time series of rainfall, evapotranspiration and soil moisture to statistically model SW dynamics as a function of key drivers across Australia's Murray-Darling Basin (˜1 million km2). We fitted generalized additive models for 18,521 individual modeling units made up of 10 × 10 km grid cells, each split into floodplain, floodplain-lake, and nonfloodplain area. Average goodness of fit of models was high across floodplains and floodplain-lakes (r2 > 0.65), which were primarily driven by river flow, and was lower for nonfloodplain areas (r2 > 0.24), which were primarily driven by rainfall. Local climate conditions were more relevant for SW dynamics in the northern compared to the southern basin and had the highest influence in the least regulated and most extended floodplains. We further applied the models of two contrasting floodplain areas to predict SW extents of cloud-affected time steps in the Landsat series during the large 2010 floods with high validated accuracy (r2 > 0.97). Our framework is applicable to other complex river basins across the world and enables a more detailed quantification of large floods and drivers of SW dynamics compared to existing methods.

  3. Systemic and intensifying drought induces collapse and replacement of native fishes: a time-series approach

    NASA Astrophysics Data System (ADS)

    Ruhi, A.; Olden, J. D.; Sabo, J. L.

    2015-12-01

    In the American Southwest, hydrologic drought has become a new normal as a result of increasing human appropriation of freshwater resources and increased aridity associated with global warming. Although drought has often been touted to threaten freshwater biodiversity, connecting drought to extinction risk of highly-imperiled faunas remains a challenge. Here we combine time-series methods from signal processing and econometrics to analyze a spatially comprehensive and long-term dataset to link discharge variation and community abundance of fish across the American Southwest. This novel time series framework identifies ongoing trends in daily discharge anomalies across the Southwest, quantifies the effect of the historical hydrologic drivers on fish community abundance, and allows us to simulate species trajectories and range-wide risk of decline (quasiextinction) under scenarios of future climate. Spectral anomalies are declining over the last 30 years in at least a quarter of the stream gaging stations across the American Southwest and these anomalies are robust predictors of historical abundance of native and non-native fishes. Quasiextinction probabilities are high (>50 %) for nearly ¾ of the native species across several large river basins in the same region; and the negative trend in annual anomalies increases quasiextinction risk for native but reduces this risk for non-native fishes. These findings suggest that ongoing drought is causing range-wide collapse and replacement of native fish faunas, and that this homogenization of western fish faunas will continue given the prevailing negative trend in discharge anomalies. Additionally, this combination of methods can be applied elsewhere as long as environmental and biological long-term time-series data are available. Collectively, these methods allow identifying the link between hydroclimatic forcing and ecological responses and thus may help anticipating the potential impacts of ongoing and future hydrologic extremes in freshwater ecosystems.

  4. Detection and characterization of lightning-based sources using continuous wavelet transform: application to audio-magnetotellurics

    NASA Astrophysics Data System (ADS)

    Larnier, H.; Sailhac, P.; Chambodut, A.

    2018-01-01

    Atmospheric electromagnetic waves created by global lightning activity contain information about electrical processes of the inner and the outer Earth. Large signal-to-noise ratio events are particularly interesting because they convey information about electromagnetic properties along their path. We introduce a new methodology to automatically detect and characterize lightning-based waves using a time-frequency decomposition obtained through the application of continuous wavelet transform. We focus specifically on three types of sources, namely, atmospherics, slow tails and whistlers, that cover the frequency range 10 Hz to 10 kHz. Each wave has distinguishable characteristics in the time-frequency domain due to source shape and dispersion processes. Our methodology allows automatic detection of each type of event in the time-frequency decomposition thanks to their specific signature. Horizontal polarization attributes are also recovered in the time-frequency domain. This procedure is first applied to synthetic extremely low frequency time-series with different signal-to-noise ratios to test for robustness. We then apply it on real data: three stations of audio-magnetotelluric data acquired in Guadeloupe, oversea French territories. Most of analysed atmospherics and slow tails display linear polarization, whereas analysed whistlers are elliptically polarized. The diversity of lightning activity is finally analysed in an audio-magnetotelluric data processing framework, as used in subsurface prospecting, through estimation of the impedance response functions. We show that audio-magnetotelluric processing results depend mainly on the frequency content of electromagnetic waves observed in processed time-series, with an emphasis on the difference between morning and afternoon acquisition. Our new methodology based on the time-frequency signature of lightning-induced electromagnetic waves allows automatic detection and characterization of events in audio-magnetotelluric time-series, providing the means to assess quality of response functions obtained through processing.

  5. Validation of High-Fidelity CFD/CAA Framework for Launch Vehicle Acoustic Environment Simulation against Scale Model Test Data

    NASA Technical Reports Server (NTRS)

    Liever, Peter A.; West, Jeffrey S.; Harris, Robert E.

    2016-01-01

    A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed for launch vehicle liftoff acoustic environment predictions. The framework couples the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate Discontinuous Galerkin solver developed in the same production framework, Loci/THRUST, to accurately resolve and propagate acoustic physics across the entire launch environment. Time-accurate, Hybrid RANS/LES CFD modeling is applied for predicting the acoustic generation physics at the plume source, and a high-order accurate unstructured mesh Discontinuous Galerkin (DG) method is employed to propagate acoustic waves away from the source across large distances using high-order accurate schemes. The DG solver is capable of solving 2nd, 3rd, and 4th order Euler solutions for non-linear, conservative acoustic field propagation. Initial application testing and validation has been carried out against high resolution acoustic data from the Ares Scale Model Acoustic Test (ASMAT) series to evaluate the capabilities and production readiness of the CFD/CAA system to resolve the observed spectrum of acoustic frequency content. This paper presents results from this validation and outlines efforts to mature and improve the computational simulation framework.

  6. Evaluating the effectiveness of risk minimisation measures: the application of a conceptual framework to Danish real-world dabigatran data.

    PubMed

    Nyeland, Martin Erik; Laursen, Mona Vestergaard; Callréus, Torbjörn

    2017-06-01

    For both marketing authorization holders and regulatory authorities, evaluating the effectiveness of risk minimization measures is now an integral part of pharmacovigilance in the European Union. The overall aim of activities in this area is to assess the performance of risk minimization measures implemented in order to ensure a positive benefit-risk balance in patients treated with a medicinal product. Following a review of the relevant literature, we developed a conceptual framework consisting of four domains (data, knowledge, behaviour and outcomes) intended for the evaluation of risk minimization measures put into practice in the Danish health-care system. For the implementation of the framework, four classes of monitoring variables can be named and defined: patient descriptors, performance-related indicators of knowledge, behaviour and outcomes. We reviewed the features of the framework when applied to historical, real-world data following the introduction of dabigatran in Denmark for the prophylactic treatment of patients with non-valvular atrial fibrillation. The application of the framework provided useful graphical displays and an opportunity for a statistical evaluation (interrupted time series analysis) of a regulatory intervention. © 2017 Commonwealth of Australia. Pharmacoepidemiology & Drug Safety © 2017 John Wiley & Sons, Ltd. © 2017 Commonwealth of Australia. Pharmacoepidemiology & Drug Safety © 2017 John Wiley & Sons, Ltd.

  7. A framework for tracking post-wildfire trajectories and desired future conditions using NDVI time series

    NASA Astrophysics Data System (ADS)

    Norman, S. P.; Hargrove, W. W.; Lee, D. C.; Spruce, J.

    2013-12-01

    Wildfires could provide a cost-effective means to maintain or restore some aspects of fire-adapted landscapes. Yet with the added influence of climate change and invasives, wildfires may also facilitate or accelerate undesired type conversions. As megafires are becoming increasingly common across portions of the US West, managers require a framework for long-term monitoring that integrates the trajectories of fire-prone landscapes and objectives, not just conditions immediately after a burn. Systematic use of satellite data provides an efficient cross-jurisdictional solution to this problem. Since 2000, MODIS-technology has provided high frequency, 240m resolution observations of Earth. Using this data stream, the ForWarn system, developed through a partnership of the US Forest Service, NASA-Stennis and others, provides 46 estimates of the Normalized Difference Vegetation Index (NDVI) per year for the conterminous US. From this time series, a variety of secondary metrics have been derived including median annual NDVI, amplitude, and phenological spikiness. Each is both a fire and recovery sensitive measure that allows managers to systematically track conditions with respect to either the pre-fire baseline or desired future conditions more adaptively. In dry interior forests where wildfires could be used to thin stands, recovery to untreated conditions may not be desired given fuels objectives or climate change. In more mesic systems, fire effects may be monitored as staged succession. With both coarse filter monitoring and desired conditions in hand, managers can better recognize and prioritize problems in disturbance-prone landscapes.

  8. Estimating mountain basin-mean precipitation from streamflow using Bayesian inference

    NASA Astrophysics Data System (ADS)

    Henn, Brian; Clark, Martyn P.; Kavetski, Dmitri; Lundquist, Jessica D.

    2015-10-01

    Estimating basin-mean precipitation in complex terrain is difficult due to uncertainty in the topographical representativeness of precipitation gauges relative to the basin. To address this issue, we use Bayesian methodology coupled with a multimodel framework to infer basin-mean precipitation from streamflow observations, and we apply this approach to snow-dominated basins in the Sierra Nevada of California. Using streamflow observations, forcing data from lower-elevation stations, the Bayesian Total Error Analysis (BATEA) methodology and the Framework for Understanding Structural Errors (FUSE), we infer basin-mean precipitation, and compare it to basin-mean precipitation estimated using topographically informed interpolation from gauges (PRISM, the Parameter-elevation Regression on Independent Slopes Model). The BATEA-inferred spatial patterns of precipitation show agreement with PRISM in terms of the rank of basins from wet to dry but differ in absolute values. In some of the basins, these differences may reflect biases in PRISM, because some implied PRISM runoff ratios may be inconsistent with the regional climate. We also infer annual time series of basin precipitation using a two-step calibration approach. Assessment of the precision and robustness of the BATEA approach suggests that uncertainty in the BATEA-inferred precipitation is primarily related to uncertainties in hydrologic model structure. Despite these limitations, time series of inferred annual precipitation under different model and parameter assumptions are strongly correlated with one another, suggesting that this approach is capable of resolving year-to-year variability in basin-mean precipitation.

  9. Identification of animal behavioral strategies by inverse reinforcement learning.

    PubMed

    Yamaguchi, Shoichiro; Naoki, Honda; Ikeda, Muneki; Tsukada, Yuki; Nakano, Shunji; Mori, Ikue; Ishii, Shin

    2018-05-01

    Animals are able to reach a desired state in an environment by controlling various behavioral patterns. Identification of the behavioral strategy used for this control is important for understanding animals' decision-making and is fundamental to dissect information processing done by the nervous system. However, methods for quantifying such behavioral strategies have not been fully established. In this study, we developed an inverse reinforcement-learning (IRL) framework to identify an animal's behavioral strategy from behavioral time-series data. We applied this framework to C. elegans thermotactic behavior; after cultivation at a constant temperature with or without food, fed worms prefer, while starved worms avoid the cultivation temperature on a thermal gradient. Our IRL approach revealed that the fed worms used both the absolute temperature and its temporal derivative and that their behavior involved two strategies: directed migration (DM) and isothermal migration (IM). With DM, worms efficiently reached specific temperatures, which explains their thermotactic behavior when fed. With IM, worms moved along a constant temperature, which reflects isothermal tracking, well-observed in previous studies. In contrast to fed animals, starved worms escaped the cultivation temperature using only the absolute, but not the temporal derivative of temperature. We also investigated the neural basis underlying these strategies, by applying our method to thermosensory neuron-deficient worms. Thus, our IRL-based approach is useful in identifying animal strategies from behavioral time-series data and could be applied to a wide range of behavioral studies, including decision-making, in other organisms.

  10. Differentiated Responses of Apple Tree Floral Phenology to Global Warming in Contrasting Climatic Regions.

    PubMed

    Legave, Jean-Michel; Guédon, Yann; Malagi, Gustavo; El Yaacoubi, Adnane; Bonhomme, Marc

    2015-01-01

    The responses of flowering phenology to temperature increases in temperate fruit trees have rarely been investigated in contrasting climatic regions. This is an appropriate framework for highlighting varying responses to diverse warming contexts, which would potentially combine chill accumulation (CA) declines and heat accumulation (HA) increases. To examine this issue, a data set was constituted in apple tree from flowering dates collected for two phenological stages of three cultivars in seven climate-contrasting temperate regions of Western Europe and in three mild regions, one in Northern Morocco and two in Southern Brazil. Multiple change-point models were applied to flowering date series, as well as to corresponding series of mean temperature during two successive periods, respectively determining for the fulfillment of chill and heat requirements. A new overview in space and time of flowering date changes was provided in apple tree highlighting not only flowering date advances as in previous studies but also stationary flowering date series. At global scale, differentiated flowering time patterns result from varying interactions between contrasting thermal determinisms of flowering dates and contrasting warming contexts. This may explain flowering date advances in most of European regions and in Morocco vs. stationary flowering date series in the Brazilian regions. A notable exception in Europe was found in the French Mediterranean region where the flowering date series was stationary. While the flowering duration series were stationary whatever the region, the flowering durations were far longer in mild regions compared to temperate regions. Our findings suggest a new warming vulnerability in temperate Mediterranean regions, which could shift toward responding more to chill decline and consequently experience late and extended flowering under future warming scenarios.

  11. Informatics approaches in the Biological Characterization of Adverse Outcome Pathways

    EPA Science Inventory

    Adverse Outcome Pathways (AOPs) are a conceptual framework to characterize toxicity pathways by a series of mechanistic steps from a molecular initiating event to population outcomes. This framework helps to direct risk assessment research, for example by aiding in computational ...

  12. Coastal Atmosphere and Sea Time Series (CoASTS)

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Berthon, Jean-Francoise; Zibordi, Giuseppe; Doyle, John P.; Grossi, Stefania; vanderLinde, Dirk; Targa, Cristina; McClain, Charles R. (Technical Monitor)

    2002-01-01

    In this document, the first three years of a time series of bio-optical marine and atmospheric measurements are presented and analyzed. These measurements were performed from an oceanographic tower in the northern Adriatic Sea within the framework of the Coastal Atmosphere and Sea Time Series (CoASTS) project, an ocean color calibration and validation activity. The data set collected includes spectral measurements of the in-water apparent (diffuse attenuation coefficient, reflectance, Q-factor, etc.) and inherent (absorption and scattering coefficients) optical properties, as well as the concentrations of the main optical components (pigment and suspended matter concentrations). Clear seasonal patterns are exhibited by the marine quantities on which an appreciable short-term variability (on the order of a half day to one day) is superimposed. This short-term variability is well correlated with the changes in salinity at the surface resulting from the southward transport of freshwater coming from the northern rivers. Concentrations of chlorophyll alpha and total suspended matter span more than two orders of magnitude. The bio-optical characteristics of the measurement site pertain to both Case-I (about 64%) and Case-II (about 36%) waters, based on a relationship between the beam attenuation coefficient at 660nm and the chlorophyll alpha concentration. Empirical algorithms relating in-water remote sensing reflectance ratios and optical components or properties of interest (chlorophyll alpha, total suspended matter, and the diffuse attenuation coefficient) are presented.

  13. Reconstruction of daily erythemal UV radiation values for the last century - The benefit of modelled ozone

    NASA Astrophysics Data System (ADS)

    Junk, J.; Feister, U.; Rozanov, E.; Krzyścin, J. W.

    2013-05-01

    Solar erythemal UV radiation (UVER) is highly relevant for numerous biological processes that affect plants, animals, and human health. Nevertheless, long-term UVER records are scarce. As significant declines in the column ozone concentration were observed in the past and a recovery of the stratospheric ozone layer is anticipated by the middle of the 21st century, there is a strong interest in the temporal variation of UVER time series. Therefore, we combined groundbased measurements of different meteorological variables with modeled ozone data sets to reconstruct time series of daily totals of UVER at the Meteorological Observatory Potsdam, Germany. Artificial neural networks were trained with measured UVER, sunshine duration, the day of year, measured and modeled total column ozone, as well as the minimum solar zenith angle. This allows for the reconstruction of daily totals of UVER for the period from 1901 to 1999. Additionally, analyses of the long-term variations from 1901 until 1999 of the reconstructed, new UVER data set are presented. The time series of monthly and annual totals of UVER provide a long-term meteorological basis for epidemiological investigations in human health and occupational medicine for the region of Potsdam and Berlin. A strong benefit of our ANN-approach is the fact that it can be easily adapted to different geographical locations, as successfully tested in the framework of the COSTAction 726.

  14. Uncovering the genetic signature of quantitative trait evolution with replicated time series data.

    PubMed

    Franssen, S U; Kofler, R; Schlötterer, C

    2017-01-01

    The genetic architecture of adaptation in natural populations has not yet been resolved: it is not clear to what extent the spread of beneficial mutations (selective sweeps) or the response of many quantitative trait loci drive adaptation to environmental changes. Although much attention has been given to the genomic footprint of selective sweeps, the importance of selection on quantitative traits is still not well studied, as the associated genomic signature is extremely difficult to detect. We propose 'Evolve and Resequence' as a promising tool, to study polygenic adaptation of quantitative traits in evolving populations. Simulating replicated time series data we show that adaptation to a new intermediate trait optimum has three characteristic phases that are reflected on the genomic level: (1) directional frequency changes towards the new trait optimum, (2) plateauing of allele frequencies when the new trait optimum has been reached and (3) subsequent divergence between replicated trajectories ultimately leading to the loss or fixation of alleles while the trait value does not change. We explore these 3 phase characteristics for relevant population genetic parameters to provide expectations for various experimental evolution designs. Remarkably, over a broad range of parameters the trajectories of selected alleles display a pattern across replicates, which differs both from neutrality and directional selection. We conclude that replicated time series data from experimental evolution studies provide a promising framework to study polygenic adaptation from whole-genome population genetics data.

  15. Advanced Computational Framework for Environmental Management ZEM, Version 1.x

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vesselinov, Velimir V.; O'Malley, Daniel; Pandey, Sachin

    2016-11-04

    Typically environmental management problems require analysis of large and complex data sets originating from concurrent data streams with different data collection frequencies and pedigree. These big data sets require on-the-fly integration into a series of models with different complexity for various types of model analyses where the data are applied as soft and hard model constraints. This is needed to provide fast iterative model analyses based on the latest available data to guide decision-making. Furthermore, the data and model are associated with uncertainties. The uncertainties are probabilistic (e.g. measurement errors) and non-probabilistic (unknowns, e.g. alternative conceptual models characterizing site conditions).more » To address all of these issues, we have developed an integrated framework for real-time data and model analyses for environmental decision-making called ZEM. The framework allows for seamless and on-the-fly integration of data and modeling results for robust and scientifically-defensible decision-making applying advanced decision analyses tools such as Bayesian- Information-Gap Decision Theory (BIG-DT). The framework also includes advanced methods for optimization that are capable of dealing with a large number of unknown model parameters, and surrogate (reduced order) modeling capabilities based on support vector regression techniques. The framework is coded in Julia, a state-of-the-art high-performance programing language (http://julialang.org). The ZEM framework is open-source and can be applied to any environmental management site. The framework will be open-source and released under GPL V3 license.« less

  16. Building a Framework that Supports Project Teams: An Example from a University/Community Collaborative Initiative

    ERIC Educational Resources Information Center

    Kolb, Judith A.; Sandmeyer, Louise E.

    2007-01-01

    In the university initiative described in this article, a series of project teams were funded to work on a variety of collaborative projects. The focus of this piece is on the framework that was developed and executed to select, support, and evaluate these teams. The framework is explained and described using data gathered throughout the study and…

  17. Critically Thinking about Harry Potter: A Framework for Discussing Controversial Works in the English Classroom.

    ERIC Educational Resources Information Center

    Marshall, Joanne M.

    2003-01-01

    Notes that at the school level and in the classroom, English educators can lead the way in open discussion, particularly discussion about books such as the Harry Potter series. Offers a seven-step framework as a way to begin an open discussion. Concludes that the framework acknowledges those concerns and provides a voice for them while also…

  18. A data management infrastructure for bridge monitoring

    NASA Astrophysics Data System (ADS)

    Jeong, Seongwoon; Byun, Jaewook; Kim, Daeyoung; Sohn, Hoon; Bae, In Hwan; Law, Kincho H.

    2015-04-01

    This paper discusses a data management infrastructure framework for bridge monitoring applications. As sensor technologies mature and become economically affordable, their deployment for bridge monitoring will continue to grow. Data management becomes a critical issue not only for storing the sensor data but also for integrating with the bridge model to support other functions, such as management, maintenance and inspection. The focus of this study is on the effective data management of bridge information and sensor data, which is crucial to structural health monitoring and life cycle management of bridge structures. We review the state-of-the-art of bridge information modeling and sensor data management, and propose a data management framework for bridge monitoring based on NoSQL database technologies that have been shown useful in handling high volume, time-series data and to flexibly deal with unstructured data schema. Specifically, Apache Cassandra and Mongo DB are deployed for the prototype implementation of the framework. This paper describes the database design for an XML-based Bridge Information Modeling (BrIM) schema, and the representation of sensor data using Sensor Model Language (SensorML). The proposed prototype data management framework is validated using data collected from the Yeongjong Bridge in Incheon, Korea.

  19. Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition.

    PubMed

    Ordóñez, Francisco Javier; Roggen, Daniel

    2016-01-18

    Human activity recognition (HAR) tasks have traditionally been solved using engineered features obtained by heuristic processes. Current research suggests that deep convolutional neural networks are suited to automate feature extraction from raw sensor inputs. However, human activities are made of complex sequences of motor movements, and capturing this temporal dynamics is fundamental for successful HAR. Based on the recent success of recurrent neural networks for time series domains, we propose a generic deep framework for activity recognition based on convolutional and LSTM recurrent units, which: (i) is suitable for multimodal wearable sensors; (ii) can perform sensor fusion naturally; (iii) does not require expert knowledge in designing features; and (iv) explicitly models the temporal dynamics of feature activations. We evaluate our framework on two datasets, one of which has been used in a public activity recognition challenge. Our results show that our framework outperforms competing deep non-recurrent networks on the challenge dataset by 4% on average; outperforming some of the previous reported results by up to 9%. Our results show that the framework can be applied to homogeneous sensor modalities, but can also fuse multimodal sensors to improve performance. We characterise key architectural hyperparameters' influence on performance to provide insights about their optimisation.

  20. A Framework for Debugging Geoscience Projects in a High Performance Computing Environment

    NASA Astrophysics Data System (ADS)

    Baxter, C.; Matott, L.

    2012-12-01

    High performance computing (HPC) infrastructure has become ubiquitous in today's world with the emergence of commercial cloud computing and academic supercomputing centers. Teams of geoscientists, hydrologists and engineers can take advantage of this infrastructure to undertake large research projects - for example, linking one or more site-specific environmental models with soft computing algorithms, such as heuristic global search procedures, to perform parameter estimation and predictive uncertainty analysis, and/or design least-cost remediation systems. However, the size, complexity and distributed nature of these projects can make identifying failures in the associated numerical experiments using conventional ad-hoc approaches both time- consuming and ineffective. To address these problems a multi-tiered debugging framework has been developed. The framework allows for quickly isolating and remedying a number of potential experimental failures, including: failures in the HPC scheduler; bugs in the soft computing code; bugs in the modeling code; and permissions and access control errors. The utility of the framework is demonstrated via application to a series of over 200,000 numerical experiments involving a suite of 5 heuristic global search algorithms and 15 mathematical test functions serving as cheap analogues for the simulation-based optimization of pump-and-treat subsurface remediation systems.

  1. Near Real-Time Monitoring of Forest Disturbance: A Multi-Sensor Remote Sensing Approach and Assessment Framework

    NASA Astrophysics Data System (ADS)

    Tang, Xiaojing

    Fast and accurate monitoring of tropical forest disturbance is essential for understanding current patterns of deforestation as well as helping eliminate illegal logging. This dissertation explores the use of data from different satellites for near real-time monitoring of forest disturbance in tropical forests, including: development of new monitoring methods; development of new assessment methods; and assessment of the performance and operational readiness of existing methods. Current methods for accuracy assessment of remote sensing products do not address the priority of near real-time monitoring of detecting disturbance events as early as possible. I introduce a new assessment framework for near real-time products that focuses on the timing and the minimum detectable size of disturbance events. The new framework reveals the relationship between change detection accuracy and the time needed to identify events. In regions that are frequently cloudy, near real-time monitoring using data from a single sensor is difficult. This study extends the work by Xin et al. (2013) and develops a new time series method (Fusion2) based on fusion of Landsat and MODIS (Moderate Resolution Imaging Spectroradiometer) data. Results of three test sites in the Amazon Basin show that Fusion2 can detect 44.4% of the forest disturbance within 13 clear observations (82 days) after the initial disturbance. The smallest event detected by Fusion2 is 6.5 ha. Also, Fusion2 detects disturbance faster and has less commission error than more conventional methods. In a comparison of coarse resolution sensors, MODIS Terra and Aqua combined provides faster and more accurate detection of disturbance events than VIIRS (Visible Infrared Imaging Radiometer Suite) and MODIS single sensor data. The performance of near real-time monitoring using VIIRS is slightly worse than MODIS Terra but significantly better than MODIS Aqua. New monitoring methods developed in this dissertation provide forest protection organizations the capacity to monitor illegal logging events promptly. In the future, combining two Landsat and two Sentinel-2 satellites will provide global coverage at 30 m resolution every 4 days, and routine monitoring may be possible at high resolution. The methods and assessment framework developed in this dissertation are adaptable to newly available datasets.

  2. Spatiotemporal interpolation of discharge across a river network by using synthetic SWOT satellite data

    NASA Astrophysics Data System (ADS)

    Paiva, Rodrigo C. D.; Durand, Michael T.; Hossain, Faisal

    2015-01-01

    Recent efforts have sought to estimate river discharge and other surface water-related quantities using spaceborne sensors, with better spatial coverage but worse temporal sampling as compared with in situ measurements. The Surface Water and Ocean Topography (SWOT) mission will provide river discharge estimates globally from space. However, questions on how to optimally use the spatially distributed but asynchronous satellite observations to generate continuous fields still exist. This paper presents a statistical model (River Kriging-RK), for estimating discharge time series in a river network in the context of the SWOT mission. RK uses discharge estimates at different locations and times to produce a continuous field using spatiotemporal kriging. A key component of RK is the space-time river discharge covariance, which was derived analytically from the diffusive wave approximation of Saint Venant's equations. The RK covariance also accounts for the loss of correlation at confluences. The model performed well in a case study on Ganges-Brahmaputra-Meghna (GBM) River system in Bangladesh using synthetic SWOT observations. The correlation model reproduced empirically derived values. RK (R2=0.83) outperformed other kriging-based methods (R2=0.80), as well as a simple time series linear interpolation (R2=0.72). RK was used to combine discharge from SWOT and in situ observations, improving estimates when the latter is included (R2=0.91). The proposed statistical concepts may eventually provide a feasible framework to estimate continuous discharge time series across a river network based on SWOT data, other altimetry missions, and/or in situ data.

  3. A new method for quantifying and modeling large scale surface water inundation dynamics and key drivers using multiple time series of Earth observation and river flow data. A case study for Australia's Murray-Darling Basin

    NASA Astrophysics Data System (ADS)

    Heimhuber, Valentin; Tulbure, Mirela G.; Broich, Mark

    2017-04-01

    Periodically inundated surface water (SW) areas such as floodplains are hotspots of biodiversity and provide a broad range of ecosystem services but have suffered alarming declines in recent history. Large scale flooding events govern the dynamics of these areas and are a critical component of the terrestrial water cycle, but their propagation through river systems and the corresponding long term SW dynamics remain poorly quantified on continental or global scales. In this research, we used an unprecedented Landsat-based time series of SW maps (1986-2011), to develop statistical inundation models and quantify the role of driver variables across the Murray-Darling Basin (MDB) (1 million square-km), which is Australia's bread basket and subject to competing demands over limited water resources. We fitted generalized additive models (GAM) between SW extent as the dependent variable and river flow data from 68 gauges, spatial time series of rainfall (P; interpolated gauge data), evapotranspiration (ET; AWRA-L land surface model) and soil moisture (SM; active passive microwave satellite remote sensing) as predictor variables. We used a fully directed and connected river network (Australian Geofabric) in combination with ancillary data, to develop a spatial modeling framework consisting of 18,521 individual modeling units. We then fitted individual models for all modeling units, which were made up of 10x10 km grid cells split into floodplain, floodplain-lake and non-floodplain areas, depending on the type of water body and its hydrologic connectivity to a gauged river. We applied the framework to quantify flood propagation times for all major river and floodplain systems across the MDB, which were in good accordance with observed travel times. After incorporating these flow lag times into the models, average goodness of fit was high across floodplains and floodplain-lake modeling units (r-squared > 0.65), which were primarily driven by river flow, and lower for non-floodplain areas (r-squared > 0.24), which were primarily driven by local rainfall. Our results indicate that local climate conditions (i.e. P, ET, SM) had more influence on SW dynamics in the northern compared to the southern MDB and were the most influential in the least regulated and most extended floodplains in the north-west. We also applied the statistical models of two floodplain areas with contrasting flooding regimes to predict SW extents of cloud-affected time steps in the Landsat time series during the large 2010 floods with high validated accuracy (r-squared > 0.97). Our findings illustrate that integrating multi-decadal time series of Earth observation data and in situ measurements with statistical modeling techniques can provide cost-effective tools for improving the management of limited SW resources and floods. The data-driven method is applicable to other large river basins and provides statistical models that can predict SW extent for cloud-affected Landsat observations or during the peak of floods and hence, allows a more detailed quantification of the dynamics of large floods compared to existing approaches. Future research will investigate the potential of image fusion techniques (i.e. ESTARFM) for improving the quantification of rapid changes in SW distribution by combining MODIS and Landsat imagery.

  4. A framework for quantifying and optimizing the value of seismic monitoring of infrastructure

    NASA Astrophysics Data System (ADS)

    Omenzetter, Piotr

    2017-04-01

    This paper outlines a framework for quantifying and optimizing the value of information from structural health monitoring (SHM) technology deployed on large infrastructure, which may sustain damage in a series of earthquakes (the main and the aftershocks). The evolution of the damage state of the infrastructure without or with SHM is presented as a time-dependent, stochastic, discrete-state, observable and controllable nonlinear dynamical system. The pre-posterior Bayesian analysis and the decision tree are used for quantifying and optimizing the value of SHM information. An optimality problem is then formulated how to decide on the adoption of SHM and how to manage optimally the usage and operations of the possibly damaged infrastructure and its repair schedule using the information from SHM. The objective function to minimize is the expected total cost or risk.

  5. The UK Earth System Models Marine Biogeochemical Evaluation Toolkit, BGC-val

    NASA Astrophysics Data System (ADS)

    de Mora, Lee

    2017-04-01

    The Biogeochemical Validation toolkit, BGC-val, is a model and grid independent python-based marine model evaluation framework that automates much of the validation of the marine component of an Earth System Model. BGC-val was initially developed to be a flexible and extensible system to evaluate the spin up of the marine UK Earth System Model (UKESM). However, the grid-independence and flexibility means that it is straightforward to adapt the BGC-val framework to evaluate other marine models. In addition to the marine component of the UKESM, this toolkit has been adapted to compare multiple models, including models from the CMIP5 and iMarNet inter-comparison projects. The BGC-val toolkit produces multiple levels of analysis which are presented in a simple to use interactive html5 document. Level 1 contains time series analyses, showing the development over time of many important biogeochemical and physical ocean metrics, such as the Global primary production or the Drake passage current. The second level of BGC-val is an in-depth spatial analyses of a single point in time. This is a series of point to point comparison of model and data in various regions, such as a comparison of Surface Nitrate in the model vs data from the world ocean atlas. The third level analyses are specialised ad-hoc packages to go in-depth on a specific question, such as the development of Oxygen minimum zones in the Equatorial Pacific. In additional to the three levels, the html5 document opens with a Level 0 table showing a summary of the status of the model run. The beta version of this toolkit is available via the Plymouth Marine Laboratory Gitlab server and uses the BSD 3 clause license.

  6. Stochasticity, succession, and environmental perturbations in a fluidic ecosystem.

    PubMed

    Zhou, Jizhong; Deng, Ye; Zhang, Ping; Xue, Kai; Liang, Yuting; Van Nostrand, Joy D; Yang, Yunfeng; He, Zhili; Wu, Liyou; Stahl, David A; Hazen, Terry C; Tiedje, James M; Arkin, Adam P

    2014-03-04

    Unraveling the drivers of community structure and succession in response to environmental change is a central goal in ecology. Although the mechanisms shaping community structure have been intensively examined, those controlling ecological succession remain elusive. To understand the relative importance of stochastic and deterministic processes in mediating microbial community succession, a unique framework composed of four different cases was developed for fluidic and nonfluidic ecosystems. The framework was then tested for one fluidic ecosystem: a groundwater system perturbed by adding emulsified vegetable oil (EVO) for uranium immobilization. Our results revealed that groundwater microbial community diverged substantially away from the initial community after EVO amendment and eventually converged to a new community state, which was closely clustered with its initial state. However, their composition and structure were significantly different from each other. Null model analysis indicated that both deterministic and stochastic processes played important roles in controlling the assembly and succession of the groundwater microbial community, but their relative importance was time dependent. Additionally, consistent with the proposed conceptual framework but contradictory to conventional wisdom, the community succession responding to EVO amendment was primarily controlled by stochastic rather than deterministic processes. During the middle phase of the succession, the roles of stochastic processes in controlling community composition increased substantially, ranging from 81.3% to 92.0%. Finally, there are limited successional studies available to support different cases in the conceptual framework, but further well-replicated explicit time-series experiments are needed to understand the relative importance of deterministic and stochastic processes in controlling community succession.

  7. Relativistic Fluid Dynamics Far From Local Equilibrium

    NASA Astrophysics Data System (ADS)

    Romatschke, Paul

    2018-01-01

    Fluid dynamics is traditionally thought to apply only to systems near local equilibrium. In this case, the effective theory of fluid dynamics can be constructed as a gradient series. Recent applications of resurgence suggest that this gradient series diverges, but can be Borel resummed, giving rise to a hydrodynamic attractor solution which is well defined even for large gradients. Arbitrary initial data quickly approaches this attractor via nonhydrodynamic mode decay. This suggests the existence of a new theory of far-from-equilibrium fluid dynamics. In this Letter, the framework of fluid dynamics far from local equilibrium for a conformal system is introduced, and the hydrodynamic attractor solutions for resummed Baier-Romatschke-Son-Starinets-Stephanov theory, kinetic theory in the relaxation time approximation, and strongly coupled N =4 super Yang-Mills theory are identified for a system undergoing Bjorken flow.

  8. Experimental Evidence of Negative Linear Compressibility in the MIL-53 Metal-organic Framework Family

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Serra-Crespo, Pablo; Dikhtiarenko, Alla; Stavitski, Eli

    Here we report a series of powder X-ray diffraction experiments performed on the soft porous crystals MIL-53(Al) and NH 2-MIL-53(Al) in a diamond anvil cell under different pressurization media. Systematic refinements of the obtained powder patterns demonstrate that these materials expand along a specific direction while undergoing total volume reduction under an increasing hydrostatic pressure. Our results confirm for the first time the negative linear compressibility behaviour of this family of materials, recently predicted from quantum chemical calculations.

  9. Experimental evidence of negative linear compressibility in the MIL-53 metal–organic framework family

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Serra-Crespo, Pablo; Dikhtiarenko, Alla; Stavitski, Eli

    Here we report a series of powder X-ray diffraction experiments performed on the soft porous crystals MIL-53(Al) and NH 2-MIL-53(Al) in a diamond anvil cell under different pressurization media. Systematic refinements of the obtained powder patterns demonstrate that these materials expand along a specific direction while undergoing total volume reduction under an increasing hydrostatic pressure. The results confirm for the first time the negative linear compressibility behaviour of this family of materials, recently predicted from quantum chemical calculations.

  10. Experimental Evidence of Negative Linear Compressibility in the MIL-53 Metal-Organic Framework Family

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Serra-Crespo, Pablo; Dikhtiarenko, Alla; Stavitski, Eli

    Here, we report a series of powder X-ray diffraction experiments performed on the soft porous crystals MIL-53(Al) and NH 2-MIL-53(Al) in a diamond anvil cell under different pressurization media. Systematic refinements of the obtained powder patterns demonstrate that these materials expand along a specific direction while undergoing total volume reduction under an increasing hydrostatic pressure. Our results confirm for the first time the negative linear compressibility behaviour of this family of materials, recently predicted from quantum chemical calculations.

  11. Experimental evidence of negative linear compressibility in the MIL-53 metal–organic framework family

    DOE PAGES

    Serra-Crespo, Pablo; Dikhtiarenko, Alla; Stavitski, Eli; ...

    2014-03-24

    Here we report a series of powder X-ray diffraction experiments performed on the soft porous crystals MIL-53(Al) and NH 2-MIL-53(Al) in a diamond anvil cell under different pressurization media. Systematic refinements of the obtained powder patterns demonstrate that these materials expand along a specific direction while undergoing total volume reduction under an increasing hydrostatic pressure. The results confirm for the first time the negative linear compressibility behaviour of this family of materials, recently predicted from quantum chemical calculations.

  12. Lunar phases and crisis center telephone calls.

    PubMed

    Wilson, J E; Tobacyk, J J

    1990-02-01

    The lunar hypothesis, that is, the notion that lunar phases can directly affect human behavior, was tested by time-series analysis of 4,575 crisis center telephone calls (all calls recorded for a 6-month interval). As expected, the lunar hypothesis was not supported. The 28-day lunar cycle accounted for less than 1% of the variance of the frequency of crisis center calls. Also, as hypothesized from an attribution theory framework, crisis center workers reported significantly greater belief in lunar effects than a non-crisis-center-worker comparison group.

  13. [Update on regulatory framework and certification process].

    PubMed

    Legrand, D

    2008-11-01

    Patients receiving blood transfusion must benefit from the same quality of care, no matter where they are treated. In an effort to enhance transfusion safety, regulations applicable to blood depot facilities have been considerably reinforced over the last few years. At the present time, facilities authorized to store and distribute labile blood products are subject to the same principles of good practice as transfusion centers. A series of regulations published in 2007 established the duties and responsibilities of depot facilities according to category (emergency, intermediary or distribution) and set out the conditions for certification.

  14. Statistical analysis of hydrological response in urbanising catchments based on adaptive sampling using inter-amount times

    NASA Astrophysics Data System (ADS)

    ten Veldhuis, Marie-Claire; Schleiss, Marc

    2017-04-01

    Urban catchments are typically characterised by a more flashy nature of the hydrological response compared to natural catchments. Predicting flow changes associated with urbanisation is not straightforward, as they are influenced by interactions between impervious cover, basin size, drainage connectivity and stormwater management infrastructure. In this study, we present an alternative approach to statistical analysis of hydrological response variability and basin flashiness, based on the distribution of inter-amount times. We analyse inter-amount time distributions of high-resolution streamflow time series for 17 (semi-)urbanised basins in North Carolina, USA, ranging from 13 to 238 km2 in size. We show that in the inter-amount-time framework, sampling frequency is tuned to the local variability of the flow pattern, resulting in a different representation and weighting of high and low flow periods in the statistical distribution. This leads to important differences in the way the distribution quantiles, mean, coefficient of variation and skewness vary across scales and results in lower mean intermittency and improved scaling. Moreover, we show that inter-amount-time distributions can be used to detect regulation effects on flow patterns, identify critical sampling scales and characterise flashiness of hydrological response. The possibility to use both the classical approach and the inter-amount-time framework to identify minimum observable scales and analyse flow data opens up interesting areas for future research.

  15. The effects of the Chesapeake Bay impact crater on the geologic framework and the correlation of hydrogeologic units of southeastern Virginia, south of the James River

    USGS Publications Warehouse

    Powars, David S.

    2000-01-01

    About 35 million years ago, a large comet or meteor slammed into the shallow shelf on the western margin of the Atlantic Ocean, creating the Chesapeake Bay impact crater. This report, the second in a series, refines the geologic framework of southeastern Virginia, south of the James River in and near the impact crater, and presents evidence for the existence of a pre-impact James River structural zone. The report includes detailed correlations of core lithologies with borehole geophysical logs; the correlations provide the foundation for the compilation of stratigraphic cross sections. These cross sections are tied into the geologic framework of the lower York-James Peninsula as presented in the first report in the series, Professional Paper 1612

  16. Liver DCE-MRI Registration in Manifold Space Based on Robust Principal Component Analysis.

    PubMed

    Feng, Qianjin; Zhou, Yujia; Li, Xueli; Mei, Yingjie; Lu, Zhentai; Zhang, Yu; Feng, Yanqiu; Liu, Yaqin; Yang, Wei; Chen, Wufan

    2016-09-29

    A technical challenge in the registration of dynamic contrast-enhanced magnetic resonance (DCE-MR) imaging in the liver is intensity variations caused by contrast agents. Such variations lead to the failure of the traditional intensity-based registration method. To address this problem, a manifold-based registration framework for liver DCE-MR time series is proposed. We assume that liver DCE-MR time series are located on a low-dimensional manifold and determine intrinsic similarities between frames. Based on the obtained manifold, the large deformation of two dissimilar images can be decomposed into a series of small deformations between adjacent images on the manifold through gradual deformation of each frame to the template image along the geodesic path. Furthermore, manifold construction is important in automating the selection of the template image, which is an approximation of the geodesic mean. Robust principal component analysis is performed to separate motion components from intensity changes induced by contrast agents; the components caused by motion are used to guide registration in eliminating the effect of contrast enhancement. Visual inspection and quantitative assessment are further performed on clinical dataset registration. Experiments show that the proposed method effectively reduces movements while preserving the topology of contrast-enhancing structures and provides improved registration performance.

  17. Developing a Complex Independent Component Analysis (CICA) Technique to Extract Non-stationary Patterns from Geophysical Time Series

    NASA Astrophysics Data System (ADS)

    Forootan, Ehsan; Kusche, Jürgen; Talpe, Matthieu; Shum, C. K.; Schmidt, Michael

    2017-12-01

    In recent decades, decomposition techniques have enabled increasingly more applications for dimension reduction, as well as extraction of additional information from geophysical time series. Traditionally, the principal component analysis (PCA)/empirical orthogonal function (EOF) method and more recently the independent component analysis (ICA) have been applied to extract, statistical orthogonal (uncorrelated), and independent modes that represent the maximum variance of time series, respectively. PCA and ICA can be classified as stationary signal decomposition techniques since they are based on decomposing the autocovariance matrix and diagonalizing higher (than two) order statistical tensors from centered time series, respectively. However, the stationarity assumption in these techniques is not justified for many geophysical and climate variables even after removing cyclic components, e.g., the commonly removed dominant seasonal cycles. In this paper, we present a novel decomposition method, the complex independent component analysis (CICA), which can be applied to extract non-stationary (changing in space and time) patterns from geophysical time series. Here, CICA is derived as an extension of real-valued ICA, where (a) we first define a new complex dataset that contains the observed time series in its real part, and their Hilbert transformed series as its imaginary part, (b) an ICA algorithm based on diagonalization of fourth-order cumulants is then applied to decompose the new complex dataset in (a), and finally, (c) the dominant independent complex modes are extracted and used to represent the dominant space and time amplitudes and associated phase propagation patterns. The performance of CICA is examined by analyzing synthetic data constructed from multiple physically meaningful modes in a simulation framework, with known truth. Next, global terrestrial water storage (TWS) data from the Gravity Recovery And Climate Experiment (GRACE) gravimetry mission (2003-2016), and satellite radiometric sea surface temperature (SST) data (1982-2016) over the Atlantic and Pacific Oceans are used with the aim of demonstrating signal separations of the North Atlantic Oscillation (NAO) from the Atlantic Multi-decadal Oscillation (AMO), and the El Niño Southern Oscillation (ENSO) from the Pacific Decadal Oscillation (PDO). CICA results indicate that ENSO-related patterns can be extracted from the Gravity Recovery And Climate Experiment Terrestrial Water Storage (GRACE TWS) with an accuracy of 0.5-1 cm in terms of equivalent water height (EWH). The magnitude of errors in extracting NAO or AMO from SST data using the complex EOF (CEOF) approach reaches up to 50% of the signal itself, while it is reduced to 16% when applying CICA. Larger errors with magnitudes of 100% and 30% of the signal itself are found while separating ENSO from PDO using CEOF and CICA, respectively. We thus conclude that the CICA is more effective than CEOF in separating non-stationary patterns.

  18. Evaluation of CryoSat-2 Measurements for the Monitoring of Large River Water Levels

    NASA Astrophysics Data System (ADS)

    Bercher, Nicolas; Calmant, Stephane; Picot, Nicolas; Seyler, Frederique; Fleury, Sara

    2013-09-01

    In this study, and maybe for the first time, we explore the ability of CryoSat-2 satellite to monitor the water level of large rivers. We focus on a section of 500 km of the Madeira river (Amazon basin), around the town of Manicore, cf. Fig.1.Due to the drifting orbit of the mission, the usual concept of "virtual station" vanishes and data are to be extracted within polygons that delineate the riverbeds. This results in spatio-temporal time series of the river water level, expressed as a function of both space (distance to the ocean) and time.We use Cryosat-2 low resolution mode (LRM) data processed with an Ice2 retracker, i.e., the content of the upcoming IOP/GOP ocean product from ESA [1]. For this study, we use demonstration samples (year 2011 on our validation area), processed by the so-called Cryosat Processing Prototype developed by CNES in the framework of the Sentinel-3 Project from ESA [5] [4]. At the time of this study, the product came with no corrections ("solid earth tide", atmosphere, etc.), .Validation is performed on (1) river water level pseudo time series and (2) river pseudo profile. An overview of the spatio-temporal time series is also given in 2D and 3D plots. Despite the lack of geophysical corrections, results are really promising (Std 0.51 m) and are challenging those obtained by Envisat (Std 0.43 m) and Jason-2 (Std 0.47 m) on nearby virtual stations.We also demonstrate the potential of the CryoSat-2 and the appropriateness of its drifting orbit to map rivers topography and derive water levels "at anytime and anywhere" , a major topic of interest regarding hydrological propagation models and the preparation of the SWOT mission.

  19. Time series analysis of reference crop evapotranspiration using soft computing techniques for Ganjam District, Odisha, India

    NASA Astrophysics Data System (ADS)

    Patra, S. R.

    2017-12-01

    Evapotranspiration (ET0) influences water resources and it is considered as a vital process in aridic hydrologic frameworks. It is one of the most important measure in finding the drought condition. Therefore, time series forecasting of evapotranspiration is very important in order to help the decision makers and water system mangers build up proper systems to sustain and manage water resources. Time series considers that -history repeats itself, hence by analysing the past values, better choices, or forecasts, can be carried out for the future. Ten years of ET0 data was used as a part of this study to make sure a satisfactory forecast of monthly values. In this study, three models: (ARIMA) mathematical model, artificial neural network model, support vector machine model are presented. These three models are used for forecasting monthly reference crop evapotranspiration based on ten years of past historical records (1991-2001) of measured evaporation at Ganjam region, Odisha, India without considering the climate data. The developed models will allow water resource managers to predict up to 12 months, making these predictions very useful to optimize the resources needed for effective water resources management. In this study multistep-ahead prediction is performed which is more complex and troublesome than onestep ahead. Our investigation proposed that nonlinear relationships may exist among the monthly indices, so that the ARIMA model might not be able to effectively extract the full relationship hidden in the historical data. Support vector machines are potentially helpful time series forecasting strategies on account of their strong nonlinear mapping capability and resistance to complexity in forecasting data. SVMs have great learning capability in time series modelling compared to ANN. For instance, the SVMs execute the structural risk minimization principle, which allows in better generalization as compared to neural networks that use the empirical risk minimization principle. The reliability of these computational models was analysed in light of simulation results and it was found out that SVM model produces better results among the three. The future research should be routed to extend the validation data set and to check the validity of our results on different areas with hybrid intelligence techniques.

  20. New Perspectives for Old Clusters: Anderson-Evans Anions as Building Blocks of Large Polyoxometalate Frameworks in a Series of Heterometallic 3 d-4 f Species.

    PubMed

    Artetxe, Beñat; Reinoso, Santiago; San Felices, Leire; Lezama, Luis; Gutiérrez-Zorrilla, Juan M; Vicent, Cristian; Haso, Fadi; Liu, Tianbo

    2016-03-18

    A series of nine [Sb7W36O133Ln3M2(OAc)(H2O)8](17-) heterometallic anions (Ln3M2; Ln=La-Gd, M=Co; Ln=Ce, M=Ni and Zn) have been obtained by reacting 3 d metal disubstituted Krebs-type tungstoantimonates(III) with early lanthanides. Their unique tetrameric structure contains a novel {MW9O33} capping unit formed by a planar {MW6O24} fragment to which three {WO2} groups are condensed to form a tungstate skeleton identical to that of a hypothetical trilacunary derivative of the ɛ-Keggin cluster. It is shown, for the first time, that classical Anderson-Evans {MW6O24} anions can act as building blocks to construct purely inorganic large frameworks. Unprecedented reactivity in the outer ring of these disk-shaped species is also revealed. The Ln3M2 anions possess chirality owing to a {Sb4O4} cluster being encapsulated in left- or right-handed orientations. Their ability to self-associate in blackberry-type vesicles in solution has been assessed for the Ce3Co2 derivative. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Exploring heterogeneous market hypothesis using realized volatility

    NASA Astrophysics Data System (ADS)

    Chin, Wen Cheong; Isa, Zaidi; Mohd Nor, Abu Hassan Shaari

    2013-04-01

    This study investigates the heterogeneous market hypothesis using high frequency data. The cascaded heterogeneous trading activities with different time durations are modelled by the heterogeneous autoregressive framework. The empirical study indicated the presence of long memory behaviour and predictability elements in the financial time series which supported heterogeneous market hypothesis. Besides the common sum-of-square intraday realized volatility, we also advocated two power variation realized volatilities in forecast evaluation and risk measurement in order to overcome the possible abrupt jumps during the credit crisis. Finally, the empirical results are used in determining the market risk using the value-at-risk approach. The findings of this study have implications for informationally market efficiency analysis, portfolio strategies and risk managements.

  2. Crossover from antipersistent to persistent behavior in time series possessing the generalyzed dynamic scaling law

    NASA Astrophysics Data System (ADS)

    Balankin, Alexander S.; Morales Matamoros, Oswaldo; Gálvez M., Ernesto; Pérez A., Alfonso

    2004-03-01

    The behavior of crude oil price volatility is analyzed within a conceptual framework of kinetic roughening of growing interfaces. We find that the persistent long-horizon volatilities satisfy the Family-Viscek dynamic scaling ansatz, whereas the mean-reverting in time short horizon volatilities obey the generalized scaling law with continuously varying scaling exponents. Furthermore we find that the crossover from antipersistent to persistent behavior is accompanied by a change in the type of volatility distribution. These phenomena are attributed to the complex avalanche dynamics of crude oil markets and so a similar behavior may be observed in a wide variety of physical systems governed by avalanche dynamics.

  3. THE ADVERSE OUTCOME PATHWAY (AOP) FRAMEWORK: A FRAMEWORK FOR ORGANIZING BIOLOGICAL KNOWLEDGE LEADING TO HEALTH RISKS.

    EPA Science Inventory

    An Adverse Outcome Pathway (AOP) represents the organization of current and newly acquired knowledge of biological pathways. These pathways contain a series of nodes (Key Events, KEs) that when sufficiently altered influence the next node on the pathway, beginning from an Molecul...

  4. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    NASA Technical Reports Server (NTRS)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify a variety of plant phenomena and improve monitoring capabilities.

  5. Construction of Gene Regulatory Networks Using Recurrent Neural Networks and Swarm Intelligence.

    PubMed

    Khan, Abhinandan; Mandal, Sudip; Pal, Rajat Kumar; Saha, Goutam

    2016-01-01

    We have proposed a methodology for the reverse engineering of biologically plausible gene regulatory networks from temporal genetic expression data. We have used established information and the fundamental mathematical theory for this purpose. We have employed the Recurrent Neural Network formalism to extract the underlying dynamics present in the time series expression data accurately. We have introduced a new hybrid swarm intelligence framework for the accurate training of the model parameters. The proposed methodology has been first applied to a small artificial network, and the results obtained suggest that it can produce the best results available in the contemporary literature, to the best of our knowledge. Subsequently, we have implemented our proposed framework on experimental (in vivo) datasets. Finally, we have investigated two medium sized genetic networks (in silico) extracted from GeneNetWeaver, to understand how the proposed algorithm scales up with network size. Additionally, we have implemented our proposed algorithm with half the number of time points. The results indicate that a reduction of 50% in the number of time points does not have an effect on the accuracy of the proposed methodology significantly, with a maximum of just over 15% deterioration in the worst case.

  6. Analytically exploiting noise correlations inside the feedback loop to improve locked-oscillator performance.

    PubMed

    Sastrawan, J; Jones, C; Akhalwaya, I; Uys, H; Biercuk, M J

    2016-08-01

    We introduce concepts from optimal estimation to the stabilization of precision frequency standards limited by noisy local oscillators. We develop a theoretical framework casting various measures for frequency standard variance in terms of frequency-domain transfer functions, capturing the effects of feedback stabilization via a time series of Ramsey measurements. Using this framework, we introduce an optimized hybrid predictive feedforward measurement protocol that employs results from multiple past measurements and transfer-function-based calculations of measurement covariance to improve the accuracy of corrections within the feedback loop. In the presence of common non-Markovian noise processes these measurements will be correlated in a calculable manner, providing a means to capture the stochastic evolution of the local oscillator frequency during the measurement cycle. We present analytic calculations and numerical simulations of oscillator performance under competing feedback schemes and demonstrate benefits in both correction accuracy and long-term oscillator stability using hybrid feedforward. Simulations verify that in the presence of uncompensated dead time and noise with significant spectral weight near the inverse cycle time predictive feedforward outperforms traditional feedback, providing a path towards developing a class of stabilization software routines for frequency standards limited by noisy local oscillators.

  7. Developing the Framework for an Early Warning System for Ebola based on Environmental Conditions

    NASA Astrophysics Data System (ADS)

    Dartevelle, Sebastien; Nguy-Robertson, Anthony; Bell, Jesse; Chretien, Jean-Paul

    2017-04-01

    The 2014-2016 Ebola outbreak in West Africa indicated that this lethal disease can become a National Security issue as it crossed boarders and taxed regional health care systems. Ebola symptoms are also similar to other endemic diseases. Thus, forewarning of its possible presence can alert local public health facilities and populations, and may thereby reduce response time. Early work by our group has identified local climate (e.g. temperature, precipitation) and vegetation health (e.g. remote sensing using normalized difference vegetation index, NDVI) variables as leading indicators to known historical Ebola outbreaks. The environmental stress placed on the system as it reaches a climatic tipping point provides optimal conditions for spillover of Ebola virus from the reservoir host (which is unknown but suspected to be bats) to humans. This work outlines a framework for an approach to provide early warning maps based on the present state of the environment. Time series data from Climate Forecast System ver. 2 and AVHRR and MODIS satellite sensors are the basis for the early warning models used. These maps can provide policy makers and local health care professionals timely information for disease surveillance and preparation for future Ebola outbreaks.

  8. Constructivists Online: Reimagining Progressive Practice. Occasional Paper Series 34

    ERIC Educational Resources Information Center

    Freidus, Helen, Ed.; Welsh Kruger, Mollie, Ed.; Goss, Steven, Ed.

    2015-01-01

    In this issue of the Occasional Paper Series, the authors reimagine progressive pedagogy within the framework of digital pedagogy and online practice. The issue begins with "Notes from the Special Issue Editors," Helen Freidus, Mollie Welsh Kruger & Steven Goss. In the first set of essays, "Inside the Online Classroom," the…

  9. A robust nonparametric framework for reconstruction of stochastic differential equation models

    NASA Astrophysics Data System (ADS)

    Rajabzadeh, Yalda; Rezaie, Amir Hossein; Amindavar, Hamidreza

    2016-05-01

    In this paper, we employ a nonparametric framework to robustly estimate the functional forms of drift and diffusion terms from discrete stationary time series. The proposed method significantly improves the accuracy of the parameter estimation. In this framework, drift and diffusion coefficients are modeled through orthogonal Legendre polynomials. We employ the least squares regression approach along with the Euler-Maruyama approximation method to learn coefficients of stochastic model. Next, a numerical discrete construction of mean squared prediction error (MSPE) is established to calculate the order of Legendre polynomials in drift and diffusion terms. We show numerically that the new method is robust against the variation in sample size and sampling rate. The performance of our method in comparison with the kernel-based regression (KBR) method is demonstrated through simulation and real data. In case of real dataset, we test our method for discriminating healthy electroencephalogram (EEG) signals from epilepsy ones. We also demonstrate the efficiency of the method through prediction in the financial data. In both simulation and real data, our algorithm outperforms the KBR method.

  10. Semiparametric modeling: Correcting low-dimensional model error in parametric models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berry, Tyrus, E-mail: thb11@psu.edu; Harlim, John, E-mail: jharlim@psu.edu; Department of Meteorology, the Pennsylvania State University, 503 Walker Building, University Park, PA 16802-5013

    2016-03-01

    In this paper, a semiparametric modeling approach is introduced as a paradigm for addressing model error arising from unresolved physical phenomena. Our approach compensates for model error by learning an auxiliary dynamical model for the unknown parameters. Practically, the proposed approach consists of the following steps. Given a physics-based model and a noisy data set of historical observations, a Bayesian filtering algorithm is used to extract a time-series of the parameter values. Subsequently, the diffusion forecast algorithm is applied to the retrieved time-series in order to construct the auxiliary model for the time evolving parameters. The semiparametric forecasting algorithm consistsmore » of integrating the existing physics-based model with an ensemble of parameters sampled from the probability density function of the diffusion forecast. To specify initial conditions for the diffusion forecast, a Bayesian semiparametric filtering method that extends the Kalman-based filtering framework is introduced. In difficult test examples, which introduce chaotically and stochastically evolving hidden parameters into the Lorenz-96 model, we show that our approach can effectively compensate for model error, with forecasting skill comparable to that of the perfect model.« less

  11. Monitoring Forest Regrowth Using a Multi-Platform Time Series

    NASA Technical Reports Server (NTRS)

    Sabol, Donald E., Jr.; Smith, Milton O.; Adams, John B.; Gillespie, Alan R.; Tucker, Compton J.

    1996-01-01

    Over the past 50 years, the forests of western Washington and Oregon have been extensively harvested for timber. This has resulted in a heterogeneous mosaic of remaining mature forests, clear-cuts, new plantations, and second-growth stands that now occur in areas that formerly were dominated by extensive old-growth forests and younger forests resulting from fire disturbance. Traditionally, determination of seral stage and stand condition have been made using aerial photography and spot field observations, a methodology that is not only time- and resource-intensive, but falls short of providing current information on a regional scale. These limitations may be solved, in part, through the use of multispectral images which can cover large areas at spatial resolutions in the order of tens of meters. The use of multiple images comprising a time series potentially can be used to monitor land use (e.g. cutting and replanting), and to observe natural processes such as regeneration, maturation and phenologic change. These processes are more likely to be spectrally observed in a time series composed of images taken during different seasons over a long period of time. Therefore, for many areas, it may be necessary to use a variety of images taken with different imaging systems. A common framework for interpretation is needed that reduces topographic, atmospheric, instrumental, effects as well as differences in lighting geometry between images. The present state of remote-sensing technology in general use does not realize the full potential of the multispectral data in areas of high topographic relief. For example, the primary method for analyzing images of forested landscapes in the Northwest has been with statistical classifiers (e.g. parallelepiped, nearest-neighbor, maximum likelihood, etc.), often applied to uncalibrated multispectral data. Although this approach has produced useful information from individual images in some areas, landcover classes defined by these techniques typically are not consistent for the same scene imaged under different illumination conditions, especially in the mountainous regions. In addition, it is difficult to correct for atmospheric and instrumental differences between multiple scenes in a time series. In this paper, we present an approach for monitoring forest cutting/regrowth in a semi-mountainous portion of the southern Gifford Pinchot National Forest using a multisensor-time series composed of MSS, TM, and AVIRIS images.

  12. Predictive framework for estimating exposure of birds to pharmaceuticals

    USGS Publications Warehouse

    Bean, Thomas G.; Arnold, Kathryn E.; Lane, Julie M.; Bergström, Ed; Thomas-Oates, Jane; Rattner, Barnett A.; Boxall, Allistair B.A.

    2017-01-01

    We present and evaluate a framework for estimating concentrations of pharmaceuticals over time in wildlife feeding at wastewater treatment plants (WWTPs). The framework is composed of a series of predictive steps involving the estimation of pharmaceutical concentration in wastewater, accumulation into wildlife food items, and uptake by wildlife with subsequent distribution into, and elimination from, tissues. Because many pharmacokinetic parameters for wildlife are unavailable for the majority of drugs in use, a read-across approach was employed using either rodent or human data on absorption, distribution, metabolism, and excretion. Comparison of the different steps in the framework against experimental data for the scenario where birds are feeding on a WWTP contaminated with fluoxetine showed that estimated concentrations in wastewater treatment works were lower than measured concentrations; concentrations in food could be reasonably estimated if experimental bioaccumulation data are available; and read-across from rodent data worked better than human to bird read-across. The framework provides adequate predictions of plasma concentrations and of elimination behavior in birds but yields poor predictions of distribution in tissues. The approach holds promise, but it is important that we improve our understanding of the physiological similarities and differences between wild birds and domesticated laboratory mammals used in pharmaceutical efficacy/safety trials, so that the wealth of data available can be applied more effectively in ecological risk assessments.

  13. Predictive framework for estimating exposure of birds to pharmaceuticals.

    PubMed

    Bean, Thomas G; Arnold, Kathryn E; Lane, Julie M; Bergström, Ed; Thomas-Oates, Jane; Rattner, Barnett A; Boxall, Alistair B A

    2017-09-01

    We present and evaluate a framework for estimating concentrations of pharmaceuticals over time in wildlife feeding at wastewater treatment plants (WWTPs). The framework is composed of a series of predictive steps involving the estimation of pharmaceutical concentration in wastewater, accumulation into wildlife food items, and uptake by wildlife with subsequent distribution into, and elimination from, tissues. Because many pharmacokinetic parameters for wildlife are unavailable for the majority of drugs in use, a read-across approach was employed using either rodent or human data on absorption, distribution, metabolism, and excretion. Comparison of the different steps in the framework against experimental data for the scenario where birds are feeding on a WWTP contaminated with fluoxetine showed that estimated concentrations in wastewater treatment works were lower than measured concentrations; concentrations in food could be reasonably estimated if experimental bioaccumulation data are available; and read-across from rodent data worked better than human to bird read-across. The framework provides adequate predictions of plasma concentrations and of elimination behavior in birds but yields poor predictions of distribution in tissues. The approach holds promise, but it is important that we improve our understanding of the physiological similarities and differences between wild birds and domesticated laboratory mammals used in pharmaceutical efficacy/safety trials, so that the wealth of data available can be applied more effectively in ecological risk assessments. Environ Toxicol Chem 2017;36:2335-2344. © 2017 SETAC. © 2017 SETAC.

  14. Markov modulated Poisson process models incorporating covariates for rainfall intensity.

    PubMed

    Thayakaran, R; Ramesh, N I

    2013-01-01

    Time series of rainfall bucket tip times at the Beaufort Park station, Bracknell, in the UK are modelled by a class of Markov modulated Poisson processes (MMPP) which may be thought of as a generalization of the Poisson process. Our main focus in this paper is to investigate the effects of including covariate information into the MMPP model framework on statistical properties. In particular, we look at three types of time-varying covariates namely temperature, sea level pressure, and relative humidity that are thought to be affecting the rainfall arrival process. Maximum likelihood estimation is used to obtain the parameter estimates, and likelihood ratio tests are employed in model comparison. Simulated data from the fitted model are used to make statistical inferences about the accumulated rainfall in the discrete time interval. Variability of the daily Poisson arrival rates is studied.

  15. An Efficient Pattern Mining Approach for Event Detection in Multivariate Temporal Data

    PubMed Central

    Batal, Iyad; Cooper, Gregory; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos

    2015-01-01

    This work proposes a pattern mining approach to learn event detection models from complex multivariate temporal data, such as electronic health records. We present Recent Temporal Pattern mining, a novel approach for efficiently finding predictive patterns for event detection problems. This approach first converts the time series data into time-interval sequences of temporal abstractions. It then constructs more complex time-interval patterns backward in time using temporal operators. We also present the Minimal Predictive Recent Temporal Patterns framework for selecting a small set of predictive and non-spurious patterns. We apply our methods for predicting adverse medical events in real-world clinical data. The results demonstrate the benefits of our methods in learning accurate event detection models, which is a key step for developing intelligent patient monitoring and decision support systems. PMID:26752800

  16. A framework for studying transient dynamics of population projection matrix models.

    PubMed

    Stott, Iain; Townley, Stuart; Hodgson, David James

    2011-09-01

    Empirical models are central to effective conservation and population management, and should be predictive of real-world dynamics. Available modelling methods are diverse, but analysis usually focuses on long-term dynamics that are unable to describe the complicated short-term time series that can arise even from simple models following ecological disturbances or perturbations. Recent interest in such transient dynamics has led to diverse methodologies for their quantification in density-independent, time-invariant population projection matrix (PPM) models, but the fragmented nature of this literature has stifled the widespread analysis of transients. We review the literature on transient analyses of linear PPM models and synthesise a coherent framework. We promote the use of standardised indices, and categorise indices according to their focus on either convergence times or transient population density, and on either transient bounds or case-specific transient dynamics. We use a large database of empirical PPM models to explore relationships between indices of transient dynamics. This analysis promotes the use of population inertia as a simple, versatile and informative predictor of transient population density, but criticises the utility of established indices of convergence times. Our findings should guide further development of analyses of transient population dynamics using PPMs or other empirical modelling techniques. © 2011 Blackwell Publishing Ltd/CNRS.

  17. A novel approach for detecting heat waves: the Standardized Heat-Wave Index.

    NASA Astrophysics Data System (ADS)

    Cucchi, Marco; Petitta, Marcello; Calmanti, Sandro

    2016-04-01

    Extreme temperatures have an impact on the energy balance of any living organism and on the operational capabilities of critical infrastructures. The ability to capture the occurrence of extreme temperature events is therefore an essential property of a multi-hazard extreme climate indicator. In this paper we introduce a new index for the detection of such extreme temperature events called SHI (Standardized Heat-Wave Index), developed in the context of XCF project for the construction of a multi-hazard extreme climate indicator (ECI). SHI is a probabilistic index based on the analysis of maximum daily temperatures time series; it is standardized, enabling comparisons overs space/time and with other indices, and it is capable of describing both extreme cold and hot events. Given a particular location, SHI is constructed using the time series of local maximum daily temperatures with the following procedure: three-days cumulated maximum daily temperatures are assigned to each day of the time series; probabilities of occurrence in the same months the reference days belong to are computed for each of the previous calculated values; such probability values are thus projected on a standard normal distribution, obtaining our standardized indices. In this work we present results obtained using NCEP Reanalysis dataset for air temperature at sigma 0.995 level, which timespan ranges from 1948 to 2014. Given the specific framework of this work, the geographical focus of this study is limited to the African continent. We present a validation of the index by showing its use for monitoring heat-waves under different climate regimes.

  18. Deep Learning for real-time gravitational wave detection and parameter estimation: Results with Advanced LIGO data

    NASA Astrophysics Data System (ADS)

    George, Daniel; Huerta, E. A.

    2018-03-01

    The recent Nobel-prize-winning detections of gravitational waves from merging black holes and the subsequent detection of the collision of two neutron stars in coincidence with electromagnetic observations have inaugurated a new era of multimessenger astrophysics. To enhance the scope of this emergent field of science, we pioneered the use of deep learning with convolutional neural networks, that take time-series inputs, for rapid detection and characterization of gravitational wave signals. This approach, Deep Filtering, was initially demonstrated using simulated LIGO noise. In this article, we present the extension of Deep Filtering using real data from LIGO, for both detection and parameter estimation of gravitational waves from binary black hole mergers using continuous data streams from multiple LIGO detectors. We demonstrate for the first time that machine learning can detect and estimate the true parameters of real events observed by LIGO. Our results show that Deep Filtering achieves similar sensitivities and lower errors compared to matched-filtering while being far more computationally efficient and more resilient to glitches, allowing real-time processing of weak time-series signals in non-stationary non-Gaussian noise with minimal resources, and also enables the detection of new classes of gravitational wave sources that may go unnoticed with existing detection algorithms. This unified framework for data analysis is ideally suited to enable coincident detection campaigns of gravitational waves and their multimessenger counterparts in real-time.

  19. An improved shuffled frog leaping algorithm based evolutionary framework for currency exchange rate prediction

    NASA Astrophysics Data System (ADS)

    Dash, Rajashree

    2017-11-01

    Forecasting purchasing power of one currency with respect to another currency is always an interesting topic in the field of financial time series prediction. Despite the existence of several traditional and computational models for currency exchange rate forecasting, there is always a need for developing simpler and more efficient model, which will produce better prediction capability. In this paper, an evolutionary framework is proposed by using an improved shuffled frog leaping (ISFL) algorithm with a computationally efficient functional link artificial neural network (CEFLANN) for prediction of currency exchange rate. The model is validated by observing the monthly prediction measures obtained for three currency exchange data sets such as USD/CAD, USD/CHF, and USD/JPY accumulated within same period of time. The model performance is also compared with two other evolutionary learning techniques such as Shuffled frog leaping algorithm and Particle Swarm optimization algorithm. Practical analysis of results suggest that, the proposed model developed using the ISFL algorithm with CEFLANN network is a promising predictor model for currency exchange rate prediction compared to other models included in the study.

  20. Prototype software model for designing intruder detection systems with simulation

    NASA Astrophysics Data System (ADS)

    Smith, Jeffrey S.; Peters, Brett A.; Curry, James C.; Gupta, Dinesh

    1998-08-01

    This article explores using discrete-event simulation for the design and control of defence oriented fixed-sensor- based detection system in a facility housing items of significant interest to enemy forces. The key issues discussed include software development, simulation-based optimization within a modeling framework, and the expansion of the framework to create real-time control tools and training simulations. The software discussed in this article is a flexible simulation environment where the data for the simulation are stored in an external database and the simulation logic is being implemented using a commercial simulation package. The simulation assesses the overall security level of a building against various intruder scenarios. A series of simulation runs with different inputs can determine the change in security level with changes in the sensor configuration, building layout, and intruder/guard strategies. In addition, the simulation model developed for the design stage of the project can be modified to produce a control tool for the testing, training, and real-time control of systems with humans and sensor hardware in the loop.

  1. Data-Driven Modeling and Prediction of Arctic Sea Ice

    NASA Astrophysics Data System (ADS)

    Kondrashov, Dmitri; Chekroun, Mickael; Ghil, Michael

    2016-04-01

    We present results of data-driven predictive analyses of sea ice over the main Arctic regions. Our approach relies on the Multilayer Stochastic Modeling (MSM) framework of Kondrashov, Chekroun and Ghil [Physica D, 2015] and it leads to probabilistic prognostic models of sea ice concentration (SIC) anomalies on seasonal time scales. This approach is applied to monthly time series of state-of-the-art data-adaptive decompositions of SIC and selected climate variables over the Arctic. We evaluate the predictive skill of MSM models by performing retrospective forecasts with "no-look ahead" for up to 6-months ahead. It will be shown in particular that the memory effects included intrinsically in the formulation of our non-Markovian MSM models allow for improvements of the prediction skill of large-amplitude SIC anomalies in certain Arctic regions on the one hand, and of September Sea Ice Extent, on the other. Further improvements allowed by the MSM framework will adopt a nonlinear formulation and explore next-generation data-adaptive decompositions, namely modification of Principal Oscillation Patterns (POPs) and rotated Multichannel Singular Spectrum Analysis (M-SSA).

  2. Characteristics of Hydrogen Sensors Based on Thin Tin Dioxide Films Modified with Gold

    NASA Astrophysics Data System (ADS)

    Almaev, A. V.; Gaman, V. I.

    2017-11-01

    Effect of hydrogen in the concentration range from 10 to 2000 ppm on the characteristics of sensors based on thin films of tin dioxide modified with gold (Au/SnO2:Sb, Au) is studied in the thermo-cyclic mode at temperatures from 623 to 773 K and absolute humidity from 2.5 to 20 g/m3. Experimental data are discussed using expressions obtained within the framework of a model that takes into account the presence of three types of adsorbed particles (O¯, OH, and OH¯) on the surface of SnO2 nanocrystals. The characteristics of the sensors based on thin Pt/Pd/SnO2:Sb films (the first series) are compared with those of Au/SnO2:Sb, Au films (the second series). It is found that the degree of dissociation of molecular hydrogen into atoms during adsorption on the sensor under interaction with Au particles on the SnO2 surface is 4 times greater than that under interaction with Pt/Pd particles. The degree of dissociation of H2O molecules into hydrogen atoms and hydroxyl groups in pure moist air on the surface of the sensors of the second series is 1.6 times greater than that for the sensors of the first series. Thus, gold is a more effective stimulator of the dissociation of H2 and H2O molecules than platinum and palladium. A formula is obtained that describes more accurately the dependence of the response of the sensors of both series to the effect of hydrogen on the concentration of this gas and on the temperature of the measuring devices.

  3. Molecular Modeling and Experimental Study of Nonlinear Optical Compounds: Mono-Substituted Derivatives of Dicyanovinylbenzene

    NASA Technical Reports Server (NTRS)

    Timofeeva, Tatyana V.; Nesterov, Vladimir N.; Antipin, Mikhael Y.; Clark, R. D.; Sanghadasa, M.; Cardelino, B. H.; Moore, C. E.; Frazier, Donald O.

    2000-01-01

    A search for potential nonlinear optical (NLO) compounds has been performed using the Cambridge Structural Database and molecular modeling. We have studied a series of mono-substituted derivatives of dicyanovinylbenzene as the NLO properties of one of its derivatives (o-methoxy-dicyanovinylbenzene, DIVA) were described earlier. The molecular geometry in the series of the compounds studied was investigated with an X- ray analysis and discussed along with results of molecular mechanics and ab initio quantum chemical calculations. The influence of crystal packing on the molecular planarity has been revealed. Two new compounds from the series studied were found to be active for second harmonic generation (SHG) in the powder. The measurements of SHG efficiency have shown that the o-F- and p-Cl-derivatives of dicyanovinylbenzene are about 10 and 20- times more active than urea, respectively. The peculiarities of crystal structure formation in the framework of balance between the van der Waals and electrostatic interactions have been discussed. The crystal morphology of DIVA and two new SHG-active compounds have been calculated on the basis of their known crystal structures.

  4. High-speed reacting flow simulation using USA-series codes

    NASA Astrophysics Data System (ADS)

    Chakravarthy, S. R.; Palaniswamy, S.

    In this paper, the finite-rate chemistry (FRC) formulation for the USA-series of codes and three sets of validations are presented. USA-series computational fluid dynamics (CFD) codes are based on Unified Solution Algorithms including explicity and implicit formulations, factorization and relaxation approaches, time marching and space marching methodolgies, etc., in order to be able to solve a very wide class of CDF problems using a single framework. Euler or Navier-Stokes equations are solved using a finite-volume treatment with upwind Total Variation Diminishing discretization for the inviscid terms. Perfect and real gas options are available including equilibrium and nonequilibrium chemistry. This capability has been widely used to study various problems including Space Shuttle exhaust plumes, National Aerospace Plane (NASP) designs, etc. (1) Numerical solutions are presented showing the full range of possible solutions to steady detonation wave problems. (2) Comparison between the solution obtained by the USA code and Generalized Kinetics Analysis Program (GKAP) is shown for supersonic combustion in a duct. (3) Simulation of combustion in a supersonic shear layer is shown to have reasonable agreement with experimental observations.

  5. Time series analysis of personal exposure to ambient air pollution and mortality using an exposure simulator.

    PubMed

    Chang, Howard H; Fuentes, Montserrat; Frey, H Christopher

    2012-09-01

    This paper describes a modeling framework for estimating the acute effects of personal exposure to ambient air pollution in a time series design. First, a spatial hierarchical model is used to relate Census tract-level daily ambient concentrations and simulated exposures for a subset of the study period. The complete exposure time series is then imputed for risk estimation. Modeling exposure via a statistical model reduces the computational burden associated with simulating personal exposures considerably. This allows us to consider personal exposures at a finer spatial resolution to improve exposure assessment and for a longer study period. The proposed approach is applied to an analysis of fine particulate matter of <2.5 μm in aerodynamic diameter (PM(2.5)) and daily mortality in the New York City metropolitan area during the period 2001-2005. Personal PM(2.5) exposures were simulated from the Stochastic Human Exposure and Dose Simulation. Accounting for exposure uncertainty, the authors estimated a 2.32% (95% posterior interval: 0.68, 3.94) increase in mortality per a 10 μg/m(3) increase in personal exposure to PM(2.5) from outdoor sources on the previous day. The corresponding estimates per a 10 μg/m(3) increase in PM(2.5) ambient concentration was 1.13% (95% confidence interval: 0.27, 2.00). The risks of mortality associated with PM(2.5) were also higher during the summer months.

  6. Bayesian analysis of time-series data under case-crossover designs: posterior equivalence and inference.

    PubMed

    Li, Shi; Mukherjee, Bhramar; Batterman, Stuart; Ghosh, Malay

    2013-12-01

    Case-crossover designs are widely used to study short-term exposure effects on the risk of acute adverse health events. While the frequentist literature on this topic is vast, there is no Bayesian work in this general area. The contribution of this paper is twofold. First, the paper establishes Bayesian equivalence results that require characterization of the set of priors under which the posterior distributions of the risk ratio parameters based on a case-crossover and time-series analysis are identical. Second, the paper studies inferential issues under case-crossover designs in a Bayesian framework. Traditionally, a conditional logistic regression is used for inference on risk-ratio parameters in case-crossover studies. We consider instead a more general full likelihood-based approach which makes less restrictive assumptions on the risk functions. Formulation of a full likelihood leads to growth in the number of parameters proportional to the sample size. We propose a semi-parametric Bayesian approach using a Dirichlet process prior to handle the random nuisance parameters that appear in a full likelihood formulation. We carry out a simulation study to compare the Bayesian methods based on full and conditional likelihood with the standard frequentist approaches for case-crossover and time-series analysis. The proposed methods are illustrated through the Detroit Asthma Morbidity, Air Quality and Traffic study, which examines the association between acute asthma risk and ambient air pollutant concentrations. © 2013, The International Biometric Society.

  7. Use of a time series design to test effectiveness of a theory-based intervention targeting adherence of health professionals to a clinical guideline.

    PubMed

    Hanbury, Andria; Wallace, Louise; Clark, Michael

    2009-09-01

    The aim of this study was to test the effectiveness of a theory of planned behaviour intervention to increase adherence of community mental health professionals to a national suicide prevention guideline. Routinely collected audit adherence data from an intervention and control site were collected and analysed using time series analysis to test whether the intervention significantly increased adherence. The effects of a local and national event on adherence were also examined. A Theory of Planned Behaviour (TPB) questionnaire, developed from interview findings, was administered to the health professionals. Subjective norms were found to be the most significant predictor of intention to adhere to the guideline, and were targeted with an interactive educational intervention. Time series analysis applied to routinely collected audit adherence data was used to test intervention effectiveness. The TPB accounted for 58% of the variance in intention to adhere, with subjective norms the only significant predictor. The intervention did not significantly increase adherence; however, the national and local events were found to have significantly increased adherence. The TPB was a useful framework for exploring barriers to adherence; however, this did not translate into an effective intervention. Future research should seek collaboration with local experts, and use this information in combination with the TPB, to develop interventions. Collaborative research with experts in pedagogy may also help to develop more effective interventions, particularly education-based interventions that require adult learning.

  8. Spatiotemporal Filtering Using Principal Component Analysis and Karhunen-Loeve Expansion Approaches for Regional GPS Network Analysis

    NASA Technical Reports Server (NTRS)

    Dong, D.; Fang, P.; Bock, F.; Webb, F.; Prawirondirdjo, L.; Kedar, S.; Jamason, P.

    2006-01-01

    Spatial filtering is an effective way to improve the precision of coordinate time series for regional GPS networks by reducing so-called common mode errors, thereby providing better resolution for detecting weak or transient deformation signals. The commonly used approach to regional filtering assumes that the common mode error is spatially uniform, which is a good approximation for networks of hundreds of kilometers extent, but breaks down as the spatial extent increases. A more rigorous approach should remove the assumption of spatially uniform distribution and let the data themselves reveal the spatial distribution of the common mode error. The principal component analysis (PCA) and the Karhunen-Loeve expansion (KLE) both decompose network time series into a set of temporally varying modes and their spatial responses. Therefore they provide a mathematical framework to perform spatiotemporal filtering.We apply the combination of PCA and KLE to daily station coordinate time series of the Southern California Integrated GPS Network (SCIGN) for the period 2000 to 2004. We demonstrate that spatially and temporally correlated common mode errors are the dominant error source in daily GPS solutions. The spatial characteristics of the common mode errors are close to uniform for all east, north, and vertical components, which implies a very long wavelength source for the common mode errors, compared to the spatial extent of the GPS network in southern California. Furthermore, the common mode errors exhibit temporally nonrandom patterns.

  9. Fractal analyses reveal independent complexity and predictability of gait

    PubMed Central

    Dierick, Frédéric; Nivard, Anne-Laure

    2017-01-01

    Locomotion is a natural task that has been assessed for decades and used as a proxy to highlight impairments of various origins. So far, most studies adopted classical linear analyses of spatio-temporal gait parameters. Here, we use more advanced, yet not less practical, non-linear techniques to analyse gait time series of healthy subjects. We aimed at finding more sensitive indexes related to spatio-temporal gait parameters than those previously used, with the hope to better identify abnormal locomotion. We analysed large-scale stride interval time series and mean step width in 34 participants while altering walking direction (forward vs. backward walking) and with or without galvanic vestibular stimulation. The Hurst exponent α and the Minkowski fractal dimension D were computed and interpreted as indexes expressing predictability and complexity of stride interval time series, respectively. These holistic indexes can easily be interpreted in the framework of optimal movement complexity. We show that α and D accurately capture stride interval changes in function of the experimental condition. Walking forward exhibited maximal complexity (D) and hence, adaptability. In contrast, walking backward and/or stimulation of the vestibular system decreased D. Furthermore, walking backward increased predictability (α) through a more stereotyped pattern of the stride interval and galvanic vestibular stimulation reduced predictability. The present study demonstrates the complementary power of the Hurst exponent and the fractal dimension to improve walking classification. Our developments may have immediate applications in rehabilitation, diagnosis, and classification procedures. PMID:29182659

  10. Concentration-discharge relationships to understand the interplay between hydrological and biogeochemical processes: insights from data analysis and numerical experiments in headwater catchments.

    NASA Astrophysics Data System (ADS)

    De Dreuzy, J. R.; Marçais, J.; Moatar, F.; Minaudo, C.; Courtois, Q.; Thomas, Z.; Longuevergne, L.; Pinay, G.

    2017-12-01

    Integration of hydrological and biogeochemical processes led to emerging patterns at the catchment scale. Monitoring in rivers reflects the aggregation of these effects. While discharge time series have been measured for decades, high frequency water quality monitoring in rivers now provides prominent measurements to characterize the interplay between hydrological and biogeochemical processes, especially to infer the processes that happen in the heterogeneous subsurface. However, we still lack frameworks to relate observed patterns to specific processes, because of the "organized complexity" of hydrological systems. Indeed, it is unclear what controls, for example, patterns in concentration-discharge (C/Q) relationships due to non-linear processes and hysteresis effects. Here we develop a non-intensive process-based model to test how the integration of different landforms (i.e. geological heterogeneities and structures, topographical features) with different biogeochemical reactivity assumptions (e.g. reactive zone locations) can shape the overall water quality time series. With numerical experiments, we investigate typical patterns in high frequency C/Q relationships. In headwater basins, we found that typical hysteretic patterns in C/Q relationships observed in data time series can be attributed to differences in water and solute locations stored across the hillslope. At the catchment scale though, these effects tend to average out by integrating contrasted hillslopes' landforms. Together these results suggest that information contained in headwater water quality monitoring can be used to understand how hydrochemical processes determine downstream conditions.

  11. Combined time-series resistivity and geochemical tracer techniques to examine submarine groundwater discharge at Dor Beach, Israel

    USGS Publications Warehouse

    Swarzenski, P.W.; Burnett, W.C.; Greenwood, W.J.; Herut, B.; Peterson, R.; Dimova, N.; Shalem, Y.; Yechieli, Y.; Weinstein, Y.

    2006-01-01

    A high-resolution, stationary geophysical and geochemical survey was conducted at Dor Beach, Israel, to examine the shallow coastal hydrogeology and its control on the exchange of submarine groundwater with the shallow Mediterranean Sea. Time-series resistivity profiles using a new 56 electrode (112-m long) marine cable produced detailed profiles of the fresh water/salt water interface and the subtle response of this interface to tidal excursions and other forcing factors. Such information, when ground-truthed with representative pore water salinities and formation resistivity values, can provide unique information of the extent and rates of submarine groundwater discharge (SGD). Time-series 222Rn measurements of the adjacent coastal water column complemented these geophysical techniques and were modeled to yield integrated advective flow rates across the sediment/water interface, which at Dor Beach ranged from about 0 to 30 cm day-1 (mean = 7.1 cm d-1), depending on the tidal range. Such results suggest that the underlying hydrogeologic framework at Dor is favorable for substantial SGD. Extrapolating these SGD estimates across a 100-m wide coastal zone implies that the Rn-derived SGD rate would equal ???7.1 m3 d-1 per m of shoreline, and that the source of this discharging groundwater is a complex mixture of fresh groundwater derived from the upland Kurkar deposits, as well as locally recycled seawater. Copyright 2006 by the American Geophysical Union.

  12. Transformation of metal-organic frameworks for molecular sieving membranes

    PubMed Central

    Li, Wanbin; Zhang, Yufan; Zhang, Congyang; Meng, Qin; Xu, Zehai; Su, Pengcheng; Li, Qingbiao; Shen, Chong; Fan, Zheng; Qin, Lei; Zhang, Guoliang

    2016-01-01

    The development of simple, versatile strategies for the synthesis of metal-organic framework (MOF)-derived membranes are of increasing scientific interest, but challenges exist in understanding suitable fabrication mechanisms. Here we report a route for the complete transformation of a series of MOF membranes and particles, based on multivalent cation substitution. Through our approach, the effective pore size can be reduced through the immobilization of metal salt residues in the cavities, and appropriate MOF crystal facets can be exposed, to achieve competitive molecular sieving capabilities. The method can also be used more generally for the synthesis of a variety of MOF membranes and particles. Importantly, we design and synthesize promising MOF membranes candidates that are hard to achieve through conventional methods. For example, our CuBTC/MIL-100 membrane exhibits 89, 171, 241 and 336 times higher H2 permeance than that of CO2, O2, N2 and CH4, respectively. PMID:27090597

  13. The Bayesian boom: good thing or bad?

    PubMed Central

    Hahn, Ulrike

    2014-01-01

    A series of high-profile critiques of Bayesian models of cognition have recently sparked controversy. These critiques question the contribution of rational, normative considerations in the study of cognition. The present article takes central claims from these critiques and evaluates them in light of specific models. Closer consideration of actual examples of Bayesian treatments of different cognitive phenomena allows one to defuse these critiques showing that they cannot be sustained across the diversity of applications of the Bayesian framework for cognitive modeling. More generally, there is nothing in the Bayesian framework that would inherently give rise to the deficits that these critiques perceive, suggesting they have been framed at the wrong level of generality. At the same time, the examples are used to demonstrate the different ways in which consideration of rationality uniquely benefits both theory and practice in the study of cognition. PMID:25152738

  14. An Ethics Framework for Public Health

    PubMed Central

    Kass, Nancy E.

    2001-01-01

    More than 100 years ago, public health began as an organized discipline, its purpose being to improve the health of populations rather than of individuals. Given its population-based focus, however, public health perennially faces dilemmas concerning the appropriate extent of its reach and whether its activities infringe on individual liberties in ethically troublesome ways. In this article a framework for ethics analysis of public health programs is proposed. To advance traditional public health goals while maximizing individual liberties and furthering social justice, public health interventions should reduce morbidity or mortality; data must substantiate that a program (or the series of programs of which a program is a part) will reduce morbidity or mortality; burdens of the program must be identified and minimized; the program must be implemented fairly and must, at times, minimize preexisting social injustices; and fair procedures must be used to determine which burdens are acceptable to a community. PMID:11684600

  15. PsyGlass: Capitalizing on Google Glass for naturalistic data collection.

    PubMed

    Paxton, Alexandra; Rodriguez, Kevin; Dale, Rick

    2015-09-01

    As commercial technology moves further into wearable technologies, cognitive and psychological scientists can capitalize on these devices to facilitate naturalistic research designs while still maintaining strong experimental control. One such wearable technology is Google Glass (Google, Inc.: www.google.com/glass), which can present wearers with audio and visual stimuli while tracking a host of multimodal data. In this article, we introduce PsyGlass, a framework for incorporating Google Glass into experimental work that is freely available for download and community improvement over time (www.github.com/a-paxton/PsyGlass). As a proof of concept, we use this framework to investigate dual-task pressures on naturalistic interaction. The preliminary study demonstrates how designs from classic experimental psychology may be integrated in naturalistic interactive designs with emerging technologies. We close with a series of recommendations for using PsyGlass and a discussion of how wearable technology more broadly may contribute to new or adapted naturalistic research designs.

  16. Multilayer Stock Forecasting Model Using Fuzzy Time Series

    PubMed Central

    Javedani Sadaei, Hossein; Lee, Muhammad Hisyam

    2014-01-01

    After reviewing the vast body of literature on using FTS in stock market forecasting, certain deficiencies are distinguished in the hybridization of findings. In addition, the lack of constructive systematic framework, which can be helpful to indicate direction of growth in entire FTS forecasting systems, is outstanding. In this study, we propose a multilayer model for stock market forecasting including five logical significant layers. Every single layer has its detailed concern to assist forecast development by reconciling certain problems exclusively. To verify the model, a set of huge data containing Taiwan Stock Index (TAIEX), National Association of Securities Dealers Automated Quotations (NASDAQ), Dow Jones Industrial Average (DJI), and S&P 500 have been chosen as experimental datasets. The results indicate that the proposed methodology has the potential to be accepted as a framework for model development in stock market forecasts using FTS. PMID:24605058

  17. Modeling Urban Scenarios & Experiments: Fort Indiantown Gap Data Collections Summary and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Archer, Daniel E.; Bandstra, Mark S.; Davidson, Gregory G.

    This report summarizes experimental radiation detector, contextual sensor, weather, and global positioning system (GPS) data collected to inform and validate a comprehensive, operational radiation transport modeling framework to evaluate radiation detector system and algorithm performance. This framework will be used to study the influence of systematic effects (such as geometry, background activity, background variability, environmental shielding, etc.) on detector responses and algorithm performance using synthetic time series data. This work consists of performing data collection campaigns at a canonical, controlled environment for complete radiological characterization to help construct and benchmark a high-fidelity model with quantified system geometries, detector response functions,more » and source terms for background and threat objects. This data also provides an archival, benchmark dataset that can be used by the radiation detection community. The data reported here spans four data collection campaigns conducted between May 2015 and September 2016.« less

  18. Pacing of deep marine sedimentation in the middle Eocene synorogenic Ainsa Basin, Spanish Pyrenees: deconvolving a 6myr record of tectonic and climate controls

    NASA Astrophysics Data System (ADS)

    Mac Niocaill, C.; Cantalejo, B.; Pickering, K. T.; Grant, M.; Johansen, K.

    2016-12-01

    The Middle Eocene thrust-top Ainsa Basin of Northern Spain preserves world-class exposures of deep-marine submarine fan and related deposits. Detailed paleomagnetic, micropaleontologic, and time-series analysis enable us to deconvolve, for the first time in any ancient deep-marine basin worldwide, both the pacing on deposition of the fine-grained interfan sediments and the main sandbodies (submarine fans) through the history of the deep-marine basin. Our magnetostratigraphy and faunal constraints provide a chronological framework for sedimentation in the basin. We use time-series analysis of a range of geochemical and sedimentologic data to identify likely climatic signals in the sedimentary archive. This has enabled us to test the likely importance of climate versus tectonics in controlling deposition. We show that the fine-grained interfan sedimentation preserves a dominant Milankovitch-like cyclicity, whereas the sandbodies (fans) reflect a complex interplay of controls such as tectonics and climate in the sediment source area, including shallow-marine staging areas for sediment redeposition into deeper water. These results not only provide critical information about the timing of substantial coarse clastic delivery into the Ainsa Basin but also give constraints on sediment flux over a 6 Myr window.

  19. A general theory on frequency and time-frequency analysis of irregularly sampled time series based on projection methods - Part 1: Frequency analysis

    NASA Astrophysics Data System (ADS)

    Lenoir, Guillaume; Crucifix, Michel

    2018-03-01

    We develop a general framework for the frequency analysis of irregularly sampled time series. It is based on the Lomb-Scargle periodogram, but extended to algebraic operators accounting for the presence of a polynomial trend in the model for the data, in addition to a periodic component and a background noise. Special care is devoted to the correlation between the trend and the periodic component. This new periodogram is then cast into the Welch overlapping segment averaging (WOSA) method in order to reduce its variance. We also design a test of significance for the WOSA periodogram, against the background noise. The model for the background noise is a stationary Gaussian continuous autoregressive-moving-average (CARMA) process, more general than the classical Gaussian white or red noise processes. CARMA parameters are estimated following a Bayesian framework. We provide algorithms that compute the confidence levels for the WOSA periodogram and fully take into account the uncertainty in the CARMA noise parameters. Alternatively, a theory using point estimates of CARMA parameters provides analytical confidence levels for the WOSA periodogram, which are more accurate than Markov chain Monte Carlo (MCMC) confidence levels and, below some threshold for the number of data points, less costly in computing time. We then estimate the amplitude of the periodic component with least-squares methods, and derive an approximate proportionality between the squared amplitude and the periodogram. This proportionality leads to a new extension for the periodogram: the weighted WOSA periodogram, which we recommend for most frequency analyses with irregularly sampled data. The estimated signal amplitude also permits filtering in a frequency band. Our results generalise and unify methods developed in the fields of geosciences, engineering, astronomy and astrophysics. They also constitute the starting point for an extension to the continuous wavelet transform developed in a companion article (Lenoir and Crucifix, 2018). All the methods presented in this paper are available to the reader in the Python package WAVEPAL.

  20. SU-E-T-56: A Novel Approach to Computing Expected Value and Variance of Point Dose From Non-Gated Radiotherapy Delivery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, S; Zhu, X; Zhang, M

    Purpose: Randomness in patient internal organ motion phase at the beginning of non-gated radiotherapy delivery may introduce uncertainty to dose received by the patient. Concerns of this dose deviation from the planned one has motivated many researchers to study this phenomenon although unified theoretical framework for computing it is still missing. This study was conducted to develop such framework for analyzing the effect. Methods: Two reasonable assumptions were made: a) patient internal organ motion is stationary and periodic; b) no special arrangement is made to start a non -gated radiotherapy delivery at any specific phase of patient internal organ motion.more » A statistical ensemble was formed consisting of patient’s non-gated radiotherapy deliveries at all equally possible initial organ motion phases. To characterize the patient received dose, statistical ensemble average method is employed to derive formulae for two variables: expected value and variance of dose received by a patient internal point from a non-gated radiotherapy delivery. Fourier Series was utilized to facilitate our analysis. Results: According to our formulae, the two variables can be computed from non-gated radiotherapy generated dose rate time sequences at the point’s corresponding locations on fixed phase 3D CT images sampled evenly in time over one patient internal organ motion period. The expected value of point dose is simply the average of the doses to the point’s corresponding locations on the fixed phase CT images. The variance can be determined by time integration in terms of Fourier Series coefficients of the dose rate time sequences on the same fixed phase 3D CT images. Conclusion: Given a non-gated radiotherapy delivery plan and patient’s 4D CT study, our novel approach can predict the expected value and variance of patient radiation dose. We expect it to play a significant role in determining both quality and robustness of patient non-gated radiotherapy plan.« less

  1. IDCDACS: IDC's Distributed Application Control System

    NASA Astrophysics Data System (ADS)

    Ertl, Martin; Boresch, Alexander; Kianička, Ján; Sudakov, Alexander; Tomuta, Elena

    2015-04-01

    The Preparatory Commission for the CTBTO is an international organization based in Vienna, Austria. Its mission is to establish a global verification regime to monitor compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), which bans all nuclear explosions. For this purpose time series data from a global network of seismic, hydro-acoustic and infrasound (SHI) sensors are transmitted to the International Data Centre (IDC) in Vienna in near-real-time, where it is processed to locate events that may be nuclear explosions. We newly designed the distributed application control system that glues together the various components of the automatic waveform data processing system at the IDC (IDCDACS). Our highly-scalable solution preserves the existing architecture of the IDC processing system that proved successful over many years of operational use, but replaces proprietary components with open-source solutions and custom developed software. Existing code was refactored and extended to obtain a reusable software framework that is flexibly adaptable to different types of processing workflows. Automatic data processing is organized in series of self-contained processing steps, each series being referred to as a processing pipeline. Pipelines process data by time intervals, i.e. the time-series data received from monitoring stations is organized in segments based on the time when the data was recorded. So-called data monitor applications queue the data for processing in each pipeline based on specific conditions, e.g. data availability, elapsed time or completion states of preceding processing pipelines. IDCDACS consists of a configurable number of distributed monitoring and controlling processes, a message broker and a relational database. All processes communicate through message queues hosted on the message broker. Persistent state information is stored in the database. A configurable processing controller instantiates and monitors all data processing applications. Due to decoupling by message queues the system is highly versatile and failure tolerant. The implementation utilizes the RabbitMQ open-source messaging platform that is based upon the Advanced Message Queuing Protocol (AMQP), an on-the-wire protocol (like HTML) and open industry standard. IDCDACS uses high availability capabilities provided by RabbitMQ and is equipped with failure recovery features to survive network and server outages. It is implemented in C and Python and is operated in a Linux environment at the IDC. Although IDCDACS was specifically designed for the existing IDC processing system its architecture is generic and reusable for different automatic processing workflows, e.g. similar to those described in (Olivieri et al. 2012, Kværna et al. 2012). Major advantages are its independence of the specific data processing applications used and the possibility to reconfigure IDCDACS for different types of processing, data and trigger logic. A possible future development would be to use the IDCDACS framework for different scientific domains, e.g. for processing of Earth observation satellite data extending the one-dimensional time-series intervals to spatio-temporal data cubes. REFERENCES Olivieri M., J. Clinton (2012) An almost fair comparison between Earthworm and SeisComp3, Seismological Research Letters, 83(4), 720-727. Kværna, T., S. J. Gibbons, D. B. Harris, D. A. Dodge (2012) Adapting pipeline architectures to track developing aftershock sequences and recurrent explosions, Proceedings of the 2012 Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies, 776-785.

  2. Conceptual Frameworks in Didactics--Learning and Teaching: Trends, Evolutions and Comparative Challenges

    ERIC Educational Resources Information Center

    Ligozat, Florence; Almqvist, Jonas

    2018-01-01

    This special issue of the "European Educational Research Journal" presents a series of research papers reflecting the trends and evolutions in conceptual frameworks that took place within the EERA 27 "Didactics--Learning and Teaching" network during its first ten years of existence. Most conceptual tools used in this field were…

  3. Conceptual Frameworks for Child Care Decision-Making. White Paper

    ERIC Educational Resources Information Center

    Chaudry, Ajay; Henly, Julia; Meyers, Marcia

    2010-01-01

    This working paper is one in a series of projects initiated by the Administration for Children and Families (ACF) to improve knowledge for child care researchers and policy makers about parental child care decision making. In this paper, the authors identify three distinct conceptual frameworks for understanding child care decisions--a rational…

  4. Using Concrete & Representational Experiences to Understand the Structure of DNA: A Four-Step Instructional Framework

    ERIC Educational Resources Information Center

    Harrell, Pamela Esprivalo; Richards, Debbie; Collins, James; Taylor, Sarah

    2005-01-01

    A description of learning experience that uses a four-step instrumentational framework involving concrete and representational experiences to promote conceptual understanding of abstract biological concepts by a series of closely-related activities is presented. The students are introduced to the structure and implications of DNA using four…

  5. Computers and Cultural Diversity. Restructuring for School Success. SUNY Series, Computers in Education.

    ERIC Educational Resources Information Center

    DeVillar, Robert A.; Faltis, Christian J.

    This book offers an alternative conceptual framework for effectively incorporating computer use within the heterogeneous classroom. The framework integrates Vygotskian social-learning theory with Allport's contact theory and the principles of cooperative learning. In Part 1 an essential element is identified for each of these areas. These are, in…

  6. An Evaluation Framework and Instrument for Evaluating e-Assessment Tools

    ERIC Educational Resources Information Center

    Singh, Upasana Gitanjali; de Villiers, Mary Ruth

    2017-01-01

    e-Assessment, in the form of tools and systems that deliver and administer multiple choice questions (MCQs), is used increasingly, raising the need for evaluation and validation of such systems. This research uses literature and a series of six empirical action research studies to develop an evaluation framework of categories and criteria called…

  7. Elastic and Inelastic Collisions

    ERIC Educational Resources Information Center

    Gluck, Paul

    2010-01-01

    There have been two articles in this journal that described a pair of collision carts used to demonstrate vividly the difference between elastic and inelastic collisions. One cart had a series of washers that were mounted rigidly on a rigid wooden framework, the other had washers mounted on rubber bands stretched across a framework. The rigidly…

  8. Short-term prediction of chaotic time series by using RBF network with regression weights.

    PubMed

    Rojas, I; Gonzalez, J; Cañas, A; Diaz, A F; Rojas, F J; Rodriguez, M

    2000-10-01

    We propose a framework for constructing and training a radial basis function (RBF) neural network. The structure of the gaussian functions is modified using a pseudo-gaussian function (PG) in which two scaling parameters sigma are introduced, which eliminates the symmetry restriction and provides the neurons in the hidden layer with greater flexibility with respect to function approximation. We propose a modified PG-BF (pseudo-gaussian basis function) network in which the regression weights are used to replace the constant weights in the output layer. For this purpose, a sequential learning algorithm is presented to adapt the structure of the network, in which it is possible to create a new hidden unit and also to detect and remove inactive units. A salient feature of the network systems is that the method used for calculating the overall output is the weighted average of the output associated with each receptive field. The superior performance of the proposed PG-BF system over the standard RBF are illustrated using the problem of short-term prediction of chaotic time series.

  9. Ship Speed Retrieval From Single Channel TerraSAR-X Data

    NASA Astrophysics Data System (ADS)

    Soccorsi, Matteo; Lehner, Susanne

    2010-04-01

    A method to estimate the speed of a moving ship is presented. The technique, introduced in Kirscht (1998), is extended to marine application and validated on TerraSAR-X High-Resolution (HR) data. The generation of a sequence of single-look SAR images from a single- channel image corresponds to an image time series with reduced resolution. This allows applying change detection techniques on the time series to evaluate the velocity components in range and azimuth of the ship. The evaluation of the displacement vector of a moving target in consecutive images of the sequence allows the estimation of the azimuth velocity component. The range velocity component is estimated by evaluating the variation of the signal amplitude during the sequence. In order to apply the technique on TerraSAR-X Spot Light (SL) data a further processing step is needed. The phase has to be corrected as presented in Eineder et al. (2009) due to the SL acquisition mode; otherwise the image sequence cannot be generated. The analysis, when possible validated by the Automatic Identification System (AIS), was performed in the framework of the ESA project MARISS.

  10. State-Space Analysis of Granger-Geweke Causality Measures with Application to fMRI.

    PubMed

    Solo, Victor

    2016-05-01

    The recent interest in the dynamics of networks and the advent, across a range of applications, of measuring modalities that operate on different temporal scales have put the spotlight on some significant gaps in the theory of multivariate time series. Fundamental to the description of network dynamics is the direction of interaction between nodes, accompanied by a measure of the strength of such interactions. Granger causality and its associated frequency domain strength measures (GEMs) (due to Geweke) provide a framework for the formulation and analysis of these issues. In pursuing this setup, three significant unresolved issues emerge. First, computing GEMs involves computing submodels of vector time series models, for which reliable methods do not exist. Second, the impact of filtering on GEMs has never been definitively established. Third, the impact of downsampling on GEMs has never been established. In this work, using state-space methods, we resolve all these issues and illustrate the results with some simulations. Our analysis is motivated by some problems in (fMRI) brain imaging, to which we apply it, but it is of general applicability.

  11. State-Space Analysis of Granger-Geweke Causality Measures with Application to fMRI

    PubMed Central

    Solo, Victor

    2017-01-01

    The recent interest in the dynamics of networks and the advent, across a range of applications, of measuring modalities that operate on different temporal scales have put the spotlight on some significant gaps in the theory of multivariate time series. Fundamental to the description of network dynamics is the direction of interaction between nodes, accompanied by a measure of the strength of such interactions. Granger causality and its associated frequency domain strength measures (GEMs) (due to Geweke) provide a framework for the formulation and analysis of these issues. In pursuing this setup, three significant unresolved issues emerge. First, computing GEMs involves computing submodels of vector time series models, for which reliable methods do not exist. Second, the impact of filtering on GEMs has never been definitively established. Third, the impact of downsampling on GEMs has never been established. In this work, using state-space methods, we resolve all these issues and illustrate the results with some simulations. Our analysis is motivated by some problems in (fMRI) brain imaging, to which we apply it, but it is of general applicability. PMID:26942749

  12. A Framework to Assess the Cumulative Hydrological Impacts of Dams on flow Regime

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Wang, D.

    2016-12-01

    In this study we proposed a framework to assess the cumulative impact of dams on hydrological regime, and the impacts of the Three Gorges Dam on flow regime in Yangtze River were investigated with the framework. We reconstructed the unregulated flow series to compare with the regulated flow series in the same period. Eco-surplus and eco-deficit and the Indicators of Hydrologic Alteration parameters were used to examine the hydrological regime change. Among IHA parameters, Wilcoxon signed-rank test and Principal Components Analysis identified the representative indicators of hydrological alterations. Eco-surplus and eco-deficit showed that the reservoir also changed the seasonal regime of the flows in autumn and winter. Annual extreme flows and October flows changes lead to negative ecological implications downstream from the Three Gorges Dam. Ecological operation for the Three Gorges Dam is necessary to mitigate the negative effects on the river ecosystem in the middle reach of Yangtze River. The framework proposed here could be a robust method to assess the cumulative impacts of reservoir operation.

  13. Bayesian wavelet PCA methodology for turbomachinery damage diagnosis under uncertainty

    NASA Astrophysics Data System (ADS)

    Xu, Shengli; Jiang, Xiaomo; Huang, Jinzhi; Yang, Shuhua; Wang, Xiaofang

    2016-12-01

    Centrifugal compressor often suffers various defects such as impeller cracking, resulting in forced outage of the total plant. Damage diagnostics and condition monitoring of such a turbomachinery system has become an increasingly important and powerful tool to prevent potential failure in components and reduce unplanned forced outage and further maintenance costs, while improving reliability, availability and maintainability of a turbomachinery system. This paper presents a probabilistic signal processing methodology for damage diagnostics using multiple time history data collected from different locations of a turbomachine, considering data uncertainty and multivariate correlation. The proposed methodology is based on the integration of three advanced state-of-the-art data mining techniques: discrete wavelet packet transform, Bayesian hypothesis testing, and probabilistic principal component analysis. The multiresolution wavelet analysis approach is employed to decompose a time series signal into different levels of wavelet coefficients. These coefficients represent multiple time-frequency resolutions of a signal. Bayesian hypothesis testing is then applied to each level of wavelet coefficient to remove possible imperfections. The ratio of posterior odds Bayesian approach provides a direct means to assess whether there is imperfection in the decomposed coefficients, thus avoiding over-denoising. Power spectral density estimated by the Welch method is utilized to evaluate the effectiveness of Bayesian wavelet cleansing method. Furthermore, the probabilistic principal component analysis approach is developed to reduce dimensionality of multiple time series and to address multivariate correlation and data uncertainty for damage diagnostics. The proposed methodology and generalized framework is demonstrated with a set of sensor data collected from a real-world centrifugal compressor with impeller cracks, through both time series and contour analyses of vibration signal and principal components.

  14. Trend Extraction in Functional Data of Amplitudes of R and T Waves in Exercise Electrocardiogram

    NASA Astrophysics Data System (ADS)

    Cammarota, Camillo; Curione, Mario

    The amplitudes of R and T waves of the electrocardiogram (ECG) recorded during the exercise test show both large inter- and intra-individual variability in response to stress. We analyze a dataset of 65 normal subjects undergoing ambulatory test. We model the dataset of R and T series in the framework of functional data, assuming that the individual series are realizations of a non-stationary process, centered at the population trend. We test the time variability of this trend computing a simultaneous confidence band and the zero crossing of its derivative. The analysis shows that the amplitudes of the R and T waves have opposite responses to stress, consisting respectively in a bump and a dip at the early recovery stage. Our findings support the existence of a relationship between R and T wave amplitudes and respectively diastolic and systolic ventricular volumes.

  15. Spatio-temporal prediction of daily temperatures using time-series of MODIS LST images

    NASA Astrophysics Data System (ADS)

    Hengl, Tomislav; Heuvelink, Gerard B. M.; Perčec Tadić, Melita; Pebesma, Edzer J.

    2012-01-01

    A computational framework to generate daily temperature maps using time-series of publicly available MODIS MOD11A2 product Land Surface Temperature (LST) images (1 km resolution; 8-day composites) is illustrated using temperature measurements from the national network of meteorological stations (159) in Croatia. The input data set contains 57,282 ground measurements of daily temperature for the year 2008. Temperature was modeled as a function of latitude, longitude, distance from the sea, elevation, time, insolation, and the MODIS LST images. The original rasters were first converted to principal components to reduce noise and filter missing pixels in the LST images. The residual were next analyzed for spatio-temporal auto-correlation; sum-metric separable variograms were fitted to account for zonal and geometric space-time anisotropy. The final predictions were generated for time-slices of a 3D space-time cube, constructed in the R environment for statistical computing. The results show that the space-time regression model can explain a significant part of the variation in station-data (84%). MODIS LST 8-day (cloud-free) images are unbiased estimator of the daily temperature, but with relatively low precision (±4.1°C); however their added value is that they systematically improve detection of local changes in land surface temperature due to local meteorological conditions and/or active heat sources (urban areas, land cover classes). The results of 10-fold cross-validation show that use of spatio-temporal regression-kriging and incorporation of time-series of remote sensing images leads to significantly more accurate maps of temperature than if plain spatial techniques were used. The average (global) accuracy of mapping temperature was ±2.4°C. The regression-kriging explained 91% of variability in daily temperatures, compared to 44% for ordinary kriging. Further software advancement—interactive space-time variogram exploration and automated retrieval, resampling and filtering of MODIS images—are anticipated.

  16. Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition

    PubMed Central

    Ordóñez, Francisco Javier; Roggen, Daniel

    2016-01-01

    Human activity recognition (HAR) tasks have traditionally been solved using engineered features obtained by heuristic processes. Current research suggests that deep convolutional neural networks are suited to automate feature extraction from raw sensor inputs. However, human activities are made of complex sequences of motor movements, and capturing this temporal dynamics is fundamental for successful HAR. Based on the recent success of recurrent neural networks for time series domains, we propose a generic deep framework for activity recognition based on convolutional and LSTM recurrent units, which: (i) is suitable for multimodal wearable sensors; (ii) can perform sensor fusion naturally; (iii) does not require expert knowledge in designing features; and (iv) explicitly models the temporal dynamics of feature activations. We evaluate our framework on two datasets, one of which has been used in a public activity recognition challenge. Our results show that our framework outperforms competing deep non-recurrent networks on the challenge dataset by 4% on average; outperforming some of the previous reported results by up to 9%. Our results show that the framework can be applied to homogeneous sensor modalities, but can also fuse multimodal sensors to improve performance. We characterise key architectural hyperparameters’ influence on performance to provide insights about their optimisation. PMID:26797612

  17. Arbitrary norm support vector machines.

    PubMed

    Huang, Kaizhu; Zheng, Danian; King, Irwin; Lyu, Michael R

    2009-02-01

    Support vector machines (SVM) are state-of-the-art classifiers. Typically L2-norm or L1-norm is adopted as a regularization term in SVMs, while other norm-based SVMs, for example, the L0-norm SVM or even the L(infinity)-norm SVM, are rarely seen in the literature. The major reason is that L0-norm describes a discontinuous and nonconvex term, leading to a combinatorially NP-hard optimization problem. In this letter, motivated by Bayesian learning, we propose a novel framework that can implement arbitrary norm-based SVMs in polynomial time. One significant feature of this framework is that only a sequence of sequential minimal optimization problems needs to be solved, thus making it practical in many real applications. The proposed framework is important in the sense that Bayesian priors can be efficiently plugged into most learning methods without knowing the explicit form. Hence, this builds a connection between Bayesian learning and the kernel machines. We derive the theoretical framework, demonstrate how our approach works on the L0-norm SVM as a typical example, and perform a series of experiments to validate its advantages. Experimental results on nine benchmark data sets are very encouraging. The implemented L0-norm is competitive with or even better than the standard L2-norm SVM in terms of accuracy but with a reduced number of support vectors, -9.46% of the number on average. When compared with another sparse model, the relevance vector machine, our proposed algorithm also demonstrates better sparse properties with a training speed over seven times faster.

  18. Stochasticity, succession, and environmental perturbations in a fluidic ecosystem

    PubMed Central

    Zhou, Jizhong; Deng, Ye; Zhang, Ping; Xue, Kai; Liang, Yuting; Van Nostrand, Joy D.; Yang, Yunfeng; He, Zhili; Wu, Liyou; Stahl, David A.; Hazen, Terry C.; Tiedje, James M.; Arkin, Adam P.

    2014-01-01

    Unraveling the drivers of community structure and succession in response to environmental change is a central goal in ecology. Although the mechanisms shaping community structure have been intensively examined, those controlling ecological succession remain elusive. To understand the relative importance of stochastic and deterministic processes in mediating microbial community succession, a unique framework composed of four different cases was developed for fluidic and nonfluidic ecosystems. The framework was then tested for one fluidic ecosystem: a groundwater system perturbed by adding emulsified vegetable oil (EVO) for uranium immobilization. Our results revealed that groundwater microbial community diverged substantially away from the initial community after EVO amendment and eventually converged to a new community state, which was closely clustered with its initial state. However, their composition and structure were significantly different from each other. Null model analysis indicated that both deterministic and stochastic processes played important roles in controlling the assembly and succession of the groundwater microbial community, but their relative importance was time dependent. Additionally, consistent with the proposed conceptual framework but contradictory to conventional wisdom, the community succession responding to EVO amendment was primarily controlled by stochastic rather than deterministic processes. During the middle phase of the succession, the roles of stochastic processes in controlling community composition increased substantially, ranging from 81.3% to 92.0%. Finally, there are limited successional studies available to support different cases in the conceptual framework, but further well-replicated explicit time-series experiments are needed to understand the relative importance of deterministic and stochastic processes in controlling community succession. PMID:24550501

  19. Time-Resolved Transposon Insertion Sequencing Reveals Genome-Wide Fitness Dynamics during Infection.

    PubMed

    Yang, Guanhua; Billings, Gabriel; Hubbard, Troy P; Park, Joseph S; Yin Leung, Ka; Liu, Qin; Davis, Brigid M; Zhang, Yuanxing; Wang, Qiyao; Waldor, Matthew K

    2017-10-03

    Transposon insertion sequencing (TIS) is a powerful high-throughput genetic technique that is transforming functional genomics in prokaryotes, because it enables genome-wide mapping of the determinants of fitness. However, current approaches for analyzing TIS data assume that selective pressures are constant over time and thus do not yield information regarding changes in the genetic requirements for growth in dynamic environments (e.g., during infection). Here, we describe structured analysis of TIS data collected as a time series, termed pattern analysis of conditional essentiality (PACE). From a temporal series of TIS data, PACE derives a quantitative assessment of each mutant's fitness over the course of an experiment and identifies mutants with related fitness profiles. In so doing, PACE circumvents major limitations of existing methodologies, specifically the need for artificial effect size thresholds and enumeration of bacterial population expansion. We used PACE to analyze TIS samples of Edwardsiella piscicida (a fish pathogen) collected over a 2-week infection period from a natural host (the flatfish turbot). PACE uncovered more genes that affect E. piscicida 's fitness in vivo than were detected using a cutoff at a terminal sampling point, and it identified subpopulations of mutants with distinct fitness profiles, one of which informed the design of new live vaccine candidates. Overall, PACE enables efficient mining of time series TIS data and enhances the power and sensitivity of TIS-based analyses. IMPORTANCE Transposon insertion sequencing (TIS) enables genome-wide mapping of the genetic determinants of fitness, typically based on observations at a single sampling point. Here, we move beyond analysis of endpoint TIS data to create a framework for analysis of time series TIS data, termed pattern analysis of conditional essentiality (PACE). We applied PACE to identify genes that contribute to colonization of a natural host by the fish pathogen Edwardsiella piscicida. PACE uncovered more genes that affect E. piscicida 's fitness in vivo than were detected using a terminal sampling point, and its clustering of mutants with related fitness profiles informed design of new live vaccine candidates. PACE yields insights into patterns of fitness dynamics and circumvents major limitations of existing methodologies. Finally, the PACE method should be applicable to additional "omic" time series data, including screens based on clustered regularly interspaced short palindromic repeats with Cas9 (CRISPR/Cas9). Copyright © 2017 Yang et al.

  20. Testing and validating environmental models

    USGS Publications Warehouse

    Kirchner, J.W.; Hooper, R.P.; Kendall, C.; Neal, C.; Leavesley, G.

    1996-01-01

    Generally accepted standards for testing and validating ecosystem models would benefit both modellers and model users. Universally applicable test procedures are difficult to prescribe, given the diversity of modelling approaches and the many uses for models. However, the generally accepted scientific principles of documentation and disclosure provide a useful framework for devising general standards for model evaluation. Adequately documenting model tests requires explicit performance criteria, and explicit benchmarks against which model performance is compared. A model's validity, reliability, and accuracy can be most meaningfully judged by explicit comparison against the available alternatives. In contrast, current practice is often characterized by vague, subjective claims that model predictions show 'acceptable' agreement with data; such claims provide little basis for choosing among alternative models. Strict model tests (those that invalid models are unlikely to pass) are the only ones capable of convincing rational skeptics that a model is probably valid. However, 'false positive' rates as low as 10% can substantially erode the power of validation tests, making them insufficiently strict to convince rational skeptics. Validation tests are often undermined by excessive parameter calibration and overuse of ad hoc model features. Tests are often also divorced from the conditions under which a model will be used, particularly when it is designed to forecast beyond the range of historical experience. In such situations, data from laboratory and field manipulation experiments can provide particularly effective tests, because one can create experimental conditions quite different from historical data, and because experimental data can provide a more precisely defined 'target' for the model to hit. We present a simple demonstration showing that the two most common methods for comparing model predictions to environmental time series (plotting model time series against data time series, and plotting predicted versus observed values) have little diagnostic power. We propose that it may be more useful to statistically extract the relationships of primary interest from the time series, and test the model directly against them.

  1. Assessing the statistical robustness of inter- and intra-basinal carbon isotope chemostratigraphic correlation

    NASA Astrophysics Data System (ADS)

    Hay, C.; Creveling, J. R.; Huybers, P. J.

    2016-12-01

    Excursions in the stable carbon isotopic composition of carbonate rocks (δ13Ccarb) can facilitate correlation of Precambrian and Phanerozoic sedimentary successions at a higher temporal resolution than radiometric and biostratigraphic frameworks typically afford. Within the bounds of litho- and biostratigraphic constraints, stratigraphers often correlate isotopic patterns between distant stratigraphic sections through visual alignment of local maxima and minima of isotopic values. The reproducibility of this method can prove challenging and, thus, evaluating the statistical robustness of intrabasinal composite carbon isotope curves, and global correlations to these reference curves, remains difficult. To assess the reproducibility of stratigraphic alignment of δ13Ccarb data, and correlations between carbon isotope excursions, we employ a numerical dynamic time warping methodology that stretches and squeezes the time axis of a record to obtain an optimal correlation (in a least-squares sense) between time-uncertain series of data. In particular, we assess various alignments between series of Early Cambrian δ13Ccarb data with respect to plausible matches. We first show that an alignment of these records obtained visually, and published previously, is broadly reproducible using dynamic time warping. Alternative alignments with similar goodness of fits are also obtainable, and their stratigraphic plausibility are discussed. This approach should be generalizable to an algorithm for the purposes of developing a library of plausible alignments between multiple time-uncertain stratigraphic records.

  2. Interpreting beta-diversity components over time to conserve metacommunities in highly dynamic ecosystems.

    PubMed

    Ruhí, Albert; Datry, Thibault; Sabo, John L

    2017-12-01

    The concept of metacommunity (i.e., a set of local communities linked by dispersal) has gained great popularity among community ecologists. However, metacommunity research mostly addresses questions on spatial patterns of biodiversity at the regional scale, whereas conservation planning requires quantifying temporal variation in those metacommunities and the contributions that individual (local) sites make to regional dynamics. We propose that recent advances in diversity-partitioning methods may allow for a better understanding of metacommunity dynamics and the identification of keystone sites. We used time series of the 2 components of beta diversity (richness and replacement) and the contributions of local sites to these components to examine which sites controlled source-sink dynamics in a highly dynamic model system (an intermittent river). The relative importance of the richness and replacement components of beta diversity fluctuated over time, and sample aggregation led to underestimation of beta diversity by up to 35%. Our literature review revealed that research on intermittent rivers would benefit greatly from examination of beta-diversity components over time. Adequately appraising spatiotemporal variability in community composition and identifying sites that are pivotal for maintaining biodiversity at the landscape scale are key needs for conservation prioritization and planning. Thus, our framework may be used to guide conservation actions in highly dynamic ecosystems when time-series data describing biodiversity across sites connected by dispersal are available. © 2017 Society for Conservation Biology.

  3. The migratory impact of minimum wage legislation: Puerto Rico, 1970-1987.

    PubMed

    Santiago, C E

    1993-01-01

    "This study examines the impact of minimum wage setting on labor migration. A multiple time series framework is applied to monthly data for Puerto Rico from 1970-1987. The results show that net emigration from Puerto Rico to the United States fell in response to significant changes in the manner in which minimum wage policy was conducted, particularly after 1974. The extent of commuter type labor migration between Puerto Rico and the United States is influenced by minimum wage policy, with potentially important consequences for human capital investment and long-term standards of living." excerpt

  4. Unsupervised Framework to Monitor Lake Dynamics

    NASA Technical Reports Server (NTRS)

    Chen, Xi C. (Inventor); Boriah, Shyam (Inventor); Khandelwal, Ankush (Inventor); Kumar, Vipin (Inventor)

    2016-01-01

    A method of reducing processing time when assigning geographic areas to land cover labels using satellite sensor values includes a processor receiving a feature value for each pixel in a time series of frames of satellite sensor values, each frame containing multiple pixels and each frame covering a same geographic location. For each sub-area of the geographic location, the sub-area is assigned to one of at least three land cover labels. The processor determines a fraction function for a first sub-area assigned to a first land cover label. The sub-areas that were assigned to the first land cover label are reassigned to one of the second land cover label and the third land cover label based on the fraction functions of the sub-areas.

  5. Optimal trading strategies—a time series approach

    NASA Astrophysics Data System (ADS)

    Bebbington, Peter A.; Kühn, Reimer

    2016-05-01

    Motivated by recent advances in the spectral theory of auto-covariance matrices, we are led to revisit a reformulation of Markowitz’ mean-variance portfolio optimization approach in the time domain. In its simplest incarnation it applies to a single traded asset and allows an optimal trading strategy to be found which—for a given return—is minimally exposed to market price fluctuations. The model is initially investigated for a range of synthetic price processes, taken to be either second order stationary, or to exhibit second order stationary increments. Attention is paid to consequences of estimating auto-covariance matrices from small finite samples, and auto-covariance matrix cleaning strategies to mitigate against these are investigated. Finally we apply our framework to real world data.

  6. Use of the challenge point framework to guide motor learning of stepping reactions for improved balance control in people with stroke: a case series.

    PubMed

    Pollock, Courtney L; Boyd, Lara A; Hunt, Michael A; Garland, S Jayne

    2014-04-01

    Stepping reactions are important for walking balance and community-level mobility. Stepping reactions of people with stroke are characterized by slow reaction times, poor coordination of motor responses, and low amplitude of movements, which may contribute to their decreased ability to recover their balance when challenged. An important aspect of rehabilitation of mobility after stroke is optimizing the motor learning associated with retraining effective stepping reactions. The Challenge Point Framework (CPF) is a model that can be used to promote motor learning through manipulation of conditions of practice to modify task difficulty, that is, the interaction of the skill of the learner and the difficulty of the task to be learned. This case series illustrates how the retraining of multidirectional stepping reactions may be informed by the CPF to improve balance function in people with stroke. Four people (53-68 years of age) with chronic stroke (>1 year) and mild to moderate motor recovery received 4 weeks of multidirectional stepping reaction retraining. Important tenets of motor learning were optimized for each person during retraining in accordance with the CPF. Participants demonstrated improved community-level walking balance, as determined with the Community Balance and Mobility Scale. These improvements were evident 1 year later. Aspects of balance-related self-efficacy and movement kinematics also showed improvements during the course of the intervention. The application of CPF motor learning principles in the retraining of stepping reactions to improve community-level walking balance in people with chronic stroke appears to be promising. The CPF provides a plausible theoretical framework for the progression of functional task training in neurorehabilitation.

  7. Participatory Video: Toward a Method, Advocacy and Voice (MAV) Framework

    ERIC Educational Resources Information Center

    Sitter, Kathleen C.

    2012-01-01

    Using the new conceptual framework of participatory visual media as method, advocacy and voice (MAV), the author explores an action research study using an exemplar in which advocates from the disability community created and distributed a series of videos about love and sexuality as a critical human rights issue in the disability community. The…

  8. Working on the Work: An Action Plan for Teachers, Principals, and Superintendents. The Jossey-Bass Education Series.

    ERIC Educational Resources Information Center

    Schlechty, Phillip C.

    This book, which is a companion to the books, "Shaking up the Schoolhouse" and "Inventing Better Schools," presents the Working on the Work (WOW) framework for improving student performance by improving the quality of schoolwork. Field-tested in schools nationwide, the framework describes the 12 essential components of a WOW…

  9. The Reading Turn-Around: A Five Part Framework for Differentiated Instruction. Practitioners Bookshelf, Language & Literacy Series

    ERIC Educational Resources Information Center

    Jones, Stephanie; Clarke, Lane; Enriquez, Grace

    2009-01-01

    This book demonstrates a five-part framework for teachers, reading specialists, and literacy coaches who want to help their least engaged students become powerful readers. Merging theory and practice, the guide offers successful strategies to reach your "struggling" learners. The authors show how you can "turn-around" your instructional practice,…

  10. Overseeing One-Stops: A WIB Member's Guide to Quality Assurance. Workforce Board Development Series.

    ERIC Educational Resources Information Center

    2000

    This guide is intended to give workforce investment boards (WIBs) a practical framework for overseeing the performance of one-stop career centers. The introduction presents background information on a proposed quality assurance framework and the guide's format and development. The first two sections explain the importance of quality assurance to…

  11. Adult Education Participation Decisions and Barriers: Review of Conceptual Frameworks and Empirical Studies. Working Paper Series.

    ERIC Educational Resources Information Center

    Silva, Tim; Cahalan, Margaret; Lacireno-Paquet, Natalie

    In preparation for the next National Household Education Survey (NHES), the conceptual frameworks of participatory behavior and methods used by other researchers to study factors promoting or inhibiting participation were examined. The following items were reviewed: the adult education (AE) barriers questions included on the 1991 and 1995 editions…

  12. Employability and Higher Education: The Follies of the "Productivity Challenge" in the Teaching Excellence Framework

    ERIC Educational Resources Information Center

    Frankham, Jo

    2017-01-01

    This article considers questions of "employability," a notion foregrounded in the Green and White Papers on the Teaching Excellence Framework (TEF). The paper first questions government imperatives concerning employability and suggests a series of mismatches that are evident in the rhetorics in this area. This summary opens up elements…

  13. Gender Equality and Girls' Education: Investigating Frameworks, Disjunctures and Meanings of Quality Education

    ERIC Educational Resources Information Center

    Aikman, Sheila; Rao, Nitya

    2012-01-01

    The article draws on qualitative educational research across a diversity of low-income countries to examine the gendered inequalities in education as complex, multi-faceted and situated rather than a series of barriers to be overcome through linear input-output processes focused on isolated dimensions of quality. It argues that frameworks for…

  14. A three-tiered approach for linking pharmacokinetic considerations to the adverse outcome pathway framework for chemical-specific risk assessment

    EPA Science Inventory

    The power of the adverse outcome pathway (AOP) framework arises from its utilization of pathway-based data to describe the initial interaction of a chemical with a molecular target (molecular initiating event; (MIE), followed by a progression through a series of key events that l...

  15. Searching for the Signature of Wastewater Injection in continuous GPS Data from The Geysers Geothermal Field

    NASA Astrophysics Data System (ADS)

    Terry, R. L.; Funning, G.; Floyd, M.

    2017-12-01

    The Geysers geothermal field in California, which provides a large portion of northern California's power, has seen declining steam pressures over the past three decades, accompanied by surface subsidence. Together, these two phenomena are likely the result of the exploitation of the reservoir without adequate time for natural restoration. To combat the decline in steam pressures, The Geysers began injecting imported wastewater into the geothermal reservoir in 1997 and expanded injection in 2003. In 2012 and 2013, we installed three continuously recording GPS stations in The Geysers to closely monitor crustal deformation due to both the extraction of steam and the injection of wastewater. To assess the impact of the current injection and extraction activities on the geothermal reservoir, we analyze the position time-series from these GPS stations alongside wastewater injection and steam extraction data. We use common-mode filtering to remove any regionally-correlated noise from our GPS time series, and also estimate and subtract any seasonal signals present. To predict the effect of injection and production on surface movement, we summed the monthly time series of well data within a rectangular grid framework. We then use an array of Mogi sources based on each grid cell's total volume change to calculate the expected surface deformation due to these volume changes at depth. The temporal resolution provided by GPS allows us to characterize more accurately the properties of the subsurface geothermal reservoir related to forcing. For example, based on a similar spatiotemporal relationship between injection and seismicity, we hypothesize that there may be a delayed deformation response following injection, related to the permeability of the reservoir, and are undertaking detailed comparisons between our time series data to identify this response. Overall changes in the sense and rate of vertical motion in the field due to injection over time are also expected. We anticipate that the impact of discovering a relationship between injection and surface deformation will be of great importance in maintaining and managing geothermal resources in the future.

  16. Detecting Forest Disturbance Events from MODIS and Landsat Time Series for the Conterminous United States

    NASA Astrophysics Data System (ADS)

    Zhang, G.; Ganguly, S.; Saatchi, S. S.; Hagen, S. C.; Harris, N.; Yu, Y.; Nemani, R. R.

    2013-12-01

    Spatial and temporal patterns of forest disturbance and regrowth processes are key for understanding aboveground terrestrial vegetation biomass and carbon stocks at regional-to-continental scales. The NASA Carbon Monitoring System (CMS) program seeks key input datasets, especially information related to impacts due to natural/man-made disturbances in forested landscapes of Conterminous U.S. (CONUS), that would reduce uncertainties in current carbon stock estimation and emission models. This study provides a end-to-end forest disturbance detection framework based on pixel time series analysis from MODIS (Moderate Resolution Imaging Spectroradiometer) and Landsat surface spectral reflectance data. We applied the BFAST (Breaks for Additive Seasonal and Trend) algorithm to the Normalized Difference Vegetation Index (NDVI) data for the time period from 2000 to 2011. A harmonic seasonal model was implemented in BFAST to decompose the time series to seasonal and interannual trend components in order to detect abrupt changes in magnitude and direction of these components. To apply the BFAST for whole CONUS, we built a parallel computing setup for processing massive time-series data using the high performance computing facility of the NASA Earth Exchange (NEX). In the implementation process, we extracted the dominant deforestation events from the magnitude of abrupt changes in both seasonal and interannual components, and estimated dates for corresponding deforestation events. We estimated the recovery rate for deforested regions through regression models developed between NDVI values and time since disturbance for all pixels. A similar implementation of the BFAST algorithm was performed over selected Landsat scenes (all Landsat cloud free data was used to generate NDVI from atmospherically corrected spectral reflectances) to demonstrate the spatial coherence in retrieval layers between MODIS and Landsat. In future, the application of this largely parallel disturbance detection setup will facilitate large scale processing and wall-to-wall mapping of forest disturbance and regrowth of Landsat data for the whole of CONUS. This exercise will aid in improving the present capabilities of the NASA CMS effort in reducing uncertainties in national-level estimates of biomass and carbon stocks.

  17. Entity-Centric Abstraction and Modeling Framework for Transportation Architectures

    NASA Technical Reports Server (NTRS)

    Lewe, Jung-Ho; DeLaurentis, Daniel A.; Mavris, Dimitri N.; Schrage, Daniel P.

    2007-01-01

    A comprehensive framework for representing transpportation architectures is presented. After discussing a series of preceding perspectives and formulations, the intellectual underpinning of the novel framework using an entity-centric abstraction of transportation is described. The entities include endogenous and exogenous factors and functional expressions are offered that relate these and their evolution. The end result is a Transportation Architecture Field which permits analysis of future concepts under the holistic perspective. A simulation model which stems from the framework is presented and exercised producing results which quantify improvements in air transportation due to advanced aircraft technologies. Finally, a modeling hypothesis and its accompanying criteria are proposed to test further use of the framework for evaluating new transportation solutions.

  18. Hawaii: 2002 Economic Census. 2002 Educational Services, Geographic Area Series. EC02-61A-HI.

    ERIC Educational Resources Information Center

    US Department of Commerce, 2005

    2005-01-01

    The economic census furnishes an important part of the framework for such composite measures as the gross domestic product estimates, input/output measures, production and price indexes, and other statistical series that measure short-term changes in economic conditions. Specific uses of economic census data include the following: Policymaking…

  19. Montana: 2002. 2002 Economic Census. Educational Services, Geographic Area Series. EC02-61A-MT

    ERIC Educational Resources Information Center

    US Department of Commerce, 2005

    2005-01-01

    The economic census furnishes an important part of the framework for such composite measures as the gross domestic product estimates, input/output measures, production and price indexes, and other statistical series that measure short-term changes in economic conditions. Specific uses of economic census data include the following: Policymaking…

  20. Utilization and Organization of Visually Presented Information. Final Report.

    ERIC Educational Resources Information Center

    Dick, A. O.

    The experiments discussed in this report do not have a direct relationship to each other but represent work on a series of sub-issues within the general framework of visual processing of information. Because of this discreteness, the report is organized into a series of papers. The first is a general review of tachistoscopic work on iconic memory…

  1. Teaching Mathematical Modelling: Demonstrating Enrichment and Elaboration

    ERIC Educational Resources Information Center

    Warwick, Jon

    2015-01-01

    This paper uses a series of models to illustrate one of the fundamental processes of model building--that of enrichment and elaboration. The paper describes how a problem context is given which allows a series of models to be developed from a simple initial model using a queuing theory framework. The process encourages students to think about the…

  2. Analyzing Crime and Crime Control: A Resource Guide. Economics-Political Science Series.

    ERIC Educational Resources Information Center

    Butterfield, Ruth I.; And Others

    This document, the fourth in a series of resource guides emphasizing economic-political analysis of contemporary public policies and issues, focuses on crime control. Designed as a three-week unit for secondary school students, the guide is presented in three sections. The introduction presents an economic and a political science framework for…

  3. Data Management System for the National Energy-Water System (NEWS) Assessment Framework

    NASA Astrophysics Data System (ADS)

    Corsi, F.; Prousevitch, A.; Glidden, S.; Piasecki, M.; Celicourt, P.; Miara, A.; Fekete, B. M.; Vorosmarty, C. J.; Macknick, J.; Cohen, S. M.

    2015-12-01

    Aiming at providing a comprehensive assessment of the water-energy nexus, the National Energy-Water System (NEWS) project requires the integration of data to support a modeling framework that links climate, hydrological, power production, transmission, and economical models. Large amounts of Georeferenced data has to be streamed to the components of the inter-disciplinary model to explore future challenges and tradeoffs in the US power production, based on climate scenarios, power plant locations and technologies, available water resources, ecosystem sustainability, and economic demand. We used open source and in-house build software components to build a system that addresses two major data challenges: On-the-fly re-projection, re-gridding, interpolation, extrapolation, nodata patching, merging, temporal and spatial aggregation, of static and time series datasets in virtually any file formats and file structures, and any geographic extent for the models I/O, directly at run time; Comprehensive data management based on metadata cataloguing and discovery in repositories utilizing the MAGIC Table (Manipulation and Geographic Inquiry Control database). This innovative concept allows models to access data on-the-fly by data ID, irrespective of file path, file structure, file format and regardless its GIS specifications. In addition, a web-based information and computational system is being developed to control the I/O of spatially distributed Earth system, climate, and hydrological, power grid, and economical data flow within the NEWS framework. The system allows scenario building, data exploration, visualization, querying, and manipulation any loaded gridded, point, and vector polygon dataset. The system has demonstrated its potential for applications in other fields of Earth science modeling, education, and outreach. Over time, this implementation of the system will provide near real-time assessment of various current and future scenarios of the water-energy nexus.

  4. SSEL-ADE: A semi-supervised ensemble learning framework for extracting adverse drug events from social media.

    PubMed

    Liu, Jing; Zhao, Songzheng; Wang, Gang

    2018-01-01

    With the development of Web 2.0 technology, social media websites have become lucrative but under-explored data sources for extracting adverse drug events (ADEs), which is a serious health problem. Besides ADE, other semantic relation types (e.g., drug indication and beneficial effect) could hold between the drug and adverse event mentions, making ADE relation extraction - distinguishing ADE relationship from other relation types - necessary. However, conducting ADE relation extraction in social media environment is not a trivial task because of the expertise-dependent, time-consuming and costly annotation process, and the feature space's high-dimensionality attributed to intrinsic characteristics of social media data. This study aims to develop a framework for ADE relation extraction using patient-generated content in social media with better performance than that delivered by previous efforts. To achieve the objective, a general semi-supervised ensemble learning framework, SSEL-ADE, was developed. The framework exploited various lexical, semantic, and syntactic features, and integrated ensemble learning and semi-supervised learning. A series of experiments were conducted to verify the effectiveness of the proposed framework. Empirical results demonstrate the effectiveness of each component of SSEL-ADE and reveal that our proposed framework outperforms most of existing ADE relation extraction methods The SSEL-ADE can facilitate enhanced ADE relation extraction performance, thereby providing more reliable support for pharmacovigilance. Moreover, the proposed semi-supervised ensemble methods have the potential of being applied to effectively deal with other social media-based problems. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Managing time-substitutable electricity usage using dynamic controls

    DOEpatents

    Ghosh, Soumyadip; Hosking, Jonathan R.; Natarajan, Ramesh; Subramaniam, Shivaram; Zhang, Xiaoxuan

    2017-02-07

    A predictive-control approach allows an electricity provider to monitor and proactively manage peak and off-peak residential intra-day electricity usage in an emerging smart energy grid using time-dependent dynamic pricing incentives. The daily load is modeled as time-shifted, but cost-differentiated and substitutable, copies of the continuously-consumed electricity resource, and a consumer-choice prediction model is constructed to forecast the corresponding intra-day shares of total daily load according to this model. This is embedded within an optimization framework for managing the daily electricity usage. A series of transformations are employed, including the reformulation-linearization technique (RLT) to obtain a Mixed-Integer Programming (MIP) model representation of the resulting nonlinear optimization problem. In addition, various regulatory and pricing constraints are incorporated in conjunction with the specified profit and capacity utilization objectives.

  6. Managing time-substitutable electricity usage using dynamic controls

    DOEpatents

    Ghosh, Soumyadip; Hosking, Jonathan R.; Natarajan, Ramesh; Subramaniam, Shivaram; Zhang, Xiaoxuan

    2017-02-21

    A predictive-control approach allows an electricity provider to monitor and proactively manage peak and off-peak residential intra-day electricity usage in an emerging smart energy grid using time-dependent dynamic pricing incentives. The daily load is modeled as time-shifted, but cost-differentiated and substitutable, copies of the continuously-consumed electricity resource, and a consumer-choice prediction model is constructed to forecast the corresponding intra-day shares of total daily load according to this model. This is embedded within an optimization framework for managing the daily electricity usage. A series of transformations are employed, including the reformulation-linearization technique (RLT) to obtain a Mixed-Integer Programming (MIP) model representation of the resulting nonlinear optimization problem. In addition, various regulatory and pricing constraints are incorporated in conjunction with the specified profit and capacity utilization objectives.

  7. Dynamic simulation and numerical analysis of hurricane storm surge under sea level rise with geomorphologic changes along the northern Gulf of Mexico

    USGS Publications Warehouse

    Bilskie, Matthew V.; Hagen, S.C.; Alizad, K.A.; Medeiros, S.C.; Passeri, Davina L.; Needham, H.F.; Cox, A.

    2016-01-01

    This work outlines a dynamic modeling framework to examine the effects of global climate change, and sea level rise (SLR) in particular, on tropical cyclone-driven storm surge inundation. The methodology, applied across the northern Gulf of Mexico, adapts a present day large-domain, high resolution, tide, wind-wave, and hurricane storm surge model to characterize the potential outlook of the coastal landscape under four SLR scenarios for the year 2100. The modifications include shoreline and barrier island morphology, marsh migration, and land use land cover change. Hydrodynamics of 10 historic hurricanes were simulated through each of the five model configurations (present day and four SLR scenarios). Under SLR, the total inundated land area increased by 87% and developed and agricultural lands by 138% and 189%, respectively. Peak surge increased by as much as 1 m above the applied SLR in some areas, and other regions were subject to a reduction in peak surge, with respect to the applied SLR, indicating a nonlinear response. Analysis of time-series water surface elevation suggests the interaction between SLR and storm surge is nonlinear in time; SLR increased the time of inundation and caused an earlier arrival of the peak surge, which cannot be addressed using a static (“bathtub”) modeling framework. This work supports the paradigm shift to using a dynamic modeling framework to examine the effects of global climate change on coastal inundation. The outcomes have broad implications and ultimately support a better holistic understanding of the coastal system and aid restoration and long-term coastal sustainability.

  8. Building a Continental Scale Land Cover Monitoring Framework for Australia

    NASA Astrophysics Data System (ADS)

    Thankappan, Medhavy; Lymburner, Leo; Tan, Peter; McIntyre, Alexis; Curnow, Steven; Lewis, Adam

    2012-04-01

    Land cover information is critical for national reporting and decision making in Australia. A review of information requirements for reporting on national environmental indicators identified the need for consistent land cover information to be compared against a baseline. A Dynamic Land Cover Dataset (DLCD) for Australia has been developed by Geoscience Australia and the Australian Bureau of Agriculture and Resource Economics and Sciences (ABARES) recently, to provide a comprehensive and consistent land cover information baseline to enable monitoring and reporting for sustainable farming practices, water resource management, soil erosion, and forests at national and regional scales. The DLCD was produced from the analysis of Enhanced Vegetation Index (EVI) data at 250-metre resolution derived from the Moderate Resolution Imaging Spectroradiometer (MODIS) for the period from 2000 to 2008. The EVI time series data for each pixel was modelled as 12 coefficients based on the statistical, phenological and seasonal characteristics. The time series were then clustered in coefficients spaces and labelled using ancillary information on vegetation and land use at the catchment scale. The accuracy of the DLCD was assessed using field survey data over 25,000 locations provided by vegetation and land management agencies in State and Territory jurisdictions, and by ABARES. The DLCD is seen as the first in a series of steps to build a framework for national land cover monitoring in Australia. A robust methodology to provide annual updates to the DLCD is currently being developed at Geoscience Australia. There is also a growing demand from the user community for land cover information at better spatial resolution than currently available through the DLCD. Global land cover mapping initiatives that rely on Earth observation data offer many opportunities for national and international programs to work in concert and deliver better outcomes by streamlining efforts on development and validation of land cover products. Among the upcoming missions, the Global Monitoring for Environment and Security (GMES) Sentinel-2 satellites are seen as an important source of optical data for updating land cover information in Australia. This paper outlines the DLCD development, key applications that inform nationally significant issues, further work on updating the DLCD that would enable transition to a national land cover monitoring framework, challenges and approaches to delivering land cover information at higher spatial resolutions on a continental scale, and the potential value of data from the Sentinel-2 mission in supporting land cover monitoring in Australia and globally.

  9. Generating synthetic wave climates for coastal modelling: a linear mixed modelling approach

    NASA Astrophysics Data System (ADS)

    Thomas, C.; Lark, R. M.

    2013-12-01

    Numerical coastline morphological evolution models require wave climate properties to drive morphological change through time. Wave climate properties (typically wave height, period and direction) may be temporally fixed, culled from real wave buoy data, or allowed to vary in some way defined by a Gaussian or other pdf. However, to examine sensitivity of coastline morphologies to wave climate change, it seems desirable to be able to modify wave climate time series from a current to some new state along a trajectory, but in a way consistent with, or initially conditioned by, the properties of existing data, or to generate fully synthetic data sets with realistic time series properties. For example, mean or significant wave height time series may have underlying periodicities, as revealed in numerous analyses of wave data. Our motivation is to develop a simple methodology to generate synthetic wave climate time series that can change in some stochastic way through time. We wish to use such time series in a coastline evolution model to test sensitivities of coastal landforms to changes in wave climate over decadal and centennial scales. We have worked initially on time series of significant wave height, based on data from a Waverider III buoy located off the coast of Yorkshire, England. The statistical framework for the simulation is the linear mixed model. The target variable, perhaps after transformation (Box-Cox), is modelled as a multivariate Gaussian, the mean modelled as a function of a fixed effect, and two random components, one of which is independently and identically distributed (iid) and the second of which is temporally correlated. The model was fitted to the data by likelihood methods. We considered the option of a periodic mean, the period either fixed (e.g. at 12 months) or estimated from the data. We considered two possible correlation structures for the second random effect. In one the correlation decays exponentially with time. In the second (spherical) model, it cuts off at a temporal range. Having fitted the model, multiple realisations were generated; the random effects were simulated by specifying a covariance matrix for the simulated values, with the estimated parameters. The Cholesky factorisation of the covariance matrix was computed and realizations of the random component of the model generated by pre-multiplying a vector of iid standard Gaussian variables by the lower triangular factor. The resulting random variate was added to the mean value computed from the fixed effects, and the result back-transformed to the original scale of the measurement. Realistic simulations result from approach described above. Background exploratory data analysis was undertaken on 20-day sets of 30-minute buoy data, selected from days 5-24 of months January, April, July, October, 2011, to elucidate daily to weekly variations, and to keep numerical analysis tractable computationally. Work remains to be undertaken to develop suitable models for synthetic directional data. We suggest that the general principles of the method will have applications in other geomorphological modelling endeavours requiring time series of stochastically variable environmental parameters.

  10. Marine Research Infrastructure collaboration in the COOPLUS project framework - Promoting synergies for marine ecosystems studies

    NASA Astrophysics Data System (ADS)

    Beranzoli, L.; Best, M.; Embriaco, D.; Favali, P.; Juniper, K.; Lo Bue, N.; Lara-Lopez, A.; Materia, P.; Ó Conchubhair, D.; O'Rourke, E.; Proctor, R.; Weller, R. A.

    2017-12-01

    Understanding effects on marine ecosystems of multiple drivers at various scales; from regional such as climate and ocean circulation, to local, such as seafloor gas emissions and harmful underwater noise, requires long time-series of integrated and standardised datasets. Large-scale research infrastructures for ocean observation are able to provide such time-series for a variety of ocean process physical parameters (mass and energy exchanges among surface, water column and benthic boundary layer) that constitute important and necessary measures of environmental conditions and change/development over time. Information deduced from these data is essential for the study, modelling and prediction of marine ecosystems changes and can reveal and potentially confirm deterioration and threats. The COOPLUS European Commission project brings together research infrastructures with the aim of coordinating multilateral cooperation among RIs and identifying common priorities, actions, instruments, resources. COOPLUS will produce a Strategic Research and Innovation Agenda (SRIA) which will be a shared roadmap for mid to long-term collaboration. In particular, marine RIs collaborating in COOPLUS, namely the European Multidisciplinary Seafloor and water column Observatory: EMSO (Europe), the Ocean Observatories Initiative (OOI, USA), Ocean Networks Canada (ONC), and the Integrated Marine Observing System (IMOS, Australia), can represent a source of important data for researchers of marine ecosystems. The RIs can then, in turn, receive suggestions from researchers for implementing new measurements and stimulating cross-cutting collaborations and data integration and standardisation from their user community. This poster provides a description of EMSO, OOI, ONC and IMOS for the benefit of marine ecosystem studies and presents examples of where the analyses of time-series have revealed noteworthy environmental conditions, temporal trends and events.

  11. Trend analysis of air temperature and precipitation time series over Greece: 1955-2010

    NASA Astrophysics Data System (ADS)

    Marougianni, G.; Melas, D.; Kioutsioukis, I.; Feidas, H.; Zanis, P.; Anandranistakis, E.

    2012-04-01

    In this study, a database of air temperature and precipitation time series from the network of Hellenic National Meteorological Service has been developed in the framework of the project GEOCLIMA, co-financed by the European Union and Greek national funds through the Operational Program "Competitiveness and Entrepreneurship" of the Research Funding Program COOPERATION 2009. Initially, a quality test was applied to the raw data and then missing observations have been imputed with a regularized, spatial-temporal expectation - maximization algorithm to complete the climatic record. Next, a quantile - matching algorithm was applied in order to verify the homogeneity of the data. The processed time series were used for the calculation of temporal annual and seasonal trends of air temperature and precipitation. Monthly maximum and minimum surface air temperature and precipitation means at all available stations in Greece were analyzed for temporal trends and spatial variation patterns for the longest common time period of homogenous data (1955 - 2010), applying the Mann-Kendall test. The majority of the examined stations showed a significant increase in the summer maximum and minimum temperatures; this could be possibly physically linked to the Etesian winds, because of the less frequent expansion of the low over the southeastern Mediterranean. Summer minimum temperatures have been increasing at a faster rate than that of summer maximum temperatures, reflecting an asymmetric change of extreme temperature distributions. Total annual precipitation has been significantly decreased at the stations located in western Greece, as well as in the southeast, while the remaining areas exhibit a non-significant negative trend. This reduction is very likely linked to the positive phase of the NAO that resulted in an increase in the frequency and persistence of anticyclones over the Mediterranean.

  12. A novel grey-fuzzy-Markov and pattern recognition model for industrial accident forecasting

    NASA Astrophysics Data System (ADS)

    Edem, Inyeneobong Ekoi; Oke, Sunday Ayoola; Adebiyi, Kazeem Adekunle

    2017-10-01

    Industrial forecasting is a top-echelon research domain, which has over the past several years experienced highly provocative research discussions. The scope of this research domain continues to expand due to the continuous knowledge ignition motivated by scholars in the area. So, more intelligent and intellectual contributions on current research issues in the accident domain will potentially spark more lively academic, value-added discussions that will be of practical significance to members of the safety community. In this communication, a new grey-fuzzy-Markov time series model, developed from nondifferential grey interval analytical framework has been presented for the first time. This instrument forecasts future accident occurrences under time-invariance assumption. The actual contribution made in the article is to recognise accident occurrence patterns and decompose them into grey state principal pattern components. The architectural framework of the developed grey-fuzzy-Markov pattern recognition (GFMAPR) model has four stages: fuzzification, smoothening, defuzzification and whitenisation. The results of application of the developed novel model signify that forecasting could be effectively carried out under uncertain conditions and hence, positions the model as a distinctly superior tool for accident forecasting investigations. The novelty of the work lies in the capability of the model in making highly accurate predictions and forecasts based on the availability of small or incomplete accident data.

  13. [GRADE Evidence to Decision (EtD) frameworks: a systematic and transparent approach to making well informed healthcare choices. 2: Clinical practice guidelines.

    PubMed

    Morgano, Gian Paolo; Parmelli, Elena; Amato, Laura; Iannone, Primiano; Marchetti, Marco; Moja, Lorenzo; Davoli, Marina; Schünemann, Holger

    2018-05-01

    In the first article in this series we described the GRADE (Grading of Recommendations Assessment, Development and Evaluation) Evidence to Decision (EtD) frameworks and their rationale for different types of decisions. In this second article, we describe the use of EtD frameworks for clinical recommendations and how it can help clinicians and patients who use those recommendations. EtD frameworks for clinical practice recommendations provide a structured and transparent approach for guideline panels. The framework helps ensure consideration of key criteria that determine whether an intervention should be recommended and that judgments are informed by the best available evidence. Frameworks are also a way for panels to make guideline users aware of the rationale (justification) for their recommendations.

  14. Time series evaluation of landscape dynamics using annual Landsat imagery and spatial statistical modeling: Evidence from the Phoenix metropolitan region

    NASA Astrophysics Data System (ADS)

    Fan, Chao; Myint, Soe W.; Rey, Sergio J.; Li, Wenwen

    2017-06-01

    Urbanization is a natural and social process involving simultaneous changes to the Earth's land systems, energy flow, demographics, and the economy. Understanding the spatiotemporal pattern of urbanization is increasingly important for policy formulation, decision making, and natural resource management. A combination of satellite remote sensing and patch-based models has been widely adopted to characterize landscape changes at various spatial and temporal scales. Nevertheless, the validity of this type of framework in identifying long-term changes, especially subtle or gradual land modifications is seriously challenged. In this paper, we integrate annual image time series, continuous spatial indices, and non-parametric trend analysis into a spatiotemporal study of landscape dynamics over the Phoenix metropolitan area from 1991 to 2010. We harness local indicators of spatial dependence and modified Mann-Kendall test to describe the monotonic trends in the quantity and spatial arrangement of two important land use land cover types: vegetation and built-up areas. Results suggest that declines in vegetation and increases in built-up areas are the two prevalent types of changes across the region. Vegetation increases mostly occur at the outskirts where new residential areas are developed from natural desert. A sizable proportion of vegetation declines and built-up increases are seen in the central and southeast part. Extensive land conversion from agricultural fields into urban land use is one important driver of vegetation declines. The xeriscaping practice also contributes to part of vegetation loss and an increasingly heterogeneous landscape. The quantitative framework proposed in this study provides a pathway to effective landscape mapping and change monitoring from a spatial statistical perspective.

  15. Counseling Female Offenders and Victims: A Strengths-Restorative Approach. Springer Series on Family Violence.

    ERIC Educational Resources Information Center

    van Wormer, Katherine

    This books considers the many aspects of how the criminal justice system can be reshaped to address the needs of victims of violence and offenders who themselves are often the victims of abuse. It presents a new model that offers an integrated framework to combine tenets of social work's strengths framework with the restorative justice model. It…

  16. The Nature and Nurture of Giftedness: A New Framework for Understanding Gifted Education. Education & Psychology of the Gifted Series

    ERIC Educational Resources Information Center

    Dai, David Yun

    2010-01-01

    With unprecedented scope and vision, Dr. Dai systematically redefines giftedness and proposes a new framework for the field of gifted education. He identifies nine essential tensions, revolving around three core questions: What do we know about the respective roles of natural ability, environment and experiences, and personal effort in talent…

  17. Education, Technology and Connectedness. Global Societal Trends to 2030: Thematic Report 2

    ERIC Educational Resources Information Center

    Horvath, Veronika; Ghez, Jeremy; Khodyakov, Dmitry; Yaqub, Ohid

    2015-01-01

    This Research Report forms part of our series on global societal trends and their impact on the EU in 2030. This analysis is embedded within the framework of the European Strategy and Policy Analysis System (ESPAS) set up to develop a lasting framework to assess global trends and to develop policy responses across EU institutions over the next…

  18. Metrics and Methodologies for Measuring Teaching Quality in Higher Education: Developing the Teaching Excellence Framework (TEF)

    ERIC Educational Resources Information Center

    Gunn, Andrew

    2018-01-01

    The creation of the Teaching Excellence Framework (TEF) represents a significant development concerning the teaching mission of the university in the UK. This paper considers the background to, and the development of, the TEF. It explains the context from which the TEF emerged and unpacks a series of rationales which illustrate the need for, and…

  19. Individual Empowerment. Global Societal Trends to 2030: Thematic Report 3

    ERIC Educational Resources Information Center

    Graf, Marlon; Ghez, Jeremy; Khodyakov, Dmitry; Yaqub, Ohid

    2015-01-01

    This Research Report forms part of our series on global societal trends and their impact on the EU in 2030. This analysis is embedded within the framework of the European Strategy and Policy Analysis System (ESPAS) set up to develop a lasting framework to assess global trends and to develop policy responses across EU institutions over the next…

  20. Two luminescent Zn(II) metal-organic frameworks for exceptionally selective detection of picric acid explosives.

    PubMed

    Shi, Zhi-Qiang; Guo, Zi-Jian; Zheng, He-Gen

    2015-05-14

    Two luminescent Zn(II) metal-organic frameworks were prepared from a π-conjugated thiophene-containing carboxylic acid ligand. These two MOFs show strong luminescene and their luminescence could be quenched by a series of nitroaromatic explosives. Importantly, they exhibit very highly sensitive and selective detection of picric acid compared to other nitroaromatic explosives.

Top