Science.gov

Sample records for multivariate event time

  1. Sample Size Determination in Shared Frailty Models for Multivariate Time-to-Event Data

    PubMed Central

    Chen, Liddy M.; Ibrahim, Joseph G.; Chu, Haitao

    2014-01-01

    The frailty model is increasingly popular for analyzing multivariate time-to-event data. The most common model is the shared frailty model. Although study design consideration is as important as analysis strategies, sample size determination methodology in studies with multivariate time-to-event data is greatly lacking in the literature. In this paper, we develop a sample size determination method for the shared frailty model to investigate the treatment effect on multivariate event times. We analyzed the data using both a parametric model and a piecewise model with unknown baseline hazard, and compare the empirical power with the calculated power. Last, we discuss the formula for testing the treatment effect on recurrent events. PMID:24697252

  2. Sample size determination in shared frailty models for multivariate time-to-event data.

    PubMed

    Chen, Liddy M; Ibrahim, Joseph G; Chu, Haitao

    2014-01-01

    The frailty model is increasingly popular for analyzing multivariate time-to-event data. The most common model is the shared frailty model. Although study design consideration is as important as analysis strategies, sample size determination methodology in studies with multivariate time-to-event data is greatly lacking in the literature. In this article, we develop a sample size determination method for the shared frailty model to investigate the treatment effect on multivariate event times. We analyzed the data using both a parametric model and a piecewise model with unknown baseline hazard, and compare the empirical power with the calculated power. Last, we discuss the formula for testing the treatment effect on recurrent events.

  3. Mining Recent Temporal Patterns for Event Detection in Multivariate Time Series Data

    PubMed Central

    Batal, Iyad; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos

    2015-01-01

    Improving the performance of classifiers using pattern mining techniques has been an active topic of data mining research. In this work we introduce the recent temporal pattern mining framework for finding predictive patterns for monitoring and event detection problems in complex multivariate time series data. This framework first converts time series into time-interval sequences of temporal abstractions. It then constructs more complex temporal patterns backwards in time using temporal operators. We apply our framework to health care data of 13,558 diabetic patients and show its benefits by efficiently finding useful patterns for detecting and diagnosing adverse medical conditions that are associated with diabetes. PMID:25937993

  4. Discrimination-based sample size calculations for multivariable prognostic models for time-to-event data.

    PubMed

    Jinks, Rachel C; Royston, Patrick; Parmar, Mahesh K B

    2015-10-12

    Prognostic studies of time-to-event data, where researchers aim to develop or validate multivariable prognostic models in order to predict survival, are commonly seen in the medical literature; however, most are performed retrospectively and few consider sample size prior to analysis. Events per variable rules are sometimes cited, but these are based on bias and coverage of confidence intervals for model terms, which are not of primary interest when developing a model to predict outcome. In this paper we aim to develop sample size recommendations for multivariable models of time-to-event data, based on their prognostic ability. We derive formulae for determining the sample size required for multivariable prognostic models in time-to-event data, based on a measure of discrimination, D, developed by Royston and Sauerbrei. These formulae fall into two categories: either based on the significance of the value of D in a new study compared to a previous estimate, or based on the precision of the estimate of D in a new study in terms of confidence interval width. Using simulation we show that they give the desired power and type I error and are not affected by random censoring. Additionally, we conduct a literature review to collate published values of D in different disease areas. We illustrate our methods using parameters from a published prognostic study in liver cancer. The resulting sample sizes can be large, and we suggest controlling study size by expressing the desired accuracy in the new study as a relative value as well as an absolute value. To improve usability we use the values of D obtained from the literature review to develop an equation to approximately convert the commonly reported Harrell's c-index to D. A flow chart is provided to aid decision making when using these methods. We have developed a suite of sample size calculations based on the prognostic ability of a survival model, rather than the magnitude or significance of model coefficients. We have

  5. A Bayesian semiparametric multivariate joint model for multiple longitudinal outcomes and a time-to-event.

    PubMed

    Rizopoulos, Dimitris; Ghosh, Pulak

    2011-05-30

    Motivated by a real data example on renal graft failure, we propose a new semiparametric multivariate joint model that relates multiple longitudinal outcomes to a time-to-event. To allow for greater flexibility, key components of the model are modelled nonparametrically. In particular, for the subject-specific longitudinal evolutions we use a spline-based approach, the baseline risk function is assumed piecewise constant, and the distribution of the latent terms is modelled using a Dirichlet Process prior formulation. Additionally, we discuss the choice of a suitable parameterization, from a practitioner's point of view, to relate the longitudinal process to the survival outcome. Specifically, we present three main families of parameterizations, discuss their features, and present tools to choose between them.

  6. Time-varying associations of suicide with deployments, mental health conditions, and stressful life events among current and former US military personnel: a retrospective multivariate analysis.

    PubMed

    Shen, Yu-Chu; Cunha, Jesse M; Williams, Thomas V

    2016-11-01

    US military suicides have increased substantially over the past decade and currently account for almost 20% of all military deaths. We investigated the associations of a comprehensive set of time-varying risk factors with suicides among current and former military service members. We did a retrospective multivariate analysis of all US military personnel between 2001 and 2011 (n=110 035 573 person-quarter-years, representing 3 795 823 service members). Outcome was death by suicide, either during service or post-separation. We used Cox proportional hazard models at the person-quarter level to examine associations of deployment, mental disorders, history of unlawful activity, stressful life events, and other demographic and service factors with death by suicide. The strongest predictors of death by suicide were current and past diagnoses of self-inflicted injuries, major depression, bipolar disorder, substance use disorder, and other mental health conditions (compared with service members with no history of diagnoses, the hazard ratio [HR] ranged from 1·4 [95% CI 1·14-1·72] to 8·34 [6·71-10·37]). Compared with service members who were never deployed, hazard rates of suicide (which represent the probability of death by suicide in a specific quarter given that the individual was alive in the previous quarter) were lower among the currently deployed (HR 0·50, 95% CI 0·40-0·61) but significantly higher in the quarters following first deployment (HR 1·51 [1·17-1·96] if deployed in the previous three quarters; 1·14 [1·06-1·23] if deployed four or more quarters ago). The hazard rate of suicide increased within the first year of separation from the military (HR 2·49, 95% CI 2·12-2·91), and remained high for those who had separated from the military 6 or more years ago (HR 1·63, 1·45-1·82). The increased hazard rate of death by suicide for military personnel varies by time since exposure to deployment, mental health diagnoses, and other stressful

  7. Two-stage estimation for multivariate recurrent event data with a dependent terminal event.

    PubMed

    Chen, Chyong-Mei; Chuang, Ya-Wen; Shen, Pao-Sheng

    2015-03-01

    Recurrent event data arise in longitudinal follow-up studies, where each subject may experience the same type of events repeatedly. The work in this article is motivated by the data from a study of repeated peritonitis for patients on peritoneal dialysis. Due to the aspects of medicine and cost, the peritonitis cases were classified into two types: Gram-positive and non-Gram-positive peritonitis. Further, since the death and hemodialysis therapy preclude the occurrence of recurrent events, we face multivariate recurrent event data with a dependent terminal event. We propose a flexible marginal model, which has three characteristics: first, we assume marginal proportional hazard and proportional rates models for terminal event time and recurrent event processes, respectively; second, the inter-recurrences dependence and the correlation between the multivariate recurrent event processes and terminal event time are modeled through three multiplicative frailties corresponding to the specified marginal models; third, the rate model with frailties for recurrent events is specified only on the time before the terminal event. We propose a two-stage estimation procedure for estimating unknown parameters. We also establish the consistency of the two-stage estimator. Simulation studies show that the proposed approach is appropriate for practical use. The methodology is applied to the peritonitis cohort data that motivated this study.

  8. Multivariate Voronoi Outlier Detection for Time Series.

    PubMed

    Zwilling, Chris E; Wang, Michelle Yongmei

    2014-10-01

    Outlier detection is a primary step in many data mining and analysis applications, including healthcare and medical research. This paper presents a general method to identify outliers in multivariate time series based on a Voronoi diagram, which we call Multivariate Voronoi Outlier Detection (MVOD). The approach copes with outliers in a multivariate framework, via designing and extracting effective attributes or features from the data that can take parametric or nonparametric forms. Voronoi diagrams allow for automatic configuration of the neighborhood relationship of the data points, which facilitates the differentiation of outliers and non-outliers. Experimental evaluation demonstrates that our MVOD is an accurate, sensitive, and robust method for detecting outliers in multivariate time series data.

  9. Multivariate Time Series Decomposition into Oscillation Components.

    PubMed

    Matsuda, Takeru; Komaki, Fumiyasu

    2017-08-01

    Many time series are considered to be a superposition of several oscillation components. We have proposed a method for decomposing univariate time series into oscillation components and estimating their phases (Matsuda & Komaki, 2017 ). In this study, we extend that method to multivariate time series. We assume that several oscillators underlie the given multivariate time series and that each variable corresponds to a superposition of the projections of the oscillators. Thus, the oscillators superpose on each variable with amplitude and phase modulation. Based on this idea, we develop gaussian linear state-space models and use them to decompose the given multivariate time series. The model parameters are estimated from data using the empirical Bayes method, and the number of oscillators is determined using the Akaike information criterion. Therefore, the proposed method extracts underlying oscillators in a data-driven manner and enables investigation of phase dynamics in a given multivariate time series. Numerical results show the effectiveness of the proposed method. From monthly mean north-south sunspot number data, the proposed method reveals an interesting phase relationship.

  10. Transfer entropy between multivariate time series

    NASA Astrophysics Data System (ADS)

    Mao, Xuegeng; Shang, Pengjian

    2017-06-01

    It is a crucial topic to identify the direction and strength of the interdependence between time series in multivariate systems. In this paper, we propose the method of transfer entropy based on the theory of time-delay reconstruction of a phase space, which is a model-free approach to detect causalities in multivariate time series. This method overcomes the limitation that original transfer entropy only can capture which system drives the transition probabilities of another in scalar time series. Using artificial time series, we show that the driving character is obviously reflected with the increase of the coupling strength between two signals and confirm the effectiveness of the method with noise added. Furthermore, we utilize it to real-world data, namely financial time series, in order to characterize the information flow among different stocks.

  11. Multivariate cluster analysis of forest fire events in Portugal

    NASA Astrophysics Data System (ADS)

    Tonini, Marj; Pereira, Mario; Vega Orozco, Carmen; Parente, Joana

    2015-04-01

    Portugal is one of the major fire-prone European countries, mainly due to its favourable climatic, topographic and vegetation conditions. Compared to the other Mediterranean countries, the number of events registered here from 1980 up to nowadays is the highest one; likewise, with respect to the burnt area, Portugal is the third most affected country. Portuguese mapped burnt areas are available from the website of the Institute for the Conservation of Nature and Forests (ICNF). This official geodatabase is the result of satellite measurements starting from the year 1990. The spatial information, delivered in shapefile format, provides a detailed description of the shape and the size of area burnt by each fire, while the date/time information relate to the ignition fire is restricted to the year of occurrence. In terms of a statistical formalism wildfires can be associated to a stochastic point process, where events are analysed as a set of geographical coordinates corresponding, for example, to the centroid of each burnt area. The spatio/temporal pattern of stochastic point processes, including the cluster analysis, is a basic procedure to discover predisposing factorsas well as for prevention and forecasting purposes. These kinds of studies are primarily focused on investigating the spatial cluster behaviour of environmental data sequences and/or mapping their distribution at different times. To include both the two dimensions (space and time) a comprehensive spatio-temporal analysis is needful. In the present study authors attempt to verify if, in the case of wildfires in Portugal, space and time act independently or if, conversely, neighbouring events are also closer in time. We present an application of the spatio-temporal K-function to a long dataset (1990-2012) of mapped burnt areas. Moreover, the multivariate K-function allowed checking for an eventual different distribution between small and large fires. The final objective is to elaborate a 3D

  12. Multivariate Statistical Modelling of Drought and Heat Wave Events

    NASA Astrophysics Data System (ADS)

    Manning, Colin; Widmann, Martin; Vrac, Mathieu; Maraun, Douglas; Bevaqua, Emanuele

    2016-04-01

    Multivariate Statistical Modelling of Drought and Heat Wave Events C. Manning1,2, M. Widmann1, M. Vrac2, D. Maraun3, E. Bevaqua2,3 1. School of Geography, Earth and Environmental Sciences, University of Birmingham, Edgbaston, Birmingham, UK 2. Laboratoire des Sciences du Climat et de l'Environnement, (LSCE-IPSL), Centre d'Etudes de Saclay, Gif-sur-Yvette, France 3. Wegener Center for Climate and Global Change, University of Graz, Brandhofgasse 5, 8010 Graz, Austria Compound extreme events are a combination of two or more contributing events which in themselves may not be extreme but through their joint occurrence produce an extreme impact. Compound events are noted in the latest IPCC report as an important type of extreme event that have been given little attention so far. As part of the CE:LLO project (Compound Events: muLtivariate statisticaL mOdelling) we are developing a multivariate statistical model to gain an understanding of the dependence structure of certain compound events. One focus of this project is on the interaction between drought and heat wave events. Soil moisture has both a local and non-local effect on the occurrence of heat waves where it strongly controls the latent heat flux affecting the transfer of sensible heat to the atmosphere. These processes can create a feedback whereby a heat wave maybe amplified or suppressed by the soil moisture preconditioning, and vice versa, the heat wave may in turn have an effect on soil conditions. An aim of this project is to capture this dependence in order to correctly describe the joint probabilities of these conditions and the resulting probability of their compound impact. We will show an application of Pair Copula Constructions (PCCs) to study the aforementioned compound event. PCCs allow in theory for the formulation of multivariate dependence structures in any dimension where the PCC is a decomposition of a multivariate distribution into a product of bivariate components modelled using copulas. A

  13. Multivariate Recurrent Events in the Presence of Multivariate Informative Censoring with Applications to Bleeding and Transfusion Events in Myelodyplastic Syndrome

    PubMed Central

    Zeng, Donglin; Ibrahim, Joseph G.; Chen, Ming-Hui; Hu, Kuolung; Jia, Catherine

    2014-01-01

    Summary We propose a general novel class of joint models to analyze recurrent events that has a wide variety of applications. The application of focus on this paper is to model the bleeding and transfusion events in Myelodyplastic Syndrome (MDS) studies, where patients may die or withdraw from the study early due to adverse events or other reasons, such as consent withdrawal or required alternative therapy during the study. The proposed model accommodates multiple recurrent events and multivariate informative censoring through a shared random effects model. The random effects model captures both within-subject and within-event dependence simultaneously. We construct the likelihood function for the semi-parametric joint model and develop an EM algorithm for inference. The computational burden does not increase with the number of types of recurrent events. We utilize the MDS clinical trial data to illustrate our proposed methodology. We also conduct a number of simulations to examine the performance of the proposed model. PMID:24605978

  14. Network structure of multivariate time series.

    PubMed

    Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito

    2015-10-21

    Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.

  15. Network structure of multivariate time series

    NASA Astrophysics Data System (ADS)

    Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito

    2015-10-01

    Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.

  16. Network structure of multivariate time series

    PubMed Central

    Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito

    2015-01-01

    Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail. PMID:26487040

  17. Modeling multivariate covariance nonstationary time series and their dependency structure

    SciTech Connect

    Gersch, W.

    1985-08-01

    The parametric modeling of covariance nonstationary time series and the computation of their changing interdependency structure from the fitted model are treated. The nonstationary time series are modeled by a multivariate time varying autoregressive (AR) model. The time evolution of the AR parameters is expressed as linear combinations of discrete Legendre orthogonal polynomial functions of time. The model is fitted by a Householder transformation-AIC order determination, regression subset selection method. The computation of the instantaneous dependence, feedback and causality structure of the time series from the fitted model, is discussed. An example of the modeling and determination of instantaneous causality in a human implanted electrode seizure event EEG is shown.

  18. Time varying, multivariate volume data reduction

    SciTech Connect

    Ahrens, James P; Fout, Nathaniel; Ma, Kwan - Liu

    2010-01-01

    Large-scale supercomputing is revolutionizing the way science is conducted. A growing challenge, however, is understanding the massive quantities of data produced by large-scale simulations. The data, typically time-varying, multivariate, and volumetric, can occupy from hundreds of gigabytes to several terabytes of storage space. Transferring and processing volume data of such sizes is prohibitively expensive and resource intensive. Although it may not be possible to entirely alleviate these problems, data compression should be considered as part of a viable solution, especially when the primary means of data analysis is volume rendering. In this paper we present our study of multivariate compression, which exploits correlations among related variables, for volume rendering. Two configurations for multidimensional compression based on vector quantization are examined. We emphasize quality reconstruction and interactive rendering, which leads us to a solution using graphics hardware to perform on-the-fly decompression during rendering. In this paper we present a solution which addresses the need for data reduction in large supercomputing environments where data resulting from simulations occupies tremendous amounts of storage. Our solution employs a lossy encoding scheme to acrueve data reduction with several options in terms of rate-distortion behavior. We focus on encoding of multiple variables together, with optional compression in space and time. The compressed volumes can be rendered directly with commodity graphics cards at interactive frame rates and rendering quality similar to that of static volume renderers. Compression results using a multivariate time-varying data set indicate that encoding multiple variables results in acceptable performance in the case of spatial and temporal encoding as compared to independent compression of variables. The relative performance of spatial vs. temporal compression is data dependent, although temporal compression has the

  19. Visualizing frequent patterns in large multivariate time series

    NASA Astrophysics Data System (ADS)

    Hao, M.; Marwah, M.; Janetzko, H.; Sharma, R.; Keim, D. A.; Dayal, U.; Patnaik, D.; Ramakrishnan, N.

    2011-01-01

    The detection of previously unknown, frequently occurring patterns in time series, often called motifs, has been recognized as an important task. However, it is difficult to discover and visualize these motifs as their numbers increase, especially in large multivariate time series. To find frequent motifs, we use several temporal data mining and event encoding techniques to cluster and convert a multivariate time series to a sequence of events. Then we quantify the efficiency of the discovered motifs by linking them with a performance metric. To visualize frequent patterns in a large time series with potentially hundreds of nested motifs on a single display, we introduce three novel visual analytics methods: (1) motif layout, using colored rectangles for visualizing the occurrences and hierarchical relationships of motifs in a multivariate time series, (2) motif distortion, for enlarging or shrinking motifs as appropriate for easy analysis and (3) motif merging, to combine a number of identical adjacent motif instances without cluttering the display. Analysts can interactively optimize the degree of distortion and merging to get the best possible view. A specific motif (e.g., the most efficient or least efficient motif) can be quickly detected from a large time series for further investigation. We have applied these methods to two real-world data sets: data center cooling and oil well production. The results provide important new insights into the recurring patterns.

  20. Detrended fluctuation analysis of multivariate time series

    NASA Astrophysics Data System (ADS)

    Xiong, Hui; Shang, P.

    2017-01-01

    In this work, we generalize the detrended fluctuation analysis (DFA) to the multivariate case, named multivariate DFA (MVDFA). The validity of the proposed MVDFA is illustrated by numerical simulations on synthetic multivariate processes, where the cases that initial data are generated independently from the same system and from different systems as well as the correlated variate from one system are considered. Moreover, the proposed MVDFA works well when applied to the multi-scale analysis of the returns of stock indices in Chinese and US stock markets. Generally, connections between the multivariate system and the individual variate are uncovered, showing the solid performances of MVDFA and the multi-scale MVDFA.

  1. A Multivariate Description of Compound Events of Meteorological Drought and Heat Waves

    NASA Astrophysics Data System (ADS)

    Manning, Colin; Widmann, Martin; Maraun, Douglas; Vrac, Mathieu; Van Loon, Anne; Bevacqua, Emanuele

    2017-04-01

    Compound events (CEs) are extreme impacts driven by multiple events or variables that in themselves may not be extreme but through their joint occurrence produce an extreme impact. We focus here on compound events arising from the concurrence of meteorological drought and heat waves. Meteorological drought can lead to a deficit in moisture availability for evapotranspiration from soil and so induce land surface feedbacks, whereby reductions and increases may occur in latent and sensible heat fluxes respectively, leading to an amplification of temperature extremes. We take an events based approach where we define events in time and space, relative to a given location, using characteristics of both meteorological drought and heat waves such as duration, spatial extent and a measure of severity. We employ Pair Copula Constructions (PCC) to define the multivariate distribution of these characteristics. Copula are multivariate distribution functions that allow one to model the dependence structure of given variables separately from their marginal behaviour. PCCs then allow in theory for the formulation of a multivariate distribution of any dimension that is decomposed into a product of marginal probability density functions and two-dimensional copula, of which some are conditional. We show here the variables used and their dependence structure that comprise the compound event arising from the concurrence of meteorological drought and heat waves relative to a location of interest. We provide physical interpretation to the multivariate distribution defined and show potential applications of this approach.

  2. Segmentation of biological multivariate time-series data

    NASA Astrophysics Data System (ADS)

    Omranian, Nooshin; Mueller-Roeber, Bernd; Nikoloski, Zoran

    2015-03-01

    Time-series data from multicomponent systems capture the dynamics of the ongoing processes and reflect the interactions between the components. The progression of processes in such systems usually involves check-points and events at which the relationships between the components are altered in response to stimuli. Detecting these events together with the implicated components can help understand the temporal aspects of complex biological systems. Here we propose a regularized regression-based approach for identifying breakpoints and corresponding segments from multivariate time-series data. In combination with techniques from clustering, the approach also allows estimating the significance of the determined breakpoints as well as the key components implicated in the emergence of the breakpoints. Comparative analysis with the existing alternatives demonstrates the power of the approach to identify biologically meaningful breakpoints in diverse time-resolved transcriptomics data sets from the yeast Saccharomyces cerevisiae and the diatom Thalassiosira pseudonana.

  3. An Efficient Pattern Mining Approach for Event Detection in Multivariate Temporal Data.

    PubMed

    Batal, Iyad; Cooper, Gregory; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos

    2016-01-01

    This work proposes a pattern mining approach to learn event detection models from complex multivariate temporal data, such as electronic health records. We present Recent Temporal Pattern mining, a novel approach for efficiently finding predictive patterns for event detection problems. This approach first converts the time series data into time-interval sequences of temporal abstractions. It then constructs more complex time-interval patterns backward in time using temporal operators. We also present the Minimal Predictive Recent Temporal Patterns framework for selecting a small set of predictive and non-spurious patterns. We apply our methods for predicting adverse medical events in real-world clinical data. The results demonstrate the benefits of our methods in learning accurate event detection models, which is a key step for developing intelligent patient monitoring and decision support systems.

  4. An Efficient Pattern Mining Approach for Event Detection in Multivariate Temporal Data

    PubMed Central

    Batal, Iyad; Cooper, Gregory; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos

    2015-01-01

    This work proposes a pattern mining approach to learn event detection models from complex multivariate temporal data, such as electronic health records. We present Recent Temporal Pattern mining, a novel approach for efficiently finding predictive patterns for event detection problems. This approach first converts the time series data into time-interval sequences of temporal abstractions. It then constructs more complex time-interval patterns backward in time using temporal operators. We also present the Minimal Predictive Recent Temporal Patterns framework for selecting a small set of predictive and non-spurious patterns. We apply our methods for predicting adverse medical events in real-world clinical data. The results demonstrate the benefits of our methods in learning accurate event detection models, which is a key step for developing intelligent patient monitoring and decision support systems. PMID:26752800

  5. Identification of Multivariate Time Series and Multivariate Input-Output Models

    NASA Astrophysics Data System (ADS)

    Cooper, David M.; Wood, Eric F.

    1982-08-01

    The problem of linear model structure identification for multivariate time series or multiple input-output models is presented and solved. The identification is obtained using canonical correlations to determine model order. The equivalence between state-space model structure and multivariate autoregressive moving average with exogenous inputs (ARMAX) models is presented. The class of models open to analysis includes rainfall-runoff models, multivariate streamflow models, and time invariant state-space models used in Kaiman filtering. Examples include a rainfall-runoff model using three precipitation inputs, a four-site monthly streamflow model, and a four-season streamflow model.

  6. Small Sample Properties of Bayesian Multivariate Autoregressive Time Series Models

    ERIC Educational Resources Information Center

    Price, Larry R.

    2012-01-01

    The aim of this study was to compare the small sample (N = 1, 3, 5, 10, 15) performance of a Bayesian multivariate vector autoregressive (BVAR-SEM) time series model relative to frequentist power and parameter estimation bias. A multivariate autoregressive model was developed based on correlated autoregressive time series vectors of varying…

  7. Small Sample Properties of Bayesian Multivariate Autoregressive Time Series Models

    ERIC Educational Resources Information Center

    Price, Larry R.

    2012-01-01

    The aim of this study was to compare the small sample (N = 1, 3, 5, 10, 15) performance of a Bayesian multivariate vector autoregressive (BVAR-SEM) time series model relative to frequentist power and parameter estimation bias. A multivariate autoregressive model was developed based on correlated autoregressive time series vectors of varying…

  8. Interpretable Early Classification of Multivariate Time Series

    ERIC Educational Resources Information Center

    Ghalwash, Mohamed F.

    2013-01-01

    Recent advances in technology have led to an explosion in data collection over time rather than in a single snapshot. For example, microarray technology allows us to measure gene expression levels in different conditions over time. Such temporal data grants the opportunity for data miners to develop algorithms to address domain-related problems,…

  9. Interpretable Early Classification of Multivariate Time Series

    ERIC Educational Resources Information Center

    Ghalwash, Mohamed F.

    2013-01-01

    Recent advances in technology have led to an explosion in data collection over time rather than in a single snapshot. For example, microarray technology allows us to measure gene expression levels in different conditions over time. Such temporal data grants the opportunity for data miners to develop algorithms to address domain-related problems,…

  10. Distance measure with improved lower bound for multivariate time series

    NASA Astrophysics Data System (ADS)

    Li, Hailin

    2017-02-01

    Lower bound function is one of the important techniques used to fast search and index time series data. Multivariate time series has two aspects of high dimensionality including the time-based dimension and the variable-based dimension. Due to the influence of variable-based dimension, a novel method is proposed to deal with the lower bound distance computation for multivariate time series. The proposed method like the traditional ones also reduces the dimensionality of time series in its first step and thus does not directly apply the lower bound function on the multivariate time series. The dimensionality reduction is that multivariate time series is reduced to univariate time series denoted as center sequences according to the principle of piecewise aggregate approximation. In addition, an extended lower bound function is designed to obtain good tightness and fast measure the distance between any two center sequences. The experimental results demonstrate that the proposed lower bound function has better tightness and improves the performance of similarity search in multivariate time series datasets.

  11. Estimating the ratio of multivariate recurrent event rates with application to a blood transfusion study.

    PubMed

    Ning, Jing; Rahbar, Mohammad H; Choi, Sangbum; Piao, Jin; Hong, Chuan; Del Junco, Deborah J; Rahbar, Elaheh; Fox, Erin E; Holcomb, John B; Wang, Mei-Cheng

    2015-07-09

    In comparative effectiveness studies of multicomponent, sequential interventions like blood product transfusion (plasma, platelets, red blood cells) for trauma and critical care patients, the timing and dynamics of treatment relative to the fragility of a patient's condition is often overlooked and underappreciated. While many hospitals have established massive transfusion protocols to ensure that physiologically optimal combinations of blood products are rapidly available, the period of time required to achieve a specified massive transfusion standard (e.g. a 1:1 or 1:2 ratio of plasma or platelets:red blood cells) has been ignored. To account for the time-varying characteristics of transfusions, we use semiparametric rate models for multivariate recurrent events to estimate blood product ratios. We use latent variables to account for multiple sources of informative censoring (early surgical or endovascular hemorrhage control procedures or death). The major advantage is that the distributions of latent variables and the dependence structure between the multivariate recurrent events and informative censoring need not be specified. Thus, our approach is robust to complex model assumptions. We establish asymptotic properties and evaluate finite sample performance through simulations, and apply the method to data from the PRospective Observational Multicenter Major Trauma Transfusion study.

  12. Drunk driving detection based on classification of multivariate time series.

    PubMed

    Li, Zhenlong; Jin, Xue; Zhao, Xiaohua

    2015-09-01

    This paper addresses the problem of detecting drunk driving based on classification of multivariate time series. First, driving performance measures were collected from a test in a driving simulator located in the Traffic Research Center, Beijing University of Technology. Lateral position and steering angle were used to detect drunk driving. Second, multivariate time series analysis was performed to extract the features. A piecewise linear representation was used to represent multivariate time series. A bottom-up algorithm was then employed to separate multivariate time series. The slope and time interval of each segment were extracted as the features for classification. Third, a support vector machine classifier was used to classify driver's state into two classes (normal or drunk) according to the extracted features. The proposed approach achieved an accuracy of 80.0%. Drunk driving detection based on the analysis of multivariate time series is feasible and effective. The approach has implications for drunk driving detection. Copyright © 2015 Elsevier Ltd and National Safety Council. All rights reserved.

  13. Regularly timed events amid chaos

    NASA Astrophysics Data System (ADS)

    Blakely, Jonathan N.; Cooper, Roy M.; Corron, Ned J.

    2015-11-01

    We show rigorously that the solutions of a class of chaotic oscillators are characterized by regularly timed events in which the derivative of the solution is instantaneously zero. The perfect regularity of these events is in stark contrast with the well-known unpredictability of chaos. We explore some consequences of these regularly timed events through experiments using chaotic electronic circuits. First, we show that a feedback loop can be implemented to phase lock the regularly timed events to a periodic external signal. In this arrangement the external signal regulates the timing of the chaotic signal but does not strictly lock its phase. That is, phase slips of the chaotic oscillation persist without disturbing timing of the regular events. Second, we couple the regularly timed events of one chaotic oscillator to those of another. A state of synchronization is observed where the oscillators exhibit synchronized regular events while their chaotic amplitudes and phases evolve independently. Finally, we add additional coupling to synchronize the amplitudes, as well, however in the opposite direction illustrating the independence of the amplitudes from the regularly timed events.

  14. A multivariate heuristic model for fuzzy time-series forecasting.

    PubMed

    Huarng, Kun-Huang; Yu, Tiffany Hui-Kuang; Hsu, Yu Wei

    2007-08-01

    Fuzzy time-series models have been widely applied due to their ability to handle nonlinear data directly and because no rigid assumptions for the data are needed. In addition, many such models have been shown to provide better forecasting results than their conventional counterparts. However, since most of these models require complicated matrix computations, this paper proposes the adoption of a multivariate heuristic function that can be integrated with univariate fuzzy time-series models into multivariate models. Such a multivariate heuristic function can easily be extended and integrated with various univariate models. Furthermore, the integrated model can handle multiple variables to improve forecasting results and, at the same time, avoid complicated computations due to the inclusion of multiple variables.

  15. Optimal model-free prediction from multivariate time series

    NASA Astrophysics Data System (ADS)

    Runge, Jakob; Donner, Reik V.; Kurths, Jürgen

    2015-04-01

    Forecasting a complex system's time evolution constitutes a challenging problem, especially if the governing physical equations are unknown or too complex to be simulated with first-principle models. Here a model-free prediction scheme based on the observed multivariate time series is discussed. It efficiently overcomes the curse of dimensionality in finding good predictors from large data sets and yields information-theoretically optimal predictors. The practical performance of the prediction scheme is demonstrated on multivariate nonlinear stochastic delay processes and in an application to an index of El Nino-Southern Oscillation.

  16. [Anomaly Detection of Multivariate Time Series Based on Riemannian Manifolds].

    PubMed

    Xu, Yonghong; Hou, Xiaoying; Li Shuting; Cui, Jie

    2015-06-01

    Multivariate time series problems widely exist in production and life in the society. Anomaly detection has provided people with a lot of valuable information in financial, hydrological, meteorological fields, and the research areas of earthquake, video surveillance, medicine and others. In order to quickly and efficiently find exceptions in time sequence so that it can be presented in front of people in an intuitive way, we in this study combined the Riemannian manifold with statistical process control charts, based on sliding window, with a description of the covariance matrix as the time sequence, to achieve the multivariate time series of anomaly detection and its visualization. We made MA analog data flow and abnormal electrocardiogram data from MIT-BIH as experimental objects, and verified the anomaly detection method. The results showed that the method was reasonable and effective.

  17. A time domain frequency-selective multivariate Granger causality approach.

    PubMed

    Leistritz, Lutz; Witte, Herbert

    2016-08-01

    The investigation of effective connectivity is one of the major topics in computational neuroscience to understand the interaction between spatially distributed neuronal units of the brain. Thus, a wide variety of methods has been developed during the last decades to investigate functional and effective connectivity in multivariate systems. Their spectrum ranges from model-based to model-free approaches with a clear separation into time and frequency range methods. We present in this simulation study a novel time domain approach based on Granger's principle of predictability, which allows frequency-selective considerations of directed interactions. It is based on a comparison of prediction errors of multivariate autoregressive models fitted to systematically modified time series. These modifications are based on signal decompositions, which enable a targeted cancellation of specific signal components with specific spectral properties. Depending on the embedded signal decomposition method, a frequency-selective or data-driven signal-adaptive Granger Causality Index may be derived.

  18. A multivariate time-series approach to marital interaction

    PubMed Central

    Kupfer, Jörg; Brosig, Burkhard; Brähler, Elmar

    2005-01-01

    Time-series analysis (TSA) is frequently used in order to clarify complex structures of mutually interacting panel data. The method helps in understanding how the course of a dependent variable is predicted by independent time-series with no time lag, as well as by previous observations of that dependent variable (autocorrelation) and of independent variables (cross-correlation). The study analyzes the marital interaction of a married couple under clinical conditions over a period of 144 days by means of TSA. The data were collected within a course of couple therapy. The male partner was affected by a severe condition of atopic dermatitis and the woman suffered from bulimia nervosa. Each of the partners completed a mood questionnaire and a body symptom checklist. After the determination of auto- and cross-correlations between and within the parallel data sets, multivariate time-series models were specified. Mutual and individual patterns of emotional reactions explained 14% (skin) and 33% (bulimia) of the total variance in both dependent variables (adj. R², p<0.0001 for the multivariate models). The question was discussed whether multivariate TSA-models represent a suitable approach to the empirical exploration of clinical marital interaction. PMID:19742066

  19. A multivariate time-series approach to marital interaction.

    PubMed

    Kupfer, Jörg; Brosig, Burkhard; Brähler, Elmar

    2005-08-02

    Time-series analysis (TSA) is frequently used in order to clarify complex structures of mutually interacting panel data. The method helps in understanding how the course of a dependent variable is predicted by independent time-series with no time lag, as well as by previous observations of that dependent variable (autocorrelation) and of independent variables (cross-correlation).The study analyzes the marital interaction of a married couple under clinical conditions over a period of 144 days by means of TSA. The data were collected within a course of couple therapy. The male partner was affected by a severe condition of atopic dermatitis and the woman suffered from bulimia nervosa.Each of the partners completed a mood questionnaire and a body symptom checklist. After the determination of auto- and cross-correlations between and within the parallel data sets, multivariate time-series models were specified. Mutual and individual patterns of emotional reactions explained 14% (skin) and 33% (bulimia) of the total variance in both dependent variables (adj. R(2), p<0.0001 for the multivariate models).The question was discussed whether multivariate TSA-models represent a suitable approach to the empirical exploration of clinical marital interaction.

  20. The LCLS Timing Event System

    SciTech Connect

    Dusatko, John; Allison, S.; Browne, M.; Krejcik, P.; /SLAC

    2012-07-23

    The Linac Coherent Light Source requires precision timing trigger signals for various accelerator diagnostics and controls at SLAC-NAL. A new timing system has been developed that meets these requirements. This system is based on COTS hardware with a mixture of custom-designed units. An added challenge has been the requirement that the LCLS Timing System must co-exist and 'know' about the existing SLC Timing System. This paper describes the architecture, construction and performance of the LCLS timing event system.

  1. Nonlinear independent component analysis and multivariate time series analysis

    NASA Astrophysics Data System (ADS)

    Storck, Jan; Deco, Gustavo

    1997-02-01

    We derive an information-theory-based unsupervised learning paradigm for nonlinear independent component analysis (NICA) with neural networks. We demonstrate that under the constraint of bounded and invertible output transfer functions the two main goals of unsupervised learning, redundancy reduction and maximization of the transmitted information between input and output (Infomax-principle), are equivalent. No assumptions are made concerning the kind of input and output distributions, i.e. the kind of nonlinearity of correlations. An adapted version of the general NICA network is used for the modeling of multivariate time series by unsupervised learning. Given time series of various observables of a dynamical system, our net learns their evolution in time by extracting statistical dependencies between past and present elements of the time series. Multivariate modeling is obtained by making present value of each time series statistically independent not only from their own past but also from the past of the other series. Therefore, in contrast to univariate methods, the information lying in the couplings between the observables is also used and a detection of higher-order cross correlations is possible. We apply our method to time series of the two-dimensional Hénon map and to experimental time series obtained from the measurements of axial velocities in different locations in weakly turbulent Taylor-Couette flow.

  2. Phylogenetic proximity revealed by neurodevelopmental event timings.

    PubMed

    Nagarajan, Radhakrishnan; Clancy, Barbara

    2008-01-01

    Statistical properties such as distribution and correlation signatures were investigated using a temporal database of common neurodevelopmental events in the three species most frequently used in experimental studies, rat, mouse, and macaque. There was a fine nexus between phylogenetic proximity and empirically derived dates of the occurrences of 40 common events including the neurogenesis of cortical layers and outgrowth milestones of developing axonal projections. Exponential and power-law approximations to the distribution of the events reveal strikingly similar decay patterns in rats and mice when compared to macaques. Subsequent hierarchical clustering of the common event timings also captures phylogenetic proximity, an association further supported by multivariate linear regression data. These preliminary results suggest that statistical analyses of the timing of developmental milestones may offer a novel measure of phylogenetic classifications. This may have added pragmatic value in the specific support it offers for the reliability of rat/mouse comparative modeling, as well as in the broader implications for the potential of meta-analyses using databases assembled from the extensive empirical literature.

  3. Time Delay in Microlensing Event

    NASA Image and Video Library

    2015-04-14

    This plot shows data obtained from NASA's Spitzer Space Telescope and the Optical Gravitational Lensing Experiment, or OGLE, telescope located in Chile, during a "microlensing" event. Microlensing events occur when one star passes another, and the gravity of the foreground star causes the distant star's light to magnify and brighten. This magnification is evident in the plot, as both Spitzer and OGLE register an increase in the star's brightness. If the foreground star is circled by a planet, the planet's gravity can alter the magnification over a shorter period, seen in the plot in the form of spikes and a dip. The great distance between Spitzer, in space, and OGLE, on the ground, meant that Spitzer saw this particular microlensing event before OGLE. The offset in the timing can be used to measure the distance to the planet. In this case, the planet, called OGLE-2014-BLG-0124L, was found to be 13,000 light-years away, near the center of our Milky Way galaxy. The finding was the result of fortuitous timing because Spitzer's overall program to observe microlensing events was only just starting up in the week before the planet's effects were visible from Spitzer's vantage point. While Spitzer sees infrared light of 3.6 microns in wavelength, OGLE sees visible light of 0.8 microns. http://photojournal.jpl.nasa.gov/catalog/PIA19331

  4. A Method for Comparing Multivariate Time Series with Different Dimensions

    PubMed Central

    Tapinos, Avraam; Mendes, Pedro

    2013-01-01

    In many situations it is desirable to compare dynamical systems based on their behavior. Similarity of behavior often implies similarity of internal mechanisms or dependency on common extrinsic factors. While there are widely used methods for comparing univariate time series, most dynamical systems are characterized by multivariate time series. Yet, comparison of multivariate time series has been limited to cases where they share a common dimensionality. A semi-metric is a distance function that has the properties of non-negativity, symmetry and reflexivity, but not sub-additivity. Here we develop a semi-metric – SMETS – that can be used for comparing groups of time series that may have different dimensions. To demonstrate its utility, the method is applied to dynamic models of biochemical networks and to portfolios of shares. The former is an example of a case where the dependencies between system variables are known, while in the latter the system is treated (and behaves) as a black box. PMID:23393554

  5. Inverse problem for multivariate time series using dynamical latent variables

    NASA Astrophysics Data System (ADS)

    Zamparo, M.; Stramaglia, S.; Banavar, J. R.; Maritan, A.

    2012-06-01

    Factor analysis is a well known statistical method to describe the variability among observed variables in terms of a smaller number of unobserved latent variables called factors. While dealing with multivariate time series, the temporal correlation structure of data may be modeled by including correlations in latent factors, but a crucial choice is the covariance function to be implemented. We show that analyzing multivariate time series in terms of latent Gaussian processes, which are mutually independent but with each of them being characterized by exponentially decaying temporal correlations, leads to an efficient implementation of the expectation-maximization algorithm for the maximum likelihood estimation of parameters, due to the properties of block-tridiagonal matrices. The proposed approach solves an ambiguity known as the identifiability problem, which renders the solution of factor analysis determined only up to an orthogonal transformation. Samples with just two temporal points are sufficient for the parameter estimation: hence the proposed approach may be applied even in the absence of prior information about the correlation structure of latent variables by fitting the model to pairs of points with varying time delay. Our modeling allows one to make predictions of the future values of time series and we illustrate our method by applying it to an analysis of published gene expression data from cell culture HeLa.

  6. Clustering multivariate time series using Hidden Markov Models.

    PubMed

    Ghassempour, Shima; Girosi, Federico; Maeder, Anthony

    2014-03-06

    In this paper we describe an algorithm for clustering multivariate time series with variables taking both categorical and continuous values. Time series of this type are frequent in health care, where they represent the health trajectories of individuals. The problem is challenging because categorical variables make it difficult to define a meaningful distance between trajectories. We propose an approach based on Hidden Markov Models (HMMs), where we first map each trajectory into an HMM, then define a suitable distance between HMMs and finally proceed to cluster the HMMs with a method based on a distance matrix. We test our approach on a simulated, but realistic, data set of 1,255 trajectories of individuals of age 45 and over, on a synthetic validation set with known clustering structure, and on a smaller set of 268 trajectories extracted from the longitudinal Health and Retirement Survey. The proposed method can be implemented quite simply using standard packages in R and Matlab and may be a good candidate for solving the difficult problem of clustering multivariate time series with categorical variables using tools that do not require advanced statistic knowledge, and therefore are accessible to a wide range of researchers.

  7. Multivariate Statistical Modelling of Compound Events via Pair-Copula Constructions: Analysis of Floods in Ravenna

    NASA Astrophysics Data System (ADS)

    Bevacqua, Emanuele; Maraun, Douglas; Hobæk Haff, Ingrid; Widmann, Martin; Vrac, Mathieu

    2017-04-01

    Compound events are multivariate extreme events in which the individual contributing variables may not be extreme themselves, but their joint - dependent - occurrence causes an extreme impact. The conventional univariate statistical analysis cannot give accurate information regarding the multivariate nature of these events. We develop a conceptual model, implemented via pair-copula constructions, which allows for the quantification of the risk associated with compound events in present day and future climate, as well as the uncertainty estimates around such risk. The model includes meteorological predictors which provide insight into both the involved physical processes, and the temporal variability of CEs. Moreover, this model provides multivariate statistical downscaling of compound events. Downscaling of compound events is required to extend their risk assessment to the past or future climate, where climate models either do not simulate realistic values of the local variables driving the events, or do not simulate them at all. Based on the developed model, we study compound floods, i.e. joint storm surge and high river runoff, in Ravenna (Italy). To explicitly quantify the risk, we define the impact of compound floods as a function of sea and river levels. We use meteorological predictors to extend the analysis to the past, and get a more robust risk analysis. We quantify the uncertainties of the risk analysis observing that they are very large due to the shortness of the available data, though this may also be the case in other studies where they have not been estimated. Ignoring the dependence between sea and river levels would result in an underestimation of risk, in particular the expected return period of the highest compound flood observed increases from about 20 to 32 years when switching from the dependent to the independent case.

  8. Fast and Flexible Multivariate Time Series Subsequence Search

    NASA Technical Reports Server (NTRS)

    Bhaduri, Kanishka; Oza, Nikunj C.; Zhu, Qiang; Srivastava, Ashok N.

    2010-01-01

    Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical monitoring, and financial systems. Domain experts are often interested in searching for interesting multivariate patterns from these MTS databases which often contain several gigabytes of data. Surprisingly, research on MTS search is very limited. Most of the existing work only supports queries with the same length of data, or queries on a fixed set of variables. In this paper, we propose an efficient and flexible subsequence search framework for massive MTS databases, that, for the first time, enables querying on any subset of variables with arbitrary time delays between them. We propose two algorithms to solve this problem (1) a List Based Search (LBS) algorithm which uses sorted lists for indexing, and (2) a R*-tree Based Search (RBS) which uses Minimum Bounding Rectangles (MBR) to organize the subsequences. Both algorithms guarantee that all matching patterns within the specified thresholds will be returned (no false dismissals). The very few false alarms can be removed by a post-processing step. Since our framework is also capable of Univariate Time-Series (UTS) subsequence search, we first demonstrate the efficiency of our algorithms on several UTS datasets previously used in the literature. We follow this up with experiments using two large MTS databases from the aviation domain, each containing several millions of observations. Both these tests show that our algorithms have very high prune rates (>99%) thus needing actual disk access for only less than 1% of the observations. To the best of our knowledge, MTS subsequence search has never been attempted on datasets of the size we have used in this paper.

  9. Optimal model-free prediction from multivariate time series.

    PubMed

    Runge, Jakob; Donner, Reik V; Kurths, Jürgen

    2015-05-01

    Forecasting a time series from multivariate predictors constitutes a challenging problem, especially using model-free approaches. Most techniques, such as nearest-neighbor prediction, quickly suffer from the curse of dimensionality and overfitting for more than a few predictors which has limited their application mostly to the univariate case. Therefore, selection strategies are needed that harness the available information as efficiently as possible. Since often the right combination of predictors matters, ideally all subsets of possible predictors should be tested for their predictive power, but the exponentially growing number of combinations makes such an approach computationally prohibitive. Here a prediction scheme that overcomes this strong limitation is introduced utilizing a causal preselection step which drastically reduces the number of possible predictors to the most predictive set of causal drivers making a globally optimal search scheme tractable. The information-theoretic optimality is derived and practical selection criteria are discussed. As demonstrated for multivariate nonlinear stochastic delay processes, the optimal scheme can even be less computationally expensive than commonly used suboptimal schemes like forward selection. The method suggests a general framework to apply the optimal model-free approach to select variables and subsequently fit a model to further improve a prediction or learn statistical dependencies. The performance of this framework is illustrated on a climatological index of El Niño Southern Oscillation.

  10. Optimal model-free prediction from multivariate time series

    NASA Astrophysics Data System (ADS)

    Runge, Jakob; Donner, Reik V.; Kurths, Jürgen

    2015-05-01

    Forecasting a time series from multivariate predictors constitutes a challenging problem, especially using model-free approaches. Most techniques, such as nearest-neighbor prediction, quickly suffer from the curse of dimensionality and overfitting for more than a few predictors which has limited their application mostly to the univariate case. Therefore, selection strategies are needed that harness the available information as efficiently as possible. Since often the right combination of predictors matters, ideally all subsets of possible predictors should be tested for their predictive power, but the exponentially growing number of combinations makes such an approach computationally prohibitive. Here a prediction scheme that overcomes this strong limitation is introduced utilizing a causal preselection step which drastically reduces the number of possible predictors to the most predictive set of causal drivers making a globally optimal search scheme tractable. The information-theoretic optimality is derived and practical selection criteria are discussed. As demonstrated for multivariate nonlinear stochastic delay processes, the optimal scheme can even be less computationally expensive than commonly used suboptimal schemes like forward selection. The method suggests a general framework to apply the optimal model-free approach to select variables and subsequently fit a model to further improve a prediction or learn statistical dependencies. The performance of this framework is illustrated on a climatological index of El Niño Southern Oscillation.

  11. Multivariate Bayesian modeling of known and unknown causes of events--an application to biosurveillance.

    PubMed

    Shen, Yanna; Cooper, Gregory F

    2012-09-01

    This paper investigates Bayesian modeling of known and unknown causes of events in the context of disease-outbreak detection. We introduce a multivariate Bayesian approach that models multiple evidential features of every person in the population. This approach models and detects (1) known diseases (e.g., influenza and anthrax) by using informative prior probabilities and (2) unknown diseases (e.g., a new, highly contagious respiratory virus that has never been seen before) by using relatively non-informative prior probabilities. We report the results of simulation experiments which support that this modeling method can improve the detection of new disease outbreaks in a population. A contribution of this paper is that it introduces a multivariate Bayesian approach for jointly modeling both known and unknown causes of events. Such modeling has general applicability in domains where the space of known causes is incomplete.

  12. Optimizing functional network representation of multivariate time series.

    PubMed

    Zanin, Massimiliano; Sousa, Pedro; Papo, David; Bajo, Ricardo; García-Prieto, Juan; del Pozo, Francisco; Menasalvas, Ernestina; Boccaletti, Stefano

    2012-01-01

    By combining complex network theory and data mining techniques, we provide objective criteria for optimization of the functional network representation of generic multivariate time series. In particular, we propose a method for the principled selection of the threshold value for functional network reconstruction from raw data, and for proper identification of the network's indicators that unveil the most discriminative information on the system for classification purposes. We illustrate our method by analysing networks of functional brain activity of healthy subjects, and patients suffering from Mild Cognitive Impairment, an intermediate stage between the expected cognitive decline of normal aging and the more pronounced decline of dementia. We discuss extensions of the scope of the proposed methodology to network engineering purposes, and to other data mining tasks.

  13. Optimizing Functional Network Representation of Multivariate Time Series

    PubMed Central

    Zanin, Massimiliano; Sousa, Pedro; Papo, David; Bajo, Ricardo; García-Prieto, Juan; Pozo, Francisco del; Menasalvas, Ernestina; Boccaletti, Stefano

    2012-01-01

    By combining complex network theory and data mining techniques, we provide objective criteria for optimization of the functional network representation of generic multivariate time series. In particular, we propose a method for the principled selection of the threshold value for functional network reconstruction from raw data, and for proper identification of the network's indicators that unveil the most discriminative information on the system for classification purposes. We illustrate our method by analysing networks of functional brain activity of healthy subjects, and patients suffering from Mild Cognitive Impairment, an intermediate stage between the expected cognitive decline of normal aging and the more pronounced decline of dementia. We discuss extensions of the scope of the proposed methodology to network engineering purposes, and to other data mining tasks. PMID:22953051

  14. Gaining Efficiency via Weighted Estimators for Multivariate Failure Time Data*

    PubMed

    Fan, Jianqing; Zhou, Yong; Cai, Jianwen; Chen, Min

    2009-06-01

    Multivariate failure time data arise frequently in survival analysis. A commonly used technique is the working independence estimator for marginal hazard models. Two natural questions are how to improve the efficiency of the working independence estimator and how to identify the situations under which such an estimator has high statistical efficiency. In this paper, three weighted estimators are proposed based on three different optimal criteria in terms of the asymptotic covariance of weighted estimators. Simplified close-form solutions are found, which always outperform the working independence estimator. We also prove that the working independence estimator has high statistical efficiency, when asymptotic covariance of derivatives of partial log-likelihood functions is nearly exchangeable or diagonal. Simulations are conducted to compare the performance of the weighted estimator and working independence estimator. A data set from Busselton population health surveys is analyzed using the proposed estimators.

  15. Optimizing Functional Network Representation of Multivariate Time Series

    NASA Astrophysics Data System (ADS)

    Zanin, Massimiliano; Sousa, Pedro; Papo, David; Bajo, Ricardo; García-Prieto, Juan; Pozo, Francisco Del; Menasalvas, Ernestina; Boccaletti, Stefano

    2012-09-01

    By combining complex network theory and data mining techniques, we provide objective criteria for optimization of the functional network representation of generic multivariate time series. In particular, we propose a method for the principled selection of the threshold value for functional network reconstruction from raw data, and for proper identification of the network's indicators that unveil the most discriminative information on the system for classification purposes. We illustrate our method by analysing networks of functional brain activity of healthy subjects, and patients suffering from Mild Cognitive Impairment, an intermediate stage between the expected cognitive decline of normal aging and the more pronounced decline of dementia. We discuss extensions of the scope of the proposed methodology to network engineering purposes, and to other data mining tasks.

  16. F100 multivariable control synthesis program: Evaluation of a multivariable control using a real-time engine simulation

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.; Soeder, J. F.; Seldner, K.; Cwynar, D. S.

    1977-01-01

    The design, evaluation, and testing of a practical, multivariable, linear quadratic regulator control for the F100 turbofan engine were accomplished. NASA evaluation of the multivariable control logic and implementation are covered. The evaluation utilized a real time, hybrid computer simulation of the engine. Results of the evaluation are presented, and recommendations concerning future engine testing of the control are made. Results indicated that the engine testing of the control should be conducted as planned.

  17. Bayesian inference on risk differences: an application to multivariate meta-analysis of adverse events in clinical trials.

    PubMed

    Chen, Yong; Luo, Sheng; Chu, Haitao; Wei, Peng

    2013-05-01

    Multivariate meta-analysis is useful in combining evidence from independent studies which involve several comparisons among groups based on a single outcome. For binary outcomes, the commonly used statistical models for multivariate meta-analysis are multivariate generalized linear mixed effects models which assume risks, after some transformation, follow a multivariate normal distribution with possible correlations. In this article, we consider an alternative model for multivariate meta-analysis where the risks are modeled by the multivariate beta distribution proposed by Sarmanov (1966). This model have several attractive features compared to the conventional multivariate generalized linear mixed effects models, including simplicity of likelihood function, no need to specify a link function, and has a closed-form expression of distribution functions for study-specific risk differences. We investigate the finite sample performance of this model by simulation studies and illustrate its use with an application to multivariate meta-analysis of adverse events of tricyclic antidepressants treatment in clinical trials.

  18. Algorithms for the analysis of ensemble neural spiking activity using simultaneous-event multivariate point-process models

    PubMed Central

    Ba, Demba; Temereanca, Simona; Brown, Emery N.

    2014-01-01

    Understanding how ensembles of neurons represent and transmit information in the patterns of their joint spiking activity is a fundamental question in computational neuroscience. At present, analyses of spiking activity from neuronal ensembles are limited because multivariate point process (MPP) models cannot represent simultaneous occurrences of spike events at an arbitrarily small time resolution. Solo recently reported a simultaneous-event multivariate point process (SEMPP) model to correct this key limitation. In this paper, we show how Solo's discrete-time formulation of the SEMPP model can be efficiently fit to ensemble neural spiking activity using a multinomial generalized linear model (mGLM). Unlike existing approximate procedures for fitting the discrete-time SEMPP model, the mGLM is an exact algorithm. The MPP time-rescaling theorem can be used to assess model goodness-of-fit. We also derive a new marked point-process (MkPP) representation of the SEMPP model that leads to new thinning and time-rescaling algorithms for simulating an SEMPP stochastic process. These algorithms are much simpler than multivariate extensions of algorithms for simulating a univariate point process, and could not be arrived at without the MkPP representation. We illustrate the versatility of the SEMPP model by analyzing neural spiking activity from pairs of simultaneously-recorded rat thalamic neurons stimulated by periodic whisker deflections, and by simulating SEMPP data. In the data analysis example, the SEMPP model demonstrates that whisker motion significantly modulates simultaneous spiking activity at the 1 ms time scale and that the stimulus effect is more than one order of magnitude greater for simultaneous activity compared with non-simultaneous activity. Together, the mGLM, the MPP time-rescaling theorem and the MkPP representation of the SEMPP model offer a theoretically sound, practical tool for measuring joint spiking propensity in a neuronal ensemble. PMID:24575001

  19. Algorithms for the analysis of ensemble neural spiking activity using simultaneous-event multivariate point-process models.

    PubMed

    Ba, Demba; Temereanca, Simona; Brown, Emery N

    2014-01-01

    Understanding how ensembles of neurons represent and transmit information in the patterns of their joint spiking activity is a fundamental question in computational neuroscience. At present, analyses of spiking activity from neuronal ensembles are limited because multivariate point process (MPP) models cannot represent simultaneous occurrences of spike events at an arbitrarily small time resolution. Solo recently reported a simultaneous-event multivariate point process (SEMPP) model to correct this key limitation. In this paper, we show how Solo's discrete-time formulation of the SEMPP model can be efficiently fit to ensemble neural spiking activity using a multinomial generalized linear model (mGLM). Unlike existing approximate procedures for fitting the discrete-time SEMPP model, the mGLM is an exact algorithm. The MPP time-rescaling theorem can be used to assess model goodness-of-fit. We also derive a new marked point-process (MkPP) representation of the SEMPP model that leads to new thinning and time-rescaling algorithms for simulating an SEMPP stochastic process. These algorithms are much simpler than multivariate extensions of algorithms for simulating a univariate point process, and could not be arrived at without the MkPP representation. We illustrate the versatility of the SEMPP model by analyzing neural spiking activity from pairs of simultaneously-recorded rat thalamic neurons stimulated by periodic whisker deflections, and by simulating SEMPP data. In the data analysis example, the SEMPP model demonstrates that whisker motion significantly modulates simultaneous spiking activity at the 1 ms time scale and that the stimulus effect is more than one order of magnitude greater for simultaneous activity compared with non-simultaneous activity. Together, the mGLM, the MPP time-rescaling theorem and the MkPP representation of the SEMPP model offer a theoretically sound, practical tool for measuring joint spiking propensity in a neuronal ensemble.

  20. A statistical approach for segregating cognitive task stages from multivariate fMRI BOLD time series.

    PubMed

    Demanuele, Charmaine; Bähner, Florian; Plichta, Michael M; Kirsch, Peter; Tost, Heike; Meyer-Lindenberg, Andreas; Durstewitz, Daniel

    2015-01-01

    Multivariate pattern analysis can reveal new information from neuroimaging data to illuminate human cognition and its disturbances. Here, we develop a methodological approach, based on multivariate statistical/machine learning and time series analysis, to discern cognitive processing stages from functional magnetic resonance imaging (fMRI) blood oxygenation level dependent (BOLD) time series. We apply this method to data recorded from a group of healthy adults whilst performing a virtual reality version of the delayed win-shift radial arm maze (RAM) task. This task has been frequently used to study working memory and decision making in rodents. Using linear classifiers and multivariate test statistics in conjunction with time series bootstraps, we show that different cognitive stages of the task, as defined by the experimenter, namely, the encoding/retrieval, choice, reward and delay stages, can be statistically discriminated from the BOLD time series in brain areas relevant for decision making and working memory. Discrimination of these task stages was significantly reduced during poor behavioral performance in dorsolateral prefrontal cortex (DLPFC), but not in the primary visual cortex (V1). Experimenter-defined dissection of time series into class labels based on task structure was confirmed by an unsupervised, bottom-up approach based on Hidden Markov Models. Furthermore, we show that different groupings of recorded time points into cognitive event classes can be used to test hypotheses about the specific cognitive role of a given brain region during task execution. We found that whilst the DLPFC strongly differentiated between task stages associated with different memory loads, but not between different visual-spatial aspects, the reverse was true for V1. Our methodology illustrates how different aspects of cognitive information processing during one and the same task can be separated and attributed to specific brain regions based on information contained in

  1. A statistical approach for segregating cognitive task stages from multivariate fMRI BOLD time series

    PubMed Central

    Demanuele, Charmaine; Bähner, Florian; Plichta, Michael M.; Kirsch, Peter; Tost, Heike; Meyer-Lindenberg, Andreas; Durstewitz, Daniel

    2015-01-01

    Multivariate pattern analysis can reveal new information from neuroimaging data to illuminate human cognition and its disturbances. Here, we develop a methodological approach, based on multivariate statistical/machine learning and time series analysis, to discern cognitive processing stages from functional magnetic resonance imaging (fMRI) blood oxygenation level dependent (BOLD) time series. We apply this method to data recorded from a group of healthy adults whilst performing a virtual reality version of the delayed win-shift radial arm maze (RAM) task. This task has been frequently used to study working memory and decision making in rodents. Using linear classifiers and multivariate test statistics in conjunction with time series bootstraps, we show that different cognitive stages of the task, as defined by the experimenter, namely, the encoding/retrieval, choice, reward and delay stages, can be statistically discriminated from the BOLD time series in brain areas relevant for decision making and working memory. Discrimination of these task stages was significantly reduced during poor behavioral performance in dorsolateral prefrontal cortex (DLPFC), but not in the primary visual cortex (V1). Experimenter-defined dissection of time series into class labels based on task structure was confirmed by an unsupervised, bottom-up approach based on Hidden Markov Models. Furthermore, we show that different groupings of recorded time points into cognitive event classes can be used to test hypotheses about the specific cognitive role of a given brain region during task execution. We found that whilst the DLPFC strongly differentiated between task stages associated with different memory loads, but not between different visual-spatial aspects, the reverse was true for V1. Our methodology illustrates how different aspects of cognitive information processing during one and the same task can be separated and attributed to specific brain regions based on information contained in

  2. A Bayesian approach to joint analysis of multivariate longitudinal data and parametric accelerated failure time.

    PubMed

    Luo, Sheng

    2014-02-20

    Impairment caused by Parkinson's disease (PD) is multidimensional (e.g., sensoria, functions, and cognition) and progressive. Its multidimensional nature precludes a single outcome to measure disease progression. Clinical trials of PD use multiple categorical and continuous longitudinal outcomes to assess the treatment effects on overall improvement. A terminal event such as death or dropout can stop the follow-up process. Moreover, the time to the terminal event may be dependent on the multivariate longitudinal measurements. In this article, we consider a joint random-effects model for the correlated outcomes. A multilevel item response theory model is used for the multivariate longitudinal outcomes and a parametric accelerated failure time model is used for the failure time because of the violation of proportional hazard assumption. These two models are linked via random effects. The Bayesian inference via MCMC is implemented in 'BUGS' language. Our proposed method is evaluated by a simulation study and is applied to DATATOP study, a motivating clinical trial to determine if deprenyl slows the progression of PD. © 2013 The authors. Statistics in Medicine published by John Wiley & Sons, Ltd.

  3. A Framework and Algorithms for Multivariate Time Series Analytics (MTSA): Learning, Monitoring, and Recommendation

    ERIC Educational Resources Information Center

    Ngan, Chun-Kit

    2013-01-01

    Making decisions over multivariate time series is an important topic which has gained significant interest in the past decade. A time series is a sequence of data points which are measured and ordered over uniform time intervals. A multivariate time series is a set of multiple, related time series in a particular domain in which domain experts…

  4. A Framework and Algorithms for Multivariate Time Series Analytics (MTSA): Learning, Monitoring, and Recommendation

    ERIC Educational Resources Information Center

    Ngan, Chun-Kit

    2013-01-01

    Making decisions over multivariate time series is an important topic which has gained significant interest in the past decade. A time series is a sequence of data points which are measured and ordered over uniform time intervals. A multivariate time series is a set of multiple, related time series in a particular domain in which domain experts…

  5. Narrative event boundaries, reading times, and expectation.

    PubMed

    Pettijohn, Kyle A; Radvansky, Gabriel A

    2016-10-01

    During text comprehension, readers create mental representations of the described events, called situation models. When new information is encountered, these models must be updated or new ones created. Consistent with the event indexing model, previous studies have shown that when readers encounter an event shift, reading times often increase. However, such increases are not consistently observed. This paper addresses this inconsistency by examining the extent to which reading-time differences observed at event shifts reflect an unexpectedness in the narrative rather than processes involved in model updating. In two reassessments of prior work, event shifts known to increase reading time were rated as less expected, and expectedness ratings significantly predicted reading time. In three new experiments, participants read stories in which an event shift was or was not foreshadowed, thereby influencing expectedness of the shift. Experiment 1 revealed that readers do not expect event shifts, but foreshadowing eliminates this. Experiment 2 showed that foreshadowing does not affect identification of event shifts. Finally, Experiment 3 found that, although reading times increased when an event shift was not foreshadowed, they were not different from controls when it was. Moreover, responses to memory probes were slower following an event shift regardless of foreshadowing, suggesting that situation model updating had taken place. Overall, the results support the idea that previously observed reading time increases at event shifts reflect, at least in part, a reader's unexpected encounter with a shift rather than an increase in processing effort required to update a situation model.

  6. A climate-based multivariate extreme emulator of met-ocean-hydrological events for coastal flooding

    NASA Astrophysics Data System (ADS)

    Camus, Paula; Rueda, Ana; Mendez, Fernando J.; Tomas, Antonio; Del Jesus, Manuel; Losada, Iñigo J.

    2015-04-01

    Atmosphere-ocean general circulation models (AOGCMs) are useful to analyze large-scale climate variability (long-term historical periods, future climate projections). However, applications such as coastal flood modeling require climate information at finer scale. Besides, flooding events depend on multiple climate conditions: waves, surge levels from the open-ocean and river discharge caused by precipitation. Therefore, a multivariate statistical downscaling approach is adopted to reproduce relationships between variables and due to its low computational cost. The proposed method can be considered as a hybrid approach which combines a probabilistic weather type downscaling model with a stochastic weather generator component. Predictand distributions are reproduced modeling the relationship with AOGCM predictors based on a physical division in weather types (Camus et al., 2012). The multivariate dependence structure of the predictand (extreme events) is introduced linking the independent marginal distributions of the variables by a probabilistic copula regression (Ben Ayala et al., 2014). This hybrid approach is applied for the downscaling of AOGCM data to daily precipitation and maximum significant wave height and storm-surge in different locations along the Spanish coast. Reanalysis data is used to assess the proposed method. A commonly predictor for the three variables involved is classified using a regression-guided clustering algorithm. The most appropriate statistical model (general extreme value distribution, pareto distribution) for daily conditions is fitted. Stochastic simulation of the present climate is performed obtaining the set of hydraulic boundary conditions needed for high resolution coastal flood modeling. References: Camus, P., Menéndez, M., Méndez, F.J., Izaguirre, C., Espejo, A., Cánovas, V., Pérez, J., Rueda, A., Losada, I.J., Medina, R. (2014b). A weather-type statistical downscaling framework for ocean wave climate. Journal of

  7. Multivariate space - time analysis of PRE-STORM precipitation

    NASA Technical Reports Server (NTRS)

    Polyak, Ilya; North, Gerald R.; Valdes, Juan B.

    1994-01-01

    This paper presents the methodologies and results of the multivariate modeling and two-dimensional spectral and correlation analysis of PRE-STORM rainfall gauge data. Estimated parameters of the models for the specific spatial averages clearly indicate the eastward and southeastward wave propagation of rainfall fluctuations. A relationship between the coefficients of the diffusion equation and the parameters of the stochastic model of rainfall fluctuations is derived that leads directly to the exclusive use of rainfall data to estimate advection speed (about 12 m/s) as well as other coefficients of the diffusion equation of the corresponding fields. The statistical methodology developed here can be used for confirmation of physical models by comparison of the corresponding second-moment statistics of the observed and simulated data, for generating multiple samples of any size, for solving the inverse problem of the hydrodynamic equations, and for application in some other areas of meteorological and climatological data analysis and modeling.

  8. Multivariate space - time analysis of PRE-STORM precipitation

    NASA Technical Reports Server (NTRS)

    Polyak, Ilya; North, Gerald R.; Valdes, Juan B.

    1994-01-01

    This paper presents the methodologies and results of the multivariate modeling and two-dimensional spectral and correlation analysis of PRE-STORM rainfall gauge data. Estimated parameters of the models for the specific spatial averages clearly indicate the eastward and southeastward wave propagation of rainfall fluctuations. A relationship between the coefficients of the diffusion equation and the parameters of the stochastic model of rainfall fluctuations is derived that leads directly to the exclusive use of rainfall data to estimate advection speed (about 12 m/s) as well as other coefficients of the diffusion equation of the corresponding fields. The statistical methodology developed here can be used for confirmation of physical models by comparison of the corresponding second-moment statistics of the observed and simulated data, for generating multiple samples of any size, for solving the inverse problem of the hydrodynamic equations, and for application in some other areas of meteorological and climatological data analysis and modeling.

  9. Bayesian inference on risk differences: an application to multivariate meta-analysis of adverse events in clinical trials

    PubMed Central

    Chen, Yong; Luo, Sheng; Chu, Haitao; Wei, Peng

    2013-01-01

    Multivariate meta-analysis is useful in combining evidence from independent studies which involve several comparisons among groups based on a single outcome. For binary outcomes, the commonly used statistical models for multivariate meta-analysis are multivariate generalized linear mixed effects models which assume risks, after some transformation, follow a multivariate normal distribution with possible correlations. In this article, we consider an alternative model for multivariate meta-analysis where the risks are modeled by the multivariate beta distribution proposed by Sarmanov (1966). This model have several attractive features compared to the conventional multivariate generalized linear mixed effects models, including simplicity of likelihood function, no need to specify a link function, and has a closed-form expression of distribution functions for study-specific risk differences. We investigate the finite sample performance of this model by simulation studies and illustrate its use with an application to multivariate meta-analysis of adverse events of tricyclic antidepressants treatment in clinical trials. PMID:23853700

  10. Visual pattern discovery in timed event data

    NASA Astrophysics Data System (ADS)

    Schaefer, Matthias; Wanner, Franz; Mansmann, Florian; Scheible, Christian; Stennett, Verity; Hasselrot, Anders T.; Keim, Daniel A.

    2011-01-01

    Business processes have tremendously changed the way large companies conduct their business: The integration of information systems into the workflows of their employees ensures a high service level and thus high customer satisfaction. One core aspect of business process engineering are events that steer the workflows and trigger internal processes. Strict requirements on interval-scaled temporal patterns, which are common in time series, are thereby released through the ordinal character of such events. It is this additional degree of freedom that opens unexplored possibilities for visualizing event data. In this paper, we present a flexible and novel system to find significant events, event clusters and event patterns. Each event is represented as a small rectangle, which is colored according to categorical, ordinal or intervalscaled metadata. Depending on the analysis task, different layout functions are used to highlight either the ordinal character of the data or temporal correlations. The system has built-in features for ordering customers or event groups according to the similarity of their event sequences, temporal gap alignment and stacking of co-occurring events. Two characteristically different case studies dealing with business process events and news articles demonstrate the capabilities of our system to explore event data.

  11. Multivariate Prediction Equations for HbA1c Lowering, Weight Change, and Hypoglycemic Events Associated with Insulin Rescue Medication in Type 2 Diabetes Mellitus: Informing Economic Modeling.

    PubMed

    Willis, Michael; Asseburg, Christian; Nilsson, Andreas; Johnsson, Kristina; Kartman, Bernt

    2017-03-01

    Type 2 diabetes mellitus (T2DM) is chronic and progressive and the cost-effectiveness of new treatment interventions must be established over long time horizons. Given the limited durability of drugs, assumptions regarding downstream rescue medication can drive results. Especially for insulin, for which treatment effects and adverse events are known to depend on patient characteristics, this can be problematic for health economic evaluation involving modeling. To estimate parsimonious multivariate equations of treatment effects and hypoglycemic event risks for use in parameterizing insulin rescue therapy in model-based cost-effectiveness analysis. Clinical evidence for insulin use in T2DM was identified in PubMed and from published reviews and meta-analyses. Study and patient characteristics and treatment effects and adverse event rates were extracted and the data used to estimate parsimonious treatment effect and hypoglycemic event risk equations using multivariate regression analysis. Data from 91 studies featuring 171 usable study arms were identified, mostly for premix and basal insulin types. Multivariate prediction equations for glycated hemoglobin A1c lowering and weight change were estimated separately for insulin-naive and insulin-experienced patients. Goodness of fit (R(2)) for both outcomes were generally good, ranging from 0.44 to 0.84. Multivariate prediction equations for symptomatic, nocturnal, and severe hypoglycemic events were also estimated, though considerable heterogeneity in definitions limits their usefulness. Parsimonious and robust multivariate prediction equations were estimated for glycated hemoglobin A1c and weight change, separately for insulin-naive and insulin-experienced patients. Using these in economic simulation modeling in T2DM can improve realism and flexibility in modeling insulin rescue medication. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights

  12. Multivariate real-time assessment of droughts via copula-based multi-site Hazard Trajectories and Fans

    NASA Astrophysics Data System (ADS)

    Salvadori, G.; De Michele, C.

    2015-07-01

    Droughts, like floods, represent the most dangerous, and costly, water cycle expressions, with huge impacts on society and built environment. Droughts are events occurring over a certain region, lasting several weeks or months, and involving multiple variables: thus, a multivariate, multi-site, approach is most appropriate for their statistical characterization. In this methodological work, hydrological droughts are considered, and a multivariate approach is proposed, by regarding as relevant variables the duration and the average intensity. A multivariate, multi-site, frequency analysis is presented, based on the Theory of Copulas and the joint Survival Kendall's Return Periods, by investigating the historical drought episodes occurred at five main river sections of the Po river (Northern Italy), the most important Italian basin. The tool of Dynamic Return Period is used, and the new concepts of Hazard Trajectories and Fans are introduced, in order to provide useful indications for a valuable multi-site real-time assessment of droughts.

  13. A multi-variance analysis in the time domain

    NASA Technical Reports Server (NTRS)

    Walter, Todd

    1993-01-01

    Recently a new technique for characterizing the noise processes affecting oscillators was introduced. This technique minimizes the difference between the estimates of several different variances and their values as predicted by the standard power law model of noise. The method outlined makes two significant advancements: it uses exclusively time domain variances so that deterministic parameters such as linear frequency drift may be estimated, and it correctly fits the estimates using the chi-square distribution. These changes permit a more accurate fitting at long time intervals where there is the least information. This technique was applied to both simulated and real data with excellent results.

  14. Decoupling in linear time-varying multivariable systems

    NASA Technical Reports Server (NTRS)

    Sankaran, V.

    1973-01-01

    The necessary and sufficient conditions for the decoupling of an m-input, m-output, linear time varying dynamical system by state variable feedback is described. The class of feedback matrices which decouple the system are illustrated. Systems which do not satisfy these results are described and systems with disturbances are considered. Some examples are illustrated to clarify the results.

  15. Dynamic Factor Analysis of Nonstationary Multivariate Time Series.

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; And Others

    1992-01-01

    The dynamic factor model proposed by P. C. Molenaar (1985) is exhibited, and a dynamic nonstationary factor model (DNFM) is constructed with latent factor series that have time-varying mean functions. The use of a DNFM is illustrated using data from a television viewing habits study. (SLD)

  16. When univariate model-free time series prediction is better than multivariate

    NASA Astrophysics Data System (ADS)

    Chayama, Masayoshi; Hirata, Yoshito

    2016-07-01

    The delay coordinate method is known to be a practically useful technique for reconstructing the states of an observed system. While this method is theoretically supported by Takens' embedding theorem concerning observations of a scalar time series, we can extend the method to include a multivariate time series. It is often assumed that a better prediction can be obtained using a multivariate time series than by using a scalar time series. However, multivariate time series contains various types of information, and it may be difficult to extract information that is useful for predicting the states. Thus, univariate prediction may sometimes be superior to multivariate prediction. Here, we compare univariate model-free time series predictions with multivariate ones, and demonstrate that univariate model-free prediction is better than multivariate one when the prediction steps are small, while multivariate prediction performs better when the prediction steps become larger. We show the validity of the former finding by using artificial datasets generated from the Lorenz 96 models and a real solar irradiance dataset. The results indicate that it is possible to determine which method is the best choice by considering how far into the future we want to predict.

  17. A multivariate time series approach to modeling and forecasting demand in the emergency department.

    PubMed

    Jones, Spencer S; Evans, R Scott; Allen, Todd L; Thomas, Alun; Haug, Peter J; Welch, Shari J; Snow, Gregory L

    2009-02-01

    The goals of this investigation were to study the temporal relationships between the demands for key resources in the emergency department (ED) and the inpatient hospital, and to develop multivariate forecasting models. Hourly data were collected from three diverse hospitals for the year 2006. Descriptive analysis and model fitting were carried out using graphical and multivariate time series methods. Multivariate models were compared to a univariate benchmark model in terms of their ability to provide out-of-sample forecasts of ED census and the demands for diagnostic resources. Descriptive analyses revealed little temporal interaction between the demand for inpatient resources and the demand for ED resources at the facilities considered. Multivariate models provided more accurate forecasts of ED census and of the demands for diagnostic resources. Our results suggest that multivariate time series models can be used to reliably forecast ED patient census; however, forecasts of the demands for diagnostic resources were not sufficiently reliable to be useful in the clinical setting.

  18. Multivariable testing cuts door-to-doc times by 24%.

    PubMed

    2007-01-01

    You can institute several process improvements in your ED in a relatively short time by testing several new ideas at once and determining those most likely to be successful. Brainstorm any and all ideas with your staff and then reduce the list to those most likely to be of benefit. Be sure to test ideas in combination as well as separately; sometimes the synergy of two ideas will make both stronger. Assign the charge nurse for each shift with the responsibility of seeing that the new ideas are carried out correctly.

  19. Detecting unitary events without discretization of time.

    PubMed

    Grün, S; Diesmann, M; Grammont, F; Riehle, A; Aertsen, A

    1999-12-15

    In earlier studies we developed the 'Unitary Events' analysis (Grün S. Unitary Joint-Events in Multiple-Neuron Spiking Activity: Detection, Significance and Interpretation. Reihe Physik, Band 60. Thun, Frankfurt/Main: Verlag Harri Deutsch, 1996.) to detect the presence of conspicuous spike coincidences in multiple single unit recordings and to evaluate their statistical significance. The method enabled us to study the relation between spike synchronization and behavioral events (Riehle A, Grün S, Diesmann M, Aertsen A. Spike synchronization and rate modulation differentially involved in motor cortical function. Science 1997;278:1950-1953.). There is recent experimental evidence that the timing accuracy of coincident spiking events, which might be relevant for higher brain function, may be in the range of 1-5 ms. To detect coincidences on that time scale, we sectioned the observation interval into short disjunct time slices ('bins'). Unitary Events analysis of this discretized process demonstrated that coincident events can indeed be reliably detected. However, the method looses sensitivity for higher temporal jitter of the events constituting the coincidences (Grün S. Unitary Joint-Events in Multiple-Neuron Spiking Activity: Detection, Significance and Interpretation. Reihe Physik, Band 60. Thun, Frankfurt/Main: Verlag Harri Deutsch, 1996.). Here we present a new approach, the 'multiple shift' method (MS), which overcomes the need for binning and treats the data in their (original) high time resolution (typically 1 ms, or better). Technically, coincidences are detected by shifting the spike trains against each other over the range of allowed coincidence width and integrating the number of exact coincidences (on the time resolution of the data) over all shifts. We found that the new method enhances the sensitivity for coincidences with temporal jitter. Both methods are outlined and compared on the basis of their analytical description and their application on

  20. Tidal Disruption Events Across Cosmic Time

    NASA Astrophysics Data System (ADS)

    Fialkov, Anastasia; Loeb, Abraham

    2017-01-01

    Tidal disruption events (TDEs) of stars by single or binary super-massive black holes illuminate the environment around quiescent black holes in galactic nuclei allowing to probe dorment black holes. We predict the TDE rates expected to be detected by next-generation X-ray surveys. We include events sourced by both single and binary super-massive black holes assuming that 10% of TDEs lead to the formation of relativistic jets and are therefore observable to higher redshifts. Assigning the Eddington luminosity to each event, we show that if the occupation fraction of intermediate black holes is high, more than 90% of the brightest TDE might be associated with merging black holes which are potential sources for eLISA. Next generation telescopes with improved sensitivities should probe dim local TDE events as well as bright events at high redshifts. We show that an instrument which is 50 times more sensitive than the Swift Burst Alert Telescope (BAT) is expected to trigger ~10 times more events than BAT. Majority of these events originate at low redshifts (z<0.5) if the occupation fraction of IMBHs is high and at high-redshift (z>2) if it is low.

  1. Discrete Events as Units of Perceived Time

    ERIC Educational Resources Information Center

    Liverence, Brandon M.; Scholl, Brian J.

    2012-01-01

    In visual images, we perceive both space (as a continuous visual medium) and objects (that inhabit space). Similarly, in dynamic visual experience, we perceive both continuous time and discrete events. What is the relationship between these units of experience? The most intuitive answer may be similar to the spatial case: time is perceived as an…

  2. Discrete Events as Units of Perceived Time

    ERIC Educational Resources Information Center

    Liverence, Brandon M.; Scholl, Brian J.

    2012-01-01

    In visual images, we perceive both space (as a continuous visual medium) and objects (that inhabit space). Similarly, in dynamic visual experience, we perceive both continuous time and discrete events. What is the relationship between these units of experience? The most intuitive answer may be similar to the spatial case: time is perceived as an…

  3. Multivariate statistical modelling of compound events via pair-copula constructions: analysis of floods in Ravenna (Italy)

    NASA Astrophysics Data System (ADS)

    Bevacqua, Emanuele; Maraun, Douglas; Hobæk Haff, Ingrid; Widmann, Martin; Vrac, Mathieu

    2017-06-01

    Compound events (CEs) are multivariate extreme events in which the individual contributing variables may not be extreme themselves, but their joint - dependent - occurrence causes an extreme impact. Conventional univariate statistical analysis cannot give accurate information regarding the multivariate nature of these events. We develop a conceptual model, implemented via pair-copula constructions, which allows for the quantification of the risk associated with compound events in present-day and future climate, as well as the uncertainty estimates around such risk. The model includes predictors, which could represent for instance meteorological processes that provide insight into both the involved physical mechanisms and the temporal variability of compound events. Moreover, this model enables multivariate statistical downscaling of compound events. Downscaling is required to extend the compound events' risk assessment to the past or future climate, where climate models either do not simulate realistic values of the local variables driving the events or do not simulate them at all. Based on the developed model, we study compound floods, i.e. joint storm surge and high river runoff, in Ravenna (Italy). To explicitly quantify the risk, we define the impact of compound floods as a function of sea and river levels. We use meteorological predictors to extend the analysis to the past, and get a more robust risk analysis. We quantify the uncertainties of the risk analysis, observing that they are very large due to the shortness of the available data, though this may also be the case in other studies where they have not been estimated. Ignoring the dependence between sea and river levels would result in an underestimation of risk; in particular, the expected return period of the highest compound flood observed increases from about 20 to 32 years when switching from the dependent to the independent case.

  4. Globally disruptive events show predictable timing patterns

    NASA Astrophysics Data System (ADS)

    Gillman, Michael P.; Erenler, Hilary E.

    2017-01-01

    Globally disruptive events include asteroid/comet impacts, large igneous provinces and glaciations, all of which have been considered as contributors to mass extinctions. Understanding the overall relationship between the timings of the largest extinctions and their potential proximal causes remains one of science's great unsolved mysteries. Cycles of about 60 Myr in both fossil diversity and environmental data suggest external drivers such as the passage of the Solar System through the galactic plane. While cyclic phenomena are recognized statistically, a lack of coherent mechanisms and a failure to link key events has hampered wider acceptance of multi-million year periodicity and its relevance to earth science and evolution. The generation of a robust predictive model of timings, with a clear plausible primary mechanism, would signal a paradigm shift. Here, we present a model of the timings of globally disruptive events and a possible explanation of their ultimate cause. The proposed model is a symmetrical pattern of 63 Myr sequences around a central value, interpreted as the occurrence of events along, and parallel to, the galactic midplane. The symmetry is consistent with multiple dark matter disks, aligned parallel to the midplane. One implication of the precise pattern of timings and the underlying physical model is the ability to predict future events, such as a major extinction in 1-2 Myr.

  5. Risk factors for ventilator-associated events: a case-control multivariable analysis.

    PubMed

    Lewis, Sarah C; Li, Lingling; Murphy, Michael V; Klompas, Michael

    2014-08-01

    The Centers for Disease Control and Prevention recently released new surveillance definitions for ventilator-associated events, including the new entities of ventilator-associated conditions and infection-related ventilator-associated complications. Both ventilator-associated conditions and infection-related ventilator-associated complications are associated with prolonged mechanical ventilation and hospital death, but little is known about their risk factors and how best to prevent them. We sought to identify risk factors for ventilator-associated conditions and infection-related ventilator-associated complications. Retrospective case-control study. Medical, surgical, cardiac, and neuroscience units of a tertiary care teaching hospital. Hundred ten patients with ventilator-associated conditions matched to 110 controls without ventilator-associated conditions on the basis of age, sex, ICU type, comorbidities, and duration of mechanical ventilation prior to ventilator-associated conditions. None. We compared cases with controls with regard to demographics, comorbidities, ventilator bundle adherence rates, sedative exposures, routes of nutrition, blood products, fluid balance, and modes of ventilatory support. We repeated the analysis for the subset of patients with infection-related ventilator-associated complications and their controls. Case and control patients were well matched on baseline characteristics. On multivariable logistic regression, significant risk factors for ventilator-associated conditions were mandatory modes of ventilation (odds ratio, 3.4; 95% CI, 1.6-8.0) and positive fluid balances (odds ratio, 1.2 per L positive; 95% CI, 1.0-1.4). Possible risk factors for infection-related ventilator-associated complications were starting benzodiazepines prior to intubation (odds ratio, 5.0; 95% CI, 1.3-29), total opioid exposures (odds ratio, 3.3 per 100 μg fentanyl equivalent/kg; 95% CI, 0.90-16), and paralytic medications (odds ratio, 2.3; 95% CI, 0

  6. It's T time: A study on the return period of multivariate problems

    NASA Astrophysics Data System (ADS)

    Michailidi, Eleni Maria; Balistrocchi, Matteo; Bacchi, Baldassare

    2016-04-01

    variables: hydrograph's peak flow, volume and shape. Consequently, a multivariate framework is needed for a more realistic view of the matter at hand. In recent years, the application of copula functions has facilitated overcoming the inadequacies of multivariate distributions as the problem is handled from two non-interwinding aspects: the dependence structure of the pair of variables and the marginal distributions. The main objective of this study is to investigate whether it is possible to find, in a multivariate space, a region where all the multivariate events produce 'risk' lower or greater than a fixed mean inter-occurrence of failures of one time every T-years. Preliminary results seem to confirm that it is impossible to obtain uniqueness in the definition.

  7. Granger Causality in Multivariate Time Series Using a Time-Ordered Restricted Vector Autoregressive Model

    NASA Astrophysics Data System (ADS)

    Siggiridou, Elsa; Kugiumtzis, Dimitris

    2016-04-01

    Granger causality has been used for the investigation of the inter-dependence structure of the underlying systems of multi-variate time series. In particular, the direct causal effects are commonly estimated by the conditional Granger causality index (CGCI). In the presence of many observed variables and relatively short time series, CGCI may fail because it is based on vector autoregressive models (VAR) involving a large number of coefficients to be estimated. In this work, the VAR is restricted by a scheme that modifies the recently developed method of backward-in-time selection (BTS) of the lagged variables and the CGCI is combined with BTS. Further, the proposed approach is compared favorably to other restricted VAR representations, such as the top-down strategy, the bottom-up strategy, and the least absolute shrinkage and selection operator (LASSO), in terms of sensitivity and specificity of CGCI. This is shown by using simulations of linear and nonlinear, low and high-dimensional systems and different time series lengths. For nonlinear systems, CGCI from the restricted VAR representations are compared with analogous nonlinear causality indices. Further, CGCI in conjunction with BTS and other restricted VAR representations is applied to multi-channel scalp electroencephalogram (EEG) recordings of epileptic patients containing epileptiform discharges. CGCI on the restricted VAR, and BTS in particular, could track the changes in brain connectivity before, during and after epileptiform discharges, which was not possible using the full VAR representation.

  8. Discrete events as units of perceived time.

    PubMed

    Liverence, Brandon M; Scholl, Brian J

    2012-06-01

    In visual images, we perceive both space (as a continuous visual medium) and objects (that inhabit space). Similarly, in dynamic visual experience, we perceive both continuous time and discrete events. What is the relationship between these units of experience? The most intuitive answer may be similar to the spatial case: time is perceived as an underlying medium, which is later segmented into discrete event representations. Here we explore the opposite possibility--that our subjective experience of time itself can be influenced by how durations are temporally segmented, beyond more general effects of change and complexity. We show that the way in which a continuous dynamic display is segmented into discrete units (via a path shuffling manipulation) greatly influences duration judgments, independent of psychophysical factors previously implicated in time perception, such as overall stimulus energy, attention and predictability. It seems that we may use the passage of discrete events--and the boundaries between them--in our subjective experience as part of the raw material for inferring the strength of the underlying "current" of time.

  9. Learning a Mahalanobis Distance-Based Dynamic Time Warping Measure for Multivariate Time Series Classification.

    PubMed

    Mei, Jiangyuan; Liu, Meizhu; Wang, Yuan-Fang; Gao, Huijun

    2016-06-01

    Multivariate time series (MTS) datasets broadly exist in numerous fields, including health care, multimedia, finance, and biometrics. How to classify MTS accurately has become a hot research topic since it is an important element in many computer vision and pattern recognition applications. In this paper, we propose a Mahalanobis distance-based dynamic time warping (DTW) measure for MTS classification. The Mahalanobis distance builds an accurate relationship between each variable and its corresponding category. It is utilized to calculate the local distance between vectors in MTS. Then we use DTW to align those MTS which are out of synchronization or with different lengths. After that, how to learn an accurate Mahalanobis distance function becomes another key problem. This paper establishes a LogDet divergence-based metric learning with triplet constraint model which can learn Mahalanobis matrix with high precision and robustness. Furthermore, the proposed method is applied on nine MTS datasets selected from the University of California, Irvine machine learning repository and Robert T. Olszewski's homepage, and the results demonstrate the improved performance of the proposed approach.

  10. NavyTime: Event and Time Ordering from Raw Text

    DTIC Science & Technology

    2013-06-01

    completely labeled graph of events and times, it is not about true extraction, but matching human la- beling decisions that were constrained by time and...relation ID and la- beling . Results are shown in Table 3. Our system ranked 2nd of 4 systems. Our best performing setup uses trained classi- fiers for

  11. A Multivariate Model for the Meta-Analysis of Study Level Survival Data at Multiple Times

    ERIC Educational Resources Information Center

    Jackson, Dan; Rollins, Katie; Coughlin, Patrick

    2014-01-01

    Motivated by our meta-analytic dataset involving survival rates after treatment for critical leg ischemia, we develop and apply a new multivariate model for the meta-analysis of study level survival data at multiple times. Our data set involves 50 studies that provide mortality rates at up to seven time points, which we model simultaneously, and…

  12. A Sandwich-Type Standard Error Estimator of SEM Models with Multivariate Time Series

    ERIC Educational Resources Information Center

    Zhang, Guangjian; Chow, Sy-Miin; Ong, Anthony D.

    2011-01-01

    Structural equation models are increasingly used as a modeling tool for multivariate time series data in the social and behavioral sciences. Standard error estimators of SEM models, originally developed for independent data, require modifications to accommodate the fact that time series data are inherently dependent. In this article, we extend a…

  13. A Sandwich-Type Standard Error Estimator of SEM Models with Multivariate Time Series

    ERIC Educational Resources Information Center

    Zhang, Guangjian; Chow, Sy-Miin; Ong, Anthony D.

    2011-01-01

    Structural equation models are increasingly used as a modeling tool for multivariate time series data in the social and behavioral sciences. Standard error estimators of SEM models, originally developed for independent data, require modifications to accommodate the fact that time series data are inherently dependent. In this article, we extend a…

  14. Non-parametric estimation of gap time survival functions for ordered multivariate failure time data.

    PubMed

    Schaubel, Douglas E; Cai, Jianwen

    2004-06-30

    Times between sequentially ordered events (gap times) are often of interest in biomedical studies. For example, in a cancer study, the gap times from incidence-to-remission and remission-to-recurrence may be examined. Such data are usually subject to right censoring, and within-subject failure times are generally not independent. Statistical challenges in the analysis of the second and subsequent gap times include induced dependent censoring and non-identifiability of the marginal distributions. We propose a non-parametric method for constructing one-sample estimators of conditional gap-time specific survival functions. The estimators are uniformly consistent and, upon standardization, converge weakly to a zero-mean Gaussian process, with a covariance function which can be consistently estimated. Simulation studies reveal that the asymptotic approximations are appropriate for finite samples. Methods for confidence bands are provided. The proposed methods are illustrated on a renal failure data set, where the probabilities of transplant wait-listing and kidney transplantation are of interest.

  15. Multivariate spatial analysis of a heavy rain event in a densely populated delta city

    NASA Astrophysics Data System (ADS)

    Gaitan, Santiago; ten Veldhuis, Marie-claire; Bruni, Guenda; van de Giesen, Nick

    2014-05-01

    Delta cities account for half of the world's population and host key infrastructure and services for the global economic growth. Due to the characteristic geography of delta areas, these cities face high vulnerability to extreme weather and pluvial flooding risks, that are expected to increase as climate change drives heavier rain events. Besides, delta cities are subjected to fast urban densification processes that progressively make them more vulnerable to pluvial flooding. Delta cities need to be adapted to better cope with this threat. The mechanism leading to damage after heavy rains is not completely understood. For instance, current research has shown that rain intensities and volumes can only partially explain the occurrence and localization of rain-related insurance claims (Spekkers et al., 2013). The goal of this paper is to provide further insights into spatial characteristics of the urban environment that can significantly be linked to pluvial-related flooding impacts. To that end, a study-case has been selected: on October 12 to 14 2013, a heavy rain event triggered pluvial floods in Rotterdam, a densely populated city which is undergoing multiple climate adaptation efforts and is located in the Meuse river Delta. While the average yearly precipitation in this city is around 800 mm, local rain gauge measurements ranged from aprox. 60 to 130 mm just during these three days. More than 600 citizens' telephonic complaints reported impacts related to rainfall. The registry of those complaints, which comprises around 300 calls made to the municipality and another 300 to the fire brigade, was made available for research. Other accessible information about this city includes a series of rainfall measurements with up to 1 min time-step at 7 different locations around the city, ground-based radar rainfall data (1 Km^2 spatial resolution and 5 min time-step), a digital elevation model (50 cm of horizontal resolution), a model of overland-flow paths, cadastral

  16. Reinforcement genetic approach to coefficient estimation for multivariable nonlinear discrete-time dynamical systems

    NASA Astrophysics Data System (ADS)

    Chang, Wei-Der; Yan, Jun-Juh

    2006-10-01

    In this paper, we propose a novel genetic algorithm (GA) with a multi-crossover fashion to estimate the associated coefficients for a class of nonlinear discrete-time multivariable dynamical systems. Unlike the traditional crossover method of using two chromosomes, the proposed method uses three chromosomes to achieve a crossover. According to the adjusting direction by crossing three chromosomes, more excellent offspring can be produced. To solve the identification problem of multivariable nonlinear discrete-time systems, each of estimated system coefficients represents a gene, and a collection of genes is referred to as a chromosome in the view of GA. The chromosomes in the population are then evolved using the proposed multi-crossover method. An illustrative example of multivariable nonlinear systems is given to demonstrate the effectiveness, as compared with the traditional crossover method, of the proposed method.

  17. An Investigation of Multivariate Adaptive Regression Splines for Modeling and Analysis of Univariate and Semi-Multivariate Time Series Systems

    DTIC Science & Technology

    1991-09-01

    GRAFSTAT from IBM Research; I am grateful to Dr . Peter Welch for supplying GRAFSTAT. To P.A.W. Lewis, Thank you for your support, confidence and...34Multivariate Adaptive Regression Splines", Annals of Statistics, v. 19, no. 2, pp. 1-142, 1991. Geib , A., Applied Optimal Estimation, M.I.T. Press, Cambridge

  18. Multivariable Model for Time to First Treatment in Patients With Chronic Lymphocytic Leukemia

    PubMed Central

    Wierda, William G.; O'Brien, Susan; Wang, Xuemei; Faderl, Stefan; Ferrajoli, Alessandra; Do, Kim-Anh; Garcia-Manero, Guillermo; Cortes, Jorge; Thomas, Deborah; Koller, Charles A.; Burger, Jan A.; Lerner, Susan; Schlette, Ellen; Abruzzo, Lynne; Kantarjian, Hagop M.; Keating, Michael J.

    2011-01-01

    Purpose The clinical course for patients with chronic lymphocytic leukemia (CLL) is diverse; some patients have indolent disease, never needing treatment, whereas others have aggressive disease requiring early treatment. We continue to use criteria for active disease to initiate therapy. Multivariable analysis was performed to identify prognostic factors independently associated with time to first treatment for patients with CLL. Patients and Methods Traditional laboratory, clinical prognostic, and newer prognostic factors such as fluorescent in situ hybridization (FISH), IGHV mutation status, and ZAP-70 expression evaluated at first patient visit to MD Anderson Cancer Center were correlated by multivariable analysis with time to first treatment. This multivariable model was used to develop a nomogram—a weighted tool to calculate 2- and 4-year probability of treatment and estimate median time to first treatment. Results There were 930 previously untreated patients who had traditional and new prognostic factors evaluated; they did not have active CLL requiring initiation of treatment within 3 months of first visit and were observed for time to first treatment. The following were independently associated with shorter time to first treatment: three involved lymph node sites, increased size of cervical lymph nodes, presence of 17p deletion or 11q deletion by FISH, increased serum lactate dehydrogenase, and unmutated IGHV mutation status. Conclusion We developed a multivariable model that incorporates traditional and newer prognostic factors to identify patients at high risk for progression to treatment. This model may be useful to identify patients for early interventional trials. PMID:21969505

  19. Analyzing Multiple Multivariate Time Series Data Using Multilevel Dynamic Factor Models.

    PubMed

    Song, Hairong; Zhang, Zhiyong

    2014-01-01

    Multivariate time series data offer researchers opportunities to study dynamics of various systems in social and behavioral sciences. Dynamic factor model (DFM), as an idiographic approach for studying intraindividual variability and dynamics, has typically been applied to time series data obtained from a single unit. When multivariate time series data are collected from multiple units, how to synchronize dynamical information becomes a silent issue. To address this issue, the current study presented a multilevel dynamic factor model (MDFM) that analyzes multiple multivariate time series in multilevel SEM frameworks. MDFM not only disentangles within- and between-person variability but also models dynamics of the intraindividual processes. To illustrate the uses of MDFMs, we applied lag0, lag1, and lag2 MDFMs to empirical data on affect collected from 205 dating couples who had at least 50 consecutive days of observations. We also considered a model extension where the dynamical coefficients were allowed to be randomly varying in the population. The empirical analysis yielded interesting findings regarding affect regulation and coregulation within couples, demonstrating promising uses of MDFMs in analyzing multiple multivariate time series. In the end, we discussed a number of methodological issues in the applications of MDFMs and pointed out possible directions for future research.

  20. Multivariate hydrological frequency analysis for extreme events using Archimedean copula. Case study: Lower Tunjuelo River basin (Colombia)

    NASA Astrophysics Data System (ADS)

    Gómez, Wilmar

    2017-04-01

    By analyzing the spatial and temporal variability of extreme precipitation events we can prevent or reduce the threat and risk. Many water resources projects require joint probability distributions of random variables such as precipitation intensity and duration, which can not be independent with each other. The problem of defining a probability model for observations of several dependent variables is greatly simplified by the joint distribution in terms of their marginal by taking copulas. This document presents a general framework set frequency analysis bivariate and multivariate using Archimedean copulas for extreme events of hydroclimatological nature such as severe storms. This analysis was conducted in the lower Tunjuelo River basin in Colombia for precipitation events. The results obtained show that for a joint study of the intensity-duration-frequency, IDF curves can be obtained through copulas and thus establish more accurate and reliable information from design storms and associated risks. It shows how the use of copulas greatly simplifies the study of multivariate distributions that introduce the concept of joint return period used to represent the needs of hydrological designs properly in frequency analysis.

  1. Detecting event-related changes of multivariate phase coupling in dynamic brain networks.

    PubMed

    Canolty, Ryan T; Cadieu, Charles F; Koepsell, Kilian; Ganguly, Karunesh; Knight, Robert T; Carmena, Jose M

    2012-04-01

    Oscillatory phase coupling within large-scale brain networks is a topic of increasing interest within systems, cognitive, and theoretical neuroscience. Evidence shows that brain rhythms play a role in controlling neuronal excitability and response modulation (Haider B, McCormick D. Neuron 62: 171-189, 2009) and regulate the efficacy of communication between cortical regions (Fries P. Trends Cogn Sci 9: 474-480, 2005) and distinct spatiotemporal scales (Canolty RT, Knight RT. Trends Cogn Sci 14: 506-515, 2010). In this view, anatomically connected brain areas form the scaffolding upon which neuronal oscillations rapidly create and dissolve transient functional networks (Lakatos P, Karmos G, Mehta A, Ulbert I, Schroeder C. Science 320: 110-113, 2008). Importantly, testing these hypotheses requires methods designed to accurately reflect dynamic changes in multivariate phase coupling within brain networks. Unfortunately, phase coupling between neurophysiological signals is commonly investigated using suboptimal techniques. Here we describe how a recently developed probabilistic model, phase coupling estimation (PCE; Cadieu C, Koepsell K Neural Comput 44: 3107-3126, 2010), can be used to investigate changes in multivariate phase coupling, and we detail the advantages of this model over the commonly employed phase-locking value (PLV; Lachaux JP, Rodriguez E, Martinerie J, Varela F. Human Brain Map 8: 194-208, 1999). We show that the N-dimensional PCE is a natural generalization of the inherently bivariate PLV. Using simulations, we show that PCE accurately captures both direct and indirect (network mediated) coupling between network elements in situations where PLV produces erroneous results. We present empirical results on recordings from humans and nonhuman primates and show that the PCE-estimated coupling values are different from those using the bivariate PLV. Critically on these empirical recordings, PCE output tends to be sparser than the PLVs, indicating fewer

  2. Rotation in the Dynamic Factor Modeling of Multivariate Stationary Time Series.

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Nesselroade, John R.

    2001-01-01

    Proposes a special rotation procedure for the exploratory dynamic factor model for stationary multivariate time series. The rotation procedure applies separately to each univariate component series of a q-variate latent factor series and transforms such a component, initially represented as white noise, into a univariate moving-average.…

  3. Rotation in the Dynamic Factor Modeling of Multivariate Stationary Time Series.

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Nesselroade, John R.

    2001-01-01

    Proposes a special rotation procedure for the exploratory dynamic factor model for stationary multivariate time series. The rotation procedure applies separately to each univariate component series of a q-variate latent factor series and transforms such a component, initially represented as white noise, into a univariate moving-average.…

  4. Comparison of procedures to assess non-linear and time-varying effects in multivariable models for survival data.

    PubMed

    Buchholz, Anika; Sauerbrei, Willi

    2011-03-01

    The focus of many medical applications is to model the impact of several factors on time to an event. A standard approach for such analyses is the Cox proportional hazards model. It assumes that the factors act linearly on the log hazard function (linearity assumption) and that their effects are constant over time (proportional hazards (PH) assumption). Variable selection is often required to specify a more parsimonious model aiming to include only variables with an influence on the outcome. As follow-up increases the effect of a variable often gets weaker, which means that it varies in time. However, spurious time-varying effects may also be introduced by mismodelling other parts of the multivariable model, such as omission of an important covariate or an incorrect functional form of a continuous covariate. These issues interact. To check whether the effect of a variable varies in time several tests for non-PH have been proposed. However, they are not sufficient to derive a model, as appropriate modelling of the shape of time-varying effects is required. In three examples we will compare five recently published strategies to assess whether and how the effects of covariates from a multivariable model vary in time. For practical use we will give some recommendations.

  5. Multivariate prediction of major adverse cardiac events after 9914 percutaneous coronary interventions in the north west of England

    PubMed Central

    Grayson, A D; Moore, R K; Jackson, M; Rathore, S; Sastry, S; Gray, T P; Schofield, I; Chauhan, A; Ordoubadi, F F; Prendergast, B; Stables, R H

    2006-01-01

    Objective To develop a multivariate prediction model for major adverse cardiac events (MACE) after percutaneous coronary interventions (PCIs) by using the North West Quality Improvement Programme in Cardiac Interventions (NWQIP) PCI Registry. Setting All NHS centres undertaking adult PCIs in north west England. Methods Retrospective analysis of prospectively collected data on 9914 consecutive patients undergoing adult PCI between 1 August 2001 and 31 December 2003. A multivariate logistic regression analysis was undertaken, with the forward stepwise technique, to identify independent risk factors for MACE. The area under the receiver operating characteristic (ROC) curve and the Hosmer‐Lemeshow goodness of fit statistic were calculated to assess the performance and calibration of the model, respectively. The statistical model was internally validated by using the technique of bootstrap resampling. Main outcome measures MACE, which were in‐hospital mortality, Q wave myocardial infarction, emergency coronary artery bypass graft surgery, and cerebrovascular accidents. Results Independent variables identified with an increased risk of developing MACE were advanced age, female sex, cerebrovascular disease, cardiogenic shock, priority, and treatment of the left main stem or graft lesions during PCI. The ROC curve for the predicted probability of MACE was 0.76, indicating a good discrimination power. The prediction equation was well calibrated, predicting well at all levels of risk. Bootstrapping showed that estimates were stable. Conclusions A contemporaneous multivariate prediction model for MACE after PCI was developed. The NWQIP tool allows calculation of the risk of MACE permitting meaningful risk adjusted comparisons of performance between hospitals and operators. PMID:16159983

  6. A discrete-time multiple event process survival mixture (MEPSUM) model

    PubMed Central

    Dean, Danielle O.; Bauer, Daniel J.; Shanahan, Michael J.

    2014-01-01

    Traditional survival analysis was developed to investigate the occurrence and timing of a single event, but researchers have recently begun to ask questions about the order and timing of multiple events. A multiple event process survival mixture model is developed here to analyze non-repeatable events measured in discrete-time that may occur at the same point in time. Building on both traditional univariate survival analysis and univariate survival mixture analysis, the model approximates the underlying multivariate distribution of hazard functions via a discrete-point finite mixture in which the mixing components represent prototypical patterns of event occurrence. The model is applied in an empirical analysis concerning transitions to adulthood, where the events under study include parenthood, marriage, beginning full-time work, and obtaining a college degree. Promising opportunities, as well as possible limitations of the model and future directions for research are discussed. PMID:24079930

  7. Multivariate time series modeling of short-term system scale irrigation demand

    NASA Astrophysics Data System (ADS)

    Perera, Kushan C.; Western, Andrew W.; George, Biju; Nawarathna, Bandara

    2015-12-01

    Travel time limits the ability of irrigation system operators to react to short-term irrigation demand fluctuations that result from variations in weather, including very hot periods and rainfall events, as well as the various other pressures and opportunities that farmers face. Short-term system-wide irrigation demand forecasts can assist in system operation. Here we developed a multivariate time series (ARMAX) model to forecast irrigation demands with respect to aggregated service points flows (IDCGi, ASP) and off take regulator flows (IDCGi, OTR) based across 5 command areas, which included area covered under four irrigation channels and the study area. These command area specific ARMAX models forecast 1-5 days ahead daily IDCGi, ASP and IDCGi, OTR using the real time flow data recorded at the service points and the uppermost regulators and observed meteorological data collected from automatic weather stations. The model efficiency and the predictive performance were quantified using the root mean squared error (RMSE), Nash-Sutcliffe model efficiency coefficient (NSE), anomaly correlation coefficient (ACC) and mean square skill score (MSSS). During the evaluation period, NSE for IDCGi, ASP and IDCGi, OTR across 5 command areas were ranged 0.98-0.78. These models were capable of generating skillful forecasts (MSSS ⩾ 0.5 and ACC ⩾ 0.6) of IDCGi, ASP and IDCGi, OTR for all 5 lead days and IDCGi, ASP and IDCGi, OTR forecasts were better than using the long term monthly mean irrigation demand. Overall these predictive performance from the ARMAX time series models were higher than almost all the previous studies we are aware. Further, IDCGi, ASP and IDCGi, OTR forecasts have improved the operators' ability to react for near future irrigation demand fluctuations as the developed ARMAX time series models were self-adaptive to reflect the short-term changes in the irrigation demand with respect to various pressures and opportunities that farmers' face, such as

  8. Development and validation of multivariable predictive model for thromboembolic events in lymphoma patients.

    PubMed

    Antic, Darko; Milic, Natasa; Nikolovski, Srdjan; Todorovic, Milena; Bila, Jelena; Djurdjevic, Predrag; Andjelic, Bosko; Djurasinovic, Vladislava; Sretenovic, Aleksandra; Vukovic, Vojin; Jelicic, Jelena; Hayman, Suzanne; Mihaljevic, Biljana

    2016-10-01

    Lymphoma patients are at increased risk of thromboembolic events but thromboprophylaxis in these patients is largely underused. We sought to develop and validate a simple model, based on individual clinical and laboratory patient characteristics that would designate lymphoma patients at risk for thromboembolic event. The study population included 1,820 lymphoma patients who were treated in the Lymphoma Departments at the Clinics of Hematology, Clinical Center of Serbia and Clinical Center Kragujevac. The model was developed using data from a derivation cohort (n = 1,236), and further assessed in the validation cohort (n = 584). Sixty-five patients (5.3%) in the derivation cohort and 34 (5.8%) patients in the validation cohort developed thromboembolic events. The variables independently associated with risk for thromboembolism were: previous venous and/or arterial events, mediastinal involvement, BMI>30 kg/m(2) , reduced mobility, extranodal localization, development of neutropenia and hemoglobin level < 100g/L. Based on the risk model score, the population was divided into the following risk categories: low (score 0-1), intermediate (score 2-3), and high (score >3). For patients classified at risk (intermediate and high-risk scores), the model produced negative predictive value of 98.5%, positive predictive value of 25.1%, sensitivity of 75.4%, and specificity of 87.5%. A high-risk score had positive predictive value of 65.2%. The diagnostic performance measures retained similar values in the validation cohort. Developed prognostic Thrombosis Lymphoma - ThroLy score is more specific for lymphoma patients than any other available score targeting thrombosis in cancer patients. Am. J. Hematol. 91:1014-1019, 2016. © 2016 Wiley Periodicals, Inc.

  9. Multivariate pattern analysis of MEG and EEG: A comparison of representational structure in time and space.

    PubMed

    Cichy, Radoslaw Martin; Pantazis, Dimitrios

    2017-09-01

    Multivariate pattern analysis of magnetoencephalography (MEG) and electroencephalography (EEG) data can reveal the rapid neural dynamics underlying cognition. However, MEG and EEG have systematic differences in sampling neural activity. This poses the question to which degree such measurement differences consistently bias the results of multivariate analysis applied to MEG and EEG activation patterns. To investigate, we conducted a concurrent MEG/EEG study while participants viewed images of everyday objects. We applied multivariate classification analyses to MEG and EEG data, and compared the resulting time courses to each other, and to fMRI data for an independent evaluation in space. We found that both MEG and EEG revealed the millisecond spatio-temporal dynamics of visual processing with largely equivalent results. Beyond yielding convergent results, we found that MEG and EEG also captured partly unique aspects of visual representations. Those unique components emerged earlier in time for MEG than for EEG. Identifying the sources of those unique components with fMRI, we found the locus for both MEG and EEG in high-level visual cortex, and in addition for MEG in low-level visual cortex. Together, our results show that multivariate analyses of MEG and EEG data offer a convergent and complimentary view on neural processing, and motivate the wider adoption of these methods in both MEG and EEG research. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. A multivariate based event detection method and performance comparison with two baseline methods.

    PubMed

    Liu, Shuming; Smith, Kate; Che, Han

    2015-09-01

    Early warning systems have been widely deployed to protect water systems from accidental and intentional contamination events. Conventional detection algorithms are often criticized for having high false positive rates and low true positive rates. This mainly stems from the inability of these methods to determine whether variation in sensor measurements is caused by equipment noise or the presence of contamination. This paper presents a new detection method that identifies the existence of contamination by comparing Euclidean distances of correlation indicators, which are derived from the correlation coefficients of multiple water quality sensors. The performance of the proposed method was evaluated using data from a contaminant injection experiment and compared with two baseline detection methods. The results show that the proposed method can differentiate between fluctuations caused by equipment noise and those due to the presence of contamination. It yielded higher possibility of detection and a lower false alarm rate than the two baseline methods. With optimized parameter values, the proposed method can correctly detect 95% of all contamination events with a 2% false alarm rate.

  11. Applying the multivariate time-rescaling theorem to neural population models

    PubMed Central

    Gerhard, Felipe; Haslinger, Robert; Pipa, Gordon

    2011-01-01

    Statistical models of neural activity are integral to modern neuroscience. Recently, interest has grown in modeling the spiking activity of populations of simultaneously recorded neurons to study the effects of correlations and functional connectivity on neural information processing. However any statistical model must be validated by an appropriate goodness-of-fit test. Kolmogorov-Smirnov tests based upon the time-rescaling theorem have proven to be useful for evaluating point-process-based statistical models of single-neuron spike trains. Here we discuss the extension of the time-rescaling theorem to the multivariate (neural population) case. We show that even in the presence of strong correlations between spike trains, models which neglect couplings between neurons can be erroneously passed by the univariate time-rescaling test. We present the multivariate version of the time-rescaling theorem, and provide a practical step-by-step procedure for applying it towards testing the sufficiency of neural population models. Using several simple analytically tractable models and also more complex simulated and real data sets, we demonstrate that important features of the population activity can only be detected using the multivariate extension of the test. PMID:21395436

  12. Genetic basis of adult migration timing in anadromous steelhead discovered through multivariate association testing.

    PubMed

    Hess, Jon E; Zendt, Joseph S; Matala, Amanda R; Narum, Shawn R

    2016-05-11

    Migration traits are presumed to be complex and to involve interaction among multiple genes. We used both univariate analyses and a multivariate random forest (RF) machine learning algorithm to conduct association mapping of 15 239 single nucleotide polymorphisms (SNPs) for adult migration-timing phenotype in steelhead (Oncorhynchus mykiss). Our study focused on a model natural population of steelhead that exhibits two distinct migration-timing life histories with high levels of admixture in nature. Neutral divergence was limited between fish exhibiting summer- and winter-run migration owing to high levels of interbreeding, but a univariate mixed linear model found three SNPs from a major effect gene to be significantly associated with migration timing (p < 0.000005) that explained 46% of trait variation. Alignment to the annotated Salmo salar genome provided evidence that all three SNPs localize within a 46 kb region overlapping GREB1-like (an oestrogen target gene) on chromosome Ssa03. Additionally, multivariate analyses with RF identified that these three SNPs plus 15 additional SNPs explained up to 60% of trait variation. These candidate SNPs may provide the ability to predict adult migration timing of steelhead to facilitate conservation management of this species, and this study demonstrates the benefit of multivariate analyses for association studies. © 2016 The Author(s).

  13. Multivariate stochastic analysis for Monthly hydrological time series at Cuyahoga River Basin

    NASA Astrophysics Data System (ADS)

    zhang, L.

    2011-12-01

    Copula has become a very powerful statistic and stochastic methodology in case of the multivariate analysis in Environmental and Water resources Engineering. In recent years, the popular one-parameter Archimedean copulas, e.g. Gumbel-Houggard copula, Cook-Johnson copula, Frank copula, the meta-elliptical copula, e.g. Gaussian Copula, Student-T copula, etc. have been applied in multivariate hydrological analyses, e.g. multivariate rainfall (rainfall intensity, duration and depth), flood (peak discharge, duration and volume), and drought analyses (drought length, mean and minimum SPI values, and drought mean areal extent). Copula has also been applied in the flood frequency analysis at the confluences of river systems by taking into account the dependence among upstream gauge stations rather than by using the hydrological routing technique. In most of the studies above, the annual time series have been considered as stationary signal which the time series have been assumed as independent identically distributed (i.i.d.) random variables. But in reality, hydrological time series, especially the daily and monthly hydrological time series, cannot be considered as i.i.d. random variables due to the periodicity existed in the data structure. Also, the stationary assumption is also under question due to the Climate Change and Land Use and Land Cover (LULC) change in the fast years. To this end, it is necessary to revaluate the classic approach for the study of hydrological time series by relaxing the stationary assumption by the use of nonstationary approach. Also as to the study of the dependence structure for the hydrological time series, the assumption of same type of univariate distribution also needs to be relaxed by adopting the copula theory. In this paper, the univariate monthly hydrological time series will be studied through the nonstationary time series analysis approach. The dependence structure of the multivariate monthly hydrological time series will be

  14. Uni- and multi-variable modelling of flood losses: experiences gained from the Secchia river inundation event.

    NASA Astrophysics Data System (ADS)

    Carisi, Francesca; Domeneghetti, Alessio; Kreibich, Heidi; Schröter, Kai; Castellarin, Attilio

    2017-04-01

    Flood risk is function of flood hazard and vulnerability, therefore its accurate assessment depends on a reliable quantification of both factors. The scientific literature proposes a number of objective and reliable methods for assessing flood hazard, yet it highlights a limited understanding of the fundamental damage processes. Loss modelling is associated with large uncertainty which is, among other factors, due to a lack of standard procedures; for instance, flood losses are often estimated based on damage models derived in completely different contexts (i.e. different countries or geographical regions) without checking its applicability, or by considering only one explanatory variable (i.e. typically water depth). We consider the Secchia river flood event of January 2014, when a sudden levee-breach caused the inundation of nearly 200 km2 in Northern Italy. In the aftermath of this event, local authorities collected flood loss data, together with additional information on affected private households and industrial activities (e.g. buildings surface and economic value, number of company's employees and others). Based on these data we implemented and compared a quadratic-regression damage function, with water depth as the only explanatory variable, and a multi-variable model that combines multiple regression trees and considers several explanatory variables (i.e. bagging decision trees). Our results show the importance of data collection revealing that (1) a simple quadratic regression damage function based on empirical data from the study area can be significantly more accurate than literature damage-models derived for a different context and (2) multi-variable modelling may outperform the uni-variable approach, yet it is more difficult to develop and apply due to a much higher demand of detailed data.

  15. Supporting the Process of Exploring and Interpreting Space–Time Multivariate Patterns: The Visual Inquiry Toolkit

    PubMed Central

    Chen, Jin; MacEachren, Alan M.; Guo, Diansheng

    2009-01-01

    While many data sets carry geographic and temporal references, our ability to analyze these datasets lags behind our ability to collect them because of the challenges posed by both data complexity and tool scalability issues. This study develops a visual analytics approach that leverages human expertise with visual, computational, and cartographic methods to support the application of visual analytics to relatively large spatio-temporal, multivariate data sets. We develop and apply a variety of methods for data clustering, pattern searching, information visualization, and synthesis. By combining both human and machine strengths, this approach has a better chance to discover novel, relevant, and potentially useful information that is difficult to detect by any of the methods used in isolation. We demonstrate the effectiveness of the approach by applying the Visual Inquiry Toolkit we developed to analyze a data set containing geographically referenced, time-varying and multivariate data for U.S. technology industries. PMID:19960096

  16. Constructing networks from a dynamical system perspective for multivariate nonlinear time series.

    PubMed

    Nakamura, Tomomichi; Tanizawa, Toshihiro; Small, Michael

    2016-03-01

    We describe a method for constructing networks for multivariate nonlinear time series. We approach the interaction between the various scalar time series from a deterministic dynamical system perspective and provide a generic and algorithmic test for whether the interaction between two measured time series is statistically significant. The method can be applied even when the data exhibit no obvious qualitative similarity: a situation in which the naive method utilizing the cross correlation function directly cannot correctly identify connectivity. To establish the connectivity between nodes we apply the previously proposed small-shuffle surrogate (SSS) method, which can investigate whether there are correlation structures in short-term variabilities (irregular fluctuations) between two data sets from the viewpoint of deterministic dynamical systems. The procedure to construct networks based on this idea is composed of three steps: (i) each time series is considered as a basic node of a network, (ii) the SSS method is applied to verify the connectivity between each pair of time series taken from the whole multivariate time series, and (iii) the pair of nodes is connected with an undirected edge when the null hypothesis cannot be rejected. The network constructed by the proposed method indicates the intrinsic (essential) connectivity of the elements included in the system or the underlying (assumed) system. The method is demonstrated for numerical data sets generated by known systems and applied to several experimental time series.

  17. A Visualization System for Space-Time and Multivariate Patterns (VIS-STAMP)

    PubMed Central

    Guo, Diansheng; Chen, Jin; MacEachren, Alan M.; Liao, Ke

    2011-01-01

    The research reported here integrates computational, visual, and cartographic methods to develop a geovisual analytic approach for exploring and understanding spatio-temporal and multivariate patterns. The developed methodology and tools can help analysts investigate complex patterns across multivariate, spatial, and temporal dimensions via clustering, sorting, and visualization. Specifically, the approach involves a self-organizing map, a parallel coordinate plot, several forms of reorderable matrices (including several ordering methods), a geographic small multiple display, and a 2-dimensional cartographic color design method. The coupling among these methods leverages their independent strengths and facilitates a visual exploration of patterns that are difficult to discover otherwise. The visualization system we developed supports overview of complex patterns and, through a variety of interactions, enables users to focus on specific patterns and examine detailed views. We demonstrate the system with an application to the IEEE InfoVis 2005 Contest data set, which contains time-varying, geographically referenced, and multivariate data for technology companies in the US. PMID:17073369

  18. Nonlinear Multivariate and Time Series Analysis by Neural Network Methods, with Applications to ENSO

    NASA Astrophysics Data System (ADS)

    Hsieh, W. W.

    2003-12-01

    Methods in multivariate statistical analysis are essential for working with large amounts of geophysical data--- data from observational arrays, from satellites or from numerical model output. In classical multivariate statistical analysis, there is a hierarchy of methods, starting with linear regression (LR) at the base, followed by principal component analysis (PCA), and finally canonical correlation analysis (CCA). A multivariate time series method, the singular spectrum analysis (SSA), has been a fruitful extension of the PCA technique. The common drawback of these classical methods is that only linear structures can be correctly extracted from the data. Since the late 1980s, neural network methods have become popular for performing nonlinear regression (NLR) and classification. More recently, multi-layer perceptron neural network methods have been extended to perform nonlinear PCA (NLPCA), nonlinear CCA (NLCCA) and nonlinear SSA (NLSSA). This paper presents a unified view of the NLPCA, NLCCA and NLSSA techniques, and their applications to various datasets of the atmosphere and the ocean, especially in the nonlinear study of the El Niño-Southern Oscillation (ENSO) phenomenon.

  19. Bayesian profiling of molecular signatures to predict event times

    PubMed Central

    Zhang, Dabao; Zhang, Min

    2007-01-01

    Background It is of particular interest to identify cancer-specific molecular signatures for early diagnosis, monitoring effects of treatment and predicting patient survival time. Molecular information about patients is usually generated from high throughput technologies such as microarray and mass spectrometry. Statistically, we are challenged by the large number of candidates but only a small number of patients in the study, and the right-censored clinical data further complicate the analysis. Results We present a two-stage procedure to profile molecular signatures for survival outcomes. Firstly, we group closely-related molecular features into linkage clusters, each portraying either similar or opposite functions and playing similar roles in prognosis; secondly, a Bayesian approach is developed to rank the centroids of these linkage clusters and provide a list of the main molecular features closely related to the outcome of interest. A simulation study showed the superior performance of our approach. When it was applied to data on diffuse large B-cell lymphoma (DLBCL), we were able to identify some new candidate signatures for disease prognosis. Conclusion This multivariate approach provides researchers with a more reliable list of molecular features profiled in terms of their prognostic relationship to the event times, and generates dependable information for subsequent identification of prognostic molecular signatures through either biological procedures or further data analysis. PMID:17239251

  20. Evaluation of an F100 multivariable control using a real-time engine simulation

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.; Soeder, J. F.; Skira, C.

    1977-01-01

    The control evaluated has been designed for the F100-PW-100 turbofan engine. The F100 engine represents the current state-of-the-art in aircraft gas turbine technology. The control makes use of a multivariable, linear quadratic regulator. The evaluation procedure employed utilized a real-time hybrid computer simulation of the F100 engine and an implementation of the control logic on the NASA LeRC digital computer/controller. The results of the evaluation indicated that the control logic and its implementation will be capable of controlling the engine throughout its operating range.

  1. A multivariate model for the meta-analysis of study level survival data at multiple times.

    PubMed

    Jackson, Dan; Rollins, Katie; Coughlin, Patrick

    2014-09-01

    Motivated by our meta-analytic dataset involving survival rates after treatment for critical leg ischemia, we develop and apply a new multivariate model for the meta-analysis of study level survival data at multiple times. Our data set involves 50 studies that provide mortality rates at up to seven time points, which we model simultaneously, and we compare the results to those obtained from standard methodologies. Our method uses exact binomial within-study distributions and enforces the constraints that both the study specific and the overall mortality rates must not decrease over time. We directly model the probabilities of mortality at each time point, which are the quantities of primary clinical interest. We also present I(2) statistics that quantify the impact of the between-study heterogeneity, which is very considerable in our data set.

  2. Multivariable time series prediction for the icing process on overhead power transmission line.

    PubMed

    Li, Peng; Zhao, Na; Zhou, Donghua; Cao, Min; Li, Jingjie; Shi, Xinling

    2014-01-01

    The design of monitoring and predictive alarm systems is necessary for successful overhead power transmission line icing. Given the characteristics of complexity, nonlinearity, and fitfulness in the line icing process, a model based on a multivariable time series is presented here to predict the icing load of a transmission line. In this model, the time effects of micrometeorology parameters for the icing process have been analyzed. The phase-space reconstruction theory and machine learning method were then applied to establish the prediction model, which fully utilized the history of multivariable time series data in local monitoring systems to represent the mapping relationship between icing load and micrometeorology factors. Relevant to the characteristic of fitfulness in line icing, the simulations were carried out during the same icing process or different process to test the model's prediction precision and robustness. According to the simulation results for the Tao-Luo-Xiong Transmission Line, this model demonstrates a good accuracy of prediction in different process, if the prediction length is less than two hours, and would be helpful for power grid departments when deciding to take action in advance to address potential icing disasters.

  3. Discrimination of coupling structures using causality networks from multivariate time series

    NASA Astrophysics Data System (ADS)

    Koutlis, Christos; Kugiumtzis, Dimitris

    2016-09-01

    Measures of Granger causality on multivariate time series have been used to form the so-called causality networks. A causality network represents the interdependence structure of the underlying dynamical system or coupled dynamical systems, and its properties are quantified by network indices. In this work, it is investigated whether network indices on networks generated by an appropriate Granger causality measure can discriminate different coupling structures. The information based Granger causality measure of partial mutual information from mixed embedding (PMIME) is used to form causality networks, and a large number of network indices are ranked according to their ability to discriminate the different coupling structures. The evaluation of the network indices is done with a simulation study based on two dynamical systems, the coupled Mackey-Glass delay differential equations and the neural mass model, both of 25 variables, and three prototypes of coupling structures, i.e., random, small-world, and scale-free. It is concluded that the setting of PMIME combined with a network index attains high level of discrimination of the coupling structures solely on the basis of the observed multivariate time series. This approach is demonstrated to identify epileptic seizures emerging during electroencephalogram recordings.

  4. [Psychosocial predictors of metabolic instability in brittle diabetes--a multivariate time series analysis].

    PubMed

    Brosig, B; Leweke, F; Milch, W; Eckhard, M; Reimer, C

    2001-06-01

    The term "brittle diabetes" denotes the unstable course of an insulin-dependent diabetes characterised by frequent hypo- or hyperglycaemic crises. The aim of this study is to demonstrate empirically how psychosocial parameters interact with metabolic instability in a paradigmatic case of juvenile brittle diabetes. By means of a structured diary study, blood sugar values, moods (SAM), body symptoms (GBB), the daily hustle and hassle, helping therapeutic alliance (HAQ) and the aspects of setting were registered. Resulting time series (112 days each) were ARIMA-analysed by a multivariate approach. It could be shown that the mean variance of daily blood sugar values as an indicator of brittleness was predicted by moods, body complaints and by a family session as setting factor (p < 0.05, for corresponding predictors). Feelings of dominance preceded an increase of blood sugar variance, whereas depressive moods, anger and body symptoms were associated with metabolic instability. A family therapy session also resulted in an increase of the mean blood sugar variance. The model accounted for almost 30% of the total variance of the dependent variable (R-square-adjusted, p < 0.0001). The potential of multivariate time-series as a means to demonstrate psychosomatic interrelations is discussed. We believe that the results may also contribute to an empirically rooted understanding of psychodynamic processes in psychosomatoses.

  5. Understanding characteristics in multivariate traffic flow time series from complex network structure

    NASA Astrophysics Data System (ADS)

    Yan, Ying; Zhang, Shen; Tang, Jinjun; Wang, Xiaofei

    2017-07-01

    Discovering dynamic characteristics in traffic flow is the significant step to design effective traffic managing and controlling strategy for relieving traffic congestion in urban cities. A new method based on complex network theory is proposed to study multivariate traffic flow time series. The data were collected from loop detectors on freeway during a year. In order to construct complex network from original traffic flow, a weighted Froenius norm is adopt to estimate similarity between multivariate time series, and Principal Component Analysis is implemented to determine the weights. We discuss how to select optimal critical threshold for networks at different hour in term of cumulative probability distribution of degree. Furthermore, two statistical properties of networks: normalized network structure entropy and cumulative probability of degree, are utilized to explore hourly variation in traffic flow. The results demonstrate these two statistical quantities express similar pattern to traffic flow parameters with morning and evening peak hours. Accordingly, we detect three traffic states: trough, peak and transitional hours, according to the correlation between two aforementioned properties. The classifying results of states can actually represent hourly fluctuation in traffic flow by analyzing annual average hourly values of traffic volume, occupancy and speed in corresponding hours.

  6. Multivariate weather regimes in the Mediterranean, a perspective to increase Heavy Precipitating Events predictability using medium range ensemble forecasting?

    NASA Astrophysics Data System (ADS)

    Joly, B.; Arbogast, P.; Descamps, L.; Labadie, C.

    2009-09-01

    South-eastern France is a region subject to very Heavy Precipitating Events (HPEs). They have been identified to often occur in some Large Scale recurrent Circulations (LSCs) which may play a significant role in triggering or maintaining the extreme convective processes (Nuissier et al., 2007). A previous study (within the French national CYPRIM Project, ACI-INSU) based on the classification of the geopotential height for a thousand rainy days extracted from the French southeastern regional raingauges network showed the existence of two different patterns associated with the HPEs and the importance of the coincidence of low-level ingredients. However, by design these patterns cannot be considered as objective features describing the whole large scale variability as weather regimes methods can do. Then, we intend to generalize these results by investigating a classification based on a multivariate atmospheric state vector rather than on a single parameter (the geopotential height at 500 hPa). This is also motivated by previous studies (Vautard et al. 1988, Vautard 1990) which have shown that weather regimes are linked with the low frequency variability sources and then could set up a framework to explain nonlinear transitions from low frequency to high variability events. We build a pseudo-state vector as a 25 parameters vector, the parameters being selected as the most correlated with daily rainfall. The number of classes is chosen using an hybrid method combining dynamical and hierarchical clustering. It leads to a 8-classes classification. Then the connections between the clusters and the HPEs shows that two clusters concentrate more than 70% of the HPEs. The composite analysis at different levels shows a good agreement with the CYPRIM patterns. Furthermore, a simple correlation analysis to the centroids of these two clusters shows they significantly discriminate the HPEs compared to the non-HPEs part of the data. Thus we explore the opportunity to determine

  7. Investigation of time and weather effects on crash types using full Bayesian multivariate Poisson lognormal models.

    PubMed

    El-Basyouny, Karim; Barua, Sudip; Islam, Md Tazul

    2014-12-01

    Previous research shows that various weather elements have significant effects on crash occurrence and risk; however, little is known about how these elements affect different crash types. Consequently, this study investigates the impact of weather elements and sudden extreme snow or rain weather changes on crash type. Multivariate models were used for seven crash types using five years of daily weather and crash data collected for the entire City of Edmonton. In addition, the yearly trend and random variation of parameters across the years were analyzed by using four different modeling formulations. The proposed models were estimated in a full Bayesian context via Markov Chain Monte Carlo simulation. The multivariate Poisson lognormal model with yearly varying coefficients provided the best fit for the data according to Deviance Information Criteria. Overall, results showed that temperature and snowfall were statistically significant with intuitive signs (crashes decrease with increasing temperature; crashes increase as snowfall intensity increases) for all crash types, while rainfall was mostly insignificant. Previous snow showed mixed results, being statistically significant and positively related to certain crash types, while negatively related or insignificant in other cases. Maximum wind gust speed was found mostly insignificant with a few exceptions that were positively related to crash type. Major snow or rain events following a dry weather condition were highly significant and positively related to three crash types: Follow-Too-Close, Stop-Sign-Violation, and Ran-Off-Road crashes. The day-of-the-week dummy variables were statistically significant, indicating a possible weekly variation in exposure. Transportation authorities might use the above results to improve road safety by providing drivers with information regarding the risk of certain crash types for a particular weather condition.

  8. Ecological prediction with nonlinear multivariate time-frequency functional data models

    USGS Publications Warehouse

    Yang, Wen-Hsi; Wikle, Christopher K.; Holan, Scott H.; Wildhaber, Mark L.

    2013-01-01

    Time-frequency analysis has become a fundamental component of many scientific inquiries. Due to improvements in technology, the amount of high-frequency signals that are collected for ecological and other scientific processes is increasing at a dramatic rate. In order to facilitate the use of these data in ecological prediction, we introduce a class of nonlinear multivariate time-frequency functional models that can identify important features of each signal as well as the interaction of signals corresponding to the response variable of interest. Our methodology is of independent interest and utilizes stochastic search variable selection to improve model selection and performs model averaging to enhance prediction. We illustrate the effectiveness of our approach through simulation and by application to predicting spawning success of shovelnose sturgeon in the Lower Missouri River.

  9. Detecting a currency’s dominance using multivariate time series analysis

    NASA Astrophysics Data System (ADS)

    Syahidah Yusoff, Nur; Sharif, Shamshuritawati

    2017-09-01

    A currency exchange rate is the price of one country’s currency in terms of another country’s currency. There are four different prices; opening, closing, highest, and lowest can be achieved from daily trading activities. In the past, a lot of studies have been carried out by using closing price only. However, those four prices are interrelated to each other. Thus, the multivariate time series can provide more information than univariate time series. Therefore, the enthusiasm of this paper is to compare the results of two different approaches, which are mean vector and Escoufier’s RV coefficient in constructing similarity matrices of 20 world currencies. Consequently, both matrices are used to substitute the correlation matrix required by network topology. With the help of degree centrality measure, we can detect the currency’s dominance for both networks. The pros and cons for both approaches will be presented at the end of this paper.

  10. Reconstructing the times of past and future personal events.

    PubMed

    Ben Malek, Hédi; Berna, Fabrice; D'Argembeau, Arnaud

    2017-04-11

    Humans have the remarkable ability to mentally travel through past and future times. However, while memory for the times of past events has been much investigated, little is known about how imagined future events are temporally located. Using a think-aloud protocol, we found that the temporal location of past and future events is rarely directly accessed, but instead mostly relies on reconstructive and inferential strategies. References to lifetime periods and factual knowledge (about the self, others, and the world) were most frequently used to determine the temporal location of both past and future events. Event details (e.g., places, persons, or weather conditions) were also used, but mainly for past events. Finally, the results showed that events whose temporal location was directly accessed were judged more important for personal goals. Together, these findings shed new light on the mechanisms involved in locating personal events in past and future times.

  11. Circuit for measuring time differences among events

    DOEpatents

    Romrell, Delwin M.

    1977-01-01

    An electronic circuit has a plurality of input terminals. Application of a first input signal to any one of the terminals initiates a timing sequence. Later inputs to the same terminal are ignored but a later input to any other terminal of the plurality generates a signal which can be used to measure the time difference between the later input and the first input signal. Also, such time differences may be measured between the first input signal and an input signal to any other terminal of the plurality or the circuit may be reset at any time by an external reset signal.

  12. Reconstructing causal pathways and optimal prediction from multivariate time series using the Tigramite package

    NASA Astrophysics Data System (ADS)

    Runge, Jakob

    2016-04-01

    Causal reconstruction techniques from multivariate time series have become a popular approach to analyze interactions in complex systems such as the Earth. These approaches allow to exclude effects of common drivers and indirect influences. Practical applications are, however, especially challenging if nonlinear interactions are taken into account and for typically strongly autocorrelated climate time series. Here we discuss a new reconstruction approach with accompanying software package (Tigramite) and focus on two applications: (1) Information or perturbation transfer along causal pathways. This method allows to detect and quantify which intermediate nodes are important mediators of an interaction mechanism and is illustrated to disentangle pathways of atmospheric flow over Europe and for the ENSO - Indian Monsoon interaction mechanism. (2) A nonlinear model-free prediction technique that efficiently utilizes causal drivers and can be shown to yield information-theoretically optimal predictors avoiding over-fitting. The performance of this framework is illustrated on a climatological index of El Nino Southern Oscillation. References: Runge, J. (2015). Quantifying information transfer and mediation along causal pathways in complex systems. Phys. Rev. E, 92(6), 062829. doi:10.1103/PhysRevE.92.062829 Runge, J., Donner, R. V., & Kurths, J. (2015). Optimal model-free prediction from multivariate time series. Phys. Rev. E, 91(5), 052909. doi:10.1103/PhysRevE.91.052909 Runge, J., Petoukhov, V., Donges, J. F., Hlinka, J., Jajcay, N., Vejmelka, M., … Kurths, J. (2015). Identifying causal gateways and mediators in complex spatio-temporal systems. Nature Communications, 6, 8502. doi:10.1038/ncomms9502

  13. Time-varying nonstationary multivariate risk analysis using a dynamic Bayesian copula

    NASA Astrophysics Data System (ADS)

    Sarhadi, Ali; Burn, Donald H.; Concepción Ausín, María.; Wiper, Michael P.

    2016-03-01

    A time-varying risk analysis is proposed for an adaptive design framework in nonstationary conditions arising from climate change. A Bayesian, dynamic conditional copula is developed for modeling the time-varying dependence structure between mixed continuous and discrete multiattributes of multidimensional hydrometeorological phenomena. Joint Bayesian inference is carried out to fit the marginals and copula in an illustrative example using an adaptive, Gibbs Markov Chain Monte Carlo (MCMC) sampler. Posterior mean estimates and credible intervals are provided for the model parameters and the Deviance Information Criterion (DIC) is used to select the model that best captures different forms of nonstationarity over time. This study also introduces a fully Bayesian, time-varying joint return period for multivariate time-dependent risk analysis in nonstationary environments. The results demonstrate that the nature and the risk of extreme-climate multidimensional processes are changed over time under the impact of climate change, and accordingly the long-term decision making strategies should be updated based on the anomalies of the nonstationary environment.

  14. Forecasting electric vehicles sales with univariate and multivariate time series models: The case of China.

    PubMed

    Zhang, Yong; Zhong, Miner; Geng, Nana; Jiang, Yunjian

    2017-01-01

    The market demand for electric vehicles (EVs) has increased in recent years. Suitable models are necessary to understand and forecast EV sales. This study presents a singular spectrum analysis (SSA) as a univariate time-series model and vector autoregressive model (VAR) as a multivariate model. Empirical results suggest that SSA satisfactorily indicates the evolving trend and provides reasonable results. The VAR model, which comprised exogenous parameters related to the market on a monthly basis, can significantly improve the prediction accuracy. The EV sales in China, which are categorized into battery and plug-in EVs, are predicted in both short term (up to December 2017) and long term (up to 2020), as statistical proofs of the growth of the Chinese EV industry.

  15. Forecasting electric vehicles sales with univariate and multivariate time series models: The case of China

    PubMed Central

    Zhang, Yong; Zhong, Miner; Geng, Nana; Jiang, Yunjian

    2017-01-01

    The market demand for electric vehicles (EVs) has increased in recent years. Suitable models are necessary to understand and forecast EV sales. This study presents a singular spectrum analysis (SSA) as a univariate time-series model and vector autoregressive model (VAR) as a multivariate model. Empirical results suggest that SSA satisfactorily indicates the evolving trend and provides reasonable results. The VAR model, which comprised exogenous parameters related to the market on a monthly basis, can significantly improve the prediction accuracy. The EV sales in China, which are categorized into battery and plug-in EVs, are predicted in both short term (up to December 2017) and long term (up to 2020), as statistical proofs of the growth of the Chinese EV industry. PMID:28459872

  16. Time-varying correlations in global real estate markets: A multivariate GARCH with spatial effects approach

    NASA Astrophysics Data System (ADS)

    Gu, Huaying; Liu, Zhixue; Weng, Yingliang

    2017-04-01

    The present study applies the multivariate generalized autoregressive conditional heteroscedasticity (MGARCH) with spatial effects approach for the analysis of the time-varying conditional correlations and contagion effects among global real estate markets. A distinguishing feature of the proposed model is that it can simultaneously capture the spatial interactions and the dynamic conditional correlations compared with the traditional MGARCH models. Results reveal that the estimated dynamic conditional correlations have exhibited significant increases during the global financial crisis from 2007 to 2009, thereby suggesting contagion effects among global real estate markets. The analysis further indicates that the returns of the regional real estate markets that are in close geographic and economic proximities exhibit strong co-movement. In addition, evidence of significantly positive leverage effects in global real estate markets is also determined. The findings have significant implications on global portfolio diversification opportunities and risk management practices.

  17. Denoising and Multivariate Analysis of Time-Of-Flight SIMS Images

    SciTech Connect

    Wickes, Bronwyn; Kim, Y.; Castner, David G.

    2003-08-30

    Time-of-flight SIMS (ToF-SIMS) imaging offers a modality for simultaneously visualizing the spatial distribution of different surface species. However, the utility of ToF-SIMS datasets may be limited by their large size, degraded mass resolution and low ion counts per pixel. Through denoising and multivariate image analysis, regions of similar chemistries may be differentiated more readily in ToF-SIMS image data. Three established denoising algorithms down-binning, boxcar and wavelet filtering were applied to ToF-SIMS images of different surface geometries and chemistries. The effect of these filters on the performance of principal component analysis (PCA) was evaluated in terms of the capture of important chemical image features in the principal component score images, the quality of the principal component

  18. Multivariate Analyses of Small Theropod Dinosaur Teeth and Implications for Paleoecological Turnover through Time

    PubMed Central

    Larson, Derek W.; Currie, Philip J.

    2013-01-01

    Isolated small theropod teeth are abundant in vertebrate microfossil assemblages, and are frequently used in studies of species diversity in ancient ecosystems. However, determining the taxonomic affinities of these teeth is problematic due to an absence of associated diagnostic skeletal material. Species such as Dromaeosaurus albertensis, Richardoestesia gilmorei, and Saurornitholestes langstoni are known from skeletal remains that have been recovered exclusively from the Dinosaur Park Formation (Campanian). It is therefore likely that teeth from different formations widely disparate in age or geographic position are not referable to these species. Tooth taxa without any associated skeletal material, such as Paronychodon lacustris and Richardoestesia isosceles, have also been identified from multiple localities of disparate ages throughout the Late Cretaceous. To address this problem, a dataset of measurements of 1183 small theropod teeth (the most specimen-rich theropod tooth dataset ever constructed) from North America ranging in age from Santonian through Maastrichtian were analyzed using multivariate statistical methods: canonical variate analysis, pairwise discriminant function analysis, and multivariate analysis of variance. The results indicate that teeth referred to the same taxon from different formations are often quantitatively distinct. In contrast, isolated teeth found in time equivalent formations are not quantitatively distinguishable from each other. These results support the hypothesis that small theropod taxa, like other dinosaurs in the Late Cretaceous, tend to be exclusive to discrete host formations. The methods outlined have great potential for future studies of isolated teeth worldwide, and may be the most useful non-destructive technique known of extracting the most data possible from isolated and fragmentary specimens. The ability to accurately assess species diversity and turnover through time based on isolated teeth will help illuminate

  19. Multivariate analyses of small theropod dinosaur teeth and implications for paleoecological turnover through time.

    PubMed

    Larson, Derek W; Currie, Philip J

    2013-01-01

    Isolated small theropod teeth are abundant in vertebrate microfossil assemblages, and are frequently used in studies of species diversity in ancient ecosystems. However, determining the taxonomic affinities of these teeth is problematic due to an absence of associated diagnostic skeletal material. Species such as Dromaeosaurus albertensis, Richardoestesia gilmorei, and Saurornitholestes langstoni are known from skeletal remains that have been recovered exclusively from the Dinosaur Park Formation (Campanian). It is therefore likely that teeth from different formations widely disparate in age or geographic position are not referable to these species. Tooth taxa without any associated skeletal material, such as Paronychodon lacustris and Richardoestesia isosceles, have also been identified from multiple localities of disparate ages throughout the Late Cretaceous. To address this problem, a dataset of measurements of 1183 small theropod teeth (the most specimen-rich theropod tooth dataset ever constructed) from North America ranging in age from Santonian through Maastrichtian were analyzed using multivariate statistical methods: canonical variate analysis, pairwise discriminant function analysis, and multivariate analysis of variance. The results indicate that teeth referred to the same taxon from different formations are often quantitatively distinct. In contrast, isolated teeth found in time equivalent formations are not quantitatively distinguishable from each other. These results support the hypothesis that small theropod taxa, like other dinosaurs in the Late Cretaceous, tend to be exclusive to discrete host formations. The methods outlined have great potential for future studies of isolated teeth worldwide, and may be the most useful non-destructive technique known of extracting the most data possible from isolated and fragmentary specimens. The ability to accurately assess species diversity and turnover through time based on isolated teeth will help illuminate

  20. Environmental Events and the Timing of Death.

    ERIC Educational Resources Information Center

    Marriott, Cindy

    There is some evidence that the timing of death may not be random. Taking into consideration some of the variables which possibly affect death, this paper reviews intervention techniques with the possible goal of saving lives. Knowing that the elderly respond to the environment, society should accept as its responsibility the provision of support…

  1. [Near infrared spectroscopy and multivariate statistical process analysis for real-time monitoring of production process].

    PubMed

    Wang, Yi; Ma, Xiang; Wen, Ya-Dong; Zou, Quan; Wang, Jun; Tu, Jia-Run; Cai, Wen-Sheng; Shao, Xue-Guang

    2013-05-01

    Near infrared diffusive reflectance spectroscopy has been applied in on-site or on-line analysis due to its characteristics of fastness, non-destruction and the feasibility for real complex sample analysis. The present work reported a real-time monitoring method for industrial production by using near infrared spectroscopic technique and multivariate statistical process analysis. In the method, the real-time near infrared spectra of the materials are collected on the production line, and then the evaluation of the production process can be achieved by a statistic Hotelling T2 calculated with the established model. In this work, principal component analysis (PCA) is adopted for building the model, and the statistic is calculated by projecting the real-time spectra onto the PCA model. With an application of the method in a practical production, it was demonstrated that a real-time evaluation of the variations in the production can be realized by investigating the changes in the statistic, and the comparison of the products in different batches can be achieved by further statistics of the statistic. Therefore, the proposed method may provide a practical way for quality insurance of production processes.

  2. Multivariate change point analysis in time series for volcano unrest detection

    NASA Astrophysics Data System (ADS)

    Aliotta, M. A.; Cassisi, C.; Fiumara, S.; Montalto, P.

    2016-12-01

    The detection of unrest in volcanic areas represents a key task for civil protection purposes. Nowadays, large networks for different kinds of measurements deployed in most of active volcanoes supply huge amount of data, mainly in the form of time series. Automatic techniques are needed to perform the analysis of such amount of data. In this sense, time series analysis techniques can contribute to exploit the information coming from the measurements to identify possible changes into volcanic behaviour. In particular, the change point analysis can be used to this aim. The change point analysis is the process of detecting distributional changes within time-ordered observations. Among the different techniques proposed for this kind of analysis, we chose to use the SeqDrift (Sakthithasan et al., 2013) technique for its ability to deal with real time data. The algorithm iteratively compares two consecutive sliding windows coming from the data stream to choose whether the boundary point (in the between of the two windows) is a change point. The check is carried out by a non-parametric statistical test. We applied the proposed approach to a test case on Mt. Etna using large multivariate dataset from 2011-2015. The results indicate that the technique is effective to detect volcanic state changes. Sakthithasan, S., Pears, R., Koh, Y. S. (2013). One Pass Concept Change Detection for Data Streams. PAKDD (2): 461-472.

  3. A Hydraulic Tomographic Approach: Coupling of Travel Time and Amplitude Inversion Using Multivariate Statistics

    NASA Astrophysics Data System (ADS)

    Brauchler, R.; Cheng, J.; Dietrich, P.; Everett, M.; Johnson, B.; Sauter, M.

    2005-12-01

    Knowledge about the spatial variations in hydraulic properties plays an important role controlling solute movement in saturated flow systems. Traditional hydrogeological approaches appear to have difficulties providing high resolution parameter estimates. Thus, we have decided to develop an approach coupling the two existing hydraulic tomographic approaches: a) Inversion of the drawdown as a function of time (amplitude inversion) and b) the inversion of travel times of the pressure disturbance. The advantages of hydraulic travel time tomography are its high structural resolution and computational efficiency. However, travel times are primarily controlled by the aquifer diffusivity making it difficult to determine hydraulically conductivity and storage. Amplitude inversion on the other hand is able to determine hydraulic conductivity and storage separately, but the heavy computational burden of the amplitude inversion is often a shortcoming, especially for larger data sets. Our coupled inversion approach was developed and tested using synthetic data sets. The data base of the inversion comprises simulated slug tests, in which the position of the sources (injection ports) isolated with packers, are varied between the tests. The first step was the inversion of several characteristic travel times (e.g. early, intermediate and late travel times) in order to determine the diffusivity distribution. Secondly, the resulting diffusivity distributions were classified into homogeneous groups in order to differentiate between hydrogeological units characterized by a significant diffusivity contrast. The classification was performed by using multivariate statistics. With a numerical flow model and an automatic parameter estimator the amplitude inversion was performed in a final step. The classified diffusivity distribution is an excellent starting model for the amplitude inversion and allows to reduce strongly the calculation time. The final amplitude inversion overcomes

  4. A simple ergonomic measure reduces fluoroscopy time during ERCP: A multivariate analysis.

    PubMed

    Jowhari, Fahd; Hopman, Wilma M; Hookey, Lawrence

    2017-03-01

    Background and study aims Endoscopic retrograde cholangiopancreatgraphy (ERCP) carries a radiation risk to patients undergoing the procedure and the team performing it. Fluoroscopy time (FT) has been shown to have a linear relationship with radiation exposure during ERCP. Recent modifications to our ERCP suite design were felt to impact fluoroscopy time and ergonomics. This multivariate analysis was therefore undertaken to investigate these effects, and to identify and validate various clinical, procedural and ergonomic factors influencing the total fluoroscopy time during ERCP. This would better assist clinicians with predicting prolonged fluoroscopic durations and to undertake relevant precautions accordingly. Patients and methods A retrospective analysis of 299 ERCPs performed by 4 endoscopists over an 18-month period, at a single tertiary care center was conducted. All inpatients/outpatients (121 males, 178 females) undergoing ERCP for any clinical indication from January 2012 to June 2013 in the chosen ERCP suite were included in the study. Various predetermined clinical, procedural and ergonomic factors were obtained via chart review. Univariate analyses identified factors to be included in the multivariate regression model with FT as the dependent variable. Results Bringing the endoscopy and fluoroscopy screens next to each other was associated with a significantly lesser FT than when the screens were separated further (-1.4 min, P = 0.026). Other significant factors associated with a prolonged FT included having a prior ERCP (+ 1.4 min, P = 0.031), and more difficult procedures (+ 4.2 min for each level of difficulty, P < 0.001). ERCPs performed by high-volume endoscopists used lesser FT vs. low-volume endoscopists (-1.82, P = 0.015). Conclusions Our study has identified and validated various factors that affect the total fluoroscopy time during ERCP. This is the first study to show that decreasing the distance between the

  5. A simple ergonomic measure reduces fluoroscopy time during ERCP: A multivariate analysis

    PubMed Central

    Jowhari, Fahd; Hopman, Wilma M.; Hookey, Lawrence

    2017-01-01

    Background and study aims Endoscopic retrograde cholangiopancreatgraphy (ERCP) carries a radiation risk to patients undergoing the procedure and the team performing it. Fluoroscopy time (FT) has been shown to have a linear relationship with radiation exposure during ERCP. Recent modifications to our ERCP suite design were felt to impact fluoroscopy time and ergonomics. This multivariate analysis was therefore undertaken to investigate these effects, and to identify and validate various clinical, procedural and ergonomic factors influencing the total fluoroscopy time during ERCP. This would better assist clinicians with predicting prolonged fluoroscopic durations and to undertake relevant precautions accordingly. Patients and methods A retrospective analysis of 299 ERCPs performed by 4 endoscopists over an 18-month period, at a single tertiary care center was conducted. All inpatients/outpatients (121 males, 178 females) undergoing ERCP for any clinical indication from January 2012 to June 2013 in the chosen ERCP suite were included in the study. Various predetermined clinical, procedural and ergonomic factors were obtained via chart review. Univariate analyses identified factors to be included in the multivariate regression model with FT as the dependent variable. Results Bringing the endoscopy and fluoroscopy screens next to each other was associated with a significantly lesser FT than when the screens were separated further (–1.4 min, P = 0.026). Other significant factors associated with a prolonged FT included having a prior ERCP (+ 1.4 min, P = 0.031), and more difficult procedures (+ 4.2 min for each level of difficulty, P < 0.001). ERCPs performed by high-volume endoscopists used lesser FT vs. low-volume endoscopists (–1.82, P = 0.015). Conclusions Our study has identified and validated various factors that affect the total fluoroscopy time during ERCP. This is the first study to show that decreasing the distance between

  6. A Multitaper, Causal Decomposition for Stochastic, Multivariate Time Series: Application to High-Frequency Calcium Imaging Data.

    PubMed

    Sornborger, Andrew T; Lauderdale, James D

    2016-11-01

    Neural data analysis has increasingly incorporated causal information to study circuit connectivity. Dimensional reduction forms the basis of most analyses of large multivariate time series. Here, we present a new, multitaper-based decomposition for stochastic, multivariate time series that acts on the covariance of the time series at all lags, C(τ), as opposed to standard methods that decompose the time series, X(t), using only information at zero-lag. In both simulated and neural imaging examples, we demonstrate that methods that neglect the full causal structure may be discarding important dynamical information in a time series.

  7. Young Children's Memory for the Times of Personal Past Events

    PubMed Central

    Pathman, Thanujeni; Larkina, Marina; Burch, Melissa; Bauer, Patricia J.

    2012-01-01

    Remembering the temporal information associated with personal past events is critical for autobiographical memory, yet we know relatively little about the development of this capacity. In the present research, we investigated temporal memory for naturally occurring personal events in 4-, 6-, and 8-year-old children. Parents recorded unique events in which their children participated during a 4-month period. At test, children made relative recency judgments and estimated the time of each event using conventional time-scales (time of day, day of week, month of year, and season). Children also were asked to provide justifications for their time-scale judgments. Six- and 8-year-olds, but not 4-year-olds, accurately judged the order of two distinct events. There were age-related improvements in children's estimation of the time of events using conventional time-scales. Older children provided more justifications for their time-scale judgments compared to younger children. Relations between correct responding on the time-scale judgments and provision of meaningful justifications suggest that children may use that information to reconstruct the times associated with past events. The findings can be used to chart a developmental trajectory of performance in temporal memory for personal past events, and have implications for our understanding of autobiographical memory development. PMID:23687467

  8. ETARA - EVENT TIME AVAILABILITY, RELIABILITY ANALYSIS

    NASA Technical Reports Server (NTRS)

    Viterna, L. A.

    1994-01-01

    The ETARA system was written to evaluate the performance of the Space Station Freedom Electrical Power System, but the methodology and software can be modified to simulate any system that can be represented by a block diagram. ETARA is an interactive, menu-driven reliability, availability, and maintainability (RAM) simulation program. Given a Reliability Block Diagram representation of a system, the program simulates the behavior of the system over a specified period of time using Monte Carlo methods to generate block failure and repair times as a function of exponential and/or Weibull distributions. ETARA can calculate availability parameters such as equivalent availability, state availability (percentage of time at a particular output state capability), continuous state duration and number of state occurrences. The program can simulate initial spares allotment and spares replenishment for a resupply cycle. The number of block failures are tabulated both individually and by block type. ETARA also records total downtime, repair time, and time waiting for spares. Maintenance man-hours per year and system reliability, with or without repair, at or above a particular output capability can also be calculated. The key to using ETARA is the development of a reliability or availability block diagram. The block diagram is a logical graphical illustration depicting the block configuration necessary for a function to be successfully accomplished. Each block can represent a component, a subsystem, or a system. The function attributed to each block is considered for modeling purposes to be either available or unavailable; there are no degraded modes of block performance. A block does not have to represent physically connected hardware in the actual system to be connected in the block diagram. The block needs only to have a role in contributing to an available system function. ETARA can model the RAM characteristics of systems represented by multilayered, nesting block diagrams

  9. ETARA - EVENT TIME AVAILABILITY, RELIABILITY ANALYSIS

    NASA Technical Reports Server (NTRS)

    Viterna, L. A.

    1994-01-01

    The ETARA system was written to evaluate the performance of the Space Station Freedom Electrical Power System, but the methodology and software can be modified to simulate any system that can be represented by a block diagram. ETARA is an interactive, menu-driven reliability, availability, and maintainability (RAM) simulation program. Given a Reliability Block Diagram representation of a system, the program simulates the behavior of the system over a specified period of time using Monte Carlo methods to generate block failure and repair times as a function of exponential and/or Weibull distributions. ETARA can calculate availability parameters such as equivalent availability, state availability (percentage of time at a particular output state capability), continuous state duration and number of state occurrences. The program can simulate initial spares allotment and spares replenishment for a resupply cycle. The number of block failures are tabulated both individually and by block type. ETARA also records total downtime, repair time, and time waiting for spares. Maintenance man-hours per year and system reliability, with or without repair, at or above a particular output capability can also be calculated. The key to using ETARA is the development of a reliability or availability block diagram. The block diagram is a logical graphical illustration depicting the block configuration necessary for a function to be successfully accomplished. Each block can represent a component, a subsystem, or a system. The function attributed to each block is considered for modeling purposes to be either available or unavailable; there are no degraded modes of block performance. A block does not have to represent physically connected hardware in the actual system to be connected in the block diagram. The block needs only to have a role in contributing to an available system function. ETARA can model the RAM characteristics of systems represented by multilayered, nesting block diagrams

  10. A distributed computing system for multivariate time series analyses of multichannel neurophysiological data.

    PubMed

    Müller, Andy; Osterhage, Hannes; Sowa, Robert; Andrzejak, Ralph G; Mormann, Florian; Lehnertz, Klaus

    2006-04-15

    We present a client-server application for the distributed multivariate analysis of time series using standard PCs. We here concentrate on analyses of multichannel EEG/MEG data, but our method can easily be adapted to other time series. Due to the rapid development of new analysis techniques, the focus in the design of our application was not only on computational performance, but also on high flexibility and expandability of both the client and the server programs. For this purpose, the communication between the server and the clients as well as the building of the computational tasks has been realized via the Extensible Markup Language (XML). Running our newly developed method in an asynchronous distributed environment with random availability of remote and heterogeneous resources, we tested the system's performance for a number of different univariate and bivariate analysis techniques. Results indicate that for most of the currently available analysis techniques, calculations can be performed in real time, which, in principle, allows on-line analyses at relatively low cost.

  11. Learning Linear Dynamical Systems from Multivariate Time Series: A Matrix Factorization Based Framework

    PubMed Central

    Liu, Zitao; Hauskrecht, Milos

    2016-01-01

    The linear dynamical system (LDS) model is arguably the most commonly used time series model for real-world engineering and financial applications due to its relative simplicity, mathematically predictable behavior, and the fact that exact inference and predictions for the model can be done efficiently. In this work, we propose a new generalized LDS framework, gLDS, for learning LDS models from a collection of multivariate time series (MTS) data based on matrix factorization, which is different from traditional EM learning and spectral learning algorithms. In gLDS, each MTS sequence is factorized as a product of a shared emission matrix and a sequence-specific (hidden) state dynamics, where an individual hidden state sequence is represented with the help of a shared transition matrix. One advantage of our generalized formulation is that various types of constraints can be easily incorporated into the learning process. Furthermore, we propose a novel temporal smoothing regularization approach for learning the LDS model, which stabilizes the model, its learning algorithm and predictions it makes. Experiments on several real-world MTS data show that (1) regular LDS models learned from gLDS are able to achieve better time series predictive performance than other LDS learning algorithms; (2) constraints can be directly integrated into the learning process to achieve special properties such as stability, low-rankness; and (3) the proposed temporal smoothing regularization encourages more stable and accurate predictions. PMID:27830108

  12. Using timed event sequential data in nursing research.

    PubMed

    Pecanac, Kristen E; Doherty-King, Barbara; Yoon, Ju Young; Brown, Roger; Schiefelbein, Tony

    2015-01-01

    Measuring behavior is important in nursing research, and innovative technologies are needed to capture the "real-life" complexity of behaviors and events. The purpose of this article is to describe the use of timed event sequential data in nursing research and to demonstrate the use of this data in a research study. Timed event sequencing allows the researcher to capture the frequency, duration, and sequence of behaviors as they occur in an observation period and to link the behaviors to contextual details. Timed event sequential data can easily be collected with handheld computers, loaded with a software program designed for capturing observations in real time. Timed event sequential data add considerable strength to analysis of any nursing behavior of interest, which can enhance understanding and lead to improvement in nursing practice.

  13. A Regularized Linear Dynamical System Framework for Multivariate Time Series Analysis

    PubMed Central

    Liu, Zitao; Hauskrecht, Milos

    2015-01-01

    Linear Dynamical System (LDS) is an elegant mathematical framework for modeling and learning Multivariate Time Series (MTS). However, in general, it is difficult to set the dimension of an LDS’s hidden state space. A small number of hidden states may not be able to model the complexities of a MTS, while a large number of hidden states can lead to overfitting. In this paper, we study learning methods that impose various regularization penalties on the transition matrix of the LDS model and propose a regularized LDS learning framework (rLDS) which aims to (1) automatically shut down LDSs’ spurious and unnecessary dimensions, and consequently, address the problem of choosing the optimal number of hidden states; (2) prevent the overfitting problem given a small amount of MTS data; and (3) support accurate MTS forecasting. To learn the regularized LDS from data we incorporate a second order cone program and a generalized gradient descent method into the Maximum a Posteriori framework and use Expectation Maximization to obtain a low-rank transition matrix of the LDS model. We propose two priors for modeling the matrix which lead to two instances of our rLDS. We show that our rLDS is able to recover well the intrinsic dimensionality of the time series dynamics and it improves the predictive performance when compared to baselines on both synthetic and real-world MTS datasets. PMID:25905027

  14. Multivariate or Multivariable Regression?

    PubMed Central

    Goodman, Melody

    2013-01-01

    The terms multivariate and multivariable are often used interchangeably in the public health literature. However, these terms actually represent 2 very distinct types of analyses. We define the 2 types of analysis and assess the prevalence of use of the statistical term multivariate in a 1-year span of articles published in the American Journal of Public Health. Our goal is to make a clear distinction and to identify the nuances that make these types of analyses so distinct from one another. PMID:23153131

  15. Nuclear event zero-time calculation and uncertainty evaluation.

    PubMed

    Pan, Pujing; Ungar, R Kurt

    2012-04-01

    It is important to know the initial time, or zero-time, of a nuclear event such as a nuclear weapon's test, a nuclear power plant accident or a nuclear terrorist attack (e.g. with an improvised nuclear device, IND). Together with relevant meteorological information, the calculated zero-time is used to help locate the origin of a nuclear event. The zero-time of a nuclear event can be derived from measured activity ratios of two nuclides. The calculated zero-time of a nuclear event would not be complete without an appropriately evaluated uncertainty term. In this paper, analytical equations for zero-time and the associated uncertainty calculations are derived using a measured activity ratio of two nuclides. Application of the derived equations is illustrated in a realistic example using data from the last Chinese thermonuclear test in 1980. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.

  16. Predicting analysis time in event-driven clinical trials with event-reporting lag.

    PubMed

    Wang, Jianming; Ke, Chunlei; Jiang, Qi; Zhang, Charlie; Snapinn, Steven

    2012-04-30

    For a clinical trial with a time-to-event primary endpoint, the rate of accrual of the event of interest determines the timing of the analysis, upon which significant resources and strategic planning depend. It is important to be able to predict the analysis time early and accurately. Currently available methods use either parametric or nonparametric models to predict the analysis time based on accumulating information about enrollment, event, and study withdrawal rates and implicitly assume that the available data are completely reported at the time of performing the prediction. This assumption, however, may not be true when it takes a certain amount of time (i.e., event-reporting lag) for an event to be reported, in which case, the data are incomplete for prediction. Ignoring the event-reporting lag could substantially impact the accuracy of the prediction. In this paper, we describe a general parametric model to incorporate event-reporting lag into analysis time prediction. We develop a prediction procedure using a Bayesian method and provide detailed implementations for exponential distributions. Some simulations were performed to evaluate the performance of the proposed method. An application to an on-going clinical trial is also described. Copyright © 2012 John Wiley & Sons, Ltd.

  17. Multivariate analysis of GPS position time series of JPL second reprocessing campaign

    NASA Astrophysics Data System (ADS)

    Amiri-Simkooei, A. R.; Mohammadloo, T. H.; Argus, D. F.

    2017-01-01

    The second reprocessing of all GPS data gathered by the Analysis Centers of IGS was conducted in late 2013 using the latest models and methodologies. Improved models of antenna phase center variations and solar radiation pressure in JPL's reanalysis are expected to significantly reduce errors. In an earlier work, JPL estimates of position time series, termed first reprocessing campaign, were examined in terms of their spatial and temporal correlation, power spectra, and draconitic signal. Similar analyses are applied to GPS time series at 89 and 66 sites of the second reanalysis with the time span of 7 and 21 years, respectively, to study possible improvements. Our results indicate that the spatial correlations are reduced on average by a factor of 1.25. While the white and flicker noise amplitudes for all components are reduced by 29-56 %, the random walk amplitude is enlarged. The white, flicker, and random walk noise amount to rate errors of, respectively, 0.01, 0.12, and 0.09 mm/yr in the horizontal and 0.04, 0.41 and 0.3 mm/yr in the vertical. Signals reported previously, such as those with periods of 13.63, 14.76, 5.5, and 351.4 / n for n=1,2,ldots,8 days, are identified in multivariate spectra of both data sets. The oscillation of the draconitic signal is reduced by factors of 1.87, 1.87, and 1.68 in the east, north and up components, respectively. Two other signals with Chandlerian period and a period of 380 days can also be detected.

  18. Multivariate analysis of GPS position time series of JPL second reprocessing campaign

    NASA Astrophysics Data System (ADS)

    Amiri-Simkooei, A. R.; Mohammadloo, T. H.; Argus, D. F.

    2017-06-01

    The second reprocessing of all GPS data gathered by the Analysis Centers of IGS was conducted in late 2013 using the latest models and methodologies. Improved models of antenna phase center variations and solar radiation pressure in JPL's reanalysis are expected to significantly reduce errors. In an earlier work, JPL estimates of position time series, termed first reprocessing campaign, were examined in terms of their spatial and temporal correlation, power spectra, and draconitic signal. Similar analyses are applied to GPS time series at 89 and 66 sites of the second reanalysis with the time span of 7 and 21 years, respectively, to study possible improvements. Our results indicate that the spatial correlations are reduced on average by a factor of 1.25. While the white and flicker noise amplitudes for all components are reduced by 29-56 %, the random walk amplitude is enlarged. The white, flicker, and random walk noise amount to rate errors of, respectively, 0.01, 0.12, and 0.09 mm/yr in the horizontal and 0.04, 0.41 and 0.3 mm/yr in the vertical. Signals reported previously, such as those with periods of 13.63, 14.76, 5.5, and 351.4 / n for n=1,2,{\\ldots },8 days, are identified in multivariate spectra of both data sets. The oscillation of the draconitic signal is reduced by factors of 1.87, 1.87, and 1.68 in the east, north and up components, respectively. Two other signals with Chandlerian period and a period of 380 days can also be detected.

  19. Bicomponent Trend Maps: A Multivariate Approach to Visualizing Geographic Time Series

    PubMed Central

    Schroeder, Jonathan P.

    2012-01-01

    The most straightforward approaches to temporal mapping cannot effectively illustrate all potentially significant aspects of spatio-temporal patterns across many regions and times. This paper introduces an alternative approach, bicomponent trend mapping, which employs a combination of principal component analysis and bivariate choropleth mapping to illustrate two distinct dimensions of long-term trend variations. The approach also employs a bicomponent trend matrix, a graphic that illustrates an array of typical trend types corresponding to different combinations of scores on two principal components. This matrix is useful not only as a legend for bicomponent trend maps but also as a general means of visualizing principal components. To demonstrate and assess the new approach, the paper focuses on the task of illustrating population trends from 1950 to 2000 in census tracts throughout major U.S. urban cores. In a single static display, bicomponent trend mapping is not able to depict as wide a variety of trend properties as some other multivariate mapping approaches, but it can make relationships among trend classes easier to interpret, and it offers some unique flexibility in classification that could be particularly useful in an interactive data exploration environment. PMID:23504193

  20. Analysis of heterogeneous dengue transmission in Guangdong in 2014 with multivariate time series model

    PubMed Central

    Cheng, Qing; Lu, Xin; Wu, Joseph T.; Liu, Zhong; Huang, Jincai

    2016-01-01

    Guangdong experienced the largest dengue epidemic in recent history. In 2014, the number of dengue cases was the highest in the previous 10 years and comprised more than 90% of all cases. In order to analyze heterogeneous transmission of dengue, a multivariate time series model decomposing dengue risk additively into endemic, autoregressive and spatiotemporal components was used to model dengue transmission. Moreover, random effects were introduced in the model to deal with heterogeneous dengue transmission and incidence levels and power law approach was embedded into the model to account for spatial interaction. There was little spatial variation in the autoregressive component. In contrast, for the endemic component, there was a pronounced heterogeneity between the Pearl River Delta area and the remaining districts. For the spatiotemporal component, there was considerable heterogeneity across districts with highest values in some western and eastern department. The results showed that the patterns driving dengue transmission were found by using clustering analysis. And endemic component contribution seems to be important in the Pearl River Delta area, where the incidence is high (95 per 100,000), while areas with relatively low incidence (4 per 100,000) are highly dependent on spatiotemporal spread and local autoregression. PMID:27666657

  1. A multivariate probabilistic graphical model for real-time volcano monitoring on Mount Etna

    NASA Astrophysics Data System (ADS)

    Cannavò, Flavio; Cannata, Andrea; Cassisi, Carmelo; Di Grazia, Giuseppe; Montalto, Placido; Prestifilippo, Michele; Privitera, Eugenio; Coltelli, Mauro; Gambino, Salvatore

    2017-05-01

    Real-time assessment of the state of a volcano plays a key role for civil protection purposes. Unfortunately, because of the coupling of highly nonlinear and partially known complex volcanic processes, and the intrinsic uncertainties in measured parameters, the state of a volcano needs to be expressed in probabilistic terms, thus making any rapid assessment sometimes impractical. With the aim of aiding on-duty personnel in volcano-monitoring roles, we present an expert system approach to automatically estimate the ongoing state of a volcano from all available measurements. The system consists of a probabilistic model that encodes the conditional dependencies between measurements and volcanic states in a directed acyclic graph and renders an estimation of the probability distribution of the feasible volcanic states. We test the model with Mount Etna (Italy) as a case study by considering a long record of multivariate data. Results indicate that the proposed model is effective for early warning and has considerable potential for decision-making purposes.

  2. Analysis of heterogeneous dengue transmission in Guangdong in 2014 with multivariate time series model.

    PubMed

    Cheng, Qing; Lu, Xin; Wu, Joseph T; Liu, Zhong; Huang, Jincai

    2016-09-26

    Guangdong experienced the largest dengue epidemic in recent history. In 2014, the number of dengue cases was the highest in the previous 10 years and comprised more than 90% of all cases. In order to analyze heterogeneous transmission of dengue, a multivariate time series model decomposing dengue risk additively into endemic, autoregressive and spatiotemporal components was used to model dengue transmission. Moreover, random effects were introduced in the model to deal with heterogeneous dengue transmission and incidence levels and power law approach was embedded into the model to account for spatial interaction. There was little spatial variation in the autoregressive component. In contrast, for the endemic component, there was a pronounced heterogeneity between the Pearl River Delta area and the remaining districts. For the spatiotemporal component, there was considerable heterogeneity across districts with highest values in some western and eastern department. The results showed that the patterns driving dengue transmission were found by using clustering analysis. And endemic component contribution seems to be important in the Pearl River Delta area, where the incidence is high (95 per 100,000), while areas with relatively low incidence (4 per 100,000) are highly dependent on spatiotemporal spread and local autoregression.

  3. Predicting proton event time characteristics from radio burst data

    SciTech Connect

    Bakshi, P.; Nguyen, T.

    1981-06-01

    For events originating on the Western hemisphere, the delay before onset of the solar flare protons is shown to be well correlated (r about 0.80) with the rise time of the associated radio-burst at 2-3 GHz or the rise time of the H sub alpha flare. The peak flux time of the protons is shown to be very well correlated (r about 0.90) with the delay before onset, and fairly well correlated (r about 0.70) with the flare or radio rise time. These results allow a prediction of the proton event time characteristics from real time radio burst data.

  4. Family Events and the Timing of Intergenerational Transfers

    ERIC Educational Resources Information Center

    Leopold, Thomas; Schneider, Thorsten

    2011-01-01

    This research investigates how family events in adult children's lives influence the timing of their parents' financial transfers. We draw on retrospective data collected by the German Socio-Economic Panel Study and use event history models to study the effects of marriage, divorce and childbirth on the receipt of large gifts from parents. We find…

  5. Family Events and the Timing of Intergenerational Transfers

    ERIC Educational Resources Information Center

    Leopold, Thomas; Schneider, Thorsten

    2011-01-01

    This research investigates how family events in adult children's lives influence the timing of their parents' financial transfers. We draw on retrospective data collected by the German Socio-Economic Panel Study and use event history models to study the effects of marriage, divorce and childbirth on the receipt of large gifts from parents. We find…

  6. Sensor-Generated Time Series Events: A Definition Language

    PubMed Central

    Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan

    2012-01-01

    There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.

  7. Event-by-event study of space-time dynamics in flux-tube fragmentation

    NASA Astrophysics Data System (ADS)

    Wong, Cheuk-Yin

    2017-07-01

    In the semi-classical description of the flux-tube fragmentation process for hadron production and hadronization in high-energy {e}+{e}- annihilations and pp collisions, the rapidity-space-time ordering and the local conservation laws of charge, flavor, and momentum provide a set of powerful tools that may allow the reconstruction of the space-time dynamics of quarks and mesons in exclusive measurements of produced hadrons, on an event-by-event basis. We propose procedures to reconstruct the space-time dynamics from event-by-event exclusive hadron data to exhibit explicitly the ordered chain of hadrons produced in a flux tube fragmentation. As a supplementary tool, we infer the average space-time coordinates of the q-\\bar{q} pair production vertices from the {π }- rapidity distribution data obtained by the NA61/SHINE Collaboration in pp collisions at \\sqrt{s}=6.3 to 17.3 GeV.

  8. Exploring the statistical convergence of earthquake inter-event times

    NASA Astrophysics Data System (ADS)

    Naylor, M.; Main, I.; Touati, S.

    2008-12-01

    Seismic activity is routinely quantified using mean event rates or mean inter-event times. Standard estimates of the error on such mean values implicitly assume that the events that are used to calculate the mean are independent. However, earthquakes can be triggered by other events and are thus not necessarily independent. As a result, the errors on mean earthquake inter-event times do not exhibit Gaussian convergence with increasing sample size according to the Central Limit Theorem [1]. In this presentation we investigate how the errors decay with sample size in earthquake catalogues. We demonstrate that the errors on mean inter-event times, as a function of sample size, are well estimated by defining an effective sample size using the autocorrelation function to estimate the number of pieces of independent data that exist in samples of different length. This allows us to accurately project estimates of error as a function of sample size, which are further verified using extended simulations of the ETAS model. This is a generic technique that can be used to assess errors on a wide variety of correlated datasets. [1] Naylor, M., Main, I.G. and Touati, S., (In press) Quantifying uncertainties on mean earthquake inter-event times, JGR.

  9. Intraoperative imaging of cortical cerebral perfusion by time-resolved thermography and multivariate data analysis.

    PubMed

    Steiner, Gerald; Sobottka, Stephan B; Koch, Edmund; Schackert, Gabriele; Kirsch, Matthias

    2011-01-01

    A new approach to cortical perfusion imaging is demonstrated using high-sensitivity thermography in conjunction with multivariate statistical data analysis. Local temperature changes caused by a cold bolus are imaged and transferred to a false color image. A cold bolus of 10 ml saline at ice temperature is injected systemically via a central venous access. During the injection, a sequence of 735 thermographic images are recorded within 2 min. The recorded data cube is subjected to a principal component analysis (PCA) to select slight changes of the cortical temperature caused by the cold bolus. PCA reveals that 11 s after injection the temperature of blood vessels is shortly decreased followed by an increase to the temperature before the cold bolus is injected. We demonstrate the potential of intraoperative thermography in combination with multivariate data analysis to image cortical cerebral perfusion without any markers. We provide the first in vivo application of multivariate thermographic imaging.

  10. Semiparametric time-to-event modeling in the presence of a latent progression event.

    PubMed

    Rice, John D; Tsodikov, Alex

    2016-08-24

    In cancer research, interest frequently centers on factors influencing a latent event that must precede a terminal event. In practice it is often impossible to observe the latent event precisely, making inference about this process difficult. To address this problem, we propose a joint model for the unobserved time to the latent and terminal events, with the two events linked by the baseline hazard. Covariates enter the model parametrically as linear combinations that multiply, respectively, the hazard for the latent event and the hazard for the terminal event conditional on the latent one. We derive the partial likelihood estimators for this problem assuming the latent event is observed, and propose a profile likelihood-based method for estimation when the latent event is unobserved. The baseline hazard in this case is estimated nonparametrically using the EM algorithm, which allows for closed-form Breslow-type estimators at each iteration, bringing improved computational efficiency and stability compared with maximizing the marginal likelihood directly. We present simulation studies to illustrate the finite-sample properties of the method; its use in practice is demonstrated in the analysis of a prostate cancer data set.

  11. Real-Time Event Detection for Monitoring Natural and Source ...

    EPA Pesticide Factsheets

    The use of event detection systems in finished drinking water systems is increasing in order to monitor water quality in both operational and security contexts. Recent incidents involving harmful algal blooms and chemical spills into watersheds have increased interest in monitoring source water quality prior to treatment. This work highlights the use of the CANARY event detection software in detecting suspected illicit events in an actively monitored watershed in South Carolina. CANARY is an open source event detection software that was developed by USEPA and Sandia National Laboratories. The software works with any type of sensor, utilizes multiple detection algorithms and approaches, and can incorporate operational information as needed. Monitoring has been underway for several years to detect events related to intentional or unintentional dumping of materials into the monitored watershed. This work evaluates the feasibility of using CANARY to enhance the detection of events in this watershed. This presentation will describe the real-time monitoring approach used in this watershed, the selection of CANARY configuration parameters that optimize detection for this watershed and monitoring application, and the performance of CANARY during the time frame analyzed. Further, this work will highlight how rainfall events impacted analysis, and the innovative application of CANARY taken in order to effectively detect the suspected illicit events. This presentation d

  12. Analysing adverse events by time-to-event models: the CLEOPATRA study.

    PubMed

    Proctor, Tanja; Schumacher, Martin

    2016-07-01

    When analysing primary and secondary endpoints in a clinical trial with patients suffering from a chronic disease, statistical models for time-to-event data are commonly used and accepted. This is in contrast to the analysis of data on adverse events where often only a table with observed frequencies and corresponding test statistics is reported. An example is the recently published CLEOPATRA study where a three-drug regimen is compared with a two-drug regimen in patients with HER2-positive first-line metastatic breast cancer. Here, as described earlier, primary and secondary endpoints (progression-free and overall survival) are analysed using time-to-event models, whereas adverse events are summarized in a simple frequency table, although the duration of study treatment differs substantially. In this paper, we demonstrate the application of time-to-event models to first serious adverse events using the data of the CLEOPATRA study. This will cover the broad range between a simple incidence rate approach over survival and competing risks models (with death as a competing event) to multi-state models. We illustrate all approaches by means of graphical displays highlighting the temporal dynamics and compare the obtained results. For the CLEOPATRA study, the resulting hazard ratios are all in the same order of magnitude. But the use of time-to-event models provides valuable and additional information that would potentially be overlooked by only presenting incidence proportions. These models adequately address the temporal dynamics of serious adverse events as well as death of patients. Copyright © 2016 John Wiley & Sons, Ltd.

  13. Population assessment using multivariate time-series analysis: A case study of rockfishes in Puget Sound.

    PubMed

    Tolimieri, Nick; Holmes, Elizabeth E; Williams, Gregory D; Pacunski, Robert; Lowry, Dayv

    2017-04-01

    Estimating a population's growth rate and year-to-year variance is a key component of population viability analysis (PVA). However, standard PVA methods require time series of counts obtained using consistent survey methods over many years. In addition, it can be difficult to separate observation and process variance, which is critical for PVA. Time-series analysis performed with multivariate autoregressive state-space (MARSS) models is a flexible statistical framework that allows one to address many of these limitations. MARSS models allow one to combine surveys with different gears and across different sites for estimation of PVA parameters, and to implement replication, which reduces the variance-separation problem and maximizes informational input for mean trend estimation. Even data that are fragmented with unknown error levels can be accommodated. We present a practical case study that illustrates MARSS analysis steps: data choice, model set-up, model selection, and parameter estimation. Our case study is an analysis of the long-term trends of rockfish in Puget Sound, Washington, based on citizen science scuba surveys, a fishery-independent trawl survey, and recreational fishery surveys affected by bag-limit reductions. The best-supported models indicated that the recreational and trawl surveys tracked different, temporally independent assemblages that declined at similar rates (an average of -3.8% to -3.9% per year). The scuba survey tracked a separate increasing and temporally independent assemblage (an average of 4.1% per year). Three rockfishes (bocaccio, canary, and yelloweye) are listed in Puget Sound under the US Endangered Species Act (ESA). These species are associated with deep water, which the recreational and trawl surveys sample better than the scuba survey. All three ESA-listed rockfishes declined as a proportion of recreational catch between the 1970s and 2010s, suggesting that they experienced similar or more severe reductions in abundance

  14. Real-time measurements, rare events and photon economics

    NASA Astrophysics Data System (ADS)

    Jalali, B.; Solli, D. R.; Goda, K.; Tsia, K.; Ropers, C.

    2010-07-01

    Rogue events otherwise known as outliers and black swans are singular, rare, events that carry dramatic impact. They appear in seemingly unconnected systems in the form of oceanic rogue waves, stock market crashes, evolution, and communication systems. Attempts to understand the underlying dynamics of such complex systems that lead to spectacular and often cataclysmic outcomes have been frustrated by the scarcity of events, resulting in insufficient statistical data, and by the inability to perform experiments under controlled conditions. Extreme rare events also occur in ultrafast physical sciences where it is possible to collect large data sets, even for rare events, in a short time period. The knowledge gained from observing rare events in ultrafast systems may provide valuable insight into extreme value phenomena that occur over a much slower timescale and that have a closer connection with human experience. One solution is a real-time ultrafast instrument that is capable of capturing singular and randomly occurring non-repetitive events. The time stretch technology developed during the past 13 years is providing a powerful tool box for reaching this goal. This paper reviews this technology and discusses its use in capturing rogue events in electronic signals, spectroscopy, and imaging. We show an example in nonlinear optics where it was possible to capture rare and random solitons whose unusual statistical distribution resemble those observed in financial markets. The ability to observe the true spectrum of each event in real time has led to important insight in understanding the underlying process, which in turn has made it possible to control soliton generation leading to improvement in the coherence of supercontinuum light. We also show a new class of fast imagers which are being considered for early detection of cancer because of their potential ability to detect rare diseased cells (so called rogue cells) in a large population of healthy cells.

  15. The Carrington Event: Possible Solar Proton Intensity-Time Profile

    NASA Astrophysics Data System (ADS)

    Smart, D. F.; Shea, M. A.; McCracken, K. G.

    2004-05-01

    We evaluate the >30 MeV proton fluence associated with the Carrington event as 1.9 x 10**10 protons per sqcm based on the analysis of solar proton generated NO(y) radicals that are deposited in polar ice. (See McCracken et al., JGR, 106, 21,585, 2001.) We construct a possible intensity-time profile of the solar particle flux for this event by assuming that it is part of the class of interplanetary shock dominated events where the maximum particle flux is observed as the shock passes the earth. We show that most of the very large solar proton fluence events (those with >30 MeV omnidirectonal fluence exceeding 1 x 10**9 protons per cmsq) observed at the earth during the last 50 years belong to this class of event.

  16. Balance characteristics of multivariate background error covariance for rainy and dry seasons and their impact on precipitation forecasts of two rainfall events

    NASA Astrophysics Data System (ADS)

    Chen, Yaodeng; Xia, Xue; Min, Jinzhong; Huang, Xiang-Yu; Rizvi, Syed R. H.

    2016-10-01

    Atmospheric moisture content or humidity is an important analysis variable of any meteorological data assimilation system. The humidity analysis can be univariate, using humidity background (normally short-range numerical forecasts) and humidity observations. However, more and more data assimilation systems are multivariate, analyzing humidity together with wind, temperature and pressure. Background error covariances, with unbalanced velocity potential and humidity in the multivariate formulation, are generated from weather research and forecasting model forecasts, collected over a summer rainy season and a winter dry season. The unbalanced velocity potential and humidity related correlations are shown to be significantly larger, indicating more important roles unbalanced velocity potential and humidity play, in the rainy season than that in the dry season. Three cycling data assimilation experiments of two rainfall events in the middle and lower reaches of the Yangtze River are carried out. The experiments differ in the formulation of the background error covariances. Results indicate that only including unbalanced velocity potential in the multivariate background error covariance improves wind analyses, but has little impact on temperature and humidity analyses. In contrast, further including humidity in the multivariate background error covariance although has a slight negative effect on wind analyses and a neutral effect on temperature analyses, but significantly improves humidity analyses, leading to precipitation forecasts more consistent with China Hourly Merged Precipitation Analysis.

  17. Moving Events in Time: Time-Referent Hand-Arm Movements Influence Perceived Temporal Distance to Past Events

    ERIC Educational Resources Information Center

    Blom, Stephanie S. A. H.; Semin, Gun R.

    2013-01-01

    We examine and find support for the hypothesis that time-referent hand-arm movements influence temporal judgments. In line with the concept of "left is associated with earlier times, and right is associated with later times," we show that performing left (right) hand-arm movements while thinking about a past event increases (decreases) the…

  18. Moving Events in Time: Time-Referent Hand-Arm Movements Influence Perceived Temporal Distance to Past Events

    ERIC Educational Resources Information Center

    Blom, Stephanie S. A. H.; Semin, Gun R.

    2013-01-01

    We examine and find support for the hypothesis that time-referent hand-arm movements influence temporal judgments. In line with the concept of "left is associated with earlier times, and right is associated with later times," we show that performing left (right) hand-arm movements while thinking about a past event increases (decreases) the…

  19. Effects of alcohol intake on time-based event expectations.

    PubMed

    Kunchulia, Marina; Thomaschke, Roland

    2016-04-01

    Previous evidence suggests that alcohol affects various forms of temporal cognition. However, there are presently no studies investigating whether and how alcohol affects on time-based event expectations. Here, we investigated the effects of alcohol on time-based event expectations. Seventeen healthy volunteers, aged between 19 and 36 years, participated. We employed a variable foreperiod paradigm with temporally predictable events, mimicking a computer game. Error rate and reaction time were analyzed in placebo (0 g/kg), low dose (0.2 g/kg) and high dose (0.6 g/kg) conditions. We found that alcohol intake did not eliminate, but substantially reduced, the formation of time-based expectancy. This effect was stronger for high doses, than for low doses, of alcohol. As a result of our studies, we have evidence that alcohol intake impairs time-based event expectations. The mechanism by which the level of alcohol impairs time-based event expectations needs to be clarified by future research.

  20. Asynchronous visual event-based time-to-contact.

    PubMed

    Clady, Xavier; Clercq, Charles; Ieng, Sio-Hoi; Houseini, Fouzhan; Randazzo, Marco; Natale, Lorenzo; Bartolozzi, Chiara; Benosman, Ryad

    2014-01-01

    Reliable and fast sensing of the environment is a fundamental requirement for autonomous mobile robotic platforms. Unfortunately, the frame-based acquisition paradigm at the basis of main stream artificial perceptive systems is limited by low temporal dynamics and redundant data flow, leading to high computational costs. Hence, conventional sensing and relative computation are obviously incompatible with the design of high speed sensor-based reactive control for mobile applications, that pose strict limits on energy consumption and computational load. This paper introduces a fast obstacle avoidance method based on the output of an asynchronous event-based time encoded imaging sensor. The proposed method relies on an event-based Time To Contact (TTC) computation based on visual event-based motion flows. The approach is event-based in the sense that every incoming event adds to the computation process thus allowing fast avoidance responses. The method is validated indoor on a mobile robot, comparing the event-based TTC with a laser range finder TTC, showing that event-based sensing offers new perspectives for mobile robotics sensing.

  1. Asynchronous visual event-based time-to-contact

    PubMed Central

    Clady, Xavier; Clercq, Charles; Ieng, Sio-Hoi; Houseini, Fouzhan; Randazzo, Marco; Natale, Lorenzo; Bartolozzi, Chiara; Benosman, Ryad

    2014-01-01

    Reliable and fast sensing of the environment is a fundamental requirement for autonomous mobile robotic platforms. Unfortunately, the frame-based acquisition paradigm at the basis of main stream artificial perceptive systems is limited by low temporal dynamics and redundant data flow, leading to high computational costs. Hence, conventional sensing and relative computation are obviously incompatible with the design of high speed sensor-based reactive control for mobile applications, that pose strict limits on energy consumption and computational load. This paper introduces a fast obstacle avoidance method based on the output of an asynchronous event-based time encoded imaging sensor. The proposed method relies on an event-based Time To Contact (TTC) computation based on visual event-based motion flows. The approach is event-based in the sense that every incoming event adds to the computation process thus allowing fast avoidance responses. The method is validated indoor on a mobile robot, comparing the event-based TTC with a laser range finder TTC, showing that event-based sensing offers new perspectives for mobile robotics sensing. PMID:24570652

  2. Sharing Time: A Student-Run Speech Event.

    ERIC Educational Resources Information Center

    Foster, Michele

    An ethnographic study was made of a student-led speech event in an ethnically mixed combined first- and second-grade classroom. In an activity called "Sharing Time," general rules governed appropriate ways of behaving, but no teacher rules governed ways of speaking, topic, or amount of time at talk. Collected over a 5-month period, data…

  3. Bayesian joint modeling of longitudinal measurements and time-to-event data using robust distributions.

    PubMed

    Baghfalaki, T; Ganjali, M; Hashemi, R

    2014-01-01

    Distributional assumptions of most of the existing methods for joint modeling of longitudinal measurements and time-to-event data cannot allow incorporation of outlier robustness. In this article, we develop and implement a joint modeling of longitudinal and time-to-event data using some powerful distributions for robust analyzing that are known as normal/independent distributions. These distributions include univariate and multivariate versions of the Student's t, the slash, and the contaminated normal distributions. The proposed model implements a linear mixed effects model under a normal/independent distribution assumption for both random effects and residuals of the longitudinal process. For the time-to-event process a parametric proportional hazard model with a Weibull baseline hazard is used. Also, a Bayesian approach using the Markov-chain Monte Carlo method is adopted for parameter estimation. Some simulation studies are performed to investigate the performance of the proposed method under presence and absence of outliers. Also, the proposed methods are applied for analyzing a real AIDS clinical trial, with the aim of comparing the efficiency and safety of two antiretroviral drugs, where CD4 count measurements are gathered as longitudinal outcomes. In these data, time to death or dropout is considered as the interesting time-to-event outcome variable. Different model structures are developed for analyzing these data sets, where model selection is performed by the deviance information criterion (DIC), expected Akaike information criterion (EAIC), and expected Bayesian information criterion (EBIC).

  4. Poor physical health predicts time to additional breast cancer events and mortality in breast cancer survivors.

    PubMed

    Saquib, Nazmus; Pierce, John P; Saquib, Juliann; Flatt, Shirley W; Natarajan, Loki; Bardwell, Wayne A; Patterson, Ruth E; Stefanick, Marcia L; Thomson, Cynthia A; Rock, Cheryl L; Jones, Lovell A; Gold, Ellen B; Karanja, Njeri; Parker, Barbara A

    2011-03-01

    Health-related quality of life has been hypothesized to predict time to additional breast cancer events and all-cause mortality in breast cancer survivors. Women with early-stage breast cancer (n=2967) completed the SF-36 (mental and physical health-related quality of life) and standardized psychosocial questionnaires to assess social support, optimism, hostility, and depression prior to randomization into a dietary trial. Cox regression was performed to assess whether these measures of quality of life and psychosocial functioning predicted time to additional breast cancer events and all-cause mortality; hazard ratios were the measure of association. There were 492 additional breast cancer events and 301 deaths occurred over a median 7.3 years (range: 0.01-10.8 years) of follow-up. In multivariate models, poorer physical health was associated with both decreased time to additional breast cancer events and all-cause mortality (p trend=0.005 and 0.004, respectively), while greater hostility predicted additional breast cancer events only (p trend=0.03). None of the other psychosocial variables predicted either outcome. The hazard ratios comparing persons with poor (bottom two quintiles) to better (top three quintiles) physical health were 1.42 (95% CI: 1.16, 1.75) for decreased time to additional breast cancer events and 1.37 (95% CI: 1.08, 1.74) for all-cause mortality. Potentially modifiable factors associated with poor physical health included higher body mass index, lower physical activity, lower alcohol consumption, and more insomnia (p<0.05 for all). Interventions to improve physical health should be tested as a means to increase time to additional breast cancer events and mortality among breast cancer survivors. Copyright © 2010 John Wiley & Sons, Ltd.

  5. Reciprocal Benefits of Mass-Univariate and Multivariate Modeling in Brain Mapping: Applications to Event-Related Functional MRI, H2 15O-, and FDG-PET

    PubMed Central

    Habeck, Christian G.

    2006-01-01

    In brain mapping studies of sensory, cognitive, and motor operations, specific waveforms of dynamic neural activity are predicted based on theoretical models of human information processing. For example in event-related functional MRI (fMRI), the general linear model (GLM) is employed in mass-univariate analyses to identify the regions whose dynamic activity closely matches the expected waveforms. By comparison multivariate analyses based on PCA or ICA provide greater flexibility in detecting spatiotemporal properties of experimental data that may strongly support alternative neuroscientific explanations. We investigated conjoint multivariate and mass-univariate analyses that combine the capabilities to (1) verify activation of neural machinery we already understand and (2) discover reliable signatures of new neural machinery. We examined combinations of GLM and PCA that recover latent neural signals (waveforms and footprints) with greater accuracy than either method alone. Comparative results are illustrated with analyses of real fMRI data, adding to Monte Carlo simulation support. PMID:23165047

  6. Inferring direct directed-information flow from multivariate nonlinear time series

    NASA Astrophysics Data System (ADS)

    Jachan, Michael; Henschel, Kathrin; Nawrath, Jakob; Schad, Ariane; Timmer, Jens; Schelter, Björn

    2009-07-01

    Estimating the functional topology of a network from multivariate observations is an important task in nonlinear dynamics. We introduce the nonparametric partial directed coherence that allows disentanglement of direct and indirect connections and their directions. We illustrate the performance of the nonparametric partial directed coherence by means of a simulation with data from synchronized nonlinear oscillators and apply it to real-world data from a patient suffering from essential tremor.

  7. Time Separation Between Events in a Sequence: a Regional Property?

    NASA Astrophysics Data System (ADS)

    Muirwood, R.; Fitzenz, D. D.

    2013-12-01

    Earthquake sequences are loosely defined as events occurring too closely in time and space to appear unrelated. Depending on the declustering method, several, all, or no event(s) after the first large event might be recognized as independent mainshocks. It can therefore be argued that a probabilistic seismic hazard assessment (PSHA, traditionally dealing with mainshocks only) might already include the ground shaking effects of such sequences. Alternatively all but the largest event could be classified as an ';aftershock' and removed from the earthquake catalog. While in PSHA the question is only whether to keep or remove the events from the catalog, for Risk Management purposes, the community response to the earthquakes, as well as insurance risk transfer mechanisms, can be profoundly affected by the actual timing of events in such a sequence. In particular the repetition of damaging earthquakes over a period of weeks to months can lead to businesses closing and families evacuating from the region (as happened in Christchurch, New Zealand in 2011). Buildings that are damaged in the first earthquake may go on to be damaged again, even while they are being repaired. Insurance also functions around a set of critical timeframes - including the definition of a single 'event loss' for reinsurance recoveries within the 192 hour ';hours clause', the 6-18 month pace at which insurance claims are settled, and the annual renewal of insurance and reinsurance contracts. We show how temporal aspects of earthquake sequences need to be taken into account within models for Risk Management, and what time separation between events are most sensitive, both in terms of the modeled disruptions to lifelines and business activity as well as in the losses to different parties (such as insureds, insurers and reinsurers). We also explore the time separation between all events and between loss causing events for a collection of sequences from across the world and we point to the need to

  8. Does time really slow down during a frightening event?

    PubMed

    Stetson, Chess; Fiesta, Matthew P; Eagleman, David M

    2007-12-12

    Observers commonly report that time seems to have moved in slow motion during a life-threatening event. It is unknown whether this is a function of increased time resolution during the event, or instead an illusion of remembering an emotionally salient event. Using a hand-held device to measure speed of visual perception, participants experienced free fall for 31 m before landing safely in a net. We found no evidence of increased temporal resolution, in apparent conflict with the fact that participants retrospectively estimated their own fall to last 36% longer than others' falls. The duration dilation during a frightening event, and the lack of concomitant increase in temporal resolution, indicate that subjective time is not a single entity that speeds or slows, but instead is composed of separable subcomponents. Our findings suggest that time-slowing is a function of recollection, not perception: a richer encoding of memory may cause a salient event to appear, retrospectively, as though it lasted longer.

  9. Does Time Really Slow Down during a Frightening Event?

    PubMed Central

    Stetson, Chess; Fiesta, Matthew P.; Eagleman, David M.

    2007-01-01

    Observers commonly report that time seems to have moved in slow motion during a life-threatening event. It is unknown whether this is a function of increased time resolution during the event, or instead an illusion of remembering an emotionally salient event. Using a hand-held device to measure speed of visual perception, participants experienced free fall for 31 m before landing safely in a net. We found no evidence of increased temporal resolution, in apparent conflict with the fact that participants retrospectively estimated their own fall to last 36% longer than others' falls. The duration dilation during a frightening event, and the lack of concomitant increase in temporal resolution, indicate that subjective time is not a single entity that speeds or slows, but instead is composed of separable subcomponents. Our findings suggest that time-slowing is a function of recollection, not perception: a richer encoding of memory may cause a salient event to appear, retrospectively, as though it lasted longer. PMID:18074019

  10. A hybrid approach to predicting events in clinical trials with time-to-event outcomes.

    PubMed

    Fang, Liang; Su, Zheng

    2011-09-01

    In many clinical trials with time-to-event outcomes there are interim analyses planned at pre-specified event counts. It is of great value to predict when the pre-specified event milestones can be reached based on the available data as the timeline for a study is essential to the study sponsors and data monitoring committees for logistic planning purposes. Both parametric and non-parametric approaches exist in the literature for estimating the underlining survival function, based on which the predictions of future event times can be determined. The parametric approaches assume that the survival function is smooth, which is often not the case as the survival function usually has one or multiple change points and the hazard functions can differ significantly before and after a change point. The existing non-parametric method bases predictions on the Kaplan-Meier survival curve appended with a parametric tail to the largest observation, and all of the available data is used to estimate the parametric tail. This approach also requires a smooth survival function in order to achieve an accurate estimate of the tail distribution. In this article, we propose a hybrid parametric, non-parametric approach to predicting events in clinical trials with time-to-event outcomes. The change points in the survival function are first detected, and the survival function before the last change point is estimated non-parametrically and the tail distribution beyond the last change point is estimated parametrically. Numerical results show that the proposed approach provides accurate predictions for future event times and outperforms the existing approaches. Copyright © 2011 Elsevier Inc. All rights reserved.

  11. A real-time assessment of factors influencing medication events.

    PubMed

    Dollarhide, Adrian W; Rutledge, Thomas; Weinger, Matthew B; Fisher, Erin Stucky; Jain, Sonia; Wolfson, Tanya; Dresselhaus, Timothy R

    2014-01-01

    Reducing medical error is critical to improving the safety and quality of healthcare. Physician stress, fatigue, and excessive workload are performance-shaping factors (PSFs) that may influence medical events (actual administration errors and near misses), but direct relationships between these factors and patient safety have not been clearly defined. This study assessed the real-time influence of emotional stress, workload, and sleep deprivation on self-reported medication events by physicians in academic hospitals. During an 18-month study period, 185 physician participants working at four university-affiliated teaching hospitals reported medication events using a confidential reporting application on handheld computers. Emotional stress scores, perceived workload, patient case volume, clinical experience, total sleep, and demographic variables were also captured via the handheld computers. Medication event reports (n = 11) were then correlated with these demographic and PSFs. Medication events were associated with 36.1% higher perceived workload (p < .05), 38.6% higher inpatient caseloads (p < .01), and 55.9% higher emotional stress scores (p < .01). There was a trend for reported events to also be associated with less sleep (p = .10). These results confirm the effect of factors influencing medication events, and support attention to both provider and hospital environmental characteristics for improving patient safety.

  12. The sensitivity and specificity of markers for event times.

    PubMed

    Cai, Tianxi; Pepe, Margaret Sullivan; Zheng, Yingye; Lumley, Thomas; Jenny, Nancy Swords

    2006-04-01

    The statistical literature on assessing the accuracy of risk factors or disease markers as diagnostic tests deals almost exclusively with settings where the test, Y, is measured concurrently with disease status D. In practice, however, disease status may vary over time and there is often a time lag between when the marker is measured and the occurrence of disease. One example concerns the Framingham risk score (FR-score) as a marker for the future risk of cardiovascular events, events that occur after the score is ascertained. To evaluate such a marker, one needs to take the time lag into account since the predictive accuracy may be higher when the marker is measured closer to the time of disease occurrence. We therefore consider inference for sensitivity and specificity functions that are defined as functions of time. Semiparametric regression models are proposed. Data from a cohort study are used to estimate model parameters. One issue that arises in practice is that event times may be censored. In this research, we extend in several respects the work by Leisenring et al. (1997) that dealt only with parametric models for binary tests and uncensored data. We propose semiparametric models that accommodate continuous tests and censoring. Asymptotic distribution theory for parameter estimates is developed and procedures for making statistical inference are evaluated with simulation studies. We illustrate our methods with data from the Cardiovascular Health Study, relating the FR-score measured at enrollment to subsequent risk of cardiovascular events.

  13. Events in time: Basic analysis of Poisson data

    SciTech Connect

    Engelhardt, M.E.

    1994-09-01

    The report presents basic statistical methods for analyzing Poisson data, such as the member of events in some period of time. It gives point estimates, confidence intervals, and Bayesian intervals for the rate of occurrence per unit of time. It shows how to compare subsets of the data, both graphically and by statistical tests, and how to look for trends in time. It presents a compound model when the rate of occurrence varies randomly. Examples and SAS programs are given.

  14. Deconstructing events: The neural bases for space, time, and causality

    PubMed Central

    Kranjec, Alexander; Cardillo, Eileen R.; Lehet, Matthew; Chatterjee, Anjan

    2013-01-01

    Space, time, and causality provide a natural structure for organizing our experience. These abstract categories allow us to think relationally in the most basic sense; understanding simple events require one to represent the spatial relations among objects, the relative durations of actions or movements, and links between causes and effects. The present fMRI study investigates the extent to which the brain distinguishes between these fundamental conceptual domains. Participants performed a one-back task with three conditions of interest (SPACE, TIME and CAUSALITY). Each condition required comparing relations between events in a simple verbal narrative. Depending on the condition, participants were instructed to either attend to the spatial, temporal, or causal characteristics of events, but between participants, each particular event relation appeared in all three conditions. Contrasts compared neural activity during each condition against the remaining two and revealed how thinking about events is deconstructed neurally. Space trials recruited neural areas traditionally associated with visuospatial processing, primarily bilateral frontal and occipitoparietal networks. Causality trials activated areas previously found to underlie causal thinking and thematic role assignment, such as left medial frontal, and left middle temporal gyri, respectively. Causality trials also produced activations in SMA, caudate, and cerebellum; cortical and subcortical regions associated with the perception of time at different timescales. The TIME contrast however, produced no significant effects. This pattern, indicating negative results for TIME trials, but positive effects for CAUSALITY trials in areas important for time perception, motivated additional overlap analyses to further probe relations between domains. The results of these analyses suggest a closer correspondence between time and causality than between time and space. PMID:21861674

  15. Deconstructing events: the neural bases for space, time, and causality.

    PubMed

    Kranjec, Alexander; Cardillo, Eileen R; Schmidt, Gwenda L; Lehet, Matthew; Chatterjee, Anjan

    2012-01-01

    Space, time, and causality provide a natural structure for organizing our experience. These abstract categories allow us to think relationally in the most basic sense; understanding simple events requires one to represent the spatial relations among objects, the relative durations of actions or movements, and the links between causes and effects. The present fMRI study investigates the extent to which the brain distinguishes between these fundamental conceptual domains. Participants performed a 1-back task with three conditions of interest (space, time, and causality). Each condition required comparing relations between events in a simple verbal narrative. Depending on the condition, participants were instructed to either attend to the spatial, temporal, or causal characteristics of events, but between participants each particular event relation appeared in all three conditions. Contrasts compared neural activity during each condition against the remaining two and revealed how thinking about events is deconstructed neurally. Space trials recruited neural areas traditionally associated with visuospatial processing, primarily bilateral frontal and occipitoparietal networks. Causality trials activated areas previously found to underlie causal thinking and thematic role assignment, such as left medial frontal and left middle temporal gyri, respectively. Causality trials also produced activations in SMA, caudate, and cerebellum; cortical and subcortical regions associated with the perception of time at different timescales. The time contrast, however, produced no significant effects. This pattern, indicating negative results for time trials but positive effects for causality trials in areas important for time perception, motivated additional overlap analyses to further probe relations between domains. The results of these analyses suggest a closer correspondence between time and causality than between time and space.

  16. New strategy to identify radicals in a time evolving EPR data set by multivariate curve resolution-alternating least squares.

    PubMed

    Fadel, Maya Abou; de Juan, Anna; Vezin, Hervé; Duponchel, Ludovic

    2016-12-01

    Electron paramagnetic resonance (EPR) spectroscopy is a powerful technique that is able to characterize radicals formed in kinetic reactions. However, spectral characterization of individual chemical species is often limited or even unmanageable due to the severe kinetic and spectral overlap among species in kinetic processes. Therefore, we applied, for the first time, multivariate curve resolution-alternating least squares (MCR-ALS) method to EPR time evolving data sets to model and characterize the different constituents in a kinetic reaction. Here we demonstrate the advantage of multivariate analysis in the investigation of radicals formed along the kinetic process of hydroxycoumarin in alkaline medium. Multiset analysis of several EPR-monitored kinetic experiments performed in different conditions revealed the individual paramagnetic centres as well as their kinetic profiles. The results obtained by MCR-ALS method demonstrate its prominent potential in analysis of EPR time evolved spectra.

  17. Estimating differences and ratios in median times to event

    PubMed Central

    Rogawski, Elizabeth T.; Westreich, Daniel J.; Kang, Gagandeep; Ward, Honorine D.; Cole, Stephen R.

    2016-01-01

    Time differences and time ratios are often more interpretable estimates of effect than hazard ratios for time-to-event data, especially for common outcomes. We developed a SAS macro for estimating time differences and time ratios between baseline-fixed binary exposure groups based on inverse probability weighted Kaplan-Meier curves. The macro uses pooled logistic regression to calculate inverse probability of censoring and exposure weights, draws Kaplan-Meier curves based on the weighted data, and estimates the time difference and time ratio at a user-defined survival proportion. The macro also calculates the risk difference and risk ratio at a user-specified time. Confidence intervals are constructed by bootstrap. We provide an example assessing the effect of exclusive breastfeeding during diarrhea on the incidence of subsequent diarrhea in children followed from birth to 3 years in Vellore, India. The SAS macro provided here should facilitate the wider reporting of time differences and time ratios. PMID:27465526

  18. A multivariate time-warping based classifier for gesture recognition with wearable strain sensors.

    PubMed

    Giorgino, Toni; Tormene, Paolo; Quaglini, Silvana

    2007-01-01

    Conductive elastomer elements can be industrially embedded into garments to form unobtrusive strain sensing stripes. The present article outlines the structure of a strain-sensor based gesture detection algorithm. Current sensing prototypes include several dozens of sensors; their redundancy with respect to the limb's degrees of freedom, and other artifacts implied by this measurement technique, call for the development of novel robust multivariate pattern-matching techniques. The algorithm's construction is explained, and its performances are evaluated in the context of motor rehabilitation exercises for both two-class and multi-class tasks.

  19. Space-Time Event Sparse Penalization for Magneto-/Electroencephalography

    PubMed Central

    Bolstad, Andrew; Van Veen, Barry; Nowak, Robert

    2009-01-01

    This article presents a new spatio-temporal method for M/EEG source reconstruction based on the assumption that only a small number of events, localized in space and/or time, are responsible for the measured signal. Each space-time event is represented using a basis function expansion which reflects the most relevant (or measurable) features of the signal. This model of neural activity leads naturally to a Bayesian likelihood function which balances the model fit to the data with the complexity of the model, where the complexity is related to the number of included events. A novel Expectation-Maximization algorithm which maximizes the likelihood function is presented. The new method is shown to be effective on several MEG simulations of neurological activity as well as data from a self-paced finger tapping experiment. PMID:19457366

  20. Event-by-Event Study of Space-Time Dynamics in Flux-Tube Fragmentation

    DOE PAGES

    Wong, Cheuk-Yin

    2017-05-25

    In the semi-classical description of the flux-tube fragmentation process for hadron production and hadronization in high-energymore » $e^+e^-$ annihilations and $pp$ collisions, the rapidity-space-time ordering and the local conservation laws of charge, flavor, and momentum provide a set of powerful tools that may allow the reconstruction of the space-time dynamics of quarks and mesons in exclusive measurements of produced hadrons, on an event-by-event basis. We propose procedures to reconstruct the space-time dynamics from event-by-event exclusive hadron data to exhibit explicitly the ordered chain of hadrons produced in a flux tube fragmentation. As a supplementary tool, we infer the average space-time coordinates of the $q$-$$\\bar q$$ pair production vertices from the $$\\pi^-$$ rapidity distribution data obtained by the NA61/SHINE Collaboration in $pp$ collisions at $$\\sqrt{s}$$ = 6.3 to 17.3 GeV.« less

  1. Interpretability of Multivariate Brain Maps in Linear Brain Decoding: Definition, and Heuristic Quantification in Multivariate Analysis of MEG Time-Locked Effects.

    PubMed

    Kia, Seyed Mostafa; Vega Pons, Sandro; Weisz, Nathan; Passerini, Andrea

    2016-01-01

    Brain decoding is a popular multivariate approach for hypothesis testing in neuroimaging. Linear classifiers are widely employed in the brain decoding paradigm to discriminate among experimental conditions. Then, the derived linear weights are visualized in the form of multivariate brain maps to further study spatio-temporal patterns of underlying neural activities. It is well known that the brain maps derived from weights of linear classifiers are hard to interpret because of high correlations between predictors, low signal to noise ratios, and the high dimensionality of neuroimaging data. Therefore, improving the interpretability of brain decoding approaches is of primary interest in many neuroimaging studies. Despite extensive studies of this type, at present, there is no formal definition for interpretability of multivariate brain maps. As a consequence, there is no quantitative measure for evaluating the interpretability of different brain decoding methods. In this paper, first, we present a theoretical definition of interpretability in brain decoding; we show that the interpretability of multivariate brain maps can be decomposed into their reproducibility and representativeness. Second, as an application of the proposed definition, we exemplify a heuristic for approximating the interpretability in multivariate analysis of evoked magnetoencephalography (MEG) responses. Third, we propose to combine the approximated interpretability and the generalization performance of the brain decoding into a new multi-objective criterion for model selection. Our results, for the simulated and real MEG data, show that optimizing the hyper-parameters of the regularized linear classifier based on the proposed criterion results in more informative multivariate brain maps. More importantly, the presented definition provides the theoretical background for quantitative evaluation of interpretability, and hence, facilitates the development of more effective brain decoding algorithms

  2. Interpretability of Multivariate Brain Maps in Linear Brain Decoding: Definition, and Heuristic Quantification in Multivariate Analysis of MEG Time-Locked Effects

    PubMed Central

    Kia, Seyed Mostafa; Vega Pons, Sandro; Weisz, Nathan; Passerini, Andrea

    2017-01-01

    Brain decoding is a popular multivariate approach for hypothesis testing in neuroimaging. Linear classifiers are widely employed in the brain decoding paradigm to discriminate among experimental conditions. Then, the derived linear weights are visualized in the form of multivariate brain maps to further study spatio-temporal patterns of underlying neural activities. It is well known that the brain maps derived from weights of linear classifiers are hard to interpret because of high correlations between predictors, low signal to noise ratios, and the high dimensionality of neuroimaging data. Therefore, improving the interpretability of brain decoding approaches is of primary interest in many neuroimaging studies. Despite extensive studies of this type, at present, there is no formal definition for interpretability of multivariate brain maps. As a consequence, there is no quantitative measure for evaluating the interpretability of different brain decoding methods. In this paper, first, we present a theoretical definition of interpretability in brain decoding; we show that the interpretability of multivariate brain maps can be decomposed into their reproducibility and representativeness. Second, as an application of the proposed definition, we exemplify a heuristic for approximating the interpretability in multivariate analysis of evoked magnetoencephalography (MEG) responses. Third, we propose to combine the approximated interpretability and the generalization performance of the brain decoding into a new multi-objective criterion for model selection. Our results, for the simulated and real MEG data, show that optimizing the hyper-parameters of the regularized linear classifier based on the proposed criterion results in more informative multivariate brain maps. More importantly, the presented definition provides the theoretical background for quantitative evaluation of interpretability, and hence, facilitates the development of more effective brain decoding algorithms

  3. Hearing flashes and seeing beeps: Timing audiovisual events

    PubMed Central

    2017-01-01

    Many events from daily life are audiovisual (AV). Handclaps produce both visual and acoustic signals that are transmitted in air and processed by our sensory systems at different speeds, reaching the brain multisensory integration areas at different moments. Signals must somehow be associated in time to correctly perceive synchrony. This project aims at quantifying the mutual temporal attraction between senses and characterizing the different interaction modes depending on the offset. In every trial participants saw four beep-flash pairs regularly spaced in time, followed after a variable delay by a fifth event in the test modality (auditory or visual). A large range of AV offsets was tested. The task was to judge whether the last event came before/after what was expected given the perceived rhythm, while attending only to the test modality. Flashes were perceptually shifted in time toward beeps, the attraction being stronger for lagging than leading beeps. Conversely, beeps were not shifted toward flashes, indicating a nearly total auditory capture. The subjective timing of the visual component resulting from the AV interaction could easily be forward but not backward in time, an intuitive constraint stemming from minimum visual processing delays. Finally, matching auditory and visual time-sensitivity with beeps embedded in pink noise produced very similar mutual attractions of beeps and flashes. Breaking the natural auditory preference for timing allowed vision to take over as well, showing that this preference is not hardwired. PMID:28207786

  4. Making Story Time a Literacy Event for the Young Child.

    ERIC Educational Resources Information Center

    Weir, Beth

    1989-01-01

    Reviews research and anecdotal accounts which present instructional techniques and which suggest that the quality of instruction, quality of time, and quality of books are significant factors in ensuring that story reading is a true literacy event. Argues that consistent story readings facilitate the acquisition of the reading process. (RS)

  5. Automatic seismic event tracking using a dynamic time warping algorithm

    NASA Astrophysics Data System (ADS)

    Jin, Song; Chen, ShuangQuan; Wei, Jianxin; Li, Xiang-Yang

    2017-10-01

    For seismic data interpretation, horizon picking based on seismic events in stacked or migrated seismic sections is essential for obtaining information on subsurface structures. This conventional work is time-consuming via manual implementation. In this paper, we develop an automatic seismic event tracking method of horizon interpretation using the dynamic time warping (DTW) algorithm. The proposed method consists of two steps: calculating local time shifts between adjacent traces through a pilot trace and then event tracking. In the method, the DTW algorithm is applied to calculate time shifts between two adjacent traces, and an improved multitrace DTW strategy is proposed to improve the robustness. One synthetic seismic trace is used to demonstrate the DTW algorithm, and a synthetic seismic section is used to verify the feasibility of the proposed method handling contaminated seismic data with noise. Finally, we apply the method to a 3D physical model dataset. The result indicates that the proposed method is quantitatively feasible for seismic event automatic tracking and is reasonably stable for noisy seismic section flattening, which also has the potential to extract seismic horizon slices effectively.

  6. Time Evolution of Elemental Ratios in Solar Energetic Particle Events

    NASA Astrophysics Data System (ADS)

    Zelina, P.; Dalla, S.; Cohen, C. M. S.; Mewaldt, R. A.

    2017-01-01

    Heavy ion ratio abundances in solar energetic particle (SEP) events, e.g., Fe/O, often exhibit decreases over time. Using particle instruments on the Advanced Composition Explorer, Solar and Heliospheric Observatory and Solar Terrestrial Relations Observatory spacecraft, we analyzed heavy ion data from 4 SEP events taking place between 2006 December and 2014 December. We constructed 36 different ionic pairs and studied their time evolution in each event. We quantified the temporal behavior of abundant SEP ratios by fitting the data to derive a decay time constant B. We also considered the ratio of ionic mass-to-charge for each pair, the S value given, e.g., for Fe/O by {S}{Fe/{{O}}}={(M/Q)}{Fe}/{(M/Q)}{{O}}. We found that the temporal behavior of SEP ratios is ordered by the value of S: ratios with S> 1 showed decreases over time (i.e., B< 0) and those with S< 1 showed increases (B> 0). We plotted B as a function of S and observed a clear monotonic dependence: ratios with a large S decayed at a higher rate. A prominent discontinuity at S = 2.0 (corresponding to He/H) was found in three of the four events, suggesting anomalous behavior of protons. The X/H ratios often show an initial increase followed by a decrease, and decay at a slower rate. We discuss possible causes of the observed B versus S trends within current understanding of SEP propagation.

  7. A hybrid clustering approach for multivariate time series - A case study applied to failure analysis in a gas turbine.

    PubMed

    Fontes, Cristiano Hora; Budman, Hector

    2017-09-16

    A clustering problem involving multivariate time series (MTS) requires the selection of similarity metrics. This paper shows the limitations of the PCA similarity factor (SPCA) as a single metric in nonlinear problems where there are differences in magnitude of the same process variables due to expected changes in operation conditions. A novel method for clustering MTS based on a combination between SPCA and the average-based Euclidean distance (AED) within a fuzzy clustering approach is proposed. Case studies involving either simulated or real industrial data collected from a large scale gas turbine are used to illustrate that the hybrid approach enhances the ability to recognize normal and fault operating patterns. This paper also proposes an oversampling procedure to create synthetic multivariate time series that can be useful in commonly occurring situations involving unbalanced data sets. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  8. Established time series measure occurrence and frequency of episodic events.

    NASA Astrophysics Data System (ADS)

    Pebody, Corinne; Lampitt, Richard

    2015-04-01

    Established time series measure occurrence and frequency of episodic events. Episodic flux events occur in open oceans. Time series making measurements over significant time scales are one of the few methods that can capture these events and compare their impact with 'normal' flux. Seemingly rare events may be significant on local scales, but without the ability to measure the extent of flux on spatial and temporal scales and combine with the frequency of occurrence, it is difficult to constrain their impact. The Porcupine Abyssal Plain Sustained Observatory (PAP-SO) in the Northeast Atlantic (49 °N 16 °W, 5000m water depth) has measured particle flux since 1989 and zooplankton swimmers since 2000. Sediment traps at 3000m and 100 metres above bottom, collect material year round and we have identified close links between zooplankton and particle flux. Some of these larger animals, for example Diacria trispinosa, make a significant contribution to carbon flux through episodic flux events. D. trispinosa is a euthecosome mollusc which occurs in the Northeast Atlantic, though the PAP-SO is towards the northern limit of its distribution. Pteropods are comprised of aragonite shell, containing soft body parts excepting the muscular foot which extends beyond the mouth of the living animal. Pteropods, both live-on-entry animals and the empty shells are found year round in the 3000m trap. Generally the abundance varies with particle flux, but within that general pattern there are episodic events where significant numbers of these animals containing both organic and inorganic carbon are captured at depth and therefore could be defined as contributing to export flux. Whether the pulse of animals is as a result of the life cycle of D. trispinosa or the effects of the physics of the water column is unclear, but the complexity of the PAP-SO enables us not only to collect these animals but to examine them in parallel to the biogeochemical and physical elements measured by the

  9. Estimating marginal effects in accelerated failure time models for serial sojourn times among repeated events.

    PubMed

    Chang, Shu-Hui

    2004-06-01

    Recurrent event data are commonly encountered in longitudinal studies when events occur repeatedly over time for each study subject. An accelerated failure time (AFT) model on the sojourn time between recurrent events is considered in this article. This model assumes that the covariate effect and the subject-specific frailty are additive on the logarithm of sojourn time, and the covariate effect maintains the same over distinct episodes, while the distributions of the frailty and the random error in the model are unspecified. With the ordinal nature of recurrent events, two scale transformations of the sojourn times are derived to construct semiparametric methods of log-rank type for estimating the marginal covariate effects in the model. The proposed estimation approaches/inference procedures also can be extended to the bivariate events, which alternate themselves over time. Examples and comparisons are presented to illustrate the performance of the proposed methods.

  10. Initial Time Dependence of Abundances in Solar Energetic Particle Events

    NASA Technical Reports Server (NTRS)

    Reames, Donald V.; Ny, C. K.; Tylka, A. J.

    1999-01-01

    We compare the initial behavior of Fe/O and He/H abundance ratios and their relationship to the evolution of the proton energy spectra in "small" and "large" gradual solar energetic particle (SEP) events. The results are qualitatively consistent with the behavior predicted by the theory of Ng et al. (1999a, b). He/H ratios that initially rise with time are a signature of scattering by non-Kolmogorov Alfven wave spectra generated by intense beams of shock-accelerated protons streaming outward in large gradual SEP events.

  11. iVAR: a program for imputing missing data in multivariate time series using vector autoregressive models.

    PubMed

    Liu, Siwei; Molenaar, Peter C M

    2014-12-01

    This article introduces iVAR, an R program for imputing missing data in multivariate time series on the basis of vector autoregressive (VAR) models. We conducted a simulation study to compare iVAR with three methods for handling missing data: listwise deletion, imputation with sample means and variances, and multiple imputation ignoring time dependency. The results showed that iVAR produces better estimates for the cross-lagged coefficients than do the other three methods. We demonstrate the use of iVAR with an empirical example of time series electrodermal activity data and discuss the advantages and limitations of the program.

  12. The Time of Our Lives: Life Span Development of Timing and Event Tracking

    ERIC Educational Resources Information Center

    McAuley, J. Devin; Jones, Mari Riess; Holub, Shayla; Johnston, Heather M.; Miller, Nathaniel S.

    2006-01-01

    Life span developmental profiles were constructed for 305 participants (ages 4-95) for a battery of paced and unpaced perceptual-motor timing tasks that included synchronize-continue tapping at a wide range of target event rates. Two life span hypotheses, derived from an entrainment theory of timing and event tracking, were tested. A preferred…

  13. The Time of Our Lives: Life Span Development of Timing and Event Tracking

    ERIC Educational Resources Information Center

    McAuley, J. Devin; Jones, Mari Riess; Holub, Shayla; Johnston, Heather M.; Miller, Nathaniel S.

    2006-01-01

    Life span developmental profiles were constructed for 305 participants (ages 4-95) for a battery of paced and unpaced perceptual-motor timing tasks that included synchronize-continue tapping at a wide range of target event rates. Two life span hypotheses, derived from an entrainment theory of timing and event tracking, were tested. A preferred…

  14. [Multivariate Adaptive Regression Splines (MARS), an alternative for the analysis of time series].

    PubMed

    Vanegas, Jairo; Vásquez, Fabián

    Multivariate Adaptive Regression Splines (MARS) is a non-parametric modelling method that extends the linear model, incorporating nonlinearities and interactions between variables. It is a flexible tool that automates the construction of predictive models: selecting relevant variables, transforming the predictor variables, processing missing values and preventing overshooting using a self-test. It is also able to predict, taking into account structural factors that might influence the outcome variable, thereby generating hypothetical models. The end result could identify relevant cut-off points in data series. It is rarely used in health, so it is proposed as a tool for the evaluation of relevant public health indicators. For demonstrative purposes, data series regarding the mortality of children under 5 years of age in Costa Rica were used, comprising the period 1978-2008. Copyright © 2016 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  15. Events per person-time (incidence rate): a misleading statistic?

    PubMed

    Kraemer, Helena Chmura

    2009-03-15

    Results in risk studies based on events per person-time (EPT, technically 'incidence rate') often prove non-confirmable. The circumstances in which the EPT-ratio is unquestionably both valid and optimal to compare a high- and low-risk group, a constant hazards situation, are discussed. However, the constant hazards situation seldom applies in medical research. When the constant hazards situation does not apply, even under optimal circumstances, with fixed entry time and follow-up time for all those not experiencing the event and absence of censoring, EPT-ratio yields at best ambiguous, at worst misleading, results. More careful design and survival analyses are recommended in place of use of EPTs.

  16. Exponentiated Weibull regression for time-to-event data.

    PubMed

    Khan, Shahedul A

    2017-03-27

    The Weibull, log-logistic and log-normal distributions are extensively used to model time-to-event data. The Weibull family accommodates only monotone hazard rates, whereas the log-logistic and log-normal are widely used to model unimodal hazard functions. The increasing availability of lifetime data with a wide range of characteristics motivate us to develop more flexible models that accommodate both monotone and nonmonotone hazard functions. One such model is the exponentiated Weibull distribution which not only accommodates monotone hazard functions but also allows for unimodal and bathtub shape hazard rates. This distribution has demonstrated considerable potential in univariate analysis of time-to-event data. However, the primary focus of many studies is rather on understanding the relationship between the time to the occurrence of an event and one or more covariates. This leads to a consideration of regression models that can be formulated in different ways in survival analysis. One such strategy involves formulating models for the accelerated failure time family of distributions. The most commonly used distributions serving this purpose are the Weibull, log-logistic and log-normal distributions. In this study, we show that the exponentiated Weibull distribution is closed under the accelerated failure time family. We then formulate a regression model based on the exponentiated Weibull distribution, and develop large sample theory for statistical inference. We also describe a Bayesian approach for inference. Two comparative studies based on real and simulated data sets reveal that the exponentiated Weibull regression can be valuable in adequately describing different types of time-to-event data.

  17. Discrete mixture modeling to address genetic heterogeneity in time-to-event regression

    PubMed Central

    Eng, Kevin H.; Hanlon, Bret M.

    2014-01-01

    Motivation: Time-to-event regression models are a critical tool for associating survival time outcomes with molecular data. Despite mounting evidence that genetic subgroups of the same clinical disease exist, little attention has been given to exploring how this heterogeneity affects time-to-event model building and how to accommodate it. Methods able to diagnose and model heterogeneity should be valuable additions to the biomarker discovery toolset. Results: We propose a mixture of survival functions that classifies subjects with similar relationships to a time-to-event response. This model incorporates multivariate regression and model selection and can be fit with an expectation maximization algorithm, we call Cox-assisted clustering. We illustrate a likely manifestation of genetic heterogeneity and demonstrate how it may affect survival models with little warning. An application to gene expression in ovarian cancer DNA repair pathways illustrates how the model may be used to learn new genetic subsets for risk stratification. We explore the implications of this model for censored observations and the effect on genomic predictors and diagnostic analysis. Availability and implementation: R implementation of CAC using standard packages is available at https://gist.github.com/programeng/8620b85146b14b6edf8f Data used in the analysis are publicly available. Contact: kevin.eng@roswellpark.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24532723

  18. Time-frequency analysis of neuronal populations with instantaneous resolution based on noise-assisted multivariate empirical mode decomposition.

    PubMed

    Alegre-Cortés, J; Soto-Sánchez, C; Pizá, Á G; Albarracín, A L; Farfán, F D; Felice, C J; Fernández, E

    2016-07-15

    Linear analysis has classically provided powerful tools for understanding the behavior of neural populations, but the neuron responses to real-world stimulation are nonlinear under some conditions, and many neuronal components demonstrate strong nonlinear behavior. In spite of this, temporal and frequency dynamics of neural populations to sensory stimulation have been usually analyzed with linear approaches. In this paper, we propose the use of Noise-Assisted Multivariate Empirical Mode Decomposition (NA-MEMD), a data-driven template-free algorithm, plus the Hilbert transform as a suitable tool for analyzing population oscillatory dynamics in a multi-dimensional space with instantaneous frequency (IF) resolution. The proposed approach was able to extract oscillatory information of neurophysiological data of deep vibrissal nerve and visual cortex multiunit recordings that were not evidenced using linear approaches with fixed bases such as the Fourier analysis. Texture discrimination analysis performance was increased when Noise-Assisted Multivariate Empirical Mode plus Hilbert transform was implemented, compared to linear techniques. Cortical oscillatory population activity was analyzed with precise time-frequency resolution. Similarly, NA-MEMD provided increased time-frequency resolution of cortical oscillatory population activity. Noise-Assisted Multivariate Empirical Mode Decomposition plus Hilbert transform is an improved method to analyze neuronal population oscillatory dynamics overcoming linear and stationary assumptions of classical methods. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Prediction problem for target events based on the inter-event waiting time

    NASA Astrophysics Data System (ADS)

    Shapoval, A.

    2010-11-01

    In this paper we address the problem of forecasting the target events of a time series given the distribution ξ of time gaps between target events. Strong earthquakes and stock market crashes are the two types of such events that we are focusing on. In the series of earthquakes, as McCann et al. show [W.R. Mc Cann, S.P. Nishenko, L.R. Sykes, J. Krause, Seismic gaps and plate tectonics: seismic potential for major boundaries, Pure and Applied Geophysics 117 (1979) 1082-1147], there are well-defined gaps (called seismic gaps) between strong earthquakes. On the other hand, usually there are no regular gaps in the series of stock market crashes [M. Raberto, E. Scalas, F. Mainardi, Waiting-times and returns in high-frequency financial data: an empirical study, Physica A 314 (2002) 749-755]. For the case of seismic gaps, we analytically derive an upper bound of prediction efficiency given the coefficient of variation of the distribution ξ. For the case of stock market crashes, we develop an algorithm that predicts the next crash within a certain time interval after the previous one. We show that this algorithm outperforms random prediction. The efficiency of our algorithm sets up a lower bound of efficiency for effective prediction of stock market crashes.

  20. Return time statistic of wind power ramp events

    NASA Astrophysics Data System (ADS)

    Calif, Rudy; Schmitt, François G.

    2015-04-01

    Detection and forecasting of wind power ramp events is a critical issue for the management of power generated by wind turbine and a cluster of wind turbines. The wind power ramp events occur suddenly with larges changes (increases or decreases) of wind power output. In this work, the statistic and the dynamic of wind power ramp events are examined. For that, we analyze several datasets of wind power output with different sampling rate and duration. The data considered are delivered by five wind farms and two single turbines, located at different geographic locations. From these datasets, the return time series τr of wind power ramp events, i.e., the time between two successive ramps above a given threshold Δ p. The return time statistic is investigated plotting the complementary cumulative distribution C(τ_r) in log-log representation. Using a robust method developed by Clauset et al., combining maximum-likelihood fitting methods with goodness-of-fit tests based on the Kolmogorov Smirnov statistic, we show a scaling behavior of the return time statistic, of the form: C(τ_r)˜ kτ_r-α where k is a positive constant and the exponent α called the tail exponent of the distribution. In this study, the value of α ranges from 1.68 to 2.20. This result is a potential information for the estimation risk of wind power generation based on the return time series. Clauset A, Shalizi CR, Newman MEJ. Power-Law distributions in empirical data. SIAM Review 2009;51(4):661-703.

  1. Estimating time-varying effects for overdispersed recurrent events data with treatment switching

    PubMed Central

    CHEN, QINGXIA; ZENG, DONGLIN; IBRAHIM, JOSEPH G.; AKACHA, MOUNA; SCHMIDLI, HEINZ

    2014-01-01

    Summary In the analysis of multivariate event times, frailty models assuming time-independent regression coefficients are often considered, mainly due to their mathematical convenience. In practice, regression coefficients are often time dependent and the temporal effects are of clinical interest. Motivated by a phase III clinical trial in multiple sclerosis, we develop a semiparametric frailty modelling approach to estimate time-varying effects for overdispersed recurrent events data with treatment switching. The proposed model incorporates the treatment switching time in the time-varying coefficients. Theoretical properties of the proposed model are established and an efficient expectation-maximization algorithm is derived to obtain the maximum likelihood estimates. Simulation studies evaluate the numerical performance of the proposed model under various temporal treatment effect curves. The ideas in this paper can also be used for time-varying coefficient frailty models without treatment switching as well as for alternative models when the proportional hazard assumption is violated. A multiple sclerosis dataset is analysed to illustrate our methodology. PMID:24465031

  2. First-passage time approach to controlling noise in the timing of intracellular events

    PubMed Central

    Ghusinga, Khem Raj; Dennehy, John J.; Singh, Abhyudai

    2017-01-01

    In the noisy cellular environment, gene products are subject to inherent random fluctuations in copy numbers over time. How cells ensure precision in the timing of key intracellular events despite such stochasticity is an intriguing fundamental problem. We formulate event timing as a first-passage time problem, where an event is triggered when the level of a protein crosses a critical threshold for the first time. Analytical calculations are performed for the first-passage time distribution in stochastic models of gene expression. Derivation of these formulas motivates an interesting question: Is there an optimal feedback strategy to regulate the synthesis of a protein to ensure that an event will occur at a precise time, while minimizing deviations or noise about the mean? Counterintuitively, results show that for a stable long-lived protein, the optimal strategy is to express the protein at a constant rate without any feedback regulation, and any form of feedback (positive, negative, or any combination of them) will always amplify noise in event timing. In contrast, a positive feedback mechanism provides the highest precision in timing for an unstable protein. These theoretical results explain recent experimental observations of single-cell lysis times in bacteriophage λ. Here, lysis of an infected bacterial cell is orchestrated by the expression and accumulation of a stable λ protein up to a threshold, and precision in timing is achieved via feedforward rather than feedback control. Our results have broad implications for diverse cellular processes that rely on precise temporal triggering of events. PMID:28069947

  3. Predicting the timing of dynamic events through sound: Bouncing balls.

    PubMed

    Gygi, Brian; Giordano, Bruno L; Shafiro, Valeriy; Kharkhurin, Anatoliy; Zhang, Peter Xinya

    2015-07-01

    Dynamic information in acoustical signals produced by bouncing objects is often used by listeners to predict the objects' future behavior (e.g., hitting a ball). This study examined factors that affect the accuracy of motor responses to sounds of real-world dynamic events. In experiment 1, listeners heard 2-5 bounces from a tennis ball, ping-pong, basketball, or wiffle ball, and would tap to indicate the time of the next bounce in a series. Across ball types and number of bounces, listeners were extremely accurate in predicting the correct bounce time (CT) with a mean prediction error of only 2.58% of the CT. Prediction based on a physical model of bouncing events indicated that listeners relied primarily on temporal cues when estimating the timing of the next bounce, and to a lesser extent on the loudness and spectral cues. In experiment 2, the timing of each bounce pattern was altered to correspond to the bounce timing pattern of another ball, producing stimuli with contradictory acoustic cues. Nevertheless, listeners remained highly accurate in their estimates of bounce timing. This suggests that listeners can adopt their estimates of bouncing-object timing based on acoustic cues that provide most veridical information about dynamic aspects of object behavior.

  4. Life Events and Depressive Symptoms in African American Adolescents: Do Ecological Domains and Timing of Life Events Matter?

    ERIC Educational Resources Information Center

    Sanchez, Yadira M.; Lambert, Sharon F.; Ialongo, Nicholas S.

    2012-01-01

    Considerable research has documented associations between adverse life events and internalizing symptoms in adolescents, but much of this research has focused on the number of events experienced, with less attention to the ecological context or timing of events. This study examined life events in three ecological domains relevant to adolescents…

  5. Real-time prediction of the occurrence of GLE events

    NASA Astrophysics Data System (ADS)

    Núñez, Marlon; Reyes-Santiago, Pedro J.; Malandraki, Olga E.

    2017-07-01

    A tool for predicting the occurrence of Ground Level Enhancement (GLE) events using the UMASEP scheme is presented. This real-time tool, called HESPERIA UMASEP-500, is based on the detection of the magnetic connection, along which protons arrive in the near-Earth environment, by estimating the lag correlation between the time derivatives of 1 min soft X-ray flux (SXR) and 1 min near-Earth proton fluxes observed by the GOES satellites. Unlike current GLE warning systems, this tool can predict GLE events before the detection by any neutron monitor (NM) station. The prediction performance measured for the period from 1986 to 2016 is presented for two consecutive periods, because of their notable difference in performance. For the 2000-2016 period, this prediction tool obtained a probability of detection (POD) of 53.8% (7 of 13 GLE events), a false alarm ratio (FAR) of 30.0%, and average warning times (AWT) of 8 min with respect to the first NM station's alert and 15 min to the GLE Alert Plus's warning. We have tested the model by replacing the GOES proton data with SOHO/EPHIN proton data, and the results are similar in terms of POD, FAR, and AWT for the same period. The paper also presents a comparison with a GLE warning system.

  6. Time-quefrency analysis of overlapping similar microseismic events

    NASA Astrophysics Data System (ADS)

    Nagano, Koji

    2016-05-01

    In this paper, I describe a new technique to determine the interval between P-waves in similar, overlapping microseismic events. The similar microseismic events that occur with overlapping waveforms are called `proximate microseismic doublets' herein. Proximate microseismic doublets had been discarded in previous studies because we had not noticed their usefulness. Analysis of similar events can show relative locations of sources between them. Analysis of proximate microseismic doublets can provide more precise relative source locations because variation in the velocity structure has little influence on their relative travel times. It is necessary to measure the interval between the P-waves in the proximate microseismic doublets to determine their relative source locations. A `proximate microseismic doublet' is a pair of microseismic events in which the second event arrives before the attenuation of the first event. Cepstrum analysis can provide the interval even though the second event overlaps the first event. However, a cepstrum of a proximate microseismic doublet generally has two peaks, one representing the interval between the arrivals of the two P-waves, and the other representing the interval between the arrivals of the two S-waves. It is therefore difficult to determine the peak that represents the P-wave interval from the cepstrum alone. I used window functions in cepstrum analysis to isolate the first and second P-waves and to suppress the second S-wave. I change the length of the window function and calculate the cepstrum for each window length. The result is represented in a three-dimensional contour plot of length-quefrency-cepstrum data. The contour plot allows me to identify the cepstrum peak that represents the P-wave interval. The precise quefrency can be determined from a two-dimensional quefrency-cepstrum graph, provided that the length of the window is appropriately chosen. I have used both synthetic and field data to demonstrate that this

  7. Real-Time Multimission Event Notification System for Mars Relay

    NASA Technical Reports Server (NTRS)

    Wallick, Michael N.; Allard, Daniel A.; Gladden, Roy E.; Wang, Paul; Hy, Franklin H.

    2013-01-01

    As the Mars Relay Network is in constant flux (missions and teams going through their daily workflow), it is imperative that users are aware of such state changes. For example, a change by an orbiter team can affect operations on a lander team. This software provides an ambient view of the real-time status of the Mars network. The Mars Relay Operations Service (MaROS) comprises a number of tools to coordinate, plan, and visualize various aspects of the Mars Relay Network. As part of MaROS, a feature set was developed that operates on several levels of the software architecture. These levels include a Web-based user interface, a back-end "ReSTlet" built in Java, and databases that store the data as it is received from the network. The result is a real-time event notification and management system, so mission teams can track and act upon events on a moment-by-moment basis. This software retrieves events from MaROS and displays them to the end user. Updates happen in real time, i.e., messages are pushed to the user while logged into the system, and queued when the user is not online for later viewing. The software does not do away with the email notifications, but augments them with in-line notifications. Further, this software expands the events that can generate a notification, and allows user-generated notifications. Existing software sends a smaller subset of mission-generated notifications via email. A common complaint of users was that the system-generated e-mails often "get lost" with other e-mail that comes in. This software allows for an expanded set (including user-generated) of notifications displayed in-line of the program. By separating notifications, this can improve a user's workflow.

  8. Harmonic spectral components in time sequences of Markov correlated events

    NASA Astrophysics Data System (ADS)

    Mazzetti, Piero; Carbone, Anna

    2017-07-01

    The paper concerns the analysis of the conditions allowing time sequences of Markov correlated events give rise to a line power spectrum having a relevant physical interest. It is found that by specializing the Markov matrix in order to represent closed loop sequences of events with arbitrary distribution, generated in a steady physical condition, a large set of line spectra, covering all possible frequency values, is obtained. The amplitude of the spectral lines is given by a matrix equation based on a generalized Markov matrix involving the Fourier transform of the distribution functions representing the time intervals between successive events of the sequence. The paper is a complement of a previous work where a general expression for the continuous power spectrum was given. In that case the Markov matrix was left in a more general form, thus preventing the possibility of finding line spectra of physical interest. The present extension is also suggested by the interest of explaining the emergence of a broad set of waves found in the electro and magneto-encephalograms, whose frequency ranges from 0.5 to about 40Hz, in terms of the effects produced by chains of firing neurons within the complex neural network of the brain. An original model based on synchronized closed loop sequences of firing neurons is proposed, and a few numerical simulations are reported as an application of the above cited equation.

  9. Model predictive control of P-time event graphs

    NASA Astrophysics Data System (ADS)

    Hamri, H.; Kara, R.; Amari, S.

    2016-12-01

    This paper deals with model predictive control of discrete event systems modelled by P-time event graphs. First, the model is obtained by using the dater evolution model written in the standard algebra. Then, for the control law, we used the finite-horizon model predictive control. For the closed-loop control, we used the infinite-horizon model predictive control (IH-MPC). The latter is an approach that calculates static feedback gains which allows the stability of the closed-loop system while respecting the constraints on the control vector. The problem of IH-MPC is formulated as a linear convex programming subject to a linear matrix inequality problem. Finally, the proposed methodology is applied to a transportation system.

  10. Interpreting the estimated timing of migration events between hybridizing species.

    PubMed

    Strasburg, Jared L; Rieseberg, Loren H

    2011-06-01

    The question of whether speciation can occur in the presence of gene flow has long been a contentious one. However, measuring the amount and timing of gene flow remains challenging. The computer program IMa2 allows researchers to estimate the timing of migration events for each locus during analyses, and these estimates have been used to infer the timing of introgression and mode of speciation. We use simulated data sets to examine the degree to which gene-flow timing estimates can be used for these purposes, and what demographic conditions and data sets may be most amenable to gene-flow timing estimation. We find that the 90% highest posterior density (HPD) interval of gene-flow timing is almost always substantially wider than the actual window of gene flow, and increasing the information content of the data set in terms of number of loci, number of sequences sampled or locus length (and thus number of variable sites) has little impact on the posterior distribution over the range of values we tested. Even when simulated gene flow only occurred over the most recent 0.01% of the species' history, the HPD interval usually encompasses the inferred divergence time. Our results indicate that gene-flow timing estimates made using the method currently implemented in IMa2 cannot reliably be used to make inferences about the timing of introgression between diverged species or to distinguish between speciation with gene flow and allopatric speciation followed by one or more episodes of gene flow.

  11. A rank test for bivariate time-to-event outcomes when one event is a surrogate.

    PubMed

    Shaw, Pamela A; Fay, Michael P

    2016-08-30

    In many clinical settings, improving patient survival is of interest but a practical surrogate, such as time to disease progression, is instead used as a clinical trial's primary endpoint. A time-to-first endpoint (e.g., death or disease progression) is commonly analyzed but may not be adequate to summarize patient outcomes if a subsequent event contains important additional information. We consider a surrogate outcome very generally as one correlated with the true endpoint of interest. Settings of interest include those where the surrogate indicates a beneficial outcome so that the usual time-to-first endpoint of death or surrogate event is nonsensical. We present a new two-sample test for bivariate, interval-censored time-to-event data, where one endpoint is a surrogate for the second, less frequently observed endpoint of true interest. This test examines whether patient groups have equal clinical severity. If the true endpoint rarely occurs, the proposed test acts like a weighted logrank test on the surrogate; if it occurs for most individuals, then our test acts like a weighted logrank test on the true endpoint. If the surrogate is a useful statistical surrogate, our test can have better power than tests based on the surrogate that naively handles the true endpoint. In settings where the surrogate is not valid (treatment affects the surrogate but not the true endpoint), our test incorporates the information regarding the lack of treatment effect from the observed true endpoints and hence is expected to have a dampened treatment effect compared with tests based on the surrogate alone. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.

  12. Detecting and characterising ramp events in wind power time series

    NASA Astrophysics Data System (ADS)

    Gallego, Cristóbal; Cuerva, Álvaro; Costa, Alexandre

    2014-12-01

    In order to implement accurate models for wind power ramp forecasting, ramps need to be previously characterised. This issue has been typically addressed by performing binary ramp/non-ramp classifications based on ad-hoc assessed thresholds. However, recent works question this approach. This paper presents the ramp function, an innovative wavelet- based tool which detects and characterises ramp events in wind power time series. The underlying idea is to assess a continuous index related to the ramp intensity at each time step, which is obtained by considering large power output gradients evaluated under different time scales (up to typical ramp durations). The ramp function overcomes some of the drawbacks shown by the aforementioned binary classification and permits forecasters to easily reveal specific features of the ramp behaviour observed at a wind farm. As an example, the daily profile of the ramp-up and ramp-down intensities are obtained for the case of a wind farm located in Spain.

  13. Space-Time Characteristic Functions in Multivariate Logic and Possible Interpretation of Entanglement

    NASA Astrophysics Data System (ADS)

    Gaudeau de Gerlicz, Claude; Sechpine, Pierre; Bobola, Philippe; Antoine, Mathias

    The knowledge about hidden variables in physics, (Bohr's-Schrödinger theories) and their developments, boundaries seem more and more fuzzy at physical scales. Also some other new theories give to both time and space as much fuzziness. The classical theory, (school of Copenhagen's) and also Heisenberg and Louis de Broglie give us the idea of a dual wave and particle parts such the way we observe. Thus, the Pondichery interpretation recently developed by Cramer and al. gives to the time part this duality. According Cramer, there could be a little more to this duality, some late or advanced waves of time that have been confirmed and admitted as possible solutions with the Maxwell's equations. We developed here a possible pattern that could matched in the sequence between Space and both retarded and advanced time wave in the "Cramer handshake" in locality of the present when the observation is made everything become local.

  14. Empirical reconstruction of storm-time steady magnetospheric convection events

    NASA Astrophysics Data System (ADS)

    Stephens, G. K.; Sitnov, M. I.; Kissinger, J.; Tsyganenko, N. A.; McPherron, R. L.; Korth, H.; Anderson, B. J.

    2013-12-01

    We investigate the storm-scale morphology of the magnetospheric magnetic field as well as underlying distributions of electric currents, equatorial plasma pressure and entropy for four Steady Magnetospheric Convection (SMC) events that occurred during the May 2000 and October 2011 magnetic storms. The analysis is made using the empirical geomagnetic field model TS07D, in which the structure of equatorial currents is not predefined and it is dictated by data. The model also combines the strengths of statistical and event-oriented approaches in mining data for the reconstruction of the magnetic field. The formation of a near-Earth minimum of the equatorial magnetic field in the midnight sector is inferred from data without ad hoc assumptions of a special current system postulated in earlier empirical reconstructions. In addition, a new SMC class is discovered where the minimum equatorial field is substantially larger and located closer to Earth. The magnetic field tailward of the minimum is also much larger, and the corresponding region of accumulated magnetic flux may occupy a very short tail region. The equatorial current and plasma pressure are found to be strongly enhanced far beyond geosynchronous orbit and in a broad local time interval covering the whole nightside region. This picture is consistent with independent recent statistical studies of the SMC pressure distributions, global MHD and kinetic RCM-E simulations. Distributions of the flux tube volume and entropy inferred from data reveal different mechanisms of the magnetotail convection crisis resolution for two classes of SMC events.

  15. An introduction to real-time graphical techniques for analyzing multivariate data

    NASA Astrophysics Data System (ADS)

    Friedman, Jerome H.; McDonald, John Alan; Stuetzle, Werner

    1987-08-01

    Orion I is a graphics system used to study applications of computer graphics - especially interactive motion graphics - in statistics. Orion I is the newest of a family of "Prim" systems, whose most striking common feature is the use of real-time motion graphics to display three dimensional scatterplots. Orion I differs from earlier Prim systems through the use of modern and relatively inexpensive raster graphics and microprocessor technology. It also delivers more computing power to its user; Orion I can perform more sophisticated real-time computations than were possible on previous such systems. We demonstrate some of Orion I's capabilities in our film: "Exploring data with Orion I".

  16. Recurrent-Neural-Network-Based Multivariable Adaptive Control for a Class of Nonlinear Dynamic Systems With Time-Varying Delay.

    PubMed

    Hwang, Chih-Lyang; Jan, Chau

    2016-02-01

    At the beginning, an approximate nonlinear autoregressive moving average (NARMA) model is employed to represent a class of multivariable nonlinear dynamic systems with time-varying delay. It is known that the disadvantages of robust control for the NARMA model are as follows: 1) suitable control parameters for larger time delay are more sensitive to achieving desirable performance; 2) it only deals with bounded uncertainty; and 3) the nominal NARMA model must be learned in advance. Due to the dynamic feature of the NARMA model, a recurrent neural network (RNN) is online applied to learn it. However, the system performance becomes deteriorated due to the poor learning of the larger variation of system vector functions. In this situation, a simple network is employed to compensate the upper bound of the residue caused by the linear parameterization of the approximation error of RNN. An e -modification learning law with a projection for weight matrix is applied to guarantee its boundedness without persistent excitation. Under suitable conditions, the semiglobally ultimately bounded tracking with the boundedness of estimated weight matrix is obtained by the proposed RNN-based multivariable adaptive control. Finally, simulations are presented to verify the effectiveness and robustness of the proposed control.

  17. Time use choices and healthy body weight: A multivariate analysis of data from the American Time use Survey

    PubMed Central

    2011-01-01

    Background We examine the relationship between time use choices and healthy body weight as measured by survey respondents' body mass index (BMI). Using data from the 2006 and 2007 American Time Use Surveys, we expand upon earlier research by including more detailed measures of time spent eating as well as measures of physical activity time and sedentary time. We also estimate three alternative models that relate time use to BMI. Results Our results suggest that time use and BMI are simultaneously determined. The preferred empirical model reveals evidence of an inverse relationship between time spent eating and BMI for women and men. In contrast, time spent drinking beverages while simultaneously doing other things and time spent watching television/videos are positively linked to BMI. For women only, time spent in food preparation and clean-up is inversely related to BMI while for men only, time spent sleeping is inversely related to BMI. Models that include grocery prices, opportunity costs of time, and nonwage income reveal that as these economic variables increase, BMI declines. Conclusions In this large, nationally representative data set, our analyses that correct for time use endogeneity reveal that the Americans' time use decisions have implications for their BMI. The analyses suggest that both eating time and context (i.e., while doing other tasks simultaneously) matters as does time spent in food preparation, and time spent in sedentary activities. Reduced form models suggest that shifts in grocery prices, opportunity costs of time, and nonwage income may be contributing to alterations in time use patterns and food choices that have implications for BMI. PMID:21810246

  18. MULTIVARIATE STATISTICAL MODELS FOR EFFECTS OF PM AND COPOLLUTANTS IN A DAILY TIME SERIES EPIDEMIOLOGY STUDY

    EPA Science Inventory

    Most analyses of daily time series epidemiology data relate mortality or morbidity counts to PM and other air pollutants by means of single-outcome regression models using multiple predictors, without taking into account the complex statistical structure of the predictor variable...

  19. Higher Dimensional Clayton–Oakes Models for Multivariate Failure Time Data

    PubMed Central

    Prentice, R. L.

    2016-01-01

    Summary The Clayton–Oakes bivariate failure time model is extended to dimensions m > 2 in a manner that allows unspecified marginal survivor functions for all dimensions less than m. Special cases that allow unspecified marginal survivor functions of dimension q with q < m, while making some provisions for dependencies of dimension greater than q, are also described. PMID:27738350

  20. Falcon: Visual analysis of large, irregularly sampled, and multivariate time series data in additive manufacturing

    DOE PAGES

    Steed, Chad A.; Halsey, William; Dehoff, Ryan; ...

    2017-02-16

    Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less

  1. Classification of broiler breast filets according to deboning time using near infrared spectroscopy and multivariate analysis

    USDA-ARS?s Scientific Manuscript database

    Chicken breast filets were deboned and NIR spectra were collected after 2, 4, and 24 hours. The deboning was performed on pairs of filets to minimize differences due only to the meat and not the deboning time (i.e. right at 2 hours, left at 24; right at 2, left at 4; right at 4, left at 24 hrs). The...

  2. Timing and tempo of the Great Oxidation Event

    NASA Astrophysics Data System (ADS)

    Gumsley, Ashley P.; Chamberlain, Kevin R.; Bleeker, Wouter; Söderlund, Ulf; de Kock, Michiel O.; Larsson, Emilie R.; Bekker, Andrey

    2017-02-01

    The first significant buildup in atmospheric oxygen, the Great Oxidation Event (GOE), began in the early Paleoproterozoic in association with global glaciations and continued until the end of the Lomagundi carbon isotope excursion ca. 2,060 Ma. The exact timing of and relationships among these events are debated because of poor age constraints and contradictory stratigraphic correlations. Here, we show that the first Paleoproterozoic global glaciation and the onset of the GOE occurred between ca. 2,460 and 2,426 Ma, ˜100 My earlier than previously estimated, based on an age of 2,426 ± 3 Ma for Ongeluk Formation magmatism from the Kaapvaal Craton of southern Africa. This age helps define a key paleomagnetic pole that positions the Kaapvaal Craton at equatorial latitudes of 11° ± 6° at this time. Furthermore, the rise of atmospheric oxygen was not monotonic, but was instead characterized by oscillations, which together with climatic instabilities may have continued over the next ˜200 My until ≤2,250-2,240 Ma. Ongeluk Formation volcanism at ca. 2,426 Ma was part of a large igneous province (LIP) and represents a waning stage in the emplacement of several temporally discrete LIPs across a large low-latitude continental landmass. These LIPs played critical, albeit complex, roles in the rise of oxygen and in both initiating and terminating global glaciations. This series of events invites comparison with the Neoproterozoic oxygen increase and Sturtian Snowball Earth glaciation, which accompanied emplacement of LIPs across supercontinent Rodinia, also positioned at low latitude.

  3. Timing and tempo of the Great Oxidation Event

    PubMed Central

    Chamberlain, Kevin R.; Bleeker, Wouter; Söderlund, Ulf; de Kock, Michiel O.; Larsson, Emilie R.; Bekker, Andrey

    2017-01-01

    The first significant buildup in atmospheric oxygen, the Great Oxidation Event (GOE), began in the early Paleoproterozoic in association with global glaciations and continued until the end of the Lomagundi carbon isotope excursion ca. 2,060 Ma. The exact timing of and relationships among these events are debated because of poor age constraints and contradictory stratigraphic correlations. Here, we show that the first Paleoproterozoic global glaciation and the onset of the GOE occurred between ca. 2,460 and 2,426 Ma, ∼100 My earlier than previously estimated, based on an age of 2,426 ± 3 Ma for Ongeluk Formation magmatism from the Kaapvaal Craton of southern Africa. This age helps define a key paleomagnetic pole that positions the Kaapvaal Craton at equatorial latitudes of 11° ± 6° at this time. Furthermore, the rise of atmospheric oxygen was not monotonic, but was instead characterized by oscillations, which together with climatic instabilities may have continued over the next ∼200 My until ≤2,250–2,240 Ma. Ongeluk Formation volcanism at ca. 2,426 Ma was part of a large igneous province (LIP) and represents a waning stage in the emplacement of several temporally discrete LIPs across a large low-latitude continental landmass. These LIPs played critical, albeit complex, roles in the rise of oxygen and in both initiating and terminating global glaciations. This series of events invites comparison with the Neoproterozoic oxygen increase and Sturtian Snowball Earth glaciation, which accompanied emplacement of LIPs across supercontinent Rodinia, also positioned at low latitude. PMID:28167763

  4. What controls the local time extent of flux transfer events?

    NASA Astrophysics Data System (ADS)

    Milan, S. E.; Imber, S. M.; Carter, J. A.; Walach, M.-T.; Hubert, B.

    2016-02-01

    Flux transfer events (FTEs) are the manifestation of bursty and/or patchy magnetic reconnection at the magnetopause. We compare two sequences of the ionospheric signatures of flux transfer events observed in global auroral imagery and coherent ionospheric radar measurements. Both sequences were observed during very similar seasonal and interplanetary magnetic field (IMF) conditions, though with differing solar wind speed. A key observation is that the signatures differed considerably in their local time extent. The two periods are 26 August 1998, when the IMF had components BZ≈-10 nT and BY≈9 nT and the solar wind speed was VX≈650 km s-1, and 31 August 2005, IMF BZ≈-7 nT, BY≈17 nT, and VX≈380 km s-1. In the first case, the reconnection rate was estimated to be near 160 kV, and the FTE signatures extended across at least 7 h of magnetic local time (MLT) of the dayside polar cap boundary. In the second, a reconnection rate close to 80 kV was estimated, and the FTEs had a MLT extent of roughly 2 h. We discuss the ramifications of these differences for solar wind-magnetosphere coupling.

  5. Real-time performance analysis of wireless multimedia networks based on partially observed multivariate point processes

    NASA Astrophysics Data System (ADS)

    Hortos, William S.

    2000-07-01

    Third-generation (3G) wireless networks will support integrated multimedia services based on a cellular extension of a packet-switched architecture using variants of the Internet protocol (IP). Services can be categorized as real- time and delay-sensitive, or non-real-time and delay- insensitive. Each call, arriving to or active within the network, carries demand for one or more services in parallel; each service type with a guaranteed quality of service (QoS). Admission of new calls to the wireless IP network (WIN) from the gateway of a wired network or from a mobile subscriber (MS) is allowed by call admission control procedures. Roaming of the MSs among the nodes of the WIN is controlled by handoff procedures between base stations (BSs), or BS controllers, and the MSs. Metrics such as the probabilities of call blocking and dropping, handoff transition time, processing latency of a call, throughput, and capacity are used to evaluate the performance of network control procedures. The metrics are directly related to the network resources required to provide the QoS for the integrated services.

  6. Intelligent fuzzy controller for event-driven real time systems

    NASA Technical Reports Server (NTRS)

    Grantner, Janos; Patyra, Marek; Stachowicz, Marian S.

    1992-01-01

    Most of the known linguistic models are essentially static, that is, time is not a parameter in describing the behavior of the object's model. In this paper we show a model for synchronous finite state machines based on fuzzy logic. Such finite state machines can be used to build both event-driven, time-varying, rule-based systems and the control unit section of a fuzzy logic computer. The architecture of a pipelined intelligent fuzzy controller is presented, and the linguistic model is represented by an overall fuzzy relation stored in a single rule memory. A VLSI integrated circuit implementation of the fuzzy controller is suggested. At a clock rate of 30 MHz, the controller can perform 3 MFLIPS on multi-dimensional fuzzy data.

  7. A hybrid framework for downscaling time-dependent multivariate coastal boundary conditions

    NASA Astrophysics Data System (ADS)

    Alvarez Antolinez, J. A.; Murray, A. B.; Moore, L. J.; Wood, J.; Mendez, F. J.

    2016-12-01

    Long-term coastal evolution models can be forced with long-term time series of sea state parameters or with representative angular distribution of alongshore sediment transport. The use of observational data - buoys - is usually restricted to 15-20 years requiring transformation of waves and in most of the cases, there are gaps in the time series. The development of atmospheric reanalysis and wave hindcasts can dramatically extend the period to more than 100 years, depending on the resolution of the wind forcing (0.3º - 2º). Global and regional wave hindcasts are available nowadays at a spatial scale varying between 0.5-1º. However, this spatial resolution is not enough for coastal applications, requiring a huge CPU effort for the numerical modelling of the wave transformation processes at a very high spatial resolution (less than 1 Km usually) for obtaining long-term time series of triplets (significant wave height, SWH, peak period, Tp, mean direction, MWD) at the shoreface ( 20 meters depth). To address this problem, we propose an efficient hybrid approach combining: (a) the use of long-term time series of triplets based on regional wave hindcasts; (b) a hybrid wave transformation to the shoreface with SWAN model and data mining techniques (Camus et al, 2011); (c) a statistical downscaling model based on daily sea level pressure-based (SLP) weather types that allows to downscale the triplets (Camus et al, 2014) for each specific synoptic-scale SLP pattern; (d) reconstruction of daily values of SWH, Tp and MWD in the period 1871-2010 using the 20CRv2 reanalysis; (e) use of the Coastline Evolution Model, CEM (Ashton and Murray, 2006) forced with the daily values of the forcing, for the analysis of the multidecadal variability of coastline response in the period 1900-2010. In comparion with a fully dynamic downscaling, this framework saves a considerable amount of CPU time (1000X faster) and allows to analyze the multidecadal variability of the coastline

  8. Detection of intermittent events in atmospheric time series

    NASA Astrophysics Data System (ADS)

    Paradisi, P.; Cesari, R.; Palatella, L.; Contini, D.; Donateo, A.

    2009-04-01

    The modeling approach in atmospheric sciences is based on the assumption that local fluxes of mass, momentum, heat, etc... can be described as linear functions of the local gradient of some intensive property (concentration, flow strain, temperature,...). This is essentially associated with Gaussian statistics and short range (exponential) correlations. However, the atmosphere is a complex dynamical system displaying a wide range of spatial and temporal scales. A global description of the atmospheric dynamics should include a great number of degrees of freedom, strongly interacting on several temporal and spatial scales, thus generating long range (power-law) correlations and non-Gaussian distribution of fluctuations (Lévy flights, Lévy walks, Continuous Time Random Walks) [1]. This is typically associated with anomalous diffusion and scaling, non-trivial memory features and correlation decays and, especially, with the emergence of flux-gradient relationships that are non-linear and/or non-local in time and/or space. Actually, the local flux-gradient relationship is greatly preferred due to a more clear physical meaning, allowing to perform direct comparisons with experimental data, and, especially, to smaller computational costs in numerical models. In particular, the linearity of this relationship allows to define a transport coefficient (e.g., turbulent diffusivity) and the modeling effort is usually focused on this coefficient. However, the validity of the local (and linear) flux-gradient model is strongly dependent on the range of spatial and temporal scales represented by the model and, consequently, by the sub-grid processes included in the flux-gradient relationship. In this work, in order to check the validity of local and linear flux-gradient relationships, an approach based on the concept of renewal critical events [2] is introduced. In fact, in renewal theory [2], the dynamical origin of anomalous behaviour and non-local flux-gradient relation is

  9. Event coincidence analysis for quantifying statistical interrelationships between event time series. On the role of flood events as triggers of epidemic outbreaks

    NASA Astrophysics Data System (ADS)

    Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.

    2016-05-01

    Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.

  10. Time course of salinity adaptation in a strongly euryhaline estuarine teleost, fundulus heteroclitus: A multivariable approach

    USGS Publications Warehouse

    Marshall, W.S.; Emberley, T.R.; Singer, T.D.; Bryson, S.E.; McCormick, S.D.

    1999-01-01

    Freshwater-adapted killifish (Fundulus heteroclitus) were transferred directly from soft fresh water to full-strength sea water for periods of 1h, 3h, 8h and 1, 2, 7, 14 and 30 days. Controls were transferred to fresh water for 24 h. Measured variables included: blood [Na+], osmolality, glucose and cortisol levels, basal and stimulated rates of ion transport and permeability of in vitro opercular epithelium, gill Na+/K+-ATPase and citrate synthase activity and chloride cell ultrastructure. These data were compared with previously published killifish cystic fibrosis transmembrane conductance regulator (kfCFTR) expression in the gills measured over a similar time course. Plasma cortisol levels peaked at 1 h, coincident with a rise in plasma [Na+]. At 8 h after transfer to sea water, a time at which previous work has shown kfCFTR expression to be elevated, blood osmolality and [Na+] were high, and cortisol levels and opercular membrane short-circuit current (I(SC); a measure of Cl- secretion rate) were low. The 24h group, which showed the highest level of kfCFTR expression, had the highest plasma [Na+] and osmolality, elevated plasma cortisol levels, significantly lower opercular membrane resistance, an increased opercular membrane ion secretion rate and collapsed tubule inclusions in mitochondria-rich cells, but no change in gill Na+/K+-ATPase and citrate synthase activity or plasma glucose levels. Apparently, killifish have a rapid (<1h) cortisol response to salinity coupled to subsequent (8-48 h) expression of kfCFTR anion channel proteins in existing mitochondria-rich cells that convert transport from ion uptake to ion secretion.

  11. Generalized survival models for correlated time-to-event data.

    PubMed

    Liu, Xing-Rong; Pawitan, Yudi; Clements, Mark S

    2017-09-14

    Our aim is to develop a rich and coherent framework for modeling correlated time-to-event data, including (1) survival regression models with different links and (2) flexible modeling for time-dependent and nonlinear effects with rich postestimation. We extend the class of generalized survival models, which expresses a transformed survival in terms of a linear predictor, by incorporating a shared frailty or random effects for correlated survival data. The proposed approach can include parametric or penalized smooth functions for time, time-dependent effects, nonlinear effects, and their interactions. The maximum (penalized) marginal likelihood method is used to estimate the regression coefficients and the variance for the frailty or random effects. The optimal smoothing parameters for the penalized marginal likelihood estimation can be automatically selected by a likelihood-based cross-validation criterion. For models with normal random effects, Gauss-Hermite quadrature can be used to obtain the cluster-level marginal likelihoods. The Akaike Information Criterion can be used to compare models and select the link function. We have implemented these methods in the R package rstpm2. Simulating for both small and larger clusters, we find that this approach performs well. Through 2 applications, we demonstrate (1) a comparison of proportional hazards and proportional odds models with random effects for clustered survival data and (2) the estimation of time-varying effects on the log-time scale, age-varying effects for a specific treatment, and two-dimensional splines for time and age. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Association of cardiomyopathy with adverse cardiac events in pregnant women at the time of delivery.

    PubMed

    Lima, Fabio V; Parikh, Puja B; Zhu, Jiawen; Yang, Jie; Stergiopoulos, Kathleen

    2015-03-01

    The aim of this study was to determine the predictors of adverse events in pregnant women with cardiomyopathy (CDM) and CDM subtypes at the time of delivery. Investigation of patients' characteristics and outcomes in women with CDM at the time of delivery has been limited. The Healthcare Cost and Utilization Project's National Inpatient Sample was screened for hospital admissions for delivery in pregnant women with CDM from 2006 to 2010. Clinical characteristics and maternal outcomes were identified in women with and without CDM and in CDM subtypes. The primary outcome of interest was major adverse clinical events (MACE), a composite of in-hospital death, acute myocardial infarction, heart failure, arrhythmia, cerebrovascular event, or embolic event. Our study population comprised 2,078 patients with CDM and 4,438,439 patients without CDM. Of those with CDM, 52 (2.5%) were hypertrophic, 1,039 (50.0%) were peripartum, and 987 (47.5%) were classified as other. Women with CDM were older, white, and insured by Medicaid. MACE rates were significantly higher in women with peripartum CDM (46%), compared with hypertrophic CDM (23%) and all others (39%) (p < 0.001). In multivariable analysis, the presence of peripartum cardiomyopathy (odds ratio [OR]: 2.2; 95% confidence interval [CI]: 1.1 to 4.6), valvular disease (OR: 2.11; 95% CI: 1.6 to 2.9), and eclampsia (OR: 5.0; 95% CI: 1.6 to 1.9) was independently associated with MACE. Presence of CDM is independently predictive of MACE during hospitalization for delivery. Patients with peripartum CDM had the highest likelihood of MACE compared with other CDM subtypes. Copyright © 2015 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  13. Two step transfer entropy - An estimator of delayed directional couplings between multivariate EEG time series.

    PubMed

    Songhorzadeh, Maryam; Ansari-Asl, Karim; Mahmoudi, Alimorad

    2016-12-01

    Quantifying delayed directional couplings between electroencephalographic (EEG) time series requires an efficient method of causal network inference. This is especially due to the limited knowledge about the underlying dynamics of the brain activity. Recent methods based on information theoretic measures such as Transfer Entropy (TE) made significant progress on this issue by providing a model-free framework for causality detection. However, TE estimation from observed data is not a trivial task, especially when the number of variables is large which is the case in a highly complex system like human brain. Here we propose a computationally efficient procedure for TE estimation based on using sets of the Most Informative Variables that effectively contribute to resolving the uncertainty of the destination. In the first step of this method, some conditioning sets are determined through a nonlinear state space reconstruction; then in the second step, optimal estimation of TE is done based on these sets. Validation of the proposed method using synthetic data and neurophysiological signals demonstrates computational efficiency in quantifying delayed directional couplings compared with the common TE analysis.

  14. Investigating Progression in Substance Use Initiation Using a Discrete-Time Multiple Event Process Survival Mixture (MEPSUM) Approach

    PubMed Central

    Richmond-Rakerd, Leah S.; Fleming, Kimberly A.; Slutske, Wendy S.

    2015-01-01

    The order and timing of substance initiation has significant implications for later problematic patterns of use. Despite the need to study initiation from a multivariate framework, survival analytic methods typically cannot accommodate more than two substances in one model. The Discrete-Time Multiple Event Process Survival Mixture (MEPSUM; Dean, Bauer, & Shanahan, 2014) model represents an advance by incorporating more than two outcomes and enabling establishment of latent classes within a multivariate hazard distribution. Employing a MEPSUM approach, we evaluated patterns of tobacco, alcohol, and cannabis initiation in the National Longitudinal Study of Adolescent to Adult Health (N=18,923). We found four classes that differed in their ages and ordering of peak initiation risk. Demographics, externalizing psychopathology, and personality significantly predicted class membership. Sex differences in the association between delinquency and initiation patterns also emerged. Findings support the utility of the MEPSUM approach in elucidating developmental pathways underlying clinically relevant phenomena. PMID:27127730

  15. Four Simultaneous Component Models for the Analysis of Multivariate Time Series from More Than One Subject To Model Intraindividual and Interindividual Differences.

    ERIC Educational Resources Information Center

    Timmerman, Marieke E.; Kiers, Henk A. L.

    2003-01-01

    Discusses a class of four simultaneous component models for the explanatory analysis of multivariate time series collected from more than one subject simultaneously. Shows how the models can be ordered hierarchically and illustrates their use through an empirical example. (SLD)

  16. Putting Predictive Models to Use: Scoring of Unseen Streaming Data using a Multivariate Time Series Classification Tool

    NASA Astrophysics Data System (ADS)

    Sipes, T.; Karimabadi, H.; Imber, S. M.; Slavin, J. A.; Pothier, N. M.; Coeli, R.

    2013-12-01

    Advances in data collection and data storage technologies have made the assembly of multivariate time series data more common. Data analysis and extraction of knowledge from such massive and complex datasets encountered in space physics today present a major obstacle to fully utilizing our vast data repositories and to scientific progress. In the previous years we introduced a time series classification tool MineTool-TS [Karimabadi et al, 2009] and its extension to simulation and streaming data [Sipes& Karimabadi, 2012, 2013]. In this work we demonstrate the applicability and real world utility of the predictive models created using the tool to scoring and labeling of a large dataset of unseen, streaming data. Predictive models that are created are based on the assumption that the training data used to create them is a true representative of the population. Multivariate time series datasets are also characterized by large amounts of variability and potential background noise. Moreover, there are multiple issues being raised by the streaming nature of the data. In this work we illustrate how we dealt with these challenges and demonstrate the results in a study of flux ropes in the plasma sheet. We have used an iterative process of building a predictive model using the original labeled training set, tested it on a week worth of streaming data, had the results checked by a scientific expert in the domain, and fed the results and the labels back into the training set, creating a large training set and using it to produce the final model. This final model was then put to use to predict a very large, unseen, six month period of streaming data. In this work we present the results of our machine learning approach to automatically detect flux ropes in spacecraft data.

  17. Chemical fingerprinting of petroleum biomarkers in biota samples using retention-time locking chromatography and multivariate analysis.

    PubMed

    Bartolomé, Luis; Deusto, Miren; Etxebarria, Nestor; Navarro, Patricia; Usobiaga, Aresatz; Zuloaga, Olatz

    2007-07-20

    This work was conducted to study a new separation and evaluation approach for the chemical fingerprinting of petroleum biomarkers in biota samples. The final aim of this work was to study the correlation between the observed effects in the shore habitats (mussels and limpets) and one pollution source: the oil spill of the Prestige tanker. The method combined a clean-up step of the biota extracts (mussels and limpets), the retention-time locking of the gas chromatographic set up, and the multivariate data analysis of the chromatograms. For clean-up, solid-phase extraction and gel permeation chromatography were compared, and 5g Florisil cartridges assured the lack of interfering compounds in the last extracts. In order to assure reproducible retention times and to avoid the realignment of the chromatograms, the retention-time locking feature of our gas chromatography-mass spectrometry (GC-MS) set up was used. Finally, in the case of multivariate analysis, the GC-MS chromatograms were treated, essentially by derivatization and by normalization, and all the chromatograms at m/z 191 (terpenes), m/z 217-218 (steranes and diasteranes) and m/z 231 (triaromatic steranes) were treated by means of principal component analysis. Furthermore, slightly different four oil samples from the Prestige oil spill were analyzed following the Nordtest method, and the GC-MS chromatograms were considered as the reference chemical fingerprints of the sources. In this sense, the correlation between the studied samples, including sediments and biota samples, and the source candidate was completed by means of a supervised pattern recognition method. As a result, the method proposed in this work was useful to identify the Prestige oil spill as the source of many of the analyzed samples.

  18. Detecting Rare Events in the Time-Domain

    SciTech Connect

    Rest, A; Garg, A

    2008-10-31

    One of the biggest challenges in current and future time-domain surveys is to extract the objects of interest from the immense data stream. There are two aspects to achieving this goal: detecting variable sources and classifying them. Difference imaging provides an elegant technique for identifying new transients or changes in source brightness. Much progress has been made in recent years toward refining the process. We discuss a selection of pitfalls that can afflict an automated difference imagine pipeline and describe some solutions. After identifying true astrophysical variables, we are faced with the challenge of classifying them. For rare events, such as supernovae and microlensing, this challenge is magnified because we must balance having selection criteria that select for the largest number of objects of interest against a high contamination rate. We discuss considerations and techniques for developing classification schemes.

  19. Relative timing of deglacial climate events in Antarctica and Greenland.

    PubMed

    Morgan, Vin; Delmotte, Marc; van Ommen, Tas; Jouzel, Jean; Chappellaz, Jérôme; Woon, Suenor; Masson-Delmotte, Valérie; Raynaud, Dominique

    2002-09-13

    The last deglaciation was marked by large, hemispheric, millennial-scale climate variations: the Bølling-Allerød and Younger Dryas periods in the north, and the Antarctic Cold Reversal in the south. A chronology from the high-accumulation Law Dome East Antarctic ice core constrains the relative timing of these two events and provides strong evidence that the cooling at the start of the Antarctic Cold Reversal did not follow the abrupt warming during the northern Bølling transition around 14,500 years ago. This result suggests that southern changes are not a direct response to abrupt changes in North Atlantic thermohaline circulation, as is assumed in the conventional picture of a hemispheric temperature seesaw.

  20. Simplifying Facility and Event Scheduling: Saving Time and Money.

    ERIC Educational Resources Information Center

    Raasch, Kevin

    2003-01-01

    Describes a product called the Event Management System (EMS), a computer software program to manage facility and event scheduling. Provides example of the school district and university uses of EMS. Describes steps in selecting a scheduling-management system. (PKP)

  1. Simplifying Facility and Event Scheduling: Saving Time and Money.

    ERIC Educational Resources Information Center

    Raasch, Kevin

    2003-01-01

    Describes a product called the Event Management System (EMS), a computer software program to manage facility and event scheduling. Provides example of the school district and university uses of EMS. Describes steps in selecting a scheduling-management system. (PKP)

  2. Time in Language: Event Duration in Language Comprehension

    ERIC Educational Resources Information Center

    Coll-Florit, Marta; Gennari, Silvia P.

    2011-01-01

    This work investigates how we process and represent event duration in on-line language comprehension. Specifically, it examines how events of different duration are processed and what type of knowledge underlies their representations. Studies 1-4 examined verbs and phrases in different contexts. They showed that durative events took longer to…

  3. DeCon: A tool to detect emotional concordance in multivariate time series data of emotional responding

    PubMed Central

    Bulteel, Kirsten; Ceulemans, Eva; Thompson, Renee J.; Waugh, Christian E.; Gotlib, Ian H.; Tuerlinckx, Francis; Kuppens, Peter

    2013-01-01

    The occurrence of concordance among different response components during an emotional episode is a key feature of several contemporary accounts and definitions of emotion. Yet, capturing such response concordance in empirical data has proven to be elusive, in large part because of a lack of appropriate statistical tools that are tailored to measure the intricacies of response concordance in the context of data on emotional responding. In this article, we present a tool we developed to detect two different forms of response concordance—response patterning and synchronization—in multivariate time series data of emotional responding, and apply this tool to data concerning physiological responding to emotional stimuli. While the findings provide partial evidence for both response patterning and synchronization, they also show that the presence and nature of such patterning and synchronization is strongly person-dependent. PMID:24220647

  4. Placebo group improvement in trials of pharmacotherapies for alcohol use disorders: a multivariate meta-analysis examining change over time.

    PubMed

    Del Re, A C; Maisel, Natalya; Blodgett, Janet C; Wilbourne, Paula; Finney, John W

    2013-10-01

    Placebo group improvement in pharmacotherapy trials has been increasing over time across several pharmacological treatment areas. However, it is unknown to what degree increasing improvement has occurred in pharmacotherapy trials for alcohol use disorders or what factors may account for placebo group improvement. This meta-analysis of 47 alcohol pharmacotherapy trials evaluated (1) the magnitude of placebo group improvement, (2) the extent to which placebo group improvement has been increasing over time, and (3) several potential moderators that might account for variation in placebo group improvement. Random-effects univariate and multivariate analyses were conducted that examined the magnitude of placebo group improvement in the 47 studies and several potential moderators of improvement: (a) publication year, (b) country in which the study was conducted, (c) outcome data source/type, (d) number of placebo administrations, (e) overall severity of study participants, and (f) additional psychosocial treatment. Substantial placebo group improvement was found overall and improvement was larger in more recent studies. Greater improvement was found on moderately subjective outcomes, with more frequent administrations of the placebo, and in studies with greater participant severity of illness. However, even after controlling for these moderators, placebo group improvement remained significant, as did placebo group improvement over time. Similar to previous pharmacotherapy placebo research, substantial pretest to posttest placebo group improvement has occurred in alcohol pharmacotherapy trials, an effect that has been increasing over time. However, several plausible moderator variables were not able to explain why placebo group improvement has been increasing over time.

  5. Decoding Dynamic Brain Patterns from Evoked Responses: A Tutorial on Multivariate Pattern Analysis Applied to Time Series Neuroimaging Data.

    PubMed

    Grootswagers, Tijl; Wardle, Susan G; Carlson, Thomas A

    2017-04-01

    Multivariate pattern analysis (MVPA) or brain decoding methods have become standard practice in analyzing fMRI data. Although decoding methods have been extensively applied in brain-computer interfaces, these methods have only recently been applied to time series neuroimaging data such as MEG and EEG to address experimental questions in cognitive neuroscience. In a tutorial style review, we describe a broad set of options to inform future time series decoding studies from a cognitive neuroscience perspective. Using example MEG data, we illustrate the effects that different options in the decoding analysis pipeline can have on experimental results where the aim is to "decode" different perceptual stimuli or cognitive states over time from dynamic brain activation patterns. We show that decisions made at both preprocessing (e.g., dimensionality reduction, subsampling, trial averaging) and decoding (e.g., classifier selection, cross-validation design) stages of the analysis can significantly affect the results. In addition to standard decoding, we describe extensions to MVPA for time-varying neuroimaging data including representational similarity analysis, temporal generalization, and the interpretation of classifier weight maps. Finally, we outline important caveats in the design and interpretation of time series decoding experiments.

  6. A new proposal for multivariable modelling of time-varying effects in survival data based on fractional polynomial time-transformation.

    PubMed

    Sauerbrei, Willi; Royston, Patrick; Look, Maxime

    2007-06-01

    The Cox proportional hazards model has become the standard for the analysis of survival time data in cancer and other chronic diseases. In most studies, proportional hazards (PH) are assumed for covariate effects. With long-term follow-up, the PH assumption may be violated, leading to poor model fit. To accommodate non-PH effects, we introduce a new procedure, MFPT, an extension of the multivariable fractional polynomial (MFP) approach, to do the following: (1) select influential variables; (2) determine a sensible dose-response function for continuous variables; (3) investigate time-varying effects; (4) model such time-varying effects on a continuous scale. Assuming PH initially, we start with a detailed model-building step, including a search for possible non-linear functions for continuous covariates. Sometimes a variable with a strong short-term effect may appear weak or non-influential if 'averaged' over time under the PH assumption. To protect against omitting such variables, we repeat the analysis over a restricted time-interval. Any additional prognostic variables identified by this second analysis are added to create our final time-fixed multivariable model. Using a forward-selection algorithm we search for possible improvements in fit by adding time-varying covariates. The first part to create a final time-fixed model does not require the use of MFP. A model may be given from 'outside' or a different strategy may be preferred for this part. This broadens the scope of the time-varying part. To motivate and illustrate the methodology, we create prognostic models from a large database of patients with primary breast cancer. Non-linear time-fixed effects are found for progesterone receptor status and number of positive lymph nodes. Highly statistically significant time-varying effects are present for progesterone receptor status and tumour size.

  7. Life events and depressive symptoms in African American adolescents: do ecological domains and timing of life events matter?

    PubMed

    Sanchez, Yadira M; Lambert, Sharon F; Ialongo, Nicholas S

    2012-04-01

    Considerable research has documented associations between adverse life events and internalizing symptoms in adolescents, but much of this research has focused on the number of events experienced, with less attention to the ecological context or timing of events. This study examined life events in three ecological domains relevant to adolescents (i.e., family, peers, themselves) as predictors of the course of depressive symptoms among a community epidemiologically defined sample of 419 (47.2% females) urban African American adolescents. Given that youth depressive symptoms change over time, grade level was examined as a moderator. For males, the strength of associations between life events happening to participants, family life events, and peer life events and depressive symptoms did not change from grades 6-9. For females, the strength of the association between peer life events and depressive symptoms did not change over time, but the strength of associations between life events happening to participants and family life events and females' depressive symptoms decreased over time. Implications of the findings and directions for future research are discussed.

  8. Life Events and Depressive Symptoms in African American Adolescents: Do Ecological Domains and Timing of Life Events Matter?

    PubMed Central

    Lambert, Sharon F.; Ialongo, Nicholas S.

    2013-01-01

    Considerable research has documented associations between adverse life events and internalizing symptoms in adolescents, but much of this research has focused on the number of events experienced, with less attention to the ecological context or timing of events. This study examined life events in three ecological domains relevant to adolescents (i.e., family, peers, themselves) as predictors of the course of depressive symptoms among a community epidemiologically defined sample of 419 (47.2% females) urban African American adolescents. Given that youth depressive symptoms change over time, grade level was examined as a moderator. For males, the strength of associations between life events happening to participants, family life events, and peer life events and depressive symptoms did not change from grades 6–9. For females, the strength of the association between peer life events and depressive symptoms did not change over time, but the strength of associations between life events happening to participants and family life events and females’ depressive symptoms decreased over time. Implications of the findings and directions for future research are discussed. PMID:21706385

  9. Time to Tenure in Spanish Universities: An Event History Analysis

    PubMed Central

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility. PMID:24116199

  10. Time to tenure in Spanish universities: an event history analysis.

    PubMed

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility.

  11. Time warp: authorship shapes the perceived timing of actions and events.

    PubMed

    Ebert, Jeffrey P; Wegner, Daniel M

    2010-03-01

    It has been proposed that inferring personal authorship for an event gives rise to intentional binding, a perceptual illusion in which one's action and inferred effect seem closer in time than they otherwise would (Haggard, Clark, & Kalogeras, 2002). Using a novel, naturalistic paradigm, we conducted two experiments to test this hypothesis and examine the relationship between binding and self-reported authorship. In both experiments, an important authorship indicator - consistency between one's action and a subsequent event - was manipulated, and its effects on binding and self-reported authorship were measured. Results showed that action-event consistency enhanced both binding and self-reported authorship, supporting the hypothesis that binding arises from an inference of authorship. At the same time, evidence for a dissociation emerged, with consistency having a more robust effect on self-reports than on binding. Taken together, these results suggest that binding and self-reports reveal different aspects of the sense of authorship.

  12. UNCERTAINTY IN PHASE ARRIVAL TIME PICKS FOR REGIONAL SEISMIC EVENTS: AN EXPERIMENTAL DESIGN

    SciTech Connect

    A. VELASCO; ET AL

    2001-02-01

    The detection and timing of seismic arrivals play a critical role in the ability to locate seismic events, especially at low magnitude. Errors can occur with the determination of the timing of the arrivals, whether these errors are made by automated processing or by an analyst. One of the major obstacles encountered in properly estimating travel-time picking error is the lack of a clear and comprehensive discussion of all of the factors that influence phase picks. This report discusses possible factors that need to be modeled to properly study phase arrival time picking errors. We have developed a multivariate statistical model, experimental design, and analysis strategy that can be used in this study. We have embedded a general form of the International Data Center(IDC)/U.S. National Data Center (USNDC) phase pick measurement error model into our statistical model. We can use this statistical model to optimally calibrate a picking error model to regional data. A follow-on report will present the results of this analysis plan applied to an implementation of an experiment/data-gathering task.

  13. Frey syndrome: factors influencing the time to event.

    PubMed

    Lafont, M; Whyte, A; Whyte, J; Saura, E; Tejedor, M T

    2015-07-01

    Frey syndrome is a common complication after parotidectomy. The time from surgery to disease onset may be quite long; therefore, a time-to-event analysis was performed for the occurrence of this syndrome post-parotidectomy. Three hundred and thirty-four patients who underwent a parotidectomy between January 2002 and November 2012 were identified (retrospective study). Of these patients, 102 developed Frey syndrome post-surgery and 232 did not. The time-to-onset analysis enabled us to estimate the risk ratio associated with different types of parotid gland tumours, various parotidectomy procedures, and repeat parotidectomy, which is useful for predicting preoperative and surgical risk. The risk of developing Frey syndrome was lower in patients with malignant tumours than in those with benign tumours (risk ratio 0.351, 95% confidence interval (CI) 0.155-0.594). Risk ratios for lumpectomy PA (pre-auricular area), superficial parotidectomy, and total parotidectomy with respect to lumpectomy T (tail) were 4.378 (95% CI 1.168-16.410), 8.040 (95% CI 3.286-19.670), and 8.174 (95% CI 3.076-21.723), respectively. Repeat parotidectomy also increased the risk of developing Frey syndrome (risk ratio 3.214, 95% CI 1.547-6.678). No effect of the use of a superficial muscular aponeurotic system (SMAS) flap on the risk of developing Frey syndrome was detected (P=0.888). Copyright © 2015 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  14. Supervised Time Series Event Detector for Building Data

    SciTech Connect

    2016-04-13

    A machine learning based approach is developed to detect events that have rarely been seen in the historical data. The data can include building energy consumption, sensor data, environmental data and any data that may affect the building's energy consumption. The algorithm is a modified nonlinear Bayesian support vector machine, which examines daily energy consumption profile, detect the days with abnormal events, and diagnose the cause of the events.

  15. Prediction of beef color using time-domain nuclear magnetic resonance (TD-NMR) relaxometry data and multivariate analyses.

    PubMed

    Moreira, Luiz Felipe Pompeu Prado; Ferrari, Adriana Cristina; Moraes, Tiago Bueno; Reis, Ricardo Andrade; Colnago, Luiz Alberto; Pereira, Fabíola Manhas Verbi

    2016-05-19

    Time-domain nuclear magnetic resonance and chemometrics were used to predict color parameters, such as lightness (L*), redness (a*), and yellowness (b*) of beef (Longissimus dorsi muscle) samples. Analyzing the relaxation decays with multivariate models performed with partial least-squares regression, color quality parameters were predicted. The partial least-squares models showed low errors independent of the sample size, indicating the potentiality of the method. Minced procedure and weighing were not necessary to improve the predictive performance of the models. The reduction of transverse relaxation time (T2 ) measured by Carr-Purcell-Meiboom-Gill pulse sequence in darker beef in comparison with lighter ones can be explained by the lower relaxivity Fe(2+) present in deoxymyoglobin and oxymyoglobin (red beef) to the higher relaxivity of Fe(3+) present in metmyoglobin (brown beef). These results point that time-domain nuclear magnetic resonance spectroscopy can become a useful tool for quality assessment of beef cattle on bulk of the sample and through-packages, because this technique is also widely applied to measure sensorial parameters, such as flavor, juiciness and tenderness, and physicochemical parameters, cooking loss, fat and moisture content, and instrumental tenderness using Warner Bratzler shear force. Copyright © 2016 John Wiley & Sons, Ltd.

  16. Generalization of the mechanisms of cross-correlation analysis in the case of a multivariate time series

    NASA Astrophysics Data System (ADS)

    Kravets, O. Ja; Abramov, G. V.; Beletskaja, S. Ju

    2017-02-01

    The article describes a generalization of the mechanisms of cross-correlation analysis in the case of a multivariate time series and how this allows the optimal lags to be identified for each of the independent variables (IV) using a number of algorithms. The use of generalized mechanisms will allow variables to be analysed and predicted based on the retrospective analysis of multidimensional data. In the available literature, cross-correlation has been defined only for pairs of time series. However, the study of dependent variable (DV) dependencies on multidimensional independent variables that takes into account the vector of specially selected time lags will significantly improve the quality of models based on multiple regression. The idea of multiple cross-correlation lies in the sequential forward shift of each IV row with respect to DV (it transpires that DV is delayed relative to IV) until we obtain a minimum error or the best test of multiple regression. After the completion of all stages of multiple cross-correlation, the synthesis of the model is not a difficult process.

  17. Real-time interpretation of novel events across childhood

    PubMed Central

    Borovsky, Arielle; Sweeney, Kim; Elman, Jeffrey L.; Fernald, Anne

    2014-01-01

    Despite extensive evidence that adults and children rapidly integrate world knowledge to generate expectancies for upcoming language, little work has explored how this knowledge is initially acquired and used. We explore this question in 3- to 10-year-old children and adults by measuring the degree to which sentences depicting recently learned connections between agents, actions and objects lead to anticipatory eye-movements to the objects. Combinatory information in sentences about agent and action elicited anticipatory eye-movements to the Target object in adults and older children. Our findings suggest that adults and school-aged children can quickly activate information about recently exposed novel event relationships in real-time language processing. However, there were important developmental differences in the use of this knowledge. Adults and school-aged children used the sentential agent and action to predict the sentence final theme, while preschool children’s fixations reflected a simple association to the currently spoken item. We consider several reasons for this developmental difference and possible extensions of this paradigm. PMID:24976677

  18. Time line of redox events in aging postmitotic cells

    PubMed Central

    Brandes, Nicolas; Tienson, Heather; Lindemann, Antje; Vitvitsky, Victor; Reichmann, Dana; Banerjee, Ruma; Jakob, Ursula

    2013-01-01

    The precise roles that oxidants play in lifespan and aging are still unknown. Here, we report the discovery that chronologically aging yeast cells undergo a sudden redox collapse, which affects over 80% of identified thiol-containing proteins. We present evidence that this redox collapse is not triggered by an increase in endogenous oxidants as would have been postulated by the free radical theory of aging. Instead it appears to be instigated by a substantial drop in cellular NADPH, which normally provides the electron source for maintaining cellular redox homeostasis. This decrease in NADPH levels occurs very early during lifespan and sets into motion a cascade that is predicted to down-regulate most cellular processes. Caloric restriction, a near-universal lifespan extending measure, increases NADPH levels and delays each facet of the cascade. Our studies reveal a time line of events leading up to the system-wide oxidation of the proteome days before cell death. DOI: http://dx.doi.org/10.7554/eLife.00306.001 PMID:23390587

  19. Young Children's Memory for the Times of Personal Past Events

    ERIC Educational Resources Information Center

    Pathman, Thanujeni; Larkina, Marina; Burch, Melissa M.; Bauer, Patricia J.

    2013-01-01

    Remembering the temporal information associated with personal past events is critical for autobiographical memory, yet we know relatively little about the development of this capacity. In the present research, we investigated temporal memory for naturally occurring personal events in 4-, 6-, and 8-year-old children. Parents recorded unique events…

  20. Young Children's Memory for the Times of Personal Past Events

    ERIC Educational Resources Information Center

    Pathman, Thanujeni; Larkina, Marina; Burch, Melissa M.; Bauer, Patricia J.

    2013-01-01

    Remembering the temporal information associated with personal past events is critical for autobiographical memory, yet we know relatively little about the development of this capacity. In the present research, we investigated temporal memory for naturally occurring personal events in 4-, 6-, and 8-year-old children. Parents recorded unique events…

  1. Placebo group improvement in trials of pharmacotherapies for alcohol use disorders: A multivariate meta-analysis examining change over time

    PubMed Central

    Del Re, AC; Maisel, Natalya; Blodgett, Janet; Wilbourne, Paula; Finney, John

    2014-01-01

    Objective Placebo group improvement in pharmacotherapy trials has been increasing over time across several pharmacological treatment areas. However, it is unknown to what degree increasing improvement has occurred in pharmacotherapy trials for alcohol use disorders or what factors may account for placebo group improvement. This meta-analysis of 47 alcohol pharmacotherapy trials evaluated (1) the magnitude of placebo group improvement, (2) the extent to which placebo group improvement has been increasing over time, and (3) several potential moderators that might account for variation in placebo group improvement. Method Random-effects univariate and multivariate analyses were conducted that examined the magnitude of placebo group improvement in the 47 studies and several potential moderators of improvement: (a) publication year, (b) country in which the study was conducted, (c) outcome data source/type, (d) number of placebo administrations, (e) overall severity of study participants, and (f) additional psychosocial treatment. Results Substantial placebo group improvement was found overall and improvement was larger in more recent studies. Greater improvement was found on moderately subjective outcomes, with more frequent administrations of the placebo, and in studies with greater participant severity of illness. However, even after controlling for these moderators, placebo group improvement remained significant, as did placebo group improvement over time. Conclusion Similar to previous pharmacotherapy placebo research, substantial pre- to post-test placebo group improvement has occurred in alcohol pharmacotherapy trials, an effect that has been increasing over time. However, several plausible moderator variables were not able to explain why placebo group improvement has been increasing over time. PMID:23857312

  2. Time since discharge of 9mm cartridges by headspace analysis, part 2: Ageing study and estimation of the time since discharge using multivariate regression.

    PubMed

    Gallidabino, M; Romolo, F S; Weyermann, C

    2017-03-01

    Estimating the time since discharge of spent cartridges can be a valuable tool in the forensic investigation of firearm-related crimes. To reach this aim, it was previously proposed that the decrease of volatile organic compounds released during discharge is monitored over time using non-destructive headspace extraction techniques. While promising results were obtained for large-calibre cartridges (e.g., shotgun shells), handgun calibres yielded unsatisfying results. In addition to the natural complexity of the specimen itself, these can also be attributed to some selective choices in the methods development. Thus, the present series of papers aimed to systematically evaluate the potential of headspace analysis to estimate the time since discharge of cartridges through the use of more comprehensive analytical and interpretative techniques. Following the comprehensive optimisation and validation of an exhaustive headspace sorptive extraction (HSSE) method in the first part of this work, the present paper addresses the application of chemometric tools in order to systematically evaluate the potential of applying headspace analysis to estimate the time since discharge of 9mm Geco cartridges. Several multivariate regression and pre-treatment methods were tested and compared to univariate models based on non-linear regression. Random forests (RF) and partial least squares (PLS) proceeded by pairwise log-ratios normalisation (PLR) showed the best results, and allowed to estimate time since discharge up to 48h of ageing and to differentiate recently fired from older cartridges (e.g., less than 5h compared to more than 1-2 days). The proposed multivariate approaches showed significant improvement compared to univariate models. The effects of storage conditions were also tested and results demonstrated that temperature, humidity and cartridge position should be taken into account when estimating the time since discharge.

  3. Predictive modeling in Clostridium acetobutylicum fermentations employing Raman spectroscopy and multivariate data analysis for real-time culture monitoring

    NASA Astrophysics Data System (ADS)

    Zu, Theresah N. K.; Liu, Sanchao; Germane, Katherine L.; Servinsky, Matthew D.; Gerlach, Elliot S.; Mackie, David M.; Sund, Christian J.

    2016-05-01

    The coupling of optical fibers with Raman instrumentation has proven to be effective for real-time monitoring of chemical reactions and fermentations when combined with multivariate statistical data analysis. Raman spectroscopy is relatively fast, with little interference from the water peak present in fermentation media. Medical research has explored this technique for analysis of mammalian cultures for potential diagnosis of some cancers. Other organisms studied via this route include Escherichia coli, Saccharomyces cerevisiae, and some Bacillus sp., though very little work has been performed on Clostridium acetobutylicum cultures. C. acetobutylicum is a gram-positive anaerobic bacterium, which is highly sought after due to its ability to use a broad spectrum of substrates and produce useful byproducts through the well-known Acetone-Butanol-Ethanol (ABE) fermentation. In this work, real-time Raman data was acquired from C. acetobutylicum cultures grown on glucose. Samples were collected concurrently for comparative off-line product analysis. Partial-least squares (PLS) models were built both for agitated cultures and for static cultures from both datasets. Media components and metabolites monitored include glucose, butyric acid, acetic acid, and butanol. Models were cross-validated with independent datasets. Experiments with agitation were more favorable for modeling with goodness of fit (QY) values of 0.99 and goodness of prediction (Q2Y) values of 0.98. Static experiments did not model as well as agitated experiments. Raman results showed the static experiments were chaotic, especially during and shortly after manual sampling.

  4. Multivariate time series analysis on the dynamic relationship between Class B notifiable diseases and gross domestic product (GDP) in China.

    PubMed

    Zhang, Tao; Yin, Fei; Zhou, Ting; Zhang, Xing-Yu; Li, Xiao-Song

    2016-12-23

    The surveillance of infectious diseases is of great importance for disease control and prevention, and more attention should be paid to the Class B notifiable diseases in China. Meanwhile, according to the International Monetary Fund (IMF), the annual growth of Chinese gross domestic product (GDP) would decelerate below 7% after many years of soaring. Under such circumstances, this study aimed to answer what will happen to the incidence rates of infectious diseases in China if Chinese GDP growth remained below 7% in the next five years. Firstly, time plots and cross-correlation matrices were presented to illustrate the characteristics of data. Then, the multivariate time series (MTS) models were proposed to explore the dynamic relationship between incidence rates and GDP. Three kinds of MTS models, i.e., vector auto-regressive (VAR) model for original series, VAR model for differenced series and error-correction model (ECM), were considered in this study. The rank of error-correction term was taken as an indicator for model selection. Finally, our results suggested that four kinds of infectious diseases (epidemic hemorrhagic fever, pertussis, scarlet fever and syphilis) might need attention in China because their incidence rates have increased since the year 2010.

  5. Multivariate time series analysis on the dynamic relationship between Class B notifiable diseases and gross domestic product (GDP) in China.

    PubMed

    Zhang, Tao; Yin, Fei; Zhou, Ting; Zhang, Xing-Yu; Li, Xiao-Song

    2016-12-01

    The surveillance of infectious diseases is of great importance for disease control and prevention, and more attention should be paid to the Class B notifiable diseases in China. Meanwhile, according to the International Monetary Fund (IMF), the annual growth of Chinese gross domestic product (GDP) would decelerate below 7% after many years of soaring. Under such circumstances, this study aimed to answer what will happen to the incidence rates of infectious diseases in China if Chinese GDP growth remained below 7% in the next five years. Firstly, time plots and cross-correlation matrices were presented to illustrate the characteristics of data. Then, the multivariate time series (MTS) models were proposed to explore the dynamic relationship between incidence rates and GDP. Three kinds of MTS models, i.e., vector auto-regressive (VAR) model for original series, VAR model for differenced series and error-correction model (ECM), were considered in this study. The rank of error-correction term was taken as an indicator for model selection. Finally, our results suggested that four kinds of infectious diseases (epidemic hemorrhagic fever, pertussis, scarlet fever and syphilis) might need attention in China because their incidence rates have increased since the year 2010.

  6. Large Time Projection Chambers for Rare Event Detection

    SciTech Connect

    Heffner, M

    2009-11-03

    The Time Projection Chamber (TPC) concept [add ref to TPC section] has been applied to many projects outside of particle physics and the accelerator based experiments where it was initially developed. TPCs in non-accelerator particle physics experiments are principally focused on rare event detection (e.g. neutrino and darkmater experiments) and the physics of these experiments can place dramatically different constraints on the TPC design (only extensions to the traditional TPCs are discussed here). The drift gas, or liquid, is usually the target or matter under observation and due to very low signal rates a TPC with the largest active mass is desired. The large mass complicates particle tracking of short and sometimes very low energy particles. Other special design issues include, efficient light collection, background rejection, internal triggering and optimal energy resolution. Backgrounds from gamma-rays and neutrons are significant design issues in the construction of these TPCs. They are generally placed deep underground to shield from cosmogenic particles and surrounded with shielding to reduce radiation from the local surroundings. The construction materials have to be carefully screened for radiopurity as they are in close contact with the active mass and can be a signification source of background events. The TPC excels in reducing this internal background because the mass inside the fieldcage forms one monolithic volume from which fiducial cuts can be made ex post facto to isolate quiet drift mass, and can be circulated and purified to a very high level. Self shielding in these large mass systems can be significant and the effect improves with density. The liquid phase TPC can obtain a high density at low pressure which results in very good self-shielding and compact installation with a lightweight containment. The down sides are the need for cryogenics, slower charge drift, tracks shorter than the typical electron diffusion, lower energy resolution (e

  7. Detecting correlation changes in multivariate time series: A comparison of four non-parametric change point detection methods.

    PubMed

    Cabrieto, Jedelyn; Tuerlinckx, Francis; Kuppens, Peter; Grassmann, Mariel; Ceulemans, Eva

    2017-06-01

    Change point detection in multivariate time series is a complex task since next to the mean, the correlation structure of the monitored variables may also alter when change occurs. DeCon was recently developed to detect such changes in mean and\\or correlation by combining a moving windows approach and robust PCA. However, in the literature, several other methods have been proposed that employ other non-parametric tools: E-divisive, Multirank, and KCP. Since these methods use different statistical approaches, two issues need to be tackled. First, applied researchers may find it hard to appraise the differences between the methods. Second, a direct comparison of the relative performance of all these methods for capturing change points signaling correlation changes is still lacking. Therefore, we present the basic principles behind DeCon, E-divisive, Multirank, and KCP and the corresponding algorithms, to make them more accessible to readers. We further compared their performance through extensive simulations using the settings of Bulteel et al. (Biological Psychology, 98 (1), 29-42, 2014) implying changes in mean and in correlation structure and those of Matteson and James (Journal of the American Statistical Association, 109 (505), 334-345, 2014) implying different numbers of (noise) variables. KCP emerged as the best method in almost all settings. However, in case of more than two noise variables, only DeCon performed adequately in detecting correlation changes.

  8. Dynamic ultrasound imaging--a multivariate approach for the analysis and comparison of time-dependent musculoskeletal movements.

    PubMed

    Löfstedt, Tommy; Ahnlund, Olof; Peolsson, Michael; Trygg, Johan

    2012-09-27

    Muscle functions are generally assumed to affect a wide variety of conditions and activities, including pain, ischemic and neurological disorders, exercise and injury. It is therefore very desirable to obtain more information on musculoskeletal contributions to and activity during clinical processes such as the treatment of muscle injuries, post-surgery evaluations, and the monitoring of progressive degeneration in neuromuscular disorders.The spatial image resolution achievable with ultrasound systems has improved tremendously in the last few years and it is nowadays possible to study skeletal muscles in real-time during activity. However, ultrasound imaging has an inherent problem that makes it difficult to compare different measurement series or image sequences from two or more subjects. Due to physiological differences between different subjects, the ultrasound sequences will be visually different - partly because of variation in probe placement and partly because of the difficulty of perfectly reproducing any given movement. Ultrasound images of the biceps and calf of a single subject were transformed to achieve congruence and then efficiently compressed and stacked to facilitate analysis using a multivariate method known as O2PLS. O2PLS identifies related and unrelated variation in and between two sets of data such that different phases of the studied movements can be analysed. The methodology was used to study the dynamics of the Achilles tendon and the calf and also the Biceps brachii and upper arm. The movements of these parts of the body are both of interest in clinical orthopaedic research. This study extends the novel method of multivariate analysis of congruent images (MACI) to facilitate comparisons between two series of ultrasound images. This increases its potential range of medical applications and its utility for detecting, visualising and quantifying the dynamics and functions of skeletal muscle. The most important results of this study are that

  9. Events and children’s sense of time: a perspective on the origins of everyday time-keeping

    PubMed Central

    Forman, Helen

    2015-01-01

    In this article I discuss abstract or pure time versus the content of time, (i.e., events, activities, and other goings-on). Or, more specifically, the utility of these two sorts of time in time-keeping or temporal organization. It is often assumed that abstract, uniform, and objective time is a universal physical entity out there, which humans may perceive of. However, this sort of evenly flowing time was only recently introduced to the human community, together with the mechanical clock. Before the introduction of mechanical clock-time, there were only events available to denote the extent of time. Events defined time, unlike the way time may define events in our present day culture. It is therefore conceivable that our primeval or natural mode of time-keeping involves the perception, estimation, and coordination of events. I find it likely that events continues to subserve our sense of time and time-keeping efforts, especially for children who have not yet mastered the use of clock-time. Instead of seeing events as a distraction to our perception of time, I suggest that our experience and understanding of time emerges from our perception of events. PMID:25814969

  10. Off-Time Events and Life Quality of Older Adults.

    ERIC Educational Resources Information Center

    Goodhart, Darlene; Zautra, Alex

    Many previous studies have found that daily life events influence community residents' perceived quality of life, which refers to the relative goodness of life as evaluated subjectively. A subsample population of 539 older residents, aged 55 and over, were interviewed in their homes. A 60-item scale was devised to measure the effects of…

  11. New Tools for Comparing Beliefs about the Timing of Recurrent Events with Climate Time Series Datasets

    NASA Astrophysics Data System (ADS)

    Stiller-Reeve, Mathew; Stephenson, David; Spengler, Thomas

    2017-04-01

    For climate services to be relevant and informative for users, scientific data definitions need to match users' perceptions or beliefs. This study proposes and tests novel yet simple methods to compare beliefs of timing of recurrent climatic events with empirical evidence from multiple historical time series. The methods are tested by applying them to the onset date of the monsoon in Bangladesh, where several scientific monsoon definitions can be applied, yielding different results for monsoon onset dates. It is a challenge to know which monsoon definition compares best with people's beliefs. Time series from eight different scientific monsoon definitions in six regions are compared with respondent beliefs from a previously completed survey concerning the monsoon onset. Beliefs about the timing of the monsoon onset are represented probabilistically for each respondent by constructing a probability mass function (PMF) from elicited responses about the earliest, normal, and latest dates for the event. A three-parameter circular modified triangular distribution (CMTD) is used to allow for the possibility (albeit small) of the onset at any time of the year. These distributions are then compared to the historical time series using two approaches: likelihood scores, and the mean and standard deviation of time series of dates simulated from each belief distribution. The methods proposed give the basis for further iterative discussion with decision-makers in the development of eventual climate services. This study uses Jessore, Bangladesh, as an example and finds that a rainfall definition, applying a 10 mm day-1 threshold to NCEP-NCAR reanalysis (Reanalysis-1) data, best matches the survey respondents' beliefs about monsoon onset.

  12. Quantifying causal coupling strength: A lag-specific measure for multivariate time series related to transfer entropy

    NASA Astrophysics Data System (ADS)

    Runge, Jakob; Heitzig, Jobst; Marwan, Norbert; Kurths, Jürgen

    2012-12-01

    While it is an important problem to identify the existence of causal associations between two components of a multivariate time series, a topic addressed in Runge, Heitzig, Petoukhov, and Kurths [Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.108.258701 108, 258701 (2012)], it is even more important to assess the strength of their association in a meaningful way. In the present article we focus on the problem of defining a meaningful coupling strength using information-theoretic measures and demonstrate the shortcomings of the well-known mutual information and transfer entropy. Instead, we propose a certain time-delayed conditional mutual information, the momentary information transfer (MIT), as a lag-specific measure of association that is general, causal, reflects a well interpretable notion of coupling strength, and is practically computable. Rooted in information theory, MIT is general in that it does not assume a certain model class underlying the process that generates the time series. As discussed in a previous paper [Runge, Heitzig, Petoukhov, and Kurths, Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.108.258701 108, 258701 (2012)], the general framework of graphical models makes MIT causal in that it gives a nonzero value only to lagged components that are not independent conditional on the remaining process. Further, graphical models admit a low-dimensional formulation of conditions, which is important for a reliable estimation of conditional mutual information and, thus, makes MIT practically computable. MIT is based on the fundamental concept of source entropy, which we utilize to yield a notion of coupling strength that is, compared to mutual information and transfer entropy, well interpretable in that, for many cases, it solely depends on the interaction of the two components at a certain lag. In particular, MIT is, thus, in many cases able to exclude the misleading influence of autodependency within a process in an information-theoretic way

  13. Can Granger causality delineate natural versus anthropogenic drivers of climate change from global-average multivariate time series?

    NASA Astrophysics Data System (ADS)

    Kodra, E. A.; Chatterjee, S.; Ganguly, A. R.

    2009-12-01

    The Fourth Assessment Report (AR4) of the Intergovernmental Panel on Climate Change (IPCC) notes with a high degree of certainty that global warming can be attributed to anthropogenic emissions. Detection and attribution studies, which attempt to delineate human influences on regional- and decadal-scale climate change or its impacts, use a variety of techniques, including Granger causality. Recently, Granger causality was used as a tool for detection and attribution in climate based on a spatio-temporal data mining approach. However, the degree to which Granger causality may be able to delineate natural versus anthropogenic drivers of change in these situations needs to be thoroughly investigated. As a first step, we use multivariate global-average time series of observations to test the performance of Granger causality. We apply the popular Granger F-tests to Radiative Forcing (RF), which is a transformation of carbon dioxide (CO2), and Global land surface Temperature anomalies (GT). Our preliminary results with observations appear to suggest that RF Granger-causes GT, which seem to become more apparent with more data. However, carefully designed simulations indicate that these results are not reliable and may, in fact, be misleading. On the other hand, the same observation- and simulation-driven methodologies, when applied to the El Niño Southern Oscillation (ENSO) index, clearly show reliable Granger-causality from ENSO to GT. We develop and test several hypotheses to explain why the Granger causality tests between RF and GT are not reliable. We conclude that the form of Granger causality used in this study, and in past studies reported in the literature, is sensitive to data availability, random variability, and especially whether the variables arise from a deterministic or stochastic process. Simulations indicate that Granger causality in this form performs poorly, even in simple linear effect cases, when applied to one deterministic and one stochastic time

  14. Time-varying maximal proteinuria correlates with adverse cardiovascular events and graft failure in kidney transplant recipients.

    PubMed

    Jeon, Hee Jung; Kim, Clara Tammy; An, Jung Nam; Lee, Hajeong; Kim, Hyosang; Park, Su-Kil; Joo, Kwon Wook; Lim, Chun Soo; Jung, In Mok; Ahn, Curie; Kim, Yon Su; Kim, Young Hoon; Lee, Jung Pyo

    2015-12-01

    In the general population, proteinuria is associated with progression to kidney failure, cardiovascular disease, and mortality. Here, we analyzed the effects of proteinuria on outcomes in kidney transplant recipients. We performed a retrospective, multi-centre cohort study involving 2047 recipients to evaluate the effects of post-transplant proteinuria on adverse cardiovascular events, graft failure, and mortality. Patients were classified into two groups according to their levels of proteinuria: patients without proteinuria (<150 mg/day, n = 1113) and proteinuric patients (≥ 150 mg/day, n = 934). Multivariate Cox hazard model was conducted with using the maximal proteinuria as time-varying covariate. During a median 55.3-month (range, 0.6-167.1) follow-up, there were 50 cases of major adverse cardiac events (cardiac death, nonfatal myocardial infarction, or coronary revascularization), 115 cases of graft failure, and 52 patient deaths. In multivariate Cox regression with time-varying covariate, proteinuric recipients were significantly associated with major adverse cardiac events (hazard ratio [HR] 8.689, 95% confidence interval [CI] 2.929-25.774, P < 0.001) compared to those without proteinuria. Recipients with proteinuria showed significantly higher incidences of acute rejection (23.1% vs. 9.4%, P < 0.001) and graft failure rate (HR 6.910, 95% CI 3.270-14.601, P < 0.001). In addition, mortality rate was also significantly higher in patients with proteinuria (HR 6.815, 95% CI 2.164-21.467, P = 0.001). Post-transplant proteinuria correlates with adverse cardiovascular events, graft failure, and mortality. Therefore, proteinuria should be evaluated and managed to improve the outcomes of renal recipients. © 2015 Asian Pacific Society of Nephrology.

  15. Number and appraisal of daily hassles and life events in young adulthood: the association with physical activity and screen time: a longitudinal cohort study.

    PubMed

    Uijtdewilligen, Léonie; Singh, Amika S; Chinapaw, Mai J; Koppes, Lando L J; van Mechelen, Willem; Twisk, Jos W R

    2014-10-13

    Young adults face radical life changes regarding residence, marriage, family and work that may negatively impact their health behaviours. Therefore, we investigated the associations of the number of daily hassles and life events and their subjective appraisal with physical activity and screen time in young adulthood. Data came from participants of the Amsterdam Growth and Health Longitudinal Study (AGAHLS). Self-reported physical activity (min/wk) was used from wave 6 (1991; mean age 27), wave 7 (1993; mean age 29), wave 8 (1996/1997; mean age 32) and 9 (2000; mean age 36). Self-reported screen time (h/wk) was assessed in waves 8 and 9. The number and the appraisal of daily hassles and major life events were assessed with the Everyday Problem Checklist and Life Events List, respectively (including five life event domains, i.e.: health, work, home/family, personal/social relations, and finances). The final sample included 474 participants for the physical activity analyses and 475 participants for the screen time analyses. To test the longitudinal associations of daily hassles and life events with physical activity and screen time, univariable and multivariable Generalised Estimating Equations were performed. Effect modification by gender was tested. Physical activity levels were higher in those who had experienced more daily hassles. People who reported higher subjective appraisal in the work and finances life event domains also had higher levels of physical activity, although only the subjective appraisal in the finances domain remained significant in the multivariable model. No significant associations between number and subjective appraisal of daily hassles and life events and screen time were observed. The occurrence of specific life events may be more influential for people's physical activity behaviour than their respective sum or emotional tone. Still, the assessment of daily hassles may be a relevant addition in this research field. Finally, we suggest that

  16. Near-real-time attribution of extreme weather events

    NASA Astrophysics Data System (ADS)

    Allen, M. R.; Pall, P.; Stone, D.; Stott, P.; Lohmann, D.

    2007-12-01

    As the impacts of global climate change become increasingly evident, there is growing demand for a quantitative and objective answer the the question of what is "to blame" for observed extreme weather phenomena. In addition to considerable public interest, understanding how external drivers, particularly secular trends such as anthropogenic greenhouse gas forcing, is important for the correct quantification of current weather-related risks for the insurance industry. We propose a method of quantifying the contribution of external drivers to weather-related risks based on a twinned ensemble design. Under this approach, a large ensemble of simulations with a forecast-resolution atmospheric model is driven with observed sea surface temperatures and atmospheric composition over the period of interest. A second ensemble is then generated with the influence of a particular external agent, such as anthropogenic greenhouse gases, removed through modification of composition and surface temperatures. Conventional detection and attribution techniques are used to allow for uncertainty in the magnitude and pattern of the signal removed. The frequency of occurrence of the weather event in question can then be compared between the two ensembles. For the exploration of changing risks of the most extreme events, very large ensembles (thousands of members, unprecedented for a model of this resolution) are needed, requiring a novel distributed computing approach, relying on computing resources donated by the general public: see http://attribution.cpdn.org. We focus as an example on the events of Autumn 2000 which brought widespread flooding to many regions of the UK. Precipitation from the twin ensembles is used to force an empirical run-off model to provide an estimate of its contribution to flood risk. Results are summarized in the form of an estimated fraction attributable risk for the anthropogenic contribution to the flooding events of that year.

  17. A multivariate auto-regressive combined-harmonics analysis and its application to ozone time series data

    NASA Astrophysics Data System (ADS)

    Yang, Eun-Su

    2001-07-01

    A new statistical approach is used to analyze Dobson Umkehr layer-ozone measurements at Arosa for 1979-1996 and Total Ozone Mapping Spectrometer (TOMS) Version 7 zonal mean ozone for 1979-1993, accounting for stratospheric aerosol optical depth (SAOD), quasi-biennial oscillation (QBO), and solar flux effects. A stepwise regression scheme selects statistically significant periodicities caused by season, SAOD, QBO, and solar variations and filters them out. Auto-regressive (AR) terms are included in ozone residuals and time lags are assumed for the residuals of exogenous variables. Then, the magnitudes of responses of ozone to the SAOD, QBO, and solar index (SI) series are derived from the stationary time series of the residuals. These Multivariate Auto-Regressive Combined Harmonics (MARCH) processes possess the following significant advantages: (1)the ozone trends are estimated more precisely than the previous methods; (2)the influences of the exogenous SAOD, QBO, and solar variations are clearly separated at various time lags; (3)the collinearity of the exogenous variables in the regression is significantly reduced; and (4)the probability of obtaining misleading correlations between ozone and exogenous times series is reduced. The MARCH results indicate that the Umkehr ozone response to SAOD (not a real ozone response but rather an optical interference effect), QBO, and solar effects is driven by combined dynamical radiative-chemical processes. These results are independently confirmed using the revised Standard models that include aerosol and solar forcing mechanisms with all possible time lags but not by the Standard model when restricted to a zero time lag in aerosol and solar ozone forcings. As for Dobson Umkehr ozone measurements at Arosa, the aerosol effects are most significant in layers 8, 7, and 6 with no time lag, as is to be expected due to the optical contamination of Umkehr measurements by SAOD. The QBO and solar UV effects appear in all layers 4

  18. Handling time misalignment and rank deficiency in liquid chromatography by multivariate curve resolution: Quantitation of five biogenic amines in fish.

    PubMed

    Pinto, Licarion; Díaz Nieto, César Horacio; Zón, María Alicia; Fernández, Héctor; de Araujo, Mario Cesar Ugulino

    2016-01-01

    Biogenic amines (BAs) are used for identifying spoilage in food. The most common are tryptamine (TRY), 2-phenylethylamine (PHE), putrescine (PUT), cadaverine (CAD) and histamine (HIS). Due to lack of chromophores, chemical derivatization with dansyl was employed to analyze these BAs using high performance liquid chromatography with a diode array detector (HPLC-DAD). However, the derivatization reaction occurs with any primary or secondary amine, leading to co-elution of analytes and interferents with identical spectral profiles, and thus causing rank deficiency. When the spectral profile is the same and peak misalignment is present on the chromatographic runs, it is not possible to handle the data only with Multivariate Curve Resolution and Alternative Least Square (MCR-ALS), by augmenting the time, or the spectral mode. A way to circumvent this drawback is to receive information from another detector that leads to a selective profile for the analyte. To overcome both problems, (tri-linearity break in time, and spectral mode), this paper proposes a new analytical methodology for fast quantitation of these BAs in fish with HPLC-DAD by using the icoshift algorithm for temporal misalignment correction before MCR-ALS spectral mode augmented treatment. Limits of detection, relative errors of prediction (REP) and average recoveries, ranging from 0.14 to 0.50 µg mL(-1), 3.5-8.8% and 88.08%-99.68%, respectively. These are outstanding results obtained, reaching quantification limits for the five BAs much lower than those established by the Food and Agriculture Organization of the United Nations and World Health Organization (FAO/WHO), and the European Food Safety Authority (EFSA), all without any pre-concentration steps. The concentrations of BAs in fish samples ranged from 7.82 to 29.41 µg g(-1), 8.68-25.95 µg g(-1), 4.76-28.54 µg g(-1), 5.18-39.95 µg g(-1) and 1.45-52.62 µg g(-1) for TRY, PHE, PUT, CAD, and HIS, respectively. In addition, the proposed method spends

  19. Validation of cross-sectional time series and multivariate adaptive regression splines models for the prediction of energy expenditure in children and adolescents using doubly labeled water

    USDA-ARS?s Scientific Manuscript database

    Accurate, nonintrusive, and inexpensive techniques are needed to measure energy expenditure (EE) in free-living populations. Our primary aim in this study was to validate cross-sectional time series (CSTS) and multivariate adaptive regression splines (MARS) models based on observable participant cha...

  20. Geological Time, Biological Events and the Learning Transfer Problem

    ERIC Educational Resources Information Center

    Johnson, Claudia C.; Middendorf, Joan; Rehrey, George; Dalkilic, Mehmet M.; Cassidy, Keely

    2014-01-01

    Comprehension of geologic time does not come easily, especially for students who are studying the earth sciences for the first time. This project investigated the potential success of two teaching interventions that were designed to help non-science majors enrolled in an introductory geology class gain a richer conceptual understanding of the…

  1. Space-time variation of respiratory cancers in South Carolina: a flexible multivariate mixture modeling approach to risk estimation.

    PubMed

    Carroll, Rachel; Lawson, Andrew B; Kirby, Russell S; Faes, Christel; Aregay, Mehreteab; Watjou, Kevin

    2017-01-01

    Many types of cancer have an underlying spatiotemporal distribution. Spatiotemporal mixture modeling can offer a flexible approach to risk estimation via the inclusion of latent variables. In this article, we examine the application and benefits of using four different spatiotemporal mixture modeling methods in the modeling of cancer of the lung and bronchus as well as "other" respiratory cancer incidences in the state of South Carolina. Of the methods tested, no single method outperforms the other methods; which method is best depends on the cancer under consideration. The lung and bronchus cancer incidence outcome is best described by the univariate modeling formulation, whereas the "other" respiratory cancer incidence outcome is best described by the multivariate modeling formulation. Spatiotemporal multivariate mixture methods can aid in the modeling of cancers with small and sparse incidences when including information from a related, more common type of cancer. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Reporting of Life Events Over Time: Methodological Issues in a Longitudinal Sample of Women

    ERIC Educational Resources Information Center

    Pachana, Nancy A.; Brilleman, Sam L.; Dobson, Annette J.

    2011-01-01

    The number of life events reported by study participants is sensitive to the method of data collection and time intervals under consideration. Individual characteristics also influence reporting; respondents with poor mental health report more life events. Much current research on life events is cross-sectional. Data from a longitudinal study of…

  3. Reporting of Life Events Over Time: Methodological Issues in a Longitudinal Sample of Women

    ERIC Educational Resources Information Center

    Pachana, Nancy A.; Brilleman, Sam L.; Dobson, Annette J.

    2011-01-01

    The number of life events reported by study participants is sensitive to the method of data collection and time intervals under consideration. Individual characteristics also influence reporting; respondents with poor mental health report more life events. Much current research on life events is cross-sectional. Data from a longitudinal study of…

  4. A random time interval approach for analysing the impact of a possible intermediate event on a terminal event.

    PubMed

    Beyersmann, Jan

    2007-08-01

    We consider the impact of a possible intermediate event on a terminal event in an illness-death model with states 'initial', 'intermediate' and 'terminal'. One aim is to unambiguously describe the occurrence of the intermediate event in terms of the observable data, the problem being that the intermediate event may not occur. We propose to consider a random time interval, whose length is the time spent in the intermediate state. We derive an estimator of the joint distribution of the left and right limit of the random time interval from the Aalen-Johansen estimator of the matrix of transition probabilities and study its asymptotic properties. We apply our approach to hospital infection data. Estimating the distribution of the random time interval will usually be only a first step of an analysis. We illustrate this by analysing change in length of hospital stay following an infection and derive the large sample properties of the respective estimator. ((c) 2007 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim).

  5. Travel time classification of extreme solar events: Two families and an outlier

    NASA Astrophysics Data System (ADS)

    Freed, A. J.; Russell, C. T.

    2014-10-01

    Extreme solar events are of great interest because of the extensive damage that could be experienced by technological systems such as electrical transformers during such periods. In studying geophysical phenomena, it is helpful to have a quantitative measure of event strength so that similar events can be intercompared. Such a measure also allows the calculation of the occurrence rates of events with varying strength. We use historical fast travel time solar events to develop a measure of strength based on the Sun-Earth trip time. We find that these fast events can be grouped into two distinct families with one even faster outlier. That outlier is not the Carrington event of 1859 but the extremely intense solar particle event of August 1972.

  6. Reliability of travel time data computed from interpreted migrated events

    NASA Astrophysics Data System (ADS)

    Jannaud, L. R.

    1995-02-01

    In the Sequential Migration Aided Reflection Tomography (SMART) method, travel times used by reflection tomography are computed by tracing rays which propagate with the migration velocity and reflect from reflectors picked on migrated images. Because of limits of migration resolution, this picking involves inaccuracies, to which computed travel times are unfortunately very sensitive. The objective of this paper is to predict a priori the confidence we can have in emergence data, i.e., emergence point location and travel time, from the statistical information that describes the uncertainties of the reflectors. (These reflectors can be obtained by picking on migrated images as explained above or by any other method). The proposed method relies on a linearization of each step of the ray computation, allowing one to deduce, from the statistical properties of reflector fluctuations, the statistical properties of ray-tracing outputs. The computed confidences and correlations give access to a more realistic analysis of emergence data. Moreover, they can be used as inputs for reflection tomography to compute models that match travel times according to the confidence we have in the reflector. Applications on real data show that the uncertainties are generally large and, what is much more interesting, strongly varying from one ray to another. Taking them into account is therefore very important for both a better understanding of the kinematic information in the data and the computation of a model that matches these travel times.

  7. Parent–offspring similarity in the timing of developmental events: an origin of heterochrony?

    PubMed Central

    Tills, Oliver; Rundle, Simon D.; Spicer, John I.

    2013-01-01

    Understanding the link between ontogeny (development) and phylogeny (evolution) remains a key aim of biology. Heterochrony, the altered timing of developmental events between ancestors and descendants, could be such a link although the processes responsible for producing heterochrony, widely viewed as an interspecific phenomenon, are still unclear. However, intraspecific variation in developmental event timing, if heritable, could provide the raw material from which heterochronies originate. To date, however, heritable developmental event timing has not been demonstrated, although recent work did suggest a genetic basis for intraspecific differences in event timing in the embryonic development of the pond snail, Radix balthica. Consequently, here we used high-resolution (temporal and spatial) imaging of the entire embryonic development of R. balthica to perform a parent–offspring comparison of the timing of twelve, physiological and morphological developmental events. Between-parent differences in the timing of all events were good predictors of such timing differences between their offspring, and heritability was demonstrated for two of these events (foot attachment and crawling). Such heritable intraspecific variation in developmental event timing could be the raw material for speciation events, providing a fundamental link between ontogeny and phylogeny, via heterochrony. PMID:23966639

  8. Causal Context Presented in Subsequent Event Modifies the Perceived Timing of Cause and Effect

    PubMed Central

    Umemura, Hiroyuki

    2017-01-01

    The effect of perceived causality on other aspects of perception, such as temporal or spatial perception, has interested many researchers. Previous studies have shown that the perceived timing of two events is modulated when the events are intentionally produced or the causal link between the two events was known in advance. However, little research has directly supported the idea that causality alone can modulate the perceived timing of two events without having knowledge about causal links in advance. In this study, I used novel causal displays in which various types of causal contexts could be presented in subsequent events (movement or color change of objects). In these displays, the preceding events were the same (ball falling from above), so observers could not predict which subsequent events displayed. The results showed that the perceived causal context modulated the temporal relationship of two serial events so as to be consistent with the causal order implied by the subsequent event; ball hit the floor, then objects moved. These modulations were smaller when the movements implied preceding effect of the falling ball (e.g., wind pressure). These results are well-suited to the Bayesian framework in which the perceived timing of events is reconstructed through the observers' prior experiences, and suggest that multiple prior experiences would competitively contribute to the estimation of the timing of events. PMID:28326051

  9. Causal Context Presented in Subsequent Event Modifies the Perceived Timing of Cause and Effect.

    PubMed

    Umemura, Hiroyuki

    2017-01-01

    The effect of perceived causality on other aspects of perception, such as temporal or spatial perception, has interested many researchers. Previous studies have shown that the perceived timing of two events is modulated when the events are intentionally produced or the causal link between the two events was known in advance. However, little research has directly supported the idea that causality alone can modulate the perceived timing of two events without having knowledge about causal links in advance. In this study, I used novel causal displays in which various types of causal contexts could be presented in subsequent events (movement or color change of objects). In these displays, the preceding events were the same (ball falling from above), so observers could not predict which subsequent events displayed. The results showed that the perceived causal context modulated the temporal relationship of two serial events so as to be consistent with the causal order implied by the subsequent event; ball hit the floor, then objects moved. These modulations were smaller when the movements implied preceding effect of the falling ball (e.g., wind pressure). These results are well-suited to the Bayesian framework in which the perceived timing of events is reconstructed through the observers' prior experiences, and suggest that multiple prior experiences would competitively contribute to the estimation of the timing of events.

  10. Pipeline Implementation of Real Time Event Cross Correlation for Nuclear Treaty Monitoring

    NASA Astrophysics Data System (ADS)

    Junek, W. N.; Wehlen, J. A., III

    2014-12-01

    The United States National Data Center (US NDC) is responsible for monitoring international compliance to nuclear test ban treaties. This mission is performed through real time acquisition, processing, and evaluation of data acquired by a global network of seismic, hydroacoustic, and infrasonic sensors. Automatic and human reviewed event solutions are stored in a data warehouse which contains over 15 years of alphanumeric information and waveform data. A significant effort is underway to employ the data warehouse in real time processing to improve the quality of automatic event solutions, reduce analyst burden, and supply decision makers with information regarding relevant historic events. To this end, the US NDC processing pipeline has been modified to automatically recognize events built in the past. Event similarity information and the most relevant historic solution are passed to the human analyst to assist their evaluation of automatically formed events. This is achieved through real time cross correlation of selected seismograms from automatically formed events against those stored in the data warehouse. Historic events used in correlation analysis are selected based on a set of user defined parameters, which are tuned to maintain pipeline timeliness requirements. Software architecture and database infrastructure were modified using a multithreaded design for increased processing speed, database connection pools for parallel queries, and Oracle spatial indexing to enhance query efficiency. This functionality allows the human analyst to spend more time studying anomalous events and less time rebuilding routine events.

  11. Multivariate statistical process control (MSPC) using Raman spectroscopy for in-line culture cell monitoring considering time-varying batches synchronized with correlation optimized warping (COW).

    PubMed

    Liu, Ya-Juan; André, Silvère; Saint Cristau, Lydia; Lagresle, Sylvain; Hannas, Zahia; Calvosa, Éric; Devos, Olivier; Duponchel, Ludovic

    2017-02-01

    Multivariate statistical process control (MSPC) is increasingly popular as the challenge provided by large multivariate datasets from analytical instruments such as Raman spectroscopy for the monitoring of complex cell cultures in the biopharmaceutical industry. However, Raman spectroscopy for in-line monitoring often produces unsynchronized data sets, resulting in time-varying batches. Moreover, unsynchronized data sets are common for cell culture monitoring because spectroscopic measurements are generally recorded in an alternate way, with more than one optical probe parallelly connecting to the same spectrometer. Synchronized batches are prerequisite for the application of multivariate analysis such as multi-way principal component analysis (MPCA) for the MSPC monitoring. Correlation optimized warping (COW) is a popular method for data alignment with satisfactory performance; however, it has never been applied to synchronize acquisition time of spectroscopic datasets in MSPC application before. In this paper we propose, for the first time, to use the method of COW to synchronize batches with varying durations analyzed with Raman spectroscopy. In a second step, we developed MPCA models at different time intervals based on the normal operation condition (NOC) batches synchronized by COW. New batches are finally projected considering the corresponding MPCA model. We monitored the evolution of the batches using two multivariate control charts based on Hotelling's T(2) and Q. As illustrated with results, the MSPC model was able to identify abnormal operation condition including contaminated batches which is of prime importance in cell culture monitoring We proved that Raman-based MSPC monitoring can be used to diagnose batches deviating from the normal condition, with higher efficacy than traditional diagnosis, which would save time and money in the biopharmaceutical industry. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. fixedTimeEvents: An R package for the distribution of distances between discrete events in fixed time

    NASA Astrophysics Data System (ADS)

    Liland, Kristian Hovde; Snipen, Lars

    When a series of Bernoulli trials occur within a fixed time frame or limited space, it is often interesting to assess if the successful outcomes have occurred completely at random, or if they tend to group together. One example, in genetics, is detecting grouping of genes within a genome. Approximations of the distribution of successes are possible, but they become inaccurate for small sample sizes. In this article, we describe the exact distribution of time between random, non-overlapping successes in discrete time of fixed length. A complete description of the probability mass function, the cumulative distribution function, mean, variance and recurrence relation is included. We propose an associated test for the over-representation of short distances and illustrate the methodology through relevant examples. The theory is implemented in an R package including probability mass, cumulative distribution, quantile function, random number generator, simulation functions, and functions for testing.

  13. Qualitative and event-specific real-time PCR detection methods for Bt brinjal event EE-1.

    PubMed

    Randhawa, Gurinder Jit; Sharma, Ruchi; Singh, Monika

    2012-01-01

    Bt brinjal event EE-1 with cry1Ac gene, expressing insecticidal protein against fruit and shoot borer, is the first genetically modified food crop in the pipeline for commercialization in India. Qualitative polymerase chain reaction (PCR) along with event-specific conventional as well as real-time PCR methods to characterize the event EE-1 is reported. A multiplex (pentaplex) PCR system simultaneously amplifying cry1Ac transgene, Cauliflower Mosaic Virus (CaMV) 35S promoter, nopaline synthase (nos) terminator, aminoglycoside adenyltransferase (aadA) marker gene, and a taxon-specific beta-fructosidase gene in event EE-1 has been developed. Furthermore, construct-specific PCR, targeting the approximate 1.8 kb region of inserted gene construct comprising the region of CaMV 35S promoter and cry1Ac gene has also been developed. The LOD of developed EE-1 specific conventional PCR assay is 0.01%. The method performance of the reported real-time PCR assay was consistent with the acceptance criteria of Codex Alimentarius Commission ALINORM 10/33/23, with the LOD and LOQ values of 0.05%. The developed detection methods would not only facilitate effective regulatory compliance for identification of genetic traits, risk assessment, management, and postrelease monitoring, but also address consumer concerns and resolution of legal disputes.

  14. Developmental and Cognitive Perspectives on Humans' Sense of the Times of Past and Future Events

    ERIC Educational Resources Information Center

    Friedman, W.J.

    2005-01-01

    Mental time travel in human adults includes a sense of when past events occurred and future events are expected to occur. Studies with adults and children reveal that a number of distinct psychological processes contribute to a temporally differentiated sense of the past and future. Adults possess representations of multiple time patterns, and…

  15. The Roles of Prior Experience and the Timing of Misinformation Presentation on Young Children's Event Memories

    ERIC Educational Resources Information Center

    Roberts, Kim P.; Powell, Martine B.

    2007-01-01

    The current study addressed how the timing of interviews affected children's memories of unique and repeated events. Five- to six-year-olds (N = 125) participated in activities 1 or 4 times and were misinformed either 3 or 21 days after the only or last event. Although single-experience children were subsequently less accurate in the 21- versus…

  16. Developmental and Cognitive Perspectives on Humans' Sense of the Times of Past and Future Events

    ERIC Educational Resources Information Center

    Friedman, W.J.

    2005-01-01

    Mental time travel in human adults includes a sense of when past events occurred and future events are expected to occur. Studies with adults and children reveal that a number of distinct psychological processes contribute to a temporally differentiated sense of the past and future. Adults possess representations of multiple time patterns, and…

  17. Time sequence of events leading to chromosomal aberration formation

    SciTech Connect

    Moore, R.C.; Bender, M.A.

    1993-05-01

    Investigations have been carried out on the influence of the repair polymerases on the yield of different types of chromosomal aberrations. The studies were mainly concerned with the effect of inhibiting the polymerases on the yield of aberrations. The polymerases fill in single-strand regions, and the fact that their inhibition affects the yield of aberrations suggests that single-strand lesions are influential in aberration formation. The results indicate that there are two actions of polymerases in clastogenesis. One is in their involvement in a G{sub 2} repair system, in which either of the two chromatids is concerned, and which does not yield aberrations unless the inhibition is still operating when the cells enter mitosis. The second is such that when repair is inhibited, further damage accrues. The second action is affected by inhibiting polymerase repair, but also operates even when the repair enzymes are active. The production of chromosomal exchanges involves a series of reactions, some of which are reversible. The time span over which the reactions occur is much longer than has been envisaged previously.

  18. Time sequence of events leading to chromosomal aberration formation

    SciTech Connect

    Moore, R.C. ); Bender, M.A. )

    1993-01-01

    Investigations have been carried out on the influence of the repair polymerases on the yield of different types of chromosomal aberrations. The studies were mainly concerned with the effect of inhibiting the polymerases on the yield of aberrations. The polymerases fill in single-strand regions, and the fact that their inhibition affects the yield of aberrations suggests that single-strand lesions are influential in aberration formation. The results indicate that there are two actions of polymerases in clastogenesis. One is in their involvement in a G[sub 2] repair system, in which either of the two chromatids is concerned, and which does not yield aberrations unless the inhibition is still operating when the cells enter mitosis. The second is such that when repair is inhibited, further damage accrues. The second action is affected by inhibiting polymerase repair, but also operates even when the repair enzymes are active. The production of chromosomal exchanges involves a series of reactions, some of which are reversible. The time span over which the reactions occur is much longer than has been envisaged previously.

  19. Modeling Potential Time to Event Data with Competing Risks

    PubMed Central

    Li, Liang; Hu, Bo; Kattan, Michael W.

    2014-01-01

    Patients receiving radical prostatectomy are at risk of metastasis or prostate cancer related death, and often need repeated clinical evaluations to determine whether additional adjuvant or salvage therapies are needed. Since the prostate cancer is a slowly progressing disease, and these additional therapies come with significant side effects, it is important for clinical decision making purposes to estimate a patient’s risk of cancer metastasis, in the presence of a competing risk by death, under the hypothetical condition that the patient does not receive any additional therapy. In observational studies, patients may receive additional therapy by choice; the time to metastasis without any therapy is often a potential outcome and not always observed. We study the competing risks model of Fine and Gray (1999) with adjustment for treatment choice by inverse probability censoring weighting (IPCW). The model can be fit using standard software for partial likelihood with double IPCW weights. The proposed methodology is used in a prostate cancer study to predict the post-prostatectomy cumulative incidence probability of cancer metastasis without additional adjuvant or salvage therapies. PMID:24061908

  20. Human error and time of occurrence in hazardous material events in mining and manufacturing.

    PubMed

    Ruckart, Perri Zeitz; Burgess, Paula A

    2007-04-11

    Human error has played a role in several large-scale hazardous materials events. To assess how human error and time of occurrence may have contributed to acute chemical releases, data from the Hazardous Substances Emergency Events Surveillance (HSEES) system for 1996-2003 were analyzed. Analyses were restricted to events in mining or manufacturing where human error was a contributing factor. The temporal distribution of releases was also evaluated to determine if the night shift impacted releases due to human error. Human error-related events in mining and manufacturing resulted in almost four times as many events with victims and almost three times as many events with evacuations compared with events in these industries where human error was not a contributing factor (10.3% versus 2.7% and 11.8% versus 4.5%, respectively). Time of occurrence of events attributable to human error in mining and manufacturing showed a widespread distribution for number of events, events with victims and evacuations, and hospitalizations and deaths, without apparent increased occurrence during the night shift. Utilizing human factor engineering in both front-end ergonomic design and retrospective incident investigation provides one potential systematic approach that may help minimize human error in workplace-related acute chemical releases and their resulting injuries.

  1. Time-varying exposure and the impact of stressful life events on onset of affective disorder.

    PubMed

    Wainwright, Nicholas W J; Surtees, Paul G

    2002-07-30

    Stressful life events are now established as risk factors for the onset of affective disorder but few studies have investigated time-varying exposure effects. Discrete (grouped) time survival methods provide a flexible framework for evaluating multiple time-dependent covariates and time-varying covariate effects. Here, we use these methods to investigate the time-varying influence of life events on the onset of affective disorder. Various straightforward time-varying exposure models are compared, involving one or more (stepped) time-dependent covariates and time-dependent covariates constructed or estimated according to exponential decay. These models are applied to data from two quite different studies. The first, a small scale interviewer-based longitudinal study (n = 180) concerned with affective disorder onset following loss (or threat of loss) event experiences. The second, a questionnaire assessment as part of an ongoing population study (n = 3353), provides a history of marital loss events and of depressive disorder onset. From the first study the initial impact of loss events was found to decay with a half-life of 5 weeks. Psychological coping strategy was found to modify vulnerability to the adverse effects of these events. The second study revealed that while men had a lower immediate risk of disorder onset following loss event experience their risk period was greater than for women. Time-varying exposure effects were well described by the appropriate use of simple time-dependent covariates.

  2. Predicting dose-time profiles of solar energetic particle events using Bayesian forecasting methods.

    PubMed

    Neal, J S; Townsend, L W

    2001-12-01

    Bayesian inference techniques, coupled with Markov chain Monte Carlo sampling methods, are used to predict dose-time profiles for energetic solar particle events. Inputs into the predictive methodology are dose and dose-rate measurements obtained early in the event. Surrogate dose values are grouped in hierarchical models to express relationships among similar solar particle events. Models assume nonlinear, sigmoidal growth for dose throughout an event. Markov chain Monte Carlo methods are used to sample from Bayesian posterior predictive distributions for dose and dose rate. Example predictions are provided for the November 8, 2000, and August 12, 1989, solar particle events.

  3. Multivariate normality

    NASA Technical Reports Server (NTRS)

    Crutcher, H. L.; Falls, L. W.

    1976-01-01

    Sets of experimentally determined or routinely observed data provide information about the past, present and, hopefully, future sets of similarly produced data. An infinite set of statistical models exists which may be used to describe the data sets. The normal distribution is one model. If it serves at all, it serves well. If a data set, or a transformation of the set, representative of a larger population can be described by the normal distribution, then valid statistical inferences can be drawn. There are several tests which may be applied to a data set to determine whether the univariate normal model adequately describes the set. The chi-square test based on Pearson's work in the late nineteenth and early twentieth centuries is often used. Like all tests, it has some weaknesses which are discussed in elementary texts. Extension of the chi-square test to the multivariate normal model is provided. Tables and graphs permit easier application of the test in the higher dimensions. Several examples, using recorded data, illustrate the procedures. Tests of maximum absolute differences, mean sum of squares of residuals, runs and changes of sign are included in these tests. Dimensions one through five with selected sample sizes 11 to 101 are used to illustrate the statistical tests developed.

  4. Analysis and modeling of quasi-stationary multivariate time series and their application to middle latency auditory evoked potentials

    NASA Astrophysics Data System (ADS)

    Hutt, A.; Riedel, H.

    2003-03-01

    A methodological framework for analyzing and modeling of multivariate data is introduced. In a first step, a cluster method extracts data segments of quasi-stationary states. A novel cluster criterion for segment borders is introduced, which is independent of the number of clusters. Its assessment reveals additional robustness towards initial conditions. A subsequent dynamical systems based modeling (DSBM) approach focuses on data segments and fits low-dimensional dynamical systems for each segment. Applications to middle latent auditory evoked potentials yield data segments, which are equivalent to well-known waves from electroencephalography studies. Focussing to wave Pa, two-dimensional dynamical systems with common topological properties are extracted. These findings reveal the common underlying dynamics of Pa and indicate self-organized brain activity.

  5. A new approach in space-time analysis of multivariate hydrological data: Application to Brazil's Nordeste region rainfall

    NASA Astrophysics Data System (ADS)

    Sicard, Emeline; Sabatier, Robert; Niel, HéLèNe; Cadier, Eric

    2002-12-01

    The objective of this paper is to implement an original method for spatial and multivariate data, combining a method of three-way array analysis (STATIS) with geostatistical tools. The variables of interest are the monthly amounts of rainfall in the Nordeste region of Brazil, recorded from 1937 to 1975. The principle of the technique is the calculation of a linear combination of the initial variables, containing a large part of the initial variability and taking into account the spatial dependencies. It is a promising method that is able to analyze triple variability: spatial, seasonal, and interannual. In our case, the first component obtained discriminates a group of rain gauges, corresponding approximately to the Agreste, from all the others. The monthly variables of July and August strongly influence this separation. Furthermore, an annual study brings out the stability of the spatial structure of components calculated for each year.

  6. Cognitive tasks in information analysis: Use of event dwell time to characterize component activities

    SciTech Connect

    Sanquist, Thomas F.; Greitzer, Frank L.; Slavich, Antoinette L.; Littlefield, Rik J.; Littlefield, Janis S.; Cowley, Paula J.

    2004-09-28

    Technology-based enhancement of information analysis requires a detailed understanding of the cognitive tasks involved in the process. The information search and report production tasks of the information analysis process were investigated through evaluation of time-stamped workstation data gathered with custom software. Model tasks simulated the search and production activities, and a sample of actual analyst data were also evaluated. Task event durations were calculated on the basis of millisecond-level time stamps, and distributions were plotted for analysis. The data indicate that task event time shows a cyclic pattern of variation, with shorter event durations (< 2 sec) reflecting information search and filtering, and longer event durations (> 10 sec) reflecting information evaluation. Application of cognitive principles to the interpretation of task event time data provides a basis for developing “cognitive signatures” of complex activities, and can facilitate the development of technology aids for information intensive tasks.

  7. Multivariate time-dependent comparison of the impact of lenalidomide in lower-risk myelodysplastic syndromes with chromosome 5q deletion.

    PubMed

    Sánchez-García, Joaquín; Del Cañizo, Consuelo; Lorenzo, Ignacio; Nomdedeu, Benet; Luño, Elisa; de Paz, Raquel; Xicoy, Blanca; Valcárcel, David; Brunet, Salut; Marco-Betes, Victor; García-Pintos, Marta; Osorio, Santiago; Tormo, Mar; Bailén, Alicia; Cerveró, Carlos; Ramos, Fernando; Diez-Campelo, María; Such, Esperanza; Arrizabalaga, Beatriz; Azaceta, Gemma; Bargay, Joan; Arilla, María J; Falantes, José; Serrano-López, Josefina; Sanz, Guillermo F

    2014-07-01

    The impact of lenalidomide treatment on long-term outcomes of patients with lower risk myelodysplastic syndromes (MDS) and chromosome 5q deletion (del(5q)) is unclear. This study used time-dependent multivariate methodology to analyse the influence of lenalidomide therapy on overall survival (OS) and acute myeloblastic leukaemia (AML) progression in 215 patients with International Prognostic Scoring System (IPSS) low or intermediate-1 risk and del(5q). There were significant differences in several relevant characteristics at presentation between patients receiving (n = 86) or not receiving lenalidomide (n = 129). The 5-year time-dependent probabilities of OS and progression to AML were 62% and 31% for patients receiving lenalidomide and 42% and 25% for patients not receiving lenalidomide; differences were not statistically significant in multivariate analysis that included all variables independently associated with those outcomes (OS, P = 0·45; risk of AML, P = 0·31, respectively). Achievement of RBC transfusion independency (P = 0·069) or cytogenetic response (P = 0·021) after lenalidomide was associated with longer OS in multivariate analysis. These data clearly show that response to lenalidomide results in a substantial clinical benefit in lower risk MDS patients with del(5q). Lenalidomide treatment does not appear to increase AML risk in this population of patients.

  8. Time, space, and events in language and cognition: a comparative view.

    PubMed

    Sinha, Chris; Gärdenfors, Peter

    2014-10-01

    We propose an event-based account of the cognitive and linguistic representation of time and temporal relations. Human beings differ from nonhuman animals in entertaining and communicating elaborate detached (as opposed to cued) event representations and temporal relational schemas. We distinguish deictically based (D-time) from sequentially based (S-time) representations, identifying these with the philosophical categories of A-series and B-series time. On the basis of cross-linguistic data, we claim that all cultures employ both D-time and S-time representations. We outline a cognitive model of event structure, emphasizing that this does not entail an explicit, separate representation of a time dimension. We propose that the notion of an event-independent, metric "time as such" is not universal, but a cultural and historical construction based on cognitive technologies for measuring time intervals. We critically examine claims that time is universally conceptualized in terms of spatial metaphors, and hypothesize that systematic space-time metaphor is only found in languages and cultures that have constructed the notion of time as a separate dimension. We emphasize the importance of distinguishing what is universal from what is variable in cultural and linguistic representations of time, and speculate on the general implications of an event-based understanding of time. © 2014 New York Academy of Sciences.

  9. Joint monitoring and prediction of accrual and event times in clinical trials.

    PubMed

    Zhang, Xiaoxi; Long, Qi

    2012-11-01

    In many clinical trials, the primary endpoint is time to an event of interest, for example, time to cardiac attack or tumor progression, and the statistical power of these trials is primarily driven by the number of events observed during the trials. In such trials, the number of events observed is impacted not only by the number of subjects enrolled but also by other factors including the event rate and the follow-up duration. Consequently, it is important for investigators to be able to monitor and predict accurately patient accrual and event times so as to predict the times of interim and final analyses and enable efficient allocation of research resources, which have long been recognized as important aspects of trial design and conduct. The existing methods for prediction of event times all assume that patient accrual follows a Poisson process with a constant Poisson rate over time; however, it is fairly common in real-life clinical trials that the Poisson rate changes over time. In this paper, we propose a Bayesian joint modeling approach for monitoring and prediction of accrual and event times in clinical trials. We employ a nonhomogeneous Poisson process to model patient accrual and a parametric or nonparametric model for the event and loss to follow-up processes. Compared to existing methods, our proposed methods are more flexible and robust in that we model accrual and event/loss-to-follow-up times jointly and allow the underlying accrual rates to change over time. We evaluate the performance of the proposed methods through simulation studies and illustrate the methods using data from a real oncology trial. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Modeling and prediction of subject accrual and event times in clinical trials: a systematic review.

    PubMed

    Zhang, Xiaoxi; Long, Qi

    2012-12-01

    Modeling and prediction of subject accrual and event times in clinical trials has been a topic of considerable interest for important practical reasons. It has implications not only at the initial planning stage of a trial but also on its ongoing monitoring. To provide a systematic view of the recent research in the field of modeling and prediction of subject accrual and event times in clinical trials. Two classes of methods for modeling and prediction of subject accrual are reviewed, namely, one that uses the Brownian motion and the other uses the Poisson process. Extensions of the accrual models in multicenter clinical trials are also discussed. Trials with survival endpoints require proper joint modeling of subject accrual and event/lost-to-follow-up (LTFU) times, the latter of which can be modeled either parametrically (e.g., exponential and Weibull) or nonparametrically. Flexible stochastic models are better suited when modeling real trials that does not follow constant underlying enrollment rate. The accrual model generally improves as center-specific information is accounted for in multicenter trials. The choice between parametric and nonparametric event models can depend on confidence on the underlying event rates. All methods reviewed in event modeling assume noninformative censoring, which cannot be tested. We recommend using proper stochastic accrual models, in combination with flexible event time models when applicable, for modeling and prediction of subject enrollment and event times in clinical trials.

  11. Monitoring Natural Events Globally in Near Real-Time Using NASA's Open Web Services and Tools

    NASA Technical Reports Server (NTRS)

    Boller, Ryan A.; Ward, Kevin Alan; Murphy, Kevin J.

    2015-01-01

    Since 1960, NASA has been making global measurements of the Earth from a multitude of space-based missions, many of which can be useful for monitoring natural events. In recent years, these measurements have been made available in near real-time, making it possible to use them to also aid in managing the response to natural events. We present the challenges and ongoing solutions to using NASA satellite data for monitoring and managing these events.

  12. Neural Network-Based Event-Triggered State Feedback Control of Nonlinear Continuous-Time Systems.

    PubMed

    Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani

    2016-03-01

    This paper presents a novel approximation-based event-triggered control of multi-input multi-output uncertain nonlinear continuous-time systems in affine form. The controller is approximated using a linearly parameterized neural network (NN) in the context of event-based sampling. After revisiting the NN approximation property in the context of event-based sampling, an event-triggered condition is proposed using the Lyapunov technique to reduce the network resource utilization and to generate the required number of events for the NN approximation. In addition, a novel weight update law for aperiodic tuning of the NN weights at triggered instants is proposed to relax the knowledge of complete system dynamics and to reduce the computation when compared with the traditional NN-based control. Nonetheless, a nonzero positive lower bound for the inter-event times is guaranteed to avoid the accumulation of events or Zeno behavior. For analyzing the stability, the event-triggered system is modeled as a nonlinear impulsive dynamical system and the Lyapunov technique is used to show local ultimate boundedness of all signals. Furthermore, in order to overcome the unnecessary triggered events when the system states are inside the ultimate bound, a dead-zone operator is used to reset the event-trigger errors to zero. Finally, the analytical design is substantiated with numerical results.

  13. Joint modelling of longitudinal and time-to-event data with application to predicting abdominal aortic aneurysm growth and rupture.

    PubMed

    Sweeting, Michael J; Thompson, Simon G

    2011-09-01

    Shared random effects joint models are becoming increasingly popular for investigating the relationship between longitudinal and time-to-event data. Although appealing, such complex models are computationally intensive, and quick, approximate methods may provide a reasonable alternative. In this paper, we first compare the shared random effects model with two approximate approaches: a naïve proportional hazards model with time-dependent covariate and a two-stage joint model, which uses plug-in estimates of the fitted values from a longitudinal analysis as covariates in a survival model. We show that the approximate approaches should be avoided since they can severely underestimate any association between the current underlying longitudinal value and the event hazard. We present classical and Bayesian implementations of the shared random effects model and highlight the advantages of the latter for making predictions. We then apply the models described to a study of abdominal aortic aneurysms (AAA) to investigate the association between AAA diameter and the hazard of AAA rupture. Out-of-sample predictions of future AAA growth and hazard of rupture are derived from Bayesian posterior predictive distributions, which are easily calculated within an MCMC framework. Finally, using a multivariate survival sub-model we show that underlying diameter rather than the rate of growth is the most important predictor of AAA rupture.

  14. Predictive event modelling in multicenter clinical trials with waiting time to response.

    PubMed

    Anisimov, Vladimir V

    2011-01-01

    A new analytic statistical technique for predictive event modeling in ongoing multicenter clinical trials with waiting time to response is developed. It allows for the predictive mean and predictive bounds for the number of events to be constructed over time, accounting for the newly recruited patients and patients already at risk in the trial, and for different recruitment scenarios. For modeling patient recruitment, an advanced Poisson-gamma model is used, which accounts for the variation in recruitment over time, the variation in recruitment rates between different centers and the opening or closing of some centers in the future. A few models for event appearance allowing for 'recurrence', 'death' and 'lost-to-follow-up' events and using finite Markov chains in continuous time are considered. To predict the number of future events over time for an ongoing trial at some interim time, the parameters of the recruitment and event models are estimated using current data and then the predictive recruitment rates in each center are adjusted using individual data and Bayesian re-estimation. For a typical scenario (continue to recruit during some time interval, then stop recruitment and wait until a particular number of events happens), the closed-form expressions for the predictive mean and predictive bounds of the number of events at any future time point are derived under the assumptions of Markovian behavior of the event progression. The technique is efficiently applied to modeling different scenarios for some ongoing oncology trials. Case studies are considered. Copyright © 2011 John Wiley & Sons, Ltd.

  15. Summarizing the incidence of adverse events using volcano plots and time intervals.

    PubMed

    Zink, Richard C; Wolfinger, Russell D; Mann, Geoffrey

    2013-01-01

    Adverse event incidence analyses are a critical component for describing the safety profile of any new intervention. The results typically are presented in lengthy summary tables. For therapeutic areas where patients have frequent adverse events, analysis and interpretation are made more difficult by the sheer number and variety of events that occur. Understanding the risk in these instances becomes even more crucial. We describe a space-saving graphical summary that overcomes the limitations of traditional presentations of adverse events and improves interpretability of the safety profile. We present incidence analyses of adverse events graphically using volcano plots to highlight treatment differences. Data from a clinical trial of patients experiencing an aneurysmal subarachnoid hemorrhage are used for illustration. Adjustments for multiplicity are illustrated. Color is used to indicate the treatment with higher incidence; bubble size represents the total number of events that occur in the treatment arms combined. Adjustments for multiple comparisons are displayed in a manner to indicate clearly those events for which the difference between treatment arms is statistically significant. Furthermore, adverse events can be displayed by time intervals, with multiple volcano plots or animation to appreciate changes in adverse event risk over time. Such presentations can emphasize early differences across treatments that may resolve later or highlight events for which treatment differences may become more substantial with longer follow-up. Treatment arms are compared in a pairwise fashion. Volcano plots are space-saving tools that emphasize important differences between the adverse event profiles of two treatment arms. They can incorporate multiplicity adjustments in a manner that is straightforward to interpret and, by using time intervals, can illustrate how adverse event risk changes over the course of a clinical trial.

  16. Seizure-Onset Mapping Based on Time-Variant Multivariate Functional Connectivity Analysis of High-Dimensional Intracranial EEG: A Kalman Filter Approach.

    PubMed

    Lie, Octavian V; van Mierlo, Pieter

    2017-01-01

    The visual interpretation of intracranial EEG (iEEG) is the standard method used in complex epilepsy surgery cases to map the regions of seizure onset targeted for resection. Still, visual iEEG analysis is labor-intensive and biased due to interpreter dependency. Multivariate parametric functional connectivity measures using adaptive autoregressive (AR) modeling of the iEEG signals based on the Kalman filter algorithm have been used successfully to localize the electrographic seizure onsets. Due to their high computational cost, these methods have been applied to a limited number of iEEG time-series (<60). The aim of this study was to test two Kalman filter implementations, a well-known multivariate adaptive AR model (Arnold et al. 1998) and a simplified, computationally efficient derivation of it, for their potential application to connectivity analysis of high-dimensional (up to 192 channels) iEEG data. When used on simulated seizures together with a multivariate connectivity estimator, the partial directed coherence, the two AR models were compared for their ability to reconstitute the designed seizure signal connections from noisy data. Next, focal seizures from iEEG recordings (73-113 channels) in three patients rendered seizure-free after surgery were mapped with the outdegree, a graph-theory index of outward directed connectivity. Simulation results indicated high levels of mapping accuracy for the two models in the presence of low-to-moderate noise cross-correlation. Accordingly, both AR models correctly mapped the real seizure onset to the resection volume. This study supports the possibility of conducting fully data-driven multivariate connectivity estimations on high-dimensional iEEG datasets using the Kalman filter approach.

  17. Schizophrenia Spectrum Disorders Show Reduced Specificity and Less Positive Events in Mental Time Travel.

    PubMed

    Chen, Xing-Jie; Liu, Lu-Lu; Cui, Ji-Fang; Wang, Ya; Chen, An-Tao; Li, Feng-Hua; Wang, Wei-Hong; Zheng, Han-Feng; Gan, Ming-Yuan; Li, Chun-Qiu; Shum, David H K; Chan, Raymond C K

    2016-01-01

    Mental time travel refers to the ability to recall past events and to imagine possible future events. Schizophrenia (SCZ) patients have problems in remembering specific personal experiences in the past and imagining what will happen in the future. This study aimed to examine episodic past and future thinking in SCZ spectrum disorders including SCZ patients and individuals with schizotypal personality disorder (SPD) proneness who are at risk for developing SCZ. Thirty-two SCZ patients, 30 SPD proneness individuals, and 33 healthy controls participated in the study. The Sentence Completion for Events from the Past Test (SCEPT) and the Sentence Completion for Events in the Future Test were used to measure past and future thinking abilities. Results showed that SCZ patients showed significantly reduced specificity in recalling past and imagining future events, they generated less proportion of specific and extended events compared to healthy controls. SPD proneness individuals only generated less extended events compared to healthy controls. The reduced specificity was mainly manifested in imagining future events. Both SCZ patients and SPD proneness individuals generated less positive events than controls. These results suggest that mental time travel impairments in SCZ spectrum disorders and have implications for understanding their cognitive and emotional deficits.

  18. Schizophrenia Spectrum Disorders Show Reduced Specificity and Less Positive Events in Mental Time Travel

    PubMed Central

    Chen, Xing-jie; Liu, Lu-lu; Cui, Ji-fang; Wang, Ya; Chen, An-tao; Li, Feng-hua; Wang, Wei-hong; Zheng, Han-feng; Gan, Ming-yuan; Li, Chun-qiu; Shum, David H. K.; Chan, Raymond C. K.

    2016-01-01

    Mental time travel refers to the ability to recall past events and to imagine possible future events. Schizophrenia (SCZ) patients have problems in remembering specific personal experiences in the past and imagining what will happen in the future. This study aimed to examine episodic past and future thinking in SCZ spectrum disorders including SCZ patients and individuals with schizotypal personality disorder (SPD) proneness who are at risk for developing SCZ. Thirty-two SCZ patients, 30 SPD proneness individuals, and 33 healthy controls participated in the study. The Sentence Completion for Events from the Past Test (SCEPT) and the Sentence Completion for Events in the Future Test were used to measure past and future thinking abilities. Results showed that SCZ patients showed significantly reduced specificity in recalling past and imagining future events, they generated less proportion of specific and extended events compared to healthy controls. SPD proneness individuals only generated less extended events compared to healthy controls. The reduced specificity was mainly manifested in imagining future events. Both SCZ patients and SPD proneness individuals generated less positive events than controls. These results suggest that mental time travel impairments in SCZ spectrum disorders and have implications for understanding their cognitive and emotional deficits. PMID:27507958

  19. Tracking precipitation events in time and space in gridded observational data

    NASA Astrophysics Data System (ADS)

    White, R. H.; Battisti, D. S.; Skok, G.

    2017-08-01

    Novel precipitation event data sets are created by tracking 3-hourly observed rainfall in both time and space in the TRMM 3B42 and ERA-Interim data sets. Relative to TRMM, ERA-Interim data undersimulate the total number of events (factor of ˜0.5), and oversimulate the frequency of events lasting >5 days (factor of 1.6). Longer-lasting events tend to have larger spatial footprints and higher intensity precipitation at any point in their lifetime, and thus contribute significantly more to total precipitation than shorter events. Precipitation changes in selected tropical and subtropical regions are attributed to different event characteristics: some to changes in event rainfall intensity, others to changes in the number of events. In the 40°S-40°N spatial average, the number of events lasting 1-5 days significantly increased from 1998 to 2014 in both the TRMM 3B42 and ERA-Interim data. The event data sets, analysis scripts, and selected processed data are freely available online.

  20. The Time-Scaling Issue in the Frequency Analysis of Multidimensional Extreme Events

    NASA Astrophysics Data System (ADS)

    Gonzalez, J.; Valdes, J. B.

    2004-05-01

    Extreme events, such as droughts, appear as a period of time where water availability differ exceptionally from normal condition. Several characteristic of this departure from the normality are important in analyzing droughts recurrence frequency (e.g. magnitude, maximum intensity, duration, severity,.). In this kind of problems, the time scale applied in the analyses may become an issue when applying conventional frequency analysis approaches, generally based on the run theory. Usually few (one or two) main event-characteristics may be used, and when the time-scale changes in orders of magnitude, the derived frequency significantly changes, so poor characterization is achieved. For example, sort time-scale empathies characteristic such as intensity, but long time scale does magnitude. That variability may be overcome using a new approach, where events are threatened as in-time-multidimensional. This is studied in this work by comparing analysis applying conventional approach and the new multidimensional approach, and using from daily to decadal time scale. The improve in the performance of applying multidimensional technique, whit which frequency remains characterized even using different time-scale order of magnitude, results the main outcome of the study. The ability of implicitly incorporate all event feature in the time distribution, made possible characterize the events, independently of the time-scale, if the scale does not hide the extreme features.

  1. Event-Based Tone Mapping for Asynchronous Time-Based Image Sensor

    PubMed Central

    Simon Chane, Camille; Ieng, Sio-Hoi; Posch, Christoph; Benosman, Ryad B.

    2016-01-01

    The asynchronous time-based neuromorphic image sensor ATIS is an array of autonomously operating pixels able to encode luminance information with an exceptionally high dynamic range (>143 dB). This paper introduces an event-based methodology to display data from this type of event-based imagers, taking into account the large dynamic range and high temporal accuracy that go beyond available mainstream display technologies. We introduce an event-based tone mapping methodology for asynchronously acquired time encoded gray-level data. A global and a local tone mapping operator are proposed. Both are designed to operate on a stream of incoming events rather than on time frame windows. Experimental results on real outdoor scenes are presented to evaluate the performance of the tone mapping operators in terms of quality, temporal stability, adaptation capability, and computational time. PMID:27642275

  2. Adaptive Event-Triggered Control Based on Heuristic Dynamic Programming for Nonlinear Discrete-Time Systems.

    PubMed

    Dong, Lu; Zhong, Xiangnan; Sun, Changyin; He, Haibo

    2016-04-08

    This paper presents the design of a novel adaptive event-triggered control method based on the heuristic dynamic programming (HDP) technique for nonlinear discrete-time systems with unknown system dynamics. In the proposed method, the control law is only updated when the event-triggered condition is violated. Compared with the periodic updates in the traditional adaptive dynamic programming (ADP) control, the proposed method can reduce the computation and transmission cost. An actor-critic framework is used to learn the optimal event-triggered control law and the value function. Furthermore, a model network is designed to estimate the system state vector. The main contribution of this paper is to design a new trigger threshold for discrete-time systems. A detailed Lyapunov stability analysis shows that our proposed event-triggered controller can asymptotically stabilize the discrete-time systems. Finally, we test our method on two different discrete-time systems, and the simulation results are included.

  3. Time-transgressive late cenozoic radiolarian events of the equatorial indo-pacific.

    PubMed

    Johnson, D A; Nigrini, C A

    1985-11-01

    A biostratigraphic study of late Cenozoic Radiolaria in the equatorial Indo-Pacific shows an asymmetrical distribution between synchronous and diachronous events. A majority of synchronous events (15 out of 19) are last occurrences; the majority of diachronous events (10 out of 13) are first occurrences. Extinctions may therefore be preferable to first occurrences in the selection of datum levels for the definition of biostratigraphic zonations and for correlation control within global time scales. Diachronous equatorial radiolarian events span 1 to 2 million years, several orders of magnitude longer than the nominal mixing time of the oceans, suggesting that the biological and physical exchange processes associated with speciation events may not follow simple advective mixing models.

  4. Angles of multivariable root loci

    NASA Technical Reports Server (NTRS)

    Thompson, P. M.; Stein, G.; Laub, A. J.

    1982-01-01

    A generalized eigenvalue problem is demonstrated to be useful for computing the multivariable root locus, particularly when obtaining the arrival angles to finite transmission zeros. The multivariable root loci are found for a linear, time-invariant output feedback problem. The problem is then employed to compute a closed-loop eigenstructure. The method of computing angles on the root locus is demonstrated, and the method is extended to a multivariable optimal root locus.

  5. Analysys of intensity time profile of solar proton events in SOLAR CYCLE 23

    NASA Astrophysics Data System (ADS)

    Ochelkov, Yurij

    The intensity time profile of proton fluxes in SEP events of SOLAR CYCLE 23 on data from GOES was studied. In our study the time interval in which proton flux intensity changes from 1/10 of peak intensity to peak intensity both in rise and decay stages is considered as a main phase of time evolution of SEP events. The study of time profiles in this phase is very important to help us to understand which propagation mechanism determines the proton peak intensities in different events and to establish SEP event classification. We propose the next parameters to use for quantitative analysis of time profiles: 1) The time width parameters of time profile, such as the mean width of time profile for intensity in logarithmic scale for main phase, for rise and decay stages of main phase; the same parameters for stage of flux change in limits from 1/3.16 of peak intensity to peak intensity . 2) The propagation parameters of time profile, such as ratio of time to maximum (the time interval between the injection onset time and the peak intensity time) to all time width parameters. We calculated the theoretical values of all these parameters for diffusive propagation with instant and long injection for different dependencies diffusion coefficient from distance. We constructed the distribution of propagation parameter for main phase for all events of SOLAR CYCLE 23 and found that distribution has a peak near value 0.43 for fluxes with proton energy greater than 60 and 100 MeV. Roughly 20% events have the propagation parameter which is equal 0.43 with accuracy 10%. We propose to consider such time profile as a basic time profile. It is formed by diffusive process with diffusion coefficient with linear dependence from distance and instant injection. On our mind the declination from basic profile for great bulk of events depends on long injection from shock waves and trapping proton in structures of the heliosphere. Practically all profiles for proton energy greater than 10 MeV are

  6. Asynchronous Periodic Edge-Event Triggered Control for Double-Integrator Networks With Communication Time Delays.

    PubMed

    Duan, Gaopeng; Xiao, Feng; Wang, Long

    2017-01-23

    This paper focuses on the average consensus of double-integrator networked systems based on the asynchronous periodic edge-event triggered control. The asynchronous property lies in the edge event-detecting procedure. For different edges, their event detections are performed at different times and the corresponding events occur independently of each other. When an event is activated, the two adjacent agents connected by the corresponding link sample their relative state information and update their controllers. The application of incidence matrix facilitates the transformation of control objects from the agent-based to the edge-based. Practically, due to the constraints of network bandwidth and communication distance, agents usually cannot receive the instantaneous information of some others, which has an impact on the system performance. Hence, it is necessary to investigate the presence of communication time delays. For double-integrator multiagent systems with and without communication time delays, the average state consensus can be asynchronously achieved by designing appropriate parameters under the proposed event-detecting rules. The presented results specify the relationship among the maximum allowable time delays, interaction topologies, and event-detecting periods. Furthermore, the proposed protocols have the advantages of reduced communication costs and controller-updating costs. Simulation examples are given to illustrate the proposed theoretical results.

  7. Shared parameter models for the joint analysis of longitudinal data and event times.

    PubMed

    Vonesh, Edward F; Greene, Tom; Schluchter, Mark D

    2006-01-15

    Longitudinal studies often gather joint information on time to some event (survival analysis, time to dropout) and serial outcome measures (repeated measures, growth curves). Depending on the purpose of the study, one may wish to estimate and compare serial trends over time while accounting for possibly non-ignorable dropout or one may wish to investigate any associations that may exist between the event time of interest and various longitudinal trends. In this paper, we consider a class of random-effects models known as shared parameter models that are particularly useful for jointly analysing such data; namely repeated measurements and event time data. Specific attention will be given to the longitudinal setting where the primary goal is to estimate and compare serial trends over time while adjusting for possible informative censoring due to patient dropout. Parametric and semi-parametric survival models for event times together with generalized linear or non-linear mixed-effects models for repeated measurements are proposed for jointly modelling serial outcome measures and event times. Methods of estimation are based on a generalized non-linear mixed-effects model that may be easily implemented using existing software. This approach allows for flexible modelling of both the distribution of event times and of the relationship of the longitudinal response variable to the event time of interest. The model and methods are illustrated using data from a multi-centre study of the effects of diet and blood pressure control on progression of renal disease, the modification of diet in renal disease study.

  8. Towards Real-Time Detection of Gait Events on Different Terrains Using Time-Frequency Analysis and Peak Heuristics Algorithm

    PubMed Central

    Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin

    2016-01-01

    Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses. PMID:27706086

  9. Towards Real-Time Detection of Gait Events on Different Terrains Using Time-Frequency Analysis and Peak Heuristics Algorithm.

    PubMed

    Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin

    2016-10-01

    Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses.

  10. On the estimation of intracluster correlation for time-to-event outcomes in cluster randomized trials.

    PubMed

    Kalia, Sumeet; Klar, Neil; Donner, Allan

    2016-12-30

    Cluster randomized trials (CRTs) involve the random assignment of intact social units rather than independent subjects to intervention groups. Time-to-event outcomes often are endpoints in CRTs. Analyses of such data need to account for the correlation among cluster members. The intracluster correlation coefficient (ICC) is used to assess the similarity among binary and continuous outcomes that belong to the same cluster. However, estimating the ICC in CRTs with time-to-event outcomes is a challenge because of the presence of censored observations. The literature suggests that the ICC may be estimated using either censoring indicators or observed event times. A simulation study explores the effect of administrative censoring on estimating the ICC. Results show that ICC estimators derived from censoring indicators or observed event times are negatively biased. Analytic work further supports these results. Observed event times are preferred to estimate the ICC under minimum frequency of administrative censoring. To our knowledge, the existing literature provides no practical guidance on the estimation of ICC when substantial amount of administrative censoring is present. The results from this study corroborate the need for further methodological research on estimating the ICC for correlated time-to-event outcomes. Copyright © 2016 John Wiley & Sons, Ltd.

  11. WAITING TIME DISTRIBUTION OF SOLAR ENERGETIC PARTICLE EVENTS MODELED WITH A NON-STATIONARY POISSON PROCESS

    SciTech Connect

    Li, C.; Su, W.; Fang, C.; Zhong, S. J.; Wang, L.

    2014-09-10

    We present a study of the waiting time distributions (WTDs) of solar energetic particle (SEP) events observed with the spacecraft WIND and GOES. The WTDs of both solar electron events (SEEs) and solar proton events (SPEs) display a power-law tail of ∼Δt {sup –γ}. The SEEs display a broken power-law WTD. The power-law index is γ{sub 1} = 0.99 for the short waiting times (<70 hr) and γ{sub 2} = 1.92 for large waiting times (>100 hr). The break of the WTD of SEEs is probably due to the modulation of the corotating interaction regions. The power-law index, γ ∼ 1.82, is derived for the WTD of the SPEs which is consistent with the WTD of type II radio bursts, indicating a close relationship between the shock wave and the production of energetic protons. The WTDs of SEP events can be modeled with a non-stationary Poisson process, which was proposed to understand the waiting time statistics of solar flares. We generalize the method and find that, if the SEP event rate λ = 1/Δt varies as the time distribution of event rate f(λ) = Aλ{sup –α}exp (– βλ), the time-dependent Poisson distribution can produce a power-law tail WTD of ∼Δt {sup α} {sup –3}, where 0 ≤ α < 2.

  12. L/superscript-p/ stability /p ranging from 1 to infinity/ of multivariable non-linear time-varying feedback systems that are open-loop unstable

    NASA Technical Reports Server (NTRS)

    Callier, F. M.; Desoer, C. A.

    1974-01-01

    The loop transformation technique (Sandberg, 1965; Zames, 1966, Willems, 1971), and the fixed point theorem (Schwartz, 1970) are used to derive the L(superscript-p) stability for a class of multivariable nonlinear time-varying feedback systems which are open-loop unstable. The application of the fixed point theorem in L(superscript-p) shows that the nonlinear feedback system has one and only one solution for any pair of inputs in L(superscript-p), that the solutions are continuously dependent on the inputs, and that the closed loop system is L(superscript-p)-stable for any p ranging from 1 to infinity.

  13. Spatial Cueing in Time-Space Synesthetes: An Event-Related Brain Potential Study

    ERIC Educational Resources Information Center

    Teuscher, Ursina; Brang, David; Ramachandran, Vilayanur S.; Coulson, Seana

    2010-01-01

    Some people report that they consistently and involuntarily associate time events, such as months of the year, with specific spatial locations; a condition referred to as time-space synesthesia. The present study investigated the manner in which such synesthetic time-space associations affect visuo-spatial attention via an endogenous cuing…

  14. Spatial Cueing in Time-Space Synesthetes: An Event-Related Brain Potential Study

    ERIC Educational Resources Information Center

    Teuscher, Ursina; Brang, David; Ramachandran, Vilayanur S.; Coulson, Seana

    2010-01-01

    Some people report that they consistently and involuntarily associate time events, such as months of the year, with specific spatial locations; a condition referred to as time-space synesthesia. The present study investigated the manner in which such synesthetic time-space associations affect visuo-spatial attention via an endogenous cuing…

  15. Stochastic Generation of Drought Events using Reconstructed Annual Streamflow Time Series from Tree Ring Analysis

    NASA Astrophysics Data System (ADS)

    Lopes, A.; Dracup, J. A.

    2011-12-01

    The statistical analysis of multiyear drought events in streamflow records is often restricted by the size of samples since only a few number of droughts events can be extracted from common river flow time series data. An alternative to those conventional datasets is the use of paleo hydrologic data such as streamflow time series reconstructed from tree ring analysis. In this study, we analyze the statistical characteristics of drought events present in a 1439 year long time series of reconstructed annual streamflow records at the Feather river inflow to the Oreville reservoir, California. Also, probabilistic distributions were used to describe duration and severity of drought events and the results were compared with previous studies that used only the observed streamflow data. Finally, a stochastic simulation model was developed to synthetically generate sequences of drought and high-flow events with the same characteristics of the paleo hydrologic record. The long term mean flow was used as the single truncation level to define 248 drought events and 248 high flow events with specific duration and severity. The longest drought and high flow events had 13 years (1471 to 1483) and 9 years of duration (1903 to 1911), respectively. A strong relationship between event duration and severity in both drought and high flow events were found so the longest droughts also corresponded to the more severe ones. Therefore, the events were classified by duration (in years) and probability distributions were fitted to the frequency distribution of drought and high flow severity for each duration. As a result, it was found that the gamma distribution describes well the frequency distribution of drought severities for all durations. For high flow events, the exponential distribution is more adequate for one year events while the gamma distribution is better suited for the longer events. Those distributions can be used to estimate the recurrence time of drought events according to

  16. Improving linear accelerator service response with a real- time electronic event reporting system.

    PubMed

    Hoisak, Jeremy D P; Pawlicki, Todd; Kim, Gwe-Ya; Fletcher, Richard; Moore, Kevin L

    2014-09-08

    To track linear accelerator performance issues, an online event recording system was developed in-house for use by therapists and physicists to log the details of technical problems arising on our institution's four linear accelerators. In use since October 2010, the system was designed so that all clinical physicists would receive email notification when an event was logged. Starting in October 2012, we initiated a pilot project in collaboration with our linear accelerator vendor to explore a new model of service and support, in which event notifications were also sent electronically directly to dedicated engineers at the vendor's technical help desk, who then initiated a response to technical issues. Previously, technical issues were reported by telephone to the vendor's call center, which then disseminated information and coordinated a response with the Technical Support help desk and local service engineers. The purpose of this work was to investigate the improvements to clinical operations resulting from this new service model. The new and old service models were quantitatively compared by reviewing event logs and the oncology information system database in the nine months prior to and after initiation of the project. Here, we focus on events that resulted in an inoperative linear accelerator ("down" machine). Machine downtime, vendor response time, treatment cancellations, and event resolution were evaluated and compared over two equivalent time periods. In 389 clinical days, there were 119 machine-down events: 59 events before and 60 after introduction of the new model. In the new model, median time to service response decreased from 45 to 8 min, service engineer dispatch time decreased 44%, downtime per event decreased from 45 to 20 min, and treatment cancellations decreased 68%. The decreased vendor response time and reduced number of on-site visits by a service engineer resulted in decreased downtime and decreased patient treatment cancellations.

  17. Improving linear accelerator service response with a real-time electronic event reporting system.

    PubMed

    Hoisak, Jeremy D P; Pawlicki, Todd; Kim, Gwe-Ya; Fletcher, Richard; Moore, Kevin L

    2014-09-01

    To track linear accelerator performance issues, an online event recording system was developed in-house for use by therapists and physicists to log the details of technical problems arising on our institution's four linear accelerators. In use since October 2010, the system was designed so that all clinical physicists would receive email notification when an event was logged. Starting in October 2012, we initiated a pilot project in collaboration with our linear accelerator vendor to explore a new model of service and support, in which event notifications were also sent electronically directly to dedicated engineers at the vendor's technical help desk, who then initiated a response to technical issues. Previously, technical issues were reported by telephone to the vendor's call center, which then disseminated information and coordinated a response with the Technical Support help desk and local service engineers. The purpose of this work was to investigate the improvements to clinical operations resulting from this new service model. The new and old service models were quantitatively compared by reviewing event logs and the oncology information system database in the nine months prior to and after initiation of the project. Here, we focus on events that resulted in an inoperative linear accelerator ("down" machine). Machine downtime, vendor response time, treatment cancellations, and event resolution were evaluated and compared over two equivalent time periods. In 389 clinical days, there were 119 machine-down events: 59 events before and 60 after introduction of the new model. In the new model, median time to service response decreased from 45 to 8 min, service engineer dispatch time decreased 44%, downtime per event decreased from 45 to 20 min, and treatment cancellations decreased 68%. The decreased vendor response time and reduced number of on-site visits by a service engineer resulted in decreased downtime and decreased patient treatment cancellations. PACS

  18. Near Optimal Event-Triggered Control of Nonlinear Discrete-Time Systems Using Neurodynamic Programming.

    PubMed

    Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani

    2016-09-01

    This paper presents an event-triggered near optimal control of uncertain nonlinear discrete-time systems. Event-driven neurodynamic programming (NDP) is utilized to design the control policy. A neural network (NN)-based identifier, with event-based state and input vectors, is utilized to learn the system dynamics. An actor-critic framework is used to learn the cost function and the optimal control input. The NN weights of the identifier, the critic, and the actor NNs are tuned aperiodically once every triggered instant. An adaptive event-trigger condition to decide the trigger instants is derived. Thus, a suitable number of events are generated to ensure a desired accuracy of approximation. A near optimal performance is achieved without using value and/or policy iterations. A detailed analysis of nontrivial inter-event times with an explicit formula to show the reduction in computation is also derived. The Lyapunov technique is used in conjunction with the event-trigger condition to guarantee the ultimate boundedness of the closed-loop system. The simulation results are included to verify the performance of the controller. The net result is the development of event-driven NDP.

  19. A novel way to detect correlations on multi-time scales, with temporal evolution and for multi-variables

    NASA Astrophysics Data System (ADS)

    Yuan, Naiming; Xoplaki, Elena; Zhu, Congwen; Luterbacher, Juerg

    2016-06-01

    In this paper, two new methods, Temporal evolution of Detrended Cross-Correlation Analysis (TDCCA) and Temporal evolution of Detrended Partial-Cross-Correlation Analysis (TDPCCA), are proposed by generalizing DCCA and DPCCA. Applying TDCCA/TDPCCA, it is possible to study correlations on multi-time scales and over different periods. To illustrate their properties, we used two climatological examples: i) Global Sea Level (GSL) versus North Atlantic Oscillation (NAO); and ii) Summer Rainfall over Yangtze River (SRYR) versus previous winter Pacific Decadal Oscillation (PDO). We find significant correlations between GSL and NAO on time scales of 60 to 140 years, but the correlations are non-significant between 1865-1875. As for SRYR and PDO, significant correlations are found on time scales of 30 to 35 years, but the correlations are more pronounced during the recent 30 years. By combining TDCCA/TDPCCA and DCCA/DPCCA, we proposed a new correlation-detection system, which compared to traditional methods, can objectively show how two time series are related (on which time scale, during which time period). These are important not only for diagnosis of complex system, but also for better designs of prediction models. Therefore, the new methods offer new opportunities for applications in natural sciences, such as ecology, economy, sociology and other research fields.

  20. A novel way to detect correlations on multi-time scales, with temporal evolution and for multi-variables

    PubMed Central

    Yuan, Naiming; Xoplaki, Elena; Zhu, Congwen; Luterbacher, Juerg

    2016-01-01

    In this paper, two new methods, Temporal evolution of Detrended Cross-Correlation Analysis (TDCCA) and Temporal evolution of Detrended Partial-Cross-Correlation Analysis (TDPCCA), are proposed by generalizing DCCA and DPCCA. Applying TDCCA/TDPCCA, it is possible to study correlations on multi-time scales and over different periods. To illustrate their properties, we used two climatological examples: i) Global Sea Level (GSL) versus North Atlantic Oscillation (NAO); and ii) Summer Rainfall over Yangtze River (SRYR) versus previous winter Pacific Decadal Oscillation (PDO). We find significant correlations between GSL and NAO on time scales of 60 to 140 years, but the correlations are non-significant between 1865–1875. As for SRYR and PDO, significant correlations are found on time scales of 30 to 35 years, but the correlations are more pronounced during the recent 30 years. By combining TDCCA/TDPCCA and DCCA/DPCCA, we proposed a new correlation-detection system, which compared to traditional methods, can objectively show how two time series are related (on which time scale, during which time period). These are important not only for diagnosis of complex system, but also for better designs of prediction models. Therefore, the new methods offer new opportunities for applications in natural sciences, such as ecology, economy, sociology and other research fields. PMID:27293028

  1. Position-gram - A Visual Method for Detecting Transient Events in Continuous GPS Time Series

    NASA Astrophysics Data System (ADS)

    Jiang, Y.; Wdowinski, S.

    2008-12-01

    Continuous Global Positioning System (CGPS) time series provide excellent observations for detecting crustal deformation at various length and time-scales. With the increasing precision and length of the time series, new modes of deformation, such as slow slip events and sub-continental scale changes in crustal velocities, can be detected. However, non-tectonic surface movements and measurement noise limit our ability to detect and quantify tectonic-induced transient deformation. Two common methods for reducing noise level in CGPS time series, spatial filtering and periodic seasonal fitting, significantly improve the secular tectonic signal, but fail when transient deformation events are embedded in the time series. We developed a new visually-based method for detecting transient events in CGPS time series. The development was inspired by wavelet analysis presentations that use color to present quantitative information about relationships between time and frequency domains. Here we explore the relationship between time and space domains. The displacement information is color coded according to spline fitting of each time series. This 3-D information (time, space, and displacement in color) allows easy detection of spatio-temporal patterns, which can serve as indicators for transient deformation events. We tested the new method with CGPS time series from three regions with different spatial scales: the Pacific Northwest, Southern California, and the entire continental US. The Pacific Northwest study confirmed that our proposed methodology is capable of detecting transient events and mapping their lateral distribution. The Southern California study detected a new transient event near the intersection of the San Andreas and San Jacinto faults, far from any known creeping fault segments. Finally the continental scale analysis revealed regionally correlated crustal movements in the Basin and Range and California, but uncorrelated with sites in eastern US. Such signal

  2. Are Time- and Event-based Prospective Memory Comparably Affected in HIV Infection?†

    PubMed Central

    Zogg, Jennifer B.; Woods, Steven Paul; Weber, Erica; Doyle, Katie; Grant, Igor; Atkinson, J. Hampton; Ellis, Ronald J.; McCutchan, J. Allen; Marcotte, Thomas D.; Hale, Braden R.; Ellis, Ronald J.; McCutchan, J. Allen; Letendre, Scott; Capparelli, Edmund; Schrier, Rachel; Heaton, Robert K.; Cherner, Mariana; Moore, David J.; Jernigan, Terry; Fennema-Notestine, Christine; Archibald, Sarah L.; Hesselink, John; Annese, Jacopo; Taylor, Michael J.; Masliah, Eliezer; Everall, Ian; Langford, T. Dianne; Richman, Douglas; Smith, David M.; McCutchan, J. Allen; Everall, Ian; Lipton, Stuart; McCutchan, J. Allen; Atkinson, J. Hampton; Ellis, Ronald J.; Letendre, Scott; Atkinson, J. Hampton; von Jaeger, Rodney; Gamst, Anthony C.; Cushman, Clint; Masys, Daniel R.; Abramson, Ian; Ake, Christopher; Vaida, Florin

    2011-01-01

    According to the multi-process theory of prospective memory (ProM), time-based tasks rely more heavily on strategic processes dependent on prefrontal systems than do event-based tasks. Given the prominent frontostriatal pathophysiology of HIV infection, one would expect HIV-infected individuals to demonstrate greater deficits in time-based versus event-based ProM. However, the two prior studies examining this question have produced variable results. We evaluated this hypothesis in 143 individuals with HIV infection and 43 demographically similar seronegative adults (HIV−) who completed the research version of the Memory for Intentions Screening Test, which yields parallel subscales of time- and event-based ProM. Results showed main effects of HIV serostatus and cue type, but no interaction between serostatus and cue. Planned pair-wise comparisons showed a significant effect of HIV on time-based ProM and a trend-level effect on event-based ProM that was driven primarily by the subset of participants with HIV-associated neurocognitive disorders. Nevertheless, time-based ProM was more strongly correlated with measures of executive functions, attention/working memory, and verbal fluency in HIV-infected persons. Although HIV-associated deficits in time- and event-based ProM appear to be of comparable severity, the cognitive architecture of time-based ProM may be more strongly influenced by strategic monitoring and retrieval processes. PMID:21459901

  3. Case-based damage assessment of storm events in near real-time

    NASA Astrophysics Data System (ADS)

    Möhrle, Stella; Mühr, Bernhard

    2015-04-01

    Damage assessment in times of crisis is complex due to a highly dynamic environment and uncertainty in respect of available information. In order to assess the extent of a disaster in near real-time, historic events and their consequences may facilitate first estimations. Events of the past, which are in the same category or which have similar frame conditions like imminent or just occurring storms, might give preliminary information about possible damages. The challenge here is to identify useful historic events based on little information regarding the current event. This work investigates the potential of drawing conclusions about a current event based on similar historic disasters, exemplarily for storm events in Germany. Predicted wind speed and area affected can be used for roughly classifying a storm event. For this purpose, a grid of equidistant points can be used to split up the area of Germany. In combination with predicted wind speed at these points and the predicted number of points affected, respectively, a storm can be categorized in a fast manner. In contrast to investigate only data taken by the observation network, the grid approach is more objective, since stations are not equally distributed. Based on model data, the determined storm class provides one key factor for identifying similar historic events. Further aspects, such as region or specific event characteristics, complete knowledge about the potential storm scale and result in a similarity function, which automatically identifies useful events from the past. This work presents a case-based approach to estimate damages in the event of an extreme storm event in Germany. The focus in on the similarity function, which is based on model storm classes, particularly wind speed and area affected. In order to determine possible damages more precisely, event specific characteristics and region will be included. In the frame of determining similar storm events, neighboring storm classes will be

  4. Element analysis: a wavelet-based method for analysing time-localized events in noisy time series

    NASA Astrophysics Data System (ADS)

    Lilly, Jonathan M.

    2017-04-01

    A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized `events'. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event's `region of influence' within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis, is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry.

  5. Combined multivariate data analysis of high-performance thin-layer chromatography fingerprints and direct analysis in real time mass spectra for profiling of natural products like propolis.

    PubMed

    Morlock, Gertrud E; Ristivojevic, Petar; Chernetsova, Elena S

    2014-02-07

    Sophisticated statistical tools are required to extract the full analytical power from high-performance thin-layer chromatography (HPTLC). Especially, the combination of HPTLC fingerprints (image) with chemometrics is rarely used so far. Also, the newly developed, instantaneous direct analysis in real time mass spectrometry (DART-MS) method is perspective for sample characterization and differentiation by multivariate data analysis. This is a first novel study on the differentiation of natural products using a combination of fast fingerprint techniques, like HPTLC and DART-MS, for multivariate data analysis. The results obtained by the chemometric evaluation of HPTLC and DART-MS data provided complementary information. The complexity, expense, and analysis time were significantly reduced due to the use of statistical tools for evaluation of fingerprints. The approach allowed categorizing 91 propolis samples from Germany and other locations based on their phenolic compound profile. A high level of confidence was obtained when combining orthogonal approaches (HPTLC and DART-MS) for ultrafast sample characterization. HPTLC with selective post-chromatographic derivatization provided information on polarity, functional groups and spectral properties of marker compounds, while information on possible elemental formulae of principal components (phenolic markers) was obtained by DART-MS. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Approximate Optimal Control of Affine Nonlinear Continuous-Time Systems Using Event-Sampled Neurodynamic Programming.

    PubMed

    Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani

    2017-03-01

    This paper presents an approximate optimal control of nonlinear continuous-time systems in affine form by using the adaptive dynamic programming (ADP) with event-sampled state and input vectors. The knowledge of the system dynamics is relaxed by using a neural network (NN) identifier with event-sampled inputs. The value function, which becomes an approximate solution to the Hamilton-Jacobi-Bellman equation, is generated by using event-sampled NN approximator. Subsequently, the NN identifier and the approximated value function are utilized to obtain the optimal control policy. Both the identifier and value function approximator weights are tuned only at the event-sampled instants leading to an aperiodic update scheme. A novel adaptive event sampling condition is designed to determine the sampling instants, such that the approximation accuracy and the stability are maintained. A positive lower bound on the minimum inter-sample time is guaranteed to avoid accumulation point, and the dependence of inter-sample time upon the NN weight estimates is analyzed. A local ultimate boundedness of the resulting nonlinear impulsive dynamical closed-loop system is shown. Finally, a numerical example is utilized to evaluate the performance of the near-optimal design. The net result is the design of an event-sampled ADP-based controller for nonlinear continuous-time systems.

  7. A LORETA study of mental time travel: similar and distinct electrophysiological correlates of re-experiencing past events and pre-experiencing future events.

    PubMed

    Lavallee, Christina F; Persinger, Michael A

    2010-12-01

    Previous studies exploring mental time travel paradigms with functional neuroimaging techniques have uncovered both common and distinct neural correlates of re-experiencing past events or pre-experiencing future events. A gap in the mental time travel literature exists, as paradigms have not explored the affective component of re-experiencing past episodic events; this study explored this sparsely researched area. The present study employed standardized low resolution electromagnetic tomography (sLORETA) to identify electrophysiological correlates of re-experience affect-laden and non-affective past events, as well as pre-experiencing a future anticipated event. Our results confirm previous research and are also novel in that we illustrate common and distinct electrophysiological correlates of re-experiencing affective episodic events. Furthermore, research from this experiment yields results outlining a pattern of activation in the frontal and temporal regions is correlated with the time frame of past or future events subjects imagined.

  8. The origin and non-universality of the earthquake inter-event time distribution

    NASA Astrophysics Data System (ADS)

    Touati, S.; Naylor, M.; Main, I. G.

    2009-04-01

    Understanding the form and origin of the earthquake inter-event time distribution is vital for both the advancement of seismic hazard assessment models and the development of physically-based models of earthquake dynamics. Many authors have modelled regional earthquake inter-event times using a gamma distribution, whereby data collapse occurs under a simple rescaling of the data from different regions or time periods. We use earthquake data and simulations to present a new understanding of the form of the earthquake inter-event time distribution as essentially bimodal, and a physically-motivated explanation for its origin in terms of the interaction of separate aftershock sequences within the earthquake time series. Our insight into the origin of the bimodality is through stochastic simulations of the Epidemic-Type Aftershock Sequences (ETAS) model, a point process model based on well-known empirical laws of seismicity, in which we are able to keep track of the triggering "family" structure in the catalogue unlike with real seismicity. We explain the variation of the distribution shape with region size and show that it is not universal under rescaling by the mean event rate. The power-law segment in the gamma distribution usually used to model inter-earthquake times arises under some conditions as a crossover between the two peaks; the previous results supporting universality can be explained by strong data selection criteria in the form of a requirement for short-term stationarity in the event rate.

  9. A multivariate time-frequency method to characterize the influence of respiration over heart period and arterial pressure

    NASA Astrophysics Data System (ADS)

    Orini, Michele; Bailón, Raquel; Laguna, Pablo; Mainardi, Luca T.; Barbieri, Riccardo

    2012-12-01

    Respiratory activity introduces oscillations both in arterial pressure and heart period, through mechanical and autonomic mechanisms. Respiration, arterial pressure, and heart period are, generally, non-stationary processes and the interactions between them are dynamic. In this study we present a methodology to robustly estimate the time course of cross spectral indices to characterize dynamic interactions between respiratory oscillations of heart period and blood pressure, as well as their interactions with respiratory activity. Time-frequency distributions belonging to Cohen's class are used to estimate time-frequency (TF) representations of coherence, partial coherence and phase difference. The characterization is based on the estimation of the time course of cross spectral indices estimated in specific TF regions around the respiratory frequency. We used this methodology to describe the interactions between respiration, heart period variability (HPV) and systolic arterial pressure variability (SAPV) during tilt table test with both spontaneous and controlled respiratory patterns. The effect of selective autonomic blockade was also studied. Results suggest the presence of common underling mechanisms of regulation between cardiovascular signals, whose interactions are time-varying. SAPV changes followed respiratory flow both in supine and standing positions and even after selective autonomic blockade. During head-up tilt, phase differences between respiration and SAPV increased. Phase differences between respiration and HPV were comparable to those between respiration and SAPV during supine position, and significantly increased during standing. As a result, respiratory oscillations in SAPV preceded respiratory oscillations in HPV during standing. Partial coherence was the most sensitive index to orthostatic stress. Phase difference estimates were consistent among spontaneous and controlled breathing patterns, whereas coherence was higher in spontaneous breathing

  10. Classification and Space-Time Analysis of Precipitation Events in Manizales, Caldas, Colombia.

    NASA Astrophysics Data System (ADS)

    Suarez Hincapie, J. N.; Vélez, J.; Romo Melo, L.; Chang, P.

    2015-12-01

    Manizales is a mid-mountain Andean city located near the Nevado del Ruiz volcano in west-central Colombia, this location exposes it to earthquakes, floods, landslides and volcanic eruptions. It is located in the intertropical convergence zone (ITCZ) and presents a climate with a bimodal rainfall regime (Cortés, 2010). Its mean annual rainfall is 2000 mm, one may observe precipitation 70% of the days over a year. This rain which favors the formation of large masses of clouds and the presence of macroclimatic phenomenon as "El Niño South Oscillation", has historically caused great impacts in the region (Vélez et al, 2012). For example the geographical location coupled with rain events results in a high risk of landslides in the city. Manizales has a hydrometeorological network of 40 stations that measure and transmit data of up to eight climate variables. Some of these stations keep 10 years of historical data. However, until now this information has not been used for space-time classification of precipitation events, nor has the meteorological variables that influence them been thoroughly researched. The purpose of this study was to classify historical events of rain in an urban area of Manizales and investigate patterns of atmospheric behavior that influence or trigger such events. Classification of events was performed by calculating the "n" index of the heavy rainfall, describing the behavior of precipitation as a function of time throughout the event (Monjo, 2009). The analysis of meteorological variables was performed using statistical quantification over variable time periods before each event. The proposed classification allowed for an analysis of the evolution of rainfall events. Specially, it helped to look for the influence of different meteorological variables triggering rainfall events in hazardous areas as the city of Manizales.

  11. Gunbarrel mafic magmatic event: A key 780 Ma time marker for Rodinia plate reconstructions

    USGS Publications Warehouse

    Harlan, S.S.; Heaman, L.; LeCheminant, A.N.; Premo, W.R.

    2003-01-01

    Precise U-Pb baddeleyite dating of mafic igneous rocks provides evidence for a widespread and synchronous magmatic event that extended for >2400 km along the western margin of the Neoproterozoic Laurentian craton. U-Pb baddeleyite analyses for eight intrusions from seven localities ranging from the northern Canadian Shield to northwestern Wyoming-southwestern Montana are statistically indistinguishable and yield a composite U-Pb concordia age for this event of 780.3 ?? 1.4 Ma (95% confidence level). This 780 Ma event is herein termed the Gunbarrel magmatic event. The mafic magmatism of the Gunbarrel event represents the largest mafic dike swarm yet identified along the Neoproterozoic margin of Laurentia. The origin of the mafic magmatism is not clear, but may be related to mantle-plume activity or upwelling asthenosphere leading to crustal extension accompanying initial breakup of the supercontinent Rodinia and development of the proto-Pacific Ocean. The mafic magmatism of the Gunbarrel magmatic event at 780 Ma predates the voluminous magmatism of the 723 Ma Franklin igneous event of the northwestern Canadian Shield by ???60 m.y. The precise dating of the extensive Neoproterozoic Gunbarrel and Franklin magmatic events provides unique time markers that can ultimately be used for robust testing of Neoproterozoic continental reconstructions.

  12. Designing group sequential randomized clinical trials with time to event end points using a R function.

    PubMed

    Filleron, Thomas; Gal, Jocelyn; Kramar, Andrew

    2012-10-01

    A major and difficult task is the design of clinical trials with a time to event endpoint. In fact, it is necessary to compute the number of events and in a second step the required number of patients. Several commercial software packages are available for computing sample size in clinical trials with sequential designs and time to event endpoints, but there are a few R functions implemented. The purpose of this paper is to describe features and use of the R function. plansurvct.func, which is an add-on function to the package gsDesign which permits in one run of the program to calculate the number of events, and required sample size but also boundaries and corresponding p-values for a group sequential design. The use of the function plansurvct.func is illustrated by several examples and validated using East software. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  13. The role of musical training in emergent and event-based timing.

    PubMed

    Baer, L H; Thibodeau, J L N; Gralnick, T M; Li, K Z H; Penhune, V B

    2013-01-01

    Musical performance is thought to rely predominantly on event-based timing involving a clock-like neural process and an explicit internal representation of the time interval. Some aspects of musical performance may rely on emergent timing, which is established through the optimization of movement kinematics, and can be maintained without reference to any explicit representation of the time interval. We predicted that musical training would have its largest effect on event-based timing, supporting the dissociability of these timing processes and the dominance of event-based timing in musical performance. We compared 22 musicians and 17 non-musicians on the prototypical event-based timing task of finger tapping and on the typically emergently timed task of circle drawing. For each task, participants first responded in synchrony with a metronome (Paced) and then responded at the same rate without the metronome (Unpaced). Analyses of the Unpaced phase revealed that non-musicians were more variable in their inter-response intervals for finger tapping compared to circle drawing. Musicians did not differ between the two tasks. Between groups, non-musicians were more variable than musicians for tapping but not for drawing. We were able to show that the differences were due to less timer variability in musicians on the tapping task. Correlational analyses of movement jerk and inter-response interval variability revealed a negative association for tapping and a positive association for drawing in non-musicians only. These results suggest that musical training affects temporal variability in tapping but not drawing. Additionally, musicians and non-musicians may be employing different movement strategies to maintain accurate timing in the two tasks. These findings add to our understanding of how musical training affects timing and support the dissociability of event-based and emergent timing modes.

  14. The role of musical training in emergent and event-based timing

    PubMed Central

    Baer, L. H.; Thibodeau, J. L. N.; Gralnick, T. M.; Li, K. Z. H.; Penhune, V. B.

    2013-01-01

    Introduction: Musical performance is thought to rely predominantly on event-based timing involving a clock-like neural process and an explicit internal representation of the time interval. Some aspects of musical performance may rely on emergent timing, which is established through the optimization of movement kinematics, and can be maintained without reference to any explicit representation of the time interval. We predicted that musical training would have its largest effect on event-based timing, supporting the dissociability of these timing processes and the dominance of event-based timing in musical performance. Materials and Methods: We compared 22 musicians and 17 non-musicians on the prototypical event-based timing task of finger tapping and on the typically emergently timed task of circle drawing. For each task, participants first responded in synchrony with a metronome (Paced) and then responded at the same rate without the metronome (Unpaced). Results: Analyses of the Unpaced phase revealed that non-musicians were more variable in their inter-response intervals for finger tapping compared to circle drawing. Musicians did not differ between the two tasks. Between groups, non-musicians were more variable than musicians for tapping but not for drawing. We were able to show that the differences were due to less timer variability in musicians on the tapping task. Correlational analyses of movement jerk and inter-response interval variability revealed a negative association for tapping and a positive association for drawing in non-musicians only. Discussion: These results suggest that musical training affects temporal variability in tapping but not drawing. Additionally, musicians and non-musicians may be employing different movement strategies to maintain accurate timing in the two tasks. These findings add to our understanding of how musical training affects timing and support the dissociability of event-based and emergent timing modes. PMID:23717275

  15. Timing of the most recent surface rupture event on the Ohariu Fault near Paraparaumu, New Zealand

    USGS Publications Warehouse

    Litchfield, N.; Van Dissen, R.; Langridge, Rob; Heron, D.; Prentice, C.

    2004-01-01

    Thirteen radiocarbon ages from three trenches across the Ohariu Fault tightly constrain the timing of the most recent surface rupture event at Muaupoko Stream valley, c. 2 km east of Paraparaumu, to between 930 and 1050 cal. yr BP. This age overlaps with previously published ages of the most recent event on the Ohariu Fault and together they further constrain the event to 1000-1050 cal. yr BP. Two trenches provide loose constraints on the maximum recurrence interval at 3-7000 yr. Tephra, most probably the Kawakawa Tephra, was found within alluvial fan deposits in two of the trenches. ?? The Royal Society of New Zealand 2004.

  16. Peri-event cross-correlation over time for analysis of interactions in neuronal firing.

    PubMed

    Paiva, António R C; Park, Il; Sanchez, Justin C; Príncipe, José C

    2008-01-01

    Several methods have been described in the literature to verify the presence of couplings between neurons in the brain. In this paper we introduce the peri-event cross-correlation over time (PECCOT) to describe the interaction among the two neurons as a function of the event onset. Instead of averaging over time, the PECCOT averages the cross-correlation over instances of the event. As a consequence, the PECCOT is able to characterize with high temporal resolution the interactions over time among neurons. To illustrate the method, the PECCOT is applied to a simulated dataset and for analysis of synchrony in recordings of a rat performing a go/no go behavioral lever press task. We verify the presence of synchrony before the lever press time and its suppression afterwards.

  17. Diagnosis of Discrete Event System with Linear-Time Temporal Logic Proposition

    NASA Astrophysics Data System (ADS)

    Zanma, Tadanao; Aoyama, Shigeru; Ishida, Muneaki

    Diagnosis for discrete event systems has been investigated. In this paper, authors examine a state estimation problem of a system modeled by a finite state automaton in which each state has its corresponding logical formulas. We formalize a diagnosis problem of truth values of atomic propositions which constitute the logical formulas. Our approach to the problem is based on the discrete event system theory by use of linear-time temporal logic.

  18. Simulations of barrier traversal and reflection times based on event enhanced quantum theory

    NASA Astrophysics Data System (ADS)

    Ruschhaupt, Andreas

    1998-12-01

    The formalism of event enhanced quantum theory is used to simulate traversal and reflection times of electrons through a one-dimensional barrier. The dependence of these times on the parameters of the barrier and the detectors is examined and the results are compared with those of selected approaches.

  19. Development of a time-oriented data warehouse based on a medical information event model.

    PubMed

    Yamamoto, Yuichiro; Namikawa, Hirokazu; Inamura, Kiyonari

    2002-01-01

    We designed a new medical information event model and developed a time-oriented data warehouse based on the model. Here, the medical information event in a basic data unit is handled by a medical information system. The timing of decision making and treatment for a patient in the processing of his medical information is sometimes very critical. The time-oriented data warehouse was developed, to provide a search feature on the time axis. Our medical information event model has a unique simple data structure. PC-ORDERING2000 developed by NEC, which used Oracle, had about 600 pages of tables. However, we reduced these 600 complicated data structures to one unique and simple event model. By means of shifting clinical data from the old type order entry system into the new order entry system of the medical information event model, we produced a simple and flexible system, and the easy secondary use of clinical data of patients was realized. Evaluation of our system revealed heightened data retrieval efficiency and shortened response time 1:600 at a terminal, owing to the 1:600 reduction of the number of tables as mentioned above.

  20. Time accuracy of a barcode system for recording resuscitation events: laboratory trials.

    PubMed

    Stewart, J A; Short, F A

    1999-11-01

    Barcode systems for recording clinical data from resuscitation attempts offer the prospect of more complete and time-accurate data collection; in addition, collection of data in digital form and the resulting ease of computer processing promises to facilitate data analysis for quality improvement and research. We conducted trials of such a barcode system, recording events during a videotaped, simulated in-hospital resuscitation, with particular attention to time accuracy. Nine subjects watched a videotape of a simulated cardiac resuscitation, recording events first with the barcode system and then with a conventional handwritten form. Recorded times were compared to an accurate record of events (gold standard) from the videotape. Mean absolute errors and standard deviations of errors from the gold standard were significantly smaller with the barcode system (P < 0.01 for both). Numbers of event omissions did not differ significantly. The barcode system is more accurate than conventional handwritten recording in capturing event times from a simulated resuscitation. The system shows promise as a means to improve time accuracy of resuscitation records.

  1. November 2004 space weather events: Real-time observations and forecasts

    NASA Astrophysics Data System (ADS)

    Trichtchenko, L.; Zhukov, A.; van der Linden, R.; Stankov, S. M.; Jakowski, N.; StanisłAwska, I.; Juchnikowski, G.; Wilkinson, P.; Patterson, G.; Thomson, A. W. P.

    2007-06-01

    Space weather events with their solar origin and their distribution through the heliosphere affect the whole magnetosphere-ionosphere-Earth system. Their real-time monitoring and forecasting are important for science and technology. Here we discuss one of the largest space weather events of Solar Cycle 23, in November 2004, which was also one of the most difficult periods to forecast. Nine halo coronal mass ejections (CMEs), interacting on their way through the interplanetary medium and forming two geoeffective interplanetary structures, exemplify the complexity of the event. Real-time and quasi-real-time observations of the ground geomagnetic field show rapid and extensive expansion of the auroral oval to 55° in geomagnetic latitude accompanied by great variability of the ionosphere. Geomagnetically induced currents (GICs) seen in ground networks, such as power grids and pipelines, were significant during the event, although no problems were reported. Forecasts of the CME propagation, global and local ground geomagnetic activity, and ionospheric parameters, issued by several regional warning centers, revealed certain deficiencies in predictions of the interplanetary characteristics of the CME, size of the geomagnetic disturbances, and complexity of the ionospheric variations produced by this event. This paper is a collective report based on the materials presented at the splinter session on November 2004 events during the first European Space Weather Week.

  2. Relaxation times in single event electrospraying controlled by nozzle front surface modification.

    PubMed

    Stachewicz, Urszula; Dijksman, J Frits; Burdinski, Dirk; Yurteri, Caner U; Marijnissen, Jan C M

    2009-02-17

    Single event electrospraying (SEE) is a method for on-demand deposition of femtoliter to picoliter volumes of fluids. To determine the influence of the size of the meniscus on the characteristics of the single event electrospraying process, glass capillaries were used with and without an antiwetting coating comprising a self-assembled 1H,1H,2H,2H-perfluorodecyltrichlorosilane-based monolayer to control the meniscus size. A large difference was found in driving single event electrospraying from a small meniscus compared to what is needed to generate a single event electrospraying from a large meniscus. Furthermore, after studying the different time constants related to the electrical and the hydrodynamic phenomena, we are able to explain the timing limitations of the deposition process from both a small and a large meniscus. The hydrodynamic relaxation time is significantly reduced in the case of the modified capillary, and the timing of SEE, which determines the deposition time, is limited by the resistor-capacitor RC time of the electrical circuit needed to drive the SEE. We have built a model that describes the almost one-dimensional motion of the liquid in the capillary during pulsing. The model has been used to estimate the hydrodynamic relaxation times related to the meniscus-to-cone and cone-to-meniscus transitions during SEE. By confining the meniscus to the inner diameter of the nozzle, we are able to deposit a volume smaller than 5 pL per SEE.

  3. Unbiased metabolite profiling by liquid chromatography-quadrupole time-of-flight mass spectrometry and multivariate data analysis for herbal authentication: classification of seven Lonicera species flower buds.

    PubMed

    Gao, Wen; Yang, Hua; Qi, Lian-Wen; Liu, E-Hu; Ren, Mei-Ting; Yan, Yu-Ting; Chen, Jun; Li, Ping

    2012-07-06

    Plant-based medicines become increasingly popular over the world. Authentication of herbal raw materials is important to ensure their safety and efficacy. Some herbs belonging to closely related species but differing in medicinal properties are difficult to be identified because of similar morphological and microscopic characteristics. Chromatographic fingerprinting is an alternative method to distinguish them. Existing approaches do not allow a comprehensive analysis for herbal authentication. We have now developed a strategy consisting of (1) full metabolic profiling of herbal medicines by rapid resolution liquid chromatography (RRLC) combined with quadrupole time-of-flight mass spectrometry (QTOF MS), (2) global analysis of non-targeted compounds by molecular feature extraction algorithm, (3) multivariate statistical analysis for classification and prediction, and (4) marker compounds characterization. This approach has provided a fast and unbiased comparative multivariate analysis of the metabolite composition of 33-batch samples covering seven Lonicera species. Individual metabolic profiles are performed at the level of molecular fragments without prior structural assignment. In the entire set, the obtained classifier for seven Lonicera species flower buds showed good prediction performance and a total of 82 statistically different components were rapidly obtained by the strategy. The elemental compositions of discriminative metabolites were characterized by the accurate mass measurement of the pseudomolecular ions and their chemical types were assigned by the MS/MS spectra. The high-resolution, comprehensive and unbiased strategy for metabolite data analysis presented here is powerful and opens the new direction of authentication in herbal analysis.

  4. Estimation of intervention effects using recurrent event time data in the presence of event dependence and a cured fraction.

    PubMed

    Xu, Ying; Lam, K F; Cheung, Yin Bun

    2014-06-15

    Recurrent event data with a fraction of subjects having zero event are often seen in randomized clinical trials. Those with zero event may belong to a cured (or non-susceptible) fraction. Event dependence refers to the situation that a person's past event history affects his future event occurrences. In the presence of event dependence, an intervention may have an impact on the event rate in the non-cured through two pathways-a primary effect directly on the outcome event and a secondary effect mediated through event dependence. The primary effect combined with the secondary effect is the total effect. We propose a frailty mixture model and a two-step estimation procedure for the estimation of the effect of an intervention on the probability of cure and the total effect on event rate in the non-cured. A summary measure of intervention effects is derived. The performance of the proposed model is evaluated by simulation. Data on respiratory exacerbations from a randomized, placebo-controlled trial are re-analyzed for illustration.

  5. Prospective memory while driving: comparison of time- and event-based intentions.

    PubMed

    Trawley, Steven L; Stephens, Amanda N; Rendell, Peter G; Groeger, John A

    2017-06-01

    Prospective memories can divert attentional resources from ongoing activities. However, it is unclear whether these effects and the theoretical accounts that seek to explain them will generalise to a complex real-world task such as driving. Twenty-four participants drove two simulated routes while maintaining a fixed headway with a lead vehicle. Drivers were given either event-based (e.g. arriving at a filling station) or time-based errands (e.g. on-board clock shows 3:30). In contrast to the predominant view in the literature which suggests time-based tasks are more demanding, drivers given event-based errands showed greater difficulty in mirroring lead vehicle speed changes compared to the time-based group. Results suggest that common everyday secondary tasks, such as scouting the roadside for a bank, may have a detrimental impact on driving performance. The additional finding that this cost was only evident with the event-based task highlights a potential area of both theoretical and practical interest. Practitioner Summary: Drivers were given either time- or event-based errands whilst engaged in a simulated drive. We examined the effect of errands on an ongoing vehicle follow task. In contrast to previous non-driving studies, event-based errands are more disruptive. Common everyday errands may have a detrimental impact on driving performance.

  6. Number needed to treat for time-to-event data with competing risks.

    PubMed

    Gouskova, Natalia A; Kundu, Suprateek; Imrey, Peter B; Fine, Jason P

    2014-01-30

    The number needed to treat is a tool often used in clinical settings to illustrate the effect of a treatment. It has been widely adopted in the communication of risks to both clinicians and non-clinicians, such as patients, who are better able to understand this measure than absolute risk or rate reductions. The concept was introduced by Laupacis, Sackett, and Roberts in 1988 for binary data, and extended to time-to-event data by Altman and Andersen in 1999. However, up to the present, there is no definition of the number needed to treat for time-to-event data with competing risks. This paper introduces such a definition using the cumulative incidence function and suggests non-parametric and semi-parametric inferential methods for right-censored time-to-event data in the presence of competing risks. The procedures are illustrated using the data from a breast cancer clinical trial. Copyright © 2013 John Wiley & Sons, Ltd.

  7. BACKWARD ESTIMATION OF STOCHASTIC PROCESSES WITH FAILURE EVENTS AS TIME ORIGINS1

    PubMed Central

    Gary Chan, Kwun Chuen; Wang, Mei-Cheng

    2011-01-01

    Stochastic processes often exhibit sudden systematic changes in pattern a short time before certain failure events. Examples include increase in medical costs before death and decrease in CD4 counts before AIDS diagnosis. To study such terminal behavior of stochastic processes, a natural and direct way is to align the processes using failure events as time origins. This paper studies backward stochastic processes counting time backward from failure events, and proposes one-sample nonparametric estimation of the mean of backward processes when follow-up is subject to left truncation and right censoring. We will discuss benefits of including prevalent cohort data to enlarge the identifiable region and large sample properties of the proposed estimator with related extensions. A SEER–Medicare linked data set is used to illustrate the proposed methodologies. PMID:21359167

  8. Multivariate Analyses and Classification of Inertial Sensor Data to Identify Aging Effects on the Timed-Up-and-Go Test.

    PubMed

    Vervoort, Danique; Vuillerme, Nicolas; Kosse, Nienke; Hortobágyi, Tibor; Lamoth, Claudine J C

    2016-01-01

    Many tests can crudely quantify age-related mobility decrease but instrumented versions of mobility tests could increase their specificity and sensitivity. The Timed-up-and-Go (TUG) test includes several elements that people use in daily life. The test has different transition phases: rise from a chair, walk, 180° turn, walk back, turn, and sit-down on a chair. For this reason the TUG is an often used test to evaluate in a standardized way possible decline in balance and walking ability due to age and or pathology. Using inertial sensors, qualitative information about the performance of the sub-phases can provide more specific information about a decline in balance and walking ability. The first aim of our study was to identify variables extracted from the instrumented timed-up-and-go (iTUG) that most effectively distinguished performance differences across age (age 18-75). Second, we determined the discriminative ability of those identified variables to classify a younger (age 18-45) and older age group (age 46-75). From healthy adults (n = 59), trunk accelerations and angular velocities were recorded during iTUG performance. iTUG phases were detected with wavelet-analysis. Using a Partial Least Square (PLS) model, from the 72-iTUG variables calculated across phases, those that explained most of the covariance between variables and age were extracted. Subsequently, a PLS-discriminant analysis (DA) assessed classification power of the identified iTUG variables to discriminate the age groups. 27 variables, related to turning, walking and the stand-to-sit movement explained 71% of the variation in age. The PLS-DA with these 27 variables showed a sensitivity and specificity of 90% and 85%. Based on this model, the iTUG can accurately distinguish young and older adults. Such data can serve as a reference for pathological aging with respect to a widely used mobility test. Mobility tests like the TUG supplemented with smart technology could be used in clinical practice.

  9. Multivariate Analyses and Classification of Inertial Sensor Data to Identify Aging Effects on the Timed-Up-and-Go Test

    PubMed Central

    Vervoort, Danique; Vuillerme, Nicolas; Kosse, Nienke; Hortobágyi, Tibor; Lamoth, Claudine J. C.

    2016-01-01

    Many tests can crudely quantify age-related mobility decrease but instrumented versions of mobility tests could increase their specificity and sensitivity. The Timed-up-and-Go (TUG) test includes several elements that people use in daily life. The test has different transition phases: rise from a chair, walk, 180° turn, walk back, turn, and sit-down on a chair. For this reason the TUG is an often used test to evaluate in a standardized way possible decline in balance and walking ability due to age and or pathology. Using inertial sensors, qualitative information about the performance of the sub-phases can provide more specific information about a decline in balance and walking ability. The first aim of our study was to identify variables extracted from the instrumented timed-up-and-go (iTUG) that most effectively distinguished performance differences across age (age 18–75). Second, we determined the discriminative ability of those identified variables to classify a younger (age 18–45) and older age group (age 46–75). From healthy adults (n = 59), trunk accelerations and angular velocities were recorded during iTUG performance. iTUG phases were detected with wavelet-analysis. Using a Partial Least Square (PLS) model, from the 72-iTUG variables calculated across phases, those that explained most of the covariance between variables and age were extracted. Subsequently, a PLS-discriminant analysis (DA) assessed classification power of the identified iTUG variables to discriminate the age groups. 27 variables, related to turning, walking and the stand-to-sit movement explained 71% of the variation in age. The PLS-DA with these 27 variables showed a sensitivity and specificity of 90% and 85%. Based on this model, the iTUG can accurately distinguish young and older adults. Such data can serve as a reference for pathological aging with respect to a widely used mobility test. Mobility tests like the TUG supplemented with smart technology could be used in clinical

  10. Neural Correlates of the Time Marker for the Perception of Event Timing

    PubMed Central

    Qi, Liang; Terada, Yoshikazu; Nishida, Shin’ya

    2016-01-01

    While sensory processing latency, inferred from the manual reaction time (RT), is substantially affected by diverse stimulus parameters, subjective temporal judgments are relatively accurate. The neural mechanisms underlying this timing perception remain obscure. Here, we measured human neural activity by magnetoencephalography while participants performed a simultaneity judgment task between the onset of random-dot coherent motion and a beep. In a separate session, participants performed an RT task for the same stimuli. We analyzed the relationship between neural activity evoked by motion onset and point of subjective simultaneity (PSS) or RT. The effect of motion coherence was smaller for PSS than RT, but changes in RT and PSS could both be predicted by the time at which an integrated sensory response crossed a threshold. The task differences could be ascribed to the lower threshold for PSS than for RT. In agreement with the psychophysical threshold difference, the participants reported longer delays in their motor response from the subjective motion onset for weaker stimuli. However, they could not judge the timing of stimuli weaker than the detection threshold. A possible interpretation of the present findings is that the brain assigns the time marker for timing perception prior to stimulus detection, but the time marker is available only after stimulus detection. PMID:27679810

  11. Joint modelling of repeated measurements and time-to-event outcomes: the fourth Armitage lecture.

    PubMed

    Diggle, Peter J; Sousa, Inês; Chetwynd, Amanda G

    2008-07-20

    In many longitudinal studies, the outcomes recorded on each subject include both a sequence of repeated measurements at pre-specified times and the time at which an event of particular interest occurs: for example, death, recurrence of symptoms or drop out from the study. The event time for each subject may be recorded exactly, interval censored or right censored. The term joint modelling refers to the statistical analysis of the resulting data while taking account of any association between the repeated measurement and time-to-event outcomes. In this paper, we first discuss different approaches to joint modelling and argue that the analysis strategy should depend on the scientific focus of the study. We then describe in detail a particularly simple, fully parametric approach. Finally, we use this approach to re-analyse data from a clinical trial of drug therapies for schizophrenic patients, in which the event time is an interval-censored or right-censored time to withdrawal from the study due to adverse side effects.

  12. Using a pictorial timeline to assess age-related changes in time estimation of daily events.

    PubMed

    Yu, Jing; Cheng, Heben; Peng, Peng

    2016-02-01

    How do older adults compare with younger adults in estimating the timing of daily events, such as heating a meal, keeping an appointment, or taking medication? In Experiment 1, we used a pictorial timeline method to examine age-related changes in how people estimate the time involved in daily events. We also conducted a spatial processing task to control for possible age-related bias in spatial processing. Findings showed that older adults projected smaller windows of time on the timeline to represent the duration of events than did younger adults, which indicates that older adults underestimate time duration. However, older adults also projected smaller windows in spatial task, which creates ambiguity in interpreting the reduced duration estimates among older adults. In Experiment 2, we administered an improved timeline task and spatial task that were comparable in difficulty between age groups and used defined endpoints of the reference line. Consistent with findings from Experiment 1, older adults projected a smaller time window than their younger counterparts, whereas the two age groups showed no differences in estimating spatial distances in the improved spatial experiment. Taken together, our findings suggest that older adults make shorter estimates of the duration of an event than younger adults, and that these age differences are due to age-related differences in orientation to time rather than to a general bias in spatial processing. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Visual and Real-Time Event-Specific Loop-Mediated Isothermal Amplification Based Detection Assays for Bt Cotton Events MON531 and MON15985.

    PubMed

    Randhawa, Gurinder Jit; Chhabra, Rashmi; Bhoge, Rajesh K; Singh, Monika

    2015-01-01

    Bt cotton events MON531 and MON15985 are authorized for commercial cultivation in more than 18 countries. In India, four Bt cotton events have been commercialized; more than 95% of total area under genetically modified (GM) cotton cultivation comprises events MON531 and MON15985. The present study reports on the development of efficient event-specific visual and real-time loop-mediated isothermal amplification (LAMP) assays for detection and identification of cotton events MON531 and MON15985. Efficiency of LAMP assays was compared with conventional and real-time PCR assays. Real-time LAMP assay was found time-efficient and most sensitive, detecting up to two target copies within 35 min. The developed real-time LAMP assays, when combined with efficient DNA extraction kit/protocol, may facilitate onsite GM detection to check authenticity of Bt cotton seeds.

  14. Time difference of arrival to blast localization of potential chemical/biological event on the move

    NASA Astrophysics Data System (ADS)

    Morcos, Amir; Desai, Sachi; Peltzer, Brian; Hohil, Myron E.

    2007-10-01

    Integrating a sensor suite with ability to discriminate potential Chemical/Biological (CB) events from high-explosive (HE) events employing a standalone acoustic sensor with a Time Difference of Arrival (TDOA) algorithm we developed a cueing mechanism for more power intensive and range limited sensing techniques. Enabling the event detection algorithm to locate to a blast event using TDOA we then provide further information of the event as either Launch/Impact and if CB/HE. The added information is provided to a range limited chemical sensing system that exploits spectroscopy to determine the contents of the chemical event. The main innovation within this sensor suite is the system will provide this information on the move while the chemical sensor will have adequate time to determine the contents of the event from a safe stand-off distance. The CB/HE discrimination algorithm exploits acoustic sensors to provide early detection and identification of CB attacks. Distinct characteristics arise within the different airburst signatures because HE warheads emphasize concussive and shrapnel effects, while CB warheads are designed to disperse their contents over large areas, therefore employing a slower burning, less intense explosive to mix and spread their contents. Differences characterized by variations in the corresponding peak pressure and rise time of the blast, differences in the ratio of positive pressure amplitude to the negative amplitude, and variations in the overall duration of the resulting waveform. The discrete wavelet transform (DWT) is used to extract the predominant components of these characteristics from air burst signatures at ranges exceeding 3km. Highly reliable discrimination is achieved with a feed-forward neural network classifier trained on a feature space derived from the distribution of wavelet coefficients and higher frequency details found within different levels of the multiresolution decomposition. The development of an adaptive noise

  15. Low time resolution analysis of polar ice cores cannot detect impulsive nitrate events

    NASA Astrophysics Data System (ADS)

    Smart, D. F.; Shea, M. A.; Melott, A. L.; Laird, C. M.

    2014-12-01

    Ice cores are archives of climate change and possibly large solar proton events (SPEs). Wolff et al. (2012) used a single event, a nitrate peak in the GISP2-H core, which McCracken et al. (2001a) time associated with the poorly quantified 1859 Carrington event, to discredit SPE-produced, impulsive nitrate deposition in polar ice. This is not the ideal test case. We critique the Wolff et al. analysis and demonstrate that the data they used cannot detect impulsive nitrate events because of resolution limitations. We suggest reexamination of the top of the Greenland ice sheet at key intervals over the last two millennia with attention to fine resolution and replicate sampling of multiple species. This will allow further insight into polar depositional processes on a subseasonal scale, including atmospheric sources, transport mechanisms to the ice sheet, postdepositional interactions, and a potential SPE association.

  16. Effective target binarization method for linear timed address-event vision system

    NASA Astrophysics Data System (ADS)

    Xu, Jiangtao; Zou, Jiawei; Yan, Shi; Gao, Zhiyuan

    2016-06-01

    This paper presents an effective target binarization method for a linear timed address-event (TAE) vision system. In the preprocessing phase, TAE data are processed by denoising, thinning, and edge connection methods sequentially to obtain the denoised- and clear-event contours. Then, the object region will be confirmed by an event-pair matching method. Finally, the image open and close operations of morphology methods are introduced to remove the artifacts generated by event-pair mismatching. Several degraded images were processed by our method and some traditional binarization methods, and the experimental results are provided. As compared with other methods, the proposed method performs efficiently on extracting the target region and gets satisfactory binarization results from object images with low-contrast and nonuniform illumination.

  17. Markov chains and semi-Markov models in time-to-event analysis

    PubMed Central

    Abner, Erin L.; Charnigo, Richard J.; Kryscio, Richard J.

    2014-01-01

    A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields. PMID:24818062

  18. Multi-variable X-band radar observation and tracking of ash plume from Mt. Etna volcano on November 23, 2013 event

    NASA Astrophysics Data System (ADS)

    Montopoli, Mario; Vulpiani, Gianfranco; Riccci, Matteo; Corradini, Stefano; Merucci, Luca; Marzano, Frank S.

    2015-04-01

    Ground based weather radar observations of volcanic ash clouds are gaining momentum after recent works which demonstrated their potential use either as stand alone tool or in combination with satellite retrievals. From an operational standpoint, radar data have been mainly exploited to derive the height of ash plume and its temporal-spatial development, taking into account the radar limitation of detecting coarse ash particles (from approximately 20 microns to 10 millimeters and above in terms of particle's radius). More sophisticated radar retrievals can include airborne ash concentration, ash fall rate and out-flux rate. Marzano et al. developed several volcanic ash radar retrieval (VARR) schemes, even though their practical use is still subject to a robust validation activity. The latter is made particularly difficult due to the lack of field campaigns with multiple observations and the scarce repetition of volcanic events. The radar variable, often used to infer the physical features of actual ash clouds, is the radar reflectivity named ZHH. It is related to ash particle size distribution and it shows a nice power law relationship with ash concentration. This makes ZHH largely used in radar-volcanology studies. However, weather radars are often able to detect Doppler frequency shifts and, more and more, they have a polarization-diversity capability. The former means that wind speed spectrum of the ash cloud is potentially inferable, whereas the latter implies that variables other than ZHH are available. Theoretically, these additional radar variables are linked to the degree of eccentricity of ash particles, their orientation and density as well as the presence of strong turbulence effects. Thus, the opportunity to refine the ash radar estimates so far developed can benefit from the thorough analysis of radar Doppler and polarization diversity. In this work we show a detailed analysis of Doppler shifts and polarization variables measured by the X band radar

  19. Onset Time of Ischemic Events and Antiplatelet Therapy after Intracranial Stent-assisted Coil Embolization.

    PubMed

    Matsumoto, Yoshihisa; Nakai, Kanji; Tsutsumi, Masanori; Iko, Minoru; Nii, Kouhei; Narita, Sumito; Eto, Ayumu; Mitsutake, Takahumi; Aikawa, Hiroshi; Kazekawa, Kiyoshi

    2014-04-01

    Stent-assisted coil embolization is effective for intracranial aneurysms, especially wide-necked aneurysms; however, the optimal antiplatelet regimens for ischemic events that develop after coil embolization have not yet been established. We aimed to determine the onset time of such postoperative ischemic events and the relationship between these events and antiplatelet therapy. We performed coil embolization using a vascular reconstruction stent for 43 cases of intracranial aneurysms and evaluated the incidence of postoperative ischemic events in these cases. Nine patients showed postoperative ischemic events during the follow-up period (13 ± 7 months). Two patients developed cerebral infarction within 24 hours. Five patients developed transient ischemic attack within 40 days while they were receiving dual antiplatelet therapy. In addition, 1 patient showed cerebral infarction 143 days postoperatively during single antiplatelet therapy, and a case of transient visual disturbance was reported 191 days postoperatively (49 days after antiplatelet therapy had been discontinued). We increased the number of antiplatelet agents in 4 of these patients. The other 5 patients were under strict observation with dual antiplatelet therapy. All these patients were shifted to single antiplatelet therapy 3-13 months postoperatively. No recurrence of ischemic events was noted. Postoperative ischemic events are most likely to occur within 40 days postoperatively. For patients with postoperative ischemic events, additional ischemic events can be prevented by increasing the number of antiplatelet agents; subsequently, they can be shifted to single antiplatelet therapy after the risk of recurrence has decreased. Copyright © 2014 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  20. Space-time patterns of meteorological drought events in the European Greater Alpine Region.

    NASA Astrophysics Data System (ADS)

    Haslinger, Klaus; Blöschl, Günter

    2017-04-01

    Drought is a natural hazard impacting tremendously on human systems as latest events, like the summer 2015 drought in Central and Eastern Europe, show. However, extreme events are rare and only a few of the most recent ones are analyzed in detail concerning their emergence in space and time. With the study presented here, we aim to investigate spatiotemporal drought characteristics on a much larger sample of events covering the past 200+ years. The area of interest is the European Greater Alpine Region located at the intersection of three major climate divisions in Europe: Mediterranean climate as well as temperate oceanic and continental climate. We use gridded data of 3-month moving average precipitation sums which are transformed into percentile values for each grid point and each month individually to ensure comparability across regions and seasons. A threshold is applied to detect contiguous areas within the gridded fields below a given percentile threshold (20th percentile). Subsequently, all areas overlapping to a certain degree along time are treated as the same space-time object to form a meteorological drought event. Distinct attributes are derived for every event like duration, spatial extent, intensity, overall severity, season of peak intensity, temperature anomaly etc., which are analyzed in terms of their long-term temporal evolution, seasonality, spatial clustering and temperature characteristics. Our results indicate on one hand only minor changes in drought frequency and duration over the last 200 years, but on the other hand large variations in drought intensity and overall severity on multidecadal time scales. The time period from 1850-1880 shows highest drought intensities and also highest severities followed by a second peak from around 1930-1950. Furthermore, the top 10% of events in terms of their severity reveal a general shift in seasonality from winter/spring events in the late 19th century towards autumn events during the last decades of

  1. Database for temporal events and spatial object features in time-lapse images

    NASA Astrophysics Data System (ADS)

    Eggers, Charles E.; Trivedi, Mohan M.

    2000-04-01

    We present an image database system with the capability to locate specified object-level merge and separation events in a sequence of time-lapse images. Specifically, the objects of interest are live cells in phase contrast images acquired by scanning cytometry. The system is named TERSIS and it resides on a workstation accessing time lapse images on CD- ROM. The cell objects are segmented and the resulting data are processed to extract a time series and its time derivative series for each spatial feature. Cell objects are tracked through the image sequence by applying similarity metrics to the cell object feature vectors, and cell merge and separation events are located using global image statistics. Multiple hypotheses are generated and scored to determine participating cell objects in merge/separation events. The cell association and time-varying spatial data re stored in a database. A graphical suer interface provides the user with tools to specify queries for specific cellular states and events for recall and display. Primary limitation include the need for an automatic front-end segmenter and increased cell tracking volume. The design of this system is extensible to other object types and forms of sequential image input, including video.

  2. Discrete-event requirements model for sensor fusion to provide real-time diagnostic feedback

    NASA Astrophysics Data System (ADS)

    Rokonuzzaman, Mohd; Gosine, Raymond G.

    1998-06-01

    Minimally-invasive surgical techniques reduce the size of the access corridor and affected zones resulting in limited real-time perceptual information available to the practitioners. A real-time feedback system is required to offset deficiencies in perceptual information. This feedback system acquires data from multiple sensors and fuses these data to extract pertinent information within defined time windows. To perform this task, a set of computing components interact with each other resulting in a discrete event dynamic system. In this work, a new discrete event requirements model for sensor fusion has been proposed to ensure logical and temporal correctness of the operation of the real-time diagnostic feedback system. This proposed scheme models system requirements as a Petri net based discrete event dynamic machine. The graphical representation and quantitative analysis of this model has been developed. Having a natural graphical property, this Petri net based model enables the requirements engineer to communicate intuitively with the client to avoid faults in the early phase of the development process. The quantitative analysis helps justify the logical and temporal correctness of the operation of the system. It has been shown that this model can be analyzed to check the presence of deadlock, reachability, and repetitiveness of the operation of the sensor fusion system. This proposed novel technique to model the requirements of sensor fusion as a discrete event dynamic system has the potential to realize highly reliable real-time diagnostic feedback system for many applications, such as minimally invasive instrumentation.

  3. From sensation to perception: Using multivariate classification of visual illusions to identify neural correlates of conscious awareness in space and time.

    PubMed

    Hogendoorn, Hinze

    2015-01-01

    An important goal of cognitive neuroscience is understanding the neural underpinnings of conscious awareness. Although the low-level processing of sensory input is well understood in most modalities, it remains a challenge to understand how the brain translates such input into conscious awareness. Here, I argue that the application of multivariate pattern classification techniques to neuroimaging data acquired while observers experience perceptual illusions provides a unique way to dissociate sensory mechanisms from mechanisms underlying conscious awareness. Using this approach, it is possible to directly compare patterns of neural activity that correspond to the contents of awareness, independent from changes in sensory input, and to track these neural representations over time at high temporal resolution. I highlight five recent studies using this approach, and provide practical considerations and limitations for future implementations.

  4. Multiple imputation for multivariate data with missing and below-threshold measurements: time-series concentrations of pollutants in the Arctic.

    PubMed

    Hopke, P K; Liu, C; Rubin, D B

    2001-03-01

    Many chemical and environmental data sets are complicated by the existence of fully missing values or censored values known to lie below detection thresholds. For example, week-long samples of airborne particulate matter were obtained at Alert, NWT, Canada, between 1980 and 1991, where some of the concentrations of 24 particulate constituents were coarsened in the sense of being either fully missing or below detection limits. To facilitate scientific analysis, it is appealing to create complete data by filling in missing values so that standard complete-data methods can be applied. We briefly review commonly used strategies for handling missing values and focus on the multiple-imputation approach, which generally leads to valid inferences when faced with missing data. Three statistical models are developed for multiply imputing the missing values of airborne particulate matter. We expect that these models are useful for creating multiple imputations in a variety of incomplete multivariate time series data sets.

  5. Ants Can Expect the Time of an Event on Basis of Previous Experiences

    PubMed Central

    Cammaerts, Roger

    2016-01-01

    Working on three ant species of the genus Myrmica, M. ruginodis, M. rubra, and M. sabuleti, we showed that foragers can expect the subsequent time at which food will be available on the basis of the previous times at which food was present. The ants acquired this expectative ability right after having experienced two time shifts of food delivery. Moreover, the ants' learning score appeared to be a logarithmic function of time (i.e., of the number of training days). This ability to expect subsequent times at which an event will occur may be an advantageous ethological trait. PMID:27403457

  6. Multicomponent seismic noise attenuation with multivariate order statistic filters

    NASA Astrophysics Data System (ADS)

    Wang, Chao; Wang, Yun; Wang, Xiaokai; Xun, Chao

    2016-10-01

    The vector relationship between multicomponent seismic data is highly important for multicomponent processing and interpretation, but this vector relationship could be damaged when each component is processed individually. To overcome the drawback of standard component-by-component filtering, multivariate order statistic filters are introduced and extended to attenuate the noise of multicomponent seismic data by treating such dataset as a vector wavefield rather than a set of scalar fields. According to the characteristics of seismic signals, we implement this type of multivariate filtering along local events. First, the optimal local events are recognized according to the similarity between the vector signals which are windowed from neighbouring seismic traces with a sliding time window along each trial trajectory. An efficient strategy is used to reduce the computational cost of similarity measurement for vector signals. Next, one vector sample each from the neighbouring traces are extracted along the optimal local event as the input data for a multivariate filter. Different multivariate filters are optimal for different noise. The multichannel modified trimmed mean (MTM) filter, as one of the multivariate order statistic filters, is applied to synthetic and field multicomponent seismic data to test its performance for attenuating white Gaussian noise. The results indicate that the multichannel MTM filter can attenuate noise while preserving the relative amplitude information of multicomponent seismic data more effectively than a single-channel filter.

  7. Event-Triggered Adaptive Dynamic Programming for Continuous-Time Systems With Control Constraints.

    PubMed

    Dong, Lu; Zhong, Xiangnan; Sun, Changyin; He, Haibo

    2016-08-31

    In this paper, an event-triggered near optimal control structure is developed for nonlinear continuous-time systems with control constraints. Due to the saturating actuators, a nonquadratic cost function is introduced and the Hamilton-Jacobi-Bellman (HJB) equation for constrained nonlinear continuous-time systems is formulated. In order to solve the HJB equation, an actor-critic framework is presented. The critic network is used to approximate the cost function and the action network is used to estimate the optimal control law. In addition, in the proposed method, the control signal is transmitted in an aperiodic manner to reduce the computational and the transmission cost. Both the networks are only updated at the trigger instants decided by the event-triggered condition. Detailed Lyapunov analysis is provided to guarantee that the closed-loop event-triggered system is ultimately bounded. Three case studies are used to demonstrate the effectiveness of the proposed method.

  8. Precipitation-Snowmelt Timing and Snowmelt Augmentation of Large Peak Flow Events, Western Cascades, Oregon

    NASA Astrophysics Data System (ADS)

    Jennings, K. S.; Jones, J. A.

    2014-12-01

    Extreme rain-on-snow floods are known to result from snowmelt coincident with precipitation, but comparatively little is known about the relative timing of these factors within storm events. Cumulative net snowmelt (hourly, from a snowmelt lysimeter) was plotted against precipitation for 26 large storms (> 1-yr return period) over the period 1991-2012 in the transient snow zone of the H.J. Andrews Experimental Forest in the western Cascades of Oregon. The relative timing of precipitation and net snowmelt at the hourly time scale was assessed with wavelet coherence. Five precipitation-net snowmelt response categories were identified: flat; persistent melt; persistent snow accumulation; late melt; and late snow accumulation. Persistent melt events were characterized by increasing cumulative net snowmelt and precipitation and had the highest mean peak flow and water available for runoff. Both the persistent melt and persistent snow accumulation categories had large, contiguous regions of significant wavelet coherence at multiple temporal scales, but pulses of precipitation preceded pulses of snowmelt in the persistent melt events, whereas precipitation was absorbed by the snowpack in the persistent accumulation category. A dewpoint temperature consistently above 0.5°C, elevated wind speeds, and a high fraction of precipitation falling as rain in the persistent melt category facilitated rapid snowmelt rates. During the two extreme rain-on-snow events in the sample, snowmelt was significantly synchronized with precipitation at 1-h to 64-h time scales throughout the 10-day event duration. Event categorization and analysis of wavelet coherence between precipitation and snowmelt can help predict peak discharge magnitude.

  9. Simulating recurrent event data with hazard functions defined on a total time scale.

    PubMed

    Jahn-Eimermacher, Antje; Ingel, Katharina; Ozga, Ann-Kathrin; Preussler, Stella; Binder, Harald

    2015-03-08

    In medical studies with recurrent event data a total time scale perspective is often needed to adequately reflect disease mechanisms. This means that the hazard process is defined on the time since some starting point, e.g. the beginning of some disease, in contrast to a gap time scale where the hazard process restarts after each event. While techniques such as the Andersen-Gill model have been developed for analyzing data from a total time perspective, techniques for the simulation of such data, e.g. for sample size planning, have not been investigated so far. We have derived a simulation algorithm covering the Andersen-Gill model that can be used for sample size planning in clinical trials as well as the investigation of modeling techniques. Specifically, we allow for fixed and/or random covariates and an arbitrary hazard function defined on a total time scale. Furthermore we take into account that individuals may be temporarily insusceptible to a recurrent incidence of the event. The methods are based on conditional distributions of the inter-event times conditional on the total time of the preceeding event or study start. Closed form solutions are provided for common distributions. The derived methods have been implemented in a readily accessible R script. The proposed techniques are illustrated by planning the sample size for a clinical trial with complex recurrent event data. The required sample size is shown to be affected not only by censoring and intra-patient correlation, but also by the presence of risk-free intervals. This demonstrates the need for a simulation algorithm that particularly allows for complex study designs where no analytical sample size formulas might exist. The derived simulation algorithm is seen to be useful for the simulation of recurrent event data that follow an Andersen-Gill model. Next to the use of a total time scale, it allows for intra-patient correlation and risk-free intervals as are often observed in clinical trial data

  10. The effect of time constraints and running phases on combined event pistol shooting performance.

    PubMed

    Dadswell, Clare; Payton, Carl; Holmes, Paul; Burden, Adrian

    2016-01-01

    The combined event is a crucial aspect of the modern pentathlon competition, but little is known about how shooting performance changes through the event. This study aimed to identify (i) how performance-related variables changed within each shooting series and (ii) how performance-related variables changed between each shooting series. Seventeen modern pentathletes completed combined event trials. An optoelectronic shooting system recorded score and pistol movement, and force platforms recorded centre of pressure movement 1 s prior to every shot. Heart rate and blood lactate values were recorded throughout the event. Whilst heart rate and blood lactate significantly increased between series (P < 0.05), there were no accompanying changes in the time period that participants spent aiming at the target, shot score, pistol movement or centre of pressure movement (P > 0.05). Thus, combined event shooting performance following each running phase appears similar to shooting performance following only 20 m of running. This finding has potential implications for the way in which modern pentathletes train for combined event shooting, and highlights the need for modern pentathletes to establish new methods with which to enhance shooting accuracy.

  11. Real-time extreme weather event attribution with forecast seasonal SSTs

    NASA Astrophysics Data System (ADS)

    Haustein, K.; Otto, F. E. L.; Uhe, P.; Schaller, N.; Allen, M. R.; Hermanson, L.; Christidis, N.; McLean, P.; Cullen, H.

    2016-06-01

    Within the last decade, extreme weather event attribution has emerged as a new field of science and garnered increasing attention from the wider scientific community and the public. Numerous methods have been put forward to determine the contribution of anthropogenic climate change to individual extreme weather events. So far nearly all such analyses were done months after an event has happened. Here we present a new method which can assess the fraction of attributable risk of a severe weather event due to an external driver in real-time. The method builds on a large ensemble of atmosphere-only general circulation model simulations forced by seasonal forecast sea surface temperatures (SSTs). Taking the England 2013/14 winter floods as an example, we demonstrate that the change in risk for heavy rainfall during the England floods due to anthropogenic climate change, is of similar magnitude using either observed or seasonal forecast SSTs. Testing the dynamic response of the model to the anomalous ocean state for January 2014, we find that observed SSTs are required to establish a discernible link between a particular SST pattern and an atmospheric response such as a shift in the jetstream in the model. For extreme events occurring under strongly anomalous SST patterns associated with known low-frequency climate modes, however, forecast SSTs can provide sufficient guidance to determine the dynamic contribution to the event.

  12. Time-Frequency Characteristics of Tsunami Magnetic Signals from Four Pacific Ocean Events

    NASA Astrophysics Data System (ADS)

    Schnepf, N. R.; Manoj, C.; An, C.; Sugioka, H.; Toh, H.

    2016-12-01

    The recent deployment of highly sensitive seafloor magnetometers coinciding with the deep solar minimum has provided excellent opportunities for observing tsunami electromagnetic signals. These fluctuating signals (periods ranging from 10-20 min) are generally found to be within ± ˜1 nT and coincide with the arrival of the tsunami waves. Previous studies focused on tsunami electromagnetic characteristics, as well as modeling the signal for individual events. This study instead aims to provide the time-frequency characteristics for a range of tsunami signals and a method to separate the data's noise using additional data from a remote observatory. We focus on four Pacific Ocean events of varying tsunami signal amplitude: (1) the 2011 Tohoku, Japan event (M9.0), (2) the 2010 Chile event (M8.8), (3) the 2009 Samoa event (M8.0) and, (4) the 2007 Kuril Islands event (M8.1). We find possible tsunami signals in high-pass filtered data and successfully isolate the signals from noise using a cross-wavelet analysis. The cross-wavelet analysis reveals that the longer period signals precede the stronger, shorter period signals. Our results are very encouraging for using tsunami magnetic signals in warning systems.

  13. A joint model for nonparametric functional mapping of longitudinal trajectory and time-to-event

    PubMed Central

    Lin, Min; Wu, Rongling

    2006-01-01

    Background The characterization of the relationship between a longitudinal response process and a time-to-event has been a pressing challenge in biostatistical research. This has emerged as an important issue in genetic studies when one attempts to detect the common genes or quantitative trait loci (QTL) that govern both a longitudinal trajectory and developmental event. Results We present a joint statistical model for functional mapping of dynamic traits in which the event times and longitudinal traits are taken to depend on a common set of genetic mechanisms. By fitting the Legendre polynomial of orthogonal properties for the time-dependent mean vector, our model does not rely on any curve, which is different from earlier parametric models of functional mapping. This newly developed nonparametric model is demonstrated and validated by an example for a forest tree in which stemwood growth and the time to first flower are jointly modelled. Conclusion Our model allows for the detection of specific QTL that govern both longitudinal traits and developmental processes through either pleiotropic effects or close linkage, or both. This model will have great implications for integrating longitudinal and event data to gain better insights into comprehensive biology and biomedicine. PMID:16539724

  14. "Anniversary Reaction": Important Events and Timing of Death in a Group of Roman Catholic Priests.

    ERIC Educational Resources Information Center

    Walker, Lee; Walker, Lawrence D.

    1990-01-01

    Compared death dates of 1,038 Roman Catholic priests with dates of Christmas, Easter, birthday, and day of ordination. Found no meaningful patterns of death around any anniversary, suggesting either no association between time of death and important anniversaries or that important event may be so extraordinary to each individuals that it is not…

  15. Modeling Repeatable Events Using Discrete-Time Data: Predicting Marital Dissolution

    ERIC Educational Resources Information Center

    Teachman, Jay

    2011-01-01

    I join two methodologies by illustrating the application of multilevel modeling principles to hazard-rate models with an emphasis on procedures for discrete-time data that contain repeatable events. I demonstrate this application using data taken from the 1995 National Survey of Family Growth (NSFG) to ascertain the relationship between multiple…

  16. Individual Change and the Timing and Onset of Important Life Events: Methods, Models, and Assumptions

    ERIC Educational Resources Information Center

    Grimm, Kevin; Marcoulides, Katerina

    2016-01-01

    Researchers are often interested in studying how the timing of a specific event affects concurrent and future development. When faced with such research questions there are multiple statistical models to consider and those models are the focus of this paper as well as their theoretical underpinnings and assumptions regarding the nature of the…

  17. Individual Change and the Timing and Onset of Important Life Events: Methods, Models, and Assumptions

    ERIC Educational Resources Information Center

    Grimm, Kevin; Marcoulides, Katerina

    2016-01-01

    Researchers are often interested in studying how the timing of a specific event affects concurrent and future development. When faced with such research questions there are multiple statistical models to consider and those models are the focus of this paper as well as their theoretical underpinnings and assumptions regarding the nature of the…

  18. Intensity/time profiles of solar particle events at one astronomical unit

    NASA Technical Reports Server (NTRS)

    Shea, M. A.

    1988-01-01

    A description of the intensity-time profiles of solar proton events observed at the orbit of the earth is presented. The discussion, which includes descriptive figures, presents a general overview of the subject without the detailed mathematical description of the physical processes which usually accompany most reviews.

  19. The influence of pubertal timing and stressful life events on depression and delinquency among Chinese adolescents.

    PubMed

    Chen, Jie; Yu, Jing; Wu, Yun; Zhang, Jianxin

    2015-06-01

    This study aimed to investigate the influences of pubertal timing and stressful life events on Chinese adolescents' depression and delinquency. Sex differences in these influences were also examined. A large sample with 4,228 participants aged 12-15 years (53% girls) was recruited in Beijing, China. Participants' pubertal development, stressful life events, depressive symptoms, and delinquency were measured using self-reported questionnaires. Both early maturing girls and boys displayed more delinquency than their same-sex on-time and late maturing peers. Early maturing girls displayed more depressive symptoms than on-time and late maturing girls, but boys in the three maturation groups showed similar levels of depressive symptoms. The interactive effects between early pubertal timing and stressful life events were significant in predicting depression and delinquency, particularly for girls. Early pubertal maturation is an important risk factor for Chinese adolescents' depression and delinquency. Stressful life events intensified the detrimental effects of early pubertal maturation on adolescents' depression and delinquency, particularly for girls.

  20. Non-parametric estimation and model checking procedures for marginal gap time distributions for recurrent events.

    PubMed

    Kvist, Kajsa; Gerster, Mette; Andersen, Per Kragh; Kessing, Lars Vedel

    2007-12-30

    For recurrent events there is evidence that misspecification of the frailty distribution can cause severe bias in estimated regression coefficients (Am. J. Epidemiol 1998; 149:404-411; Statist. Med. 2006; 25:1672-1684). In this paper we adapt a procedure originally suggested in (Biometrika 1999; 86:381-393) for parallel data for checking the gamma frailty to recurrent events. To apply the model checking procedure, a consistent non-parametric estimator for the marginal gap time distributions is needed. This is in general not possible due to induced dependent censoring in the recurrent events setting, however, in (Biometrika 1999; 86:59-70) a non-parametric estimator for the joint gap time distributions based on the principle of inverse probability of censoring weights is suggested. Here, we attempt to apply this estimator in the model checking procedure and the performance of the method is investigated with simulations and applied to Danish registry data. The method is further investigated using the usual Kaplan-Meier estimator and a marginalized estimator for the marginal gap time distributions. We conclude that the procedure only works when the recurrent event is common and when the intra-individual association between gap times is weak.

  1. Modeling Repeatable Events Using Discrete-Time Data: Predicting Marital Dissolution

    ERIC Educational Resources Information Center

    Teachman, Jay

    2011-01-01

    I join two methodologies by illustrating the application of multilevel modeling principles to hazard-rate models with an emphasis on procedures for discrete-time data that contain repeatable events. I demonstrate this application using data taken from the 1995 National Survey of Family Growth (NSFG) to ascertain the relationship between multiple…

  2. Multivariate Analysis of the Effect of Source of Supply and Carrier on Shipping Times for Issue Priority Group One (IPG-1) Requisitions

    DTIC Science & Technology

    2003-09-01

    MULTIVARIATE ANALYSIS , *SUPPLIES, *SHIPPING, MATHEMATICAL MODELS, SOURCES, THEATER LEVEL OPERATIONS, DEFENSE SYSTEMS, NAVY, THESES, NONPARAMETRIC STATISTICS, SUPPLY DEPOTS, LINEARITY, LEAST SQUARES METHOD.

  3. Global grid of master events for waveform cross-correlation: from testing to real time processing

    NASA Astrophysics Data System (ADS)

    Bobrov, Dmitry; Rozhkov, Mikhail; Kitov, Ivan

    2014-05-01

    Seismic monitoring of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) requires a globally uniform detection threshold, which is provided by geographical distribution of the Primary Seismic Network of the International Monitoring System (IMS). This detection threshold has to be as low as allowed by the entire set of real time and historical data recorded by the IMS. The International Data Centre (IDC) analyzes all relevant data in automatic processing and interactive review to issue a Reviewed Event Bulletin (REB), which includes all qualified events as obtained for the purpose of nuclear test monitoring. Since 2000, raw data, individual detections, and created events are saved in the IDC archive currently reaching tens of terabyte. In order to effectively use this archive in global monitoring we introduced the waveform cross correlation (matched filter) technique. Cross correlation between real time records at IMS stations and template waveforms is calculated for a dense (spacing of ~ 140 km) and regular grid of master events uniformly covering the globe. There are approximately 25,000 master events with 3 to 10 templates at IMS stations. In seismically active zones, we populate masters with real waveforms. For aseismic zones, we develop an extended set of synthetic templates for virtual master events. For optimal performance of cross correlation, the Principal and Independent Component Analysis are applied to the historical (from earthquakes and underground nuclear tests) and synthetic waveforms. Real waveform templates and selected PCA/ICA components are used in automatic processing for the production of a tentative cross-correlation standard event list (XSEL).

  4. Event-Triggered Fault Detection Filter Design for a Continuous-Time Networked Control System.

    PubMed

    Wang, Yu-Long; Shi, Peng; Lim, Cheng-Chew; Liu, Yuan

    2016-12-01

    This paper studies the problem of event-triggered fault detection filter (FDF) and controller coordinated design for a continuous-time networked control system (NCS) with biased sensor faults. By considering sensor-to-FDF network-induced delays and packet dropouts, which do not impose a constraint on the event-triggering mechanism, and proposing the simultaneous network bandwidth utilization ratio and fault occurrence probability-based event-triggering mechanism, a new closed-loop model for the considered NCS is established. Based on the established model, the event-triggered H ∞ performance analysis, and FDF and controller coordinated design are presented. The combined mutually exclusive distribution and Wirtinger-based integral inequality approach is proposed for the first time to deal with integral inequalities for products of vectors. This approach is proved to be less conservative than the existing Wirtinger-based integral inequality approach. The designed FDF and controller can guarantee the sensitivity of the residual signal to faults and the robustness of the NCS to external disturbances. The simulation results verify the effectiveness of the proposed event-triggering mechanism, and the FDF and controller coordinated design.

  5. Event-Triggered Generalized Dissipativity Filtering for Neural Networks With Time-Varying Delays.

    PubMed

    Wang, Jia; Zhang, Xian-Ming; Han, Qing-Long

    2016-01-01

    This paper is concerned with event-triggered generalized dissipativity filtering for a neural network (NN) with a time-varying delay. The signal transmission from the NN to its filter is completed through a communication channel. It is assumed that the network measurement of the NN is sampled periodically. An event-triggered communication scheme is introduced to design a suitable filter such that precious communication resources can be saved significantly while certain filtering performance can be ensured. On the one hand, the event-triggered communication scheme is devised to select only those sampled signals violating a certain threshold to be transmitted, which directly leads to saving of precious communication resources. On the other hand, the filtering error system is modeled as a time-delay system closely dependent on the parameters of the event-triggered scheme. Based on this model, a suitable filter is designed such that certain filtering performance can be ensured, provided that a set of linear matrix inequalities are satisfied. Furthermore, since a generalized dissipativity performance index is introduced, several kinds of event-triggered filtering issues, such as H∞ filtering, passive filtering, mixed H∞ and passive filtering, (Q,S,R) -dissipative filtering, and L2 - L∞ filtering, are solved in a unified framework. Finally, two examples are given to illustrate the effectiveness of the proposed method.

  6. Time period and lesbian identity events: a comparison of Norwegian lesbians across 1986 and 2005.

    PubMed

    Giertsen, Merethe; Anderssen, Norman

    2007-11-01

    The purpose of the present work was to investigate the assumption that the lives of lesbians are easier today. When exploring the hypothesis that identity events (e.g., coming out to parents) among lesbian women have changed over time and happen earlier in life today, we expected to find several time period effects. Two national samples obtained through mailed questionnaires were compared, 1986 (n = 123) and 2005 (n = 236), age range 20-49. Time period effects were found, including informants reporting identifying as lesbian earlier in life. Time period effects, however, were not found regarding relational identity events such as informing others about one's identity status. The findings did not reveal any conclusive evidence that it is easier to establish a lesbian lifestyle today.

  7. Renewal stochastic processes with correlated events: phase transitions along time evolution.

    PubMed

    Velázquez, Jorge; Robledo, Alberto

    2011-03-01

    We consider renewal stochastic processes generated by nonindependent events from the perspective that their basic distribution and associated generating functions obey the statistical-mechanical structure of systems with interacting degrees of freedom. Based on this fact we look briefly into the less-known case of processes that display phase transitions along time. When the density distribution ψ{n}(t) for the occurrence of the nth event at time t is considered to be a partition function, of a "microcanonical" type for n "degrees of freedom" at fixed "energy" t, one obtains a set of four partition functions of which that for the generating function variable z and Laplace transform variable ε, conjugate to n and t, respectively, plays a central role. These partition functions relate to each other in the customary way and in accordance to the precepts of large deviations theory, while the entropy, or Massieu potential, derived from ψ{n}(t) satisfies an Euler relation. We illustrate this scheme first for an ordinary renewal process of events generated by a simple exponential waiting-time distribution ψ(t). Then we examine a process modeled after the so-called Hamiltonian mean-field model that is representative of agents that perform a repeated task with an associated outcome, such as an opinion poll. When a sequence of (many) events takes place in a sufficiently short time the process exhibits clustering of the outcome, but for larger times the process resembles that of independent events. The two regimes are separated by a sharp transition, technically of the second order. Finally we point out the existence of a similar scheme for random-walk processes.

  8. Verifying Ptolemy II Discrete-Event Models Using Real-Time Maude

    NASA Astrophysics Data System (ADS)

    Bae, Kyungmin; Ölveczky, Peter Csaba; Feng, Thomas Huining; Tripakis, Stavros

    This paper shows how Ptolemy II discrete-event (DE) models can be formally analyzed using Real-Time Maude. We formalize in Real-Time Maude the semantics of a subset of hierarchical Ptolemy II DE models, and explain how the code generation infrastructure of Ptolemy II has been used to automatically synthesize a Real-Time Maude verification model from a Ptolemy II design model. This enables a model-engineering process that combines the convenience of Ptolemy II DE modeling and simulation with formal verification in Real-Time Maude.

  9. Analysis of inter-event times for avalanches on a conical bead pile with cohesion

    NASA Astrophysics Data System (ADS)

    Lehman, Susan; Johnson, Nathan; Tieman, Catherine; Wainwright, Elliot

    2015-03-01

    We investigate the critical behavior of a 3D conical bead pile built from uniform 3 mm steel spheres. Beads are added to the pile by dropping them onto the apex one at a time; avalanches are measured through changes in pile mass. We investigate the dynamic response of the pile by recording avalanches from the pile over tens of thousands of bead drops. We have previously shown that the avalanche size distribution follows a power law for beads dropped onto the pile apex from a low drop height. We are now tuning the critical behavior of the system by adding cohesion from a uniform magnetic field and find an increase in both size and number for very large avalanches and decreases in the mid-size avalanches. The resulting bump in the avalanche distribution moves to larger avalanche size as the cohesion in the system is increased. We compare the experimental inter-event time distribution to both the Brownian passage-time and Weibull distributions, and observe a shift from the Weibull to Brownian passage-time as we raise the threshold from measuring time between events of all sizes to time between only the largest system-spanning events. These results are both consistent with those from a mean-field model of slip avalanches in a shear system [Dahmen, Nat Phys 7, 554 (2011)].

  10. Correlation between night time VLF amplitude fluctuations and seismic events in Indian sub-continent

    NASA Astrophysics Data System (ADS)

    Ray, Suman; Chakrabarti, Sandip Kumar; Sasmal, Sudipta

    We present the results of an analysis of yearlong (2007) monitoring of night time data of the VLF signal amplitude. We use the VLF signals, transmitted from the Indian Navy station VTX (latitude 8.43(°) N, longitude 77.73(°) E) at 18.2 kHz and received at the Indian Centre for Space Physics, Kolkata (latitude 22.5(°) N, 87.5(°) E). We analyzed this data to find out the correlation between night time amplitude fluctuation and seismic events. We found, analyzing individual earthquakes (with magnitudes >5) as well as from statistical analysis (of all the events with effective magnitudes greater than 3.5), that night time fluctuation of the signal amplitude has the highest probability to be beyond the 2σ levels about three days prior to the seismic events. Recently an earthquake of magnitude 7.4 occurred at South-western Pakistan (latitude 28.9(°) N, 64(°) E). We analyze the night time VLF signals for two weeks around this earthquake day to see if there were any precursory effects of this earthquake. We find that the amplitude of the night time VLF signals anomalously fluctuated four days before this earthquake. Thus, the night time fluctuation could be considered as a precursor to enhanced seismic activities.

  11. Real-time gait event detection for normal subjects from lower trunk accelerations.

    PubMed

    González, Rafael C; López, Antonio M; Rodriguez-Uría, Javier; Alvarez, Diego; Alvarez, Juan C

    2010-03-01

    In this paper we report on a novel algorithm for the real-time detection and timing of initial (IC) and final contact (FC) gait events. We process the vertical and antero-posterior accelerations registered at the lower trunk (L3 vertebra). The algorithm is based on a set of heuristic rules extracted from a set of 1719 steps. An independent experiment was conducted to compare the results of our algorithms with those obtained from a Digimax force platform. The results show small deviations from times of occurrence of events recorded from the platform (13+/-35 ms for IC and 9+/-54 ms for FC). Results for the FC timing are especially relevant in this field, as no previous work has addressed its temporal location through the processing of lower trunk accelerations. The delay in the real-time detection of the IC is 117+/-39 ms and 34+/-72 ms for the FC, improving previously reported results for real-time detection of events from lower trunk accelerations.

  12. Systematic investigation of time windows for adverse event data mining for recently approved drugs.

    PubMed

    Hochberg, Alan M; Hauben, Manfred; Pearson, Ronald K; O'Hara, Donald J; Reisinger, Stephanie J

    2009-06-01

    The optimum timing of drug safety data mining for a new drug is uncertain. The objective of this study was to compare cumulative data mining versus mining with sliding time windows. Adverse Event Reporting System data (2001-2005) were studied for 27 drugs. A literature database was used to evaluate signals of disproportionate reporting (SDRs) from an urn model data-mining algorithm. Data mining was applied cumulatively and with sliding time windows from 1 to 4 years in width. Time from SDR generation to the appearance of a publication describing the corresponding adverse event was calculated. Cumulative data mining and 1- to 2-year sliding windows produced the most SDRs for recently approved drugs. In the first postmarketing year, data mining produced SDRs an average of 800 days in advance of publications regarding the corresponding drug-event combination. However, this timing advantage reduced to zero by year 4. The optimum window width for sliding windows should increase with time on the market. Data mining may be most useful for early signal detection during the first 3 years of a drug's postmarketing life. Beyond that, it may be most useful for supporting or weakening hypotheses.

  13. Sample size adjustment designs with time-to-event outcomes: A caution.

    PubMed

    Freidlin, Boris; Korn, Edward L

    2017-08-01

    Sample size adjustment designs, which allow increasing the study sample size based on interim analysis of outcome data from a randomized clinical trial, have been increasingly promoted in the biostatistical literature. Although it is recognized that group sequential designs can be at least as efficient as sample size adjustment designs, many authors argue that a key advantage of these designs is their flexibility; interim sample size adjustment decisions can incorporate information and business interests external to the trial. Recently, Chen et al. (Clinical Trials 2015) considered sample size adjustment applications in the time-to-event setting using a design (CDL) that limits adjustments to situations where the interim results are promising. The authors demonstrated that while CDL provides little gain in unconditional power (versus fixed-sample-size designs), there is a considerable increase in conditional power for trials in which the sample size is adjusted. In time-to-event settings, sample size adjustment allows an increase in the number of events required for the final analysis. This can be achieved by either (a) following the original study population until the additional events are observed thus focusing on the tail of the survival curves or (b) enrolling a potentially large number of additional patients thus focusing on the early differences in survival curves. We use the CDL approach to investigate performance of sample size adjustment designs in time-to-event trials. Through simulations, we demonstrate that when the magnitude of the true treatment effect changes over time, interim information on the shape of the survival curves can be used to enrich the final analysis with events from the time period with the strongest treatment effect. In particular, interested parties have the ability to make the end-of-trial treatment effect larger (on average) based on decisions using interim outcome data. Furthermore, in "clinical null" cases where there is no

  14. gPhoton: Time-tagged GALEX photon events analysis tools

    NASA Astrophysics Data System (ADS)

    Million, Chase C.; Fleming, S. W.; Shiao, B.; Loyd, P.; Seibert, M.; Smith, M.

    2016-03-01

    Written in Python, gPhoton calibrates and sky-projects the ~1.1 trillion ultraviolet photon events detected by the microchannel plates on the Galaxy Evolution Explorer Spacecraft (GALEX), archives these events in a publicly accessible database at the Mikulski Archive for Space Telescopes (MAST), and provides tools for working with the database to extract scientific results, particularly over short time domains. The software includes a re-implementation of core functionality of the GALEX mission calibration pipeline to produce photon list files from raw spacecraft data as well as a suite of command line tools to generate calibrated light curves, images, and movies from the MAST database.

  15. Convolutional neural networks applied to neutrino events in a liquid argon time projection chamber

    NASA Astrophysics Data System (ADS)

    Acciarri, R.; Adams, C.; An, R.; Asaadi, J.; Auger, M.; Bagby, L.; Baller, B.; Barr, G.; Bass, M.; Bay, F.; Bishai, M.; Blake, A.; Bolton, T.; Bugel, L.; Camilleri, L.; Caratelli, D.; Carls, B.; Castillo Fernandez, R.; Cavanna, F.; Chen, H.; Church, E.; Cianci, D.; Collin, G. H.; Conrad, J. M.; Convery, M.; Crespo-Anadón, J. I.; Del Tutto, M.; Devitt, D.; Dytman, S.; Eberly, B.; Ereditato, A.; Escudero Sanchez, L.; Esquivel, J.; Fleming, B. T.; Foreman, W.; Furmanski, A. P.; Garvey, G. T.; Genty, V.; Goeldi, D.; Gollapinni, S.; Graf, N.; Gramellini, E.; Greenlee, H.; Grosso, R.; Guenette, R.; Hackenburg, A.; Hamilton, P.; Hen, O.; Hewes, J.; Hill, C.; Ho, J.; Horton-Smith, G.; James, C.; de Vries, J. Jan; Jen, C.-M.; Jiang, L.; Johnson, R. A.; Jones, B. J. P.; Joshi, J.; Jostlein, H.; Kaleko, D.; Karagiorgi, G.; Ketchum, W.; Kirby, B.; Kirby, M.; Kobilarcik, T.; Kreslo, I.; Laube, A.; Li, Y.; Lister, A.; Littlejohn, B. R.; Lockwitz, S.; Lorca, D.; Louis, W. C.; Luethi, M.; Lundberg, B.; Luo, X.; Marchionni, A.; Mariani, C.; Marshall, J.; Martinez Caicedo, D. A.; Meddage, V.; Miceli, T.; Mills, G. B.; Moon, J.; Mooney, M.; Moore, C. D.; Mousseau, J.; Murrells, R.; Naples, D.; Nienaber, P.; Nowak, J.; Palamara, O.; Paolone, V.; Papavassiliou, V.; Pate, S. F.; Pavlovic, Z.; Porzio, D.; Pulliam, G.; Qian, X.; Raaf, J. L.; Rafique, A.; Rochester, L.; von Rohr, C. Rudolf; Russell, B.; Schmitz, D. W.; Schukraft, A.; Seligman, W.; Shaevitz, M. H.; Sinclair, J.; Snider, E. L.; Soderberg, M.; Söldner-Rembold, S.; Soleti, S. R.; Spentzouris, P.; Spitz, J.; St. John, J.; Strauss, T.; Szelc, A. M.; Tagg, N.; Terao, K.; Thomson, M.; Toups, M.; Tsai, Y.-T.; Tufanli, S.; Usher, T.; Van de Water, R. G.; Viren, B.; Weber, M.; Weston, J.; Wickremasinghe, D. A.; Wolbers, S.; Wongjirad, T.; Woodruff, K.; Yang, T.; Zeller, G. P.; Zennamo, J.; Zhang, C.

    2017-03-01

    We present several studies of convolutional neural networks applied to data coming from the MicroBooNE detector, a liquid argon time projection chamber (LArTPC). The algorithms studied include the classification of single particle images, the localization of single particle and neutrino interactions in an image, and the detection of a simulated neutrino event overlaid with cosmic ray backgrounds taken from real detector data. These studies demonstrate the potential of convolutional neural networks for particle identification or event detection on simulated neutrino interactions. We also address technical issues that arise when applying this technique to data from a large LArTPC at or near ground level.

  16. Relative Time-scale for Channeling Events Within Chaotic Terrains, Margaritifer Sinus, Mars

    NASA Technical Reports Server (NTRS)

    Janke, D.

    1985-01-01

    A relative time scale for ordering channel and chaos forming events was constructed for areas within the Margaritifer Sinus region of Mars. Transection and superposition relationships of channels, chaotic terrain, and the surfaces surrounding them were used to create the relative time scale; crater density studies were not used. Channels and chaos in contact with one another were treated as systems. These systems were in turn treated both separately (in order to understand internal relationships) and as members of the suite of Martian erosional forms (in order to produce a combined, master time scale). Channeling events associated with chaotic terrain development occurred over an extended geomorphic period. The channels can be divided into three convenient groups: those that pre-date intercrater plains development post-plains, pre-chasma systems; and those associated with the development of the Vallis Marineris chasmata. No correlations with cyclic climatic changes, major geologic events in other regions on Mars, or triggering phenomena (for example, specific impact events) were found.

  17. The Dependence of Characteristic Times of Gradual SEP Events on Their Associated CME Properties

    NASA Astrophysics Data System (ADS)

    Pan, Z. H.; Wang, C. B.; Xue, X. H.; Wang, Y. M.

    It is generally believed that coronal mass ejections CMEs are the drivers of shocks that accelerate gradual solar energetic particles SEPs One might expect that the characteristics of the SEP intensity time profiles observed at 1 AU are determined by properties of the associated CMEs such as the radial speed and the angular width Recently Kahler statistically investigated the characteristic times of gradual SEP events observed from 1998-2002 and their associated coronal mass ejection properties Astrophys J 628 1014--1022 2005 Three characteristic times of gradual SEP events are determined as functions of solar source longitude 1 T 0 the time from associated CME launch to SEP onset at 1 AU 2 T R the rise time from SEP onset to the time when the SEP intensity is a factor of 2 below peak intensity and 3 T D the duration over which the SEP intensity is within a factor of 2 of the peak intensity However in his study the CME speeds and angular widths are directly taken from the LASCO CME catalog In this study we analyze the radial speeds and the angular widths of CMEs by an ice-cream cone model and re-investigate their correlationships with the characteristic times of the corresponding SEP events We find T R and T D are significantly correlated with radial speed for SEP events in the best-connected longitude range and there is no correlation between T 0 and CME radial speed and angular width which is consistent with Kahler s results On the other hand it s found that T R and T D are also have

  18. Timing and climatic expression of Dansgaard-Oeschger events in stalagmites from Turkey

    NASA Astrophysics Data System (ADS)

    Fleitmann, D.; Badertscher, S.; Cheng, H.; Edwards, R.

    2011-12-01

    The timing of Dansgaard-Oeschger events (D-O events), also known as Greenland interstadials (GIS), and their climatic and environmental impact in the Eastern Mediterranean is not well constrained. A set of highly-resolved and precisely dated stalagmite oxygen and carbon isotope records from Sofular Cave in northern Turkey cover the last 130 kyr before present almost continously and show D-O events 1-25 in great detail. Rapid climatic changes at the transition into D-O events are marked by a fast ecosystem response within a few decades. The timing of most D-O events in the Sofular time series is broadly consistent with the Hulu Cave (Wang et al., 2001)and other absolutely dated speleothem records, such as the Kleegruobe Cave (Spotl et al., 2006) or Fort Stanton Cave (Asmerom et al., 2010) records. Importantly, the timing is also consistent within age uncertainties with the most recent NGRIP ice core chronology (GICC05; Svenson et al., 2006; 2008). References Asmerom, Y., V. J. Polyak, et al. (2010). "Variable winter moisture in the southwestern United States linked to rapid glacial climate shifts." Nature Geoscience 3(2): 114-117. Spotl, C., A. Mangini, et al. (2006). "Chronology and paleoenvironment of Marine Isotope Stage 3 from two high-elevation speleothems, Austrian Alps." Quaternary Science Reviews 25(9-10): 1127-1136. Svensson, A., K. K. Andersen, et al. (2006). "The Greenland Ice Core Chronology 2005, 15-42 ka. Part 2: comparison to other records." Quaternary Science Reviews 25(23-24): 3258-3267. Svensson, A., K. K. Andersen, et al. (2008). "A 60 000 year Greenland stratigraphic ice core chronology." Climate of the Past 4(1): 47-57.

  19. Lessons Learned from Real-Time, Event-Based Internet Science Communications

    NASA Technical Reports Server (NTRS)

    Phillips, T.; Myszka, E.; Gallagher, D. L.; Adams, M. L.; Koczor, R. J.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The Directorate's Science Roundtable includes active researchers, NASA public relations, educators, and administrators. The Science@NASA award-winning family of Web sites features science, mathematics, and space news. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing science activities in real-time has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases, broadcasts accommodate active feedback and questions from Internet participants. Through these projects a pattern has emerged in the level of interest or popularity with the public. The pattern differentiates projects that include science from those that do not, All real-time, event-based Internet activities have captured public interest at a level not achieved through science stories or educator resource material exclusively. The worst event-based activity attracted more interest than the best written science story. One truly rewarding lesson learned through these projects is that the public recognizes the importance and excitement of being part of scientific discovery. Flying a camera to 100,000 feet altitude isn't as interesting to the public as searching for viable life-forms at these oxygen-poor altitudes. The details of these real-time, event-based projects and lessons learned will be discussed.

  20. Lessons Learned from Real-Time, Event-Based Internet Science Communications

    NASA Technical Reports Server (NTRS)

    Phillips, T.; Myszka, E.; Gallagher, D. L.; Adams, M. L.; Koczor, R. J.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The Directorate's Science Roundtable includes active researchers, NASA public relations, educators, and administrators. The Science@NASA award-winning family of Web sites features science, mathematics, and space news. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing science activities in real-time has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases, broadcasts accommodate active feedback and questions from Internet participants. Through these projects a pattern has emerged in the level of interest or popularity with the public. The pattern differentiates projects that include science from those that do not, All real-time, event-based Internet activities have captured public interest at a level not achieved through science stories or educator resource material exclusively. The worst event-based activity attracted more interest than the best written science story. One truly rewarding lesson learned through these projects is that the public recognizes the importance and excitement of being part of scientific discovery. Flying a camera to 100,000 feet altitude isn't as interesting to the public as searching for viable life-forms at these oxygen-poor altitudes. The details of these real-time, event-based projects and lessons learned will be discussed.

  1. Statistical Property and Model for the Inter-Event Time of Terrorism Attacks

    NASA Astrophysics Data System (ADS)

    Zhu, Jun-Fang; Han, Xiao-Pu; Wang, Bing-Hong

    2010-06-01

    The inter-event time of terrorism attack events is investigated by empirical data and model analysis. Empirical evidence shows that it follows a scale-free property. In order to understand the dynamic mechanism of such a statistical feature, an opinion dynamic model with a memory effect is proposed on a two-dimensional lattice network. The model mainly highlights the role of individual social conformity and self-affirmation psychology. An attack event occurs when the order parameter indicating the strength of public opposition opinion is smaller than a critical value. Ultimately, the model can reproduce the same statistical property as the empirical data and gives a good understanding for the possible dynamic mechanism of terrorism attacks.

  2. Regression with incomplete covariates and left-truncated time-to-event data.

    PubMed

    Shen, Hua; Cook, Richard J

    2013-03-15

    Studies of chronic diseases routinely sample individuals subject to conditions on an event time of interest. In epidemiology, for example, prevalent cohort studies aiming to evaluate risk factors for survival following onset of dementia require subjects to have survived to the point of screening. In clinical trials designed to assess the effect of experimental cancer treatments on survival, patients are required to survive from the time of cancer diagnosis to recruitment. Such conditions yield samples featuring left-truncated event time distributions. Incomplete covariate data often arise in such settings, but standard methods do not deal with the fact that individuals' covariate distributions are also affected by left truncation. We describe an expectation-maximization algorithm for dealing with incomplete covariate data in such settings, which uses the covariate distribution conditional on the selection criterion. We describe an extension to deal with subgroup analyses in clinical trials for the case in which the stratification variable is incompletely observed.

  3. Time forecast of a break-off event from a hanging glacier

    NASA Astrophysics Data System (ADS)

    Faillettaz, J.; Funk, M.; Vagliasindi, M.

    2015-09-01

    A cold hanging glacier located on the south face of the Grandes Jorasses (Mont Blanc, Italy) broke off on the 23 and 29 September 2014 with a total estimated ice volume of 105 000 m3. Thanks to very accurate surface displacement measurements taken right up to the final break-off, this event could be successfully predicted 10 days in advance, enabling local authorities to take the necessary safety measures. The break-off event also confirmed that surface displacements experience a power law acceleration along with superimposed log-periodic oscillations prior to the final rupture. This paper describes the methods used to achieve a satisfactory time forecast in real time and demonstrates, using a retrospective analysis, their potential for the development of early-warning systems in real time.

  4. Time forecast of a break-off event from a hanging glacier

    NASA Astrophysics Data System (ADS)

    Faillettaz, Jérome; Funk, Martin; Vagliasindi, Marco

    2016-06-01

    A cold hanging glacier located on the south face of the Grandes Jorasses (Mont Blanc, Italy) broke off on the 23 and 29 September 2014 with a total estimated ice volume of 105 000 m3. Thanks to accurate surface displacement measurements taken up to the final break-off, this event was successfully predicted 10 days in advance, enabling local authorities to take the necessary safety measures. The break-off event also confirmed that surface displacements experienced a power law acceleration along with superimposed log-periodic oscillations prior to the final rupture. This paper describes the methods used to achieve a satisfactory time forecast in real time and demonstrates, using a retrospective analysis, their potential for the development of early-warning systems in real time.

  5. Adults’ reports of their earliest memories: Consistency in events, ages, and narrative characteristics over time

    PubMed Central

    Bauer, Patricia J.; Tasdemir-Ozdes, Aylin; Larkina, Marina

    2014-01-01

    Earliest memories have been of interest since the late 1800s, when it was first noted that most adults do not have memories from the first years of life (so-called childhood amnesia). Several characteristics of adults’ earliest memories have been investigated, including emotional content, the perspective from which they are recalled, and vividness. The focus of the present research was a feature of early memories heretofore relatively neglected in the literature, namely, their consistency. Adults reported their earliest memories 2 to 4 times over a 4-year period. Reports of earliest memories were highly consistent in the events identified as the bases for earliest memories, the reported age at the time of the event, and in terms of qualities of the narrative descriptions. These findings imply stability in the boundary that marks the offset of childhood amnesia, as well as in the beginning of a continuous sense of self over time. PMID:24836979

  6. Music, clicks, and their imaginations favor differently the event-based timing component for rhythmic movements.

    PubMed

    Bravi, Riccardo; Quarta, Eros; Del Tongo, Claudia; Carbonaro, Nicola; Tognetti, Alessandro; Minciacchi, Diego

    2015-06-01

    The involvement or noninvolvement of a clock-like neural process, an effector-independent representation of the time intervals to produce, is described as the essential difference between event-based and emergent timing. In a previous work (Bravi et al. in Exp Brain Res 232:1663-1675, 2014a. doi: 10.1007/s00221-014-3845-9 ), we studied repetitive isochronous wrist's flexion-extensions (IWFEs), performed while minimizing visual and tactile information, to clarify whether non-temporal and temporal characteristics of paced auditory stimuli affect the precision and accuracy of the rhythmic motor performance. Here, with the inclusion of new recordings, we expand the examination of the dataset described in our previous study to investigate whether simple and complex paced auditory stimuli (clicks and music) and their imaginations influence in a different way the timing mechanisms for repetitive IWFEs. Sets of IWFEs were analyzed by the windowed (lag one) autocorrelation-wγ(1), a statistical method recently introduced for the distinction between event-based and emergent timing. Our findings provide evidence that paced auditory information and its imagination favor the engagement of a clock-like neural process, and specifically that music, unlike clicks, lacks the power to elicit event-based timing, not counteracting the natural shift of wγ(1) toward positive values as frequency of movements increase.

  7. A New Characteristic Function for Fast Time-Reverse Seismic Event Location

    NASA Astrophysics Data System (ADS)

    Hendriyana, Andri; Bauer, Klaus; Weber, Michael; Jaya, Makky; Muksin, Muksin

    2015-04-01

    Microseismicity produced by natural activities is usually characterized by low signal-to-noise ratio and huge amount of data as recording is conducted for a long period of time. Locating microseismic events is preferably carried out using migration-based methods such as time-reverse modeling (TRM). The original TRM is based on backpropagating the wavefield from the receiver down to the source location. Alternatively, we are using a characteristic function (CF) derived from the measured wavefield as input for the TRM. The motivation for such a strategy is to avoid undesired contributions from secondary arrivals which may generate artifacts in the final images. In this presentation, we introduce a new CF as input for TRM method. To obtain this CF, initially we apply kurtosis-based automatic onset detection and convolution with a given wavelet. The convolution with low frequency wavelets allows us to conduct time-reverse modeling using coarser sampling hence it will reduce computing time. We apply the method to locate seismic events measured along an active part of the Sumatra Fault around the Tarutung pull-apart basin (North Sumatra, Indonesia). The results show that seismic events are well-determined since they are concentrated along the Sumatran fault. Internal details of the Tarutung basin structure could be derived. Our results are consistent with those obtained from inversion of manually picked travel time data.

  8. APNEA list mode data acquisition and real-time event processing

    SciTech Connect

    Hogle, R.A.; Miller, P.; Bramblett, R.L.

    1997-11-01

    The LMSC Active Passive Neutron Examinations and Assay (APNEA) Data Logger is a VME-based data acquisition system using commercial-off-the-shelf hardware with the application-specific software. It receives TTL inputs from eighty-eight {sup 3}He detector tubes and eight timing signals. Two data sets are generated concurrently for each acquisition session: (1) List Mode recording of all detector and timing signals, timestamped to 3 microsecond resolution; (2) Event Accumulations generated in real-time by counting events into short (tens of microseconds) and long (seconds) time bins following repetitive triggers. List Mode data sets can be post-processed to: (1) determine the optimum time bins for TRU assay of waste drums, (2) analyze a given data set in several ways to match different assay requirements and conditions and (3) confirm assay results by examining details of the raw data. Data Logger events are processed and timestamped by an array of 15 TMS320C40 DSPs and delivered to an embedded controller (PowerPC604) for interim disk storage. Three acquisition modes, corresponding to different trigger sources are provided. A standard network interface to a remote host system (Windows NT or SunOS) provides for system control, status, and transfer of previously acquired data. 6 figs.

  9. Discovery, classification, and scientific exploration of transient events from the Catalina Real-time Transient Survey

    NASA Astrophysics Data System (ADS)

    Mahabal, A. A.; Djorgovski, S. G.; Drake, A. J.; Donalek, C.; Graham, M. J.; Williams, R. D.; Chen, Y.; Moghaddam, B.; Turmon, M.; Beshore, E.; Larson, S.

    2011-09-01

    Exploration of the time domain -- variable and transient objects and phenomena -- is rapidly becoming a vibrant research frontier, touching on essentially every field of astronomy and astrophysics, from the Solar system to cosmology. Time domain astronomy is being enabled by the advent of the new generation of synoptic sky surveys that cover large areas on the sky repeatedly, and generating massive data streams. Their scientific exploration poses many challenges, driven mainly by the need for a real-time discovery, classification, and follow-up of the interesting events. Here we describe the Catalina Real-Time Transient Survey (CRTS), that discovers and publishes transient events at optical wavelengths in real time, thus benefiting the entire community. We describe some of the scientific results to date, and then focus on the challenges of the automated classification and prioritization of transient events. CRTS represents a scientific and a technological testbed and precursor for the larger surveys in the future, including the Large Synoptic Survey Telescope (LSST) and the Square Kilometer Array (SKA).

  10. Foreshocks and aftershocks of Pisagua 2014 earthquake: time and space evolution of megathrust event.

    NASA Astrophysics Data System (ADS)

    Fuenzalida Velasco, Amaya; Rietbrock, Andreas; Wollam, Jack; Thomas, Reece; de Lima Neto, Oscar; Tavera, Hernando; Garth, Thomas; Ruiz, Sergio

    2016-04-01

    The 2014 Pisagua earthquake of magnitude 8.2 is the first case in Chile where a foreshock sequence was clearly recorded by a local network, as well all the complete sequence including the mainshock and its aftershocks. The seismicity of the last year before the mainshock include numerous clusters close to the epicentral zone (Ruiz et al; 2014) but it was on 16th March that this activity became stronger with the Mw 6.7 precursory event taking place in front of Iquique coast at 12 km depth. The Pisagua earthquake arrived on 1st April 2015 breaking almost 120 km N-S and two days after a 7.6 aftershock occurred in the south of the rupture, enlarging the zone affected by this sequence. In this work, we analyse the foreshocks and aftershock sequence of Pisagua earthquake, from the spatial and time evolution for a total of 15.764 events that were recorded from the 1st March to 31th May 2015. This event catalogue was obtained from the automatic analyse of seismic raw data of more than 50 stations installed in the north of Chile and the south of Peru. We used the STA/LTA algorithm for the detection of P and S arrival times on the vertical components and then a method of back propagation in a 1D velocity model for the event association and preliminary location of its hypocenters following the algorithm outlined by Rietbrock et al. (2012). These results were then improved by locating with NonLinLoc software using a regional velocity model. We selected the larger events to analyse its moment tensor solution by a full waveform inversion using ISOLA software. In order to understand the process of nucleation and propagation of the Pisagua earthquake, we also analysed the evolution in time of the seismicity of the three months of data. The zone where the precursory events took place was strongly activated two weeks before the mainshock and remained very active until the end of the analysed period with an important quantity of the seismicity located in the upper plate and having

  11. Trends of spring time frost events and phenological dates in Central Europe

    NASA Astrophysics Data System (ADS)

    Scheifinger, H.; Menzel, A.; Koch, E.; Peter, Ch.

    Over large parts of the Northern Hemisphere's continents temperature has been increasing during the last century. Particularly minimum temperatures show a more pronounced increase than maximum temperatures. Not only the phenological seasons, but also the potentially plant damaging late frost events are governed by the atmosphere. In case of a rise of minimum temperatures one would expect phenological phases and spring late frost events to occur earlier. In this work the question is elucidated whether plant phenology shifts at a higher or lower rate towards earlier occurrences than potential plant damaging events, like spring late frost events. Frost events based on the last occurrence of daily minimum temperatures below a certain threshold have been moving faster to earlier occurrence dates than phenological phases during the last decades at 50 climate stations in Central Europe. Trend values of frost time series range around -0.2 days/year and of phenological time series are between -0.2 and 0.0 days/year over the period from 1951-1997. `Corylus avellana beginning of pollination' is the only one of the 13 phases considered here with a lower trend value of -0.28 days/year. Early phases are more adapted to below zero temperatures and therefore follow more closely the temperature variability. Later phases seem to have more reason to be concerned about possible late frost events and react more cautiously towards higher spring temperatures and earlier last frost dates. The risk of late frost damage for plants should have been lower during the last decade as compared to the previous decades.

  12. Relative timing of last glacial maximum and late-glacial events in the central tropical Andes

    NASA Astrophysics Data System (ADS)

    Bromley, Gordon R. M.; Schaefer, Joerg M.; Winckler, Gisela; Hall, Brenda L.; Todd, Claire E.; Rademaker, Kurt M.

    2009-11-01

    Whether or not tropical climate fluctuated in synchrony with global events during the Late Pleistocene is a key problem in climate research. However, the timing of past climate changes in the tropics remains controversial, with a number of recent studies reporting that tropical ice age climate is out of phase with global events. Here, we present geomorphic evidence and an in-situ cosmogenic 3He surface-exposure chronology from Nevado Coropuna, southern Peru, showing that glaciers underwent at least two significant advances during the Late Pleistocene prior to Holocene warming. Comparison of our glacial-geomorphic map at Nevado Coropuna to mid-latitude reconstructions yields a striking similarity between Last Glacial Maximum (LGM) and Late-Glacial sequences in tropical and temperate regions. Exposure ages constraining the maximum and end of the older advance at Nevado Coropuna range between 24.5 and 25.3 ka, and between 16.7 and 21.1 ka, respectively, depending on the cosmogenic production rate scaling model used. Similarly, the mean age of the younger event ranges from 10 to 13 ka. This implies that (1) the LGM and the onset of deglaciation in southern Peru occurred no earlier than at higher latitudes and (2) that a significant Late-Glacial event occurred, most likely prior to the Holocene, coherent with the glacial record from mid and high latitudes. The time elapsed between the end of the LGM and the Late-Glacial event at Nevado Coropuna is independent of scaling model and matches the period between the LGM termination and Late-Glacial reversal in classic mid-latitude records, suggesting that these events in both tropical and temperate regions were in phase.

  13. Real-time gesture interface based on event-driven processing from stereo silicon retinas.

    PubMed

    Lee, Jun Haeng; Delbruck, Tobi; Pfeiffer, Michael; Park, Paul K J; Shin, Chang-Woo; Ryu, Hyunsurk Eric; Kang, Byung Chang

    2014-12-01

    We propose a real-time hand gesture interface based on combining a stereo pair of biologically inspired event-based dynamic vision sensor (DVS) silicon retinas with neuromorphic event-driven postprocessing. Compared with conventional vision or 3-D sensors, the use of DVSs, which output asynchronous and sparse events in response to motion, eliminates the need to extract movements from sequences of video frames, and allows significantly faster and more energy-efficient processing. In addition, the rate of input events depends on the observed movements, and thus provides an additional cue for solving the gesture spotting problem, i.e., finding the onsets and offsets of gestures. We propose a postprocessing framework based on spiking neural networks that can process the events received from the DVSs in real time, and provides an architecture for future implementation in neuromorphic hardware devices. The motion trajectories of moving hands are detected by spatiotemporally correlating the stereoscopically verged asynchronous events from the DVSs by using leaky integrate-and-fire (LIF) neurons. Adaptive thresholds of the LIF neurons achieve the segmentation of trajectories, which are then translated into discrete and finite feature vectors. The feature vectors are classified with hidden Markov models, using a separate Gaussian mixture model for spotting irrelevant transition gestures. The disparity information from stereovision is used to adapt LIF neuron parameters to achieve recognition invariant of the distance of the user to the sensor, and also helps to filter out movements in the background of the user. Exploiting the high dynamic range of DVSs, furthermore, allows gesture recognition over a 60-dB range of scene illuminance. The system achieves recognition rates well over 90% under a variety of variable conditions with static and dynamic backgrounds with naïve users.

  14. Network meta-analysis of (individual patient) time to event data alongside (aggregate) count data.

    PubMed

    Saramago, Pedro; Chuang, Ling-Hsiang; Soares, Marta O

    2014-09-10

    Network meta-analysis methods extend the standard pair-wise framework to allow simultaneous comparison of multiple interventions in a single statistical model. Despite published work on network meta-analysis mainly focussing on the synthesis of aggregate data, methods have been developed that allow the use of individual patient-level data specifically when outcomes are dichotomous or continuous. This paper focuses on the synthesis of individual patient-level and summary time to event data, motivated by a real data example looking at the effectiveness of high compression treatments on the healing of venous leg ulcers. This paper introduces a novel network meta-analysis modelling approach that allows individual patient-level (time to event with censoring) and summary-level data (event count for a given follow-up time) to be synthesised jointly by assuming an underlying, common, distribution of time to healing. Alternative model assumptions were tested within the motivating example. Model fit and adequacy measures were used to compare and select models. Due to the availability of individual patient-level data in our example we were able to use a Weibull distribution to describe time to healing; otherwise, we would have been limited to specifying a uniparametric distribution. Absolute effectiveness estimates were more sensitive than relative effectiveness estimates to a range of alternative specifications for the model. The synthesis of time to event data considering individual patient-level data provides modelling flexibility, and can be particularly important when absolute effectiveness estimates, and not just relative effect estimates, are of interest.

  15. Network meta-analysis of (individual patient) time to event data alongside (aggregate) count data

    PubMed Central

    2014-01-01

    Background Network meta-analysis methods extend the standard pair-wise framework to allow simultaneous comparison of multiple interventions in a single statistical model. Despite published work on network meta-analysis mainly focussing on the synthesis of aggregate data, methods have been developed that allow the use of individual patient-level data specifically when outcomes are dichotomous or continuous. This paper focuses on the synthesis of individual patient-level and summary time to event data, motivated by a real data example looking at the effectiveness of high compression treatments on the healing of venous leg ulcers. Methods This paper introduces a novel network meta-analysis modelling approach that allows individual patient-level (time to event with censoring) and summary-level data (event count for a given follow-up time) to be synthesised jointly by assuming an underlying, common, distribution of time to healing. Alternative model assumptions were tested within the motivating example. Model fit and adequacy measures were used to compare and select models. Results Due to the availability of individual patient-level data in our example we were able to use a Weibull distribution to describe time to healing; otherwise, we would have been limited to specifying a uniparametric distribution. Absolute effectiveness estimates were more sensitive than relative effectiveness estimates to a range of alternative specifications for the model. Conclusions The synthesis of time to event data considering individual patient-level data provides modelling flexibility, and can be particularly important when absolute effectiveness estimates, and not just relative effect estimates, are of interest. PMID:25209121

  16. Multitarget real-time PCR-based system: monitoring for unauthorized genetically modified events in India.

    PubMed

    Randhawa, Gurinder Jit; Singh, Monika; Sood, Payal; Bhoge, Rajesh K

    2014-07-23

    A multitarget TaqMan real-time PCR (RTi-PCR) based system was developed to monitor unauthorized genetically modified (GM) events in India. Most of the GM events included in this study are either authorized for commercial cultivation or field trials, which were indigenously developed or imported for research purposes. The developed system consists of a 96-well prespotted plate with lyophilized primers and probes, for simultaneous detection of 47 targets in duplicate, including 21 event-specific sequences, 5 construct regions, 15 for transgenic elements, and 6 taxon-specific targets for cotton, eggplant, maize, potato, rice, and soybean. Limit of detection (LOD) of assays ranged from 0.1 to 0.01% GM content for different targets. Applicability, robustness, and practical utility of the developed system were verified with stacked GM cotton event, powdered samples of proficiency testing and two unknown test samples. This user-friendly multitarget approach can be efficiently utilized for monitoring the unauthorized GM events in an Indian context.

  17. Extreme event return times in long-term memory processes near 1/f

    NASA Astrophysics Data System (ADS)

    Blender, R.; Fraedrich, K.; Sienz, F.

    2008-07-01

    The distribution of extreme event return times and their correlations are analyzed in observed and simulated long-term memory (LTM) time series with 1/f power spectra. The analysis is based on tropical temperature and mixing ratio (specific humidity) time series from TOGA COARE with 1 min resolution and an approximate 1/f power spectrum. Extreme events are determined by Peak-Over-Threshold (POT) crossing. The Weibull distribution represents a reasonable fit to the return time distributions while the power-law predicted by the stretched exponential for 1/f deviates considerably. For a comparison and an analysis of the return time predictability, a very long simulated time series with an approximate 1/f spectrum is produced by a fractionally differenced (FD) process. This simulated data confirms the Weibull distribution (a power law can be excluded). The return time sequences show distinctly weaker long-term correlations than the original time series (correlation exponent γ≍0.56).

  18. Implications of small-bowel transit time in the detection rate of capsule endoscopy: A multivariable multicenter study of patients with obscure gastrointestinal bleeding

    PubMed Central

    Girelli, Carlo Maria; Soncini, Marco; Rondonotti, Emanuele

    2017-01-01

    AIM To define the role of small-bowel transit time in the detection rate of significant small-bowel lesions. METHODS Small-bowel capsule endoscopy records, prospectively collected from 30 participating centers in the Lombardy Registry from October 2011 to December 2013, were included in the study if the clinical indication was obscure gastrointestinal bleeding and the capsule reached the cecum. Based on capsule findings, we created two groups: P2 (significant findings) and P0-1 (normal/negligible findings). Groups were compared for age, gender, small-bowel transit time, type of instrument, modality of capsule performance (outpatients vs inpatients), bowel cleanliness, and center volume. RESULTS We retrieved and scrutinized 1,433 out of 2,295 capsule endoscopy records (62.4%) fulfilling the inclusion criteria. Patients were 67 ± 15 years old, and 815 (57%) were males. In comparison with patients in the P0-1 group, those in the P2 group (n = 776, 54%) were older (P < 0.0001), had a longer small-bowel transit time (P = 0.0015), and were more frequently examined in low-volume centers (P < 0.001). Age and small-bowel transit time were correlated (P < 0.001), with age as the sole independent predictor on multivariable analysis. Findings of the P2 group were artero-venous malformations (54.5%), inflammatory (23.6%) and protruding (10.4%) lesions, and luminal blood (11.5%). CONCLUSION In this selected, prospectively collected cohort of small-bowel capsule endoscopy performed for obscure gastrointestinal bleeding, a longer small-bowel transit time was associated with a higher detection rate of significant lesions, along with age and a low center volume, with age serving as an independent predictor. PMID:28216977

  19. Implications of small-bowel transit time in the detection rate of capsule endoscopy: A multivariable multicenter study of patients with obscure gastrointestinal bleeding.

    PubMed

    Girelli, Carlo Maria; Soncini, Marco; Rondonotti, Emanuele

    2017-01-28

    To define the role of small-bowel transit time in the detection rate of significant small-bowel lesions. Small-bowel capsule endoscopy records, prospectively collected from 30 participating centers in the Lombardy Registry from October 2011 to December 2013, were included in the study if the clinical indication was obscure gastrointestinal bleeding and the capsule reached the cecum. Based on capsule findings, we created two groups: P2 (significant findings) and P0-1 (normal/negligible findings). Groups were compared for age, gender, small-bowel transit time, type of instrument, modality of capsule performance (outpatients vs inpatients), bowel cleanliness, and center volume. We retrieved and scrutinized 1,433 out of 2,295 capsule endoscopy records (62.4%) fulfilling the inclusion criteria. Patients were 67 ± 15 years old, and 815 (57%) were males. In comparison with patients in the P0-1 group, those in the P2 group (n = 776, 54%) were older (P < 0.0001), had a longer small-bowel transit time (P = 0.0015), and were more frequently examined in low-volume centers (P < 0.001). Age and small-bowel transit time were correlated (P < 0.001), with age as the sole independent predictor on multivariable analysis. Findings of the P2 group were artero-venous malformations (54.5%), inflammatory (23.6%) and protruding (10.4%) lesions, and luminal blood (11.5%). In this selected, prospectively collected cohort of small-bowel capsule endoscopy performed for obscure gastrointestinal bleeding, a longer small-bowel transit time was associated with a higher detection rate of significant lesions, along with age and a low center volume, with age serving as an independent predictor.

  20. Solar Demon: near real-time solar eruptive event detection on SDO/AIA images

    NASA Astrophysics Data System (ADS)

    Kraaikamp, Emil; Verbeeck, Cis

    Solar flares, dimmings and EUV waves have been observed routinely in extreme ultra-violet (EUV) images of the Sun since 1996. These events are closely associated with coronal mass ejections (CMEs), and therefore provide useful information for early space weather alerts. The Solar Dynamics Observatory/Atmospheric Imaging Assembly (SDO/AIA) generates such a massive dataset that it becomes impossible to find most of these eruptive events manually. Solar Demon is a set of automatic detection algorithms that attempts to solve this problem by providing both near real-time warnings of eruptive events and a catalog of characterized events. Solar Demon has been designed to detect and characterize dimmings, EUV waves, as well as solar flares in near real-time on SDO/AIA data. The detection modules are running continuously at the Royal Observatory of Belgium on both quick-look data and synoptic science data. The output of Solar Demon can be accessed in near real-time on the Solar Demon website, and includes images, movies, light curves, and the numerical evolution of several parameters. Solar Demon is the result of collaboration between the FP7 projects AFFECTS and COMESEP. Flare detections of Solar Demon are integrated into the COMESEP alert system. Here we present the Solar Demon detection algorithms and their output. We will focus on the algorithm and its operational implementation. Examples of interesting flare, dimming and EUV wave events, and general statistics of the detections made so far during solar cycle 24 will be presented as well.

  1. The development and validation of a multivariable model to predict whether patients referred for total knee replacement are suitable surgical candidates at the time of initial consultation.

    PubMed

    Churchill, Laura; Malian, Samuel J; Chesworth, Bert M; Bryant, Dianne; MacDonald, Steven J; Marsh, Jacquelyn D; Giffin, J Robert

    2016-12-01

    In previous studies, 50%-70% of patients referred to orthopedic surgeons for total knee replacement (TKR) were not surgical candidates at the time of initial assessment. The purpose of our study was to identify and cross-validate patient self-reported predictors of suitability for TKR and to determine the clinical utility of a predictive model to guide the timing and appropriateness of referral to a surgeon. We assessed pre-consultation patient data as well as the surgeon's findings and post-consultation recommendations. We used multivariate logistic regression to detect self-reported items that could identify suitable surgical candidates. Patients' willingness to undergo surgery, higher rating of pain, greater physical function, previous intra-articular injections and patient age were the factors predictive of patients being offered and electing to undergo TKR. The application of the model developed in our study would effectively reduce the proportion of nonsurgical referrals by 25%, while identifying the vast majority of surgical candidates (> 90%). Using patient-reported information, we can correctly predict the outcome of specialist consultation for TKR in 70% of cases. To reduce long waits for first consultation with a surgeon, it may be possible to use these items to educate and guide referring clinicians and patients to understand when specialist consultation is the next step in managing the patient with severe osteoarthritis of the knee.

  2. The development and validation of a multivariable model to predict whether patients referred for total knee replacement are suitable surgical candidates at the time of initial consultation

    PubMed Central

    Churchill, Laura; Malian, Samuel J.; Chesworth, Bert M.; Bryant, Dianne; MacDonald, Steven J.; Marsh, Jacquelyn D.; Giffin, J. Robert

    2016-01-01

    Background In previous studies, 50%–70% of patients referred to orthopedic surgeons for total knee replacement (TKR) were not surgical candidates at the time of initial assessment. The purpose of our study was to identify and cross-validate patient self-reported predictors of suitability for TKR and to determine the clinical utility of a predictive model to guide the timing and appropriateness of referral to a surgeon. Methods We assessed pre-consultation patient data as well as the surgeon’s findings and post-consultation recommendations. We used multivariate logistic regression to detect self-reported items that could identify suitable surgical candidates. Results Patients’ willingness to undergo surgery, higher rating of pain, greater physical function, previous intra-articular injections and patient age were the factors predictive of patients being offered and electing to undergo TKR. Conclusion The application of the model developed in our study would effectively reduce the proportion of nonsurgical referrals by 25%, while identifying the vast majority of surgical candidates (> 90%). Using patient-reported information, we can correctly predict the outcome of specialist consultation for TKR in 70% of cases. To reduce long waits for first consultation with a surgeon, it may be possible to use these items to educate and guide referring clinicians and patients to understand when specialist consultation is the next step in managing the patient with severe osteoarthritis of the knee. PMID:28234616

  3. A novel multivariate approach using science-based calibration for direct coating thickness determination in real-time NIR process monitoring.

    PubMed

    Möltgen, C-V; Herdling, T; Reich, G

    2013-11-01

    This study demonstrates an approach, using science-based calibration (SBC), for direct coating thickness determination on heart-shaped tablets in real-time. Near-Infrared (NIR) spectra were collected during four full industrial pan coating operations. The tablets were coated with a thin hydroxypropyl methylcellulose (HPMC) film up to a film thickness of 28 μm. The application of SBC permits the calibration of the NIR spectral data without using costly determined reference values. This is due to the fact that SBC combines classical methods to estimate the coating signal and statistical methods for the noise estimation. The approach enabled the use of NIR for the measurement of the film thickness increase from around 8 to 28 μm of four independent batches in real-time. The developed model provided a spectroscopic limit of detection for the coating thickness of 0.64 ± 0.03 μm root-mean square (RMS). In the commonly used statistical methods for calibration, such as Partial Least Squares (PLS), sufficiently varying reference values are needed for calibration. For thin non-functional coatings this is a challenge because the quality of the model depends on the accuracy of the selected calibration standards. The obvious and simple approach of SBC eliminates many of the problems associated with the conventional statistical methods and offers an alternative for multivariate calibration.

  4. Bootstrap-based methods for estimating standard errors in Cox's regression analyses of clustered event times.

    PubMed

    Xiao, Yongling; Abrahamowicz, Michal

    2010-03-30

    We propose two bootstrap-based methods to correct the standard errors (SEs) from Cox's model for within-cluster correlation of right-censored event times. The cluster-bootstrap method resamples, with replacement, only the clusters, whereas the two-step bootstrap method resamples (i) the clusters, and (ii) individuals within each selected cluster, with replacement. In simulations, we evaluate both methods and compare them with the existing robust variance estimator and the shared gamma frailty model, which are available in statistical software packages. We simulate clustered event time data, with latent cluster-level random effects, which are ignored in the conventional Cox's model. For cluster-level covariates, both proposed bootstrap methods yield accurate SEs, and type I error rates, and acceptable coverage rates, regardless of the true random effects distribution, and avoid serious variance under-estimation by conventional Cox-based standard errors. However, the two-step bootstrap method over-estimates the variance for individual-level covariates. We also apply the proposed bootstrap methods to obtain confidence bands around flexible estimates of time-dependent effects in a real-life analysis of cluster event times.

  5. Sample size and robust marginal methods for cluster-randomized trials with censored event times.

    PubMed

    Zhong, Yujie; Cook, Richard J

    2015-03-15

    In cluster-randomized trials, intervention effects are often formulated by specifying marginal models, fitting them under a working independence assumption, and using robust variance estimates to address the association in the responses within clusters. We develop sample size criteria within this framework, with analyses based on semiparametric Cox regression models fitted with event times subject to right censoring. At the design stage, copula models are specified to enable derivation of the asymptotic variance of estimators from a marginal Cox regression model and to compute the number of clusters necessary to satisfy power requirements. Simulation studies demonstrate the validity of the sample size formula in finite samples for a range of cluster sizes, censoring rates, and degrees of within-cluster association among event times. The power and relative efficiency implications of copula misspecification is studied, as well as the effect of within-cluster dependence in the censoring times. Sample size criteria and other design issues are also addressed for the setting where the event status is only ascertained at periodic assessments and times are interval censored.

  6. Value of unstructured time (breaks) during formal continuing medical education events.

    PubMed

    Tipping, J; Donahue, J; Hannah, E

    2001-01-01

    Unstructured time (breaks) at formal continuing medical education (CME) events is nonaccredited in some jurisdictions. Program participants, however, perceive this time as valuable to their learning. The purpose of this research was to determine what activities occur during unstructured time in formal CME events and how these activities impact learning for physicians. A qualitative method based on grounded theory was used to determine themes of behavior. Both individual and focus group interviews were conducted. Data were analyzed and coded into themes, which were then further explored and validated by the use of a questionnaire survey. One hundred ninety-seven family physicians were involved in the study. Several activities related to the enhancement of learning were identified and grouped into themes. There were few differences in the ranking of importance between the themes identified, nor were differences determined based on gender or type of CME in which the break occurred. The results suggest that unstructured time (breaks) should be included in formal CME events to help physician learners integrate new material, solve individual practice problems, and make new meaning out of their experience. The interaction between colleagues that occurs as a result of the provision of breaks is perceived as crucial in aiding the process of applying knowledge to practice.

  7. Monotonic continuous-time random walks with drift and stochastic reset events

    NASA Astrophysics Data System (ADS)

    Montero, Miquel; Villarroel, Javier

    2013-01-01

    In this paper we consider a stochastic process that may experience random reset events which suddenly bring the system to the starting value and analyze the relevant statistical magnitudes. We focus our attention on monotonic continuous-time random walks with a constant drift: The process increases between the reset events, either by the effect of the random jumps, or by the action of the deterministic drift. As a result of all these combined factors interesting properties emerge, like the existence (for any drift strength) of a stationary transition probability density function, or the faculty of the model to reproduce power-law-like behavior. General formulas for two extreme statistics, the survival probability, and the mean exit time are also derived. To corroborate in an independent way the results of the paper, Monte Carlo methods were used. These numerical estimations are in full agreement with the analytical predictions.

  8. Continuous-time random walks with reset events. Historical background and new perspectives

    NASA Astrophysics Data System (ADS)

    Montero, Miquel; Masó-Puigdellosas, Axel; Villarroel, Javier

    2017-09-01

    In this paper, we consider a stochastic process that may experience random reset events which relocate the system to its starting position. We focus our attention on a one-dimensional, monotonic continuous-time random walk with a constant drift: the process moves in a fixed direction between the reset events, either by the effect of the random jumps, or by the action of a deterministic bias. However, the orientation of its motion is randomly determined after each restart. As a result of these alternating dynamics, interesting properties do emerge. General formulas for the propagator as well as for two extreme statistics, the survival probability and the mean first-passage time, are also derived. The rigor of these analytical results is verified by numerical estimations, for particular but illuminating examples.

  9. Covert signs of expectancy in serial reaction time tasks revealed by event-related potentials.

    PubMed

    Sommer, W; Leuthold, H; Soetens, E

    1999-02-01

    Choice reaction time is strongly determined by the sequence of preceding stimuli. With long response-stimulus intervals (RSIs), a cost-benefit pattern is observed, which has been related to expectancy, whereas with short RSIs a benefit-only pattern emerges, possibly because of automatic facilitation. In the present study, event-related potentials were recorded while subjects performed serial choice responses to visual and auditory stimuli at long and short RSIs. As expected, reaction times displayed cost-benefit and benefit-only patterns at long and short RSIs, respectively. In contrast, sequential effects in event-related potential amplitudes displayed a cost-benefit pattern, unaffected by the RSI. The results demonstrate that an expectancy-like mechanism is always active in serial tasks but appears to influence performance only when the RSI is long.

  10. A Bayesian Approach for Instrumental Variable Analysis with Censored Time-to-Event Outcome

    PubMed Central

    Li, Gang; Lu, Xuyang

    2014-01-01

    Instrumental variable (IV) analysis has been widely used in economics, epidemiology, and other fields to estimate the causal effects of covariates on outcomes, in the presence of unobserved confounders and/or measurement errors in covariates. However, IV methods for time-to-event outcome with censored data remain underdeveloped. This paper proposes a Bayesian approach for IV analysis with censored time-to-event outcome by using a two-stage linear model. A Markov Chain Monte Carlo sampling method is developed for parameter estimation for both normal and non-normal linear models with elliptically contoured error distributions. Performance of our method is examined by simulation studies. Our method largely reduces bias and greatly improves coverage probability of the estimated causal effect, compared to the method that ignores the unobserved confounders and measurement errors. We illustrate our method on the Women's Health Initiative Observational Study and the Atherosclerosis Risk in Communities Study. PMID:25393617

  11. Prediction of a time-to-event trait using genome wide SNP data

    PubMed Central

    2013-01-01

    Background A popular objective of many high-throughput genome projects is to discover various genomic markers associated with traits and develop statistical models to predict traits of future patients based on marker values. Results In this paper, we present a prediction method for time-to-event traits using genome-wide single-nucleotide polymorphisms (SNPs). We also propose a MaxTest associating between a time-to-event trait and a SNP accounting for its possible genetic models. The proposed MaxTest can help screen out nonprognostic SNPs and identify genetic models of prognostic SNPs. The performance of the proposed method is evaluated through simulations. Conclusions In conjunction with the MaxTest, the proposed method provides more parsimonious prediction models but includes more prognostic SNPs than some naive prediction methods. The proposed method is demonstrated with real GWAS data. PMID:23418752

  12. Dietary patterns associated with overweight and obesity among Brazilian schoolchildren: an approach based on the time-of-day of eating events.

    PubMed

    Kupek, Emil; Lobo, Adriana S; Leal, Danielle B; Bellisle, France; de Assis, Maria Alice A

    2016-12-01

    Several studies reported that the timing of eating events has critical implications in the prevention of obesity, but dietary patterns regarding the time-of-day have not been explored in children. The aim of this study was to derive latent food patterns of daily eating events and to examine their associations with overweight/obesity among schoolchildren. A population-based cross-sectional study was conducted with 7-10-year-old Brazilian schoolchildren (n 1232) who completed the Previous Day Food Questionnaire, illustrated with twenty-one foods/beverages in six daily eating events. Latent class analysis was used to derive dietary patterns whose association with child weight status was evaluated by multivariate multinomial regression. Four mutually exclusive latent classes of dietary patterns were identified and labelled according to the time-of-day of eating events and food intake probability (FIP): (A) higher FIP only at lunch; (B) lower FIP at all eating events; (C) higher FIP at lunch, afternoon and evening snacks; (D) lower FIP at breakfast and at evening snack, higher FIP at other meals/snacks. The percentages of children within these classes were 32·3, 48·6, 15·1 and 4·0 %, respectively. After controlling for potential confounders, the mean probabilities of obesity for these classes were 6 % (95 % CI 3·0, 9·0), 13 % (95 % CI 9·0, 17·0), 12 % (95 % CI 6·0, 19) and 11 % (95 % CI 5·0, 17·0), in the same order. In conclusion, the children eating traditional lunch with rice and beans as the main meal of the day (class A) had the lowest obesity risk, thus reinforcing the importance of both the food type and the time-of-day of its intake for weight status.

  13. Racial Disparities for Age at Time of Cardiovascular Events and Cardiovascular Death in SLE Patients

    PubMed Central

    Scalzi, Lisabeth V.; Hollenbeak, Christopher S.; Wang, Li

    2010-01-01

    Objective The aim of this study was to determine if there are racial disparities in regard to the age at which SLE patients experience CVD and CVD associated death. Methods Using the 2003–2006 National Inpatient Sample, we calculated the age difference between SLE patients and their race and gender-matched controls at the time of hospitalization for a cardiovascular (CVD) event and for CVD-associated death. In addition, we also calculated the age difference for the same outcomes between White SLE patients and gender-matched controls for each minority group. Results The mean age difference at the time of CVD event between women with and without SLE was 10.5 years. All age differences between women with SLE (n=3,625) and women without SLE admitted for CVD were significant (p<0.0001). Black women were the youngest female SLE racial group to be admitted with CVD (53.9 years) and have a CVD associated inhospital mortality (52.8 years; n=218). Black SLE women were 19.8 years younger than race and gender-matched controls at the time of CVD associated death. Admission trends for CVD were reversed for Black women such that the highest proportions of these patients were admitted before age 55 and then steadily decreased across age categories. There were 805 men with SLE admitted with a CVD event, with Black and Hispanic groups being the youngest. Conclusions There are significant racial disparities with regard to age at the time of hospital admission for CVD events and a CVD-related hospitalization resulting in death in patients with SLE. PMID:20506536

  14. Not ready for prime time: transitional events in the extremely preterm infant.

    PubMed

    Armentrout, Debra

    2014-01-01

    Successful transition from intrauterine to extrauterine life involves significant physiologic changes. The majority of these changes occur relatively quickly during those first moments following delivery; however, transition for the extremely preterm infant occurs over a longer period of time. Careful assessment and perceptive interventions on the part of neonatal care providers is essential as the extremely preterm infant adjusts to life outside the womb. This article will focus on respiratory, cardiovascular, gastrointestinal, and neurologic transitional events experienced by the extremely premature infant.

  15. The timing of the Black Sea flood event: Insights from modeling of glacial isostatic adjustment

    NASA Astrophysics Data System (ADS)

    Goldberg, Samuel L.; Lau, Harriet C. P.; Mitrovica, Jerry X.; Latychev, Konstantin

    2016-10-01

    We present a suite of gravitationally self-consistent predictions of sea-level change since Last Glacial Maximum (LGM) in the vicinity of the Bosphorus and Dardanelles straits that combine signals associated with glacial isostatic adjustment (GIA) and the flooding of the Black Sea. Our predictions are tuned to fit