Sample records for multivariate event time

  1. Mining Recent Temporal Patterns for Event Detection in Multivariate Time Series Data

    PubMed Central

    Batal, Iyad; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos

    2015-01-01

    Improving the performance of classifiers using pattern mining techniques has been an active topic of data mining research. In this work we introduce the recent temporal pattern mining framework for finding predictive patterns for monitoring and event detection problems in complex multivariate time series data. This framework first converts time series into time-interval sequences of temporal abstractions. It then constructs more complex temporal patterns backwards in time using temporal operators. We apply our framework to health care data of 13,558 diabetic patients and show its benefits by efficiently finding useful patterns for detecting and diagnosing adverse medical conditions that are associated with diabetes. PMID:25937993

  2. Contamination Event Detection with Multivariate Time-Series Data in Agricultural Water Monitoring †

    PubMed Central

    Mao, Yingchi; Qi, Hai; Ping, Ping; Li, Xiaofang

    2017-01-01

    Time series data of multiple water quality parameters are obtained from the water sensor networks deployed in the agricultural water supply network. The accurate and efficient detection and warning of contamination events to prevent pollution from spreading is one of the most important issues when pollution occurs. In order to comprehensively reduce the event detection deviation, a spatial–temporal-based event detection approach with multivariate time-series data for water quality monitoring (M-STED) was proposed. The M-STED approach includes three parts. The first part is that M-STED adopts a Rule K algorithm to select backbone nodes as the nodes in the CDS, and forward the sensed data of multiple water parameters. The second part is to determine the state of each backbone node with back propagation neural network models and the sequential Bayesian analysis in the current timestamp. The third part is to establish a spatial model with Bayesian networks to estimate the state of the backbones in the next timestamp and trace the “outlier” node to its neighborhoods to detect a contamination event. The experimental results indicate that the average detection rate is more than 80% with M-STED and the false detection rate is lower than 9%, respectively. The M-STED approach can improve the rate of detection by about 40% and reduce the false alarm rate by about 45%, compared with the event detection with a single water parameter algorithm, S-STED. Moreover, the proposed M-STED can exhibit better performance in terms of detection delay and scalability. PMID:29207535

  3. Visualizing frequent patterns in large multivariate time series

    NASA Astrophysics Data System (ADS)

    Hao, M.; Marwah, M.; Janetzko, H.; Sharma, R.; Keim, D. A.; Dayal, U.; Patnaik, D.; Ramakrishnan, N.

    2011-01-01

    The detection of previously unknown, frequently occurring patterns in time series, often called motifs, has been recognized as an important task. However, it is difficult to discover and visualize these motifs as their numbers increase, especially in large multivariate time series. To find frequent motifs, we use several temporal data mining and event encoding techniques to cluster and convert a multivariate time series to a sequence of events. Then we quantify the efficiency of the discovered motifs by linking them with a performance metric. To visualize frequent patterns in a large time series with potentially hundreds of nested motifs on a single display, we introduce three novel visual analytics methods: (1) motif layout, using colored rectangles for visualizing the occurrences and hierarchical relationships of motifs in a multivariate time series, (2) motif distortion, for enlarging or shrinking motifs as appropriate for easy analysis and (3) motif merging, to combine a number of identical adjacent motif instances without cluttering the display. Analysts can interactively optimize the degree of distortion and merging to get the best possible view. A specific motif (e.g., the most efficient or least efficient motif) can be quickly detected from a large time series for further investigation. We have applied these methods to two real-world data sets: data center cooling and oil well production. The results provide important new insights into the recurring patterns.

  4. Multivariate Statistical Modelling of Drought and Heat Wave Events

    NASA Astrophysics Data System (ADS)

    Manning, Colin; Widmann, Martin; Vrac, Mathieu; Maraun, Douglas; Bevaqua, Emanuele

    2016-04-01

    Multivariate Statistical Modelling of Drought and Heat Wave Events C. Manning1,2, M. Widmann1, M. Vrac2, D. Maraun3, E. Bevaqua2,3 1. School of Geography, Earth and Environmental Sciences, University of Birmingham, Edgbaston, Birmingham, UK 2. Laboratoire des Sciences du Climat et de l'Environnement, (LSCE-IPSL), Centre d'Etudes de Saclay, Gif-sur-Yvette, France 3. Wegener Center for Climate and Global Change, University of Graz, Brandhofgasse 5, 8010 Graz, Austria Compound extreme events are a combination of two or more contributing events which in themselves may not be extreme but through their joint occurrence produce an extreme impact. Compound events are noted in the latest IPCC report as an important type of extreme event that have been given little attention so far. As part of the CE:LLO project (Compound Events: muLtivariate statisticaL mOdelling) we are developing a multivariate statistical model to gain an understanding of the dependence structure of certain compound events. One focus of this project is on the interaction between drought and heat wave events. Soil moisture has both a local and non-local effect on the occurrence of heat waves where it strongly controls the latent heat flux affecting the transfer of sensible heat to the atmosphere. These processes can create a feedback whereby a heat wave maybe amplified or suppressed by the soil moisture preconditioning, and vice versa, the heat wave may in turn have an effect on soil conditions. An aim of this project is to capture this dependence in order to correctly describe the joint probabilities of these conditions and the resulting probability of their compound impact. We will show an application of Pair Copula Constructions (PCCs) to study the aforementioned compound event. PCCs allow in theory for the formulation of multivariate dependence structures in any dimension where the PCC is a decomposition of a multivariate distribution into a product of bivariate components modelled using copulas. A

  5. An Efficient Pattern Mining Approach for Event Detection in Multivariate Temporal Data

    PubMed Central

    Batal, Iyad; Cooper, Gregory; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos

    2015-01-01

    This work proposes a pattern mining approach to learn event detection models from complex multivariate temporal data, such as electronic health records. We present Recent Temporal Pattern mining, a novel approach for efficiently finding predictive patterns for event detection problems. This approach first converts the time series data into time-interval sequences of temporal abstractions. It then constructs more complex time-interval patterns backward in time using temporal operators. We also present the Minimal Predictive Recent Temporal Patterns framework for selecting a small set of predictive and non-spurious patterns. We apply our methods for predicting adverse medical events in real-world clinical data. The results demonstrate the benefits of our methods in learning accurate event detection models, which is a key step for developing intelligent patient monitoring and decision support systems. PMID:26752800

  6. Multivariate assessment of event-related potentials with the t-CWT method.

    PubMed

    Bostanov, Vladimir

    2015-11-05

    Event-related brain potentials (ERPs) are usually assessed with univariate statistical tests although they are essentially multivariate objects. Brain-computer interface applications are a notable exception to this practice, because they are based on multivariate classification of single-trial ERPs. Multivariate ERP assessment can be facilitated by feature extraction methods. One such method is t-CWT, a mathematical-statistical algorithm based on the continuous wavelet transform (CWT) and Student's t-test. This article begins with a geometric primer on some basic concepts of multivariate statistics as applied to ERP assessment in general and to the t-CWT method in particular. Further, it presents for the first time a detailed, step-by-step, formal mathematical description of the t-CWT algorithm. A new multivariate outlier rejection procedure based on principal component analysis in the frequency domain is presented as an important pre-processing step. The MATLAB and GNU Octave implementation of t-CWT is also made publicly available for the first time as free and open source code. The method is demonstrated on some example ERP data obtained in a passive oddball paradigm. Finally, some conceptually novel applications of the multivariate approach in general and of the t-CWT method in particular are suggested and discussed. Hopefully, the publication of both the t-CWT source code and its underlying mathematical algorithm along with a didactic geometric introduction to some basic concepts of multivariate statistics would make t-CWT more accessible to both users and developers in the field of neuroscience research.

  7. Joint Models of Longitudinal and Time-to-Event Data with More Than One Event Time Outcome: A Review.

    PubMed

    Hickey, Graeme L; Philipson, Pete; Jorgensen, Andrea; Kolamunnage-Dona, Ruwanthi

    2018-01-31

    Methodological development and clinical application of joint models of longitudinal and time-to-event outcomes have grown substantially over the past two decades. However, much of this research has concentrated on a single longitudinal outcome and a single event time outcome. In clinical and public health research, patients who are followed up over time may often experience multiple, recurrent, or a succession of clinical events. Models that utilise such multivariate event time outcomes are quite valuable in clinical decision-making. We comprehensively review the literature for implementation of joint models involving more than a single event time per subject. We consider the distributional and modelling assumptions, including the association structure, estimation approaches, software implementations, and clinical applications. Research into this area is proving highly promising, but to-date remains in its infancy.

  8. Estimating the ratio of multivariate recurrent event rates with application to a blood transfusion study.

    PubMed

    Ning, Jing; Rahbar, Mohammad H; Choi, Sangbum; Piao, Jin; Hong, Chuan; Del Junco, Deborah J; Rahbar, Elaheh; Fox, Erin E; Holcomb, John B; Wang, Mei-Cheng

    2017-08-01

    In comparative effectiveness studies of multicomponent, sequential interventions like blood product transfusion (plasma, platelets, red blood cells) for trauma and critical care patients, the timing and dynamics of treatment relative to the fragility of a patient's condition is often overlooked and underappreciated. While many hospitals have established massive transfusion protocols to ensure that physiologically optimal combinations of blood products are rapidly available, the period of time required to achieve a specified massive transfusion standard (e.g. a 1:1 or 1:2 ratio of plasma or platelets:red blood cells) has been ignored. To account for the time-varying characteristics of transfusions, we use semiparametric rate models for multivariate recurrent events to estimate blood product ratios. We use latent variables to account for multiple sources of informative censoring (early surgical or endovascular hemorrhage control procedures or death). The major advantage is that the distributions of latent variables and the dependence structure between the multivariate recurrent events and informative censoring need not be specified. Thus, our approach is robust to complex model assumptions. We establish asymptotic properties and evaluate finite sample performance through simulations, and apply the method to data from the PRospective Observational Multicenter Major Trauma Transfusion study.

  9. Multivariate Time Series Decomposition into Oscillation Components.

    PubMed

    Matsuda, Takeru; Komaki, Fumiyasu

    2017-08-01

    Many time series are considered to be a superposition of several oscillation components. We have proposed a method for decomposing univariate time series into oscillation components and estimating their phases (Matsuda & Komaki, 2017 ). In this study, we extend that method to multivariate time series. We assume that several oscillators underlie the given multivariate time series and that each variable corresponds to a superposition of the projections of the oscillators. Thus, the oscillators superpose on each variable with amplitude and phase modulation. Based on this idea, we develop gaussian linear state-space models and use them to decompose the given multivariate time series. The model parameters are estimated from data using the empirical Bayes method, and the number of oscillators is determined using the Akaike information criterion. Therefore, the proposed method extracts underlying oscillators in a data-driven manner and enables investigation of phase dynamics in a given multivariate time series. Numerical results show the effectiveness of the proposed method. From monthly mean north-south sunspot number data, the proposed method reveals an interesting phase relationship.

  10. Measuring agreement of multivariate discrete survival times using a modified weighted kappa coefficient.

    PubMed

    Guo, Ying; Manatunga, Amita K

    2009-03-01

    Assessing agreement is often of interest in clinical studies to evaluate the similarity of measurements produced by different raters or methods on the same subjects. We present a modified weighted kappa coefficient to measure agreement between bivariate discrete survival times. The proposed kappa coefficient accommodates censoring by redistributing the mass of censored observations within the grid where the unobserved events may potentially happen. A generalized modified weighted kappa is proposed for multivariate discrete survival times. We estimate the modified kappa coefficients nonparametrically through a multivariate survival function estimator. The asymptotic properties of the kappa estimators are established and the performance of the estimators are examined through simulation studies of bivariate and trivariate survival times. We illustrate the application of the modified kappa coefficient in the presence of censored observations with data from a prostate cancer study.

  11. Drunk driving detection based on classification of multivariate time series.

    PubMed

    Li, Zhenlong; Jin, Xue; Zhao, Xiaohua

    2015-09-01

    This paper addresses the problem of detecting drunk driving based on classification of multivariate time series. First, driving performance measures were collected from a test in a driving simulator located in the Traffic Research Center, Beijing University of Technology. Lateral position and steering angle were used to detect drunk driving. Second, multivariate time series analysis was performed to extract the features. A piecewise linear representation was used to represent multivariate time series. A bottom-up algorithm was then employed to separate multivariate time series. The slope and time interval of each segment were extracted as the features for classification. Third, a support vector machine classifier was used to classify driver's state into two classes (normal or drunk) according to the extracted features. The proposed approach achieved an accuracy of 80.0%. Drunk driving detection based on the analysis of multivariate time series is feasible and effective. The approach has implications for drunk driving detection. Copyright © 2015 Elsevier Ltd and National Safety Council. All rights reserved.

  12. Network structure of multivariate time series.

    PubMed

    Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito

    2015-10-21

    Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.

  13. Multivariate Bayesian modeling of known and unknown causes of events--an application to biosurveillance.

    PubMed

    Shen, Yanna; Cooper, Gregory F

    2012-09-01

    This paper investigates Bayesian modeling of known and unknown causes of events in the context of disease-outbreak detection. We introduce a multivariate Bayesian approach that models multiple evidential features of every person in the population. This approach models and detects (1) known diseases (e.g., influenza and anthrax) by using informative prior probabilities and (2) unknown diseases (e.g., a new, highly contagious respiratory virus that has never been seen before) by using relatively non-informative prior probabilities. We report the results of simulation experiments which support that this modeling method can improve the detection of new disease outbreaks in a population. A contribution of this paper is that it introduces a multivariate Bayesian approach for jointly modeling both known and unknown causes of events. Such modeling has general applicability in domains where the space of known causes is incomplete. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  14. Nonparametric Bayesian Segmentation of a Multivariate Inhomogeneous Space-Time Poisson Process.

    PubMed

    Ding, Mingtao; He, Lihan; Dunson, David; Carin, Lawrence

    2012-12-01

    A nonparametric Bayesian model is proposed for segmenting time-evolving multivariate spatial point process data. An inhomogeneous Poisson process is assumed, with a logistic stick-breaking process (LSBP) used to encourage piecewise-constant spatial Poisson intensities. The LSBP explicitly favors spatially contiguous segments, and infers the number of segments based on the observed data. The temporal dynamics of the segmentation and of the Poisson intensities are modeled with exponential correlation in time, implemented in the form of a first-order autoregressive model for uniformly sampled discrete data, and via a Gaussian process with an exponential kernel for general temporal sampling. We consider and compare two different inference techniques: a Markov chain Monte Carlo sampler, which has relatively high computational complexity; and an approximate and efficient variational Bayesian analysis. The model is demonstrated with a simulated example and a real example of space-time crime events in Cincinnati, Ohio, USA.

  15. Bayesian inference on risk differences: an application to multivariate meta-analysis of adverse events in clinical trials.

    PubMed

    Chen, Yong; Luo, Sheng; Chu, Haitao; Wei, Peng

    2013-05-01

    Multivariate meta-analysis is useful in combining evidence from independent studies which involve several comparisons among groups based on a single outcome. For binary outcomes, the commonly used statistical models for multivariate meta-analysis are multivariate generalized linear mixed effects models which assume risks, after some transformation, follow a multivariate normal distribution with possible correlations. In this article, we consider an alternative model for multivariate meta-analysis where the risks are modeled by the multivariate beta distribution proposed by Sarmanov (1966). This model have several attractive features compared to the conventional multivariate generalized linear mixed effects models, including simplicity of likelihood function, no need to specify a link function, and has a closed-form expression of distribution functions for study-specific risk differences. We investigate the finite sample performance of this model by simulation studies and illustrate its use with an application to multivariate meta-analysis of adverse events of tricyclic antidepressants treatment in clinical trials.

  16. Small Sample Properties of Bayesian Multivariate Autoregressive Time Series Models

    ERIC Educational Resources Information Center

    Price, Larry R.

    2012-01-01

    The aim of this study was to compare the small sample (N = 1, 3, 5, 10, 15) performance of a Bayesian multivariate vector autoregressive (BVAR-SEM) time series model relative to frequentist power and parameter estimation bias. A multivariate autoregressive model was developed based on correlated autoregressive time series vectors of varying…

  17. A statistical approach for segregating cognitive task stages from multivariate fMRI BOLD time series.

    PubMed

    Demanuele, Charmaine; Bähner, Florian; Plichta, Michael M; Kirsch, Peter; Tost, Heike; Meyer-Lindenberg, Andreas; Durstewitz, Daniel

    2015-01-01

    Multivariate pattern analysis can reveal new information from neuroimaging data to illuminate human cognition and its disturbances. Here, we develop a methodological approach, based on multivariate statistical/machine learning and time series analysis, to discern cognitive processing stages from functional magnetic resonance imaging (fMRI) blood oxygenation level dependent (BOLD) time series. We apply this method to data recorded from a group of healthy adults whilst performing a virtual reality version of the delayed win-shift radial arm maze (RAM) task. This task has been frequently used to study working memory and decision making in rodents. Using linear classifiers and multivariate test statistics in conjunction with time series bootstraps, we show that different cognitive stages of the task, as defined by the experimenter, namely, the encoding/retrieval, choice, reward and delay stages, can be statistically discriminated from the BOLD time series in brain areas relevant for decision making and working memory. Discrimination of these task stages was significantly reduced during poor behavioral performance in dorsolateral prefrontal cortex (DLPFC), but not in the primary visual cortex (V1). Experimenter-defined dissection of time series into class labels based on task structure was confirmed by an unsupervised, bottom-up approach based on Hidden Markov Models. Furthermore, we show that different groupings of recorded time points into cognitive event classes can be used to test hypotheses about the specific cognitive role of a given brain region during task execution. We found that whilst the DLPFC strongly differentiated between task stages associated with different memory loads, but not between different visual-spatial aspects, the reverse was true for V1. Our methodology illustrates how different aspects of cognitive information processing during one and the same task can be separated and attributed to specific brain regions based on information contained in

  18. Multivariable nonlinear analysis of foreign exchange rates

    NASA Astrophysics Data System (ADS)

    Suzuki, Tomoya; Ikeguchi, Tohru; Suzuki, Masuo

    2003-05-01

    We analyze the multivariable time series of foreign exchange rates. These are price movements that have often been analyzed, and dealing time intervals and spreads between bid and ask prices. Considering dealing time intervals as event timing such as neurons’ firings, we use raster plots (RPs) and peri-stimulus time histograms (PSTHs) which are popular methods in the field of neurophysiology. Introducing special processings to obtaining RPs and PSTHs time histograms for analyzing exchange rates time series, we discover that there exists dynamical interaction among three variables. We also find that adopting multivariables leads to improvements of prediction accuracy.

  19. Reconstructing multi-mode networks from multivariate time series

    NASA Astrophysics Data System (ADS)

    Gao, Zhong-Ke; Yang, Yu-Xuan; Dang, Wei-Dong; Cai, Qing; Wang, Zhen; Marwan, Norbert; Boccaletti, Stefano; Kurths, Jürgen

    2017-09-01

    Unveiling the dynamics hidden in multivariate time series is a task of the utmost importance in a broad variety of areas in physics. We here propose a method that leads to the construction of a novel functional network, a multi-mode weighted graph combined with an empirical mode decomposition, and to the realization of multi-information fusion of multivariate time series. The method is illustrated in a couple of successful applications (a multi-phase flow and an epileptic electro-encephalogram), which demonstrate its powerfulness in revealing the dynamical behaviors underlying the transitions of different flow patterns, and enabling to differentiate brain states of seizure and non-seizure.

  20. Multivariate Prediction Equations for HbA1c Lowering, Weight Change, and Hypoglycemic Events Associated with Insulin Rescue Medication in Type 2 Diabetes Mellitus: Informing Economic Modeling.

    PubMed

    Willis, Michael; Asseburg, Christian; Nilsson, Andreas; Johnsson, Kristina; Kartman, Bernt

    2017-03-01

    Type 2 diabetes mellitus (T2DM) is chronic and progressive and the cost-effectiveness of new treatment interventions must be established over long time horizons. Given the limited durability of drugs, assumptions regarding downstream rescue medication can drive results. Especially for insulin, for which treatment effects and adverse events are known to depend on patient characteristics, this can be problematic for health economic evaluation involving modeling. To estimate parsimonious multivariate equations of treatment effects and hypoglycemic event risks for use in parameterizing insulin rescue therapy in model-based cost-effectiveness analysis. Clinical evidence for insulin use in T2DM was identified in PubMed and from published reviews and meta-analyses. Study and patient characteristics and treatment effects and adverse event rates were extracted and the data used to estimate parsimonious treatment effect and hypoglycemic event risk equations using multivariate regression analysis. Data from 91 studies featuring 171 usable study arms were identified, mostly for premix and basal insulin types. Multivariate prediction equations for glycated hemoglobin A 1c lowering and weight change were estimated separately for insulin-naive and insulin-experienced patients. Goodness of fit (R 2 ) for both outcomes were generally good, ranging from 0.44 to 0.84. Multivariate prediction equations for symptomatic, nocturnal, and severe hypoglycemic events were also estimated, though considerable heterogeneity in definitions limits their usefulness. Parsimonious and robust multivariate prediction equations were estimated for glycated hemoglobin A 1c and weight change, separately for insulin-naive and insulin-experienced patients. Using these in economic simulation modeling in T2DM can improve realism and flexibility in modeling insulin rescue medication. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All

  1. Multivariate hydrological frequency analysis for extreme events using Archimedean copula. Case study: Lower Tunjuelo River basin (Colombia)

    NASA Astrophysics Data System (ADS)

    Gómez, Wilmar

    2017-04-01

    By analyzing the spatial and temporal variability of extreme precipitation events we can prevent or reduce the threat and risk. Many water resources projects require joint probability distributions of random variables such as precipitation intensity and duration, which can not be independent with each other. The problem of defining a probability model for observations of several dependent variables is greatly simplified by the joint distribution in terms of their marginal by taking copulas. This document presents a general framework set frequency analysis bivariate and multivariate using Archimedean copulas for extreme events of hydroclimatological nature such as severe storms. This analysis was conducted in the lower Tunjuelo River basin in Colombia for precipitation events. The results obtained show that for a joint study of the intensity-duration-frequency, IDF curves can be obtained through copulas and thus establish more accurate and reliable information from design storms and associated risks. It shows how the use of copulas greatly simplifies the study of multivariate distributions that introduce the concept of joint return period used to represent the needs of hydrological designs properly in frequency analysis.

  2. F100 multivariable control synthesis program: Evaluation of a multivariable control using a real-time engine simulation

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.; Soeder, J. F.; Seldner, K.; Cwynar, D. S.

    1977-01-01

    The design, evaluation, and testing of a practical, multivariable, linear quadratic regulator control for the F100 turbofan engine were accomplished. NASA evaluation of the multivariable control logic and implementation are covered. The evaluation utilized a real time, hybrid computer simulation of the engine. Results of the evaluation are presented, and recommendations concerning future engine testing of the control are made. Results indicated that the engine testing of the control should be conducted as planned.

  3. Applying the multivariate time-rescaling theorem to neural population models

    PubMed Central

    Gerhard, Felipe; Haslinger, Robert; Pipa, Gordon

    2011-01-01

    Statistical models of neural activity are integral to modern neuroscience. Recently, interest has grown in modeling the spiking activity of populations of simultaneously recorded neurons to study the effects of correlations and functional connectivity on neural information processing. However any statistical model must be validated by an appropriate goodness-of-fit test. Kolmogorov-Smirnov tests based upon the time-rescaling theorem have proven to be useful for evaluating point-process-based statistical models of single-neuron spike trains. Here we discuss the extension of the time-rescaling theorem to the multivariate (neural population) case. We show that even in the presence of strong correlations between spike trains, models which neglect couplings between neurons can be erroneously passed by the univariate time-rescaling test. We present the multivariate version of the time-rescaling theorem, and provide a practical step-by-step procedure for applying it towards testing the sufficiency of neural population models. Using several simple analytically tractable models and also more complex simulated and real data sets, we demonstrate that important features of the population activity can only be detected using the multivariate extension of the test. PMID:21395436

  4. Experience with Event Timing Does not Alter Emergent Timing: Further Evidence for Robustness of Event and Emergent Timing.

    PubMed

    Pope, Megan A; Studenka, Breanna E

    2018-02-15

    Although, event and emergent timings are thought of as mutually exclusive, significant correlations between tapping and circle drawing (Baer, Thibodeau, Gralnick, Li, & Penhune, 2013 ; Studenka, Zelaznik, & Balasubramaniam, 2012 ; Zelaznik & Rosenbaum, 2010 ) suggest that emergent timing may not be as robust as once thought. We aimed to test this hypothesis in both a younger (18-25) and older (55-100) population. Participants performed one block of circle drawing as a baseline, then six blocks of tapping, followed by circle drawing. We examined the use of event timing. Our hypothesis that acute experience with event timing would bias an individual to use event timing during an emergent task was not supported. We, instead, support the robustness of event and emergent timing as independent timing modes.

  5. Clustering Multivariate Time Series Using Hidden Markov Models

    PubMed Central

    Ghassempour, Shima; Girosi, Federico; Maeder, Anthony

    2014-01-01

    In this paper we describe an algorithm for clustering multivariate time series with variables taking both categorical and continuous values. Time series of this type are frequent in health care, where they represent the health trajectories of individuals. The problem is challenging because categorical variables make it difficult to define a meaningful distance between trajectories. We propose an approach based on Hidden Markov Models (HMMs), where we first map each trajectory into an HMM, then define a suitable distance between HMMs and finally proceed to cluster the HMMs with a method based on a distance matrix. We test our approach on a simulated, but realistic, data set of 1,255 trajectories of individuals of age 45 and over, on a synthetic validation set with known clustering structure, and on a smaller set of 268 trajectories extracted from the longitudinal Health and Retirement Survey. The proposed method can be implemented quite simply using standard packages in R and Matlab and may be a good candidate for solving the difficult problem of clustering multivariate time series with categorical variables using tools that do not require advanced statistic knowledge, and therefore are accessible to a wide range of researchers. PMID:24662996

  6. A time domain frequency-selective multivariate Granger causality approach.

    PubMed

    Leistritz, Lutz; Witte, Herbert

    2016-08-01

    The investigation of effective connectivity is one of the major topics in computational neuroscience to understand the interaction between spatially distributed neuronal units of the brain. Thus, a wide variety of methods has been developed during the last decades to investigate functional and effective connectivity in multivariate systems. Their spectrum ranges from model-based to model-free approaches with a clear separation into time and frequency range methods. We present in this simulation study a novel time domain approach based on Granger's principle of predictability, which allows frequency-selective considerations of directed interactions. It is based on a comparison of prediction errors of multivariate autoregressive models fitted to systematically modified time series. These modifications are based on signal decompositions, which enable a targeted cancellation of specific signal components with specific spectral properties. Depending on the embedded signal decomposition method, a frequency-selective or data-driven signal-adaptive Granger Causality Index may be derived.

  7. A Framework and Algorithms for Multivariate Time Series Analytics (MTSA): Learning, Monitoring, and Recommendation

    ERIC Educational Resources Information Center

    Ngan, Chun-Kit

    2013-01-01

    Making decisions over multivariate time series is an important topic which has gained significant interest in the past decade. A time series is a sequence of data points which are measured and ordered over uniform time intervals. A multivariate time series is a set of multiple, related time series in a particular domain in which domain experts…

  8. Copula based flexible modeling of associations between clustered event times.

    PubMed

    Geerdens, Candida; Claeskens, Gerda; Janssen, Paul

    2016-07-01

    Multivariate survival data are characterized by the presence of correlation between event times within the same cluster. First, we build multi-dimensional copulas with flexible and possibly symmetric dependence structures for such data. In particular, clustered right-censored survival data are modeled using mixtures of max-infinitely divisible bivariate copulas. Second, these copulas are fit by a likelihood approach where the vast amount of copula derivatives present in the likelihood is approximated by finite differences. Third, we formulate conditions for clustered right-censored survival data under which an information criterion for model selection is either weakly consistent or consistent. Several of the familiar selection criteria are included. A set of four-dimensional data on time-to-mastitis is used to demonstrate the developed methodology.

  9. Measures of dependence for multivariate Lévy distributions

    NASA Astrophysics Data System (ADS)

    Boland, J.; Hurd, T. R.; Pivato, M.; Seco, L.

    2001-02-01

    Recent statistical analysis of a number of financial databases is summarized. Increasing agreement is found that logarithmic equity returns show a certain type of asymptotic behavior of the largest events, namely that the probability density functions have power law tails with an exponent α≈3.0. This behavior does not vary much over different stock exchanges or over time, despite large variations in trading environments. The present paper proposes a class of multivariate distributions which generalizes the observed qualities of univariate time series. A new consequence of the proposed class is the "spectral measure" which completely characterizes the multivariate dependences of the extreme tails of the distribution. This measure on the unit sphere in M-dimensions, in principle completely general, can be determined empirically by looking at extreme events. If it can be observed and determined, it will prove to be of importance for scenario generation in portfolio risk management.

  10. A Method for Comparing Multivariate Time Series with Different Dimensions

    PubMed Central

    Tapinos, Avraam; Mendes, Pedro

    2013-01-01

    In many situations it is desirable to compare dynamical systems based on their behavior. Similarity of behavior often implies similarity of internal mechanisms or dependency on common extrinsic factors. While there are widely used methods for comparing univariate time series, most dynamical systems are characterized by multivariate time series. Yet, comparison of multivariate time series has been limited to cases where they share a common dimensionality. A semi-metric is a distance function that has the properties of non-negativity, symmetry and reflexivity, but not sub-additivity. Here we develop a semi-metric – SMETS – that can be used for comparing groups of time series that may have different dimensions. To demonstrate its utility, the method is applied to dynamic models of biochemical networks and to portfolios of shares. The former is an example of a case where the dependencies between system variables are known, while in the latter the system is treated (and behaves) as a black box. PMID:23393554

  11. Causality networks from multivariate time series and application to epilepsy.

    PubMed

    Siggiridou, Elsa; Koutlis, Christos; Tsimpiris, Alkiviadis; Kimiskidis, Vasilios K; Kugiumtzis, Dimitris

    2015-08-01

    Granger causality and variants of this concept allow the study of complex dynamical systems as networks constructed from multivariate time series. In this work, a large number of Granger causality measures used to form causality networks from multivariate time series are assessed. For this, realizations on high dimensional coupled dynamical systems are considered and the performance of the Granger causality measures is evaluated, seeking for the measures that form networks closest to the true network of the dynamical system. In particular, the comparison focuses on Granger causality measures that reduce the state space dimension when many variables are observed. Further, the linear and nonlinear Granger causality measures of dimension reduction are compared to a standard Granger causality measure on electroencephalographic (EEG) recordings containing episodes of epileptiform discharges.

  12. A climate-based multivariate extreme emulator of met-ocean-hydrological events for coastal flooding

    NASA Astrophysics Data System (ADS)

    Camus, Paula; Rueda, Ana; Mendez, Fernando J.; Tomas, Antonio; Del Jesus, Manuel; Losada, Iñigo J.

    2015-04-01

    Atmosphere-ocean general circulation models (AOGCMs) are useful to analyze large-scale climate variability (long-term historical periods, future climate projections). However, applications such as coastal flood modeling require climate information at finer scale. Besides, flooding events depend on multiple climate conditions: waves, surge levels from the open-ocean and river discharge caused by precipitation. Therefore, a multivariate statistical downscaling approach is adopted to reproduce relationships between variables and due to its low computational cost. The proposed method can be considered as a hybrid approach which combines a probabilistic weather type downscaling model with a stochastic weather generator component. Predictand distributions are reproduced modeling the relationship with AOGCM predictors based on a physical division in weather types (Camus et al., 2012). The multivariate dependence structure of the predictand (extreme events) is introduced linking the independent marginal distributions of the variables by a probabilistic copula regression (Ben Ayala et al., 2014). This hybrid approach is applied for the downscaling of AOGCM data to daily precipitation and maximum significant wave height and storm-surge in different locations along the Spanish coast. Reanalysis data is used to assess the proposed method. A commonly predictor for the three variables involved is classified using a regression-guided clustering algorithm. The most appropriate statistical model (general extreme value distribution, pareto distribution) for daily conditions is fitted. Stochastic simulation of the present climate is performed obtaining the set of hydraulic boundary conditions needed for high resolution coastal flood modeling. References: Camus, P., Menéndez, M., Méndez, F.J., Izaguirre, C., Espejo, A., Cánovas, V., Pérez, J., Rueda, A., Losada, I.J., Medina, R. (2014b). A weather-type statistical downscaling framework for ocean wave climate. Journal of

  13. Comparing and combining biomarkers as principle surrogates for time-to-event clinical endpoints.

    PubMed

    Gabriel, Erin E; Sachs, Michael C; Gilbert, Peter B

    2015-02-10

    Principal surrogate endpoints are useful as targets for phase I and II trials. In many recent trials, multiple post-randomization biomarkers are measured. However, few statistical methods exist for comparison of or combination of biomarkers as principal surrogates, and none of these methods to our knowledge utilize time-to-event clinical endpoint information. We propose a Weibull model extension of the semi-parametric estimated maximum likelihood method that allows for the inclusion of multiple biomarkers in the same risk model as multivariate candidate principal surrogates. We propose several methods for comparing candidate principal surrogates and evaluating multivariate principal surrogates. These include the time-dependent and surrogate-dependent true and false positive fraction, the time-dependent and the integrated standardized total gain, and the cumulative distribution function of the risk difference. We illustrate the operating characteristics of our proposed methods in simulations and outline how these statistics can be used to evaluate and compare candidate principal surrogates. We use these methods to investigate candidate surrogates in the Diabetes Control and Complications Trial. Copyright © 2014 John Wiley & Sons, Ltd.

  14. Discrete mixture modeling to address genetic heterogeneity in time-to-event regression

    PubMed Central

    Eng, Kevin H.; Hanlon, Bret M.

    2014-01-01

    Motivation: Time-to-event regression models are a critical tool for associating survival time outcomes with molecular data. Despite mounting evidence that genetic subgroups of the same clinical disease exist, little attention has been given to exploring how this heterogeneity affects time-to-event model building and how to accommodate it. Methods able to diagnose and model heterogeneity should be valuable additions to the biomarker discovery toolset. Results: We propose a mixture of survival functions that classifies subjects with similar relationships to a time-to-event response. This model incorporates multivariate regression and model selection and can be fit with an expectation maximization algorithm, we call Cox-assisted clustering. We illustrate a likely manifestation of genetic heterogeneity and demonstrate how it may affect survival models with little warning. An application to gene expression in ovarian cancer DNA repair pathways illustrates how the model may be used to learn new genetic subsets for risk stratification. We explore the implications of this model for censored observations and the effect on genomic predictors and diagnostic analysis. Availability and implementation: R implementation of CAC using standard packages is available at https://gist.github.com/programeng/8620b85146b14b6edf8f Data used in the analysis are publicly available. Contact: kevin.eng@roswellpark.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24532723

  15. A multivariate time series approach to modeling and forecasting demand in the emergency department.

    PubMed

    Jones, Spencer S; Evans, R Scott; Allen, Todd L; Thomas, Alun; Haug, Peter J; Welch, Shari J; Snow, Gregory L

    2009-02-01

    The goals of this investigation were to study the temporal relationships between the demands for key resources in the emergency department (ED) and the inpatient hospital, and to develop multivariate forecasting models. Hourly data were collected from three diverse hospitals for the year 2006. Descriptive analysis and model fitting were carried out using graphical and multivariate time series methods. Multivariate models were compared to a univariate benchmark model in terms of their ability to provide out-of-sample forecasts of ED census and the demands for diagnostic resources. Descriptive analyses revealed little temporal interaction between the demand for inpatient resources and the demand for ED resources at the facilities considered. Multivariate models provided more accurate forecasts of ED census and of the demands for diagnostic resources. Our results suggest that multivariate time series models can be used to reliably forecast ED patient census; however, forecasts of the demands for diagnostic resources were not sufficiently reliable to be useful in the clinical setting.

  16. Hierarchy of temporal responses of multivariate self-excited epidemic processes

    NASA Astrophysics Data System (ADS)

    Saichev, Alexander; Maillart, Thomas; Sornette, Didier

    2013-04-01

    Many natural and social systems are characterized by bursty dynamics, for which past events trigger future activity. These systems can be modelled by so-called self-excited Hawkes conditional Poisson processes. It is generally assumed that all events have similar triggering abilities. However, some systems exhibit heterogeneity and clusters with possibly different intra- and inter-triggering, which can be accounted for by generalization into the "multivariate" self-excited Hawkes conditional Poisson processes. We develop the general formalism of the multivariate moment generating function for the cumulative number of first-generation and of all generation events triggered by a given mother event (the "shock") as a function of the current time t. This corresponds to studying the response function of the process. A variety of different systems have been analyzed. In particular, for systems in which triggering between events of different types proceeds through a one-dimension directed or symmetric chain of influence in type space, we report a novel hierarchy of intermediate asymptotic power law decays ˜ 1/ t 1-( m+1) θ of the rate of triggered events as a function of the distance m of the events to the initial shock in the type space, where 0 < θ < 1 for the relevant long-memory processes characterizing many natural and social systems. The richness of the generated time dynamics comes from the cascades of intermediate events of possibly different kinds, unfolding via random changes of types genealogy.

  17. A lengthy look at the daily grind: time series analysis of events, mood, stress, and satisfaction.

    PubMed

    Fuller, Julie A; Stanton, Jeffrey M; Fisher, Gwenith G; Spitzmuller, Christiane; Russell, Steven S; Smith, Patricia C

    2003-12-01

    The present study investigated processes by which job stress and satisfaction unfold over time by examining the relations between daily stressful events, mood, and these variables. Using a Web-based daily survey of stressor events, perceived strain, mood, and job satisfaction completed by 14 university workers, 1,060 occasions of data were collected. Transfer function analysis, a multivariate version of time series analysis, was used to examine the data for relationships among the measured variables after factoring out the contaminating influences of serial dependency. Results revealed a contrast effect in which a stressful event associated positively with higher strain on the same day and associated negatively with strain on the following day. Perceived strain increased over the course of a semester for a majority of participants, suggesting that effects of stress build over time. Finally, the data were consistent with the notion that job satisfaction is a distal outcome that is mediated by perceived strain. ((c) 2003 APA, all rights reserved)

  18. Multivariate stochastic analysis for Monthly hydrological time series at Cuyahoga River Basin

    NASA Astrophysics Data System (ADS)

    zhang, L.

    2011-12-01

    Copula has become a very powerful statistic and stochastic methodology in case of the multivariate analysis in Environmental and Water resources Engineering. In recent years, the popular one-parameter Archimedean copulas, e.g. Gumbel-Houggard copula, Cook-Johnson copula, Frank copula, the meta-elliptical copula, e.g. Gaussian Copula, Student-T copula, etc. have been applied in multivariate hydrological analyses, e.g. multivariate rainfall (rainfall intensity, duration and depth), flood (peak discharge, duration and volume), and drought analyses (drought length, mean and minimum SPI values, and drought mean areal extent). Copula has also been applied in the flood frequency analysis at the confluences of river systems by taking into account the dependence among upstream gauge stations rather than by using the hydrological routing technique. In most of the studies above, the annual time series have been considered as stationary signal which the time series have been assumed as independent identically distributed (i.i.d.) random variables. But in reality, hydrological time series, especially the daily and monthly hydrological time series, cannot be considered as i.i.d. random variables due to the periodicity existed in the data structure. Also, the stationary assumption is also under question due to the Climate Change and Land Use and Land Cover (LULC) change in the fast years. To this end, it is necessary to revaluate the classic approach for the study of hydrological time series by relaxing the stationary assumption by the use of nonstationary approach. Also as to the study of the dependence structure for the hydrological time series, the assumption of same type of univariate distribution also needs to be relaxed by adopting the copula theory. In this paper, the univariate monthly hydrological time series will be studied through the nonstationary time series analysis approach. The dependence structure of the multivariate monthly hydrological time series will be

  19. Fast and Flexible Multivariate Time Series Subsequence Search

    NASA Technical Reports Server (NTRS)

    Bhaduri, Kanishka; Oza, Nikunj C.; Zhu, Qiang; Srivastava, Ashok N.

    2010-01-01

    Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical monitoring, and financial systems. Domain experts are often interested in searching for interesting multivariate patterns from these MTS databases which often contain several gigabytes of data. Surprisingly, research on MTS search is very limited. Most of the existing work only supports queries with the same length of data, or queries on a fixed set of variables. In this paper, we propose an efficient and flexible subsequence search framework for massive MTS databases, that, for the first time, enables querying on any subset of variables with arbitrary time delays between them. We propose two algorithms to solve this problem (1) a List Based Search (LBS) algorithm which uses sorted lists for indexing, and (2) a R*-tree Based Search (RBS) which uses Minimum Bounding Rectangles (MBR) to organize the subsequences. Both algorithms guarantee that all matching patterns within the specified thresholds will be returned (no false dismissals). The very few false alarms can be removed by a post-processing step. Since our framework is also capable of Univariate Time-Series (UTS) subsequence search, we first demonstrate the efficiency of our algorithms on several UTS datasets previously used in the literature. We follow this up with experiments using two large MTS databases from the aviation domain, each containing several millions of observations. Both these tests show that our algorithms have very high prune rates (>99%) thus needing actual disk access for only less than 1% of the observations. To the best of our knowledge, MTS subsequence search has never been attempted on datasets of the size we have used in this paper.

  20. Multivariate time series analysis of neuroscience data: some challenges and opportunities.

    PubMed

    Pourahmadi, Mohsen; Noorbaloochi, Siamak

    2016-04-01

    Neuroimaging data may be viewed as high-dimensional multivariate time series, and analyzed using techniques from regression analysis, time series analysis and spatiotemporal analysis. We discuss issues related to data quality, model specification, estimation, interpretation, dimensionality and causality. Some recent research areas addressing aspects of some recurring challenges are introduced. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Evaluation of an F100 multivariable control using a real-time engine simulation

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.; Skira, C.; Soeder, J. F.

    1977-01-01

    A multivariable control design for the F100 turbofan engine was evaluated, as part of the F100 multivariable control synthesis (MVCS) program. The evaluation utilized a real-time, hybrid computer simulation of the engine and a digital computer implementation of the control. Significant results of the evaluation are presented and recommendations concerning future engine testing of the control are made.

  2. Stressful Life Events Around the Time of Unplanned Pregnancy and Women's Health: Exploratory Findings from a National Sample.

    PubMed

    Hall, Kelli Stidham; Dalton, Vanessa K; Zochowski, Melissa; Johnson, Timothy R B; Harris, Lisa H

    2017-06-01

    Objective Little is known about how women's social context of unintended pregnancy, particularly adverse social circumstances, relates to their general health and wellbeing. We explored associations between stressful life events around the time of unintended pregnancy and physical and mental health. Methods Data are drawn from a national probability study of 1078 U.S. women aged 18-55. Our internet-based survey measured 14 different stressful life events occurring at the time of unintended pregnancy (operationalized as an additive index score), chronic disease and mental health conditions, and current health and wellbeing symptoms (standardized perceived health, depression, stress, and discrimination scales). Multivariable regression modeled relationships between stressful life events and health conditions/symptoms while controlling for sociodemographic and reproductive covariates. Results Among ever-pregnant women (N = 695), stressful life events were associated with all adverse health outcomes/symptoms in unadjusted analyses. In multivariable models, higher stressful life event scores were positively associated with chronic disease (aOR 1.21, CI 1.03-1.41) and mental health (aOR 1.42, CI 1.23-1.64) conditions, higher depression (B 0.37, CI 0.19-0.55), stress (B 0.32, CI 0.22-0.42), and discrimination (B 0.74, CI 0.45-1.04) scores, and negatively associated with ≥ very good perceived health (aOR 0.84, CI 0.73-0.97). Stressful life event effects were strongest for emotional and partner-related sub-scores. Conclusion Women with adverse social circumstances surrounding their unintended pregnancy experienced poorer health. Findings suggest that reproductive health should be considered in the broader context of women's health and wellbeing and have implications for integrated models of care that address women's family planning needs, mental and physical health, and social environments.

  3. Narrative event boundaries, reading times, and expectation.

    PubMed

    Pettijohn, Kyle A; Radvansky, Gabriel A

    2016-10-01

    During text comprehension, readers create mental representations of the described events, called situation models. When new information is encountered, these models must be updated or new ones created. Consistent with the event indexing model, previous studies have shown that when readers encounter an event shift, reading times often increase. However, such increases are not consistently observed. This paper addresses this inconsistency by examining the extent to which reading-time differences observed at event shifts reflect an unexpectedness in the narrative rather than processes involved in model updating. In two reassessments of prior work, event shifts known to increase reading time were rated as less expected, and expectedness ratings significantly predicted reading time. In three new experiments, participants read stories in which an event shift was or was not foreshadowed, thereby influencing expectedness of the shift. Experiment 1 revealed that readers do not expect event shifts, but foreshadowing eliminates this. Experiment 2 showed that foreshadowing does not affect identification of event shifts. Finally, Experiment 3 found that, although reading times increased when an event shift was not foreshadowed, they were not different from controls when it was. Moreover, responses to memory probes were slower following an event shift regardless of foreshadowing, suggesting that situation model updating had taken place. Overall, the results support the idea that previously observed reading time increases at event shifts reflect, at least in part, a reader's unexpected encounter with a shift rather than an increase in processing effort required to update a situation model.

  4. A flexible model for multivariate interval-censored survival times with complex correlation structure.

    PubMed

    Falcaro, Milena; Pickles, Andrew

    2007-02-10

    We focus on the analysis of multivariate survival times with highly structured interdependency and subject to interval censoring. Such data are common in developmental genetics and genetic epidemiology. We propose a flexible mixed probit model that deals naturally with complex but uninformative censoring. The recorded ages of onset are treated as possibly censored ordinal outcomes with the interval censoring mechanism seen as arising from a coarsened measurement of a continuous variable observed as falling between subject-specific thresholds. This bypasses the requirement for the failure times to be observed as falling into non-overlapping intervals. The assumption of a normal age-of-onset distribution of the standard probit model is relaxed by embedding within it a multivariate Box-Cox transformation whose parameters are jointly estimated with the other parameters of the model. Complex decompositions of the underlying multivariate normal covariance matrix of the transformed ages of onset become possible. The new methodology is here applied to a multivariate study of the ages of first use of tobacco and first consumption of alcohol without parental permission in twins. The proposed model allows estimation of the genetic and environmental effects that are shared by both of these risk behaviours as well as those that are specific. 2006 John Wiley & Sons, Ltd.

  5. Multivariate time series clustering on geophysical data recorded at Mt. Etna from 1996 to 2003

    NASA Astrophysics Data System (ADS)

    Di Salvo, Roberto; Montalto, Placido; Nunnari, Giuseppe; Neri, Marco; Puglisi, Giuseppe

    2013-02-01

    Time series clustering is an important task in data analysis issues in order to extract implicit, previously unknown, and potentially useful information from a large collection of data. Finding useful similar trends in multivariate time series represents a challenge in several areas including geophysics environment research. While traditional time series analysis methods deal only with univariate time series, multivariate time series analysis is a more suitable approach in the field of research where different kinds of data are available. Moreover, the conventional time series clustering techniques do not provide desired results for geophysical datasets due to the huge amount of data whose sampling rate is different according to the nature of signal. In this paper, a novel approach concerning geophysical multivariate time series clustering is proposed using dynamic time series segmentation and Self Organizing Maps techniques. This method allows finding coupling among trends of different geophysical data recorded from monitoring networks at Mt. Etna spanning from 1996 to 2003, when the transition from summit eruptions to flank eruptions occurred. This information can be used to carry out a more careful evaluation of the state of volcano and to define potential hazard assessment at Mt. Etna.

  6. Nonlinear multivariate and time series analysis by neural network methods

    NASA Astrophysics Data System (ADS)

    Hsieh, William W.

    2004-03-01

    Methods in multivariate statistical analysis are essential for working with large amounts of geophysical data, data from observational arrays, from satellites, or from numerical model output. In classical multivariate statistical analysis, there is a hierarchy of methods, starting with linear regression at the base, followed by principal component analysis (PCA) and finally canonical correlation analysis (CCA). A multivariate time series method, the singular spectrum analysis (SSA), has been a fruitful extension of the PCA technique. The common drawback of these classical methods is that only linear structures can be correctly extracted from the data. Since the late 1980s, neural network methods have become popular for performing nonlinear regression and classification. More recently, neural network methods have been extended to perform nonlinear PCA (NLPCA), nonlinear CCA (NLCCA), and nonlinear SSA (NLSSA). This paper presents a unified view of the NLPCA, NLCCA, and NLSSA techniques and their applications to various data sets of the atmosphere and the ocean (especially for the El Niño-Southern Oscillation and the stratospheric quasi-biennial oscillation). These data sets reveal that the linear methods are often too simplistic to describe real-world systems, with a tendency to scatter a single oscillatory phenomenon into numerous unphysical modes or higher harmonics, which can be largely alleviated in the new nonlinear paradigm.

  7. Rotation in the Dynamic Factor Modeling of Multivariate Stationary Time Series.

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Nesselroade, John R.

    2001-01-01

    Proposes a special rotation procedure for the exploratory dynamic factor model for stationary multivariate time series. The rotation procedure applies separately to each univariate component series of a q-variate latent factor series and transforms such a component, initially represented as white noise, into a univariate moving-average.…

  8. Inferring phase equations from multivariate time series.

    PubMed

    Tokuda, Isao T; Jain, Swati; Kiss, István Z; Hudson, John L

    2007-08-10

    An approach is presented for extracting phase equations from multivariate time series data recorded from a network of weakly coupled limit cycle oscillators. Our aim is to estimate important properties of the phase equations including natural frequencies and interaction functions between the oscillators. Our approach requires the measurement of an experimental observable of the oscillators; in contrast with previous methods it does not require measurements in isolated single or two-oscillator setups. This noninvasive technique can be advantageous in biological systems, where extraction of few oscillators may be a difficult task. The method is most efficient when data are taken from the nonsynchronized regime. Applicability to experimental systems is demonstrated by using a network of electrochemical oscillators; the obtained phase model is utilized to predict the synchronization diagram of the system.

  9. NavyTime: Event and Time Ordering from Raw Text

    DTIC Science & Technology

    2013-06-01

    time-time, and event-DCT (DCT is the doc- ument creation time). 74 Event Extraction F1 ATT-1 81.05 NavyTime 80.30 KUL 79.32 cleartk -4 & cleartk -3...71.88 KUL 70.17 cleartk 67.87 NavyTime 67.48 Temp:ESA 54.55 JU-CSE 52.69 Temp:WNet 50.00 FSS-TimEx 42.94 Tense and Aspect Attributes System Tense F1...Aspect F1 cleartk 62.18 70.40 NavyTime 61.67 72.43 ATT 59.47 73.50 JU-CSE 58.62 72.14 KUL 49.70 63.20 not all systems participated Figure 1: Complete

  10. Multivariate time series modeling of short-term system scale irrigation demand

    NASA Astrophysics Data System (ADS)

    Perera, Kushan C.; Western, Andrew W.; George, Biju; Nawarathna, Bandara

    2015-12-01

    Travel time limits the ability of irrigation system operators to react to short-term irrigation demand fluctuations that result from variations in weather, including very hot periods and rainfall events, as well as the various other pressures and opportunities that farmers face. Short-term system-wide irrigation demand forecasts can assist in system operation. Here we developed a multivariate time series (ARMAX) model to forecast irrigation demands with respect to aggregated service points flows (IDCGi, ASP) and off take regulator flows (IDCGi, OTR) based across 5 command areas, which included area covered under four irrigation channels and the study area. These command area specific ARMAX models forecast 1-5 days ahead daily IDCGi, ASP and IDCGi, OTR using the real time flow data recorded at the service points and the uppermost regulators and observed meteorological data collected from automatic weather stations. The model efficiency and the predictive performance were quantified using the root mean squared error (RMSE), Nash-Sutcliffe model efficiency coefficient (NSE), anomaly correlation coefficient (ACC) and mean square skill score (MSSS). During the evaluation period, NSE for IDCGi, ASP and IDCGi, OTR across 5 command areas were ranged 0.98-0.78. These models were capable of generating skillful forecasts (MSSS ⩾ 0.5 and ACC ⩾ 0.6) of IDCGi, ASP and IDCGi, OTR for all 5 lead days and IDCGi, ASP and IDCGi, OTR forecasts were better than using the long term monthly mean irrigation demand. Overall these predictive performance from the ARMAX time series models were higher than almost all the previous studies we are aware. Further, IDCGi, ASP and IDCGi, OTR forecasts have improved the operators' ability to react for near future irrigation demand fluctuations as the developed ARMAX time series models were self-adaptive to reflect the short-term changes in the irrigation demand with respect to various pressures and opportunities that farmers' face, such as

  11. Uni- and multi-variable modelling of flood losses: experiences gained from the Secchia river inundation event.

    NASA Astrophysics Data System (ADS)

    Carisi, Francesca; Domeneghetti, Alessio; Kreibich, Heidi; Schröter, Kai; Castellarin, Attilio

    2017-04-01

    Flood risk is function of flood hazard and vulnerability, therefore its accurate assessment depends on a reliable quantification of both factors. The scientific literature proposes a number of objective and reliable methods for assessing flood hazard, yet it highlights a limited understanding of the fundamental damage processes. Loss modelling is associated with large uncertainty which is, among other factors, due to a lack of standard procedures; for instance, flood losses are often estimated based on damage models derived in completely different contexts (i.e. different countries or geographical regions) without checking its applicability, or by considering only one explanatory variable (i.e. typically water depth). We consider the Secchia river flood event of January 2014, when a sudden levee-breach caused the inundation of nearly 200 km2 in Northern Italy. In the aftermath of this event, local authorities collected flood loss data, together with additional information on affected private households and industrial activities (e.g. buildings surface and economic value, number of company's employees and others). Based on these data we implemented and compared a quadratic-regression damage function, with water depth as the only explanatory variable, and a multi-variable model that combines multiple regression trees and considers several explanatory variables (i.e. bagging decision trees). Our results show the importance of data collection revealing that (1) a simple quadratic regression damage function based on empirical data from the study area can be significantly more accurate than literature damage-models derived for a different context and (2) multi-variable modelling may outperform the uni-variable approach, yet it is more difficult to develop and apply due to a much higher demand of detailed data.

  12. A Multitaper, Causal Decomposition for Stochastic, Multivariate Time Series: Application to High-Frequency Calcium Imaging Data.

    PubMed

    Sornborger, Andrew T; Lauderdale, James D

    2016-11-01

    Neural data analysis has increasingly incorporated causal information to study circuit connectivity. Dimensional reduction forms the basis of most analyses of large multivariate time series. Here, we present a new, multitaper-based decomposition for stochastic, multivariate time series that acts on the covariance of the time series at all lags, C ( τ ), as opposed to standard methods that decompose the time series, X ( t ), using only information at zero-lag. In both simulated and neural imaging examples, we demonstrate that methods that neglect the full causal structure may be discarding important dynamical information in a time series.

  13. Constructing networks from a dynamical system perspective for multivariate nonlinear time series.

    PubMed

    Nakamura, Tomomichi; Tanizawa, Toshihiro; Small, Michael

    2016-03-01

    We describe a method for constructing networks for multivariate nonlinear time series. We approach the interaction between the various scalar time series from a deterministic dynamical system perspective and provide a generic and algorithmic test for whether the interaction between two measured time series is statistically significant. The method can be applied even when the data exhibit no obvious qualitative similarity: a situation in which the naive method utilizing the cross correlation function directly cannot correctly identify connectivity. To establish the connectivity between nodes we apply the previously proposed small-shuffle surrogate (SSS) method, which can investigate whether there are correlation structures in short-term variabilities (irregular fluctuations) between two data sets from the viewpoint of deterministic dynamical systems. The procedure to construct networks based on this idea is composed of three steps: (i) each time series is considered as a basic node of a network, (ii) the SSS method is applied to verify the connectivity between each pair of time series taken from the whole multivariate time series, and (iii) the pair of nodes is connected with an undirected edge when the null hypothesis cannot be rejected. The network constructed by the proposed method indicates the intrinsic (essential) connectivity of the elements included in the system or the underlying (assumed) system. The method is demonstrated for numerical data sets generated by known systems and applied to several experimental time series.

  14. A Sandwich-Type Standard Error Estimator of SEM Models with Multivariate Time Series

    ERIC Educational Resources Information Center

    Zhang, Guangjian; Chow, Sy-Miin; Ong, Anthony D.

    2011-01-01

    Structural equation models are increasingly used as a modeling tool for multivariate time series data in the social and behavioral sciences. Standard error estimators of SEM models, originally developed for independent data, require modifications to accommodate the fact that time series data are inherently dependent. In this article, we extend a…

  15. Time-varying associations of suicide with deployments, mental health conditions, and stressful life events among current and former US military personnel: a retrospective multivariate analysis.

    PubMed

    Shen, Yu-Chu; Cunha, Jesse M; Williams, Thomas V

    2016-11-01

    US military suicides have increased substantially over the past decade and currently account for almost 20% of all military deaths. We investigated the associations of a comprehensive set of time-varying risk factors with suicides among current and former military service members. We did a retrospective multivariate analysis of all US military personnel between 2001 and 2011 (n=110 035 573 person-quarter-years, representing 3 795 823 service members). Outcome was death by suicide, either during service or post-separation. We used Cox proportional hazard models at the person-quarter level to examine associations of deployment, mental disorders, history of unlawful activity, stressful life events, and other demographic and service factors with death by suicide. The strongest predictors of death by suicide were current and past diagnoses of self-inflicted injuries, major depression, bipolar disorder, substance use disorder, and other mental health conditions (compared with service members with no history of diagnoses, the hazard ratio [HR] ranged from 1·4 [95% CI 1·14-1·72] to 8·34 [6·71-10·37]). Compared with service members who were never deployed, hazard rates of suicide (which represent the probability of death by suicide in a specific quarter given that the individual was alive in the previous quarter) were lower among the currently deployed (HR 0·50, 95% CI 0·40-0·61) but significantly higher in the quarters following first deployment (HR 1·51 [1·17-1·96] if deployed in the previous three quarters; 1·14 [1·06-1·23] if deployed four or more quarters ago). The hazard rate of suicide increased within the first year of separation from the military (HR 2·49, 95% CI 2·12-2·91), and remained high for those who had separated from the military 6 or more years ago (HR 1·63, 1·45-1·82). The increased hazard rate of death by suicide for military personnel varies by time since exposure to deployment, mental health diagnoses, and other stressful

  16. Tidal Disruption Events Across Cosmic Time

    NASA Astrophysics Data System (ADS)

    Fialkov, Anastasia; Loeb, Abraham

    2017-01-01

    Tidal disruption events (TDEs) of stars by single or binary super-massive black holes illuminate the environment around quiescent black holes in galactic nuclei allowing to probe dorment black holes. We predict the TDE rates expected to be detected by next-generation X-ray surveys. We include events sourced by both single and binary super-massive black holes assuming that 10% of TDEs lead to the formation of relativistic jets and are therefore observable to higher redshifts. Assigning the Eddington luminosity to each event, we show that if the occupation fraction of intermediate black holes is high, more than 90% of the brightest TDE might be associated with merging black holes which are potential sources for eLISA. Next generation telescopes with improved sensitivities should probe dim local TDE events as well as bright events at high redshifts. We show that an instrument which is 50 times more sensitive than the Swift Burst Alert Telescope (BAT) is expected to trigger ~10 times more events than BAT. Majority of these events originate at low redshifts (z<0.5) if the occupation fraction of IMBHs is high and at high-redshift (z>2) if it is low.

  17. Optimizing Functional Network Representation of Multivariate Time Series

    PubMed Central

    Zanin, Massimiliano; Sousa, Pedro; Papo, David; Bajo, Ricardo; García-Prieto, Juan; Pozo, Francisco del; Menasalvas, Ernestina; Boccaletti, Stefano

    2012-01-01

    By combining complex network theory and data mining techniques, we provide objective criteria for optimization of the functional network representation of generic multivariate time series. In particular, we propose a method for the principled selection of the threshold value for functional network reconstruction from raw data, and for proper identification of the network's indicators that unveil the most discriminative information on the system for classification purposes. We illustrate our method by analysing networks of functional brain activity of healthy subjects, and patients suffering from Mild Cognitive Impairment, an intermediate stage between the expected cognitive decline of normal aging and the more pronounced decline of dementia. We discuss extensions of the scope of the proposed methodology to network engineering purposes, and to other data mining tasks. PMID:22953051

  18. Optimizing Functional Network Representation of Multivariate Time Series

    NASA Astrophysics Data System (ADS)

    Zanin, Massimiliano; Sousa, Pedro; Papo, David; Bajo, Ricardo; García-Prieto, Juan; Pozo, Francisco Del; Menasalvas, Ernestina; Boccaletti, Stefano

    2012-09-01

    By combining complex network theory and data mining techniques, we provide objective criteria for optimization of the functional network representation of generic multivariate time series. In particular, we propose a method for the principled selection of the threshold value for functional network reconstruction from raw data, and for proper identification of the network's indicators that unveil the most discriminative information on the system for classification purposes. We illustrate our method by analysing networks of functional brain activity of healthy subjects, and patients suffering from Mild Cognitive Impairment, an intermediate stage between the expected cognitive decline of normal aging and the more pronounced decline of dementia. We discuss extensions of the scope of the proposed methodology to network engineering purposes, and to other data mining tasks.

  19. Using timed event sequential data in nursing research.

    PubMed

    Pecanac, Kristen E; Doherty-King, Barbara; Yoon, Ju Young; Brown, Roger; Schiefelbein, Tony

    2015-01-01

    Measuring behavior is important in nursing research, and innovative technologies are needed to capture the "real-life" complexity of behaviors and events. The purpose of this article is to describe the use of timed event sequential data in nursing research and to demonstrate the use of this data in a research study. Timed event sequencing allows the researcher to capture the frequency, duration, and sequence of behaviors as they occur in an observation period and to link the behaviors to contextual details. Timed event sequential data can easily be collected with handheld computers, loaded with a software program designed for capturing observations in real time. Timed event sequential data add considerable strength to analysis of any nursing behavior of interest, which can enhance understanding and lead to improvement in nursing practice.

  20. Event-by-Event Study of Space-Time Dynamics in Flux-Tube Fragmentation

    DOE PAGES

    Wong, Cheuk-Yin

    2017-05-25

    In the semi-classical description of the flux-tube fragmentation process for hadron production and hadronization in high-energymore » $e^+e^-$ annihilations and $pp$ collisions, the rapidity-space-time ordering and the local conservation laws of charge, flavor, and momentum provide a set of powerful tools that may allow the reconstruction of the space-time dynamics of quarks and mesons in exclusive measurements of produced hadrons, on an event-by-event basis. We propose procedures to reconstruct the space-time dynamics from event-by-event exclusive hadron data to exhibit explicitly the ordered chain of hadrons produced in a flux tube fragmentation. As a supplementary tool, we infer the average space-time coordinates of the $q$-$$\\bar q$$ pair production vertices from the $$\\pi^-$$ rapidity distribution data obtained by the NA61/SHINE Collaboration in $pp$ collisions at $$\\sqrt{s}$$ = 6.3 to 17.3 GeV.« less

  1. Event-by-Event Study of Space-Time Dynamics in Flux-Tube Fragmentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, Cheuk-Yin

    In the semi-classical description of the flux-tube fragmentation process for hadron production and hadronization in high-energymore » $e^+e^-$ annihilations and $pp$ collisions, the rapidity-space-time ordering and the local conservation laws of charge, flavor, and momentum provide a set of powerful tools that may allow the reconstruction of the space-time dynamics of quarks and mesons in exclusive measurements of produced hadrons, on an event-by-event basis. We propose procedures to reconstruct the space-time dynamics from event-by-event exclusive hadron data to exhibit explicitly the ordered chain of hadrons produced in a flux tube fragmentation. As a supplementary tool, we infer the average space-time coordinates of the $q$-$$\\bar q$$ pair production vertices from the $$\\pi^-$$ rapidity distribution data obtained by the NA61/SHINE Collaboration in $pp$ collisions at $$\\sqrt{s}$$ = 6.3 to 17.3 GeV.« less

  2. Sensor-Generated Time Series Events: A Definition Language

    PubMed Central

    Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan

    2012-01-01

    There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.

  3. Analyzing time-ordered event data with missed observations.

    PubMed

    Dokter, Adriaan M; van Loon, E Emiel; Fokkema, Wimke; Lameris, Thomas K; Nolet, Bart A; van der Jeugd, Henk P

    2017-09-01

    A common problem with observational datasets is that not all events of interest may be detected. For example, observing animals in the wild can difficult when animals move, hide, or cannot be closely approached. We consider time series of events recorded in conditions where events are occasionally missed by observers or observational devices. These time series are not restricted to behavioral protocols, but can be any cyclic or recurring process where discrete outcomes are observed. Undetected events cause biased inferences on the process of interest, and statistical analyses are needed that can identify and correct the compromised detection processes. Missed observations in time series lead to observed time intervals between events at multiples of the true inter-event time, which conveys information on their detection probability. We derive the theoretical probability density function for observed intervals between events that includes a probability of missed detection. Methodology and software tools are provided for analysis of event data with potential observation bias and its removal. The methodology was applied to simulation data and a case study of defecation rate estimation in geese, which is commonly used to estimate their digestive throughput and energetic uptake, or to calculate goose usage of a feeding site from dropping density. Simulations indicate that at a moderate chance to miss arrival events ( p  = 0.3), uncorrected arrival intervals were biased upward by up to a factor 3, while parameter values corrected for missed observations were within 1% of their true simulated value. A field case study shows that not accounting for missed observations leads to substantial underestimates of the true defecation rate in geese, and spurious rate differences between sites, which are introduced by differences in observational conditions. These results show that the derived methodology can be used to effectively remove observational biases in time-ordered event

  4. Multivariate flood risk assessment: reinsurance perspective

    NASA Astrophysics Data System (ADS)

    Ghizzoni, Tatiana; Ellenrieder, Tobias

    2013-04-01

    For insurance and re-insurance purposes the knowledge of the spatial characteristics of fluvial flooding is fundamental. The probability of simultaneous flooding at different locations during one event and the associated severity and losses have to be estimated in order to assess premiums and for accumulation control (Probable Maximum Losses calculation). Therefore, the identification of a statistical model able to describe the multivariate joint distribution of flood events in multiple location is necessary. In this context, copulas can be viewed as alternative tools for dealing with multivariate simulations as they allow to formalize dependence structures of random vectors. An application of copula function for flood scenario generation is presented for Australia (Queensland, New South Wales and Victoria) where 100.000 possible flood scenarios covering approximately 15.000 years were simulated.

  5. Comparison of Time-to-First Event and Recurrent Event Methods in Randomized Clinical Trials.

    PubMed

    Claggett, Brian; Pocock, Stuart; Wei, L J; Pfeffer, Marc A; McMurray, John J V; Solomon, Scott D

    2018-03-27

    Background -Most Phase-3 trials feature time-to-first event endpoints for their primary and/or secondary analyses. In chronic diseases where a clinical event can occur more than once, recurrent-event methods have been proposed to more fully capture disease burden and have been assumed to improve statistical precision and power compared to conventional "time-to-first" methods. Methods -To better characterize factors that influence statistical properties of recurrent-events and time-to-first methods in the evaluation of randomized therapy, we repeatedly simulated trials with 1:1 randomization of 4000 patients to active vs control therapy, with true patient-level risk reduction of 20% (i.e. RR=0.80). For patients who discontinued active therapy after a first event, we assumed their risk reverted subsequently to their original placebo-level risk. Through simulation, we varied a) the degree of between-patient heterogeneity of risk and b) the extent of treatment discontinuation. Findings were compared with those from actual randomized clinical trials. Results -As the degree of between-patient heterogeneity of risk was increased, both time-to-first and recurrent-events methods lost statistical power to detect a true risk reduction and confidence intervals widened. The recurrent-events analyses continued to estimate the true RR=0.80 as heterogeneity increased, while the Cox model produced estimates that were attenuated. The power of recurrent-events methods declined as the rate of study drug discontinuation post-event increased. Recurrent-events methods provided greater power than time-to-first methods in scenarios where drug discontinuation was ≤30% following a first event, lesser power with drug discontinuation rates of ≥60%, and comparable power otherwise. We confirmed in several actual trials in chronic heart failure that treatment effect estimates were attenuated when estimated via the Cox model and that increased statistical power from recurrent-events methods

  6. Nuclear event zero-time calculation and uncertainty evaluation.

    PubMed

    Pan, Pujing; Ungar, R Kurt

    2012-04-01

    It is important to know the initial time, or zero-time, of a nuclear event such as a nuclear weapon's test, a nuclear power plant accident or a nuclear terrorist attack (e.g. with an improvised nuclear device, IND). Together with relevant meteorological information, the calculated zero-time is used to help locate the origin of a nuclear event. The zero-time of a nuclear event can be derived from measured activity ratios of two nuclides. The calculated zero-time of a nuclear event would not be complete without an appropriately evaluated uncertainty term. In this paper, analytical equations for zero-time and the associated uncertainty calculations are derived using a measured activity ratio of two nuclides. Application of the derived equations is illustrated in a realistic example using data from the last Chinese thermonuclear test in 1980. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.

  7. Multivariate multiscale entropy of financial markets

    NASA Astrophysics Data System (ADS)

    Lu, Yunfan; Wang, Jun

    2017-11-01

    In current process of quantifying the dynamical properties of the complex phenomena in financial market system, the multivariate financial time series are widely concerned. In this work, considering the shortcomings and limitations of univariate multiscale entropy in analyzing the multivariate time series, the multivariate multiscale sample entropy (MMSE), which can evaluate the complexity in multiple data channels over different timescales, is applied to quantify the complexity of financial markets. Its effectiveness and advantages have been detected with numerical simulations with two well-known synthetic noise signals. For the first time, the complexity of four generated trivariate return series for each stock trading hour in China stock markets is quantified thanks to the interdisciplinary application of this method. We find that the complexity of trivariate return series in each hour show a significant decreasing trend with the stock trading time progressing. Further, the shuffled multivariate return series and the absolute multivariate return series are also analyzed. As another new attempt, quantifying the complexity of global stock markets (Asia, Europe and America) is carried out by analyzing the multivariate returns from them. Finally we utilize the multivariate multiscale entropy to assess the relative complexity of normalized multivariate return volatility series with different degrees.

  8. Time Delay in Microlensing Event

    NASA Image and Video Library

    2015-04-14

    This plot shows data obtained from NASA's Spitzer Space Telescope and the Optical Gravitational Lensing Experiment, or OGLE, telescope located in Chile, during a "microlensing" event. Microlensing events occur when one star passes another, and the gravity of the foreground star causes the distant star's light to magnify and brighten. This magnification is evident in the plot, as both Spitzer and OGLE register an increase in the star's brightness. If the foreground star is circled by a planet, the planet's gravity can alter the magnification over a shorter period, seen in the plot in the form of spikes and a dip. The great distance between Spitzer, in space, and OGLE, on the ground, meant that Spitzer saw this particular microlensing event before OGLE. The offset in the timing can be used to measure the distance to the planet. In this case, the planet, called OGLE-2014-BLG-0124L, was found to be 13,000 light-years away, near the center of our Milky Way galaxy. The finding was the result of fortuitous timing because Spitzer's overall program to observe microlensing events was only just starting up in the week before the planet's effects were visible from Spitzer's vantage point. While Spitzer sees infrared light of 3.6 microns in wavelength, OGLE sees visible light of 0.8 microns. http://photojournal.jpl.nasa.gov/catalog/PIA19331

  9. Tracking the time-varying cortical connectivity patterns by adaptive multivariate estimators.

    PubMed

    Astolfi, L; Cincotti, F; Mattia, D; De Vico Fallani, F; Tocci, A; Colosimo, A; Salinari, S; Marciani, M G; Hesse, W; Witte, H; Ursino, M; Zavaglia, M; Babiloni, F

    2008-03-01

    The directed transfer function (DTF) and the partial directed coherence (PDC) are frequency-domain estimators that are able to describe interactions between cortical areas in terms of the concept of Granger causality. However, the classical estimation of these methods is based on the multivariate autoregressive modelling (MVAR) of time series, which requires the stationarity of the signals. In this way, transient pathways of information transfer remains hidden. The objective of this study is to test a time-varying multivariate method for the estimation of rapidly changing connectivity relationships between cortical areas of the human brain, based on DTF/PDC and on the use of adaptive MVAR modelling (AMVAR) and to apply it to a set of real high resolution EEG data. This approach will allow the observation of rapidly changing influences between the cortical areas during the execution of a task. The simulation results indicated that time-varying DTF and PDC are able to estimate correctly the imposed connectivity patterns under reasonable operative conditions of signal-to-noise ratio (SNR) ad number of trials. An SNR of five and a number of trials of at least 20 provide a good accuracy in the estimation. After testing the method by the simulation study, we provide an application to the cortical estimations obtained from high resolution EEG data recorded from a group of healthy subject during a combined foot-lips movement and present the time-varying connectivity patterns resulting from the application of both DTF and PDC. Two different cortical networks were detected with the proposed methods, one constant across the task and the other evolving during the preparation of the joint movement.

  10. Ecological prediction with nonlinear multivariate time-frequency functional data models

    USGS Publications Warehouse

    Yang, Wen-Hsi; Wikle, Christopher K.; Holan, Scott H.; Wildhaber, Mark L.

    2013-01-01

    Time-frequency analysis has become a fundamental component of many scientific inquiries. Due to improvements in technology, the amount of high-frequency signals that are collected for ecological and other scientific processes is increasing at a dramatic rate. In order to facilitate the use of these data in ecological prediction, we introduce a class of nonlinear multivariate time-frequency functional models that can identify important features of each signal as well as the interaction of signals corresponding to the response variable of interest. Our methodology is of independent interest and utilizes stochastic search variable selection to improve model selection and performs model averaging to enhance prediction. We illustrate the effectiveness of our approach through simulation and by application to predicting spawning success of shovelnose sturgeon in the Lower Missouri River.

  11. Understanding characteristics in multivariate traffic flow time series from complex network structure

    NASA Astrophysics Data System (ADS)

    Yan, Ying; Zhang, Shen; Tang, Jinjun; Wang, Xiaofei

    2017-07-01

    Discovering dynamic characteristics in traffic flow is the significant step to design effective traffic managing and controlling strategy for relieving traffic congestion in urban cities. A new method based on complex network theory is proposed to study multivariate traffic flow time series. The data were collected from loop detectors on freeway during a year. In order to construct complex network from original traffic flow, a weighted Froenius norm is adopt to estimate similarity between multivariate time series, and Principal Component Analysis is implemented to determine the weights. We discuss how to select optimal critical threshold for networks at different hour in term of cumulative probability distribution of degree. Furthermore, two statistical properties of networks: normalized network structure entropy and cumulative probability of degree, are utilized to explore hourly variation in traffic flow. The results demonstrate these two statistical quantities express similar pattern to traffic flow parameters with morning and evening peak hours. Accordingly, we detect three traffic states: trough, peak and transitional hours, according to the correlation between two aforementioned properties. The classifying results of states can actually represent hourly fluctuation in traffic flow by analyzing annual average hourly values of traffic volume, occupancy and speed in corresponding hours.

  12. Bayesian transformation cure frailty models with multivariate failure time data.

    PubMed

    Yin, Guosheng

    2008-12-10

    We propose a class of transformation cure frailty models to accommodate a survival fraction in multivariate failure time data. Established through a general power transformation, this family of cure frailty models includes the proportional hazards and the proportional odds modeling structures as two special cases. Within the Bayesian paradigm, we obtain the joint posterior distribution and the corresponding full conditional distributions of the model parameters for the implementation of Gibbs sampling. Model selection is based on the conditional predictive ordinate statistic and deviance information criterion. As an illustration, we apply the proposed method to a real data set from dentistry.

  13. Moving Events in Time: Time-Referent Hand-Arm Movements Influence Perceived Temporal Distance to Past Events

    ERIC Educational Resources Information Center

    Blom, Stephanie S. A. H.; Semin, Gun R.

    2013-01-01

    We examine and find support for the hypothesis that time-referent hand-arm movements influence temporal judgments. In line with the concept of "left is associated with earlier times, and right is associated with later times," we show that performing left (right) hand-arm movements while thinking about a past event increases (decreases) the…

  14. Modeling multivariate time series on manifolds with skew radial basis functions.

    PubMed

    Jamshidi, Arta A; Kirby, Michael J

    2011-01-01

    We present an approach for constructing nonlinear empirical mappings from high-dimensional domains to multivariate ranges. We employ radial basis functions and skew radial basis functions for constructing a model using data that are potentially scattered or sparse. The algorithm progresses iteratively, adding a new function at each step to refine the model. The placement of the functions is driven by a statistical hypothesis test that accounts for correlation in the multivariate range variables. The test is applied on training and validation data and reveals nonstatistical or geometric structure when it fails. At each step, the added function is fit to data contained in a spatiotemporally defined local region to determine the parameters--in particular, the scale of the local model. The scale of the function is determined by the zero crossings of the autocorrelation function of the residuals. The model parameters and the number of basis functions are determined automatically from the given data, and there is no need to initialize any ad hoc parameters save for the selection of the skew radial basis functions. Compactly supported skew radial basis functions are employed to improve model accuracy, order, and convergence properties. The extension of the algorithm to higher-dimensional ranges produces reduced-order models by exploiting the existence of correlation in the range variable data. Structure is tested not just in a single time series but between all pairs of time series. We illustrate the new methodologies using several illustrative problems, including modeling data on manifolds and the prediction of chaotic time series.

  15. Multivariable Time Series Prediction for the Icing Process on Overhead Power Transmission Line

    PubMed Central

    Li, Peng; Zhao, Na; Zhou, Donghua; Cao, Min; Li, Jingjie; Shi, Xinling

    2014-01-01

    The design of monitoring and predictive alarm systems is necessary for successful overhead power transmission line icing. Given the characteristics of complexity, nonlinearity, and fitfulness in the line icing process, a model based on a multivariable time series is presented here to predict the icing load of a transmission line. In this model, the time effects of micrometeorology parameters for the icing process have been analyzed. The phase-space reconstruction theory and machine learning method were then applied to establish the prediction model, which fully utilized the history of multivariable time series data in local monitoring systems to represent the mapping relationship between icing load and micrometeorology factors. Relevant to the characteristic of fitfulness in line icing, the simulations were carried out during the same icing process or different process to test the model's prediction precision and robustness. According to the simulation results for the Tao-Luo-Xiong Transmission Line, this model demonstrates a good accuracy of prediction in different process, if the prediction length is less than two hours, and would be helpful for power grid departments when deciding to take action in advance to address potential icing disasters. PMID:25136653

  16. Time-varying nonstationary multivariate risk analysis using a dynamic Bayesian copula

    NASA Astrophysics Data System (ADS)

    Sarhadi, Ali; Burn, Donald H.; Concepción Ausín, María.; Wiper, Michael P.

    2016-03-01

    A time-varying risk analysis is proposed for an adaptive design framework in nonstationary conditions arising from climate change. A Bayesian, dynamic conditional copula is developed for modeling the time-varying dependence structure between mixed continuous and discrete multiattributes of multidimensional hydrometeorological phenomena. Joint Bayesian inference is carried out to fit the marginals and copula in an illustrative example using an adaptive, Gibbs Markov Chain Monte Carlo (MCMC) sampler. Posterior mean estimates and credible intervals are provided for the model parameters and the Deviance Information Criterion (DIC) is used to select the model that best captures different forms of nonstationarity over time. This study also introduces a fully Bayesian, time-varying joint return period for multivariate time-dependent risk analysis in nonstationary environments. The results demonstrate that the nature and the risk of extreme-climate multidimensional processes are changed over time under the impact of climate change, and accordingly the long-term decision making strategies should be updated based on the anomalies of the nonstationary environment.

  17. Comparing of Cox model and parametric models in analysis of effective factors on event time of neuropathy in patients with type 2 diabetes.

    PubMed

    Kargarian-Marvasti, Sadegh; Rimaz, Shahnaz; Abolghasemi, Jamileh; Heydari, Iraj

    2017-01-01

    Cox proportional hazard model is the most common method for analyzing the effects of several variables on survival time. However, under certain circumstances, parametric models give more precise estimates to analyze survival data than Cox. The purpose of this study was to investigate the comparative performance of Cox and parametric models in a survival analysis of factors affecting the event time of neuropathy in patients with type 2 diabetes. This study included 371 patients with type 2 diabetes without neuropathy who were registered at Fereydunshahr diabetes clinic. Subjects were followed up for the development of neuropathy between 2006 to March 2016. To investigate the factors influencing the event time of neuropathy, significant variables in univariate model ( P < 0.20) were entered into the multivariate Cox and parametric models ( P < 0.05). In addition, Akaike information criterion (AIC) and area under ROC curves were used to evaluate the relative goodness of fitted model and the efficiency of each procedure, respectively. Statistical computing was performed using R software version 3.2.3 (UNIX platforms, Windows and MacOS). Using Kaplan-Meier, survival time of neuropathy was computed 76.6 ± 5 months after initial diagnosis of diabetes. After multivariate analysis of Cox and parametric models, ethnicity, high-density lipoprotein and family history of diabetes were identified as predictors of event time of neuropathy ( P < 0.05). According to AIC, "log-normal" model with the lowest Akaike's was the best-fitted model among Cox and parametric models. According to the results of comparison of survival receiver operating characteristics curves, log-normal model was considered as the most efficient and fitted model.

  18. Methodological challenges to multivariate syndromic surveillance: a case study using Swiss animal health data.

    PubMed

    Vial, Flavie; Wei, Wei; Held, Leonhard

    2016-12-20

    In an era of ubiquitous electronic collection of animal health data, multivariate surveillance systems (which concurrently monitor several data streams) should have a greater probability of detecting disease events than univariate systems. However, despite their limitations, univariate aberration detection algorithms are used in most active syndromic surveillance (SyS) systems because of their ease of application and interpretation. On the other hand, a stochastic modelling-based approach to multivariate surveillance offers more flexibility, allowing for the retention of historical outbreaks, for overdispersion and for non-stationarity. While such methods are not new, they are yet to be applied to animal health surveillance data. We applied an example of such stochastic model, Held and colleagues' two-component model, to two multivariate animal health datasets from Switzerland. In our first application, multivariate time series of the number of laboratories test requests were derived from Swiss animal diagnostic laboratories. We compare the performance of the two-component model to parallel monitoring using an improved Farrington algorithm and found both methods yield a satisfactorily low false alarm rate. However, the calibration test of the two-component model on the one-step ahead predictions proved satisfactory, making such an approach suitable for outbreak prediction. In our second application, the two-component model was applied to the multivariate time series of the number of cattle abortions and the number of test requests for bovine viral diarrhea (a disease that often results in abortions). We found that there is a two days lagged effect from the number of abortions to the number of test requests. We further compared the joint modelling and univariate modelling of the number of laboratory test requests time series. The joint modelling approach showed evidence of superiority in terms of forecasting abilities. Stochastic modelling approaches offer the

  19. Recurrent Neural Networks for Multivariate Time Series with Missing Values.

    PubMed

    Che, Zhengping; Purushotham, Sanjay; Cho, Kyunghyun; Sontag, David; Liu, Yan

    2018-04-17

    Multivariate time series data in practical applications, such as health care, geoscience, and biology, are characterized by a variety of missing values. In time series prediction and other related tasks, it has been noted that missing values and their missing patterns are often correlated with the target labels, a.k.a., informative missingness. There is very limited work on exploiting the missing patterns for effective imputation and improving prediction performance. In this paper, we develop novel deep learning models, namely GRU-D, as one of the early attempts. GRU-D is based on Gated Recurrent Unit (GRU), a state-of-the-art recurrent neural network. It takes two representations of missing patterns, i.e., masking and time interval, and effectively incorporates them into a deep model architecture so that it not only captures the long-term temporal dependencies in time series, but also utilizes the missing patterns to achieve better prediction results. Experiments of time series classification tasks on real-world clinical datasets (MIMIC-III, PhysioNet) and synthetic datasets demonstrate that our models achieve state-of-the-art performance and provide useful insights for better understanding and utilization of missing values in time series analysis.

  20. Quantifying space-time dynamics of flood event types

    NASA Astrophysics Data System (ADS)

    Viglione, Alberto; Chirico, Giovanni Battista; Komma, Jürgen; Woods, Ross; Borga, Marco; Blöschl, Günter

    2010-11-01

    SummaryA generalised framework of space-time variability in flood response is used to characterise five flood events of different type in the Kamp area in Austria: one long-rain event, two short-rain events, one rain-on-snow event and one snowmelt event. Specifically, the framework quantifies the contributions of the space-time variability of rainfall/snowmelt, runoff coefficient, hillslope and channel routing to the flood runoff volume and the delay and spread of the resulting hydrograph. The results indicate that the components obtained by the framework clearly reflect the individual processes which characterise the event types. For the short-rain events, temporal, spatial and movement components can all be important in runoff generation and routing, which would be expected because of their local nature in time and, particularly, in space. For the long-rain event, the temporal components tend to be more important for runoff generation, because of the more uniform spatial coverage of rainfall, while for routing the spatial distribution of the produced runoff, which is not uniform, is also important. For the rain-on-snow and snowmelt events, the spatio-temporal variability terms typically do not play much role in runoff generation and the spread of the hydrograph is mainly due to the duration of the event. As an outcome of the framework, a dimensionless response number is proposed that represents the joint effect of runoff coefficient and hydrograph peakedness and captures the absolute magnitudes of the observed flood peaks.

  1. Comprehensive drought characteristics analysis based on a nonlinear multivariate drought index

    NASA Astrophysics Data System (ADS)

    Yang, Jie; Chang, Jianxia; Wang, Yimin; Li, Yunyun; Hu, Hui; Chen, Yutong; Huang, Qiang; Yao, Jun

    2018-02-01

    It is vital to identify drought events and to evaluate multivariate drought characteristics based on a composite drought index for better drought risk assessment and sustainable development of water resources. However, most composite drought indices are constructed by the linear combination, principal component analysis and entropy weight method assuming a linear relationship among different drought indices. In this study, the multidimensional copulas function was applied to construct a nonlinear multivariate drought index (NMDI) to solve the complicated and nonlinear relationship due to its dependence structure and flexibility. The NMDI was constructed by combining meteorological, hydrological, and agricultural variables (precipitation, runoff, and soil moisture) to better reflect the multivariate variables simultaneously. Based on the constructed NMDI and runs theory, drought events for a particular area regarding three drought characteristics: duration, peak, and severity were identified. Finally, multivariate drought risk was analyzed as a tool for providing reliable support in drought decision-making. The results indicate that: (1) multidimensional copulas can effectively solve the complicated and nonlinear relationship among multivariate variables; (2) compared with single and other composite drought indices, the NMDI is slightly more sensitive in capturing recorded drought events; and (3) drought risk shows a spatial variation; out of the five partitions studied, the Jing River Basin as well as the upstream and midstream of the Wei River Basin are characterized by a higher multivariate drought risk. In general, multidimensional copulas provides a reliable way to solve the nonlinear relationship when constructing a comprehensive drought index and evaluating multivariate drought characteristics.

  2. Young Children's Memory for the Times of Personal Past Events

    PubMed Central

    Pathman, Thanujeni; Larkina, Marina; Burch, Melissa; Bauer, Patricia J.

    2012-01-01

    Remembering the temporal information associated with personal past events is critical for autobiographical memory, yet we know relatively little about the development of this capacity. In the present research, we investigated temporal memory for naturally occurring personal events in 4-, 6-, and 8-year-old children. Parents recorded unique events in which their children participated during a 4-month period. At test, children made relative recency judgments and estimated the time of each event using conventional time-scales (time of day, day of week, month of year, and season). Children also were asked to provide justifications for their time-scale judgments. Six- and 8-year-olds, but not 4-year-olds, accurately judged the order of two distinct events. There were age-related improvements in children's estimation of the time of events using conventional time-scales. Older children provided more justifications for their time-scale judgments compared to younger children. Relations between correct responding on the time-scale judgments and provision of meaningful justifications suggest that children may use that information to reconstruct the times associated with past events. The findings can be used to chart a developmental trajectory of performance in temporal memory for personal past events, and have implications for our understanding of autobiographical memory development. PMID:23687467

  3. Multivariate meta-analysis: potential and promise.

    PubMed

    Jackson, Dan; Riley, Richard; White, Ian R

    2011-09-10

    The multivariate random effects model is a generalization of the standard univariate model. Multivariate meta-analysis is becoming more commonly used and the techniques and related computer software, although continually under development, are now in place. In order to raise awareness of the multivariate methods, and discuss their advantages and disadvantages, we organized a one day 'Multivariate meta-analysis' event at the Royal Statistical Society. In addition to disseminating the most recent developments, we also received an abundance of comments, concerns, insights, critiques and encouragement. This article provides a balanced account of the day's discourse. By giving others the opportunity to respond to our assessment, we hope to ensure that the various view points and opinions are aired before multivariate meta-analysis simply becomes another widely used de facto method without any proper consideration of it by the medical statistics community. We describe the areas of application that multivariate meta-analysis has found, the methods available, the difficulties typically encountered and the arguments for and against the multivariate methods, using four representative but contrasting examples. We conclude that the multivariate methods can be useful, and in particular can provide estimates with better statistical properties, but also that these benefits come at the price of making more assumptions which do not result in better inference in every case. Although there is evidence that multivariate meta-analysis has considerable potential, it must be even more carefully applied than its univariate counterpart in practice. Copyright © 2011 John Wiley & Sons, Ltd.

  4. Multivariate meta-analysis: Potential and promise

    PubMed Central

    Jackson, Dan; Riley, Richard; White, Ian R

    2011-01-01

    The multivariate random effects model is a generalization of the standard univariate model. Multivariate meta-analysis is becoming more commonly used and the techniques and related computer software, although continually under development, are now in place. In order to raise awareness of the multivariate methods, and discuss their advantages and disadvantages, we organized a one day ‘Multivariate meta-analysis’ event at the Royal Statistical Society. In addition to disseminating the most recent developments, we also received an abundance of comments, concerns, insights, critiques and encouragement. This article provides a balanced account of the day's discourse. By giving others the opportunity to respond to our assessment, we hope to ensure that the various view points and opinions are aired before multivariate meta-analysis simply becomes another widely used de facto method without any proper consideration of it by the medical statistics community. We describe the areas of application that multivariate meta-analysis has found, the methods available, the difficulties typically encountered and the arguments for and against the multivariate methods, using four representative but contrasting examples. We conclude that the multivariate methods can be useful, and in particular can provide estimates with better statistical properties, but also that these benefits come at the price of making more assumptions which do not result in better inference in every case. Although there is evidence that multivariate meta-analysis has considerable potential, it must be even more carefully applied than its univariate counterpart in practice. Copyright © 2011 John Wiley & Sons, Ltd. PMID:21268052

  5. Using Statistical Process Control for detecting anomalies in multivariate spatiotemporal Earth Observations

    NASA Astrophysics Data System (ADS)

    Flach, Milan; Mahecha, Miguel; Gans, Fabian; Rodner, Erik; Bodesheim, Paul; Guanche-Garcia, Yanira; Brenning, Alexander; Denzler, Joachim; Reichstein, Markus

    2016-04-01

    The number of available Earth observations (EOs) is currently substantially increasing. Detecting anomalous patterns in these multivariate time series is an important step in identifying changes in the underlying dynamical system. Likewise, data quality issues might result in anomalous multivariate data constellations and have to be identified before corrupting subsequent analyses. In industrial application a common strategy is to monitor production chains with several sensors coupled to some statistical process control (SPC) algorithm. The basic idea is to raise an alarm when these sensor data depict some anomalous pattern according to the SPC, i.e. the production chain is considered 'out of control'. In fact, the industrial applications are conceptually similar to the on-line monitoring of EOs. However, algorithms used in the context of SPC or process monitoring are rarely considered for supervising multivariate spatio-temporal Earth observations. The objective of this study is to exploit the potential and transferability of SPC concepts to Earth system applications. We compare a range of different algorithms typically applied by SPC systems and evaluate their capability to detect e.g. known extreme events in land surface processes. Specifically two main issues are addressed: (1) identifying the most suitable combination of data pre-processing and detection algorithm for a specific type of event and (2) analyzing the limits of the individual approaches with respect to the magnitude, spatio-temporal size of the event as well as the data's signal to noise ratio. Extensive artificial data sets that represent the typical properties of Earth observations are used in this study. Our results show that the majority of the algorithms used can be considered for the detection of multivariate spatiotemporal events and directly transferred to real Earth observation data as currently assembled in different projects at the European scale, e.g. http://baci-h2020.eu

  6. Real-time measurements, rare events and photon economics

    NASA Astrophysics Data System (ADS)

    Jalali, B.; Solli, D. R.; Goda, K.; Tsia, K.; Ropers, C.

    2010-07-01

    Rogue events otherwise known as outliers and black swans are singular, rare, events that carry dramatic impact. They appear in seemingly unconnected systems in the form of oceanic rogue waves, stock market crashes, evolution, and communication systems. Attempts to understand the underlying dynamics of such complex systems that lead to spectacular and often cataclysmic outcomes have been frustrated by the scarcity of events, resulting in insufficient statistical data, and by the inability to perform experiments under controlled conditions. Extreme rare events also occur in ultrafast physical sciences where it is possible to collect large data sets, even for rare events, in a short time period. The knowledge gained from observing rare events in ultrafast systems may provide valuable insight into extreme value phenomena that occur over a much slower timescale and that have a closer connection with human experience. One solution is a real-time ultrafast instrument that is capable of capturing singular and randomly occurring non-repetitive events. The time stretch technology developed during the past 13 years is providing a powerful tool box for reaching this goal. This paper reviews this technology and discusses its use in capturing rogue events in electronic signals, spectroscopy, and imaging. We show an example in nonlinear optics where it was possible to capture rare and random solitons whose unusual statistical distribution resemble those observed in financial markets. The ability to observe the true spectrum of each event in real time has led to important insight in understanding the underlying process, which in turn has made it possible to control soliton generation leading to improvement in the coherence of supercontinuum light. We also show a new class of fast imagers which are being considered for early detection of cancer because of their potential ability to detect rare diseased cells (so called rogue cells) in a large population of healthy cells.

  7. Multivariate analysis in thoracic research.

    PubMed

    Mengual-Macenlle, Noemí; Marcos, Pedro J; Golpe, Rafael; González-Rivas, Diego

    2015-03-01

    Multivariate analysis is based in observation and analysis of more than one statistical outcome variable at a time. In design and analysis, the technique is used to perform trade studies across multiple dimensions while taking into account the effects of all variables on the responses of interest. The development of multivariate methods emerged to analyze large databases and increasingly complex data. Since the best way to represent the knowledge of reality is the modeling, we should use multivariate statistical methods. Multivariate methods are designed to simultaneously analyze data sets, i.e., the analysis of different variables for each person or object studied. Keep in mind at all times that all variables must be treated accurately reflect the reality of the problem addressed. There are different types of multivariate analysis and each one should be employed according to the type of variables to analyze: dependent, interdependence and structural methods. In conclusion, multivariate methods are ideal for the analysis of large data sets and to find the cause and effect relationships between variables; there is a wide range of analysis types that we can use.

  8. An unjustified benefit: immortal time bias in the analysis of time-dependent events.

    PubMed

    Gleiss, Andreas; Oberbauer, Rainer; Heinze, Georg

    2018-02-01

    Immortal time bias is a problem arising from methodologically wrong analyses of time-dependent events in survival analyses. We illustrate the problem by analysis of a kidney transplantation study. Following patients from transplantation to death, groups defined by the occurrence or nonoccurrence of graft failure during follow-up seemingly had equal overall mortality. Such naive analysis assumes that patients were assigned to the two groups at time of transplantation, which actually are a consequence of occurrence of a time-dependent event later during follow-up. We introduce landmark analysis as the method of choice to avoid immortal time bias. Landmark analysis splits the follow-up time at a common, prespecified time point, the so-called landmark. Groups are then defined by time-dependent events having occurred before the landmark, and outcome events are only considered if occurring after the landmark. Landmark analysis can be easily implemented with common statistical software. In our kidney transplantation example, landmark analyses with landmarks set at 30 and 60 months clearly identified graft failure as a risk factor for overall mortality. We give further typical examples from transplantation research and discuss strengths and limitations of landmark analysis and other methods to address immortal time bias such as Cox regression with time-dependent covariables. © 2017 Steunstichting ESOT.

  9. Multivariate pattern analysis of MEG and EEG: A comparison of representational structure in time and space.

    PubMed

    Cichy, Radoslaw Martin; Pantazis, Dimitrios

    2017-09-01

    Multivariate pattern analysis of magnetoencephalography (MEG) and electroencephalography (EEG) data can reveal the rapid neural dynamics underlying cognition. However, MEG and EEG have systematic differences in sampling neural activity. This poses the question to which degree such measurement differences consistently bias the results of multivariate analysis applied to MEG and EEG activation patterns. To investigate, we conducted a concurrent MEG/EEG study while participants viewed images of everyday objects. We applied multivariate classification analyses to MEG and EEG data, and compared the resulting time courses to each other, and to fMRI data for an independent evaluation in space. We found that both MEG and EEG revealed the millisecond spatio-temporal dynamics of visual processing with largely equivalent results. Beyond yielding convergent results, we found that MEG and EEG also captured partly unique aspects of visual representations. Those unique components emerged earlier in time for MEG than for EEG. Identifying the sources of those unique components with fMRI, we found the locus for both MEG and EEG in high-level visual cortex, and in addition for MEG in low-level visual cortex. Together, our results show that multivariate analyses of MEG and EEG data offer a convergent and complimentary view on neural processing, and motivate the wider adoption of these methods in both MEG and EEG research. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. A Multivariate Model for the Meta-Analysis of Study Level Survival Data at Multiple Times

    ERIC Educational Resources Information Center

    Jackson, Dan; Rollins, Katie; Coughlin, Patrick

    2014-01-01

    Motivated by our meta-analytic dataset involving survival rates after treatment for critical leg ischemia, we develop and apply a new multivariate model for the meta-analysis of study level survival data at multiple times. Our data set involves 50 studies that provide mortality rates at up to seven time points, which we model simultaneously, and…

  11. A general framework for multivariate multi-index drought prediction based on Multivariate Ensemble Streamflow Prediction (MESP)

    NASA Astrophysics Data System (ADS)

    Hao, Zengchao; Hao, Fanghua; Singh, Vijay P.

    2016-08-01

    Drought is among the costliest natural hazards worldwide and extreme drought events in recent years have caused huge losses to various sectors. Drought prediction is therefore critically important for providing early warning information to aid decision making to cope with drought. Due to the complicated nature of drought, it has been recognized that the univariate drought indicator may not be sufficient for drought characterization and hence multivariate drought indices have been developed for drought monitoring. Alongside the substantial effort in drought monitoring with multivariate drought indices, it is of equal importance to develop a drought prediction method with multivariate drought indices to integrate drought information from various sources. This study proposes a general framework for multivariate multi-index drought prediction that is capable of integrating complementary prediction skills from multiple drought indices. The Multivariate Ensemble Streamflow Prediction (MESP) is employed to sample from historical records for obtaining statistical prediction of multiple variables, which is then used as inputs to achieve multivariate prediction. The framework is illustrated with a linearly combined drought index (LDI), which is a commonly used multivariate drought index, based on climate division data in California and New York in the United States with different seasonality of precipitation. The predictive skill of LDI (represented with persistence) is assessed by comparison with the univariate drought index and results show that the LDI prediction skill is less affected by seasonality than the meteorological drought prediction based on SPI. Prediction results from the case study show that the proposed multivariate drought prediction outperforms the persistence prediction, implying a satisfactory performance of multivariate drought prediction. The proposed method would be useful for drought prediction to integrate drought information from various sources

  12. Predicting analysis time in events-driven clinical trials using accumulating time-to-event surrogate information.

    PubMed

    Wang, Jianming; Ke, Chunlei; Yu, Zhinuan; Fu, Lei; Dornseif, Bruce

    2016-05-01

    For clinical trials with time-to-event endpoints, predicting the accrual of the events of interest with precision is critical in determining the timing of interim and final analyses. For example, overall survival (OS) is often chosen as the primary efficacy endpoint in oncology studies, with planned interim and final analyses at a pre-specified number of deaths. Often, correlated surrogate information, such as time-to-progression (TTP) and progression-free survival, are also collected as secondary efficacy endpoints. It would be appealing to borrow strength from the surrogate information to improve the precision of the analysis time prediction. Currently available methods in the literature for predicting analysis timings do not consider utilizing the surrogate information. In this article, using OS and TTP as an example, a general parametric model for OS and TTP is proposed, with the assumption that disease progression could change the course of the overall survival. Progression-free survival, related both to OS and TTP, will be handled separately, as it can be derived from OS and TTP. The authors seek to develop a prediction procedure using a Bayesian method and provide detailed implementation strategies under certain assumptions. Simulations are performed to evaluate the performance of the proposed method. An application to a real study is also provided. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  13. Extensions to Multivariate Space Time Mixture Modeling of Small Area Cancer Data.

    PubMed

    Carroll, Rachel; Lawson, Andrew B; Faes, Christel; Kirby, Russell S; Aregay, Mehreteab; Watjou, Kevin

    2017-05-09

    Oral cavity and pharynx cancer, even when considered together, is a fairly rare disease. Implementation of multivariate modeling with lung and bronchus cancer, as well as melanoma cancer of the skin, could lead to better inference for oral cavity and pharynx cancer. The multivariate structure of these models is accomplished via the use of shared random effects, as well as other multivariate prior distributions. The results in this paper indicate that care should be taken when executing these types of models, and that multivariate mixture models may not always be the ideal option, depending on the data of interest.

  14. Event coincidence analysis for quantifying statistical interrelationships between event time series. On the role of flood events as triggers of epidemic outbreaks

    NASA Astrophysics Data System (ADS)

    Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.

    2016-05-01

    Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.

  15. Generating functions and stability study of multivariate self-excited epidemic processes

    NASA Astrophysics Data System (ADS)

    Saichev, A. I.; Sornette, D.

    2011-09-01

    We present a stability study of the class of multivariate self-excited Hawkes point processes, that can model natural and social systems, including earthquakes, epileptic seizures and the dynamics of neuron assemblies, bursts of exchanges in social communities, interactions between Internet bloggers, bank network fragility and cascading of failures, national sovereign default contagion, and so on. We present the general theory of multivariate generating functions to derive the number of events over all generations of various types that are triggered by a mother event of a given type. We obtain the stability domains of various systems, as a function of the topological structure of the mutual excitations across different event types. We find that mutual triggering tends to provide a significant extension of the stability (or subcritical) domain compared with the case where event types are decoupled, that is, when an event of a given type can only trigger events of the same type.

  16. Assessment of Critical Events Corridors through Multivariate Cascading Outages Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makarov, Yuri V.; Samaan, Nader A.; Diao, Ruisheng

    2011-10-17

    Massive blackouts of electrical power systems in North America over the past decade has focused increasing attention upon ways to identify and simulate network events that may potentially lead to widespread network collapse. This paper summarizes a method to simulate power-system vulnerability to cascading failures to a supplied set of initiating events synonymously termed as Extreme Events. The implemented simulation method is currently confined to simulating steady state power-system response to a set of extreme events. The outlined method of simulation is meant to augment and provide a new insight into bulk power transmission network planning that at present remainsmore » mainly confined to maintaining power system security for single and double component outages under a number of projected future network operating conditions. Although one of the aims of this paper is to demonstrate the feasibility of simulating network vulnerability to cascading outages, a more important goal has been to determine vulnerable parts of the network that may potentially be strengthened in practice so as to mitigate system susceptibility to cascading failures. This paper proposes to demonstrate a systematic approach to analyze extreme events and identify vulnerable system elements that may be contributing to cascading outages. The hypothesis of critical events corridors is proposed to represent repeating sequential outages that can occur in the system for multiple initiating events. The new concept helps to identify system reinforcements that planners could engineer in order to 'break' the critical events sequences and therefore lessen the likelihood of cascading outages. This hypothesis has been successfully validated with a California power system model.« less

  17. Detecting a currency’s dominance using multivariate time series analysis

    NASA Astrophysics Data System (ADS)

    Syahidah Yusoff, Nur; Sharif, Shamshuritawati

    2017-09-01

    A currency exchange rate is the price of one country’s currency in terms of another country’s currency. There are four different prices; opening, closing, highest, and lowest can be achieved from daily trading activities. In the past, a lot of studies have been carried out by using closing price only. However, those four prices are interrelated to each other. Thus, the multivariate time series can provide more information than univariate time series. Therefore, the enthusiasm of this paper is to compare the results of two different approaches, which are mean vector and Escoufier’s RV coefficient in constructing similarity matrices of 20 world currencies. Consequently, both matrices are used to substitute the correlation matrix required by network topology. With the help of degree centrality measure, we can detect the currency’s dominance for both networks. The pros and cons for both approaches will be presented at the end of this paper.

  18. Survival analysis: Part I — analysis of time-to-event

    PubMed Central

    2018-01-01

    Length of time is a variable often encountered during data analysis. Survival analysis provides simple, intuitive results concerning time-to-event for events of interest, which are not confined to death. This review introduces methods of analyzing time-to-event. The Kaplan-Meier survival analysis, log-rank test, and Cox proportional hazards regression modeling method are described with examples of hypothetical data. PMID:29768911

  19. Soil erosion under multiple time-varying rainfall events

    NASA Astrophysics Data System (ADS)

    Heng, B. C. Peter; Barry, D. Andrew; Jomaa, Seifeddine; Sander, Graham C.

    2010-05-01

    Soil erosion is a function of many factors and process interactions. An erosion event produces changes in surface soil properties such as texture and hydraulic conductivity. These changes in turn alter the erosion response to subsequent events. Laboratory-scale soil erosion studies have typically focused on single independent rainfall events with constant rainfall intensities. This study investigates the effect of multiple time-varying rainfall events on soil erosion using the EPFL erosion flume. The rainfall simulator comprises ten Veejet nozzles mounted on oscillating bars 3 m above a 6 m × 2 m flume. Spray from the nozzles is applied onto the soil surface in sweeps; rainfall intensity is thus controlled by varying the sweeping frequency. Freshly-prepared soil with a uniform slope was subjected to five rainfall events at daily intervals. In each 3-h event, rainfall intensity was ramped up linearly to a maximum of 60 mm/h and then stepped down to zero. Runoff samples were collected and analysed for particle size distribution (PSD) as well as total sediment concentration. We investigate whether there is a hysteretic relationship between sediment concentration and discharge within each event and how this relationship changes from event to event. Trends in the PSD of the eroded sediment are discussed and correlated with changes in sediment concentration. Close-up imagery of the soil surface following each event highlight changes in surface soil structure with time. This study enhances our understanding of erosion processes in the field, with corresponding implications for soil erosion modelling.

  20. Computing return times or return periods with rare event algorithms

    NASA Astrophysics Data System (ADS)

    Lestang, Thibault; Ragone, Francesco; Bréhier, Charles-Edouard; Herbert, Corentin; Bouchet, Freddy

    2018-04-01

    The average time between two occurrences of the same event, referred to as its return time (or return period), is a useful statistical concept for practical applications. For instance insurances or public agencies may be interested by the return time of a 10 m flood of the Seine river in Paris. However, due to their scarcity, reliably estimating return times for rare events is very difficult using either observational data or direct numerical simulations. For rare events, an estimator for return times can be built from the extrema of the observable on trajectory blocks. Here, we show that this estimator can be improved to remain accurate for return times of the order of the block size. More importantly, we show that this approach can be generalised to estimate return times from numerical algorithms specifically designed to sample rare events. So far those algorithms often compute probabilities, rather than return times. The approach we propose provides a computationally extremely efficient way to estimate numerically the return times of rare events for a dynamical system, gaining several orders of magnitude of computational costs. We illustrate the method on two kinds of observables, instantaneous and time-averaged, using two different rare event algorithms, for a simple stochastic process, the Ornstein–Uhlenbeck process. As an example of realistic applications to complex systems, we finally discuss extreme values of the drag on an object in a turbulent flow.

  1. Learning a Mahalanobis Distance-Based Dynamic Time Warping Measure for Multivariate Time Series Classification.

    PubMed

    Mei, Jiangyuan; Liu, Meizhu; Wang, Yuan-Fang; Gao, Huijun

    2016-06-01

    Multivariate time series (MTS) datasets broadly exist in numerous fields, including health care, multimedia, finance, and biometrics. How to classify MTS accurately has become a hot research topic since it is an important element in many computer vision and pattern recognition applications. In this paper, we propose a Mahalanobis distance-based dynamic time warping (DTW) measure for MTS classification. The Mahalanobis distance builds an accurate relationship between each variable and its corresponding category. It is utilized to calculate the local distance between vectors in MTS. Then we use DTW to align those MTS which are out of synchronization or with different lengths. After that, how to learn an accurate Mahalanobis distance function becomes another key problem. This paper establishes a LogDet divergence-based metric learning with triplet constraint model which can learn Mahalanobis matrix with high precision and robustness. Furthermore, the proposed method is applied on nine MTS datasets selected from the University of California, Irvine machine learning repository and Robert T. Olszewski's homepage, and the results demonstrate the improved performance of the proposed approach.

  2. Scalable Joint Models for Reliable Uncertainty-Aware Event Prediction.

    PubMed

    Soleimani, Hossein; Hensman, James; Saria, Suchi

    2017-08-21

    Missing data and noisy observations pose significant challenges for reliably predicting events from irregularly sampled multivariate time series (longitudinal) data. Imputation methods, which are typically used for completing the data prior to event prediction, lack a principled mechanism to account for the uncertainty due to missingness. Alternatively, state-of-the-art joint modeling techniques can be used for jointly modeling the longitudinal and event data and compute event probabilities conditioned on the longitudinal observations. These approaches, however, make strong parametric assumptions and do not easily scale to multivariate signals with many observations. Our proposed approach consists of several key innovations. First, we develop a flexible and scalable joint model based upon sparse multiple-output Gaussian processes. Unlike state-of-the-art joint models, the proposed model can explain highly challenging structure including non-Gaussian noise while scaling to large data. Second, we derive an optimal policy for predicting events using the distribution of the event occurrence estimated by the joint model. The derived policy trades-off the cost of a delayed detection versus incorrect assessments and abstains from making decisions when the estimated event probability does not satisfy the derived confidence criteria. Experiments on a large dataset show that the proposed framework significantly outperforms state-of-the-art techniques in event prediction.

  3. A simple ergonomic measure reduces fluoroscopy time during ERCP: A multivariate analysis.

    PubMed

    Jowhari, Fahd; Hopman, Wilma M; Hookey, Lawrence

    2017-03-01

    Background and study aims  Endoscopic retrograde cholangiopancreatgraphy (ERCP) carries a radiation risk to patients undergoing the procedure and the team performing it. Fluoroscopy time (FT) has been shown to have a linear relationship with radiation exposure during ERCP. Recent modifications to our ERCP suite design were felt to impact fluoroscopy time and ergonomics. This multivariate analysis was therefore undertaken to investigate these effects, and to identify and validate various clinical, procedural and ergonomic factors influencing the total fluoroscopy time during ERCP. This would better assist clinicians with predicting prolonged fluoroscopic durations and to undertake relevant precautions accordingly. Patients and methods  A retrospective analysis of 299 ERCPs performed by 4 endoscopists over an 18-month period, at a single tertiary care center was conducted. All inpatients/outpatients (121 males, 178 females) undergoing ERCP for any clinical indication from January 2012 to June 2013 in the chosen ERCP suite were included in the study. Various predetermined clinical, procedural and ergonomic factors were obtained via chart review. Univariate analyses identified factors to be included in the multivariate regression model with FT as the dependent variable. Results  Bringing the endoscopy and fluoroscopy screens next to each other was associated with a significantly lesser FT than when the screens were separated further (-1.4 min, P  = 0.026). Other significant factors associated with a prolonged FT included having a prior ERCP (+ 1.4 min, P  = 0.031), and more difficult procedures (+ 4.2 min for each level of difficulty, P  < 0.001). ERCPs performed by high-volume endoscopists used lesser FT vs. low-volume endoscopists (-1.82, P = 0.015). Conclusions  Our study has identified and validated various factors that affect the total fluoroscopy time during ERCP. This is the first study to show that decreasing the distance

  4. Semiparametric Time-to-Event Modeling in the Presence of a Latent Progression Event

    PubMed Central

    Rice, John D.; Tsodikov, Alex

    2017-01-01

    Summary In cancer research, interest frequently centers on factors influencing a latent event that must precede a terminal event. In practice it is often impossible to observe the latent event precisely, making inference about this process difficult. To address this problem, we propose a joint model for the unobserved time to the latent and terminal events, with the two events linked by the baseline hazard. Covariates enter the model parametrically as linear combinations that multiply, respectively, the hazard for the latent event and the hazard for the terminal event conditional on the latent one. We derive the partial likelihood estimators for this problem assuming the latent event is observed, and propose a profile likelihood–based method for estimation when the latent event is unobserved. The baseline hazard in this case is estimated nonparametrically using the EM algorithm, which allows for closed-form Breslow-type estimators at each iteration, bringing improved computational efficiency and stability compared with maximizing the marginal likelihood directly. We present simulation studies to illustrate the finite-sample properties of the method; its use in practice is demonstrated in the analysis of a prostate cancer data set. PMID:27556886

  5. Semiparametric time-to-event modeling in the presence of a latent progression event.

    PubMed

    Rice, John D; Tsodikov, Alex

    2017-06-01

    In cancer research, interest frequently centers on factors influencing a latent event that must precede a terminal event. In practice it is often impossible to observe the latent event precisely, making inference about this process difficult. To address this problem, we propose a joint model for the unobserved time to the latent and terminal events, with the two events linked by the baseline hazard. Covariates enter the model parametrically as linear combinations that multiply, respectively, the hazard for the latent event and the hazard for the terminal event conditional on the latent one. We derive the partial likelihood estimators for this problem assuming the latent event is observed, and propose a profile likelihood-based method for estimation when the latent event is unobserved. The baseline hazard in this case is estimated nonparametrically using the EM algorithm, which allows for closed-form Breslow-type estimators at each iteration, bringing improved computational efficiency and stability compared with maximizing the marginal likelihood directly. We present simulation studies to illustrate the finite-sample properties of the method; its use in practice is demonstrated in the analysis of a prostate cancer data set. © 2016, The International Biometric Society.

  6. Piecing together the puzzle: Improving event content coverage for real-time sub-event detection using adaptive microblog crawling

    PubMed Central

    Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan

    2017-01-01

    In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of

  7. Piecing together the puzzle: Improving event content coverage for real-time sub-event detection using adaptive microblog crawling.

    PubMed

    Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan

    2017-01-01

    In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of

  8. Time-series panel analysis (TSPA): multivariate modeling of temporal associations in psychotherapy process.

    PubMed

    Ramseyer, Fabian; Kupper, Zeno; Caspar, Franz; Znoj, Hansjörg; Tschacher, Wolfgang

    2014-10-01

    Processes occurring in the course of psychotherapy are characterized by the simple fact that they unfold in time and that the multiple factors engaged in change processes vary highly between individuals (idiographic phenomena). Previous research, however, has neglected the temporal perspective by its traditional focus on static phenomena, which were mainly assessed at the group level (nomothetic phenomena). To support a temporal approach, the authors introduce time-series panel analysis (TSPA), a statistical methodology explicitly focusing on the quantification of temporal, session-to-session aspects of change in psychotherapy. TSPA-models are initially built at the level of individuals and are subsequently aggregated at the group level, thus allowing the exploration of prototypical models. TSPA is based on vector auto-regression (VAR), an extension of univariate auto-regression models to multivariate time-series data. The application of TSPA is demonstrated in a sample of 87 outpatient psychotherapy patients who were monitored by postsession questionnaires. Prototypical mechanisms of change were derived from the aggregation of individual multivariate models of psychotherapy process. In a 2nd step, the associations between mechanisms of change (TSPA) and pre- to postsymptom change were explored. TSPA allowed a prototypical process pattern to be identified, where patient's alliance and self-efficacy were linked by a temporal feedback-loop. Furthermore, therapist's stability over time in both mastery and clarification interventions was positively associated with better outcomes. TSPA is a statistical tool that sheds new light on temporal mechanisms of change. Through this approach, clinicians may gain insight into prototypical patterns of change in psychotherapy. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  9. Interpretability of Multivariate Brain Maps in Linear Brain Decoding: Definition, and Heuristic Quantification in Multivariate Analysis of MEG Time-Locked Effects.

    PubMed

    Kia, Seyed Mostafa; Vega Pons, Sandro; Weisz, Nathan; Passerini, Andrea

    2016-01-01

    Brain decoding is a popular multivariate approach for hypothesis testing in neuroimaging. Linear classifiers are widely employed in the brain decoding paradigm to discriminate among experimental conditions. Then, the derived linear weights are visualized in the form of multivariate brain maps to further study spatio-temporal patterns of underlying neural activities. It is well known that the brain maps derived from weights of linear classifiers are hard to interpret because of high correlations between predictors, low signal to noise ratios, and the high dimensionality of neuroimaging data. Therefore, improving the interpretability of brain decoding approaches is of primary interest in many neuroimaging studies. Despite extensive studies of this type, at present, there is no formal definition for interpretability of multivariate brain maps. As a consequence, there is no quantitative measure for evaluating the interpretability of different brain decoding methods. In this paper, first, we present a theoretical definition of interpretability in brain decoding; we show that the interpretability of multivariate brain maps can be decomposed into their reproducibility and representativeness. Second, as an application of the proposed definition, we exemplify a heuristic for approximating the interpretability in multivariate analysis of evoked magnetoencephalography (MEG) responses. Third, we propose to combine the approximated interpretability and the generalization performance of the brain decoding into a new multi-objective criterion for model selection. Our results, for the simulated and real MEG data, show that optimizing the hyper-parameters of the regularized linear classifier based on the proposed criterion results in more informative multivariate brain maps. More importantly, the presented definition provides the theoretical background for quantitative evaluation of interpretability, and hence, facilitates the development of more effective brain decoding algorithms

  10. Time-resolved gamma spectroscopy of single events

    NASA Astrophysics Data System (ADS)

    Wolszczak, W.; Dorenbos, P.

    2018-04-01

    In this article we present a method of characterizing scintillating materials by digitization of each individual scintillation pulse followed by digital signal processing. With this technique it is possible to measure the pulse shape and the energy of an absorbed gamma photon on an event-by-event basis. In contrast to time-correlated single photon counting technique, the digital approach provides a faster measurement, an active noise suppression, and enables characterization of scintillation pulses simultaneously in two domains: time and energy. We applied this method to study the pulse shape change of a CsI(Tl) scintillator with energy of gamma excitation. We confirmed previously published results and revealed new details of the phenomenon.

  11. Multivariate detrending of fMRI signal drifts for real-time multiclass pattern classification.

    PubMed

    Lee, Dongha; Jang, Changwon; Park, Hae-Jeong

    2015-03-01

    Signal drift in functional magnetic resonance imaging (fMRI) is an unavoidable artifact that limits classification performance in multi-voxel pattern analysis of fMRI. As conventional methods to reduce signal drift, global demeaning or proportional scaling disregards regional variations of drift, whereas voxel-wise univariate detrending is too sensitive to noisy fluctuations. To overcome these drawbacks, we propose a multivariate real-time detrending method for multiclass classification that involves spatial demeaning at each scan and the recursive detrending of drifts in the classifier outputs driven by a multiclass linear support vector machine. Experiments using binary and multiclass data showed that the linear trend estimation of the classifier output drift for each class (a weighted sum of drifts in the class-specific voxels) was more robust against voxel-wise artifacts that lead to inconsistent spatial patterns and the effect of online processing than voxel-wise detrending. The classification performance of the proposed method was significantly better, especially for multiclass data, than that of voxel-wise linear detrending, global demeaning, and classifier output detrending without demeaning. We concluded that the multivariate approach using classifier output detrending of fMRI signals with spatial demeaning preserves spatial patterns, is less sensitive than conventional methods to sample size, and increases classification performance, which is a useful feature for real-time fMRI classification. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Timing Processes Are Correlated when Tasks Share a Salient Event

    ERIC Educational Resources Information Center

    Zelaznik, Howard N.; Rosenbaum, David A.

    2010-01-01

    Event timing is manifested when participants make discrete movements such as repeatedly tapping a key. Emergent timing is manifested when participants make continuous movements such as repeatedly drawing a circle. Here we pursued the possibility that providing salient perceptual events to mark the completion of time intervals could allow circle…

  13. Real-Time Event Detection for Monitoring Natural and Source ...

    EPA Pesticide Factsheets

    The use of event detection systems in finished drinking water systems is increasing in order to monitor water quality in both operational and security contexts. Recent incidents involving harmful algal blooms and chemical spills into watersheds have increased interest in monitoring source water quality prior to treatment. This work highlights the use of the CANARY event detection software in detecting suspected illicit events in an actively monitored watershed in South Carolina. CANARY is an open source event detection software that was developed by USEPA and Sandia National Laboratories. The software works with any type of sensor, utilizes multiple detection algorithms and approaches, and can incorporate operational information as needed. Monitoring has been underway for several years to detect events related to intentional or unintentional dumping of materials into the monitored watershed. This work evaluates the feasibility of using CANARY to enhance the detection of events in this watershed. This presentation will describe the real-time monitoring approach used in this watershed, the selection of CANARY configuration parameters that optimize detection for this watershed and monitoring application, and the performance of CANARY during the time frame analyzed. Further, this work will highlight how rainfall events impacted analysis, and the innovative application of CANARY taken in order to effectively detect the suspected illicit events. This presentation d

  14. Event-Based Tone Mapping for Asynchronous Time-Based Image Sensor

    PubMed Central

    Simon Chane, Camille; Ieng, Sio-Hoi; Posch, Christoph; Benosman, Ryad B.

    2016-01-01

    The asynchronous time-based neuromorphic image sensor ATIS is an array of autonomously operating pixels able to encode luminance information with an exceptionally high dynamic range (>143 dB). This paper introduces an event-based methodology to display data from this type of event-based imagers, taking into account the large dynamic range and high temporal accuracy that go beyond available mainstream display technologies. We introduce an event-based tone mapping methodology for asynchronously acquired time encoded gray-level data. A global and a local tone mapping operator are proposed. Both are designed to operate on a stream of incoming events rather than on time frame windows. Experimental results on real outdoor scenes are presented to evaluate the performance of the tone mapping operators in terms of quality, temporal stability, adaptation capability, and computational time. PMID:27642275

  15. Problems with Multivariate Normality: Can the Multivariate Bootstrap Help?

    ERIC Educational Resources Information Center

    Thompson, Bruce

    Multivariate normality is required for some statistical tests. This paper explores the implications of violating the assumption of multivariate normality and illustrates a graphical procedure for evaluating multivariate normality. The logic for using the multivariate bootstrap is presented. The multivariate bootstrap can be used when distribution…

  16. Interpretability of Multivariate Brain Maps in Linear Brain Decoding: Definition, and Heuristic Quantification in Multivariate Analysis of MEG Time-Locked Effects

    PubMed Central

    Kia, Seyed Mostafa; Vega Pons, Sandro; Weisz, Nathan; Passerini, Andrea

    2017-01-01

    Brain decoding is a popular multivariate approach for hypothesis testing in neuroimaging. Linear classifiers are widely employed in the brain decoding paradigm to discriminate among experimental conditions. Then, the derived linear weights are visualized in the form of multivariate brain maps to further study spatio-temporal patterns of underlying neural activities. It is well known that the brain maps derived from weights of linear classifiers are hard to interpret because of high correlations between predictors, low signal to noise ratios, and the high dimensionality of neuroimaging data. Therefore, improving the interpretability of brain decoding approaches is of primary interest in many neuroimaging studies. Despite extensive studies of this type, at present, there is no formal definition for interpretability of multivariate brain maps. As a consequence, there is no quantitative measure for evaluating the interpretability of different brain decoding methods. In this paper, first, we present a theoretical definition of interpretability in brain decoding; we show that the interpretability of multivariate brain maps can be decomposed into their reproducibility and representativeness. Second, as an application of the proposed definition, we exemplify a heuristic for approximating the interpretability in multivariate analysis of evoked magnetoencephalography (MEG) responses. Third, we propose to combine the approximated interpretability and the generalization performance of the brain decoding into a new multi-objective criterion for model selection. Our results, for the simulated and real MEG data, show that optimizing the hyper-parameters of the regularized linear classifier based on the proposed criterion results in more informative multivariate brain maps. More importantly, the presented definition provides the theoretical background for quantitative evaluation of interpretability, and hence, facilitates the development of more effective brain decoding algorithms

  17. Conjugate LEP Events at Palmer Station, Antarctica: Hemisphere-Dependent Timing

    NASA Astrophysics Data System (ADS)

    Kim, D.; Moore, R. C.

    2016-12-01

    During March 2015, a large number of lightning-induced electron precipitation (LEP) events were simultaneously observed using very low frequency receivers in both the northern and southern hemispheres. After removing overlapping events and unclear (or not well-defined) events, 22 conjugate LEP events remain and are used to statistically analyze the hemispheric dependence of LEP onset time. LEP events were detected in the northern hemisphere using the VLF remote sensing method by tracking the NAA transmitter signal (24.0 kHz, Cutler, Maine) at Tuscaloosa, Alabama. In the southern hemisphere, the NPM transmitter signal (21.4 kHz, Laulaulei, Hawii) is tracked at Palmer station, Antarctica. In each case, the GLD360 dataset from Vaisala is used to determine the hemisphere of the causative lightning flash, and this is compared with the hemisphere in which the LEP event is detected first. The onset times and onset durations can be calculated using a number of different methods, however. In this paper, we compare and contrast the onset times and durations calculated using multiple different methods, with each method applied to the same 22 conjugate LEP events.

  18. Family Events and the Timing of Intergenerational Transfers

    ERIC Educational Resources Information Center

    Leopold, Thomas; Schneider, Thorsten

    2011-01-01

    This research investigates how family events in adult children's lives influence the timing of their parents' financial transfers. We draw on retrospective data collected by the German Socio-Economic Panel Study and use event history models to study the effects of marriage, divorce and childbirth on the receipt of large gifts from parents. We find…

  19. Timing and documentation of key events in neonatal resuscitation.

    PubMed

    Heathcote, Adam Charles; Jones, Jacqueline; Clarke, Paul

    2018-04-30

    Only a minority of babies require extended resuscitation at birth. Resuscitations concerning babies who die or who survive with adverse outcomes are increasingly subject to medicolegal scrutiny. Our aim was to describe real-life timings of key resuscitation events observed in a historical series of newborns who required full resuscitation at birth. Twenty-seven babies born in our centre over a 10-year period had an Apgar score of 0 at 1 min and required full resuscitation. The median (95% confidence interval) postnatal age at achieving key events were commencing cardiac compressions, 2.0 (1.5-4.0) min; endotracheal intubation, 3.8 (2.0-6.0) min; umbilical venous catheterisation 9.0 (7.5-12.0) min; and administration of first adrenaline dose 10.0 (8.0-14.0) min. The wide range of timings presented from real-life cases may prove useful to clinicians involved in medical negligence claims and provide a baseline for quality improvements in resuscitation training. What is Known: • Only a minority of babies require extended resuscitation at birth; these cases are often subject to medicolegal interrogation • Timings of key resuscitation events are poorly described and documentation of resuscitation events is often lacking yet is open to medicolegal scrutiny What is New: • We present a wide range of real-life timings of key resuscitation events during the era of routine newborn life support training • These timings may prove useful to clinicians involved in medical negligence claims and provide a baseline for quality improvements in resuscitation training.

  20. Discrete Events as Units of Perceived Time

    ERIC Educational Resources Information Center

    Liverence, Brandon M.; Scholl, Brian J.

    2012-01-01

    In visual images, we perceive both space (as a continuous visual medium) and objects (that inhabit space). Similarly, in dynamic visual experience, we perceive both continuous time and discrete events. What is the relationship between these units of experience? The most intuitive answer may be similar to the spatial case: time is perceived as an…

  1. Time Separation Between Events in a Sequence: a Regional Property?

    NASA Astrophysics Data System (ADS)

    Muirwood, R.; Fitzenz, D. D.

    2013-12-01

    Earthquake sequences are loosely defined as events occurring too closely in time and space to appear unrelated. Depending on the declustering method, several, all, or no event(s) after the first large event might be recognized as independent mainshocks. It can therefore be argued that a probabilistic seismic hazard assessment (PSHA, traditionally dealing with mainshocks only) might already include the ground shaking effects of such sequences. Alternatively all but the largest event could be classified as an ';aftershock' and removed from the earthquake catalog. While in PSHA the question is only whether to keep or remove the events from the catalog, for Risk Management purposes, the community response to the earthquakes, as well as insurance risk transfer mechanisms, can be profoundly affected by the actual timing of events in such a sequence. In particular the repetition of damaging earthquakes over a period of weeks to months can lead to businesses closing and families evacuating from the region (as happened in Christchurch, New Zealand in 2011). Buildings that are damaged in the first earthquake may go on to be damaged again, even while they are being repaired. Insurance also functions around a set of critical timeframes - including the definition of a single 'event loss' for reinsurance recoveries within the 192 hour ';hours clause', the 6-18 month pace at which insurance claims are settled, and the annual renewal of insurance and reinsurance contracts. We show how temporal aspects of earthquake sequences need to be taken into account within models for Risk Management, and what time separation between events are most sensitive, both in terms of the modeled disruptions to lifelines and business activity as well as in the losses to different parties (such as insureds, insurers and reinsurers). We also explore the time separation between all events and between loss causing events for a collection of sequences from across the world and we point to the need to

  2. Time Variations in Forecasts and Occurrences of Large Solar Energetic Particle Events

    NASA Astrophysics Data System (ADS)

    Kahler, S. W.

    2015-12-01

    The onsets and development of large solar energetic (E > 10 MeV) particle (SEP) events have been characterized in many studies. The statistics of SEP event onset delay times from associated solar flares and coronal mass ejections (CMEs), which depend on solar source longitudes, can be used to provide better predictions of whether a SEP event will occur following a large flare or fast CME. In addition, size distributions of peak SEP event intensities provide a means for a probabilistic forecast of peak intensities attained in observed SEP increases. SEP event peak intensities have been compared with their rise and decay times for insight into the acceleration and transport processes. These two time scales are generally treated as independent parameters describing the development of a SEP event, but we can invoke an alternative two-parameter description based on the assumption that decay times exceed rise times for all events. These two parameters, from the well known Weibull distribution, provide an event description in terms of its basic shape and duration. We apply this distribution to several large SEP events and ask what the characteristic parameters and their dependence on source longitudes can tell us about the origins of these important events.

  3. Dietary patterns associated with overweight and obesity among Brazilian schoolchildren: an approach based on the time-of-day of eating events.

    PubMed

    Kupek, Emil; Lobo, Adriana S; Leal, Danielle B; Bellisle, France; de Assis, Maria Alice A

    2016-12-01

    Several studies reported that the timing of eating events has critical implications in the prevention of obesity, but dietary patterns regarding the time-of-day have not been explored in children. The aim of this study was to derive latent food patterns of daily eating events and to examine their associations with overweight/obesity among schoolchildren. A population-based cross-sectional study was conducted with 7-10-year-old Brazilian schoolchildren (n 1232) who completed the Previous Day Food Questionnaire, illustrated with twenty-one foods/beverages in six daily eating events. Latent class analysis was used to derive dietary patterns whose association with child weight status was evaluated by multivariate multinomial regression. Four mutually exclusive latent classes of dietary patterns were identified and labelled according to the time-of-day of eating events and food intake probability (FIP): (A) higher FIP only at lunch; (B) lower FIP at all eating events; (C) higher FIP at lunch, afternoon and evening snacks; (D) lower FIP at breakfast and at evening snack, higher FIP at other meals/snacks. The percentages of children within these classes were 32·3, 48·6, 15·1 and 4·0 %, respectively. After controlling for potential confounders, the mean probabilities of obesity for these classes were 6 % (95 % CI 3·0, 9·0), 13 % (95 % CI 9·0, 17·0), 12 % (95 % CI 6·0, 19) and 11 % (95 % CI 5·0, 17·0), in the same order. In conclusion, the children eating traditional lunch with rice and beans as the main meal of the day (class A) had the lowest obesity risk, thus reinforcing the importance of both the food type and the time-of-day of its intake for weight status.

  4. Evaluation of an F100 multivariable control using a real-time engine simulation

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.; Soeder, J. F.; Skira, C.

    1977-01-01

    The control evaluated has been designed for the F100-PW-100 turbofan engine. The F100 engine represents the current state-of-the-art in aircraft gas turbine technology. The control makes use of a multivariable, linear quadratic regulator. The evaluation procedure employed utilized a real-time hybrid computer simulation of the F100 engine and an implementation of the control logic on the NASA LeRC digital computer/controller. The results of the evaluation indicated that the control logic and its implementation will be capable of controlling the engine throughout its operating range.

  5. An improvement of drought monitoring through the use of a multivariate magnitude index

    NASA Astrophysics Data System (ADS)

    Real-Rangel, R. A.; Alcocer-Yamanaka, V. H.; Pedrozo-Acuña, A.; Breña-Naranjo, J. A.; Ocón-Gutiérrez, A. R.

    2017-12-01

    In drought monitoring activities it is widely acknowledged that the severity of an event is determined in relation to monthly values of univariate indices of one or more hydrological variables. Normally, these indices are estimated using temporal windows from 1 to 12 months or more to aggregate the effects of deficits in the variable of interest. However, the use of these temporal windows may lead to a misperception of both, the drought event intensity and the timing of its occurrence. In this context, this work presents the implementation of a trivariate drought magnitude index, considering key hydrological variables (e.g., precipitation, soil moisture and runoff) using for this the framework of the Multivariate Standardized Drought Index (MSDI). Despite the popularity and simplicity of the concept of drought magnitude for standardized drought indices, its implementation in drought monitoring and early warning systems has not been reported. This approach has been tested for operational purposes in the recently launched Multivariate Drought Monitor of Mexico (MOSEMM) and the results shows that the inclusion of a Magnitude index facilitates the drought detection and, thus, improves the decision making process for emergency managers.

  6. Deconstructing events: The neural bases for space, time, and causality

    PubMed Central

    Kranjec, Alexander; Cardillo, Eileen R.; Lehet, Matthew; Chatterjee, Anjan

    2013-01-01

    Space, time, and causality provide a natural structure for organizing our experience. These abstract categories allow us to think relationally in the most basic sense; understanding simple events require one to represent the spatial relations among objects, the relative durations of actions or movements, and links between causes and effects. The present fMRI study investigates the extent to which the brain distinguishes between these fundamental conceptual domains. Participants performed a one-back task with three conditions of interest (SPACE, TIME and CAUSALITY). Each condition required comparing relations between events in a simple verbal narrative. Depending on the condition, participants were instructed to either attend to the spatial, temporal, or causal characteristics of events, but between participants, each particular event relation appeared in all three conditions. Contrasts compared neural activity during each condition against the remaining two and revealed how thinking about events is deconstructed neurally. Space trials recruited neural areas traditionally associated with visuospatial processing, primarily bilateral frontal and occipitoparietal networks. Causality trials activated areas previously found to underlie causal thinking and thematic role assignment, such as left medial frontal, and left middle temporal gyri, respectively. Causality trials also produced activations in SMA, caudate, and cerebellum; cortical and subcortical regions associated with the perception of time at different timescales. The TIME contrast however, produced no significant effects. This pattern, indicating negative results for TIME trials, but positive effects for CAUSALITY trials in areas important for time perception, motivated additional overlap analyses to further probe relations between domains. The results of these analyses suggest a closer correspondence between time and causality than between time and space. PMID:21861674

  7. Models and analysis for multivariate failure time data

    NASA Astrophysics Data System (ADS)

    Shih, Joanna Huang

    The goal of this research is to develop and investigate models and analytic methods for multivariate failure time data. We compare models in terms of direct modeling of the margins, flexibility of dependency structure, local vs. global measures of association, and ease of implementation. In particular, we study copula models, and models produced by right neutral cumulative hazard functions and right neutral hazard functions. We examine the changes of association over time for families of bivariate distributions induced from these models by displaying their density contour plots, conditional density plots, correlation curves of Doksum et al, and local cross ratios of Oakes. We know that bivariate distributions with same margins might exhibit quite different dependency structures. In addition to modeling, we study estimation procedures. For copula models, we investigate three estimation procedures. the first procedure is full maximum likelihood. The second procedure is two-stage maximum likelihood. At stage 1, we estimate the parameters in the margins by maximizing the marginal likelihood. At stage 2, we estimate the dependency structure by fixing the margins at the estimated ones. The third procedure is two-stage partially parametric maximum likelihood. It is similar to the second procedure, but we estimate the margins by the Kaplan-Meier estimate. We derive asymptotic properties for these three estimation procedures and compare their efficiency by Monte-Carlo simulations and direct computations. For models produced by right neutral cumulative hazards and right neutral hazards, we derive the likelihood and investigate the properties of the maximum likelihood estimates. Finally, we develop goodness of fit tests for the dependency structure in the copula models. We derive a test statistic and its asymptotic properties based on the test of homogeneity of Zelterman and Chen (1988), and a graphical diagnostic procedure based on the empirical Bayes approach. We study the

  8. Comparison of algorithms to generate event times conditional on time-dependent covariates.

    PubMed

    Sylvestre, Marie-Pierre; Abrahamowicz, Michal

    2008-06-30

    The Cox proportional hazards model with time-dependent covariates (TDC) is now a part of the standard statistical analysis toolbox in medical research. As new methods involving more complex modeling of time-dependent variables are developed, simulations could often be used to systematically assess the performance of these models. Yet, generating event times conditional on TDC requires well-designed and efficient algorithms. We compare two classes of such algorithms: permutational algorithms (PAs) and algorithms based on a binomial model. We also propose a modification of the PA to incorporate a rejection sampler. We performed a simulation study to assess the accuracy, stability, and speed of these algorithms in several scenarios. Both classes of algorithms generated data sets that, once analyzed, provided virtually unbiased estimates with comparable variances. In terms of computational efficiency, the PA with the rejection sampler reduced the time necessary to generate data by more than 50 per cent relative to alternative methods. The PAs also allowed more flexibility in the specification of the marginal distributions of event times and required less calibration.

  9. Life Events and Depressive Symptoms in African American Adolescents: Do Ecological Domains and Timing of Life Events Matter?

    ERIC Educational Resources Information Center

    Sanchez, Yadira M.; Lambert, Sharon F.; Ialongo, Nicholas S.

    2012-01-01

    Considerable research has documented associations between adverse life events and internalizing symptoms in adolescents, but much of this research has focused on the number of events experienced, with less attention to the ecological context or timing of events. This study examined life events in three ecological domains relevant to adolescents…

  10. Multivariate postprocessing techniques for probabilistic hydrological forecasting

    NASA Astrophysics Data System (ADS)

    Hemri, Stephan; Lisniak, Dmytro; Klein, Bastian

    2016-04-01

    Hydrologic ensemble forecasts driven by atmospheric ensemble prediction systems need statistical postprocessing in order to account for systematic errors in terms of both mean and spread. Runoff is an inherently multivariate process with typical events lasting from hours in case of floods to weeks or even months in case of droughts. This calls for multivariate postprocessing techniques that yield well calibrated forecasts in univariate terms and ensure a realistic temporal dependence structure at the same time. To this end, the univariate ensemble model output statistics (EMOS; Gneiting et al., 2005) postprocessing method is combined with two different copula approaches that ensure multivariate calibration throughout the entire forecast horizon. These approaches comprise ensemble copula coupling (ECC; Schefzik et al., 2013), which preserves the dependence structure of the raw ensemble, and a Gaussian copula approach (GCA; Pinson and Girard, 2012), which estimates the temporal correlations from training observations. Both methods are tested in a case study covering three subcatchments of the river Rhine that represent different sizes and hydrological regimes: the Upper Rhine up to the gauge Maxau, the river Moselle up to the gauge Trier, and the river Lahn up to the gauge Kalkofen. The results indicate that both ECC and GCA are suitable for modelling the temporal dependences of probabilistic hydrologic forecasts (Hemri et al., 2015). References Gneiting, T., A. E. Raftery, A. H. Westveld, and T. Goldman (2005), Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS estimation, Monthly Weather Review, 133(5), 1098-1118, DOI: 10.1175/MWR2904.1. Hemri, S., D. Lisniak, and B. Klein, Multivariate postprocessing techniques for probabilistic hydrological forecasting, Water Resources Research, 51(9), 7436-7451, DOI: 10.1002/2014WR016473. Pinson, P., and R. Girard (2012), Evaluating the quality of scenarios of short-term wind power

  11. Time-frequency analysis of neuronal populations with instantaneous resolution based on noise-assisted multivariate empirical mode decomposition.

    PubMed

    Alegre-Cortés, J; Soto-Sánchez, C; Pizá, Á G; Albarracín, A L; Farfán, F D; Felice, C J; Fernández, E

    2016-07-15

    Linear analysis has classically provided powerful tools for understanding the behavior of neural populations, but the neuron responses to real-world stimulation are nonlinear under some conditions, and many neuronal components demonstrate strong nonlinear behavior. In spite of this, temporal and frequency dynamics of neural populations to sensory stimulation have been usually analyzed with linear approaches. In this paper, we propose the use of Noise-Assisted Multivariate Empirical Mode Decomposition (NA-MEMD), a data-driven template-free algorithm, plus the Hilbert transform as a suitable tool for analyzing population oscillatory dynamics in a multi-dimensional space with instantaneous frequency (IF) resolution. The proposed approach was able to extract oscillatory information of neurophysiological data of deep vibrissal nerve and visual cortex multiunit recordings that were not evidenced using linear approaches with fixed bases such as the Fourier analysis. Texture discrimination analysis performance was increased when Noise-Assisted Multivariate Empirical Mode plus Hilbert transform was implemented, compared to linear techniques. Cortical oscillatory population activity was analyzed with precise time-frequency resolution. Similarly, NA-MEMD provided increased time-frequency resolution of cortical oscillatory population activity. Noise-Assisted Multivariate Empirical Mode Decomposition plus Hilbert transform is an improved method to analyze neuronal population oscillatory dynamics overcoming linear and stationary assumptions of classical methods. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Element analysis: a wavelet-based method for analysing time-localized events in noisy time series.

    PubMed

    Lilly, Jonathan M

    2017-04-01

    A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized 'events'. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event's 'region of influence' within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis , is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry.

  13. Cross-country transferability of multi-variable damage models

    NASA Astrophysics Data System (ADS)

    Wagenaar, Dennis; Lüdtke, Stefan; Kreibich, Heidi; Bouwer, Laurens

    2017-04-01

    Flood damage assessment is often done with simple damage curves based only on flood water depth. Additionally, damage models are often transferred in space and time, e.g. from region to region or from one flood event to another. Validation has shown that depth-damage curve estimates are associated with high uncertainties, particularly when applied in regions outside the area where the data for curve development was collected. Recently, progress has been made with multi-variable damage models created with data-mining techniques, i.e. Bayesian Networks and random forest. However, it is still unknown to what extent and under which conditions model transfers are possible and reliable. Model validations in different countries will provide valuable insights into the transferability of multi-variable damage models. In this study we compare multi-variable models developed on basis of flood damage datasets from Germany as well as from The Netherlands. Data from several German floods was collected using computer aided telephone interviews. Data from the 1993 Meuse flood in the Netherlands is available, based on compensations paid by the government. The Bayesian network and random forest based models are applied and validated in both countries on basis of the individual datasets. A major challenge was the harmonization of the variables between both datasets due to factors like differences in variable definitions, and regional and temporal differences in flood hazard and exposure characteristics. Results of model validations and comparisons in both countries are discussed, particularly in respect to encountered challenges and possible solutions for an improvement of model transferability.

  14. Leveraging Long-term Seismic Catalogs for Automated Real-time Event Classification

    NASA Astrophysics Data System (ADS)

    Linville, L.; Draelos, T.; Pankow, K. L.; Young, C. J.; Alvarez, S.

    2017-12-01

    We investigate the use of labeled event types available through reviewed seismic catalogs to produce automated event labels on new incoming data from the crustal region spanned by the cataloged events. Using events cataloged by the University of Utah Seismograph Stations between October, 2012 and June, 2017, we calculate the spectrogram for a time window that spans the duration of each event as seen on individual stations, resulting in 110k event spectrograms (50% local earthquakes examples, 50% quarry blasts examples). Using 80% of the randomized example events ( 90k), a classifier is trained to distinguish between local earthquakes and quarry blasts. We explore variations of deep learning classifiers, incorporating elements of convolutional and recurrent neural networks. Using a single-layer Long Short Term Memory recurrent neural network, we achieve 92% accuracy on the classification task on the remaining 20K test examples. Leveraging the decisions from a group of stations that detected the same event by using the median of all classifications in the group increases the model accuracy to 96%. Additional data with equivalent processing from 500 more recently cataloged events (July, 2017), achieves the same accuracy as our test data on both single-station examples and multi-station medians, suggesting that the model can maintain accurate and stable classification rates on real-time automated events local to the University of Utah Seismograph Stations, with potentially minimal levels of re-training through time.

  15. A wrinkle in time: asymmetric valuation of past and future events.

    PubMed

    Caruso, Eugene M; Gilbert, Daniel T; Wilson, Timothy D

    2008-08-01

    A series of studies shows that people value future events more than equivalent events in the equidistant past. Whether people imagined being compensated or compensating others, they required and offered more compensation for events that would take place in the future than for identical events that had taken place in the past. This temporal value asymmetry (TVA) was robust in between-persons comparisons and absent in within-persons comparisons, which suggests that participants considered the TVA irrational. Contemplating future events produced greater affect than did contemplating past events, and this difference mediated the TVA. We suggest that the TVA, the gain-loss asymmetry, and hyperbolic time discounting can be unified in a three-dimensional value function that describes how people value gains and losses of different magnitudes at different moments in time.

  16. iVAR: a program for imputing missing data in multivariate time series using vector autoregressive models.

    PubMed

    Liu, Siwei; Molenaar, Peter C M

    2014-12-01

    This article introduces iVAR, an R program for imputing missing data in multivariate time series on the basis of vector autoregressive (VAR) models. We conducted a simulation study to compare iVAR with three methods for handling missing data: listwise deletion, imputation with sample means and variances, and multiple imputation ignoring time dependency. The results showed that iVAR produces better estimates for the cross-lagged coefficients than do the other three methods. We demonstrate the use of iVAR with an empirical example of time series electrodermal activity data and discuss the advantages and limitations of the program.

  17. Toeplitz Inverse Covariance-Based Clustering of Multivariate Time Series Data

    PubMed Central

    Hallac, David; Vare, Sagar; Boyd, Stephen; Leskovec, Jure

    2018-01-01

    Subsequence clustering of multivariate time series is a useful tool for discovering repeated patterns in temporal data. Once these patterns have been discovered, seemingly complicated datasets can be interpreted as a temporal sequence of only a small number of states, or clusters. For example, raw sensor data from a fitness-tracking application can be expressed as a timeline of a select few actions (i.e., walking, sitting, running). However, discovering these patterns is challenging because it requires simultaneous segmentation and clustering of the time series. Furthermore, interpreting the resulting clusters is difficult, especially when the data is high-dimensional. Here we propose a new method of model-based clustering, which we call Toeplitz Inverse Covariance-based Clustering (TICC). Each cluster in the TICC method is defined by a correlation network, or Markov random field (MRF), characterizing the interdependencies between different observations in a typical subsequence of that cluster. Based on this graphical representation, TICC simultaneously segments and clusters the time series data. We solve the TICC problem through alternating minimization, using a variation of the expectation maximization (EM) algorithm. We derive closed-form solutions to efficiently solve the two resulting subproblems in a scalable way, through dynamic programming and the alternating direction method of multipliers (ADMM), respectively. We validate our approach by comparing TICC to several state-of-the-art baselines in a series of synthetic experiments, and we then demonstrate on an automobile sensor dataset how TICC can be used to learn interpretable clusters in real-world scenarios. PMID:29770257

  18. New strategy to identify radicals in a time evolving EPR data set by multivariate curve resolution-alternating least squares.

    PubMed

    Fadel, Maya Abou; de Juan, Anna; Vezin, Hervé; Duponchel, Ludovic

    2016-12-01

    Electron paramagnetic resonance (EPR) spectroscopy is a powerful technique that is able to characterize radicals formed in kinetic reactions. However, spectral characterization of individual chemical species is often limited or even unmanageable due to the severe kinetic and spectral overlap among species in kinetic processes. Therefore, we applied, for the first time, multivariate curve resolution-alternating least squares (MCR-ALS) method to EPR time evolving data sets to model and characterize the different constituents in a kinetic reaction. Here we demonstrate the advantage of multivariate analysis in the investigation of radicals formed along the kinetic process of hydroxycoumarin in alkaline medium. Multiset analysis of several EPR-monitored kinetic experiments performed in different conditions revealed the individual paramagnetic centres as well as their kinetic profiles. The results obtained by MCR-ALS method demonstrate its prominent potential in analysis of EPR time evolved spectra. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Envisioning the times of future events: The role of personal goals.

    PubMed

    Ben Malek, Hédi; Berna, Fabrice; D'Argembeau, Arnaud

    2018-05-25

    Episodic future thinking refers to the human capacity to imagine or simulate events that might occur in one's personal future. Previous studies have shown that personal goals guide the construction and organization of episodic future thoughts, and here we sought to investigate the role of personal goals in the process of locating imagined events in time. Using a think-aloud protocol, we found that dates were directly accessed more frequently for goal-related than goal-unrelated future events, and the goal-relevance of events was a significant predictor of direct access to temporal information on a trial-by-trial basis. Furthermore, when an event was not directly dated, references to anticipated lifetime periods were more frequently used as a strategy to determine when a goal-related event might occur. Together, these findings shed new light on the mechanisms by which personal goals contribute to the location of imagined events in future times. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. The role of musical training in emergent and event-based timing.

    PubMed

    Baer, L H; Thibodeau, J L N; Gralnick, T M; Li, K Z H; Penhune, V B

    2013-01-01

    Musical performance is thought to rely predominantly on event-based timing involving a clock-like neural process and an explicit internal representation of the time interval. Some aspects of musical performance may rely on emergent timing, which is established through the optimization of movement kinematics, and can be maintained without reference to any explicit representation of the time interval. We predicted that musical training would have its largest effect on event-based timing, supporting the dissociability of these timing processes and the dominance of event-based timing in musical performance. We compared 22 musicians and 17 non-musicians on the prototypical event-based timing task of finger tapping and on the typically emergently timed task of circle drawing. For each task, participants first responded in synchrony with a metronome (Paced) and then responded at the same rate without the metronome (Unpaced). Analyses of the Unpaced phase revealed that non-musicians were more variable in their inter-response intervals for finger tapping compared to circle drawing. Musicians did not differ between the two tasks. Between groups, non-musicians were more variable than musicians for tapping but not for drawing. We were able to show that the differences were due to less timer variability in musicians on the tapping task. Correlational analyses of movement jerk and inter-response interval variability revealed a negative association for tapping and a positive association for drawing in non-musicians only. These results suggest that musical training affects temporal variability in tapping but not drawing. Additionally, musicians and non-musicians may be employing different movement strategies to maintain accurate timing in the two tasks. These findings add to our understanding of how musical training affects timing and support the dissociability of event-based and emergent timing modes.

  1. A Personalized Predictive Framework for Multivariate Clinical Time Series via Adaptive Model Selection.

    PubMed

    Liu, Zitao; Hauskrecht, Milos

    2017-11-01

    Building of an accurate predictive model of clinical time series for a patient is critical for understanding of the patient condition, its dynamics, and optimal patient management. Unfortunately, this process is not straightforward. First, patient-specific variations are typically large and population-based models derived or learned from many different patients are often unable to support accurate predictions for each individual patient. Moreover, time series observed for one patient at any point in time may be too short and insufficient to learn a high-quality patient-specific model just from the patient's own data. To address these problems we propose, develop and experiment with a new adaptive forecasting framework for building multivariate clinical time series models for a patient and for supporting patient-specific predictions. The framework relies on the adaptive model switching approach that at any point in time selects the most promising time series model out of the pool of many possible models, and consequently, combines advantages of the population, patient-specific and short-term individualized predictive models. We demonstrate that the adaptive model switching framework is very promising approach to support personalized time series prediction, and that it is able to outperform predictions based on pure population and patient-specific models, as well as, other patient-specific model adaptation strategies.

  2. Some Recent Developments on Complex Multivariate Distributions

    ERIC Educational Resources Information Center

    Krishnaiah, P. R.

    1976-01-01

    In this paper, the author gives a review of the literature on complex multivariate distributions. Some new results on these distributions are also given. Finally, the author discusses the applications of the complex multivariate distributions in the area of the inference on multiple time series. (Author)

  3. FABP4 and Cardiovascular Events in Peripheral Arterial Disease.

    PubMed

    Höbaus, Clemens; Herz, Carsten Thilo; Pesau, Gerfried; Wrba, Thomas; Koppensteiner, Renate; Schernthaner, Gerit-Holger

    2018-05-01

    Fatty acid-binding protein 4 (FABP4) is a possible biomarker of atherosclerosis. We evaluated FABP4 levels, for the first time, in patients with peripheral artery disease (PAD) and the possible association between baseline FABP4 levels and cardiovascular events over time. Patients (n = 327; mean age 69 ± 10 years) with stable PAD were enrolled in this study. Serum FABP4 was measured by bead-based multiplex assay. Cardiovascular events were analyzed by FABP4 tertiles using Kaplan-Meier and Cox regression analyses after 5 years. Serum FABP4 levels showed a significant association with the classical 3-point major adverse cardiovascular event (MACE) end point (including death, nonlethal myocardial infarction, or nonfatal stroke) in patients with PAD ( P = .038). A standard deviation increase of FABP4 resulted in a hazard ratio (HR) of 1.33 (95% confidence interval [95% CI]: 1.03-1.71) for MACE. This association increased (HR: 1.47, 95% CI: 1.03-1.71) after multivariable adjustment ( P = .020). Additionally, in multivariable linear regression analysis, FABP4 was linked to estimated glomerular filtration rate ( P < .001), gender ( P = .005), fasting triglycerides ( P = .048), and body mass index ( P < .001). Circulating FABP4 may be a useful additional biomarker to evaluate patients with stable PAD at risk of major cardiovascular complications.

  4. Time distributions of solar energetic particle events: Are SEPEs really random?

    NASA Astrophysics Data System (ADS)

    Jiggens, P. T. A.; Gabriel, S. B.

    2009-10-01

    Solar energetic particle events (SEPEs) can exhibit flux increases of several orders of magnitude over background levels and have always been considered to be random in nature in statistical models with no dependence of any one event on the occurrence of previous events. We examine whether this assumption of randomness in time is correct. Engineering modeling of SEPEs is important to enable reliable and efficient design of both Earth-orbiting and interplanetary spacecraft and future manned missions to Mars and the Moon. All existing engineering models assume that the frequency of SEPEs follows a Poisson process. We present analysis of the event waiting times using alternative distributions described by Lévy and time-dependent Poisson processes and compared these with the usual Poisson distribution. The results show significant deviation from a Poisson process and indicate that the underlying physical processes might be more closely related to a Lévy-type process, suggesting that there is some inherent “memory” in the system. Inherent Poisson assumptions of stationarity and event independence are investigated, and it appears that they do not hold and can be dependent upon the event definition used. SEPEs appear to have some memory indicating that events are not completely random with activity levels varying even during solar active periods and are characterized by clusters of events. This could have significant ramifications for engineering models of the SEP environment, and it is recommended that current statistical engineering models of the SEP environment should be modified to incorporate long-term event dependency and short-term system memory.

  5. Detecting event-related changes of multivariate phase coupling in dynamic brain networks.

    PubMed

    Canolty, Ryan T; Cadieu, Charles F; Koepsell, Kilian; Ganguly, Karunesh; Knight, Robert T; Carmena, Jose M

    2012-04-01

    Oscillatory phase coupling within large-scale brain networks is a topic of increasing interest within systems, cognitive, and theoretical neuroscience. Evidence shows that brain rhythms play a role in controlling neuronal excitability and response modulation (Haider B, McCormick D. Neuron 62: 171-189, 2009) and regulate the efficacy of communication between cortical regions (Fries P. Trends Cogn Sci 9: 474-480, 2005) and distinct spatiotemporal scales (Canolty RT, Knight RT. Trends Cogn Sci 14: 506-515, 2010). In this view, anatomically connected brain areas form the scaffolding upon which neuronal oscillations rapidly create and dissolve transient functional networks (Lakatos P, Karmos G, Mehta A, Ulbert I, Schroeder C. Science 320: 110-113, 2008). Importantly, testing these hypotheses requires methods designed to accurately reflect dynamic changes in multivariate phase coupling within brain networks. Unfortunately, phase coupling between neurophysiological signals is commonly investigated using suboptimal techniques. Here we describe how a recently developed probabilistic model, phase coupling estimation (PCE; Cadieu C, Koepsell K Neural Comput 44: 3107-3126, 2010), can be used to investigate changes in multivariate phase coupling, and we detail the advantages of this model over the commonly employed phase-locking value (PLV; Lachaux JP, Rodriguez E, Martinerie J, Varela F. Human Brain Map 8: 194-208, 1999). We show that the N-dimensional PCE is a natural generalization of the inherently bivariate PLV. Using simulations, we show that PCE accurately captures both direct and indirect (network mediated) coupling between network elements in situations where PLV produces erroneous results. We present empirical results on recordings from humans and nonhuman primates and show that the PCE-estimated coupling values are different from those using the bivariate PLV. Critically on these empirical recordings, PCE output tends to be sparser than the PLVs, indicating fewer

  6. ASSOCIATIONS BETWEEN TRAUMATIC EVENTS AND SUICIDAL BEHAVIOUR IN SOUTH AFRICA

    PubMed Central

    Sorsdahl, Katherine; Stein, Dan J.; Williams, David R.; Nock, Matthew K.

    2011-01-01

    Research conducted predominantly in the developed world suggests that there is an association between trauma exposure and suicidal behaviour. However, there are limited data available investigating whether specific traumas are uniquely predictive of suicidal behaviour, or the extent to which traumatic events predict the progression from suicide ideation to plans and attempts. A national survey was conducted with 4351 adult South Africans between 2002 and 2004 as part of the WHO World Mental Health Surveys. Data on trauma exposure and subsequent suicidal behaviour were collected. Bivariate and multivariate survival models tested the relationship between the type and number of traumatic events and lifetime suicidal behaviour. A range of traumatic events are associated with lifetime suicide ideation and attempt; however, after controlling for all traumatic events in a multivariate model, only sexual violence (OR=4.7, CI 2.3-9.4) and having witnessed violence (OR=1.8, 1.1-2.9) remained significant predictors of life-time suicide attempts. Disaggregation of the associations between traumatic events and suicide attempts indicates that they are largely due to traumatic events predicting suicide ideation rather than to the progression from suicide ideation to attempt. This paper highlights the importance of traumatic life events in the occurrence of suicidal thoughts and behaviours and provides important information about the nature of this association. Future research is needed to better understand how and why such experiences increase the risk of suicidal outcomes. PMID:22134450

  7. Lessons Learned from Real-Time, Event-Based Internet Science Communications

    NASA Technical Reports Server (NTRS)

    Phillips, T.; Myszka, E.; Gallagher, D. L.; Adams, M. L.; Koczor, R. J.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The Directorate's Science Roundtable includes active researchers, NASA public relations, educators, and administrators. The Science@NASA award-winning family of Web sites features science, mathematics, and space news. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing science activities in real-time has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases, broadcasts accommodate active feedback and questions from Internet participants. Through these projects a pattern has emerged in the level of interest or popularity with the public. The pattern differentiates projects that include science from those that do not, All real-time, event-based Internet activities have captured public interest at a level not achieved through science stories or educator resource material exclusively. The worst event-based activity attracted more interest than the best written science story. One truly rewarding lesson learned through these projects is that the public recognizes the importance and excitement of being part of scientific discovery. Flying a camera to 100,000 feet altitude isn't as interesting to the public as searching for viable life-forms at these oxygen-poor altitudes. The details of these real-time, event-based projects and lessons learned will be discussed.

  8. Time-lag and Correlation between ACE and RBSPICE Injection Event Observations during Storm Times

    NASA Astrophysics Data System (ADS)

    Madanian, H.; Patterson, J. D.; Manweiler, J. W.; Soto-chavez, A. R.; Gerrard, A. J.; Lanzerotti, L. J.

    2017-12-01

    The Radiation Belt Storm Probes Ion Composition Experiment (RBSPICE) on the Van Allen Probes mission measures energetic charged particles [ 20 keV to 1 MeV] in the inner magnetosphere and ring current. During geomagnetic storms, injections of energetic ions into the ring current change the ion population and produce geomagnetic field depressions on Earth's surface. We analyzed the magnetic field strength and particle composition in the interplanetary medium measured by instruments on the Advanced Composition Explorer (ACE) spacecraft near the inner Lagrangian point. The Electron, Proton, and Alpha Monitor-Low Energy Magnetic Spectrometer (EPAM-LEMS) sensor on ACE measures energetic particles [ 50 keV to 5 MeV] in the interplanetary space. The SYM-H index is utilized to classify the storm events by magnitude and to select more than 60 storm events between 2013 and 2017. We cross-compared ACE observations at storm times, with the RBSPICE ion measurements at dusk to midnight magnetic local time and over the 3-6 L-shell range. We report on the relative composition of the solar particles and the relative composition of the inner magnetospheric hot plasma during storm times. The data correlation is accomplished by shifting the observation time from ACE to RBSPICE using the solar wind velocity at the time of the observation. We will discuss time lags between storm onset at the magnetopause and injection events measured for each storm.

  9. Event time analysis of longitudinal neuroimage data.

    PubMed

    Sabuncu, Mert R; Bernal-Rusiel, Jorge L; Reuter, Martin; Greve, Douglas N; Fischl, Bruce

    2014-08-15

    This paper presents a method for the statistical analysis of the associations between longitudinal neuroimaging measurements, e.g., of cortical thickness, and the timing of a clinical event of interest, e.g., disease onset. The proposed approach consists of two steps, the first of which employs a linear mixed effects (LME) model to capture temporal variation in serial imaging data. The second step utilizes the extended Cox regression model to examine the relationship between time-dependent imaging measurements and the timing of the event of interest. We demonstrate the proposed method both for the univariate analysis of image-derived biomarkers, e.g., the volume of a structure of interest, and the exploratory mass-univariate analysis of measurements contained in maps, such as cortical thickness and gray matter density. The mass-univariate method employs a recently developed spatial extension of the LME model. We applied our method to analyze structural measurements computed using FreeSurfer, a widely used brain Magnetic Resonance Image (MRI) analysis software package. We provide a quantitative and objective empirical evaluation of the statistical performance of the proposed method on longitudinal data from subjects suffering from Mild Cognitive Impairment (MCI) at baseline. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. HOTS: A Hierarchy of Event-Based Time-Surfaces for Pattern Recognition.

    PubMed

    Lagorce, Xavier; Orchard, Garrick; Galluppi, Francesco; Shi, Bertram E; Benosman, Ryad B

    2017-07-01

    This paper describes novel event-based spatio-temporal features called time-surfaces and how they can be used to create a hierarchical event-based pattern recognition architecture. Unlike existing hierarchical architectures for pattern recognition, the presented model relies on a time oriented approach to extract spatio-temporal features from the asynchronously acquired dynamics of a visual scene. These dynamics are acquired using biologically inspired frameless asynchronous event-driven vision sensors. Similarly to cortical structures, subsequent layers in our hierarchy extract increasingly abstract features using increasingly large spatio-temporal windows. The central concept is to use the rich temporal information provided by events to create contexts in the form of time-surfaces which represent the recent temporal activity within a local spatial neighborhood. We demonstrate that this concept can robustly be used at all stages of an event-based hierarchical model. First layer feature units operate on groups of pixels, while subsequent layer feature units operate on the output of lower level feature units. We report results on a previously published 36 class character recognition task and a four class canonical dynamic card pip task, achieving near 100 percent accuracy on each. We introduce a new seven class moving face recognition task, achieving 79 percent accuracy.This paper describes novel event-based spatio-temporal features called time-surfaces and how they can be used to create a hierarchical event-based pattern recognition architecture. Unlike existing hierarchical architectures for pattern recognition, the presented model relies on a time oriented approach to extract spatio-temporal features from the asynchronously acquired dynamics of a visual scene. These dynamics are acquired using biologically inspired frameless asynchronous event-driven vision sensors. Similarly to cortical structures, subsequent layers in our hierarchy extract increasingly abstract

  11. A multivariate cure model for left-censored and right-censored data with application to colorectal cancer screening patterns.

    PubMed

    Hagar, Yolanda C; Harvey, Danielle J; Beckett, Laurel A

    2016-08-30

    We develop a multivariate cure survival model to estimate lifetime patterns of colorectal cancer screening. Screening data cover long periods of time, with sparse observations for each person. Some events may occur before the study begins or after the study ends, so the data are both left-censored and right-censored, and some individuals are never screened (the 'cured' population). We propose a multivariate parametric cure model that can be used with left-censored and right-censored data. Our model allows for the estimation of the time to screening as well as the average number of times individuals will be screened. We calculate likelihood functions based on the observations for each subject using a distribution that accounts for within-subject correlation and estimate parameters using Markov chain Monte Carlo methods. We apply our methods to the estimation of lifetime colorectal cancer screening behavior in the SEER-Medicare data set. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  12. Time-Symmetric Quantization in Spacetimes with Event Horizons

    NASA Astrophysics Data System (ADS)

    Kobakhidze, Archil; Rodd, Nicholas

    2013-08-01

    The standard quantization formalism in spacetimes with event horizons implies a non-unitary evolution of quantum states, as initial pure states may evolve into thermal states. This phenomenon is behind the famous black hole information loss paradox which provoked long-standing debates on the compatibility of quantum mechanics and gravity. In this paper we demonstrate that within an alternative time-symmetric quantization formalism thermal radiation is absent and states evolve unitarily in spacetimes with event horizons. We also discuss the theoretical consistency of the proposed formalism. We explicitly demonstrate that the theory preserves the microcausality condition and suggest a "reinterpretation postulate" to resolve other apparent pathologies associated with negative energy states. Accordingly as there is a consistent alternative, we argue that choosing to use time-asymmetric quantization is a necessary condition for the black hole information loss paradox.

  13. Multivariate Genetic Correlates of the Auditory Paired Stimuli-Based P2 Event-Related Potential in the Psychosis Dimension From the BSNIP Study.

    PubMed

    Mokhtari, Mohammadreza; Narayanan, Balaji; Hamm, Jordan P; Soh, Pauline; Calhoun, Vince D; Ruaño, Gualberto; Kocherla, Mohan; Windemuth, Andreas; Clementz, Brett A; Tamminga, Carol A; Sweeney, John A; Keshavan, Matcheri S; Pearlson, Godfrey D

    2016-05-01

    The complex molecular etiology of psychosis in schizophrenia (SZ) and psychotic bipolar disorder (PBP) is not well defined, presumably due to their multifactorial genetic architecture. Neurobiological correlates of psychosis can be identified through genetic associations of intermediate phenotypes such as event-related potential (ERP) from auditory paired stimulus processing (APSP). Various ERP components of APSP are heritable and aberrant in SZ, PBP and their relatives, but their multivariate genetic factors are less explored. We investigated the multivariate polygenic association of ERP from 64-sensor auditory paired stimulus data in 149 SZ, 209 PBP probands, and 99 healthy individuals from the multisite Bipolar-Schizophrenia Network on Intermediate Phenotypes study. Multivariate association of 64-channel APSP waveforms with a subset of 16 999 single nucleotide polymorphisms (SNPs) (reduced from 1 million SNP array) was examined using parallel independent component analysis (Para-ICA). Biological pathways associated with the genes were assessed using enrichment-based analysis tools. Para-ICA identified 2 ERP components, of which one was significantly correlated with a genetic network comprising multiple linearly coupled gene variants that explained ~4% of the ERP phenotype variance. Enrichment analysis revealed epidermal growth factor, endocannabinoid signaling, glutamatergic synapse and maltohexaose transport associated with P2 component of the N1-P2 ERP waveform. This ERP component also showed deficits in SZ and PBP. Aberrant P2 component in psychosis was associated with gene networks regulating several fundamental biologic functions, either general or specific to nervous system development. The pathways and processes underlying the gene clusters play a crucial role in brain function, plausibly implicated in psychosis. © The Author 2015. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For

  14. Negative Emotional Events that People Ruminate about Feel Closer in Time

    PubMed Central

    Siedlecka, Ewa; Capper, Miriam M.; Denson, Thomas F.

    2015-01-01

    Rumination is intrusive, perseverative cognition. We suggest that one psychological consequence of ruminating about negative emotional events is that the events feel as though they happened metaphorically “just yesterday”. Results from three studies showed that ruminating about real world anger provocations, guilt-inducing events, and sad times in the last year made these past events feel as though they happened more recently. The relationship between rumination and reduced temporal psychological distance persisted even when controlling for when the event occurred and the emotional intensity of the event. Moreover, angry rumination was correlated with enhanced approach motivation, which mediated the rumination-distance relationship. The relationship between guilty rumination and distance was mediated by enhanced vividness. Construal level and taking a 3rd person perspective contributed to the sense of distance when participants were prompted to think about less emotionally charged situations. A meta-analysis of the data showed that the relationship between rumination and reduced distance was significant and twice as large as the same relationship for neutral events. These findings have implications for understanding the role of emotional rumination on memory processes in clinical populations and people prone to rumination. This research suggests that rumination may be a critical mechanism that keeps negative events close in the heart, mind, and time. PMID:25714395

  15. Piecewise multivariate modelling of sequential metabolic profiling data.

    PubMed

    Rantalainen, Mattias; Cloarec, Olivier; Ebbels, Timothy M D; Lundstedt, Torbjörn; Nicholson, Jeremy K; Holmes, Elaine; Trygg, Johan

    2008-02-19

    Modelling the time-related behaviour of biological systems is essential for understanding their dynamic responses to perturbations. In metabolic profiling studies, the sampling rate and number of sampling points are often restricted due to experimental and biological constraints. A supervised multivariate modelling approach with the objective to model the time-related variation in the data for short and sparsely sampled time-series is described. A set of piecewise Orthogonal Projections to Latent Structures (OPLS) models are estimated, describing changes between successive time points. The individual OPLS models are linear, but the piecewise combination of several models accommodates modelling and prediction of changes which are non-linear with respect to the time course. We demonstrate the method on both simulated and metabolic profiling data, illustrating how time related changes are successfully modelled and predicted. The proposed method is effective for modelling and prediction of short and multivariate time series data. A key advantage of the method is model transparency, allowing easy interpretation of time-related variation in the data. The method provides a competitive complement to commonly applied multivariate methods such as OPLS and Principal Component Analysis (PCA) for modelling and analysis of short time-series data.

  16. Predictive value of night-time heart rate for cardiovascular events in hypertension. The ABP-International study.

    PubMed

    Palatini, Paolo; Reboldi, Gianpaolo; Beilin, Lawrence J; Eguchi, Kazuo; Imai, Yutaka; Kario, Kazuomi; Ohkubo, Takayoshi; Pierdomenico, Sante D; Saladini, Francesca; Schwartz, Joseph E; Wing, Lindon; Verdecchia, Paolo

    2013-09-30

    Data from prospective cohort studies regarding the association between ambulatory heart rate (HR) and cardiovascular events (CVE) are conflicting. To investigate whether ambulatory HR predicts CVE in hypertension, we performed 24-hour ambulatory blood pressure and HR monitoring in 7600 hypertensive patients aged 52 ± 16 years from Italy, U.S.A., Japan, and Australia, included in the 'ABP-International' registry. All were untreated at baseline examination. Standardized hazard ratios for ambulatory HRs were computed, stratifying for cohort, and adjusting for age, gender, blood pressure, smoking, diabetes, serum total cholesterol and serum creatinine. During a median follow-up of 5.0 years there were 639 fatal and nonfatal CVE. In a multivariable Cox model, night-time HR predicted fatal combined with nonfatal CVE more closely than 24h HR (p=0.007 and =0.03, respectively). Daytime HR and the night:day HR ratio were not associated with CVE (p=0.07 and =0.18, respectively). The hazard ratio of the fatal combined with nonfatal CVE for a 10-beats/min increment of the night-time HR was 1.13 (95% CI, 1.04-1.22). This relationship remained significant when subjects taking beta-blockers during the follow-up (hazard ratio, 1.15; 95% CI, 1.05-1.25) or subjects who had an event within 5 years after enrollment (hazard ratio, 1.23; 95% CI, 1.05-1.45) were excluded from analysis. At variance with previous data obtained from general populations, ambulatory HR added to the risk stratification for fatal combined with nonfatal CVE in the hypertensive patients from the ABP-International study. Night-time HR was a better predictor of CVE than daytime HR. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  17. A hybrid clustering approach for multivariate time series - A case study applied to failure analysis in a gas turbine.

    PubMed

    Fontes, Cristiano Hora; Budman, Hector

    2017-11-01

    A clustering problem involving multivariate time series (MTS) requires the selection of similarity metrics. This paper shows the limitations of the PCA similarity factor (SPCA) as a single metric in nonlinear problems where there are differences in magnitude of the same process variables due to expected changes in operation conditions. A novel method for clustering MTS based on a combination between SPCA and the average-based Euclidean distance (AED) within a fuzzy clustering approach is proposed. Case studies involving either simulated or real industrial data collected from a large scale gas turbine are used to illustrate that the hybrid approach enhances the ability to recognize normal and fault operating patterns. This paper also proposes an oversampling procedure to create synthetic multivariate time series that can be useful in commonly occurring situations involving unbalanced data sets. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  18. Testing the causality of Hawkes processes with time reversal

    NASA Astrophysics Data System (ADS)

    Cordi, Marcus; Challet, Damien; Muni Toke, Ioane

    2018-03-01

    We show that univariate and symmetric multivariate Hawkes processes are only weakly causal: the true log-likelihoods of real and reversed event time vectors are almost equal, thus parameter estimation via maximum likelihood only weakly depends on the direction of the arrow of time. In ideal (synthetic) conditions, tests of goodness of parametric fit unambiguously reject backward event times, which implies that inferring kernels from time-symmetric quantities, such as the autocovariance of the event rate, only rarely produce statistically significant fits. Finally, we find that fitting financial data with many-parameter kernels may yield significant fits for both arrows of time for the same event time vector, sometimes favouring the backward time direction. This goes to show that a significant fit of Hawkes processes to real data with flexible kernels does not imply a definite arrow of time unless one tests it.

  19. APNEA list mode data acquisition and real-time event processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogle, R.A.; Miller, P.; Bramblett, R.L.

    1997-11-01

    The LMSC Active Passive Neutron Examinations and Assay (APNEA) Data Logger is a VME-based data acquisition system using commercial-off-the-shelf hardware with the application-specific software. It receives TTL inputs from eighty-eight {sup 3}He detector tubes and eight timing signals. Two data sets are generated concurrently for each acquisition session: (1) List Mode recording of all detector and timing signals, timestamped to 3 microsecond resolution; (2) Event Accumulations generated in real-time by counting events into short (tens of microseconds) and long (seconds) time bins following repetitive triggers. List Mode data sets can be post-processed to: (1) determine the optimum time bins formore » TRU assay of waste drums, (2) analyze a given data set in several ways to match different assay requirements and conditions and (3) confirm assay results by examining details of the raw data. Data Logger events are processed and timestamped by an array of 15 TMS320C40 DSPs and delivered to an embedded controller (PowerPC604) for interim disk storage. Three acquisition modes, corresponding to different trigger sources are provided. A standard network interface to a remote host system (Windows NT or SunOS) provides for system control, status, and transfer of previously acquired data. 6 figs.« less

  20. Real-time prediction of the occurrence of GLE events

    NASA Astrophysics Data System (ADS)

    Núñez, Marlon; Reyes-Santiago, Pedro J.; Malandraki, Olga E.

    2017-07-01

    A tool for predicting the occurrence of Ground Level Enhancement (GLE) events using the UMASEP scheme is presented. This real-time tool, called HESPERIA UMASEP-500, is based on the detection of the magnetic connection, along which protons arrive in the near-Earth environment, by estimating the lag correlation between the time derivatives of 1 min soft X-ray flux (SXR) and 1 min near-Earth proton fluxes observed by the GOES satellites. Unlike current GLE warning systems, this tool can predict GLE events before the detection by any neutron monitor (NM) station. The prediction performance measured for the period from 1986 to 2016 is presented for two consecutive periods, because of their notable difference in performance. For the 2000-2016 period, this prediction tool obtained a probability of detection (POD) of 53.8% (7 of 13 GLE events), a false alarm ratio (FAR) of 30.0%, and average warning times (AWT) of 8 min with respect to the first NM station's alert and 15 min to the GLE Alert Plus's warning. We have tested the model by replacing the GOES proton data with SOHO/EPHIN proton data, and the results are similar in terms of POD, FAR, and AWT for the same period. The paper also presents a comparison with a GLE warning system.

  1. Summarizing the incidence of adverse events using volcano plots and time intervals.

    PubMed

    Zink, Richard C; Wolfinger, Russell D; Mann, Geoffrey

    2013-01-01

    Adverse event incidence analyses are a critical component for describing the safety profile of any new intervention. The results typically are presented in lengthy summary tables. For therapeutic areas where patients have frequent adverse events, analysis and interpretation are made more difficult by the sheer number and variety of events that occur. Understanding the risk in these instances becomes even more crucial. We describe a space-saving graphical summary that overcomes the limitations of traditional presentations of adverse events and improves interpretability of the safety profile. We present incidence analyses of adverse events graphically using volcano plots to highlight treatment differences. Data from a clinical trial of patients experiencing an aneurysmal subarachnoid hemorrhage are used for illustration. Adjustments for multiplicity are illustrated. Color is used to indicate the treatment with higher incidence; bubble size represents the total number of events that occur in the treatment arms combined. Adjustments for multiple comparisons are displayed in a manner to indicate clearly those events for which the difference between treatment arms is statistically significant. Furthermore, adverse events can be displayed by time intervals, with multiple volcano plots or animation to appreciate changes in adverse event risk over time. Such presentations can emphasize early differences across treatments that may resolve later or highlight events for which treatment differences may become more substantial with longer follow-up. Treatment arms are compared in a pairwise fashion. Volcano plots are space-saving tools that emphasize important differences between the adverse event profiles of two treatment arms. They can incorporate multiplicity adjustments in a manner that is straightforward to interpret and, by using time intervals, can illustrate how adverse event risk changes over the course of a clinical trial.

  2. A Cyber-Attack Detection Model Based on Multivariate Analyses

    NASA Astrophysics Data System (ADS)

    Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi

    In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.

  3. The impact of operative time on complications after plastic surgery: a multivariate regression analysis of 1753 cases.

    PubMed

    Hardy, Krista L; Davis, Kathryn E; Constantine, Ryan S; Chen, Mo; Hein, Rachel; Jewell, James L; Dirisala, Karunakar; Lysikowski, Jerzy; Reed, Gary; Kenkel, Jeffrey M

    2014-05-01

    Little evidence within plastic surgery literature supports the precept that longer operative times lead to greater morbidity. The authors investigate surgery duration as a determinant of morbidity, with the goal of defining a clinically relevant time for increased risk. A retrospective chart review was conducted of patients who underwent a broad range of complex plastic surgical procedures (n = 1801 procedures) at UT Southwestern Medical Center in Dallas, Texas, from January 1, 2008 to January 31, 2012. Adjusting for possible confounders, multivariate logistic regression assessed surgery duration as an independent predictor of morbidity. To define a cutoff for increased risk, incidence of complications was compared among quintiles of surgery duration. Stratification by type of surgery controlled for procedural complexity. A total of 1753 cases were included in multivariate analyses with an overall complication rate of 27.8%. Most operations were combined (75.8%), averaging 4.9 concurrent procedures. Each hour increase in surgery duration was associated with a 21% rise in odds of morbidity (P < .0001). Compared with the first quintile of operative time (<2.0 hours), there was no change in complications until after 3.1 hours of surgery (odds ratio, 1.6; P = .017), with progressively greater odds increases of 3.1 times after 4.5 hours (P < .0001) and 4.7 times after 6.8 hours (P < .0001). When stratified by type of surgery, longer operations continued to be associated with greater morbidity. Surgery duration is an independent predictor of complications, with a significantly increased risk above 3 hours. Although procedural complexity undoubtedly affects morbidity, operative time should factor into surgical decision making.

  4. A rank test for bivariate time-to-event outcomes when one event is a surrogate

    PubMed Central

    Shaw, Pamela A.; Fay, Michael P.

    2016-01-01

    In many clinical settings, improving patient survival is of interest but a practical surrogate, such as time to disease progression, is instead used as a clinical trial’s primary endpoint. A time-to-first endpoint (e.g. death or disease progression) is commonly analyzed but may not be adequate to summarize patient outcomes if a subsequent event contains important additional information. We consider a surrogate outcome very generally, as one correlated with the true endpoint of interest. Settings of interest include those where the surrogate indicates a beneficial outcome so that the usual time-to-first endpoint of death or surrogate event is nonsensical. We present a new two-sample test for bivariate, interval-censored time-to-event data, where one endpoint is a surrogate for the second, less frequently observed endpoint of true interest. This test examines whether patient groups have equal clinical severity. If the true endpoint rarely occurs, the proposed test acts like a weighted logrank test on the surrogate; if it occurs for most individuals, then our test acts like a weighted logrank test on the true endpoint. If the surrogate is a useful statistical surrogate, our test can have better power than tests based on the surrogate that naively handle the true endpoint. In settings where the surrogate is not valid (treatment affects the surrogate but not the true endpoint), our test incorporates the information regarding the lack of treatment effect from the observed true endpoints and hence is expected to have a dampened treatment effect compared to tests based on the surrogate alone. PMID:27059817

  5. Towards Real-Time Detection of Gait Events on Different Terrains Using Time-Frequency Analysis and Peak Heuristics Algorithm.

    PubMed

    Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin

    2016-10-01

    Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses.

  6. Towards Real-Time Detection of Gait Events on Different Terrains Using Time-Frequency Analysis and Peak Heuristics Algorithm

    PubMed Central

    Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin

    2016-01-01

    Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses. PMID:27706086

  7. A LORETA study of mental time travel: similar and distinct electrophysiological correlates of re-experiencing past events and pre-experiencing future events.

    PubMed

    Lavallee, Christina F; Persinger, Michael A

    2010-12-01

    Previous studies exploring mental time travel paradigms with functional neuroimaging techniques have uncovered both common and distinct neural correlates of re-experiencing past events or pre-experiencing future events. A gap in the mental time travel literature exists, as paradigms have not explored the affective component of re-experiencing past episodic events; this study explored this sparsely researched area. The present study employed standardized low resolution electromagnetic tomography (sLORETA) to identify electrophysiological correlates of re-experience affect-laden and non-affective past events, as well as pre-experiencing a future anticipated event. Our results confirm previous research and are also novel in that we illustrate common and distinct electrophysiological correlates of re-experiencing affective episodic events. Furthermore, research from this experiment yields results outlining a pattern of activation in the frontal and temporal regions is correlated with the time frame of past or future events subjects imagined. Copyright © 2010 Elsevier Inc. All rights reserved.

  8. A Regularized Linear Dynamical System Framework for Multivariate Time Series Analysis.

    PubMed

    Liu, Zitao; Hauskrecht, Milos

    2015-01-01

    Linear Dynamical System (LDS) is an elegant mathematical framework for modeling and learning Multivariate Time Series (MTS). However, in general, it is difficult to set the dimension of an LDS's hidden state space. A small number of hidden states may not be able to model the complexities of a MTS, while a large number of hidden states can lead to overfitting. In this paper, we study learning methods that impose various regularization penalties on the transition matrix of the LDS model and propose a regularized LDS learning framework (rLDS) which aims to (1) automatically shut down LDSs' spurious and unnecessary dimensions, and consequently, address the problem of choosing the optimal number of hidden states; (2) prevent the overfitting problem given a small amount of MTS data; and (3) support accurate MTS forecasting. To learn the regularized LDS from data we incorporate a second order cone program and a generalized gradient descent method into the Maximum a Posteriori framework and use Expectation Maximization to obtain a low-rank transition matrix of the LDS model. We propose two priors for modeling the matrix which lead to two instances of our rLDS. We show that our rLDS is able to recover well the intrinsic dimensionality of the time series dynamics and it improves the predictive performance when compared to baselines on both synthetic and real-world MTS datasets.

  9. Improving linear accelerator service response with a real- time electronic event reporting system.

    PubMed

    Hoisak, Jeremy D P; Pawlicki, Todd; Kim, Gwe-Ya; Fletcher, Richard; Moore, Kevin L

    2014-09-08

    To track linear accelerator performance issues, an online event recording system was developed in-house for use by therapists and physicists to log the details of technical problems arising on our institution's four linear accelerators. In use since October 2010, the system was designed so that all clinical physicists would receive email notification when an event was logged. Starting in October 2012, we initiated a pilot project in collaboration with our linear accelerator vendor to explore a new model of service and support, in which event notifications were also sent electronically directly to dedicated engineers at the vendor's technical help desk, who then initiated a response to technical issues. Previously, technical issues were reported by telephone to the vendor's call center, which then disseminated information and coordinated a response with the Technical Support help desk and local service engineers. The purpose of this work was to investigate the improvements to clinical operations resulting from this new service model. The new and old service models were quantitatively compared by reviewing event logs and the oncology information system database in the nine months prior to and after initiation of the project. Here, we focus on events that resulted in an inoperative linear accelerator ("down" machine). Machine downtime, vendor response time, treatment cancellations, and event resolution were evaluated and compared over two equivalent time periods. In 389 clinical days, there were 119 machine-down events: 59 events before and 60 after introduction of the new model. In the new model, median time to service response decreased from 45 to 8 min, service engineer dispatch time decreased 44%, downtime per event decreased from 45 to 20 min, and treatment cancellations decreased 68%. The decreased vendor response time and reduced number of on-site visits by a service engineer resulted in decreased downtime and decreased patient treatment cancellations.

  10. Lower Performance in Orientation to Time and Place Associates with Greater Risk of Cardiovascular Events and Mortality in the Oldest Old: Leiden 85-Plus Study.

    PubMed

    Rostamian, Somayeh; van Buchem, Mark A; Jukema, J Wouter; Gussekloo, Jacobijn; Poortvliet, Rosalinde K E; de Cren, Anton J M; Sabayan, Behnam

    2017-01-01

    Background: Impairment in orientation to time and place is commonly observed in community-dwelling older individuals. Nevertheless, the clinical significance of this has been not fully explored. In this study, we investigated the link between performance in orientation domains and future risk of cardiovascular events and mortality in a non-hospital setting of the oldest old adults. Methods: We included 528 subjects free of myocardial infarction (Group A), 477 individuals free of stroke/transient ischemic attack (Group B), and 432 subjects free of both myocardial infarction and stroke/transient ischemic attack (Group C) at baseline from the population-based Leiden 85-plus cohort study. Participants were asked to answer five questions related to orientation to time and five questions related to orientation to place. 5-year risks of first-time fatal and non-fatal myocardial infarction, fatal and non-fatal stroke, as well as cardiovascular and non-cardiovascular mortality, were estimated using the multivariate Cox regression analysis. Results: In the multivariable analyses, adjusted for sociodemographic characteristics and cardiovascular risk factors, each point lower performance in "orientation to time" was significantly associated with higher risk of first-time myocardial infarction (hazard ratio [HR] 1.35, 95% confidence interval [CI] 1.09-1.67, P = 0.007), first-time stroke (HR 1.35, 95% CI 1.12-1.64, P = 0.002), cardiovascular mortality (HR 1.28, 95% CI 1.06-1.54, P = 0.009) and non-cardiovascular mortality (HR 1.37, 95% CI 1.20-1.56, P < 0.001). Similarly, each point lower performance in "orientation to place" was significantly associated with higher risk of first-time myocardial infarction (HR 1.67, 95% CI 1.25-2.22, P = 0.001), first-time stroke (HR 1.39, 95% CI 1.05-1.82, P = 0.016), cardiovascular mortality (HR 1.35, 95% CI 1.00-1.82, P = 0.054) and non-cardiovascular mortality (HR 1.45, 95% CI 1.20-1.77, P < 0.001). Conclusions: Lower performance in

  11. Element analysis: a wavelet-based method for analysing time-localized events in noisy time series

    PubMed Central

    2017-01-01

    A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized ‘events’. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event’s ‘region of influence’ within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis, is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry. PMID:28484325

  12. Cognitive load and task condition in event- and time-based prospective memory: an experimental investigation.

    PubMed

    Khan, Azizuddin; Sharma, Narendra K; Dixit, Shikha

    2008-09-01

    Prospective memory is memory for the realization of delayed intention. Researchers distinguish 2 kinds of prospective memory: event- and time-based (G. O. Einstein & M. A. McDaniel, 1990). Taking that distinction into account, the present authors explored participants' comparative performance under event- and time-based tasks. In an experimental study of 80 participants, the authors investigated the roles of cognitive load and task condition in prospective memory. Cognitive load (low vs. high) and task condition (event- vs. time-based task) were the independent variables. Accuracy in prospective memory was the dependent variable. Results showed significant differential effects under event- and time-based tasks. However, the effect of cognitive load was more detrimental in time-based prospective memory. Results also revealed that time monitoring is critical in successful performance of time estimation and so in time-based prospective memory. Similarly, participants' better performance on the event-based prospective memory task showed that they acted on the basis of environment cues. Event-based prospective memory was environmentally cued; time-based prospective memory required self-initiation.

  13. Nonparametric estimation of the multivariate survivor function: the multivariate Kaplan-Meier estimator.

    PubMed

    Prentice, Ross L; Zhao, Shanshan

    2018-01-01

    The Dabrowska (Ann Stat 16:1475-1489, 1988) product integral representation of the multivariate survivor function is extended, leading to a nonparametric survivor function estimator for an arbitrary number of failure time variates that has a simple recursive formula for its calculation. Empirical process methods are used to sketch proofs for this estimator's strong consistency and weak convergence properties. Summary measures of pairwise and higher-order dependencies are also defined and nonparametrically estimated. Simulation evaluation is given for the special case of three failure time variates.

  14. Sequential parallel comparison design with binary and time-to-event outcomes.

    PubMed

    Silverman, Rachel Kloss; Ivanova, Anastasia; Fine, Jason

    2018-04-30

    Sequential parallel comparison design (SPCD) has been proposed to increase the likelihood of success of clinical trials especially trials with possibly high placebo effect. Sequential parallel comparison design is conducted with 2 stages. Participants are randomized between active therapy and placebo in stage 1. Then, stage 1 placebo nonresponders are rerandomized between active therapy and placebo. Data from the 2 stages are pooled to yield a single P value. We consider SPCD with binary and with time-to-event outcomes. For time-to-event outcomes, response is defined as a favorable event prior to the end of follow-up for a given stage of SPCD. We show that for these cases, the usual test statistics from stages 1 and 2 are asymptotically normal and uncorrelated under the null hypothesis, leading to a straightforward combined testing procedure. In addition, we show that the estimators of the treatment effects from the 2 stages are asymptotically normal and uncorrelated under the null and alternative hypothesis, yielding confidence interval procedures with correct coverage. Simulations and real data analysis demonstrate the utility of the binary and time-to-event SPCD. Copyright © 2018 John Wiley & Sons, Ltd.

  15. Multivariate Analyses of Small Theropod Dinosaur Teeth and Implications for Paleoecological Turnover through Time

    PubMed Central

    Larson, Derek W.; Currie, Philip J.

    2013-01-01

    Isolated small theropod teeth are abundant in vertebrate microfossil assemblages, and are frequently used in studies of species diversity in ancient ecosystems. However, determining the taxonomic affinities of these teeth is problematic due to an absence of associated diagnostic skeletal material. Species such as Dromaeosaurus albertensis, Richardoestesia gilmorei, and Saurornitholestes langstoni are known from skeletal remains that have been recovered exclusively from the Dinosaur Park Formation (Campanian). It is therefore likely that teeth from different formations widely disparate in age or geographic position are not referable to these species. Tooth taxa without any associated skeletal material, such as Paronychodon lacustris and Richardoestesia isosceles, have also been identified from multiple localities of disparate ages throughout the Late Cretaceous. To address this problem, a dataset of measurements of 1183 small theropod teeth (the most specimen-rich theropod tooth dataset ever constructed) from North America ranging in age from Santonian through Maastrichtian were analyzed using multivariate statistical methods: canonical variate analysis, pairwise discriminant function analysis, and multivariate analysis of variance. The results indicate that teeth referred to the same taxon from different formations are often quantitatively distinct. In contrast, isolated teeth found in time equivalent formations are not quantitatively distinguishable from each other. These results support the hypothesis that small theropod taxa, like other dinosaurs in the Late Cretaceous, tend to be exclusive to discrete host formations. The methods outlined have great potential for future studies of isolated teeth worldwide, and may be the most useful non-destructive technique known of extracting the most data possible from isolated and fragmentary specimens. The ability to accurately assess species diversity and turnover through time based on isolated teeth will help illuminate

  16. Real-Time Multimission Event Notification System for Mars Relay

    NASA Technical Reports Server (NTRS)

    Wallick, Michael N.; Allard, Daniel A.; Gladden, Roy E.; Wang, Paul; Hy, Franklin H.

    2013-01-01

    As the Mars Relay Network is in constant flux (missions and teams going through their daily workflow), it is imperative that users are aware of such state changes. For example, a change by an orbiter team can affect operations on a lander team. This software provides an ambient view of the real-time status of the Mars network. The Mars Relay Operations Service (MaROS) comprises a number of tools to coordinate, plan, and visualize various aspects of the Mars Relay Network. As part of MaROS, a feature set was developed that operates on several levels of the software architecture. These levels include a Web-based user interface, a back-end "ReSTlet" built in Java, and databases that store the data as it is received from the network. The result is a real-time event notification and management system, so mission teams can track and act upon events on a moment-by-moment basis. This software retrieves events from MaROS and displays them to the end user. Updates happen in real time, i.e., messages are pushed to the user while logged into the system, and queued when the user is not online for later viewing. The software does not do away with the email notifications, but augments them with in-line notifications. Further, this software expands the events that can generate a notification, and allows user-generated notifications. Existing software sends a smaller subset of mission-generated notifications via email. A common complaint of users was that the system-generated e-mails often "get lost" with other e-mail that comes in. This software allows for an expanded set (including user-generated) of notifications displayed in-line of the program. By separating notifications, this can improve a user's workflow.

  17. A log-Weibull spatial scan statistic for time to event data.

    PubMed

    Usman, Iram; Rosychuk, Rhonda J

    2018-06-13

    Spatial scan statistics have been used for the identification of geographic clusters of elevated numbers of cases of a condition such as disease outbreaks. These statistics accompanied by the appropriate distribution can also identify geographic areas with either longer or shorter time to events. Other authors have proposed the spatial scan statistics based on the exponential and Weibull distributions. We propose the log-Weibull as an alternative distribution for the spatial scan statistic for time to events data and compare and contrast the log-Weibull and Weibull distributions through simulation studies. The effect of type I differential censoring and power have been investigated through simulated data. Methods are also illustrated on time to specialist visit data for discharged patients presenting to emergency departments for atrial fibrillation and flutter in Alberta during 2010-2011. We found northern regions of Alberta had longer times to specialist visit than other areas. We proposed the spatial scan statistic for the log-Weibull distribution as a new approach for detecting spatial clusters for time to event data. The simulation studies suggest that the test performs well for log-Weibull data.

  18. Long-term changes in regular and low-frequency earthquake inter-event times near Parkfield, CA

    NASA Astrophysics Data System (ADS)

    Wu, C.; Shelly, D. R.; Johnson, P. A.; Gomberg, J. S.; Peng, Z.

    2012-12-01

    The temporal evolution of earthquake inter-event time may provide important clues for the timing of future events and underlying physical mechanisms of earthquake nucleation. In this study, we examine inter-event times from 12-yr catalogs of ~50,000 earthquakes and ~730,000 LFEs in the vicinity of the Parkfield section of the San Andreas Fault. We focus on the long-term evolution of inter-event times after the 2003 Mw6.5 San Simeon and 2004 Mw6.0 Parkfield earthquakes. We find that inter-event times decrease by ~4 orders of magnitudes after the Parkfield and San Simeon earthquakes and are followed by a long-term recovery with time scales of ~3 years and more than 8 years for earthquakes along and to the southwest of the San Andreas fault, respectively. The differing long-term recovery of the earthquake inter-event times is likely a manifestation of different aftershock recovery time scales that reflect the different tectonic loading rates in the two regions. We also observe a possible decrease of LFE inter-event times in some LFE families, followed by a recovery with time scales of ~4 months to several years. The drop in the recurrence time of LFE after the Parkfield earthquake is likely caused by a combination of the dynamic and positive static stress induced by the Parkfield earthquake, and the long-term recovery in LFE recurrence time could be due to post-seismic relaxation or gradual recovery of the fault zone material properties. Our on-going work includes better constraining and understanding the physical mechanisms responsible for the observed long-term recovery in earthquake and LFE inter-event times.

  19. Model predictive control of P-time event graphs

    NASA Astrophysics Data System (ADS)

    Hamri, H.; Kara, R.; Amari, S.

    2016-12-01

    This paper deals with model predictive control of discrete event systems modelled by P-time event graphs. First, the model is obtained by using the dater evolution model written in the standard algebra. Then, for the control law, we used the finite-horizon model predictive control. For the closed-loop control, we used the infinite-horizon model predictive control (IH-MPC). The latter is an approach that calculates static feedback gains which allows the stability of the closed-loop system while respecting the constraints on the control vector. The problem of IH-MPC is formulated as a linear convex programming subject to a linear matrix inequality problem. Finally, the proposed methodology is applied to a transportation system.

  20. Relative Time-scale for Channeling Events Within Chaotic Terrains, Margaritifer Sinus, Mars

    NASA Technical Reports Server (NTRS)

    Janke, D.

    1985-01-01

    A relative time scale for ordering channel and chaos forming events was constructed for areas within the Margaritifer Sinus region of Mars. Transection and superposition relationships of channels, chaotic terrain, and the surfaces surrounding them were used to create the relative time scale; crater density studies were not used. Channels and chaos in contact with one another were treated as systems. These systems were in turn treated both separately (in order to understand internal relationships) and as members of the suite of Martian erosional forms (in order to produce a combined, master time scale). Channeling events associated with chaotic terrain development occurred over an extended geomorphic period. The channels can be divided into three convenient groups: those that pre-date intercrater plains development post-plains, pre-chasma systems; and those associated with the development of the Vallis Marineris chasmata. No correlations with cyclic climatic changes, major geologic events in other regions on Mars, or triggering phenomena (for example, specific impact events) were found.

  1. Full-waveform detection of non-impulsive seismic events based on time-reversal methods

    NASA Astrophysics Data System (ADS)

    Solano, Ericka Alinne; Hjörleifsdóttir, Vala; Liu, Qinya

    2017-12-01

    We present a full-waveform detection method for non-impulsive seismic events, based on time-reversal principles. We use the strain Green's tensor as a matched filter, correlating it with continuous observed seismograms, to detect non-impulsive seismic events. We show that this is mathematically equivalent to an adjoint method for detecting earthquakes. We define the detection function, a scalar valued function, which depends on the stacked correlations for a group of stations. Event detections are given by the times at which the amplitude of the detection function exceeds a given value relative to the noise level. The method can make use of the whole seismic waveform or any combination of time-windows with different filters. It is expected to have an advantage compared to traditional detection methods for events that do not produce energetic and impulsive P waves, for example glacial events, landslides, volcanic events and transform-fault earthquakes for events which velocity structure along the path is relatively well known. Furthermore, the method has advantages over empirical Greens functions template matching methods, as it does not depend on records from previously detected events, and therefore is not limited to events occurring in similar regions and with similar focal mechanisms as these events. The method is not specific to any particular way of calculating the synthetic seismograms, and therefore complicated structural models can be used. This is particularly beneficial for intermediate size events that are registered on regional networks, for which the effect of lateral structure on the waveforms can be significant. To demonstrate the feasibility of the method, we apply it to two different areas located along the mid-oceanic ridge system west of Mexico where non-impulsive events have been reported. The first study area is between Clipperton and Siqueiros transform faults (9°N), during the time of two earthquake swarms, occurring in March 2012 and May

  2. Testing the structure of earthquake networks from multivariate time series of successive main shocks in Greece

    NASA Astrophysics Data System (ADS)

    Chorozoglou, D.; Kugiumtzis, D.; Papadimitriou, E.

    2018-06-01

    The seismic hazard assessment in the area of Greece is attempted by studying the earthquake network structure, such as small-world and random. In this network, a node represents a seismic zone in the study area and a connection between two nodes is given by the correlation of the seismic activity of two zones. To investigate the network structure, and particularly the small-world property, the earthquake correlation network is compared with randomized ones. Simulations on multivariate time series of different length and number of variables show that for the construction of randomized networks the method randomizing the time series performs better than methods randomizing directly the original network connections. Based on the appropriate randomization method, the network approach is applied to time series of earthquakes that occurred between main shocks in the territory of Greece spanning the period 1999-2015. The characterization of networks on sliding time windows revealed that small-world structure emerges in the last time interval, shortly before the main shock.

  3. Classification and Space-Time Analysis of Precipitation Events in Manizales, Caldas, Colombia.

    NASA Astrophysics Data System (ADS)

    Suarez Hincapie, J. N.; Vélez, J.; Romo Melo, L.; Chang, P.

    2015-12-01

    Manizales is a mid-mountain Andean city located near the Nevado del Ruiz volcano in west-central Colombia, this location exposes it to earthquakes, floods, landslides and volcanic eruptions. It is located in the intertropical convergence zone (ITCZ) and presents a climate with a bimodal rainfall regime (Cortés, 2010). Its mean annual rainfall is 2000 mm, one may observe precipitation 70% of the days over a year. This rain which favors the formation of large masses of clouds and the presence of macroclimatic phenomenon as "El Niño South Oscillation", has historically caused great impacts in the region (Vélez et al, 2012). For example the geographical location coupled with rain events results in a high risk of landslides in the city. Manizales has a hydrometeorological network of 40 stations that measure and transmit data of up to eight climate variables. Some of these stations keep 10 years of historical data. However, until now this information has not been used for space-time classification of precipitation events, nor has the meteorological variables that influence them been thoroughly researched. The purpose of this study was to classify historical events of rain in an urban area of Manizales and investigate patterns of atmospheric behavior that influence or trigger such events. Classification of events was performed by calculating the "n" index of the heavy rainfall, describing the behavior of precipitation as a function of time throughout the event (Monjo, 2009). The analysis of meteorological variables was performed using statistical quantification over variable time periods before each event. The proposed classification allowed for an analysis of the evolution of rainfall events. Specially, it helped to look for the influence of different meteorological variables triggering rainfall events in hazardous areas as the city of Manizales.

  4. F100 Multivariable Control Synthesis Program. Computer Implementation of the F100 Multivariable Control Algorithm

    NASA Technical Reports Server (NTRS)

    Soeder, J. F.

    1983-01-01

    As turbofan engines become more complex, the development of controls necessitate the use of multivariable control techniques. A control developed for the F100-PW-100(3) turbofan engine by using linear quadratic regulator theory and other modern multivariable control synthesis techniques is described. The assembly language implementation of this control on an SEL 810B minicomputer is described. This implementation was then evaluated by using a real-time hybrid simulation of the engine. The control software was modified to run with a real engine. These modifications, in the form of sensor and actuator failure checks and control executive sequencing, are discussed. Finally recommendations for control software implementations are presented.

  5. Multivariate analysis of longitudinal rates of change.

    PubMed

    Bryan, Matthew; Heagerty, Patrick J

    2016-12-10

    Longitudinal data allow direct comparison of the change in patient outcomes associated with treatment or exposure. Frequently, several longitudinal measures are collected that either reflect a common underlying health status, or characterize processes that are influenced in a similar way by covariates such as exposure or demographic characteristics. Statistical methods that can combine multivariate response variables into common measures of covariate effects have been proposed in the literature. Current methods for characterizing the relationship between covariates and the rate of change in multivariate outcomes are limited to select models. For example, 'accelerated time' methods have been developed which assume that covariates rescale time in longitudinal models for disease progression. In this manuscript, we detail an alternative multivariate model formulation that directly structures longitudinal rates of change and that permits a common covariate effect across multiple outcomes. We detail maximum likelihood estimation for a multivariate longitudinal mixed model. We show via asymptotic calculations the potential gain in power that may be achieved with a common analysis of multiple outcomes. We apply the proposed methods to the analysis of a trivariate outcome for infant growth and compare rates of change for HIV infected and uninfected infants. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. Screen-based entertainment time, all-cause mortality, and cardiovascular events: population-based study with ongoing mortality and hospital events follow-up.

    PubMed

    Stamatakis, Emmanuel; Hamer, Mark; Dunstan, David W

    2011-01-18

    The aim of this study was to examine the independent relationships of television viewing or other screen-based entertainment ("screen time") with all-cause mortality and clinically confirmed cardiovascular disease (CVD) events. A secondary objective was to examine the extent to which metabolic (body mass index, high-density lipoprotein and total cholesterol) and inflammatory (C-reactive protein) markers mediate the relationship between screen time and CVD events. Although some evidence suggests that prolonged sitting is linked to CVD risk factor development regardless of physical activity participation, studies with hard outcomes are scarce. A population sample of 4,512 (1,945 men) Scottish Health Survey 2003 respondents (≥35 years) were followed up to 2007 for all-cause mortality and CVD events (fatal and nonfatal combined). Main exposures were interviewer-assessed screen time (<2 h/day; 2 to <4 h/day; and ≥4 h/day) and moderate to vigorous intensity physical activity. Two hundred fifteen CVD events and 325 any-cause deaths occurred during 19,364 follow-up person-years. The covariable (age, sex, ethnicity, obesity, smoking, social class, long-standing illness, marital status, diabetes, hypertension)-adjusted hazard ratio (HR) for all-cause mortality was 1.52 (95% confidence interval [CI]: 1.06 to 2.16) and for CVD events was 2.30 (95% CI: 1.33 to 3.96) for participants engaging in ≥4 h/day of screen time relative to <2 h/day. Adjusting for physical activity attenuated these associations only slightly (all-cause mortality: HR: 1.48, 95% CI: 1.04 to 2.13; CVD events: HR: 2.25, 95% CI: 1.30 to 3.89). Exclusion of participants with CVD events in the first 2 years of follow-up and previous cancer registrations did not change these results appreciably. Approximately 25% of the association between screen time and CVD events was explained collectively by C-reactive protein, body mass index, and high-density lipoprotein cholesterol. Recreational sitting, as reflected

  7. BACKWARD ESTIMATION OF STOCHASTIC PROCESSES WITH FAILURE EVENTS AS TIME ORIGINS1

    PubMed Central

    Gary Chan, Kwun Chuen; Wang, Mei-Cheng

    2011-01-01

    Stochastic processes often exhibit sudden systematic changes in pattern a short time before certain failure events. Examples include increase in medical costs before death and decrease in CD4 counts before AIDS diagnosis. To study such terminal behavior of stochastic processes, a natural and direct way is to align the processes using failure events as time origins. This paper studies backward stochastic processes counting time backward from failure events, and proposes one-sample nonparametric estimation of the mean of backward processes when follow-up is subject to left truncation and right censoring. We will discuss benefits of including prevalent cohort data to enlarge the identifiable region and large sample properties of the proposed estimator with related extensions. A SEER–Medicare linked data set is used to illustrate the proposed methodologies. PMID:21359167

  8. Immortal time bias in observational studies of time-to-event outcomes.

    PubMed

    Jones, Mark; Fowler, Robert

    2016-12-01

    The purpose of the study is to show, through simulation and example, the magnitude and direction of immortal time bias when an inappropriate analysis is used. We compare 4 methods of analysis for observational studies of time-to-event outcomes: logistic regression, standard Cox model, landmark analysis, and time-dependent Cox model using an example data set of patients critically ill with influenza and a simulation study. For the example data set, logistic regression, standard Cox model, and landmark analysis all showed some evidence that treatment with oseltamivir provides protection from mortality in patients critically ill with influenza. However, when the time-dependent nature of treatment exposure is taken account of using a time-dependent Cox model, there is no longer evidence of a protective effect of treatment. The simulation study showed that, under various scenarios, the time-dependent Cox model consistently provides unbiased treatment effect estimates, whereas standard Cox model leads to bias in favor of treatment. Logistic regression and landmark analysis may also lead to bias. To minimize the risk of immortal time bias in observational studies of survival outcomes, we strongly suggest time-dependent exposures be included as time-dependent variables in hazard-based analyses. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Identification of unusual events in multi-channel bridge monitoring data

    NASA Astrophysics Data System (ADS)

    Omenzetter, Piotr; Brownjohn, James Mark William; Moyo, Pilate

    2004-03-01

    Continuously operating instrumented structural health monitoring (SHM) systems are becoming a practical alternative to replace visual inspection for assessment of condition and soundness of civil infrastructure such as bridges. However, converting large amounts of data from an SHM system into usable information is a great challenge to which special signal processing techniques must be applied. This study is devoted to identification of abrupt, anomalous and potentially onerous events in the time histories of static, hourly sampled strains recorded by a multi-sensor SHM system installed in a major bridge structure and operating continuously for a long time. Such events may result, among other causes, from sudden settlement of foundation, ground movement, excessive traffic load or failure of post-tensioning cables. A method of outlier detection in multivariate data has been applied to the problem of finding and localising sudden events in the strain data. For sharp discrimination of abrupt strain changes from slowly varying ones wavelet transform has been used. The proposed method has been successfully tested using known events recorded during construction of the bridge, and later effectively used for detection of anomalous post-construction events.

  10. Timing and tempo of the Great Oxidation Event

    PubMed Central

    Chamberlain, Kevin R.; Bleeker, Wouter; Söderlund, Ulf; de Kock, Michiel O.; Larsson, Emilie R.; Bekker, Andrey

    2017-01-01

    The first significant buildup in atmospheric oxygen, the Great Oxidation Event (GOE), began in the early Paleoproterozoic in association with global glaciations and continued until the end of the Lomagundi carbon isotope excursion ca. 2,060 Ma. The exact timing of and relationships among these events are debated because of poor age constraints and contradictory stratigraphic correlations. Here, we show that the first Paleoproterozoic global glaciation and the onset of the GOE occurred between ca. 2,460 and 2,426 Ma, ∼100 My earlier than previously estimated, based on an age of 2,426 ± 3 Ma for Ongeluk Formation magmatism from the Kaapvaal Craton of southern Africa. This age helps define a key paleomagnetic pole that positions the Kaapvaal Craton at equatorial latitudes of 11° ± 6° at this time. Furthermore, the rise of atmospheric oxygen was not monotonic, but was instead characterized by oscillations, which together with climatic instabilities may have continued over the next ∼200 My until ≤2,250–2,240 Ma. Ongeluk Formation volcanism at ca. 2,426 Ma was part of a large igneous province (LIP) and represents a waning stage in the emplacement of several temporally discrete LIPs across a large low-latitude continental landmass. These LIPs played critical, albeit complex, roles in the rise of oxygen and in both initiating and terminating global glaciations. This series of events invites comparison with the Neoproterozoic oxygen increase and Sturtian Snowball Earth glaciation, which accompanied emplacement of LIPs across supercontinent Rodinia, also positioned at low latitude. PMID:28167763

  11. Boosted Multivariate Trees for Longitudinal Data

    PubMed Central

    Pande, Amol; Li, Liang; Rajeswaran, Jeevanantham; Ehrlinger, John; Kogalur, Udaya B.; Blackstone, Eugene H.; Ishwaran, Hemant

    2017-01-01

    Machine learning methods provide a powerful approach for analyzing longitudinal data in which repeated measurements are observed for a subject over time. We boost multivariate trees to fit a novel flexible semi-nonparametric marginal model for longitudinal data. In this model, features are assumed to be nonparametric, while feature-time interactions are modeled semi-nonparametrically utilizing P-splines with estimated smoothing parameter. In order to avoid overfitting, we describe a relatively simple in sample cross-validation method which can be used to estimate the optimal boosting iteration and which has the surprising added benefit of stabilizing certain parameter estimates. Our new multivariate tree boosting method is shown to be highly flexible, robust to covariance misspecification and unbalanced designs, and resistant to overfitting in high dimensions. Feature selection can be used to identify important features and feature-time interactions. An application to longitudinal data of forced 1-second lung expiratory volume (FEV1) for lung transplant patients identifies an important feature-time interaction and illustrates the ease with which our method can find complex relationships in longitudinal data. PMID:29249866

  12. Survival Outcomes and Effect of Early vs. Deferred cART Among HIV-Infected Patients Diagnosed at the Time of an AIDS-Defining Event: A Cohort Analysis

    PubMed Central

    Mussini, Cristina; Johnson, Margaret; d'Arminio Monforte, Antonella; Antinori, Andrea; Gill, M. John; Sighinolfi, Laura; Uberti-Foppa, Caterina; Borghi, Vanni; Sabin, Caroline

    2011-01-01

    Objectives We analyzed clinical progression among persons diagnosed with HIV at the time of an AIDS-defining event, and assessed the impact on outcome of timing of combined antiretroviral treatment (cART). Methods Retrospective, European and Canadian multicohort study.. Patients were diagnosed with HIV from 1997–2004 and had clinical AIDS from 30 days before to 14 days after diagnosis. Clinical progression (new AIDS event, death) was described using Kaplan-Meier analysis stratifying by type of AIDS event. Factors associated with progression were identified with multivariable Cox regression. Progression rates were compared between those starting early (<30 days after AIDS event) or deferred (30–270 days after AIDS event) cART. Results The median (interquartile range) CD4 count and viral load (VL) at diagnosis of the 584 patients were 42 (16, 119) cells/µL and 5.2 (4.5, 5.7) log10 copies/mL. Clinical progression was observed in 165 (28.3%) patients. Older age, a higher VL at diagnosis, and a diagnosis of non-Hodgkin lymphoma (NHL) (vs. other AIDS events) were independently associated with disease progression. Of 366 patients with an opportunistic infection, 178 (48.6%) received early cART. There was no significant difference in clinical progression between those initiating cART early and those deferring treatment (adjusted hazard ratio 1.32 [95% confidence interval 0.87, 2.00], p = 0.20). Conclusions Older patients and patients with high VL or NHL at diagnosis had a worse outcome. Our data suggest that earlier initiation of cART may be beneficial among HIV-infected patients diagnosed with clinical AIDS in our setting. PMID:22043301

  13. Unchanged Levels of Soluble CD14 and IL-6 Over Time Predict Serious Non-AIDS Events in HIV-1-Infected People

    PubMed Central

    Sunil, Meena; Nigalye, Maitreyee; Somasunderam, Anoma; Martinez, Maria Laura; Yu, Xiaoying; Arduino, Roberto C.; Bell, Tanvir K.

    2016-01-01

    Abstract HIV-1-infected persons have increased risk of serious non-AIDS events (SNAEs) despite suppressive antiretroviral therapy. Increased circulating levels of soluble CD14 (sCD14), soluble CD163 (sCD163), and interleukin-6 (IL-6) at a single time point have been associated with SNAEs. However, whether changes in these biomarker levels predict SNAEs in HIV-1-infected persons is unknown. We hypothesized that greater decreases in inflammatory biomarkers would be associated with fewer SNAEs. We identified 39 patients with SNAEs, including major cardiovascular events, end stage renal disease, decompensated cirrhosis, non-AIDS-defining malignancies, and death of unknown cause, and age- and sex-matched HIV-1-infected controls. sCD14, sCD163, and IL-6 were measured at study enrollment (T1) and proximal to the event (T2) or equivalent duration in matched controls. Over ∼34 months, unchanged rather than decreasing levels of sCD14 and IL-6 predicted SNAEs. Older age and current illicit substance abuse, but not HCV coinfection, were associated with SNAEs. In a multivariate analysis, older age, illicit substance use, and unchanged IL-6 levels remained significantly associated with SNAEs. Thus, the trajectories of sCD14 and IL-6 levels predict SNAEs. Interventions to decrease illicit substance use may decrease the risk of SNAEs in HIV-1-infected persons. PMID:27344921

  14. Simulating recurrent event data with hazard functions defined on a total time scale.

    PubMed

    Jahn-Eimermacher, Antje; Ingel, Katharina; Ozga, Ann-Kathrin; Preussler, Stella; Binder, Harald

    2015-03-08

    In medical studies with recurrent event data a total time scale perspective is often needed to adequately reflect disease mechanisms. This means that the hazard process is defined on the time since some starting point, e.g. the beginning of some disease, in contrast to a gap time scale where the hazard process restarts after each event. While techniques such as the Andersen-Gill model have been developed for analyzing data from a total time perspective, techniques for the simulation of such data, e.g. for sample size planning, have not been investigated so far. We have derived a simulation algorithm covering the Andersen-Gill model that can be used for sample size planning in clinical trials as well as the investigation of modeling techniques. Specifically, we allow for fixed and/or random covariates and an arbitrary hazard function defined on a total time scale. Furthermore we take into account that individuals may be temporarily insusceptible to a recurrent incidence of the event. The methods are based on conditional distributions of the inter-event times conditional on the total time of the preceeding event or study start. Closed form solutions are provided for common distributions. The derived methods have been implemented in a readily accessible R script. The proposed techniques are illustrated by planning the sample size for a clinical trial with complex recurrent event data. The required sample size is shown to be affected not only by censoring and intra-patient correlation, but also by the presence of risk-free intervals. This demonstrates the need for a simulation algorithm that particularly allows for complex study designs where no analytical sample size formulas might exist. The derived simulation algorithm is seen to be useful for the simulation of recurrent event data that follow an Andersen-Gill model. Next to the use of a total time scale, it allows for intra-patient correlation and risk-free intervals as are often observed in clinical trial data

  15. Multivariate statistical process control (MSPC) using Raman spectroscopy for in-line culture cell monitoring considering time-varying batches synchronized with correlation optimized warping (COW).

    PubMed

    Liu, Ya-Juan; André, Silvère; Saint Cristau, Lydia; Lagresle, Sylvain; Hannas, Zahia; Calvosa, Éric; Devos, Olivier; Duponchel, Ludovic

    2017-02-01

    Multivariate statistical process control (MSPC) is increasingly popular as the challenge provided by large multivariate datasets from analytical instruments such as Raman spectroscopy for the monitoring of complex cell cultures in the biopharmaceutical industry. However, Raman spectroscopy for in-line monitoring often produces unsynchronized data sets, resulting in time-varying batches. Moreover, unsynchronized data sets are common for cell culture monitoring because spectroscopic measurements are generally recorded in an alternate way, with more than one optical probe parallelly connecting to the same spectrometer. Synchronized batches are prerequisite for the application of multivariate analysis such as multi-way principal component analysis (MPCA) for the MSPC monitoring. Correlation optimized warping (COW) is a popular method for data alignment with satisfactory performance; however, it has never been applied to synchronize acquisition time of spectroscopic datasets in MSPC application before. In this paper we propose, for the first time, to use the method of COW to synchronize batches with varying durations analyzed with Raman spectroscopy. In a second step, we developed MPCA models at different time intervals based on the normal operation condition (NOC) batches synchronized by COW. New batches are finally projected considering the corresponding MPCA model. We monitored the evolution of the batches using two multivariate control charts based on Hotelling's T 2 and Q. As illustrated with results, the MSPC model was able to identify abnormal operation condition including contaminated batches which is of prime importance in cell culture monitoring We proved that Raman-based MSPC monitoring can be used to diagnose batches deviating from the normal condition, with higher efficacy than traditional diagnosis, which would save time and money in the biopharmaceutical industry. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. SQL Triggers Reacting on Time Events: An Extension Proposal

    NASA Astrophysics Data System (ADS)

    Behrend, Andreas; Dorau, Christian; Manthey, Rainer

    Being able to activate triggers at timepoints reached or after time intervals elapsed has been acknowledged by many authors as a valuable functionality of a DBMS. Recently, the interest in time-based triggers has been renewed in the context of data stream monitoring. However, up till now SQL triggers react to data changes only, even though research proposals and prototypes have been supporting several other event types, in particular time-based ones, since long. We therefore propose a seamless extension of the SQL trigger concept by time-based triggers, focussing on semantic issues arising from such an extension.

  17. Multivariate space - time analysis of PRE-STORM precipitation

    NASA Technical Reports Server (NTRS)

    Polyak, Ilya; North, Gerald R.; Valdes, Juan B.

    1994-01-01

    This paper presents the methodologies and results of the multivariate modeling and two-dimensional spectral and correlation analysis of PRE-STORM rainfall gauge data. Estimated parameters of the models for the specific spatial averages clearly indicate the eastward and southeastward wave propagation of rainfall fluctuations. A relationship between the coefficients of the diffusion equation and the parameters of the stochastic model of rainfall fluctuations is derived that leads directly to the exclusive use of rainfall data to estimate advection speed (about 12 m/s) as well as other coefficients of the diffusion equation of the corresponding fields. The statistical methodology developed here can be used for confirmation of physical models by comparison of the corresponding second-moment statistics of the observed and simulated data, for generating multiple samples of any size, for solving the inverse problem of the hydrodynamic equations, and for application in some other areas of meteorological and climatological data analysis and modeling.

  18. Assessing Adverse Events of Postprostatectomy Radiation Therapy for Prostate Cancer: Evaluation of Outcomes in the Regione Emilia-Romagna, Italy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Showalter, Timothy N., E-mail: tns3b@virginia.edu; Hegarty, Sarah E.; Division of Biostatistics, Department of Pharmacology and Experimental Therapeutics, Sidney Kimmel Medical College at Thomas Jefferson University, Philadelphia, Pennsylvania

    Purpose: Although the likelihood of radiation-related adverse events influences treatment decisions regarding radiation therapy after prostatectomy for eligible patients, the data available to inform decisions are limited. This study was designed to evaluate the genitourinary, gastrointestinal, and sexual adverse events associated with postprostatectomy radiation therapy and to assess the influence of radiation timing on the risk of adverse events. Methods: The Regione Emilia-Romagna Italian Longitudinal Health Care Utilization Database was queried to identify a cohort of men who received radical prostatectomy for prostate cancer during 2003 to 2009, including patients who received postprostatectomy radiation therapy. Patients with prior radiation therapymore » were excluded. Outcome measures were genitourinary, gastrointestinal, and sexual adverse events after prostatectomy. Rates of adverse events were compared between the cohorts who did and did not receive postoperative radiation therapy. Multivariable Cox proportional hazards models were developed for each class of adverse events, including models with radiation therapy as a time-varying covariate. Results: A total of 9876 men were included in the analyses: 2176 (22%) who received radiation therapy and 7700 (78%) treated with prostatectomy alone. In multivariable Cox proportional hazards models, the additional exposure to radiation therapy after prostatectomy was associated with increased rates of gastrointestinal (rate ratio [RR] 1.81; 95% confidence interval [CI] 1.44-2.27; P<.001) and urinary nonincontinence events (RR 1.83; 95% CI 1.83-2.80; P<.001) but not urinary incontinence events or erectile dysfunction. The addition of the time from prostatectomy to radiation therapy interaction term was not significant for any of the adverse event outcomes (P>.1 for all outcomes). Conclusion: Radiation therapy after prostatectomy is associated with an increase in gastrointestinal and genitourinary adverse events

  19. Near Optimal Event-Triggered Control of Nonlinear Discrete-Time Systems Using Neurodynamic Programming.

    PubMed

    Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani

    2016-09-01

    This paper presents an event-triggered near optimal control of uncertain nonlinear discrete-time systems. Event-driven neurodynamic programming (NDP) is utilized to design the control policy. A neural network (NN)-based identifier, with event-based state and input vectors, is utilized to learn the system dynamics. An actor-critic framework is used to learn the cost function and the optimal control input. The NN weights of the identifier, the critic, and the actor NNs are tuned aperiodically once every triggered instant. An adaptive event-trigger condition to decide the trigger instants is derived. Thus, a suitable number of events are generated to ensure a desired accuracy of approximation. A near optimal performance is achieved without using value and/or policy iterations. A detailed analysis of nontrivial inter-event times with an explicit formula to show the reduction in computation is also derived. The Lyapunov technique is used in conjunction with the event-trigger condition to guarantee the ultimate boundedness of the closed-loop system. The simulation results are included to verify the performance of the controller. The net result is the development of event-driven NDP.

  20. Hot spots of multivariate extreme anomalies in Earth observations

    NASA Astrophysics Data System (ADS)

    Flach, M.; Sippel, S.; Bodesheim, P.; Brenning, A.; Denzler, J.; Gans, F.; Guanche, Y.; Reichstein, M.; Rodner, E.; Mahecha, M. D.

    2016-12-01

    Anomalies in Earth observations might indicate data quality issues, extremes or the change of underlying processes within a highly multivariate system. Thus, considering the multivariate constellation of variables for extreme detection yields crucial additional information over conventional univariate approaches. We highlight areas in which multivariate extreme anomalies are more likely to occur, i.e. hot spots of extremes in global atmospheric Earth observations that impact the Biosphere. In addition, we present the year of the most unusual multivariate extreme between 2001 and 2013 and show that these coincide with well known high impact extremes. Technically speaking, we account for multivariate extremes by using three sophisticated algorithms adapted from computer science applications. Namely an ensemble of the k-nearest neighbours mean distance, a kernel density estimation and an approach based on recurrences is used. However, the impact of atmosphere extremes on the Biosphere might largely depend on what is considered to be normal, i.e. the shape of the mean seasonal cycle and its inter-annual variability. We identify regions with similar mean seasonality by means of dimensionality reduction in order to estimate in each region both the `normal' variance and robust thresholds for detecting the extremes. In addition, we account for challenges like heteroscedasticity in Northern latitudes. Apart from hot spot areas, those anomalies in the atmosphere time series are of particular interest, which can only be detected by a multivariate approach but not by a simple univariate approach. Such an anomalous constellation of atmosphere variables is of interest if it impacts the Biosphere. The multivariate constellation of such an anomalous part of a time series is shown in one case study indicating that multivariate anomaly detection can provide novel insights into Earth observations.

  1. A diary after dinner: How the time of event recording influences later accessibility of diary events.

    PubMed

    Szőllősi, Ágnes; Keresztes, Attila; Conway, Martin A; Racsmány, Mihály

    2015-01-01

    Recording the events of a day in a diary may help improve their later accessibility. An interesting question is whether improvements in long-term accessibility will be greater if the diary is completed at the end of the day, or after a period of sleep, the following morning. We investigated this question using an internet-based diary method. On each of five days, participants (n = 109) recorded autobiographical memories for that day or for the previous day. Recording took place either in the morning or in the evening. Following a 30-day retention interval, the diary events were free recalled. We found that participants who recorded their memories in the evening before sleep had best memory performance. These results suggest that the time of reactivation and recording of recent autobiographical events has a significant effect on the later accessibility of those diary events. We discuss our results in the light of related findings that show a beneficial effect of reduced interference during sleep on memory consolidation and reconsolidation.

  2. Central sleep apnea detection from ECG-derived respiratory signals. Application of multivariate recurrence plot analysis.

    PubMed

    Maier, C; Dickhaus, H

    2010-01-01

    This study examines the suitability of recurrence plot analysis for the problem of central sleep apnea (CSA) detection and delineation from ECG-derived respiratory (EDR) signals. A parameter describing the average length of vertical line structures in recurrence plots is calculated at a time resolution of 1 s as 'instantaneous trapping time'. Threshold comparison of this parameter is used to detect ongoing CSA. In data from 26 patients (duration 208 h) we assessed sensitivity for detection of CSA and mixed apnea (MSA) events by comparing the results obtained from 8-channel Holter ECGs to the annotations (860 CSA, 480 MSA) of simultaneously registered polysomnograms. Multivariate combination of the EDR from different ECG leads improved the detection accuracy significantly. When all eight leads were considered, an average instantaneous vertical line length above 5 correctly identified 1126 of the 1340 events (sensitivity 84%) with a total number of 1881 positive detections. We conclude that recurrence plot analysis is a promising tool for detection and delineation of CSA epochs from EDR signals with high time resolution. Moreover, the approach is likewise applicable to directly measured respiratory signals.

  3. Automated multivariate analysis of multi-sensor data submitted online: Real-time environmental monitoring.

    PubMed

    Eide, Ingvar; Westad, Frank

    2018-01-01

    A pilot study demonstrating real-time environmental monitoring with automated multivariate analysis of multi-sensor data submitted online has been performed at the cabled LoVe Ocean Observatory located at 258 m depth 20 km off the coast of Lofoten-Vesterålen, Norway. The major purpose was efficient monitoring of many variables simultaneously and early detection of changes and time-trends in the overall response pattern before changes were evident in individual variables. The pilot study was performed with 12 sensors from May 16 to August 31, 2015. The sensors provided data for chlorophyll, turbidity, conductivity, temperature (three sensors), salinity (calculated from temperature and conductivity), biomass at three different depth intervals (5-50, 50-120, 120-250 m), and current speed measured in two directions (east and north) using two sensors covering different depths with overlap. A total of 88 variables were monitored, 78 from the two current speed sensors. The time-resolution varied, thus the data had to be aligned to a common time resolution. After alignment, the data were interpreted using principal component analysis (PCA). Initially, a calibration model was established using data from May 16 to July 31. The data on current speed from two sensors were subject to two separate PCA models and the score vectors from these two models were combined with the other 10 variables in a multi-block PCA model. The observations from August were projected on the calibration model consecutively one at a time and the result was visualized in a score plot. Automated PCA of multi-sensor data submitted online is illustrated with an attached time-lapse video covering the relative short time period used in the pilot study. Methods for statistical validation, and warning and alarm limits are described. Redundant sensors enable sensor diagnostics and quality assurance. In a future perspective, the concept may be used in integrated environmental monitoring.

  4. gPhoton: Time-tagged GALEX photon events analysis tools

    NASA Astrophysics Data System (ADS)

    Million, Chase C.; Fleming, S. W.; Shiao, B.; Loyd, P.; Seibert, M.; Smith, M.

    2016-03-01

    Written in Python, gPhoton calibrates and sky-projects the ~1.1 trillion ultraviolet photon events detected by the microchannel plates on the Galaxy Evolution Explorer Spacecraft (GALEX), archives these events in a publicly accessible database at the Mikulski Archive for Space Telescopes (MAST), and provides tools for working with the database to extract scientific results, particularly over short time domains. The software includes a re-implementation of core functionality of the GALEX mission calibration pipeline to produce photon list files from raw spacecraft data as well as a suite of command line tools to generate calibrated light curves, images, and movies from the MAST database.

  5. Life Events: A Complex Role In The Timing Of Suicidal Behavior Among Depressed Patients

    PubMed Central

    Oquendo, Maria A.; Perez-Rodriguez, M. Mercedes; Poh, Ernest; Sullivan, Gregory; Burke, Ainsley K.; Sublette, M. Elizabeth; Mann, J. John; Galfalvy, Hanga

    2013-01-01

    Suicidal behavior is often conceptualized as a response to overwhelming stress. Our model posits that given a propensity for acting on suicidal urges, stressors such as life events or major depressive episodes (MDEs) determine the timing of suicidal acts. Depressed patients (n=415) were assessed prospectively for suicide attempts and suicide, life events and MDE over 2 years. Longitudinal data was divided into 1-month intervals characterized by MDE (yes/no), suicidal behavior (yes/no), and life event scores. Marginal logistic regression models were fit, with suicidal behavior as the response variable and MDE and life event score in either the same or previous month, respectively, as time-varying covariates. Among 7843 person-months, 33% had MDE and 73% had life events. MDE increased risk for suicidal behavior (OR=4.83, p< 0.0001). Life event scores were unrelated to the timing of suicidal behavior (OR=1.06 per 100 point increase, p=0.32), even during an MDE (OR=1.12, p=0.15). However, among those without Borderline Personality Disorders (BPD), both health and work related life events were key precipitants, as was recurrent MDE, with a 13-fold effect. The relationship of life events to suicidal behavior among those with BPD was more complex. Recurrent MDE was a robust precipitant for suicidal behavior, regardless of BPD comorbidity. The specific nature of life events is key to understanding the timing of suicidal behavior. Given unanticipated results regarding the role of BPD and study limitations, these findings require replication. Of note, that MDE, a treatable risk factor, strongly predicts suicidal behaviors is cause for hope. PMID:24126928

  6. On the estimation of intracluster correlation for time-to-event outcomes in cluster randomized trials.

    PubMed

    Kalia, Sumeet; Klar, Neil; Donner, Allan

    2016-12-30

    Cluster randomized trials (CRTs) involve the random assignment of intact social units rather than independent subjects to intervention groups. Time-to-event outcomes often are endpoints in CRTs. Analyses of such data need to account for the correlation among cluster members. The intracluster correlation coefficient (ICC) is used to assess the similarity among binary and continuous outcomes that belong to the same cluster. However, estimating the ICC in CRTs with time-to-event outcomes is a challenge because of the presence of censored observations. The literature suggests that the ICC may be estimated using either censoring indicators or observed event times. A simulation study explores the effect of administrative censoring on estimating the ICC. Results show that ICC estimators derived from censoring indicators or observed event times are negatively biased. Analytic work further supports these results. Observed event times are preferred to estimate the ICC under minimum frequency of administrative censoring. To our knowledge, the existing literature provides no practical guidance on the estimation of ICC when substantial amount of administrative censoring is present. The results from this study corroborate the need for further methodological research on estimating the ICC for correlated time-to-event outcomes. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  7. Meteorological factors and timing of the initiating event of human parturition

    NASA Astrophysics Data System (ADS)

    Hirsch, Emmet; Lim, Courtney; Dobrez, Deborah; Adams, Marci G.; Noble, William

    2011-03-01

    The aim of this study was to determine whether meteorological factors are associated with the timing of either onset of labor with intact membranes or rupture of membranes prior to labor—together referred to as `the initiating event' of parturition. All patients delivering at Evanston Hospital after spontaneous labor or rupture of membranes at ≥20 weeks of gestation over a 6-month period were studied. Logistic regression models of the initiating event of parturition using clinical variables (maternal age, gestational age, parity, multiple gestation and intrauterine infection) with and without the addition of meteorological variables (barometric pressure, temperature and humidity) were compared. A total of 1,088 patients met the inclusion criteria. Gestational age, multiple gestation and chorioamnionitis were associated with timing of initiation of parturition ( P < 0.01). The addition of meteorological to clinical variables generated a statistically significant improvement in prediction of the initiating event; however, the magnitude of this improvement was small (less than 2% difference in receiver-operating characteristic score). These observations held regardless of parity, fetal number and gestational age. Meteorological factors are associated with the timing of parturition, but the magnitude of this association is small.

  8. Spatio-temporal Event Classification using Time-series Kernel based Structured Sparsity

    PubMed Central

    Jeni, László A.; Lőrincz, András; Szabó, Zoltán; Cohn, Jeffrey F.; Kanade, Takeo

    2016-01-01

    In many behavioral domains, such as facial expression and gesture, sparse structure is prevalent. This sparsity would be well suited for event detection but for one problem. Features typically are confounded by alignment error in space and time. As a consequence, high-dimensional representations such as SIFT and Gabor features have been favored despite their much greater computational cost and potential loss of information. We propose a Kernel Structured Sparsity (KSS) method that can handle both the temporal alignment problem and the structured sparse reconstruction within a common framework, and it can rely on simple features. We characterize spatio-temporal events as time-series of motion patterns and by utilizing time-series kernels we apply standard structured-sparse coding techniques to tackle this important problem. We evaluated the KSS method using both gesture and facial expression datasets that include spontaneous behavior and differ in degree of difficulty and type of ground truth coding. KSS outperformed both sparse and non-sparse methods that utilize complex image features and their temporal extensions. In the case of early facial event classification KSS had 10% higher accuracy as measured by F1 score over kernel SVM methods1. PMID:27830214

  9. Joint scale-change models for recurrent events and failure time.

    PubMed

    Xu, Gongjun; Chiou, Sy Han; Huang, Chiung-Yu; Wang, Mei-Cheng; Yan, Jun

    2017-01-01

    Recurrent event data arise frequently in various fields such as biomedical sciences, public health, engineering, and social sciences. In many instances, the observation of the recurrent event process can be stopped by the occurrence of a correlated failure event, such as treatment failure and death. In this article, we propose a joint scale-change model for the recurrent event process and the failure time, where a shared frailty variable is used to model the association between the two types of outcomes. In contrast to the popular Cox-type joint modeling approaches, the regression parameters in the proposed joint scale-change model have marginal interpretations. The proposed approach is robust in the sense that no parametric assumption is imposed on the distribution of the unobserved frailty and that we do not need the strong Poisson-type assumption for the recurrent event process. We establish consistency and asymptotic normality of the proposed semiparametric estimators under suitable regularity conditions. To estimate the corresponding variances of the estimators, we develop a computationally efficient resampling-based procedure. Simulation studies and an analysis of hospitalization data from the Danish Psychiatric Central Register illustrate the performance of the proposed method.

  10. Markov chains and semi-Markov models in time-to-event analysis.

    PubMed

    Abner, Erin L; Charnigo, Richard J; Kryscio, Richard J

    2013-10-25

    A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields.

  11. Markov chains and semi-Markov models in time-to-event analysis

    PubMed Central

    Abner, Erin L.; Charnigo, Richard J.; Kryscio, Richard J.

    2014-01-01

    A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields. PMID:24818062

  12. Detecting synchronization clusters in multivariate time series via coarse-graining of Markov chains.

    PubMed

    Allefeld, Carsten; Bialonski, Stephan

    2007-12-01

    Synchronization cluster analysis is an approach to the detection of underlying structures in data sets of multivariate time series, starting from a matrix R of bivariate synchronization indices. A previous method utilized the eigenvectors of R for cluster identification, analogous to several recent attempts at group identification using eigenvectors of the correlation matrix. All of these approaches assumed a one-to-one correspondence of dominant eigenvectors and clusters, which has however been shown to be wrong in important cases. We clarify the usefulness of eigenvalue decomposition for synchronization cluster analysis by translating the problem into the language of stochastic processes, and derive an enhanced clustering method harnessing recent insights from the coarse-graining of finite-state Markov processes. We illustrate the operation of our method using a simulated system of coupled Lorenz oscillators, and we demonstrate its superior performance over the previous approach. Finally we investigate the question of robustness of the algorithm against small sample size, which is important with regard to field applications.

  13. Adaptive Event-Triggered Control Based on Heuristic Dynamic Programming for Nonlinear Discrete-Time Systems.

    PubMed

    Dong, Lu; Zhong, Xiangnan; Sun, Changyin; He, Haibo

    2017-07-01

    This paper presents the design of a novel adaptive event-triggered control method based on the heuristic dynamic programming (HDP) technique for nonlinear discrete-time systems with unknown system dynamics. In the proposed method, the control law is only updated when the event-triggered condition is violated. Compared with the periodic updates in the traditional adaptive dynamic programming (ADP) control, the proposed method can reduce the computation and transmission cost. An actor-critic framework is used to learn the optimal event-triggered control law and the value function. Furthermore, a model network is designed to estimate the system state vector. The main contribution of this paper is to design a new trigger threshold for discrete-time systems. A detailed Lyapunov stability analysis shows that our proposed event-triggered controller can asymptotically stabilize the discrete-time systems. Finally, we test our method on two different discrete-time systems, and the simulation results are included.

  14. Time scales of biogeochemical and organismal responses to individual precipitation events

    NASA Astrophysics Data System (ADS)

    von Fischer, J. C.; Angert, A. L.; Augustine, D. J.; Brown, C.; Dijkstra, F. A.; Derner, J. D.; Hufbauer, R. A.; Fierer, N.; Milchunas, D. G.; Moore, J. C.; Steltzer, H.; Wallenstein, M. D.

    2010-12-01

    In temperate grasslands, spatial and intra-annual variability in the activity of plants and microbes are structured by patterns in the precipitation regime. While the effects of total annual precipitation have been well-explored, the ecological dynamics associated with individual precipitation events have not. Rainfall events induce a short-term pulse of soil respiration that may or may not be followed by stimulation of plant photosynthetic activity and growth. Because the underlying heterotrophic and autotrophic responses are interactive, respond over unique timescales and are sensitive to precipitation magnitude, it remains difficult to predict the hydrologic effects on net CO2 exchange. To develop a better mechanistic understanding of these processes, we conducted a synthetic, multi-investigator experiment to characterize the ecosystem responses to rainfall events of different sizes. Our work was conducted on the Shortgrass Steppe (SGS) LTER site over 7 days in June 2009, using 1cm and 2cm rainfall events, with controls and each treatment replicated 5 times in 2m x 2m plots. Our observations revealed both expected responses of plant activity and soil respiration, and surprising patterns in microbial enzyme activity and soil fauna population densities. Coupled with observed dynamics in 15N partitioning and kinetics, our findings provide empirical timescales for the complex ecological interactions that underlie the ecosystem responses to rainfall events. These results can be used to inform a new generation of ecosystem simulation models to more explicitly consider the time lags and interactions of different functional groups.

  15. Time-to-first-event versus recurrent-event analysis: points to consider for selecting a meaningful analysis strategy in clinical trials with composite endpoints.

    PubMed

    Rauch, Geraldine; Kieser, Meinhard; Binder, Harald; Bayes-Genis, Antoni; Jahn-Eimermacher, Antje

    2018-05-01

    Composite endpoints combining several event types of clinical interest often define the primary efficacy outcome in cardiologic trials. They are commonly evaluated as time-to-first-event, thereby following the recommendations of regulatory agencies. However, to assess the patient's full disease burden and to identify preventive factors or interventions, subsequent events following the first one should be considered as well. This is especially important in cohort studies and RCTs with a long follow-up leading to a higher number of observed events per patients. So far, there exist no recommendations which approach should be preferred. Recently, the Cardiovascular Round Table of the European Society of Cardiology indicated the need to investigate "how to interpret results if recurrent-event analysis results differ […] from time-to-first-event analysis" (Anker et al., Eur J Heart Fail 18:482-489, 2016). This work addresses this topic by means of a systematic simulation study. This paper compares two common analysis strategies for composite endpoints differing with respect to the incorporation of recurrent events for typical data scenarios motivated by a clinical trial. We show that the treatment effects estimated from a time-to-first-event analysis (Cox model) and a recurrent-event analysis (Andersen-Gill model) can systematically differ, particularly in cardiovascular trials. Moreover, we provide guidance on how to interpret these results and recommend points to consider for the choice of a meaningful analysis strategy. When planning trials with a composite endpoint, researchers, and regulatory agencies should be aware that the model choice affects the estimated treatment effect and its interpretation.

  16. Real-time monitoring of clinical processes using complex event processing and transition systems.

    PubMed

    Meinecke, Sebastian

    2014-01-01

    Dependencies between tasks in clinical processes are often complex and error-prone. Our aim is to describe a new approach for the automatic derivation of clinical events identified via the behaviour of IT systems using Complex Event Processing. Furthermore we map these events on transition systems to monitor crucial clinical processes in real-time for preventing and detecting erroneous situations.

  17. Biases in the subjective timing of perceptual events: Libet et al. (1983) revisited.

    PubMed

    Danquah, Adam N; Farrell, Martin J; O'Boyle, Donald J

    2008-09-01

    We report two experiments in which participants had to judge the time of occurrence of a stimulus relative to a clock. The experiments were based on the control condition used by Libet, Gleason, Wright, and Pearl [Libet, B., Gleason, C. A., Wright, E. W., & Pearl, D. K. (1983). Time of conscious intention to act in relation to onset of cerebral activities (readiness-potential): The unconscious initiation of a freely voluntary act. Brain 106, 623-642] to correct for any bias in the estimation of the time at which an endogenous event, the conscious intention to perform a movement, occurred. Participants' responses were affected systematically by the sensory modality of the stimulus and by the speed of the clock. Such findings demonstrate the variability in judging the time at which an exogenous event occurs and, by extension, suggest that such variability may also apply to the judging the time of occurrence of endogenous events. The reliability of participants' estimations of when they formed the conscious intention to perform a movement in Libet et al.'s (1983) study is therefore questionable.

  18. 1995 feels so close yet so far: the effect of event markers on subjective feelings of elapsed time.

    PubMed

    Zauberman, Gal; Levav, Jonathan; Diehl, Kristin; Bhargave, Rajesh

    2010-01-01

    Why does an event feel more or less distant than another event that occurred around the same time? Prior research suggests that characteristics of an event itself can affect the estimated date of its occurrence. Our work differs in that we focused on how characteristics of the time interval following an event affect people's feelings of elapsed time (i.e., their feelings of how distant an event seems). We argue that a time interval that is punctuated by a greater number of accessible intervening events related to the target event (event markers) will make the target event feel more distant, but that unrelated intervening events will not have this effect. In three studies, we found support for the systematic effect of event markers. The effect of markers was independent of other characteristics of the event, such as its memorability, emotionality, importance, and estimated date, a result suggesting that this effect is distinct from established dating biases.

  19. Forecasting electric vehicles sales with univariate and multivariate time series models: The case of China.

    PubMed

    Zhang, Yong; Zhong, Miner; Geng, Nana; Jiang, Yunjian

    2017-01-01

    The market demand for electric vehicles (EVs) has increased in recent years. Suitable models are necessary to understand and forecast EV sales. This study presents a singular spectrum analysis (SSA) as a univariate time-series model and vector autoregressive model (VAR) as a multivariate model. Empirical results suggest that SSA satisfactorily indicates the evolving trend and provides reasonable results. The VAR model, which comprised exogenous parameters related to the market on a monthly basis, can significantly improve the prediction accuracy. The EV sales in China, which are categorized into battery and plug-in EVs, are predicted in both short term (up to December 2017) and long term (up to 2020), as statistical proofs of the growth of the Chinese EV industry.

  20. Forecasting electric vehicles sales with univariate and multivariate time series models: The case of China

    PubMed Central

    Zhang, Yong; Zhong, Miner; Geng, Nana; Jiang, Yunjian

    2017-01-01

    The market demand for electric vehicles (EVs) has increased in recent years. Suitable models are necessary to understand and forecast EV sales. This study presents a singular spectrum analysis (SSA) as a univariate time-series model and vector autoregressive model (VAR) as a multivariate model. Empirical results suggest that SSA satisfactorily indicates the evolving trend and provides reasonable results. The VAR model, which comprised exogenous parameters related to the market on a monthly basis, can significantly improve the prediction accuracy. The EV sales in China, which are categorized into battery and plug-in EVs, are predicted in both short term (up to December 2017) and long term (up to 2020), as statistical proofs of the growth of the Chinese EV industry. PMID:28459872

  1. Stochastic Generation of Spatiotemporal Rainfall Events for Flood Risk Assessment

    NASA Astrophysics Data System (ADS)

    Diederen, D.; Liu, Y.; Gouldby, B.; Diermanse, F.

    2017-12-01

    Current flood risk analyses that only consider peaks of hydrometeorological forcing variables have limitations regarding their representation of reality. Simplistic assumptions regarding antecedent conditions are required, often different sources of flooding are considered in isolation, and the complex temporal and spatial evolution of the events is not considered. Mid-latitude storms, governed by large scale climatic conditions, often exhibit a high degree of temporal dependency, for example. For sustainable flood risk management, that accounts appropriately for climate change, it is desirable for flood risk analyses to reflect reality more appropriately. Analysis of risk mitigation measures and comparison of their relative performance is therefore likely to be more robust and lead to improved solutions. We provide a new framework for the provision of boundary conditions to flood risk analyses that more appropriately reflects reality. The boundary conditions capture the temporal dependencies of complex storms whilst preserving the extreme values and associated spatial dependencies. We demonstrate the application of this framework to generate a synthetic rainfall events time series boundary condition set from reanalysis rainfall data (CFSR) on the continental scale. We define spatiotemporal clusters of rainfall as events, extract hydrological parameters for each event, generate synthetic parameter sets with a multivariate distribution with a focus on the joint tail probability [Heffernan and Tawn, 2004], and finally create synthetic events from the generated synthetic parameters. We highlight the stochastic integration of (a) spatiotemporal features, e.g. event occurrence intensity over space-time, or time to previous event, which we use for the spatial placement and sequencing of the synthetic events, and (b) value-specific parameters, e.g. peak intensity and event extent. We contrast this to more traditional approaches to highlight the significant improvements in

  2. Multivariate Cluster Analysis.

    ERIC Educational Resources Information Center

    McRae, Douglas J.

    Procedures for grouping students into homogeneous subsets have long interested educational researchers. The research reported in this paper is an investigation of a set of objective grouping procedures based on multivariate analysis considerations. Four multivariate functions that might serve as criteria for adequate grouping are given and…

  3. WAITING TIME DISTRIBUTION OF SOLAR ENERGETIC PARTICLE EVENTS MODELED WITH A NON-STATIONARY POISSON PROCESS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, C.; Su, W.; Fang, C.

    2014-09-10

    We present a study of the waiting time distributions (WTDs) of solar energetic particle (SEP) events observed with the spacecraft WIND and GOES. The WTDs of both solar electron events (SEEs) and solar proton events (SPEs) display a power-law tail of ∼Δt {sup –γ}. The SEEs display a broken power-law WTD. The power-law index is γ{sub 1} = 0.99 for the short waiting times (<70 hr) and γ{sub 2} = 1.92 for large waiting times (>100 hr). The break of the WTD of SEEs is probably due to the modulation of the corotating interaction regions. The power-law index, γ ∼more » 1.82, is derived for the WTD of the SPEs which is consistent with the WTD of type II radio bursts, indicating a close relationship between the shock wave and the production of energetic protons. The WTDs of SEP events can be modeled with a non-stationary Poisson process, which was proposed to understand the waiting time statistics of solar flares. We generalize the method and find that, if the SEP event rate λ = 1/Δt varies as the time distribution of event rate f(λ) = Aλ{sup –α}exp (– βλ), the time-dependent Poisson distribution can produce a power-law tail WTD of ∼Δt {sup α} {sup –3}, where 0 ≤ α < 2.« less

  4. Schizophrenia Spectrum Disorders Show Reduced Specificity and Less Positive Events in Mental Time Travel

    PubMed Central

    Chen, Xing-jie; Liu, Lu-lu; Cui, Ji-fang; Wang, Ya; Chen, An-tao; Li, Feng-hua; Wang, Wei-hong; Zheng, Han-feng; Gan, Ming-yuan; Li, Chun-qiu; Shum, David H. K.; Chan, Raymond C. K.

    2016-01-01

    Mental time travel refers to the ability to recall past events and to imagine possible future events. Schizophrenia (SCZ) patients have problems in remembering specific personal experiences in the past and imagining what will happen in the future. This study aimed to examine episodic past and future thinking in SCZ spectrum disorders including SCZ patients and individuals with schizotypal personality disorder (SPD) proneness who are at risk for developing SCZ. Thirty-two SCZ patients, 30 SPD proneness individuals, and 33 healthy controls participated in the study. The Sentence Completion for Events from the Past Test (SCEPT) and the Sentence Completion for Events in the Future Test were used to measure past and future thinking abilities. Results showed that SCZ patients showed significantly reduced specificity in recalling past and imagining future events, they generated less proportion of specific and extended events compared to healthy controls. SPD proneness individuals only generated less extended events compared to healthy controls. The reduced specificity was mainly manifested in imagining future events. Both SCZ patients and SPD proneness individuals generated less positive events than controls. These results suggest that mental time travel impairments in SCZ spectrum disorders and have implications for understanding their cognitive and emotional deficits. PMID:27507958

  5. Joint pattern of seasonal hydrological droughts and floods alternation in China's Huai River Basin using the multivariate L-moments

    NASA Astrophysics Data System (ADS)

    Wu, ShaoFei; Zhang, Xiang; She, DunXian

    2017-06-01

    Under the current condition of climate change, droughts and floods occur more frequently, and events in which flooding occurs after a prolonged drought or a drought occurs after an extreme flood may have a more severe impact on natural systems and human lives. This challenges the traditional approach wherein droughts and floods are considered separately, which may largely underestimate the risk of the disasters. In our study, the sudden alternation of droughts and flood events (ADFEs) between adjacent seasons is studied using the multivariate L-moments theory and the bivariate copula functions in the Huai River Basin (HRB) of China with monthly streamflow data at 32 hydrological stations from 1956 to 2012. The dry and wet conditions are characterized by the standardized streamflow index (SSI) at a 3-month time scale. The results show that: (1) The summer streamflow makes the largest contribution to the annual streamflow, followed by the autumn streamflow and spring streamflow. (2) The entire study area can be divided into five homogeneous sub-regions using the multivariate regional homogeneity test. The generalized logistic distribution (GLO) and log-normal distribution (LN3) are acceptable to be the optimal marginal distributions under most conditions, and the Frank copula is more appropriate for spring-summer and summer-autumn SSI series. Continuous flood events dominate at most sites both in spring-summer and summer-autumn (with an average frequency of 13.78% and 17.06%, respectively), while continuous drought events come second (with an average frequency of 11.27% and 13.79%, respectively). Moreover, seasonal ADFEs most probably occurred near the mainstream of HRB, and drought and flood events are more likely to occur in summer-autumn than in spring-summer.

  6. Empirical performance of the multivariate normal universal portfolio

    NASA Astrophysics Data System (ADS)

    Tan, Choon Peng; Pang, Sook Theng

    2013-09-01

    Universal portfolios generated by the multivariate normal distribution are studied with emphasis on the case where variables are dependent, namely, the covariance matrix is not diagonal. The moving-order multivariate normal universal portfolio requires very long implementation time and large computer memory in its implementation. With the objective of reducing memory and implementation time, the finite-order universal portfolio is introduced. Some stock-price data sets are selected from the local stock exchange and the finite-order universal portfolio is run on the data sets, for small finite order. Empirically, it is shown that the portfolio can outperform the moving-order Dirichlet universal portfolio of Cover and Ordentlich[2] for certain parameters in the selected data sets.

  7. Event-driven time-optimal control for a class of discontinuous bioreactors.

    PubMed

    Moreno, Jaime A; Betancur, Manuel J; Buitrón, Germán; Moreno-Andrade, Iván

    2006-07-05

    Discontinuous bioreactors may be further optimized for processing inhibitory substrates using a convenient fed-batch mode. To do so the filling rate must be controlled in such a way as to push the reaction rate to its maximum value, by increasing the substrate concentration just up to the point where inhibition begins. However, an exact optimal controller requires measuring several variables (e.g., substrate concentrations in the feed and in the tank) and also good model knowledge (e.g., yield and kinetic parameters), requirements rarely satisfied in real applications. An environmentally important case, that exemplifies all these handicaps, is toxicant wastewater treatment. There the lack of online practical pollutant sensors may allow unforeseen high shock loads to be fed to the bioreactor, causing biomass inhibition that slows down the treatment process and, in extreme cases, even renders the biological process useless. In this work an event-driven time-optimal control (ED-TOC) is proposed to circumvent these limitations. We show how to detect a "there is inhibition" event by using some computable function of the available measurements. This event drives the ED-TOC to stop the filling. Later, by detecting the symmetric event, "there is no inhibition," the ED-TOC may restart the filling. A fill-react cycling then maintains the process safely hovering near its maximum reaction rate, allowing a robust and practically time-optimal operation of the bioreactor. An experimental study case of a wastewater treatment process application is presented. There the dissolved oxygen concentration was used to detect the events needed to drive the controller. (c) 2006 Wiley Periodicals, Inc.

  8. Adaptive Neural Network-Based Event-Triggered Control of Single-Input Single-Output Nonlinear Discrete-Time Systems.

    PubMed

    Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani

    2016-01-01

    This paper presents a novel adaptive neural network (NN) control of single-input and single-output uncertain nonlinear discrete-time systems under event sampled NN inputs. In this control scheme, the feedback signals are transmitted, and the NN weights are tuned in an aperiodic manner at the event sampled instants. After reviewing the NN approximation property with event sampled inputs, an adaptive state estimator (SE), consisting of linearly parameterized NNs, is utilized to approximate the unknown system dynamics in an event sampled context. The SE is viewed as a model and its approximated dynamics and the state vector, during any two events, are utilized for the event-triggered controller design. An adaptive event-trigger condition is derived by using both the estimated NN weights and a dead-zone operator to determine the event sampling instants. This condition both facilitates the NN approximation and reduces the transmission of feedback signals. The ultimate boundedness of both the NN weight estimation error and the system state vector is demonstrated through the Lyapunov approach. As expected, during an initial online learning phase, events are observed more frequently. Over time with the convergence of the NN weights, the inter-event times increase, thereby lowering the number of triggered events. These claims are illustrated through the simulation results.

  9. The time to remember: Temporal compression and duration judgements in memory for real-life events.

    PubMed

    Jeunehomme, Olivier; D'Argembeau, Arnaud

    2018-05-01

    Recent studies suggest that the continuous flow of information that constitutes daily life events is temporally compressed in episodic memory, yet the characteristics and determinants of this compression mechanism remain unclear. This study examined this question using an experimental paradigm incorporating wearable camera technology. Participants experienced a series of real-life events and were later asked to mentally replay various event sequences that were cued by pictures taken during the original events. Estimates of temporal compression (the ratio of the time needed to mentally re-experience an event to the actual event duration) showed that events were replayed, on average, about eight times faster than the original experiences. This compression mechanism seemed to operate by representing events as a succession of moments or slices of prior experience separated by temporal discontinuities. Importantly, however, rates of temporal compression were not constant and were lower for events involving goal-directed actions. The results also showed that the perceived duration of events increased with the density of recalled moments of prior experience. Taken together, these data extend our understanding of the mechanisms underlying the temporal compression and perceived duration of real-life events in episodic memory.

  10. Hazard ratio estimation and inference in clinical trials with many tied event times.

    PubMed

    Mehrotra, Devan V; Zhang, Yiwei

    2018-06-13

    The medical literature contains numerous examples of randomized clinical trials with time-to-event endpoints in which large numbers of events accrued over relatively short follow-up periods, resulting in many tied event times. A generally common feature across such examples was that the logrank test was used for hypothesis testing and the Cox proportional hazards model was used for hazard ratio estimation. We caution that this common practice is particularly risky in the setting of many tied event times for two reasons. First, the estimator of the hazard ratio can be severely biased if the Breslow tie-handling approximation for the Cox model (the default in SAS and Stata software) is used. Second, the 95% confidence interval for the hazard ratio can include one even when the corresponding logrank test p-value is less than 0.05. To help establish a better practice, with applicability for both superiority and noninferiority trials, we use theory and simulations to contrast Wald and score tests based on well-known tie-handling approximations for the Cox model. Our recommendation is to report the Wald test p-value and corresponding confidence interval based on the Efron approximation. The recommended test is essentially as powerful as the logrank test, the accompanying point and interval estimates of the hazard ratio have excellent statistical properties even in settings with many tied event times, inferential alignment between the p-value and confidence interval is guaranteed, and implementation is straightforward using commonly used software. Copyright © 2018 John Wiley & Sons, Ltd.

  11. Time until diagnosis of clinical events with different remote monitoring systems in Implantable Cardioverter-Defibrillator patients.

    PubMed

    Söth-Hansen, Malene; Witt, Christoffer Tobias; Rasmussen, Mathis; Kristensen, Jens; Gerdes, Christian; Nielsen, Jens Cosedis

    2018-05-24

    Remote monitoring (RM) is an established technology integrated into routine follow-up of patients with implantable cardioverter-defibrillator (ICD). Current RM systems differ according to transmission frequency and alert definition. We aimed to compare time difference between detection and acknowledgement of clinically relevant events between four RM systems. We analyzed time delay between detection of ventricular arrhythmic and technical events by the ICD and acknowledgement by hospital staff in 1.802 consecutive patients followed with RM during September 2014 - August 2016. Devices from Biotronik (BIO, n=374), Boston Scientific (BSC, n=196), Medtronic (MDT, n=468) and St Jude Medical (SJM, n=764) were included. We identified all events from RM webpages and their acknowledgement with RM or at in-clinic follow-up. Events occurring during weekends were excluded. We included 3.472 events. Proportion of events acknowledged within 24 hours was 72%, 23%, 18% and 65% with BIO, BSC, MDT and SJM, respectively, with median times of 13, 222, 163 and 18 hours from detection to acknowledgement (p<0.001 for both comparisons between manufacturers). Including only events transmitted as alerts by RM, 72%, 68%, 61% and 65% for BIO, BSC, MDT and SJM, respectively were acknowledged within 24 hours. Variation in time to acknowledgement of ventricular tachyarrhythmia episodes not treated with shock therapy was the primary cause for the difference between manufacturers. Significant and clinically relevant differences in time delay from event detection to acknowledgement exist between RM systems. Varying definitions of which events RM transmits as alerts are important for the differences observed. Copyright © 2018. Published by Elsevier Inc.

  12. Functional MRI and Multivariate Autoregressive Models

    PubMed Central

    Rogers, Baxter P.; Katwal, Santosh B.; Morgan, Victoria L.; Asplund, Christopher L.; Gore, John C.

    2010-01-01

    Connectivity refers to the relationships that exist between different regions of the brain. In the context of functional magnetic resonance imaging (fMRI), it implies a quantifiable relationship between hemodynamic signals from different regions. One aspect of this relationship is the existence of small timing differences in the signals in different regions. Delays of 100 ms or less may be measured with fMRI, and these may reflect important aspects of the manner in which brain circuits respond as well as the overall functional organization of the brain. The multivariate autoregressive time series model has features to recommend it for measuring these delays, and is straightforward to apply to hemodynamic data. In this review, we describe the current usage of the multivariate autoregressive model for fMRI, discuss the issues that arise when it is applied to hemodynamic time series, and consider several extensions. Connectivity measures like Granger causality that are based on the autoregressive model do not always reflect true neuronal connectivity; however, we conclude that careful experimental design could make this methodology quite useful in extending the information obtainable using fMRI. PMID:20444566

  13. Event-Triggered Adaptive Dynamic Programming for Continuous-Time Systems With Control Constraints.

    PubMed

    Dong, Lu; Zhong, Xiangnan; Sun, Changyin; He, Haibo

    2016-08-31

    In this paper, an event-triggered near optimal control structure is developed for nonlinear continuous-time systems with control constraints. Due to the saturating actuators, a nonquadratic cost function is introduced and the Hamilton-Jacobi-Bellman (HJB) equation for constrained nonlinear continuous-time systems is formulated. In order to solve the HJB equation, an actor-critic framework is presented. The critic network is used to approximate the cost function and the action network is used to estimate the optimal control law. In addition, in the proposed method, the control signal is transmitted in an aperiodic manner to reduce the computational and the transmission cost. Both the networks are only updated at the trigger instants decided by the event-triggered condition. Detailed Lyapunov analysis is provided to guarantee that the closed-loop event-triggered system is ultimately bounded. Three case studies are used to demonstrate the effectiveness of the proposed method.

  14. Estimating the effect of a rare time-dependent treatment on the recurrent event rate.

    PubMed

    Smith, Abigail R; Zhu, Danting; Goodrich, Nathan P; Merion, Robert M; Schaubel, Douglas E

    2018-05-30

    In many observational studies, the objective is to estimate the effect of treatment or state-change on the recurrent event rate. If treatment is assigned after the start of follow-up, traditional methods (eg, adjustment for baseline-only covariates or fully conditional adjustment for time-dependent covariates) may give biased results. We propose a two-stage modeling approach using the method of sequential stratification to accurately estimate the effect of a time-dependent treatment on the recurrent event rate. At the first stage, we estimate the pretreatment recurrent event trajectory using a proportional rates model censored at the time of treatment. Prognostic scores are estimated from the linear predictor of this model and used to match treated patients to as yet untreated controls based on prognostic score at the time of treatment for the index patient. The final model is stratified on matched sets and compares the posttreatment recurrent event rate to the recurrent event rate of the matched controls. We demonstrate through simulation that bias due to dependent censoring is negligible, provided the treatment frequency is low, and we investigate a threshold at which correction for dependent censoring is needed. The method is applied to liver transplant (LT), where we estimate the effect of development of post-LT End Stage Renal Disease (ESRD) on rate of days hospitalized. Copyright © 2018 John Wiley & Sons, Ltd.

  15. Seizure-Onset Mapping Based on Time-Variant Multivariate Functional Connectivity Analysis of High-Dimensional Intracranial EEG: A Kalman Filter Approach.

    PubMed

    Lie, Octavian V; van Mierlo, Pieter

    2017-01-01

    The visual interpretation of intracranial EEG (iEEG) is the standard method used in complex epilepsy surgery cases to map the regions of seizure onset targeted for resection. Still, visual iEEG analysis is labor-intensive and biased due to interpreter dependency. Multivariate parametric functional connectivity measures using adaptive autoregressive (AR) modeling of the iEEG signals based on the Kalman filter algorithm have been used successfully to localize the electrographic seizure onsets. Due to their high computational cost, these methods have been applied to a limited number of iEEG time-series (<60). The aim of this study was to test two Kalman filter implementations, a well-known multivariate adaptive AR model (Arnold et al. 1998) and a simplified, computationally efficient derivation of it, for their potential application to connectivity analysis of high-dimensional (up to 192 channels) iEEG data. When used on simulated seizures together with a multivariate connectivity estimator, the partial directed coherence, the two AR models were compared for their ability to reconstitute the designed seizure signal connections from noisy data. Next, focal seizures from iEEG recordings (73-113 channels) in three patients rendered seizure-free after surgery were mapped with the outdegree, a graph-theory index of outward directed connectivity. Simulation results indicated high levels of mapping accuracy for the two models in the presence of low-to-moderate noise cross-correlation. Accordingly, both AR models correctly mapped the real seizure onset to the resection volume. This study supports the possibility of conducting fully data-driven multivariate connectivity estimations on high-dimensional iEEG datasets using the Kalman filter approach.

  16. Random-Effects Meta-Analysis of Time-to-Event Data Using the Expectation-Maximisation Algorithm and Shrinkage Estimators

    ERIC Educational Resources Information Center

    Simmonds, Mark C.; Higgins, Julian P. T.; Stewart, Lesley A.

    2013-01-01

    Meta-analysis of time-to-event data has proved difficult in the past because consistent summary statistics often cannot be extracted from published results. The use of individual patient data allows for the re-analysis of each study in a consistent fashion and thus makes meta-analysis of time-to-event data feasible. Time-to-event data can be…

  17. Generalised synthesis of space-time variability in flood response: Dynamics of flood event types

    NASA Astrophysics Data System (ADS)

    Viglione, Alberto; Battista Chirico, Giovanni; Komma, Jürgen; Woods, Ross; Borga, Marco; Blöschl, Günter

    2010-05-01

    A analytical framework is used to characterise five flood events of different type in the Kamp area in Austria: one long-rain event, two short-rain events, one rain-on-snow event and one snowmelt event. Specifically, the framework quantifies the contributions of the space-time variability of rainfall/snowmelt, runoff coefficient, hillslope and channel routing to the flood runoff volume and the delay and spread of the resulting hydrograph. The results indicate that the components obtained by the framework clearly reflect the individual processes which characterise the event types. For the short-rain events, temporal, spatial and movement components can all be important in runoff generation and routing, which would be expected because of their local nature in time and, particularly, in space. For the long-rain event, the temporal components tend to be more important for runoff generation, because of the more uniform spatial coverage of rainfall, while for routing the spatial distribution of the produced runoff, which is not uniform, is also important. For the rain-on-snow and snowmelt events, the spatio-temporal variability terms typically do not play much role in runoff generation and the spread of the hydrograph is mainly due to the duration of the event. As an outcome of the framework, a dimensionless response number is proposed that represents the joint effect of runoff coefficient and hydrograph peakedness and captures the absolute magnitudes of the observed flood peaks.

  18. Estimating the decomposition of predictive information in multivariate systems

    NASA Astrophysics Data System (ADS)

    Faes, Luca; Kugiumtzis, Dimitris; Nollo, Giandomenico; Jurysta, Fabrice; Marinazzo, Daniele

    2015-03-01

    In the study of complex systems from observed multivariate time series, insight into the evolution of one system may be under investigation, which can be explained by the information storage of the system and the information transfer from other interacting systems. We present a framework for the model-free estimation of information storage and information transfer computed as the terms composing the predictive information about the target of a multivariate dynamical process. The approach tackles the curse of dimensionality employing a nonuniform embedding scheme that selects progressively, among the past components of the multivariate process, only those that contribute most, in terms of conditional mutual information, to the present target process. Moreover, it computes all information-theoretic quantities using a nearest-neighbor technique designed to compensate the bias due to the different dimensionality of individual entropy terms. The resulting estimators of prediction entropy, storage entropy, transfer entropy, and partial transfer entropy are tested on simulations of coupled linear stochastic and nonlinear deterministic dynamic processes, demonstrating the superiority of the proposed approach over the traditional estimators based on uniform embedding. The framework is then applied to multivariate physiologic time series, resulting in physiologically well-interpretable information decompositions of cardiovascular and cardiorespiratory interactions during head-up tilt and of joint brain-heart dynamics during sleep.

  19. Time-based and event-based prospective memory in autism spectrum disorder: the roles of executive function and theory of mind, and time-estimation.

    PubMed

    Williams, David; Boucher, Jill; Lind, Sophie; Jarrold, Christopher

    2013-07-01

    Prospective memory (remembering to carry out an action in the future) has been studied relatively little in ASD. We explored time-based (carry out an action at a pre-specified time) and event-based (carry out an action upon the occurrence of a pre-specified event) prospective memory, as well as possible cognitive correlates, among 21 intellectually high-functioning children with ASD, and 21 age- and IQ-matched neurotypical comparison children. We found impaired time-based, but undiminished event-based, prospective memory among children with ASD. In the ASD group, time-based prospective memory performance was associated significantly with diminished theory of mind, but not with diminished cognitive flexibility. There was no evidence that time-estimation ability contributed to time-based prospective memory impairment in ASD.

  20. Estimation of typhoon rainfall in GaoPing River: A Multivariate Maximum Entropy Method

    NASA Astrophysics Data System (ADS)

    Pei-Jui, Wu; Hwa-Lung, Yu

    2016-04-01

    The heavy rainfall from typhoons is the main factor of the natural disaster in Taiwan, which causes the significant loss of human lives and properties. Statistically average 3.5 typhoons invade Taiwan every year, and the serious typhoon, Morakot in 2009, impacted Taiwan in recorded history. Because the duration, path and intensity of typhoon, also affect the temporal and spatial rainfall type in specific region , finding the characteristics of the typhoon rainfall type is advantageous when we try to estimate the quantity of rainfall. This study developed a rainfall prediction model and can be divided three parts. First, using the EEOF(extended empirical orthogonal function) to classify the typhoon events, and decompose the standard rainfall type of all stations of each typhoon event into the EOF and PC(principal component). So we can classify the typhoon events which vary similarly in temporally and spatially as the similar typhoon types. Next, according to the classification above, we construct the PDF(probability density function) in different space and time by means of using the multivariate maximum entropy from the first to forth moment statistically. Therefore, we can get the probability of each stations of each time. Final we use the BME(Bayesian Maximum Entropy method) to construct the typhoon rainfall prediction model , and to estimate the rainfall for the case of GaoPing river which located in south of Taiwan.This study could be useful for typhoon rainfall predictions in future and suitable to government for the typhoon disaster prevention .

  1. Development and validation of multivariable predictive model for thromboembolic events in lymphoma patients.

    PubMed

    Antic, Darko; Milic, Natasa; Nikolovski, Srdjan; Todorovic, Milena; Bila, Jelena; Djurdjevic, Predrag; Andjelic, Bosko; Djurasinovic, Vladislava; Sretenovic, Aleksandra; Vukovic, Vojin; Jelicic, Jelena; Hayman, Suzanne; Mihaljevic, Biljana

    2016-10-01

    Lymphoma patients are at increased risk of thromboembolic events but thromboprophylaxis in these patients is largely underused. We sought to develop and validate a simple model, based on individual clinical and laboratory patient characteristics that would designate lymphoma patients at risk for thromboembolic event. The study population included 1,820 lymphoma patients who were treated in the Lymphoma Departments at the Clinics of Hematology, Clinical Center of Serbia and Clinical Center Kragujevac. The model was developed using data from a derivation cohort (n = 1,236), and further assessed in the validation cohort (n = 584). Sixty-five patients (5.3%) in the derivation cohort and 34 (5.8%) patients in the validation cohort developed thromboembolic events. The variables independently associated with risk for thromboembolism were: previous venous and/or arterial events, mediastinal involvement, BMI>30 kg/m(2) , reduced mobility, extranodal localization, development of neutropenia and hemoglobin level < 100g/L. Based on the risk model score, the population was divided into the following risk categories: low (score 0-1), intermediate (score 2-3), and high (score >3). For patients classified at risk (intermediate and high-risk scores), the model produced negative predictive value of 98.5%, positive predictive value of 25.1%, sensitivity of 75.4%, and specificity of 87.5%. A high-risk score had positive predictive value of 65.2%. The diagnostic performance measures retained similar values in the validation cohort. Developed prognostic Thrombosis Lymphoma - ThroLy score is more specific for lymphoma patients than any other available score targeting thrombosis in cancer patients. Am. J. Hematol. 91:1014-1019, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  2. A coupled classification - evolutionary optimization model for contamination event detection in water distribution systems.

    PubMed

    Oliker, Nurit; Ostfeld, Avi

    2014-03-15

    This study describes a decision support system, alerts for contamination events in water distribution systems. The developed model comprises a weighted support vector machine (SVM) for the detection of outliers, and a following sequence analysis for the classification of contamination events. The contribution of this study is an improvement of contamination events detection ability and a multi-dimensional analysis of the data, differing from the parallel one-dimensional analysis conducted so far. The multivariate analysis examines the relationships between water quality parameters and detects changes in their mutual patterns. The weights of the SVM model accomplish two goals: blurring the difference between sizes of the two classes' data sets (as there are much more normal/regular than event time measurements), and adhering the time factor attribute by a time decay coefficient, ascribing higher importance to recent observations when classifying a time step measurement. All model parameters were determined by data driven optimization so the calibration of the model was completely autonomic. The model was trained and tested on a real water distribution system (WDS) data set with randomly simulated events superimposed on the original measurements. The model is prominent in its ability to detect events that were only partly expressed in the data (i.e., affecting only some of the measured parameters). The model showed high accuracy and better detection ability as compared to previous modeling attempts of contamination event detection. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Dividing time: concurrent timing of auditory and visual events by young and elderly adults.

    PubMed

    McAuley, J Devin; Miller, Jonathan P; Wang, Mo; Pang, Kevin C H

    2010-07-01

    This article examines age differences in individual's ability to produce the durations of learned auditory and visual target events either in isolation (focused attention) or concurrently (divided attention). Young adults produced learned target durations equally well in focused and divided attention conditions. Older adults, in contrast, showed an age-related increase in timing variability in divided attention conditions that tended to be more pronounced for visual targets than for auditory targets. Age-related impairments were associated with a decrease in working memory span; moreover, the relationship between working memory and timing performance was largest for visual targets in divided attention conditions.

  4. Making Story Time a Literacy Event for the Young Child.

    ERIC Educational Resources Information Center

    Weir, Beth

    1989-01-01

    Reviews research and anecdotal accounts which present instructional techniques and which suggest that the quality of instruction, quality of time, and quality of books are significant factors in ensuring that story reading is a true literacy event. Argues that consistent story readings facilitate the acquisition of the reading process. (RS)

  5. Modeling Repeatable Events Using Discrete-Time Data: Predicting Marital Dissolution

    ERIC Educational Resources Information Center

    Teachman, Jay

    2011-01-01

    I join two methodologies by illustrating the application of multilevel modeling principles to hazard-rate models with an emphasis on procedures for discrete-time data that contain repeatable events. I demonstrate this application using data taken from the 1995 National Survey of Family Growth (NSFG) to ascertain the relationship between multiple…

  6. Developmental and Cognitive Perspectives on Humans' Sense of the Times of Past and Future Events

    ERIC Educational Resources Information Center

    Friedman, W.J.

    2005-01-01

    Mental time travel in human adults includes a sense of when past events occurred and future events are expected to occur. Studies with adults and children reveal that a number of distinct psychological processes contribute to a temporally differentiated sense of the past and future. Adults possess representations of multiple time patterns, and…

  7. Conceptualization of Collective Behavior Events in the New York "Times."

    ERIC Educational Resources Information Center

    Blake, Joseph A.; And Others

    1978-01-01

    Reports that most collective behavior events reported in the New York "Times" are described in terms of emotionality and anonymity of membership and are alleged to be violent and spontaneous, and that there are significant rank-order correlations between the reported presence of control agents, reported violence, and attributions of spontaneity.…

  8. Diagnosis of delay-deadline failures in real time discrete event models.

    PubMed

    Biswas, Santosh; Sarkar, Dipankar; Bhowal, Prodip; Mukhopadhyay, Siddhartha

    2007-10-01

    In this paper a method for fault detection and diagnosis (FDD) of real time systems has been developed. A modeling framework termed as real time discrete event system (RTDES) model is presented and a mechanism for FDD of the same has been developed. The use of RTDES framework for FDD is an extension of the works reported in the discrete event system (DES) literature, which are based on finite state machines (FSM). FDD of RTDES models are suited for real time systems because of their capability of representing timing faults leading to failures in terms of erroneous delays and deadlines, which FSM-based ones cannot address. The concept of measurement restriction of variables is introduced for RTDES and the consequent equivalence of states and indistinguishability of transitions have been characterized. Faults are modeled in terms of an unmeasurable condition variable in the state map. Diagnosability is defined and the procedure of constructing a diagnoser is provided. A checkable property of the diagnoser is shown to be a necessary and sufficient condition for diagnosability. The methodology is illustrated with an example of a hydraulic cylinder.

  9. Solar Demon: near real-time solar eruptive event detection on SDO/AIA images

    NASA Astrophysics Data System (ADS)

    Kraaikamp, Emil; Verbeeck, Cis

    Solar flares, dimmings and EUV waves have been observed routinely in extreme ultra-violet (EUV) images of the Sun since 1996. These events are closely associated with coronal mass ejections (CMEs), and therefore provide useful information for early space weather alerts. The Solar Dynamics Observatory/Atmospheric Imaging Assembly (SDO/AIA) generates such a massive dataset that it becomes impossible to find most of these eruptive events manually. Solar Demon is a set of automatic detection algorithms that attempts to solve this problem by providing both near real-time warnings of eruptive events and a catalog of characterized events. Solar Demon has been designed to detect and characterize dimmings, EUV waves, as well as solar flares in near real-time on SDO/AIA data. The detection modules are running continuously at the Royal Observatory of Belgium on both quick-look data and synoptic science data. The output of Solar Demon can be accessed in near real-time on the Solar Demon website, and includes images, movies, light curves, and the numerical evolution of several parameters. Solar Demon is the result of collaboration between the FP7 projects AFFECTS and COMESEP. Flare detections of Solar Demon are integrated into the COMESEP alert system. Here we present the Solar Demon detection algorithms and their output. We will focus on the algorithm and its operational implementation. Examples of interesting flare, dimming and EUV wave events, and general statistics of the detections made so far during solar cycle 24 will be presented as well.

  10. Negative Events in Childhood Predict Trajectories of Internalizing Symptoms Up to Young Adulthood: An 18-Year Longitudinal Study

    PubMed Central

    Melchior, Maria; Touchette, Évelyne; Prokofyeva, Elena; Chollet, Aude; Fombonne, Eric; Elidemir, Gulizar; Galéra, Cédric

    2014-01-01

    Background Common negative events can precipitate the onset of internalizing symptoms. We studied whether their occurrence in childhood is associated with mental health trajectories over the course of development. Methods Using data from the TEMPO study, a French community-based cohort study of youths, we studied the association between negative events in 1991 (when participants were aged 4–16 years) and internalizing symptoms, assessed by the ASEBA family of instruments in 1991, 1999, and 2009 (n = 1503). Participants' trajectories of internalizing symptoms were estimated with semi-parametric regression methods (PROC TRAJ). Data were analyzed using multinomial regression models controlled for participants' sex, age, parental family status, socio-economic position, and parental history of depression. Results Negative childhood events were associated with an increased likelihood of concurrent internalizing symptoms which sometimes persisted into adulthood (multivariate ORs associated with > = 3 negative events respectively: high and decreasing internalizing symptoms: 5.54, 95% CI: 3.20–9.58; persistently high internalizing symptoms: 8.94, 95% CI: 2.82–28.31). Specific negative events most strongly associated with youths' persistent internalizing symptoms included: school difficulties (multivariate OR: 5.31, 95% CI: 2.24–12.59), parental stress (multivariate OR: 4.69, 95% CI: 2.02–10.87), serious illness/health problems (multivariate OR: 4.13, 95% CI: 1.76–9.70), and social isolation (multivariate OR: 2.24, 95% CI: 1.00–5.08). Conclusions Common negative events can contribute to the onset of children's lasting psychological difficulties. PMID:25485875

  11. Sports-Related Concussion Occurrence at Various Time Points During High School Athletic Events: Part 2.

    PubMed

    Covassin, Tracey; Petit, Kyle M; Savage, Jennifer L; Bretzin, Abigail C; Fox, Meghan E; Walker, Lauren F; Gould, Daniel

    2018-06-01

    Sports-related concussion (SRC) injury rates, and identifying those athletes at the highest risk, have been a primary research focus. However, no studies have evaluated at which time point during an athletic event athletes are most susceptible to SRCs. To determine the clinical incidence of SRCs during the start, middle, and end of practice and competition among high school male and female athletes in the state of Michigan. Descriptive epidemiological study. There were 110,774 male and 71,945 female student-athletes in grades 9 through 12 (mean time in high school, 2.32 ± 1.1 years) who participated in sponsored athletic activities (13 sports) during the 2015-2016 academic year. An SRC was diagnosed and managed by a medical professional (ie, MD, DO, PA, NP). SRC injuries were reported by certified athletic trainers, athletic administrators, and coaches using the Michigan High School Athletic Association Head Injury Reporting System. Time of SRC was defined as the beginning, middle, or end of practice/competition. Clinical incidence was calculated by dividing the number of SRCs in a time point (eg, beginning) by the total number of participants in a sport per 100 student-athletes (95% CI). Risk ratios were calculated by dividing one time point by another time point. There were 4314 SRCs reported, with the highest in football, women's basketball, and women's soccer. The total clinical incidence for all sports was 2.36 (95% CI, 2.29-2.43) per 100 student-athletes. The most common time for SRCs was the middle, followed by the end of all events. Athletes had a 4.90 (95% CI, 4.44-5.41) and 1.50 (95% CI, 1.40-1.60) times greater risk during the middle of all events when compared with the beginning and end, respectively. There was a 3.28 (95% CI, 2.96-3.63) times greater risk at the end of all events when compared with the beginning. Athletes were at the greatest risk for SRCs at the middle of practice and competition when compared with the beginning and end. The current

  12. Analyzing the posting behaviors in news forums with incremental inter-event time

    NASA Astrophysics Data System (ADS)

    Sun, Zhi; Peng, Qinke; Lv, Jia; Zhong, Tao

    2017-08-01

    Online human behaviors are widely discussed in various fields. Three key factors, named priority, interest and memory are found crucial in human behaviors. Existing research mainly focuses on the identified and active users. However, the anonymous users and the inactive ones exist widely in news forums, whose behaviors do not receive enough attention. They cannot offer abundant postings like the others. It requires us to study posting behaviors of all the users including anonymous ones, identified ones, active ones and inactive ones in news forums only at the collective level. In this paper, the memory effects of the posting behaviors in news forums are investigated at the collective level. On the basis of the incremental inter-event time, a new model is proposed to describe the posting behaviors at the collective level. The results on twelve actual news events demonstrate the good performance of our model to describe the posting behaviors at the collective level in news forums. In addition, we find the symmetric incremental inter-event time distribution and the similar posting patterns in different durations.

  13. Fixed order dynamic compensation for multivariable linear systems

    NASA Technical Reports Server (NTRS)

    Kramer, F. S.; Calise, A. J.

    1986-01-01

    This paper considers the design of fixed order dynamic compensators for multivariable time invariant linear systems, minimizing a linear quadratic performance cost functional. Attention is given to robustness issues in terms of multivariable frequency domain specifications. An output feedback formulation is adopted by suitably augmenting the system description to include the compensator states. Either a controller or observer canonical form is imposed on the compensator description to reduce the number of free parameters to its minimal number. The internal structure of the compensator is prespecified by assigning a set of ascending feedback invariant indices, thus forming a Brunovsky structure for the nominal compensator.

  14. The Dependence of Characteristic Times of Gradual SEP Events on Their Associated CME Properties

    NASA Astrophysics Data System (ADS)

    Pan, Z. H.; Wang, C. B.; Xue, X. H.; Wang, Y. M.

    It is generally believed that coronal mass ejections CMEs are the drivers of shocks that accelerate gradual solar energetic particles SEPs One might expect that the characteristics of the SEP intensity time profiles observed at 1 AU are determined by properties of the associated CMEs such as the radial speed and the angular width Recently Kahler statistically investigated the characteristic times of gradual SEP events observed from 1998-2002 and their associated coronal mass ejection properties Astrophys J 628 1014--1022 2005 Three characteristic times of gradual SEP events are determined as functions of solar source longitude 1 T 0 the time from associated CME launch to SEP onset at 1 AU 2 T R the rise time from SEP onset to the time when the SEP intensity is a factor of 2 below peak intensity and 3 T D the duration over which the SEP intensity is within a factor of 2 of the peak intensity However in his study the CME speeds and angular widths are directly taken from the LASCO CME catalog In this study we analyze the radial speeds and the angular widths of CMEs by an ice-cream cone model and re-investigate their correlationships with the characteristic times of the corresponding SEP events We find T R and T D are significantly correlated with radial speed for SEP events in the best-connected longitude range and there is no correlation between T 0 and CME radial speed and angular width which is consistent with Kahler s results On the other hand it s found that T R and T D are also have

  15. Multivariate frequency domain analysis of protein dynamics

    NASA Astrophysics Data System (ADS)

    Matsunaga, Yasuhiro; Fuchigami, Sotaro; Kidera, Akinori

    2009-03-01

    Multivariate frequency domain analysis (MFDA) is proposed to characterize collective vibrational dynamics of protein obtained by a molecular dynamics (MD) simulation. MFDA performs principal component analysis (PCA) for a bandpass filtered multivariate time series using the multitaper method of spectral estimation. By applying MFDA to MD trajectories of bovine pancreatic trypsin inhibitor, we determined the collective vibrational modes in the frequency domain, which were identified by their vibrational frequencies and eigenvectors. At near zero temperature, the vibrational modes determined by MFDA agreed well with those calculated by normal mode analysis. At 300 K, the vibrational modes exhibited characteristic features that were considerably different from the principal modes of the static distribution given by the standard PCA. The influences of aqueous environments were discussed based on two different sets of vibrational modes, one derived from a MD simulation in water and the other from a simulation in vacuum. Using the varimax rotation, an algorithm of the multivariate statistical analysis, the representative orthogonal set of eigenmodes was determined at each vibrational frequency.

  16. Multivariate Analysis of Longitudinal Rates of Change

    PubMed Central

    Bryan, Matthew; Heagerty, Patrick J.

    2016-01-01

    Longitudinal data allow direct comparison of the change in patient outcomes associated with treatment or exposure. Frequently, several longitudinal measures are collected that either reflect a common underlying health status, or characterize processes that are influenced in a similar way by covariates such as exposure or demographic characteristics. Statistical methods that can combine multivariate response variables into common measures of covariate effects have been proposed by Roy and Lin [1]; Proust-Lima, Letenneur and Jacqmin-Gadda [2]; and Gray and Brookmeyer [3] among others. Current methods for characterizing the relationship between covariates and the rate of change in multivariate outcomes are limited to select models. For example, Gray and Brookmeyer [3] introduce an “accelerated time” method which assumes that covariates rescale time in longitudinal models for disease progression. In this manuscript we detail an alternative multivariate model formulation that directly structures longitudinal rates of change, and that permits a common covariate effect across multiple outcomes. We detail maximum likelihood estimation for a multivariate longitudinal mixed model. We show via asymptotic calculations the potential gain in power that may be achieved with a common analysis of multiple outcomes. We apply the proposed methods to the analysis of a trivariate outcome for infant growth and compare rates of change for HIV infected and uninfected infants. PMID:27417129

  17. Monitoring and Identifying in Real time Critical Patients Events.

    PubMed

    Chavez Mora, Emma

    2014-01-01

    Nowadays pervasive health care monitoring environments, as well as business activity monitoring environments, gather information from a variety of data sources. However it includes new challenges because of the use of body and wireless sensors, nontraditional operational and transactional sources. This makes the health data more difficult to monitor. Decision making in this environment is typically complex and unstructured as clinical work is essentially interpretative, multitasking, collaborative, distributed and reactive. Thus, the health care arena requires real time data management in areas such as patient monitoring, detection of adverse events and adaptive responses to operational failures. This research presents a new architecture that enables real time patient data management through the use of intelligent data sources.

  18. Impact of biliary stent-related events in patients diagnosed with advanced pancreatobiliary tumours receiving palliative chemotherapy.

    PubMed

    Lamarca, Angela; Rigby, Christina; McNamara, Mairéad G; Hubner, Richard A; Valle, Juan W

    2016-07-14

    To determine the impact (morbidity/mortality) of biliary stent-related events (SRE) (cholangitis or stent obstruction) in chemotherapy-treated pancreatico-biliary patients. All consecutive patients with advanced pancreatobiliary cancer and a biliary stent in-situ prior to starting palliative chemotherapy were identified retrospectively from local electronic case-note records (Jan 13 to Jan 15). The primary end-point was SRE rate and the time-to-SRE (defined as time from first stenting before chemotherapy to date of SRE). Progression-free survival and overall survival were measured from the time of starting chemotherapy. Kaplan-Meier, Cox and Fine-Gray regression (univariate and multivariable) analyses were employed, as appropriate. For the analysis of time-to-SRE, death was considered as a competing event. Ninety-six out of 693 screened patients were eligible; 89% had a metal stent (the remainder were plastic). The median time of follow-up was 9.6 mo (range 2.2 to 26.4). Forty-one patients (43%) developed a SRE during follow-up [cholangitis (39%), stent obstruction (29%), both (32%)]. There were no significant differences in baseline characteristics between the SRE group and no-SRE groups. Recorded SRE-consequences were: none (37%), chemotherapy delay (24%), discontinuation (17%) and death (22%). The median time-to-SRE was 4.4 mo (95%CI: 3.6-5.5). Patients with severe comorbidities (P < 0.001) and patients with ≥ 2 baseline stents/biliary procedures [HR = 2.3 (95%CI: 1.2-4.44), P = 0.010] had a shorter time-to-SRE on multivariable analysis. Stage was an independent prognostic factor for overall survival (P = 0.029) in the multivariable analysis adjusted for primary tumour site, performance status and development of SRE (SRE group vs no-SRE group). SREs are common and impact on patient's morbidity. Our results highlight the need for prospective studies exploring the role of prophylactic strategies to prevent/delay SREs.

  19. Impact of biliary stent-related events in patients diagnosed with advanced pancreatobiliary tumours receiving palliative chemotherapy

    PubMed Central

    Lamarca, Angela; Rigby, Christina; McNamara, Mairéad G; Hubner, Richard A; Valle, Juan W

    2016-01-01

    AIM: To determine the impact (morbidity/mortality) of biliary stent-related events (SRE) (cholangitis or stent obstruction) in chemotherapy-treated pancreatico-biliary patients. METHODS: All consecutive patients with advanced pancreatobiliary cancer and a biliary stent in-situ prior to starting palliative chemotherapy were identified retrospectively from local electronic case-note records (Jan 13 to Jan 15). The primary end-point was SRE rate and the time-to-SRE (defined as time from first stenting before chemotherapy to date of SRE). Progression-free survival and overall survival were measured from the time of starting chemotherapy. Kaplan-Meier, Cox and Fine-Gray regression (univariate and multivariable) analyses were employed, as appropriate. For the analysis of time-to-SRE, death was considered as a competing event. RESULTS: Ninety-six out of 693 screened patients were eligible; 89% had a metal stent (the remainder were plastic). The median time of follow-up was 9.6 mo (range 2.2 to 26.4). Forty-one patients (43%) developed a SRE during follow-up [cholangitis (39%), stent obstruction (29%), both (32%)]. There were no significant differences in baseline characteristics between the SRE group and no-SRE groups. Recorded SRE-consequences were: none (37%), chemotherapy delay (24%), discontinuation (17%) and death (22%). The median time-to-SRE was 4.4 mo (95%CI: 3.6-5.5). Patients with severe comorbidities (P < 0.001) and patients with ≥ 2 baseline stents/biliary procedures [HR = 2.3 (95%CI: 1.2-4.44), P = 0.010] had a shorter time-to-SRE on multivariable analysis. Stage was an independent prognostic factor for overall survival (P = 0.029) in the multivariable analysis adjusted for primary tumour site, performance status and development of SRE (SRE group vs no-SRE group). CONCLUSION: SREs are common and impact on patient’s morbidity. Our results highlight the need for prospective studies exploring the role of prophylactic strategies to prevent/delay SREs. PMID

  20. Time-varying correlations in global real estate markets: A multivariate GARCH with spatial effects approach

    NASA Astrophysics Data System (ADS)

    Gu, Huaying; Liu, Zhixue; Weng, Yingliang

    2017-04-01

    The present study applies the multivariate generalized autoregressive conditional heteroscedasticity (MGARCH) with spatial effects approach for the analysis of the time-varying conditional correlations and contagion effects among global real estate markets. A distinguishing feature of the proposed model is that it can simultaneously capture the spatial interactions and the dynamic conditional correlations compared with the traditional MGARCH models. Results reveal that the estimated dynamic conditional correlations have exhibited significant increases during the global financial crisis from 2007 to 2009, thereby suggesting contagion effects among global real estate markets. The analysis further indicates that the returns of the regional real estate markets that are in close geographic and economic proximities exhibit strong co-movement. In addition, evidence of significantly positive leverage effects in global real estate markets is also determined. The findings have significant implications on global portfolio diversification opportunities and risk management practices.

  1. Gravitational Wave Detection of Compact Binaries Through Multivariate Analysis

    NASA Astrophysics Data System (ADS)

    Atallah, Dany Victor; Dorrington, Iain; Sutton, Patrick

    2017-01-01

    The first detection of gravitational waves (GW), GW150914, as produced by a binary black hole merger, has ushered in the era of GW astronomy. The detection technique used to find GW150914 considered only a fraction of the information available describing the candidate event: mainly the detector signal to noise ratios and chi-squared values. In hopes of greatly increasing detection rates, we want to take advantage of all the information available about candidate events. We employ a technique called Multivariate Analysis (MVA) to improve LIGO sensitivity to GW signals. MVA techniques are efficient ways to scan high dimensional data spaces for signal/noise classification. Our goal is to use MVA to classify compact-object binary coalescence (CBC) events composed of any combination of black holes and neutron stars. CBC waveforms are modeled through numerical relativity. Templates of the modeled waveforms are used to search for CBCs and quantify candidate events. Different MVA pipelines are under investigation to look for CBC signals and un-modelled signals, with promising results. One such MVA pipeline used for the un-modelled search can theoretically analyze far more data than the MVA pipelines currently explored for CBCs, potentially making a more powerful classifier. In principle, this extra information could improve the sensitivity to GW signals. We will present the results from our efforts to adapt an MVA pipeline used in the un-modelled search to classify candidate events from the CBC search.

  2. Fast Multivariate Search on Large Aviation Datasets

    NASA Technical Reports Server (NTRS)

    Bhaduri, Kanishka; Zhu, Qiang; Oza, Nikunj C.; Srivastava, Ashok N.

    2010-01-01

    Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical monitoring, and financial systems. Domain experts are often interested in searching for interesting multivariate patterns from these MTS databases which can contain up to several gigabytes of data. Surprisingly, research on MTS search is very limited. Most existing work only supports queries with the same length of data, or queries on a fixed set of variables. In this paper, we propose an efficient and flexible subsequence search framework for massive MTS databases, that, for the first time, enables querying on any subset of variables with arbitrary time delays between them. We propose two provably correct algorithms to solve this problem (1) an R-tree Based Search (RBS) which uses Minimum Bounding Rectangles (MBR) to organize the subsequences, and (2) a List Based Search (LBS) algorithm which uses sorted lists for indexing. We demonstrate the performance of these algorithms using two large MTS databases from the aviation domain, each containing several millions of observations Both these tests show that our algorithms have very high prune rates (>95%) thus needing actual

  3. Testing for the Presence of Correlation Changes in a Multivariate Time Series: A Permutation Based Approach.

    PubMed

    Cabrieto, Jedelyn; Tuerlinckx, Francis; Kuppens, Peter; Hunyadi, Borbála; Ceulemans, Eva

    2018-01-15

    Detecting abrupt correlation changes in multivariate time series is crucial in many application fields such as signal processing, functional neuroimaging, climate studies, and financial analysis. To detect such changes, several promising correlation change tests exist, but they may suffer from severe loss of power when there is actually more than one change point underlying the data. To deal with this drawback, we propose a permutation based significance test for Kernel Change Point (KCP) detection on the running correlations. Given a requested number of change points K, KCP divides the time series into K + 1 phases by minimizing the within-phase variance. The new permutation test looks at how the average within-phase variance decreases when K increases and compares this to the results for permuted data. The results of an extensive simulation study and applications to several real data sets show that, depending on the setting, the new test performs either at par or better than the state-of-the art significance tests for detecting the presence of correlation changes, implying that its use can be generally recommended.

  4. New Tools for Comparing Beliefs about the Timing of Recurrent Events with Climate Time Series Datasets

    NASA Astrophysics Data System (ADS)

    Stiller-Reeve, Mathew; Stephenson, David; Spengler, Thomas

    2017-04-01

    For climate services to be relevant and informative for users, scientific data definitions need to match users' perceptions or beliefs. This study proposes and tests novel yet simple methods to compare beliefs of timing of recurrent climatic events with empirical evidence from multiple historical time series. The methods are tested by applying them to the onset date of the monsoon in Bangladesh, where several scientific monsoon definitions can be applied, yielding different results for monsoon onset dates. It is a challenge to know which monsoon definition compares best with people's beliefs. Time series from eight different scientific monsoon definitions in six regions are compared with respondent beliefs from a previously completed survey concerning the monsoon onset. Beliefs about the timing of the monsoon onset are represented probabilistically for each respondent by constructing a probability mass function (PMF) from elicited responses about the earliest, normal, and latest dates for the event. A three-parameter circular modified triangular distribution (CMTD) is used to allow for the possibility (albeit small) of the onset at any time of the year. These distributions are then compared to the historical time series using two approaches: likelihood scores, and the mean and standard deviation of time series of dates simulated from each belief distribution. The methods proposed give the basis for further iterative discussion with decision-makers in the development of eventual climate services. This study uses Jessore, Bangladesh, as an example and finds that a rainfall definition, applying a 10 mm day-1 threshold to NCEP-NCAR reanalysis (Reanalysis-1) data, best matches the survey respondents' beliefs about monsoon onset.

  5. Extreme rainfall events: Learning from raingauge time series

    NASA Astrophysics Data System (ADS)

    Boni, G.; Parodi, A.; Rudari, R.

    2006-08-01

    SummaryThis study analyzes the historical records of annual rainfall maxima recorded in Northern Italy, cumulated over time windows (durations) of 1 and 24 h and considered paradigmatic descriptions of storms of both short and long duration. Three large areas are studied: Liguria, Piedmont and Triveneto (Triveneto includes the Regions of Veneto, Trentino Alto Adige and Friuli Venezia Giulia). A regional frequency analysis of annual rainfall maxima is carried out through the Two Components Extreme Value (TCEV) distribution. A hierarchical approach is used to define statistically homogeneous areas so that the definition of a regional distribution becomes possible. Thanks to the peculiar nature of the TCEV distribution, a frequency-based threshold criterion is proposed. Such criterion allows to distinguish the observed ordinary values from the observed extra-ordinary values of annual rainfall maxima. A second step of this study focuses on the analysis of the probability of occurrence of extra-ordinary events over a period of one year. Results show the existence of a four month dominant season that maximizes the number of occurrences of annual rainfall maxima. Such results also show how the seasonality of extra-ordinary events changes whenever a different duration of events is considered. The joint probability of occurrence of extreme storms of short and long duration is also analyzed. Such analysis demonstrates how the joint probability of occurrence significantly changes when all rainfall maxima or only extra-ordinary maxima are used. All results undergo a critical discussion. Such discussion seems to lead to the point that the identified statistical characteristics might represent the landmark of those mechanisms causing heavy precipitation in the analyzed regions.

  6. Intensity/time profiles of solar particle events at one astronomical unit

    NASA Technical Reports Server (NTRS)

    Shea, M. A.

    1988-01-01

    A description of the intensity-time profiles of solar proton events observed at the orbit of the earth is presented. The discussion, which includes descriptive figures, presents a general overview of the subject without the detailed mathematical description of the physical processes which usually accompany most reviews.

  7. Building a Catalog of Time-Dependent Inversions for Cascadia ETS Events

    NASA Astrophysics Data System (ADS)

    Bartlow, N. M.; Williams, C. A.; Wallace, L. M.

    2017-12-01

    Episodic Tremor and Slip (ETS), composed of periodically occurring slow slip events accompanied by tectonic tremor, have been recognized in Cascadia since 1999. While the tremor has been continuously and automatically monitored for a few years (Wech et al., SRL, 2010; pnsn.org/tremor), the geodetically-derived slip has not been systematically monitored in the same way. Instead, numerous time-dependent and static inversions of the geodetic data have been performed for individual ETS events, with many events going unstudied. Careful study of, and monitoring of, ETS is important both to advance the scientific understanding of fault mechanics and to improve earthquake hazard forecasting in Cascadia. Here we present the results of initial efforts to standardize geodetic inversions of slow slip during Cascadia ETS. We use the Network Inversion Filter (NIF, Segall and Matthews,1997; McGuire and Segall, 2003; Miyazaki et al.,2006), applied evenly to an extended time period, to detect and catalog slow slip transients. Bartlow et al., 2014, conducted a similar study for the Hikurangi subduction zone, covering a 2.5 year period. Additionally, we generate Green's functions using the PyLith finite element code (Aagaard et al., 2013) to allow consideration of elastic property variations derived from a Cascadia-wide seismic velocity model (Stephenson, USGS pub., 2007). These Green's functions are then integrated to provide Green's functions compatible with the Network Inversion Filter. The use of heterogeneous elastic Green's functions allows for a more accurate estimation of slip amplitudes, both during individual ETS events and averaged over multiple events. This is useful for constraining the total slip budget in Cascadia, including whether ETS takes up the entire plate motion on the deeper extent of the plate interface where it occurs. The recent study of Williams and Wallace, GRL, 2015 demonstrated that the use heterogeneous elastic Green's Functions in inversions can make a

  8. Time-Based and Event-Based Prospective Memory in Autism Spectrum Disorder: The Roles of Executive Function and Theory of Mind, and Time-Estimation

    ERIC Educational Resources Information Center

    Williams, David; Boucher, Jill; Lind, Sophie; Jarrold, Christopher

    2013-01-01

    Prospective memory (remembering to carry out an action in the future) has been studied relatively little in ASD. We explored time-based (carry out an action at a pre-specified time) and event-based (carry out an action upon the occurrence of a pre-specified event) prospective memory, as well as possible cognitive correlates, among 21…

  9. Prediction of UT1-UTC, LOD and AAM χ3 by combination of least-squares and multivariate stochastic methods

    NASA Astrophysics Data System (ADS)

    Niedzielski, Tomasz; Kosek, Wiesław

    2008-02-01

    This article presents the application of a multivariate prediction technique for predicting universal time (UT1-UTC), length of day (LOD) and the axial component of atmospheric angular momentum (AAM χ 3). The multivariate predictions of LOD and UT1-UTC are generated by means of the combination of (1) least-squares (LS) extrapolation of models for annual, semiannual, 18.6-year, 9.3-year oscillations and for the linear trend, and (2) multivariate autoregressive (MAR) stochastic prediction of LS residuals (LS + MAR). The MAR technique enables the use of the AAM χ 3 time-series as the explanatory variable for the computation of LOD or UT1-UTC predictions. In order to evaluate the performance of this approach, two other prediction schemes are also applied: (1) LS extrapolation, (2) combination of LS extrapolation and univariate autoregressive (AR) prediction of LS residuals (LS + AR). The multivariate predictions of AAM χ 3 data, however, are computed as a combination of the extrapolation of the LS model for annual and semiannual oscillations and the LS + MAR. The AAM χ 3 predictions are also compared with LS extrapolation and LS + AR prediction. It is shown that the predictions of LOD and UT1-UTC based on LS + MAR taking into account the axial component of AAM are more accurate than the predictions of LOD and UT1-UTC based on LS extrapolation or on LS + AR. In particular, the UT1-UTC predictions based on LS + MAR during El Niño/La Niña events exhibit considerably smaller prediction errors than those calculated by means of LS or LS + AR. The AAM χ 3 time-series is predicted using LS + MAR with higher accuracy than applying LS extrapolation itself in the case of medium-term predictions (up to 100 days in the future). However, the predictions of AAM χ 3 reveal the best accuracy for LS + AR.

  10. What controls the local time extent of flux transfer events?

    NASA Astrophysics Data System (ADS)

    Milan, S. E.; Imber, S. M.; Carter, J. A.; Walach, M.-T.; Hubert, B.

    2016-02-01

    Flux transfer events (FTEs) are the manifestation of bursty and/or patchy magnetic reconnection at the magnetopause. We compare two sequences of the ionospheric signatures of flux transfer events observed in global auroral imagery and coherent ionospheric radar measurements. Both sequences were observed during very similar seasonal and interplanetary magnetic field (IMF) conditions, though with differing solar wind speed. A key observation is that the signatures differed considerably in their local time extent. The two periods are 26 August 1998, when the IMF had components BZ≈-10 nT and BY≈9 nT and the solar wind speed was VX≈650 km s-1, and 31 August 2005, IMF BZ≈-7 nT, BY≈17 nT, and VX≈380 km s-1. In the first case, the reconnection rate was estimated to be near 160 kV, and the FTE signatures extended across at least 7 h of magnetic local time (MLT) of the dayside polar cap boundary. In the second, a reconnection rate close to 80 kV was estimated, and the FTEs had a MLT extent of roughly 2 h. We discuss the ramifications of these differences for solar wind-magnetosphere coupling.

  11. Modeling time-to-event (survival) data using classification tree analysis.

    PubMed

    Linden, Ariel; Yarnold, Paul R

    2017-12-01

    Time to the occurrence of an event is often studied in health research. Survival analysis differs from other designs in that follow-up times for individuals who do not experience the event by the end of the study (called censored) are accounted for in the analysis. Cox regression is the standard method for analysing censored data, but the assumptions required of these models are easily violated. In this paper, we introduce classification tree analysis (CTA) as a flexible alternative for modelling censored data. Classification tree analysis is a "decision-tree"-like classification model that provides parsimonious, transparent (ie, easy to visually display and interpret) decision rules that maximize predictive accuracy, derives exact P values via permutation tests, and evaluates model cross-generalizability. Using empirical data, we identify all statistically valid, reproducible, longitudinally consistent, and cross-generalizable CTA survival models and then compare their predictive accuracy to estimates derived via Cox regression and an unadjusted naïve model. Model performance is assessed using integrated Brier scores and a comparison between estimated survival curves. The Cox regression model best predicts average incidence of the outcome over time, whereas CTA survival models best predict either relatively high, or low, incidence of the outcome over time. Classification tree analysis survival models offer many advantages over Cox regression, such as explicit maximization of predictive accuracy, parsimony, statistical robustness, and transparency. Therefore, researchers interested in accurate prognoses and clear decision rules should consider developing models using the CTA-survival framework. © 2017 John Wiley & Sons, Ltd.

  12. Identification of unusual events in multichannel bridge monitoring data using wavelet transform and outlier analysis

    NASA Astrophysics Data System (ADS)

    Omenzetter, Piotr; Brownjohn, James M. W.; Moyo, Pilate

    2003-08-01

    Continuously operating instrumented structural health monitoring (SHM) systems are becoming a practical alternative to replace visual inspection for assessment of condition and soundness of civil infrastructure. However, converting large amount of data from an SHM system into usable information is a great challenge to which special signal processing techniques must be applied. This study is devoted to identification of abrupt, anomalous and potentially onerous events in the time histories of static, hourly sampled strains recorded by a multi-sensor SHM system installed in a major bridge structure in Singapore and operating continuously for a long time. Such events may result, among other causes, from sudden settlement of foundation, ground movement, excessive traffic load or failure of post-tensioning cables. A method of outlier detection in multivariate data has been applied to the problem of finding and localizing sudden events in the strain data. For sharp discrimination of abrupt strain changes from slowly varying ones wavelet transform has been used. The proposed method has been successfully tested using known events recorded during construction of the bridge, and later effectively used for detection of anomalous post-construction events.

  13. Continuous-time random walks with reset events. Historical background and new perspectives

    NASA Astrophysics Data System (ADS)

    Montero, Miquel; Masó-Puigdellosas, Axel; Villarroel, Javier

    2017-09-01

    In this paper, we consider a stochastic process that may experience random reset events which relocate the system to its starting position. We focus our attention on a one-dimensional, monotonic continuous-time random walk with a constant drift: the process moves in a fixed direction between the reset events, either by the effect of the random jumps, or by the action of a deterministic bias. However, the orientation of its motion is randomly determined after each restart. As a result of these alternating dynamics, interesting properties do emerge. General formulas for the propagator as well as for two extreme statistics, the survival probability and the mean first-passage time, are also derived. The rigor of these analytical results is verified by numerical estimations, for particular but illuminating examples.

  14. Real-Time Gait Event Detection Based on Kinematic Data Coupled to a Biomechanical Model.

    PubMed

    Lambrecht, Stefan; Harutyunyan, Anna; Tanghe, Kevin; Afschrift, Maarten; De Schutter, Joris; Jonkers, Ilse

    2017-03-24

    Real-time detection of multiple stance events, more specifically initial contact (IC), foot flat (FF), heel off (HO), and toe off (TO), could greatly benefit neurorobotic (NR) and neuroprosthetic (NP) control. Three real-time threshold-based algorithms have been developed, detecting the aforementioned events based on kinematic data in combination with a biomechanical model. Data from seven subjects walking at three speeds on an instrumented treadmill were used to validate the presented algorithms, accumulating to a total of 558 steps. The reference for the gait events was obtained using marker and force plate data. All algorithms had excellent precision and no false positives were observed. Timing delays of the presented algorithms were similar to current state-of-the-art algorithms for the detection of IC and TO, whereas smaller delays were achieved for the detection of FF. Our results indicate that, based on their high precision and low delays, these algorithms can be used for the control of an NR/NP, with the exception of the HO event. Kinematic data is used in most NR/NP control schemes and is thus available at no additional cost, resulting in a minimal computational burden. The presented methods can also be applied for screening pathological gait or gait analysis in general in/outside of the laboratory.

  15. Multivariate generalized multifactor dimensionality reduction to detect gene-gene interactions

    PubMed Central

    2013-01-01

    Background Recently, one of the greatest challenges in genome-wide association studies is to detect gene-gene and/or gene-environment interactions for common complex human diseases. Ritchie et al. (2001) proposed multifactor dimensionality reduction (MDR) method for interaction analysis. MDR is a combinatorial approach to reduce multi-locus genotypes into high-risk and low-risk groups. Although MDR has been widely used for case-control studies with binary phenotypes, several extensions have been proposed. One of these methods, a generalized MDR (GMDR) proposed by Lou et al. (2007), allows adjusting for covariates and applying to both dichotomous and continuous phenotypes. GMDR uses the residual score of a generalized linear model of phenotypes to assign either high-risk or low-risk group, while MDR uses the ratio of cases to controls. Methods In this study, we propose multivariate GMDR, an extension of GMDR for multivariate phenotypes. Jointly analysing correlated multivariate phenotypes may have more power to detect susceptible genes and gene-gene interactions. We construct generalized estimating equations (GEE) with multivariate phenotypes to extend generalized linear models. Using the score vectors from GEE we discriminate high-risk from low-risk groups. We applied the multivariate GMDR method to the blood pressure data of the 7,546 subjects from the Korean Association Resource study: systolic blood pressure (SBP) and diastolic blood pressure (DBP). We compare the results of multivariate GMDR for SBP and DBP to the results from separate univariate GMDR for SBP and DBP, respectively. We also applied the multivariate GMDR method to the repeatedly measured hypertension status from 5,466 subjects and compared its result with those of univariate GMDR at each time point. Results Results from the univariate GMDR and multivariate GMDR in two-locus model with both blood pressures and hypertension phenotypes indicate best combinations of SNPs whose interaction has

  16. Intelligent fuzzy controller for event-driven real time systems

    NASA Technical Reports Server (NTRS)

    Grantner, Janos; Patyra, Marek; Stachowicz, Marian S.

    1992-01-01

    Most of the known linguistic models are essentially static, that is, time is not a parameter in describing the behavior of the object's model. In this paper we show a model for synchronous finite state machines based on fuzzy logic. Such finite state machines can be used to build both event-driven, time-varying, rule-based systems and the control unit section of a fuzzy logic computer. The architecture of a pipelined intelligent fuzzy controller is presented, and the linguistic model is represented by an overall fuzzy relation stored in a single rule memory. A VLSI integrated circuit implementation of the fuzzy controller is suggested. At a clock rate of 30 MHz, the controller can perform 3 MFLIPS on multi-dimensional fuzzy data.

  17. Implementation Challenges for Multivariable Control: What You Did Not Learn in School

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay

    2008-01-01

    Multivariable control allows controller designs that can provide decoupled command tracking and robust performance in the presence of modeling uncertainties. Although the last two decades have seen extensive development of multivariable control theory and example applications to complex systems in software/hardware simulations, there are no production flying systems aircraft or spacecraft, that use multivariable control. This is because of the tremendous challenges associated with implementation of such multivariable control designs. Unfortunately, the curriculum in schools does not provide sufficient time to be able to provide an exposure to the students in such implementation challenges. The objective of this paper is to share the lessons learned by a practitioner of multivariable control in the process of applying some of the modern control theory to the Integrated Flight Propulsion Control (IFPC) design for an advanced Short Take-Off Vertical Landing (STOVL) aircraft simulation.

  18. Single Particle Analysis by Combined Chemical Imaging to Study Episodic Air Pollution Events in Vienna

    NASA Astrophysics Data System (ADS)

    Ofner, Johannes; Eitenberger, Elisabeth; Friedbacher, Gernot; Brenner, Florian; Hutter, Herbert; Schauer, Gerhard; Kistler, Magdalena; Greilinger, Marion; Lohninger, Hans; Lendl, Bernhard; Kasper-Giebl, Anne

    2017-04-01

    The aerosol composition of a city like Vienna is characterized by a complex interaction of local emissions and atmospheric input on a regional and continental scale. The identification of major aerosol constituents for basic source appointment and air quality issues needs a high analytical effort. Exceptional episodic air pollution events strongly change the typical aerosol composition of a city like Vienna on a time-scale of few hours to several days. Analyzing the chemistry of particulate matter from these events is often hampered by the sampling time and related sample amount necessary to apply the full range of bulk analytical methods needed for chemical characterization. Additionally, morphological and single particle features are hardly accessible. Chemical Imaging evolved to a powerful tool for image-based chemical analysis of complex samples. As a complementary technique to bulk analytical methods, chemical imaging can address a new access to study air pollution events by obtaining major aerosol constituents with single particle features at high temporal resolutions and small sample volumes. The analysis of the chemical imaging datasets is assisted by multivariate statistics with the benefit of image-based chemical structure determination for direct aerosol source appointment. A novel approach in chemical imaging is combined chemical imaging or so-called multisensor hyperspectral imaging, involving elemental imaging (electron microscopy-based energy dispersive X-ray imaging), vibrational imaging (Raman micro-spectroscopy) and mass spectrometric imaging (Time-of-Flight Secondary Ion Mass Spectrometry) with subsequent combined multivariate analytics. Combined chemical imaging of precipitated aerosol particles will be demonstrated by the following examples of air pollution events in Vienna: Exceptional episodic events like the transformation of Saharan dust by the impact of the city of Vienna will be discussed and compared to samples obtained at a high alpine

  19. Hybrid High-Fidelity Modeling of Radar Scenarios Using Atemporal, Discrete-Event, and Time-Step Simulation

    DTIC Science & Technology

    2016-12-01

    time T1 for the mover to travel from the current position to the next waypoint is calculated by the T1 = DistanceMaxSeed . The "EndMove" event will...speed of light in a real atmosphere. The factor of 12 is the result of the round trip travel time of the signal. The maximum detection range (Rmax) is...34EnterRange" event is triggered by the referee, the time for the target traveling to the midpoint towards its waypoint tm is calculated and applied

  20. Event- and time-triggered remembering: the impact of attention deficit hyperactivity disorder on prospective memory performance in children.

    PubMed

    Talbot, Karley-Dale S; Kerns, Kimberly A

    2014-11-01

    The current study examined prospective memory (PM, both time-based and event-based) and time estimation (TR, a time reproduction task) in children with and without attention deficit hyperactivity disorder (ADHD). This study also investigated the influence of task performance and TR on time-based PM in children with ADHD relative to controls. A sample of 69 children, aged 8 to 13 years, completed the CyberCruiser-II time-based PM task, a TR task, and the Super Little Fisherman event-based PM task. PM performance was compared with children's TR abilities, parental reports of daily prospective memory disturbances (Prospective and Retrospective Memory Questionnaire for Children, PRMQC), and ADHD symptomatology (Conner's rating scales). Children with ADHD scored more poorly on event-based PM, time-based PM, and TR; interestingly, TR did not appear related to performance on time-based PM. In addition, it was found that PRMQC scores and ADHD symptom severity were related to performance on the time-based PM task but not to performance on the event-based PM task. These results provide some limited support for theories that propose a distinction between event-based PM and time-based PM. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. The influence of pubertal timing and stressful life events on depression and delinquency among Chinese adolescents.

    PubMed

    Chen, Jie; Yu, Jing; Wu, Yun; Zhang, Jianxin

    2015-06-01

    This study aimed to investigate the influences of pubertal timing and stressful life events on Chinese adolescents' depression and delinquency. Sex differences in these influences were also examined. A large sample with 4,228 participants aged 12-15 years (53% girls) was recruited in Beijing, China. Participants' pubertal development, stressful life events, depressive symptoms, and delinquency were measured using self-reported questionnaires. Both early maturing girls and boys displayed more delinquency than their same-sex on-time and late maturing peers. Early maturing girls displayed more depressive symptoms than on-time and late maturing girls, but boys in the three maturation groups showed similar levels of depressive symptoms. The interactive effects between early pubertal timing and stressful life events were significant in predicting depression and delinquency, particularly for girls. Early pubertal maturation is an important risk factor for Chinese adolescents' depression and delinquency. Stressful life events intensified the detrimental effects of early pubertal maturation on adolescents' depression and delinquency, particularly for girls. © 2015 The Institute of Psychology, Chinese Academy of Sciences and Wiley Publishing Asia Pty Ltd.

  2. Synthetic-Type Control Charts for Time-Between-Events Monitoring

    PubMed Central

    Yen, Fang Yen; Chong, Khoo Michael Boon; Ha, Lee Ming

    2013-01-01

    This paper proposes three synthetic-type control charts to monitor the mean time-between-events of a homogenous Poisson process. The first proposed chart combines an Erlang (cumulative time between events, Tr) chart and a conforming run length (CRL) chart, denoted as Synth-Tr chart. The second proposed chart combines an exponential (or T) chart and a group conforming run length (GCRL) chart, denoted as GR-T chart. The third proposed chart combines an Erlang chart and a GCRL chart, denoted as GR-Tr chart. By using a Markov chain approach, the zero- and steady-state average number of observations to signal (ANOS) of the proposed charts are obtained, in order to evaluate the performance of the three charts. The optimal design of the proposed charts is shown in this paper. The proposed charts are superior to the existing T chart, Tr chart, and Synth-T chart. As compared to the EWMA-T chart, the GR-T chart performs better in detecting large shifts, in terms of the zero- and steady-state performances. The zero-state Synth-T4 and GR-Tr (r = 3 or 4) charts outperform the EWMA-T chart for all shifts, whereas the Synth-Tr (r = 2 or 3) and GR-T 2 charts perform better for moderate to large shifts. For the steady-state process, the Synth-Tr and GR-Tr charts are more efficient than the EWMA-T chart in detecting small to moderate shifts. PMID:23755231

  3. Marginal regression analysis of recurrent events with coarsened censoring times.

    PubMed

    Hu, X Joan; Rosychuk, Rhonda J

    2016-12-01

    Motivated by an ongoing pediatric mental health care (PMHC) study, this article presents weakly structured methods for analyzing doubly censored recurrent event data where only coarsened information on censoring is available. The study extracted administrative records of emergency department visits from provincial health administrative databases. The available information of each individual subject is limited to a subject-specific time window determined up to concealed data. To evaluate time-dependent effect of exposures, we adapt the local linear estimation with right censored survival times under the Cox regression model with time-varying coefficients (cf. Cai and Sun, Scandinavian Journal of Statistics 2003, 30, 93-111). We establish the pointwise consistency and asymptotic normality of the regression parameter estimator, and examine its performance by simulation. The PMHC study illustrates the proposed approach throughout the article. © 2016, The International Biometric Society.

  4. Perceiving Control Over Aversive and Fearful Events Can Alter How We Experience Those Events: An Investigation of Time Perception in Spider-Fearful Individuals

    PubMed Central

    Buetti, Simona; Lleras, Alejandro

    2012-01-01

    We used a time perception task to study the effects of the subjective experience of control on emotion and cognitive processing. This task is uniquely sensitive to the emotionality of the stimuli: high-arousing negative stimuli are perceived as lasting longer than high-arousing positive events, while the opposite pattern is observed for low-arousing stimuli. We evaluated the temporal distortions of emotionally charged events in non-anxious (Experiments 1 and 5) and spider-fearful individuals (Experiments 2–4). Participants were shown images of varying durations between 400 and 1600 ms and were asked to report if the perceived duration of the image seemed closer to a short (400 ms) or to a long (1600 ms) standard duration. Our results replicate previous findings showing that the emotional content of the image modulated the perceived duration of that image. More importantly, we studied whether giving participants the illusion that they have some control over the emotional content of the images could eliminate this temporal distortion. Results confirmed this hypothesis, even though our participant population was composed of highly reactive emotional individuals (spider-fearful) facing fear-related images (spiders). Further, we also showed that under conditions of little-to-no control, spider-fearful individuals perceive temporal distortions in a distinct manner from non-anxious participants: the duration of events was entirely determined by the valence of the events, rather than by the typical valence × arousal interaction. That is, spider-fearful participants perceived negative events as lasting longer than positive events, regardless of their level of arousal. Finally, we also showed that under conditions of cognitive dissonance, control can eliminate temporal distortions of low arousal events, but not of high-arousing events, providing an important boundary condition to the otherwise positive effects of control on time estimation. PMID:23060824

  5. Continuous Real-time Measurements of δ-values of Precipitation during Rain Events: Insights into Tropical Convection

    NASA Astrophysics Data System (ADS)

    He, S.; Goodkin, N.; Jackisch, D.; Ong, M. R.

    2017-12-01

    Studying how the tropical convection affects stable isotopes in precipitation can help us understand the evolution of the precipitation isotopes over time and improve the interpretation of paleoclimate records in the tropical region. We have been continuously monitoring δ-values of precipitation during rain events in Singapore for the past three years (2014-2017) using a diffusion sampler-cavity ring-down spectrometer (DS-CRDS) system. This period of time spans the most recent El Niño and La Niña, and thus affords us the opportunity to use our ultra-high temporal resolutsion data to examine the El Niño-Southern Oscillation (ENSO) impact on the precipitation isotopes during convection and the intra-annual variability in the region. δ-values of precipitation could change significantly during a single event, and mainly exhibits "V" (or "U" ) shape or "W" shape patterns. The mesoscale subsidence and rain re-evaporation are two processes that largely shape the isotopes of precipitation during events. Time series of the initial, highest and lowest δ-values of individual events, and absolute change in δ-values during these events show clear evolution over time. Events with low δ-values occurred less frequently in 2015 than the other years. Likewise, the frequency of events with larger magnitude change in δ-values and low initial values are also lower in 2015. The events with low averaged δ-values usually have very low initial δ-values, and are closely associated with organized regional convection, indicating that the convective activities in the upwind area can significantly influence the δ-values of precipitation. All these observations suggest lower intensity and frequency of regional organized convection in 2015. The ENSO event in 2015 was likely responsible for these changes. During an ENSO event, convection over the central and eastern Pacific is strengthened while that of the western Pacific and Southeast Asia is supressed, resulting in a weakened

  6. Multivariate pattern analysis of fMRI: the early beginnings.

    PubMed

    Haxby, James V

    2012-08-15

    In 2001, we published a paper on the representation of faces and objects in ventral temporal cortex that introduced a new method for fMRI analysis, which subsequently came to be called multivariate pattern analysis (MVPA). MVPA now refers to a diverse set of methods that analyze neural responses as patterns of activity that reflect the varying brain states that a cortical field or system can produce. This paper recounts the circumstances and events that led to the original study and later developments and innovations that have greatly expanded this approach to fMRI data analysis, leading to its widespread application. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. Infants’ Looking to Surprising Events: When Eye-Tracking Reveals More than Looking Time

    PubMed Central

    Yeung, H. Henny; Denison, Stephanie; Johnson, Scott P.

    2016-01-01

    Research on infants’ reasoning abilities often rely on looking times, which are longer to surprising and unexpected visual scenes compared to unsurprising and expected ones. Few researchers have examined more precise visual scanning patterns in these scenes, and so, here, we recorded 8- to 11-month-olds’ gaze with an eye tracker as we presented a sampling event whose outcome was either surprising, neutral, or unsurprising: A red (or yellow) ball was drawn from one of three visible containers populated 0%, 50%, or 100% with identically colored balls. When measuring looking time to the whole scene, infants were insensitive to the likelihood of the sampling event, replicating failures in similar paradigms. Nevertheless, a new analysis of visual scanning showed that infants did spend more time fixating specific areas-of-interest as a function of the event likelihood. The drawn ball and its associated container attracted more looking than the other containers in the 0% condition, but this pattern was weaker in the 50% condition, and even less strong in the 100% condition. Results suggest that measuring where infants look may be more sensitive than simply how much looking there is to the whole scene. The advantages of eye tracking measures over traditional looking measures are discussed. PMID:27926920

  8. A Differential Deficit in Time- versus Event-based Prospective Memory in Parkinson's Disease

    PubMed Central

    Raskin, Sarah A.; Woods, Steven Paul; Poquette, Amelia J.; McTaggart, April B.; Sethna, Jim; Williams, Rebecca C.; Tröster, Alexander I.

    2010-01-01

    Objective The aim of the current study was to clarify the nature and extent of impairment in time- versus event-based prospective memory in Parkinson's disease (PD). Prospective memory is thought to involve cognitive processes that are mediated by prefrontal systems and are executive in nature. Given that individuals with PD frequently show executive dysfunction, it is important to determine whether these individuals may have deficits in prospective memory that could impact daily functions, such as taking medications. Although it has been reported that individuals with PD evidence impairment in prospective memory, it is still unclear whether they show a greater deficit for time- versus event-based cues. Method Fifty-four individuals with PD and 34 demographically similar healthy adults were administered a standardized measure of prospective memory that allows for a direct comparison of time-based and event-based cues. In addition, participants were administered a series of standardized measures of retrospective memory and executive functions. Results Individuals with PD demonstrated impaired prospective memory performance compared to the healthy adults, with a greater impairment demonstrated for the time-based tasks. Time-based prospective memory performance was moderately correlated with measures of executive functioning, but only the Stroop Neuropsychological Screening Test emerged as a unique predictor in a linear regression. Conclusions Findings are interpreted within the context of McDaniel and Einstein's (2000) multi-process theory to suggest that individuals with PD experience particular difficulty executing a future intention when the cue to execute the prescribed intention requires higher levels of executive control. PMID:21090895

  9. Reducing ambulance response times using discrete event simulation.

    PubMed

    Wei Lam, Sean Shao; Zhang, Zhong Cheng; Oh, Hong Choon; Ng, Yih Ying; Wah, Win; Hock Ong, Marcus Eng

    2014-01-01

    The objectives of this study are to develop a discrete-event simulation (DES) model for the Singapore Emergency Medical Services (EMS), and to demonstrate the utility of this DES model for the evaluation of different policy alternatives to improve ambulance response times. A DES model was developed based on retrospective emergency call data over a continuous 6-month period in Singapore. The main outcome measure is the distribution of response times. The secondary outcome measure is ambulance utilization levels based on unit hour utilization (UHU) ratios. The DES model was used to evaluate different policy options in order to improve the response times, while maintaining reasonable fleet utilization. Three policy alternatives looking at the reallocation of ambulances, the addition of new ambulances, and alternative dispatch policies were evaluated. Modifications of dispatch policy combined with the reallocation of existing ambulances were able to achieve response time performance equivalent to that of adding 10 ambulances. The median (90th percentile) response time was 7.08 minutes (12.69 minutes). Overall, this combined strategy managed to narrow the gap between the ideal and existing response time distribution by 11-13%. Furthermore, the median UHU under this combined strategy was 0.324 with an interquartile range (IQR) of 0.047 versus a median utilization of 0.285 (IQR of 0.051) resulting from the introduction of additional ambulances. Response times were shown to be improved via a more effective reallocation of ambulances and dispatch policy. More importantly, the response time improvements were achieved without a reduction in the utilization levels and additional costs associated with the addition of ambulances. We demonstrated the effective use of DES as a versatile platform to model the dynamic system complexities of Singapore's national EMS systems for the evaluation of operational strategies to improve ambulance response times.

  10. “Smooth” Semiparametric Regression Analysis for Arbitrarily Censored Time-to-Event Data

    PubMed Central

    Zhang, Min; Davidian, Marie

    2008-01-01

    Summary A general framework for regression analysis of time-to-event data subject to arbitrary patterns of censoring is proposed. The approach is relevant when the analyst is willing to assume that distributions governing model components that are ordinarily left unspecified in popular semiparametric regression models, such as the baseline hazard function in the proportional hazards model, have densities satisfying mild “smoothness” conditions. Densities are approximated by a truncated series expansion that, for fixed degree of truncation, results in a “parametric” representation, which makes likelihood-based inference coupled with adaptive choice of the degree of truncation, and hence flexibility of the model, computationally and conceptually straightforward with data subject to any pattern of censoring. The formulation allows popular models, such as the proportional hazards, proportional odds, and accelerated failure time models, to be placed in a common framework; provides a principled basis for choosing among them; and renders useful extensions of the models straightforward. The utility and performance of the methods are demonstrated via simulations and by application to data from time-to-event studies. PMID:17970813

  11. Multivariate pattern dependence

    PubMed Central

    Saxe, Rebecca

    2017-01-01

    When we perform a cognitive task, multiple brain regions are engaged. Understanding how these regions interact is a fundamental step to uncover the neural bases of behavior. Most research on the interactions between brain regions has focused on the univariate responses in the regions. However, fine grained patterns of response encode important information, as shown by multivariate pattern analysis. In the present article, we introduce and apply multivariate pattern dependence (MVPD): a technique to study the statistical dependence between brain regions in humans in terms of the multivariate relations between their patterns of responses. MVPD characterizes the responses in each brain region as trajectories in region-specific multidimensional spaces, and models the multivariate relationship between these trajectories. We applied MVPD to the posterior superior temporal sulcus (pSTS) and to the fusiform face area (FFA), using a searchlight approach to reveal interactions between these seed regions and the rest of the brain. Across two different experiments, MVPD identified significant statistical dependence not detected by standard functional connectivity. Additionally, MVPD outperformed univariate connectivity in its ability to explain independent variance in the responses of individual voxels. In the end, MVPD uncovered different connectivity profiles associated with different representational subspaces of FFA: the first principal component of FFA shows differential connectivity with occipital and parietal regions implicated in the processing of low-level properties of faces, while the second and third components show differential connectivity with anterior temporal regions implicated in the processing of invariant representations of face identity. PMID:29155809

  12. Bootstrap-based methods for estimating standard errors in Cox's regression analyses of clustered event times.

    PubMed

    Xiao, Yongling; Abrahamowicz, Michal

    2010-03-30

    We propose two bootstrap-based methods to correct the standard errors (SEs) from Cox's model for within-cluster correlation of right-censored event times. The cluster-bootstrap method resamples, with replacement, only the clusters, whereas the two-step bootstrap method resamples (i) the clusters, and (ii) individuals within each selected cluster, with replacement. In simulations, we evaluate both methods and compare them with the existing robust variance estimator and the shared gamma frailty model, which are available in statistical software packages. We simulate clustered event time data, with latent cluster-level random effects, which are ignored in the conventional Cox's model. For cluster-level covariates, both proposed bootstrap methods yield accurate SEs, and type I error rates, and acceptable coverage rates, regardless of the true random effects distribution, and avoid serious variance under-estimation by conventional Cox-based standard errors. However, the two-step bootstrap method over-estimates the variance for individual-level covariates. We also apply the proposed bootstrap methods to obtain confidence bands around flexible estimates of time-dependent effects in a real-life analysis of cluster event times.

  13. Time difference of arrival to blast localization of potential chemical/biological event on the move

    NASA Astrophysics Data System (ADS)

    Morcos, Amir; Desai, Sachi; Peltzer, Brian; Hohil, Myron E.

    2007-10-01

    Integrating a sensor suite with ability to discriminate potential Chemical/Biological (CB) events from high-explosive (HE) events employing a standalone acoustic sensor with a Time Difference of Arrival (TDOA) algorithm we developed a cueing mechanism for more power intensive and range limited sensing techniques. Enabling the event detection algorithm to locate to a blast event using TDOA we then provide further information of the event as either Launch/Impact and if CB/HE. The added information is provided to a range limited chemical sensing system that exploits spectroscopy to determine the contents of the chemical event. The main innovation within this sensor suite is the system will provide this information on the move while the chemical sensor will have adequate time to determine the contents of the event from a safe stand-off distance. The CB/HE discrimination algorithm exploits acoustic sensors to provide early detection and identification of CB attacks. Distinct characteristics arise within the different airburst signatures because HE warheads emphasize concussive and shrapnel effects, while CB warheads are designed to disperse their contents over large areas, therefore employing a slower burning, less intense explosive to mix and spread their contents. Differences characterized by variations in the corresponding peak pressure and rise time of the blast, differences in the ratio of positive pressure amplitude to the negative amplitude, and variations in the overall duration of the resulting waveform. The discrete wavelet transform (DWT) is used to extract the predominant components of these characteristics from air burst signatures at ranges exceeding 3km. Highly reliable discrimination is achieved with a feed-forward neural network classifier trained on a feature space derived from the distribution of wavelet coefficients and higher frequency details found within different levels of the multiresolution decomposition. The development of an adaptive noise

  14. Monitoring Natural Events Globally in Near Real-Time Using NASA's Open Web Services and Tools

    NASA Technical Reports Server (NTRS)

    Boller, Ryan A.; Ward, Kevin Alan; Murphy, Kevin J.

    2015-01-01

    Since 1960, NASA has been making global measurements of the Earth from a multitude of space-based missions, many of which can be useful for monitoring natural events. In recent years, these measurements have been made available in near real-time, making it possible to use them to also aid in managing the response to natural events. We present the challenges and ongoing solutions to using NASA satellite data for monitoring and managing these events.

  15. Supervised Time Series Event Detector for Building Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-04-13

    A machine learning based approach is developed to detect events that have rarely been seen in the historical data. The data can include building energy consumption, sensor data, environmental data and any data that may affect the building's energy consumption. The algorithm is a modified nonlinear Bayesian support vector machine, which examines daily energy consumption profile, detect the days with abnormal events, and diagnose the cause of the events.

  16. Multiscale Characterization of PM2.5 in Southern Taiwan based on Noise-assisted Multivariate Empirical Mode Decomposition and Time-dependent Intrinsic Correlation

    NASA Astrophysics Data System (ADS)

    Hsiao, Y. R.; Tsai, C.

    2017-12-01

    As the WHO Air Quality Guideline indicates, ambient air pollution exposes world populations under threat of fatal symptoms (e.g. heart disease, lung cancer, asthma etc.), raising concerns of air pollution sources and relative factors. This study presents a novel approach to investigating the multiscale variations of PM2.5 in southern Taiwan over the past decade, with four meteorological influencing factors (Temperature, relative humidity, precipitation and wind speed),based on Noise-assisted Multivariate Empirical Mode Decomposition(NAMEMD) algorithm, Hilbert Spectral Analysis(HSA) and Time-dependent Intrinsic Correlation(TDIC) method. NAMEMD algorithm is a fully data-driven approach designed for nonlinear and nonstationary multivariate signals, and is performed to decompose multivariate signals into a collection of channels of Intrinsic Mode Functions (IMFs). TDIC method is an EMD-based method using a set of sliding window sizes to quantify localized correlation coefficients for multiscale signals. With the alignment property and quasi-dyadic filter bank of NAMEMD algorithm, one is able to produce same number of IMFs for all variables and estimates the cross correlation in a more accurate way. The performance of spectral representation of NAMEMD-HSA method is compared with Complementary Empirical Mode Decomposition/ Hilbert Spectral Analysis (CEEMD-HSA) and Wavelet Analysis. The nature of NAMAMD-based TDICC analysis is then compared with CEEMD-based TDIC analysis and the traditional correlation analysis.

  17. TATES: Efficient Multivariate Genotype-Phenotype Analysis for Genome-Wide Association Studies

    PubMed Central

    van der Sluis, Sophie; Posthuma, Danielle; Dolan, Conor V.

    2013-01-01

    To date, the genome-wide association study (GWAS) is the primary tool to identify genetic variants that cause phenotypic variation. As GWAS analyses are generally univariate in nature, multivariate phenotypic information is usually reduced to a single composite score. This practice often results in loss of statistical power to detect causal variants. Multivariate genotype–phenotype methods do exist but attain maximal power only in special circumstances. Here, we present a new multivariate method that we refer to as TATES (Trait-based Association Test that uses Extended Simes procedure), inspired by the GATES procedure proposed by Li et al (2011). For each component of a multivariate trait, TATES combines p-values obtained in standard univariate GWAS to acquire one trait-based p-value, while correcting for correlations between components. Extensive simulations, probing a wide variety of genotype–phenotype models, show that TATES's false positive rate is correct, and that TATES's statistical power to detect causal variants explaining 0.5% of the variance can be 2.5–9 times higher than the power of univariate tests based on composite scores and 1.5–2 times higher than the power of the standard MANOVA. Unlike other multivariate methods, TATES detects both genetic variants that are common to multiple phenotypes and genetic variants that are specific to a single phenotype, i.e. TATES provides a more complete view of the genetic architecture of complex traits. As the actual causal genotype–phenotype model is usually unknown and probably phenotypically and genetically complex, TATES, available as an open source program, constitutes a powerful new multivariate strategy that allows researchers to identify novel causal variants, while the complexity of traits is no longer a limiting factor. PMID:23359524

  18. Aperiodic Robust Model Predictive Control for Constrained Continuous-Time Nonlinear Systems: An Event-Triggered Approach.

    PubMed

    Liu, Changxin; Gao, Jian; Li, Huiping; Xu, Demin

    2018-05-01

    The event-triggered control is a promising solution to cyber-physical systems, such as networked control systems, multiagent systems, and large-scale intelligent systems. In this paper, we propose an event-triggered model predictive control (MPC) scheme for constrained continuous-time nonlinear systems with bounded disturbances. First, a time-varying tightened state constraint is computed to achieve robust constraint satisfaction, and an event-triggered scheduling strategy is designed in the framework of dual-mode MPC. Second, the sufficient conditions for ensuring feasibility and closed-loop robust stability are developed, respectively. We show that robust stability can be ensured and communication load can be reduced with the proposed MPC algorithm. Finally, numerical simulations and comparison studies are performed to verify the theoretical results.

  19. Event-specific real-time detection and quantification of genetically modified Roundup Ready soybean.

    PubMed

    Huang, Chia-Chia; Pan, Tzu-Ming

    2005-05-18

    The event-specific real-time detection and quantification of Roundup Ready soybean (RRS) using an ABI PRISM 7700 sequence detection system with light upon extension (LUX) primer was developed in this study. The event-specific primers were designed, targeting the junction of the RRS 5' integration site and the endogenous gene lectin1. Then, a standard reference plasmid was constructed that carried both of the targeted sequences for quantitative analysis. The detection limit of the LUX real-time PCR system was 0.05 ng of 100% RRS genomic DNA, which was equal to 20.5 copies. The range of quantification was from 0.1 to 100%. The sensitivity and range of quantification successfully met the requirement of the labeling rules in the European Union and Taiwan.

  20. Quantifying and estimating the predictive accuracy for censored time-to-event data with competing risks.

    PubMed

    Wu, Cai; Li, Liang

    2018-05-15

    This paper focuses on quantifying and estimating the predictive accuracy of prognostic models for time-to-event outcomes with competing events. We consider the time-dependent discrimination and calibration metrics, including the receiver operating characteristics curve and the Brier score, in the context of competing risks. To address censoring, we propose a unified nonparametric estimation framework for both discrimination and calibration measures, by weighting the censored subjects with the conditional probability of the event of interest given the observed data. The proposed method can be extended to time-dependent predictive accuracy metrics constructed from a general class of loss functions. We apply the methodology to a data set from the African American Study of Kidney Disease and Hypertension to evaluate the predictive accuracy of a prognostic risk score in predicting end-stage renal disease, accounting for the competing risk of pre-end-stage renal disease death, and evaluate its numerical performance in extensive simulation studies. Copyright © 2018 John Wiley & Sons, Ltd.

  1. Precipitation-snowmelt timing and snowmelt augmentation of large peak flow events, western Cascades, Oregon

    NASA Astrophysics Data System (ADS)

    Jennings, Keith; Jones, Julia A.

    2015-09-01

    This study tested multiple hydrologic mechanisms to explain snowpack dynamics in extreme rain-on-snow floods, which occur widely in the temperate and polar regions. We examined 26, 10 day large storm events over the period 1992-2012 in the H.J. Andrews Experimental Forest in western Oregon, using statistical analyses (regression, ANOVA, and wavelet coherence) of hourly snowmelt lysimeter, air and dewpoint temperature, wind speed, precipitation, and discharge data. All events involved snowpack outflow, but only seven events had continuous net snowpack outflow, including three of the five top-ranked peak discharge events. Peak discharge was not related to precipitation rate, but it was related to the 10 day sum of precipitation and net snowpack outflow, indicating an increased flood response to continuously melting snowpacks. The two largest peak discharge events in the study had significant wavelet coherence at multiple time scales over several days; a distribution of phase differences between precipitation and net snowpack outflow at the 12-32 h time scale with a sharp peak at π/2 radians; and strongly correlated snowpack outflow among lysimeters representing 42% of basin area. The recipe for an extreme rain-on-snow event includes persistent, slow melt within the snowpack, which appears to produce a near-saturated zone within the snowpack throughout the landscape, such that the snowpack may transmit pressure waves of precipitation directly to streams, and this process is synchronized across the landscape. Further work is needed to understand the internal dynamics of a melting snowpack throughout a snow-covered landscape and its contribution to extreme rain-on-snow floods.

  2. Combining techniques for screening and evaluating interaction terms on high-dimensional time-to-event data.

    PubMed

    Sariyar, Murat; Hoffmann, Isabell; Binder, Harald

    2014-02-26

    Molecular data, e.g. arising from microarray technology, is often used for predicting survival probabilities of patients. For multivariate risk prediction models on such high-dimensional data, there are established techniques that combine parameter estimation and variable selection. One big challenge is to incorporate interactions into such prediction models. In this feasibility study, we present building blocks for evaluating and incorporating interactions terms in high-dimensional time-to-event settings, especially for settings in which it is computationally too expensive to check all possible interactions. We use a boosting technique for estimation of effects and the following building blocks for pre-selecting interactions: (1) resampling, (2) random forests and (3) orthogonalization as a data pre-processing step. In a simulation study, the strategy that uses all building blocks is able to detect true main effects and interactions with high sensitivity in different kinds of scenarios. The main challenge are interactions composed of variables that do not represent main effects, but our findings are also promising in this regard. Results on real world data illustrate that effect sizes of interactions frequently may not be large enough to improve prediction performance, even though the interactions are potentially of biological relevance. Screening interactions through random forests is feasible and useful, when one is interested in finding relevant two-way interactions. The other building blocks also contribute considerably to an enhanced pre-selection of interactions. We determined the limits of interaction detection in terms of necessary effect sizes. Our study emphasizes the importance of making full use of existing methods in addition to establishing new ones.

  3. Event generators for address event representation transmitters

    NASA Astrophysics Data System (ADS)

    Serrano-Gotarredona, Rafael; Serrano-Gotarredona, Teresa; Linares Barranco, Bernabe

    2005-06-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows for real-time virtual massive connectivity between huge number neurons located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate 'events' according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. In a typical AER transmitter chip, there is an array of neurons that generate events. They send events to a peripheral circuitry (let's call it "AER Generator") that transforms those events to neurons coordinates (addresses) which are put sequentially on an interchip high speed digital bus. This bus includes a parallel multi-bit address word plus a Rqst (request) and Ack (acknowledge) handshaking signals for asynchronous data exchange. There have been two main approaches published in the literature for implementing such "AER Generator" circuits. They differ on the way of handling event collisions coming from the array of neurons. One approach is based on detecting and discarding collisions, while the other incorporates arbitration for sequencing colliding events . The first approach is supposed to be simpler and faster, while the second is able to handle much higher event traffic. In this article we will concentrate on the second arbiter-based approach. Boahen has been publishing several techniques for implementing and improving the arbiter based approach. Originally, he proposed an arbitration squeme by rows, followed by a column arbitration. In this scheme, while one neuron was selected by the arbiters to transmit his event out of the chip, the rest of neurons in the array were

  4. Evaluating principal surrogate endpoints with time-to-event data accounting for time-varying treatment efficacy.

    PubMed

    Gabriel, Erin E; Gilbert, Peter B

    2014-04-01

    Principal surrogate (PS) endpoints are relatively inexpensive and easy to measure study outcomes that can be used to reliably predict treatment effects on clinical endpoints of interest. Few statistical methods for assessing the validity of potential PSs utilize time-to-event clinical endpoint information and to our knowledge none allow for the characterization of time-varying treatment effects. We introduce the time-dependent and surrogate-dependent treatment efficacy curve, ${\\mathrm {TE}}(t|s)$, and a new augmented trial design for assessing the quality of a biomarker as a PS. We propose a novel Weibull model and an estimated maximum likelihood method for estimation of the ${\\mathrm {TE}}(t|s)$ curve. We describe the operating characteristics of our methods via simulations. We analyze data from the Diabetes Control and Complications Trial, in which we find evidence of a biomarker with value as a PS.

  5. Validation of cross-sectional time series and multivariate adaptive regression splines models for the prediction of energy expenditure in children and adolescents using doubly labeled water

    USDA-ARS?s Scientific Manuscript database

    Accurate, nonintrusive, and inexpensive techniques are needed to measure energy expenditure (EE) in free-living populations. Our primary aim in this study was to validate cross-sectional time series (CSTS) and multivariate adaptive regression splines (MARS) models based on observable participant cha...

  6. wayGoo recommender system: personalized recommendations for events scheduling, based on static and real-time information

    NASA Astrophysics Data System (ADS)

    Thanos, Konstantinos-Georgios; Thomopoulos, Stelios C. A.

    2016-05-01

    wayGoo is a fully functional application whose main functionalities include content geolocation, event scheduling, and indoor navigation. However, significant information about events do not reach users' attention, either because of the size of this information or because some information comes from real - time data sources. The purpose of this work is to facilitate event management operations by prioritizing the presented events, based on users' interests using both, static and real - time data. Through the wayGoo interface, users select conceptual topics that are interesting for them. These topics constitute a browsing behavior vector which is used for learning users' interests implicitly, without being intrusive. Then, the system estimates user preferences and return an events list sorted from the most preferred one to the least. User preferences are modeled via a Naïve Bayesian Network which consists of: a) the `decision' random variable corresponding to users' decision on attending an event, b) the `distance' random variable, modeled by a linear regression that estimates the probability that the distance between a user and each event destination is not discouraging, ` the seat availability' random variable, modeled by a linear regression, which estimates the probability that the seat availability is encouraging d) and the `relevance' random variable, modeled by a clustering - based collaborative filtering, which determines the relevance of each event users' interests. Finally, experimental results show that the proposed system contribute essentially to assisting users in browsing and selecting events to attend.

  7. Multivariate stochastic simulation with subjective multivariate normal distributions

    Treesearch

    P. J. Ince; J. Buongiorno

    1991-01-01

    In many applications of Monte Carlo simulation in forestry or forest products, it may be known that some variables are correlated. However, for simplicity, in most simulations it has been assumed that random variables are independently distributed. This report describes an alternative Monte Carlo simulation technique for subjectively assesed multivariate normal...

  8. Multivariate Analysis and Prediction of Dioxin-Furan ...

    EPA Pesticide Factsheets

    Peer Review Draft of Regional Methods Initiative Final Report Dioxins, which are bioaccumulative and environmentally persistent, pose an ongoing risk to human and ecosystem health. Fish constitute a significant source of dioxin exposure for humans and fish-eating wildlife. Current dioxin analytical methods are costly, time-consuming, and produce hazardous by-products. A Danish team developed a novel, multivariate statistical methodology based on the covariance of dioxin-furan congener Toxic Equivalences (TEQs) and fatty acid methyl esters (FAMEs) and applied it to North Atlantic Ocean fishmeal samples. The goal of the current study was to attempt to extend this Danish methodology to 77 whole and composite fish samples from three trophic groups: predator (whole largemouth bass), benthic (whole flathead and channel catfish) and forage fish (composite bluegill, pumpkinseed and green sunfish) from two dioxin contaminated rivers (Pocatalico R. and Kanawha R.) in West Virginia, USA. Multivariate statistical analyses, including, Principal Components Analysis (PCA), Hierarchical Clustering, and Partial Least Squares Regression (PLS), were used to assess the relationship between the FAMEs and TEQs in these dioxin contaminated freshwater fish from the Kanawha and Pocatalico Rivers. These three multivariate statistical methods all confirm that the pattern of Fatty Acid Methyl Esters (FAMEs) in these freshwater fish covaries with and is predictive of the WHO TE

  9. Decoding Dynamic Brain Patterns from Evoked Responses: A Tutorial on Multivariate Pattern Analysis Applied to Time Series Neuroimaging Data.

    PubMed

    Grootswagers, Tijl; Wardle, Susan G; Carlson, Thomas A

    2017-04-01

    Multivariate pattern analysis (MVPA) or brain decoding methods have become standard practice in analyzing fMRI data. Although decoding methods have been extensively applied in brain-computer interfaces, these methods have only recently been applied to time series neuroimaging data such as MEG and EEG to address experimental questions in cognitive neuroscience. In a tutorial style review, we describe a broad set of options to inform future time series decoding studies from a cognitive neuroscience perspective. Using example MEG data, we illustrate the effects that different options in the decoding analysis pipeline can have on experimental results where the aim is to "decode" different perceptual stimuli or cognitive states over time from dynamic brain activation patterns. We show that decisions made at both preprocessing (e.g., dimensionality reduction, subsampling, trial averaging) and decoding (e.g., classifier selection, cross-validation design) stages of the analysis can significantly affect the results. In addition to standard decoding, we describe extensions to MVPA for time-varying neuroimaging data including representational similarity analysis, temporal generalization, and the interpretation of classifier weight maps. Finally, we outline important caveats in the design and interpretation of time series decoding experiments.

  10. A threshold-free summary index of prediction accuracy for censored time to event data.

    PubMed

    Yuan, Yan; Zhou, Qian M; Li, Bingying; Cai, Hengrui; Chow, Eric J; Armstrong, Gregory T

    2018-05-10

    Prediction performance of a risk scoring system needs to be carefully assessed before its adoption in clinical practice. Clinical preventive care often uses risk scores to screen asymptomatic population. The primary clinical interest is to predict the risk of having an event by a prespecified future time t 0 . Accuracy measures such as positive predictive values have been recommended for evaluating the predictive performance. However, for commonly used continuous or ordinal risk score systems, these measures require a subjective cutoff threshold value that dichotomizes the risk scores. The need for a cutoff value created barriers for practitioners and researchers. In this paper, we propose a threshold-free summary index of positive predictive values that accommodates time-dependent event status and competing risks. We develop a nonparametric estimator and provide an inference procedure for comparing this summary measure between 2 risk scores for censored time to event data. We conduct a simulation study to examine the finite-sample performance of the proposed estimation and inference procedures. Lastly, we illustrate the use of this measure on a real data example, comparing 2 risk score systems for predicting heart failure in childhood cancer survivors. Copyright © 2018 John Wiley & Sons, Ltd.

  11. A Web Portal-Based Time-Aware KML Animation Tool for Exploring Spatiotemporal Dynamics of Hydrological Events

    NASA Astrophysics Data System (ADS)

    Bao, X.; Cai, X.; Liu, Y.

    2009-12-01

    Understanding spatiotemporal dynamics of hydrological events such as storms and droughts is highly valuable for decision making on disaster mitigation and recovery. Virtual Globe-based technologies such as Google Earth and Open Geospatial Consortium KML standards show great promises for collaborative exploration of such events using visual analytical approaches. However, currently there are two barriers for wider usage of such approaches. First, there lacks an easy way to use open source tools to convert legacy or existing data formats such as shapefiles, geotiff, or web services-based data sources to KML and to produce time-aware KML files. Second, an integrated web portal-based time-aware animation tool is currently not available. Thus users usually share their files in the portal but have no means to visually explore them without leaving the portal environment which the users are familiar with. We develop a web portal-based time-aware KML animation tool for viewing extreme hydrologic events. The tool is based on Google Earth JavaScript API and Java Portlet standard 2.0 JSR-286, and it is currently deployable in one of the most popular open source portal frameworks, namely Liferay. We have also developed an open source toolkit kml-soc-ncsa (http://code.google.com/p/kml-soc-ncsa/) to facilitate the conversion of multiple formats into KML and the creation of time-aware KML files. We illustrate our tool using some example cases, in which drought and storm events with both time and space dimension can be explored in this web-based KML animation portlet. The tool provides an easy-to-use web browser-based portal environment for multiple users to collaboratively share and explore their time-aware KML files as well as improving the understanding of the spatiotemporal dynamics of the hydrological events.

  12. FREQ: A computational package for multivariable system loop-shaping procedures

    NASA Technical Reports Server (NTRS)

    Giesy, Daniel P.; Armstrong, Ernest S.

    1989-01-01

    Many approaches in the field of linear, multivariable time-invariant systems analysis and controller synthesis employ loop-sharing procedures wherein design parameters are chosen to shape frequency-response singular value plots of selected transfer matrices. A software package, FREQ, is documented for computing within on unified framework many of the most used multivariable transfer matrices for both continuous and discrete systems. The matrices are evaluated at user-selected frequency-response values, and singular values against frequency. Example computations are presented to demonstrate the use of the FREQ code.

  13. Gunbarrel mafic magmatic event: A key 780 Ma time marker for Rodinia plate reconstructions

    USGS Publications Warehouse

    Harlan, S.S.; Heaman, L.; LeCheminant, A.N.; Premo, W.R.

    2003-01-01

    Precise U-Pb baddeleyite dating of mafic igneous rocks provides evidence for a widespread and synchronous magmatic event that extended for >2400 km along the western margin of the Neoproterozoic Laurentian craton. U-Pb baddeleyite analyses for eight intrusions from seven localities ranging from the northern Canadian Shield to northwestern Wyoming-southwestern Montana are statistically indistinguishable and yield a composite U-Pb concordia age for this event of 780.3 ?? 1.4 Ma (95% confidence level). This 780 Ma event is herein termed the Gunbarrel magmatic event. The mafic magmatism of the Gunbarrel event represents the largest mafic dike swarm yet identified along the Neoproterozoic margin of Laurentia. The origin of the mafic magmatism is not clear, but may be related to mantle-plume activity or upwelling asthenosphere leading to crustal extension accompanying initial breakup of the supercontinent Rodinia and development of the proto-Pacific Ocean. The mafic magmatism of the Gunbarrel magmatic event at 780 Ma predates the voluminous magmatism of the 723 Ma Franklin igneous event of the northwestern Canadian Shield by ???60 m.y. The precise dating of the extensive Neoproterozoic Gunbarrel and Franklin magmatic events provides unique time markers that can ultimately be used for robust testing of Neoproterozoic continental reconstructions.

  14. Event Reconstruction in the PandaRoot framework

    NASA Astrophysics Data System (ADS)

    Spataro, Stefano

    2012-12-01

    The PANDA experiment will study the collisions of beams of anti-protons, with momenta ranging from 2-15 GeV/c, with fixed proton and nuclear targets in the charm energy range, and will be built at the FAIR facility. In preparation for the experiment, the PandaRoot software framework is under development for detector simulation, reconstruction and data analysis, running on an Alien2-based grid. The basic features are handled by the FairRoot framework, based on ROOT and Virtual Monte Carlo, while the PANDA detector specifics and reconstruction code are implemented inside PandaRoot. The realization of Technical Design Reports for the tracking detectors has pushed the finalization of the tracking reconstruction code, which is complete for the Target Spectrometer, and of the analysis tools. Particle Identification algorithms are currently implemented using Bayesian approach and compared to Multivariate Analysis methods. Moreover, the PANDA data acquisition foresees a triggerless operation in which events are not defined by a hardware 1st level trigger decision, but all the signals are stored with time stamps requiring a deconvolution by the software. This has led to a redesign of the software from an event basis to a time-ordered structure. In this contribution, the reconstruction capabilities of the Panda spectrometer will be reported, focusing on the performances of the tracking system and the results for the analysis of physics benchmark channels, as well as the new (and challenging) concept of time-based simulation and its implementation.

  15. Prediction of beef color using time-domain nuclear magnetic resonance (TD-NMR) relaxometry data and multivariate analyses.

    PubMed

    Moreira, Luiz Felipe Pompeu Prado; Ferrari, Adriana Cristina; Moraes, Tiago Bueno; Reis, Ricardo Andrade; Colnago, Luiz Alberto; Pereira, Fabíola Manhas Verbi

    2016-05-19

    Time-domain nuclear magnetic resonance and chemometrics were used to predict color parameters, such as lightness (L*), redness (a*), and yellowness (b*) of beef (Longissimus dorsi muscle) samples. Analyzing the relaxation decays with multivariate models performed with partial least-squares regression, color quality parameters were predicted. The partial least-squares models showed low errors independent of the sample size, indicating the potentiality of the method. Minced procedure and weighing were not necessary to improve the predictive performance of the models. The reduction of transverse relaxation time (T 2 ) measured by Carr-Purcell-Meiboom-Gill pulse sequence in darker beef in comparison with lighter ones can be explained by the lower relaxivity Fe 2+ present in deoxymyoglobin and oxymyoglobin (red beef) to the higher relaxivity of Fe 3+ present in metmyoglobin (brown beef). These results point that time-domain nuclear magnetic resonance spectroscopy can become a useful tool for quality assessment of beef cattle on bulk of the sample and through-packages, because this technique is also widely applied to measure sensorial parameters, such as flavor, juiciness and tenderness, and physicochemical parameters, cooking loss, fat and moisture content, and instrumental tenderness using Warner Bratzler shear force. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  16. Time evolution of atmospheric particle number concentration during high-intensity pyrotechnic events

    NASA Astrophysics Data System (ADS)

    Crespo, Javier; Yubero, Eduardo; Nicolás, Jose F.; Caballero, Sandra; Galindo, Nuria

    2014-10-01

    The Mascletàs are high-intensity pyrotechnic events, typical of eastern Spanish festivals, in which thousands of firecrackers are burnt at ground level in an intense, short-time (<8 min) deafening spectacle that generates short-lived, thick aerosol clouds. In this study, the impact of such events on air quality has been evaluated by means of particle number concentration measurements performed close to the venue during the June festival in Alicante (southeastern Spain). Peak concentrations and dilution times observed throughout the Mascletàs have been compared to those measured when conventional aerial fireworks were launched 2 km away from the monitoring site. The impact of the Mascletàs on the total number concentration of particles larger than 0.3 μm was higher (maximum ∼2·104 cm-3) than that of fireworks (maximum ∼2·103 cm-3). The effect of fireworks depended on whether the dominant meteorological conditions favoured the transport of the plume to the measurement location. However, the time required for particle concentrations to return to background levels is longer and more variable for firework displays (minutes to hours) than for the Mascletàs (<25 min).

  17. Comparing Within-Person Effects from Multivariate Longitudinal Models

    ERIC Educational Resources Information Center

    Bainter, Sierra A.; Howard, Andrea L.

    2016-01-01

    Several multivariate models are motivated to answer similar developmental questions regarding within-person (intraindividual) effects between 2 or more constructs over time, yet the within-person effects tested by each model are distinct. In this article, the authors clarify the types of within-person inferences that can be made from each model.…

  18. Distinct and shared cognitive functions mediate event- and time-based prospective memory impairment in normal ageing

    PubMed Central

    Gonneaud, Julie; Kalpouzos, Grégoria; Bon, Laetitia; Viader, Fausto; Eustache, Francis; Desgranges, Béatrice

    2011-01-01

    Prospective memory (PM) is the ability to remember to perform an action at a specific point in the future. Regarded as multidimensional, PM involves several cognitive functions that are known to be impaired in normal aging. In the present study, we set out to investigate the cognitive correlates of PM impairment in normal aging. Manipulating cognitive load, we assessed event- and time-based PM, as well as several cognitive functions, including executive functions, working memory and retrospective episodic memory, in healthy subjects covering the entire adulthood. We found that normal aging was characterized by PM decline in all conditions and that event-based PM was more sensitive to the effects of aging than time-based PM. Whatever the conditions, PM was linked to inhibition and processing speed. However, while event-based PM was mainly mediated by binding and retrospective memory processes, time-based PM was mainly related to inhibition. The only distinction between high- and low-load PM cognitive correlates lays in an additional, but marginal, correlation between updating and the high-load PM condition. The association of distinct cognitive functions, as well as shared mechanisms with event- and time-based PM confirms that each type of PM relies on a different set of processes. PMID:21678154

  19. Multivariate exploration of non-intrusive load monitoring via spatiotemporal pattern network

    DOE PAGES

    Liu, Chao; Akintayo, Adedotun; Jiang, Zhanhong; ...

    2017-12-18

    Non-intrusive load monitoring (NILM) of electrical demand for the purpose of identifying load components has thus far mostly been studied using univariate data, e.g., using only whole building electricity consumption time series to identify a certain type of end-use such as lighting load. However, using additional variables in the form of multivariate time series data may provide more information in terms of extracting distinguishable features in the context of energy disaggregation. In this work, a novel probabilistic graphical modeling approach, namely the spatiotemporal pattern network (STPN) is proposed for energy disaggregation using multivariate time-series data. The STPN framework is shownmore » to be capable of handling diverse types of multivariate time-series to improve the energy disaggregation performance. The technique outperforms the state of the art factorial hidden Markov models (FHMM) and combinatorial optimization (CO) techniques in multiple real-life test cases. Furthermore, based on two homes' aggregate electric consumption data, a similarity metric is defined for the energy disaggregation of one home using a trained model based on the other home (i.e., out-of-sample case). The proposed similarity metric allows us to enhance scalability via learning supervised models for a few homes and deploying such models to many other similar but unmodeled homes with significantly high disaggregation accuracy.« less

  20. Multivariate exploration of non-intrusive load monitoring via spatiotemporal pattern network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Chao; Akintayo, Adedotun; Jiang, Zhanhong

    Non-intrusive load monitoring (NILM) of electrical demand for the purpose of identifying load components has thus far mostly been studied using univariate data, e.g., using only whole building electricity consumption time series to identify a certain type of end-use such as lighting load. However, using additional variables in the form of multivariate time series data may provide more information in terms of extracting distinguishable features in the context of energy disaggregation. In this work, a novel probabilistic graphical modeling approach, namely the spatiotemporal pattern network (STPN) is proposed for energy disaggregation using multivariate time-series data. The STPN framework is shownmore » to be capable of handling diverse types of multivariate time-series to improve the energy disaggregation performance. The technique outperforms the state of the art factorial hidden Markov models (FHMM) and combinatorial optimization (CO) techniques in multiple real-life test cases. Furthermore, based on two homes' aggregate electric consumption data, a similarity metric is defined for the energy disaggregation of one home using a trained model based on the other home (i.e., out-of-sample case). The proposed similarity metric allows us to enhance scalability via learning supervised models for a few homes and deploying such models to many other similar but unmodeled homes with significantly high disaggregation accuracy.« less

  1. Multiscale analysis of information dynamics for linear multivariate processes.

    PubMed

    Faes, Luca; Montalto, Alessandro; Stramaglia, Sebastiano; Nollo, Giandomenico; Marinazzo, Daniele

    2016-08-01

    In the study of complex physical and physiological systems represented by multivariate time series, an issue of great interest is the description of the system dynamics over a range of different temporal scales. While information-theoretic approaches to the multiscale analysis of complex dynamics are being increasingly used, the theoretical properties of the applied measures are poorly understood. This study introduces for the first time a framework for the analytical computation of information dynamics for linear multivariate stochastic processes explored at different time scales. After showing that the multiscale processing of a vector autoregressive (VAR) process introduces a moving average (MA) component, we describe how to represent the resulting VARMA process using statespace (SS) models and how to exploit the SS model parameters to compute analytical measures of information storage and information transfer for the original and rescaled processes. The framework is then used to quantify multiscale information dynamics for simulated unidirectionally and bidirectionally coupled VAR processes, showing that rescaling may lead to insightful patterns of information storage and transfer but also to potentially misleading behaviors.

  2. A general framework for time series data mining based on event analysis: application to the medical domains of electroencephalography and stabilometry.

    PubMed

    Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P

    2014-10-01

    There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Interpretable Early Classification of Multivariate Time Series

    ERIC Educational Resources Information Center

    Ghalwash, Mohamed F.

    2013-01-01

    Recent advances in technology have led to an explosion in data collection over time rather than in a single snapshot. For example, microarray technology allows us to measure gene expression levels in different conditions over time. Such temporal data grants the opportunity for data miners to develop algorithms to address domain-related problems,…

  4. Visualization-by-Sketching: An Artist's Interface for Creating Multivariate Time-Varying Data Visualizations.

    PubMed

    Schroeder, David; Keefe, Daniel F

    2016-01-01

    We present Visualization-by-Sketching, a direct-manipulation user interface for designing new data visualizations. The goals are twofold: First, make the process of creating real, animated, data-driven visualizations of complex information more accessible to artists, graphic designers, and other visual experts with traditional, non-technical training. Second, support and enhance the role of human creativity in visualization design, enabling visual experimentation and workflows similar to what is possible with traditional artistic media. The approach is to conceive of visualization design as a combination of processes that are already closely linked with visual creativity: sketching, digital painting, image editing, and reacting to exemplars. Rather than studying and tweaking low-level algorithms and their parameters, designers create new visualizations by painting directly on top of a digital data canvas, sketching data glyphs, and arranging and blending together multiple layers of animated 2D graphics. This requires new algorithms and techniques to interpret painterly user input relative to data "under" the canvas, balance artistic freedom with the need to produce accurate data visualizations, and interactively explore large (e.g., terabyte-sized) multivariate datasets. Results demonstrate a variety of multivariate data visualization techniques can be rapidly recreated using the interface. More importantly, results and feedback from artists support the potential for interfaces in this style to attract new, creative users to the challenging task of designing more effective data visualizations and to help these users stay "in the creative zone" as they work.

  5. Bayesian regression model for recurrent event data with event-varying covariate effects and event effect.

    PubMed

    Lin, Li-An; Luo, Sheng; Davis, Barry R

    2018-01-01

    In the course of hypertension, cardiovascular disease events (e.g., stroke, heart failure) occur frequently and recurrently. The scientific interest in such study may lie in the estimation of treatment effect while accounting for the correlation among event times. The correlation among recurrent event times come from two sources: subject-specific heterogeneity (e.g., varied lifestyles, genetic variations, and other unmeasurable effects) and event dependence (i.e., event incidences may change the risk of future recurrent events). Moreover, event incidences may change the disease progression so that there may exist event-varying covariate effects (the covariate effects may change after each event) and event effect (the effect of prior events on the future events). In this article, we propose a Bayesian regression model that not only accommodates correlation among recurrent events from both sources, but also explicitly characterizes the event-varying covariate effects and event effect. This model is especially useful in quantifying how the incidences of events change the effects of covariates and risk of future events. We compare the proposed model with several commonly used recurrent event models and apply our model to the motivating lipid-lowering trial (LLT) component of the Antihypertensive and Lipid-Lowering Treatment to Prevent Heart Attack Trial (ALLHAT) (ALLHAT-LLT).

  6. Bayesian regression model for recurrent event data with event-varying covariate effects and event effect

    PubMed Central

    Lin, Li-An; Luo, Sheng; Davis, Barry R.

    2017-01-01

    In the course of hypertension, cardiovascular disease events (e.g., stroke, heart failure) occur frequently and recurrently. The scientific interest in such study may lie in the estimation of treatment effect while accounting for the correlation among event times. The correlation among recurrent event times come from two sources: subject-specific heterogeneity (e.g., varied lifestyles, genetic variations, and other unmeasurable effects) and event dependence (i.e., event incidences may change the risk of future recurrent events). Moreover, event incidences may change the disease progression so that there may exist event-varying covariate effects (the covariate effects may change after each event) and event effect (the effect of prior events on the future events). In this article, we propose a Bayesian regression model that not only accommodates correlation among recurrent events from both sources, but also explicitly characterizes the event-varying covariate effects and event effect. This model is especially useful in quantifying how the incidences of events change the effects of covariates and risk of future events. We compare the proposed model with several commonly used recurrent event models and apply our model to the motivating lipid-lowering trial (LLT) component of the Antihypertensive and Lipid-Lowering Treatment to Prevent Heart Attack Trial (ALLHAT) (ALLHAT-LLT). PMID:29755162

  7. Interaction between Gender and Skill on Competitive State Anxiety Using the Time-to-Event Paradigm: What Roles Do Intensity, Direction, and Frequency Dimensions Play?

    PubMed Central

    Hagan, John E.; Pollmann, Dietmar; Schack, Thomas

    2017-01-01

    Background and purpose: The functional understanding and examination of competitive anxiety responses as temporal events that unfold as time-to-competition moves closer has emerged as a topical research area within the domains of sport psychology. However, little is known from an inclusive and interaction oriented perspective. Using the multidimensional anxiety theory as a framework, the present study examined the temporal patterning of competitive anxiety, focusing on the dimensions of intensity, direction, and frequency of intrusions in athletes across gender and skill level. Methods: Elite and semi-elite table tennis athletes from the Ghanaian league (N = 90) completed a modified version of Competitive State Anxiety Inventory-2 (CSAI-2) with the inclusion of the directional and frequency of intrusion scales at three temporal phases (7 days, 2 days, and 1 h) prior to a competitive fixture. Results: Multivariate Analyses of Variance repeated measures with follow-up analyses revealed significant interactions for between-subjects factors on all anxiety dimensions (intensity, direction, and frequency). Notably, elite (international) female athletes were less cognitively anxious, showed more facilitative interpretation toward somatic anxiety symptoms and experienced less frequency of somatic anxiety symptoms than their male counterparts. However, both elite groups displayed appreciable level of self-confidence. For time-to-event effects, both cognitive and somatic anxiety intensity fluctuated whereas self-confidence showed a steady rise as competition neared. Somatic anxiety debilitative interpretation slightly improved 1 h before competition whereas cognitive anxiety frequencies also increased progressively during the entire preparatory phase. Conclusion: Findings suggest a more dynamic image of elite athletes’ pre-competitive anxiety responses than suggested by former studies, potentially influenced by cultural differences. The use of psychological skills

  8. Designing group sequential randomized clinical trials with time to event end points using a R function.

    PubMed

    Filleron, Thomas; Gal, Jocelyn; Kramar, Andrew

    2012-10-01

    A major and difficult task is the design of clinical trials with a time to event endpoint. In fact, it is necessary to compute the number of events and in a second step the required number of patients. Several commercial software packages are available for computing sample size in clinical trials with sequential designs and time to event endpoints, but there are a few R functions implemented. The purpose of this paper is to describe features and use of the R function. plansurvct.func, which is an add-on function to the package gsDesign which permits in one run of the program to calculate the number of events, and required sample size but also boundaries and corresponding p-values for a group sequential design. The use of the function plansurvct.func is illustrated by several examples and validated using East software. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  9. Sentinel events predicting later unwanted sex among girls: A national survey in Haiti, 2012.

    PubMed

    Sumner, Steven A; Marcelin, Louis H; Cela, Toni; Mercy, James A; Lea, Veronica; Kress, Howard; Hillis, Susan D

    2015-12-01

    Sexual violence against children is a significant global public health problem, yet limited studies exist from low-resource settings. In Haiti we conducted the country's first, nationally representative survey focused on childhood violence to help inform the development of a national action plan for violence against children. The Haiti Violence Against Children Survey was a household-level, multistage, cluster survey among youth age 13-24. In this analysis we sought to determine whether sexual violence sentinel events (unwanted sexual touching or unwanted attempted sex) were predictive of later unwanted, completed, penetrative sex in Haiti. We also sought to explore characteristics of sentinel events and help-seeking behavior among Haitian children. Multivariable logistic regression was used to test associations between sentinel events and later unwanted, completed, penetrative sex. Overall, 1,457 females reported on experiences of sexual violence occurring in childhood (before age 18). A sentinel event occurred in 40.4% of females who experienced subsequent unwanted completed sex. Females experiencing a sentinel event were approximately two and a half times more likely to experience later unwanted completed sex (adjusted odds ratio=2.40, p=.004) compared to individuals who did not experience a sentinel event. The mean lag time from first sentinel event to first unwanted completed sex was 2.3 years. Only half (54.6%) of children experiencing a sentinel event told someone about their experience of sexual violence. Among children, sentinel events occur frequently before later acts of completed unwanted sex and may represent a useful point of intervention. Reporting of sexual violence by children in Haiti is low and can be improved to better act on sentinel events. Published by Elsevier Ltd.

  10. Multivariate spatial analysis of a heavy rain event in a densely populated delta city

    NASA Astrophysics Data System (ADS)

    Gaitan, Santiago; ten Veldhuis, Marie-claire; Bruni, Guenda; van de Giesen, Nick

    2014-05-01

    Delta cities account for half of the world's population and host key infrastructure and services for the global economic growth. Due to the characteristic geography of delta areas, these cities face high vulnerability to extreme weather and pluvial flooding risks, that are expected to increase as climate change drives heavier rain events. Besides, delta cities are subjected to fast urban densification processes that progressively make them more vulnerable to pluvial flooding. Delta cities need to be adapted to better cope with this threat. The mechanism leading to damage after heavy rains is not completely understood. For instance, current research has shown that rain intensities and volumes can only partially explain the occurrence and localization of rain-related insurance claims (Spekkers et al., 2013). The goal of this paper is to provide further insights into spatial characteristics of the urban environment that can significantly be linked to pluvial-related flooding impacts. To that end, a study-case has been selected: on October 12 to 14 2013, a heavy rain event triggered pluvial floods in Rotterdam, a densely populated city which is undergoing multiple climate adaptation efforts and is located in the Meuse river Delta. While the average yearly precipitation in this city is around 800 mm, local rain gauge measurements ranged from aprox. 60 to 130 mm just during these three days. More than 600 citizens' telephonic complaints reported impacts related to rainfall. The registry of those complaints, which comprises around 300 calls made to the municipality and another 300 to the fire brigade, was made available for research. Other accessible information about this city includes a series of rainfall measurements with up to 1 min time-step at 7 different locations around the city, ground-based radar rainfall data (1 Km^2 spatial resolution and 5 min time-step), a digital elevation model (50 cm of horizontal resolution), a model of overland-flow paths, cadastral

  11. Foreshocks and aftershocks of Pisagua 2014 earthquake: time and space evolution of megathrust event.

    NASA Astrophysics Data System (ADS)

    Fuenzalida Velasco, Amaya; Rietbrock, Andreas; Wollam, Jack; Thomas, Reece; de Lima Neto, Oscar; Tavera, Hernando; Garth, Thomas; Ruiz, Sergio

    2016-04-01

    The 2014 Pisagua earthquake of magnitude 8.2 is the first case in Chile where a foreshock sequence was clearly recorded by a local network, as well all the complete sequence including the mainshock and its aftershocks. The seismicity of the last year before the mainshock include numerous clusters close to the epicentral zone (Ruiz et al; 2014) but it was on 16th March that this activity became stronger with the Mw 6.7 precursory event taking place in front of Iquique coast at 12 km depth. The Pisagua earthquake arrived on 1st April 2015 breaking almost 120 km N-S and two days after a 7.6 aftershock occurred in the south of the rupture, enlarging the zone affected by this sequence. In this work, we analyse the foreshocks and aftershock sequence of Pisagua earthquake, from the spatial and time evolution for a total of 15.764 events that were recorded from the 1st March to 31th May 2015. This event catalogue was obtained from the automatic analyse of seismic raw data of more than 50 stations installed in the north of Chile and the south of Peru. We used the STA/LTA algorithm for the detection of P and S arrival times on the vertical components and then a method of back propagation in a 1D velocity model for the event association and preliminary location of its hypocenters following the algorithm outlined by Rietbrock et al. (2012). These results were then improved by locating with NonLinLoc software using a regional velocity model. We selected the larger events to analyse its moment tensor solution by a full waveform inversion using ISOLA software. In order to understand the process of nucleation and propagation of the Pisagua earthquake, we also analysed the evolution in time of the seismicity of the three months of data. The zone where the precursory events took place was strongly activated two weeks before the mainshock and remained very active until the end of the analysed period with an important quantity of the seismicity located in the upper plate and having

  12. MOBBED: a computational data infrastructure for handling large collections of event-rich time series datasets in MATLAB

    PubMed Central

    Cockfield, Jeremy; Su, Kyungmin; Robbins, Kay A.

    2013-01-01

    Experiments to monitor human brain activity during active behavior record a variety of modalities (e.g., EEG, eye tracking, motion capture, respiration monitoring) and capture a complex environmental context leading to large, event-rich time series datasets. The considerable variability of responses within and among subjects in more realistic behavioral scenarios requires experiments to assess many more subjects over longer periods of time. This explosion of data requires better computational infrastructure to more systematically explore and process these collections. MOBBED is a lightweight, easy-to-use, extensible toolkit that allows users to incorporate a computational database into their normal MATLAB workflow. Although capable of storing quite general types of annotated data, MOBBED is particularly oriented to multichannel time series such as EEG that have event streams overlaid with sensor data. MOBBED directly supports access to individual events, data frames, and time-stamped feature vectors, allowing users to ask questions such as what types of events or features co-occur under various experimental conditions. A database provides several advantages not available to users who process one dataset at a time from the local file system. In addition to archiving primary data in a central place to save space and avoid inconsistencies, such a database allows users to manage, search, and retrieve events across multiple datasets without reading the entire dataset. The database also provides infrastructure for handling more complex event patterns that include environmental and contextual conditions. The database can also be used as a cache for expensive intermediate results that are reused in such activities as cross-validation of machine learning algorithms. MOBBED is implemented over PostgreSQL, a widely used open source database, and is freely available under the GNU general public license at http://visual.cs.utsa.edu/mobbed. Source and issue reports for MOBBED

  13. MOBBED: a computational data infrastructure for handling large collections of event-rich time series datasets in MATLAB.

    PubMed

    Cockfield, Jeremy; Su, Kyungmin; Robbins, Kay A

    2013-01-01

    Experiments to monitor human brain activity during active behavior record a variety of modalities (e.g., EEG, eye tracking, motion capture, respiration monitoring) and capture a complex environmental context leading to large, event-rich time series datasets. The considerable variability of responses within and among subjects in more realistic behavioral scenarios requires experiments to assess many more subjects over longer periods of time. This explosion of data requires better computational infrastructure to more systematically explore and process these collections. MOBBED is a lightweight, easy-to-use, extensible toolkit that allows users to incorporate a computational database into their normal MATLAB workflow. Although capable of storing quite general types of annotated data, MOBBED is particularly oriented to multichannel time series such as EEG that have event streams overlaid with sensor data. MOBBED directly supports access to individual events, data frames, and time-stamped feature vectors, allowing users to ask questions such as what types of events or features co-occur under various experimental conditions. A database provides several advantages not available to users who process one dataset at a time from the local file system. In addition to archiving primary data in a central place to save space and avoid inconsistencies, such a database allows users to manage, search, and retrieve events across multiple datasets without reading the entire dataset. The database also provides infrastructure for handling more complex event patterns that include environmental and contextual conditions. The database can also be used as a cache for expensive intermediate results that are reused in such activities as cross-validation of machine learning algorithms. MOBBED is implemented over PostgreSQL, a widely used open source database, and is freely available under the GNU general public license at http://visual.cs.utsa.edu/mobbed. Source and issue reports for MOBBED

  14. Optical Spectroscopy and Multivariate Analysis for Biodosimetry and Monitoring of Radiation Injury to the Skin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levitskaia, Tatiana G.; Bryan, Samuel A.; Creim, Jeffrey A.

    2012-08-01

    In the event of an intentional or accidental release of ionizing radiation in a densely populated area, timely assessment and triage of the general population for the radiation exposure is critical. In particular, a significant number of the victims may sustain cutaneous radiation injury, which increases the mortality and worsens the overall prognosis of the victims suffered from combined thermal/mechanical and radiation trauma. Diagnosis of the cutaneous radiation injury is challenging, and established methods largely rely on visual manifestations, presence of the skin contamination, and a high degree of recall by the victim. Availability of a high throughput non-invasive inmore » vivo biodosimetry tool for assessment of the radiation exposure of the skin is of particular importance for the timely diagnosis of the cutaneous injury. In the reported investigation, we have tested the potential of an optical reflectance spectroscopy for the evaluation of the radiation injury to the skin. This is technically attractive because optical spectroscopy relies on well-established and routinely used for various applications instrumentation, one example being pulse oximetry which uses selected wavelengths for the quantification of the blood oxygenation. Our method relies on a broad spectral region ranging from the locally absorbed, shallow-penetrating ultraviolet and visible (250 to 800 nm) to more deeply penetrating near-Infrared (800 – 1600 nm) light for the monitoring of multiple physiological changes in the skin upon irradiation. Chemometrics is a multivariate methodology that allows the information from entire spectral region to be used to generate predictive regression models. In this report we demonstrate that simple spectroscopic method, such as the optical reflectance spectroscopy, in combination with multivariate data analysis, offers the promise of rapid and non-invasive in vivo diagnosis and monitoring of the cutaneous radiation exposure, and is able accurately

  15. Construction and updating of event models in auditory event processing.

    PubMed

    Huff, Markus; Maurer, Annika E; Brich, Irina; Pagenkopf, Anne; Wickelmaier, Florian; Papenmeier, Frank

    2018-02-01

    Humans segment the continuous stream of sensory information into distinct events at points of change. Between 2 events, humans perceive an event boundary. Present theories propose changes in the sensory information to trigger updating processes of the present event model. Increased encoding effort finally leads to a memory benefit at event boundaries. Evidence from reading time studies (increased reading times with increasing amount of change) suggest that updating of event models is incremental. We present results from 5 experiments that studied event processing (including memory formation processes and reading times) using an audio drama as well as a transcript thereof as stimulus material. Experiments 1a and 1b replicated the event boundary advantage effect for memory. In contrast to recent evidence from studies using visual stimulus material, Experiments 2a and 2b found no support for incremental updating with normally sighted and blind participants for recognition memory. In Experiment 3, we replicated Experiment 2a using a written transcript of the audio drama as stimulus material, allowing us to disentangle encoding and retrieval processes. Our results indicate incremental updating processes at encoding (as measured with reading times). At the same time, we again found recognition performance to be unaffected by the amount of change. We discuss these findings in light of current event cognition theories. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  16. Future versus present: time perspective and pupillary response in a relatedness judgment task investigating temporal event knowledge.

    PubMed

    Nowack, Kati; Milfont, Taciano L; van der Meer, Elke

    2013-02-01

    Mental representations of events contain many components such as typical agents, instruments, objects as well as a temporal dimension that is directed towards the future. While the role of temporal orientation (chronological, reverse) in event knowledge has been demonstrated by numerous studies, little is known about the influence of time perspective (present or future) as source of individual differences affecting event knowledge. The present study combined behavioral data with task-evoked pupil dilation to examine the impact of time perspective on cognitive resource allocation. In a relatedness judgment task, everyday events like raining were paired with an object feature like wet. Chronological items were processed more easily than reverse items regardless of time perspective. When more automatic processes were applied, greater scores on future time perspective were associated with lower error rates for chronological items. This suggests that a match between a strong focus on future consequences and items with a temporal orientation directed toward the future serves to enhance responding accuracy. Indexed by pupillary data, future-oriented participants invested more cognitive resources while outperforming present-oriented participants in reaction times across all conditions. This result was supported by a principal component analysis on the pupil data, which demonstrated the same impact of time perspective on the factor associated with more general aspects of cognitive effort. These findings suggest that future time perspective may be linked to a more general cognitive performance characteristic that improves overall task performance. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Multivariate Strategies in Functional Magnetic Resonance Imaging

    ERIC Educational Resources Information Center

    Hansen, Lars Kai

    2007-01-01

    We discuss aspects of multivariate fMRI modeling, including the statistical evaluation of multivariate models and means for dimensional reduction. In a case study we analyze linear and non-linear dimensional reduction tools in the context of a "mind reading" predictive multivariate fMRI model.

  18. Comparison of design strategies for a three-arm clinical trial with time-to-event endpoint: Power, time-to-analysis, and operational aspects.

    PubMed

    Asikanius, Elina; Rufibach, Kaspar; Bahlo, Jasmin; Bieska, Gabriele; Burger, Hans Ulrich

    2016-11-01

    To optimize resources, randomized clinical trials with multiple arms can be an attractive option to simultaneously test various treatment regimens in pharmaceutical drug development. The motivation for this work was the successful conduct and positive final outcome of a three-arm randomized clinical trial primarily assessing whether obinutuzumab plus chlorambucil in patients with chronic lympocytic lymphoma and coexisting conditions is superior to chlorambucil alone based on a time-to-event endpoint. The inference strategy of this trial was based on a closed testing procedure. We compare this strategy to three potential alternatives to run a three-arm clinical trial with a time-to-event endpoint. The primary goal is to quantify the differences between these strategies in terms of the time it takes until the first analysis and thus potential approval of a new drug, number of required events, and power. Operational aspects of implementing the various strategies are discussed. In conclusion, using a closed testing procedure results in the shortest time to the first analysis with a minimal loss in power. Therefore, closed testing procedures should be part of the statistician's standard clinical trials toolbox when planning multiarm clinical trials. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Music, clicks, and their imaginations favor differently the event-based timing component for rhythmic movements.

    PubMed

    Bravi, Riccardo; Quarta, Eros; Del Tongo, Claudia; Carbonaro, Nicola; Tognetti, Alessandro; Minciacchi, Diego

    2015-06-01

    The involvement or noninvolvement of a clock-like neural process, an effector-independent representation of the time intervals to produce, is described as the essential difference between event-based and emergent timing. In a previous work (Bravi et al. in Exp Brain Res 232:1663-1675, 2014a. doi: 10.1007/s00221-014-3845-9 ), we studied repetitive isochronous wrist's flexion-extensions (IWFEs), performed while minimizing visual and tactile information, to clarify whether non-temporal and temporal characteristics of paced auditory stimuli affect the precision and accuracy of the rhythmic motor performance. Here, with the inclusion of new recordings, we expand the examination of the dataset described in our previous study to investigate whether simple and complex paced auditory stimuli (clicks and music) and their imaginations influence in a different way the timing mechanisms for repetitive IWFEs. Sets of IWFEs were analyzed by the windowed (lag one) autocorrelation-wγ(1), a statistical method recently introduced for the distinction between event-based and emergent timing. Our findings provide evidence that paced auditory information and its imagination favor the engagement of a clock-like neural process, and specifically that music, unlike clicks, lacks the power to elicit event-based timing, not counteracting the natural shift of wγ(1) toward positive values as frequency of movements increase.

  20. Continuous robust sound event classification using time-frequency features and deep learning

    PubMed Central

    Song, Yan; Xiao, Wei; Phan, Huy

    2017-01-01

    The automatic detection and recognition of sound events by computers is a requirement for a number of emerging sensing and human computer interaction technologies. Recent advances in this field have been achieved by machine learning classifiers working in conjunction with time-frequency feature representations. This combination has achieved excellent accuracy for classification of discrete sounds. The ability to recognise sounds under real-world noisy conditions, called robust sound event classification, is an especially challenging task that has attracted recent research attention. Another aspect of real-word conditions is the classification of continuous, occluded or overlapping sounds, rather than classification of short isolated sound recordings. This paper addresses the classification of noise-corrupted, occluded, overlapped, continuous sound recordings. It first proposes a standard evaluation task for such sounds based upon a common existing method for evaluating isolated sound classification. It then benchmarks several high performing isolated sound classifiers to operate with continuous sound data by incorporating an energy-based event detection front end. Results are reported for each tested system using the new task, to provide the first analysis of their performance for continuous sound event detection. In addition it proposes and evaluates a novel Bayesian-inspired front end for the segmentation and detection of continuous sound recordings prior to classification. PMID:28892478

  1. Continuous robust sound event classification using time-frequency features and deep learning.

    PubMed

    McLoughlin, Ian; Zhang, Haomin; Xie, Zhipeng; Song, Yan; Xiao, Wei; Phan, Huy

    2017-01-01

    The automatic detection and recognition of sound events by computers is a requirement for a number of emerging sensing and human computer interaction technologies. Recent advances in this field have been achieved by machine learning classifiers working in conjunction with time-frequency feature representations. This combination has achieved excellent accuracy for classification of discrete sounds. The ability to recognise sounds under real-world noisy conditions, called robust sound event classification, is an especially challenging task that has attracted recent research attention. Another aspect of real-word conditions is the classification of continuous, occluded or overlapping sounds, rather than classification of short isolated sound recordings. This paper addresses the classification of noise-corrupted, occluded, overlapped, continuous sound recordings. It first proposes a standard evaluation task for such sounds based upon a common existing method for evaluating isolated sound classification. It then benchmarks several high performing isolated sound classifiers to operate with continuous sound data by incorporating an energy-based event detection front end. Results are reported for each tested system using the new task, to provide the first analysis of their performance for continuous sound event detection. In addition it proposes and evaluates a novel Bayesian-inspired front end for the segmentation and detection of continuous sound recordings prior to classification.

  2. Real-time monitoring of a coffee roasting process with near infrared spectroscopy using multivariate statistical analysis: A feasibility study.

    PubMed

    Catelani, Tiago A; Santos, João Rodrigo; Páscoa, Ricardo N M J; Pezza, Leonardo; Pezza, Helena R; Lopes, João A

    2018-03-01

    This work proposes the use of near infrared (NIR) spectroscopy in diffuse reflectance mode and multivariate statistical process control (MSPC) based on principal component analysis (PCA) for real-time monitoring of the coffee roasting process. The main objective was the development of a MSPC methodology able to early detect disturbances to the roasting process resourcing to real-time acquisition of NIR spectra. A total of fifteen roasting batches were defined according to an experimental design to develop the MSPC models. This methodology was tested on a set of five batches where disturbances of different nature were imposed to simulate real faulty situations. Some of these batches were used to optimize the model while the remaining was used to test the methodology. A modelling strategy based on a time sliding window provided the best results in terms of distinguishing batches with and without disturbances, resourcing to typical MSPC charts: Hotelling's T 2 and squared predicted error statistics. A PCA model encompassing a time window of four minutes with three principal components was able to efficiently detect all disturbances assayed. NIR spectroscopy combined with the MSPC approach proved to be an adequate auxiliary tool for coffee roasters to detect faults in a conventional roasting process in real-time. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Workshop on Algorithms for Time-Series Analysis

    NASA Astrophysics Data System (ADS)

    Protopapas, Pavlos

    2012-04-01

    abstract-type="normal">SummaryThis Workshop covered the four major subjects listed below in two 90-minute sessions. Each talk or tutorial allowed questions, and concluded with a discussion. Classification: Automatic classification using machine-learning methods is becoming a standard in surveys that generate large datasets. Ashish Mahabal (Caltech) reviewed various methods, and presented examples of several applications. Time-Series Modelling: Suzanne Aigrain (Oxford University) discussed autoregressive models and multivariate approaches such as Gaussian Processes. Meta-classification/mixture of expert models: Karim Pichara (Pontificia Universidad Católica, Chile) described the substantial promise which machine-learning classification methods are now showing in automatic classification, and discussed how the various methods can be combined together. Event Detection: Pavlos Protopapas (Harvard) addressed methods of fast identification of events with low signal-to-noise ratios, enlarging on the characterization and statistical issues of low signal-to-noise ratios and rare events.

  4. Multivariate Longitudinal Analysis with Bivariate Correlation Test

    PubMed Central

    Adjakossa, Eric Houngla; Sadissou, Ibrahim; Hounkonnou, Mahouton Norbert; Nuel, Gregory

    2016-01-01

    In the context of multivariate multilevel data analysis, this paper focuses on the multivariate linear mixed-effects model, including all the correlations between the random effects when the dimensional residual terms are assumed uncorrelated. Using the EM algorithm, we suggest more general expressions of the model’s parameters estimators. These estimators can be used in the framework of the multivariate longitudinal data analysis as well as in the more general context of the analysis of multivariate multilevel data. By using a likelihood ratio test, we test the significance of the correlations between the random effects of two dependent variables of the model, in order to investigate whether or not it is useful to model these dependent variables jointly. Simulation studies are done to assess both the parameter recovery performance of the EM estimators and the power of the test. Using two empirical data sets which are of longitudinal multivariate type and multivariate multilevel type, respectively, the usefulness of the test is illustrated. PMID:27537692

  5. Multivariate Longitudinal Analysis with Bivariate Correlation Test.

    PubMed

    Adjakossa, Eric Houngla; Sadissou, Ibrahim; Hounkonnou, Mahouton Norbert; Nuel, Gregory

    2016-01-01

    In the context of multivariate multilevel data analysis, this paper focuses on the multivariate linear mixed-effects model, including all the correlations between the random effects when the dimensional residual terms are assumed uncorrelated. Using the EM algorithm, we suggest more general expressions of the model's parameters estimators. These estimators can be used in the framework of the multivariate longitudinal data analysis as well as in the more general context of the analysis of multivariate multilevel data. By using a likelihood ratio test, we test the significance of the correlations between the random effects of two dependent variables of the model, in order to investigate whether or not it is useful to model these dependent variables jointly. Simulation studies are done to assess both the parameter recovery performance of the EM estimators and the power of the test. Using two empirical data sets which are of longitudinal multivariate type and multivariate multilevel type, respectively, the usefulness of the test is illustrated.

  6. Modeling associations between latent event processes governing time series of pulsing hormones.

    PubMed

    Liu, Huayu; Carlson, Nichole E; Grunwald, Gary K; Polotsky, Alex J

    2017-10-31

    This work is motivated by a desire to quantify relationships between two time series of pulsing hormone concentrations. The locations of pulses are not directly observed and may be considered latent event processes. The latent event processes of pulsing hormones are often associated. It is this joint relationship we model. Current approaches to jointly modeling pulsing hormone data generally assume that a pulse in one hormone is coupled with a pulse in another hormone (one-to-one association). However, pulse coupling is often imperfect. Existing joint models are not flexible enough for imperfect systems. In this article, we develop a more flexible class of pulse association models that incorporate parameters quantifying imperfect pulse associations. We propose a novel use of the Cox process model as a model of how pulse events co-occur in time. We embed the Cox process model into a hormone concentration model. Hormone concentration is the observed data. Spatial birth and death Markov chain Monte Carlo is used for estimation. Simulations show the joint model works well for quantifying both perfect and imperfect associations and offers estimation improvements over single hormone analyses. We apply this model to luteinizing hormone (LH) and follicle stimulating hormone (FSH), two reproductive hormones. Use of our joint model results in an ability to investigate novel hypotheses regarding associations between LH and FSH secretion in obese and non-obese women. © 2017, The International Biometric Society.

  7. Event- and Time-Driven Techniques Using Parallel CPU-GPU Co-processing for Spiking Neural Networks

    PubMed Central

    Naveros, Francisco; Garrido, Jesus A.; Carrillo, Richard R.; Ros, Eduardo; Luque, Niceto R.

    2017-01-01

    Modeling and simulating the neural structures which make up our central neural system is instrumental for deciphering the computational neural cues beneath. Higher levels of biological plausibility usually impose higher levels of complexity in mathematical modeling, from neural to behavioral levels. This paper focuses on overcoming the simulation problems (accuracy and performance) derived from using higher levels of mathematical complexity at a neural level. This study proposes different techniques for simulating neural models that hold incremental levels of mathematical complexity: leaky integrate-and-fire (LIF), adaptive exponential integrate-and-fire (AdEx), and Hodgkin-Huxley (HH) neural models (ranged from low to high neural complexity). The studied techniques are classified into two main families depending on how the neural-model dynamic evaluation is computed: the event-driven or the time-driven families. Whilst event-driven techniques pre-compile and store the neural dynamics within look-up tables, time-driven techniques compute the neural dynamics iteratively during the simulation time. We propose two modifications for the event-driven family: a look-up table recombination to better cope with the incremental neural complexity together with a better handling of the synchronous input activity. Regarding the time-driven family, we propose a modification in computing the neural dynamics: the bi-fixed-step integration method. This method automatically adjusts the simulation step size to better cope with the stiffness of the neural model dynamics running in CPU platforms. One version of this method is also implemented for hybrid CPU-GPU platforms. Finally, we analyze how the performance and accuracy of these modifications evolve with increasing levels of neural complexity. We also demonstrate how the proposed modifications which constitute the main contribution of this study systematically outperform the traditional event- and time-driven techniques under

  8. How wide in magnetic local time is the cusp? An event study

    NASA Astrophysics Data System (ADS)

    Maynard, N. C.; Weber, E. J.; Weimer, D. R.; Moen, J.; Onsager, T.; Heelis, R. A.; Egeland, A.

    1997-03-01

    A unique pass of the DMSP F11 satellite, longitudinally cutting through the cusp and mantle, combined with simultaneous optical measurements of the dayside cusp from Svalbard has been used to determine the width in local time of the cusp. We have shown from this event study that the cusp was at least 3.7 hours wide in magnetic local time. These measurements provide a lower limit for the cusp width. The observed cusp optical emissions are relatively constant, considering the processes which lead to the 630.0 nm emissions, and require precipitating electron flux to be added each minute during the DMSP pass throughout the local time extent observed by the imaging photometer and probably over the whole extent of the cusp defined by DMSP data. We conclude that the electron fluxes which produce the cusp aurora are from a process which must have been operable sometime during each minute but could have had both temporal and spatial variations. The measured width along with models of cusp precipitation provide the rationale to conclude that the region of flux tube opening in the dayside merging process involves the whole frontside magnetopause and can extend beyond the dawn-dusk terminator. The merging process for this event was found to be continuous, although spatially and temporally variable.

  9. Non-fragile ?-? control for discrete-time stochastic nonlinear systems under event-triggered protocols

    NASA Astrophysics Data System (ADS)

    Sun, Ying; Ding, Derui; Zhang, Sunjie; Wei, Guoliang; Liu, Hongjian

    2018-07-01

    In this paper, the non-fragile ?-? control problem is investigated for a class of discrete-time stochastic nonlinear systems under event-triggered communication protocols, which determine whether the measurement output should be transmitted to the controller or not. The main purpose of the addressed problem is to design an event-based output feedback controller subject to gain variations guaranteeing the prescribed disturbance attenuation level described by the ?-? performance index. By utilizing the Lyapunov stability theory combined with S-procedure, a sufficient condition is established to guarantee both the exponential mean-square stability and the ?-? performance for the closed-loop system. In addition, with the help of the orthogonal decomposition, the desired controller parameter is obtained in terms of the solution to certain linear matrix inequalities. Finally, a simulation example is exploited to demonstrate the effectiveness of the proposed event-based controller design scheme.

  10. A stochastical event-based continuous time step rainfall generator based on Poisson rectangular pulse and microcanonical random cascade models

    NASA Astrophysics Data System (ADS)

    Pohle, Ina; Niebisch, Michael; Zha, Tingting; Schümberg, Sabine; Müller, Hannes; Maurer, Thomas; Hinz, Christoph

    2017-04-01

    Rainfall variability within a storm is of major importance for fast hydrological processes, e.g. surface runoff, erosion and solute dissipation from surface soils. To investigate and simulate the impacts of within-storm variabilities on these processes, long time series of rainfall with high resolution are required. Yet, observed precipitation records of hourly or higher resolution are in most cases available only for a small number of stations and only for a few years. To obtain long time series of alternating rainfall events and interstorm periods while conserving the statistics of observed rainfall events, the Poisson model can be used. Multiplicative microcanonical random cascades have been widely applied to disaggregate rainfall time series from coarse to fine temporal resolution. We present a new coupling approach of the Poisson rectangular pulse model and the multiplicative microcanonical random cascade model that preserves the characteristics of rainfall events as well as inter-storm periods. In the first step, a Poisson rectangular pulse model is applied to generate discrete rainfall events (duration and mean intensity) and inter-storm periods (duration). The rainfall events are subsequently disaggregated to high-resolution time series (user-specified, e.g. 10 min resolution) by a multiplicative microcanonical random cascade model. One of the challenges of coupling these models is to parameterize the cascade model for the event durations generated by the Poisson model. In fact, the cascade model is best suited to downscale rainfall data with constant time step such as daily precipitation data. Without starting from a fixed time step duration (e.g. daily), the disaggregation of events requires some modifications of the multiplicative microcanonical random cascade model proposed by Olsson (1998): Firstly, the parameterization of the cascade model for events of different durations requires continuous functions for the probabilities of the multiplicative

  11. Children's Use of Morphological Cues in Real-Time Event Representation

    ERIC Educational Resources Information Center

    Zhou, Peng; Ma, Weiyi

    2018-01-01

    The present study investigated whether and how fast young children can use information encoded in morphological markers during real-time event representation. Using the visual world paradigm, we tested 35 adults, 34 5-year-olds and 33 3-year-olds. The results showed that the adults, the 5-year-olds and the 3-year-olds all exhibited eye gaze…

  12. Evaluation of a Multivariate Syndromic Surveillance System for West Nile Virus.

    PubMed

    Faverjon, Céline; Andersson, M Gunnar; Decors, Anouk; Tapprest, Jackie; Tritz, Pierre; Sandoz, Alain; Kutasi, Orsolya; Sala, Carole; Leblond, Agnès

    2016-06-01

    Various methods are currently used for the early detection of West Nile virus (WNV) but their outputs are not quantitative and/or do not take into account all available information. Our study aimed to test a multivariate syndromic surveillance system to evaluate if the sensitivity and the specificity of detection of WNV could be improved. Weekly time series data on nervous syndromes in horses and mortality in both horses and wild birds were used. Baselines were fitted to the three time series and used to simulate 100 years of surveillance data. WNV outbreaks were simulated and inserted into the baselines based on historical data and expert opinion. Univariate and multivariate syndromic surveillance systems were tested to gauge how well they detected the outbreaks; detection was based on an empirical Bayesian approach. The systems' performances were compared using measures of sensitivity, specificity, and area under receiver operating characteristic curve (AUC). When data sources were considered separately (i.e., univariate systems), the best detection performance was obtained using the data set of nervous symptoms in horses compared to those of bird and horse mortality (AUCs equal to 0.80, 0.75, and 0.50, respectively). A multivariate outbreak detection system that used nervous symptoms in horses and bird mortality generated the best performance (AUC = 0.87). The proposed approach is suitable for performing multivariate syndromic surveillance of WNV outbreaks. This is particularly relevant, given that a multivariate surveillance system performed better than a univariate approach. Such a surveillance system could be especially useful in serving as an alert for the possibility of human viral infections. This approach can be also used for other diseases for which multiple sources of evidence are available.

  13. Multi-state model for studying an intermediate event using time-dependent covariates: application to breast cancer.

    PubMed

    Meier-Hirmer, Carolina; Schumacher, Martin

    2013-06-20

    The aim of this article is to propose several methods that allow to investigate how and whether the shape of the hazard ratio after an intermediate event depends on the waiting time to occurrence of this event and/or the sojourn time in this state. A simple multi-state model, the illness-death model, is used as a framework to investigate the occurrence of this intermediate event. Several approaches are shown and their advantages and disadvantages are discussed. All these approaches are based on Cox regression. As different time-scales are used, these models go beyond Markov models. Different estimation methods for the transition hazards are presented. Additionally, time-varying covariates are included into the model using an approach based on fractional polynomials. The different methods of this article are then applied to a dataset consisting of four studies conducted by the German Breast Cancer Study Group (GBSG). The occurrence of the first isolated locoregional recurrence (ILRR) is studied. The results contribute to the debate on the role of the ILRR with respect to the course of the breast cancer disease and the resulting prognosis. We have investigated different modelling strategies for the transition hazard after ILRR or in general after an intermediate event. Including time-dependent structures altered the resulting hazard functions considerably and it was shown that this time-dependent structure has to be taken into account in the case of our breast cancer dataset. The results indicate that an early recurrence increases the risk of death. A late ILRR increases the hazard function much less and after the successful removal of the second tumour the risk of death is almost the same as before the recurrence. With respect to distant disease, the appearance of the ILRR only slightly increases the risk of death if the recurrence was treated successfully. It is important to realize that there are several modelling strategies for the intermediate event and that

  14. Quality by design case study: an integrated multivariate approach to drug product and process development.

    PubMed

    Huang, Jun; Kaul, Goldi; Cai, Chunsheng; Chatlapalli, Ramarao; Hernandez-Abad, Pedro; Ghosh, Krishnendu; Nagi, Arwinder

    2009-12-01

    To facilitate an in-depth process understanding, and offer opportunities for developing control strategies to ensure product quality, a combination of experimental design, optimization and multivariate techniques was integrated into the process development of a drug product. A process DOE was used to evaluate effects of the design factors on manufacturability and final product CQAs, and establish design space to ensure desired CQAs. Two types of analyses were performed to extract maximal information, DOE effect & response surface analysis and multivariate analysis (PCA and PLS). The DOE effect analysis was used to evaluate the interactions and effects of three design factors (water amount, wet massing time and lubrication time), on response variables (blend flow, compressibility and tablet dissolution). The design space was established by the combined use of DOE, optimization and multivariate analysis to ensure desired CQAs. Multivariate analysis of all variables from the DOE batches was conducted to study relationships between the variables and to evaluate the impact of material attributes/process parameters on manufacturability and final product CQAs. The integrated multivariate approach exemplifies application of QbD principles and tools to drug product and process development.

  15. Effects of stressful life events in young black men with high blood pressure.

    PubMed

    Han, Hae-Ra; Kim, Miyong T; Rose, Linda; Dennison, Cheryl; Bone, Lee; Hill, Martha N

    2006-01-01

    1) To describe stressful life events as experienced by a sample of young Black men with high blood pressure (HBP) living in inner-city Baltimore, Maryland; and 2) to examine the effect of cumulative stressful life events on substance use, depression, and quality of life. Data were obtained over 48 months by interview from 210 men in an HBP management study. Stressors repeatedly occurring over time included death of family member or close friend (65.2%), having a new family member (32.9%), change in residence (31.4%), difficulty finding a job (24.3%), and fired or laid off from work (17.6%). Involvement with crime or legal matters was reported at least twice during the 48 months by 33.3% of men. When a cumulative stressful life events score was calculated by summing the number of events experienced at 6-month points over 48 months and tested for its relationship with the health outcomes, the findings of multivariate analyses revealed significant associations between cumulative life stressors and depression and quality of life. No significant relationship was found between stressful life events and substance use. The results suggest that cumulative stressful life events have a negative effect on mental health and quality of life in young Black men with HBP. Future study should focus on developing interventions to assist individuals in managing distress related to stressful events with necessary community resources.

  16. Investigation of 2-stage meta-analysis methods for joint longitudinal and time-to-event data through simulation and real data application.

    PubMed

    Sudell, Maria; Tudur Smith, Catrin; Gueyffier, François; Kolamunnage-Dona, Ruwanthi

    2018-04-15

    Joint modelling of longitudinal and time-to-event data is often preferred over separate longitudinal or time-to-event analyses as it can account for study dropout, error in longitudinally measured covariates, and correlation between longitudinal and time-to-event outcomes. The joint modelling literature focuses mainly on the analysis of single studies with no methods currently available for the meta-analysis of joint model estimates from multiple studies. We propose a 2-stage method for meta-analysis of joint model estimates. These methods are applied to the INDANA dataset to combine joint model estimates of systolic blood pressure with time to death, time to myocardial infarction, and time to stroke. Results are compared to meta-analyses of separate longitudinal or time-to-event models. A simulation study is conducted to contrast separate versus joint analyses over a range of scenarios. Using the real dataset, similar results were obtained by using the separate and joint analyses. However, the simulation study indicated a benefit of use of joint rather than separate methods in a meta-analytic setting where association exists between the longitudinal and time-to-event outcomes. Where evidence of association between longitudinal and time-to-event outcomes exists, results from joint models over standalone analyses should be pooled in 2-stage meta-analyses. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  17. Enhancing e-waste estimates: Improving data quality by multivariate Input–Output Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Feng, E-mail: fwang@unu.edu; Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft; Huisman, Jaco

    2013-11-15

    Highlights: • A multivariate Input–Output Analysis method for e-waste estimates is proposed. • Applying multivariate analysis to consolidate data can enhance e-waste estimates. • We examine the influence of model selection and data quality on e-waste estimates. • Datasets of all e-waste related variables in a Dutch case study have been provided. • Accurate modeling of time-variant lifespan distributions is critical for estimate. - Abstract: Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lackmore » of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input–Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e

  18. EventThread: Visual Summarization and Stage Analysis of Event Sequence Data.

    PubMed

    Guo, Shunan; Xu, Ke; Zhao, Rongwen; Gotz, David; Zha, Hongyuan; Cao, Nan

    2018-01-01

    Event sequence data such as electronic health records, a person's academic records, or car service records, are ordered series of events which have occurred over a period of time. Analyzing collections of event sequences can reveal common or semantically important sequential patterns. For example, event sequence analysis might reveal frequently used care plans for treating a disease, typical publishing patterns of professors, and the patterns of service that result in a well-maintained car. It is challenging, however, to visually explore large numbers of event sequences, or sequences with large numbers of event types. Existing methods focus on extracting explicitly matching patterns of events using statistical analysis to create stages of event progression over time. However, these methods fail to capture latent clusters of similar but not identical evolutions of event sequences. In this paper, we introduce a novel visualization system named EventThread which clusters event sequences into threads based on tensor analysis and visualizes the latent stage categories and evolution patterns by interactively grouping the threads by similarity into time-specific clusters. We demonstrate the effectiveness of EventThread through usage scenarios in three different application domains and via interviews with an expert user.

  19. Multivariate normative comparisons using an aggregated database

    PubMed Central

    Murre, Jaap M. J.; Huizenga, Hilde M.

    2017-01-01

    In multivariate normative comparisons, a patient’s profile of test scores is compared to those in a normative sample. Recently, it has been shown that these multivariate normative comparisons enhance the sensitivity of neuropsychological assessment. However, multivariate normative comparisons require multivariate normative data, which are often unavailable. In this paper, we show how a multivariate normative database can be constructed by combining healthy control group data from published neuropsychological studies. We show that three issues should be addressed to construct a multivariate normative database. First, the database may have a multilevel structure, with participants nested within studies. Second, not all tests are administered in every study, so many data may be missing. Third, a patient should be compared to controls of similar age, gender and educational background rather than to the entire normative sample. To address these issues, we propose a multilevel approach for multivariate normative comparisons that accounts for missing data and includes covariates for age, gender and educational background. Simulations show that this approach controls the number of false positives and has high sensitivity to detect genuine deviations from the norm. An empirical example is provided. Implications for other domains than neuropsychology are also discussed. To facilitate broader adoption of these methods, we provide code implementing the entire analysis in the open source software package R. PMID:28267796

  20. Correlation Analyses Between the Characteristic Times of Gradual Solar Energetic Particle Events and the Properties of Associated Coronal Mass Ejections

    NASA Astrophysics Data System (ADS)

    Pan, Z. H.; Wang, C. B.; Wang, Yuming; Xue, X. H.

    2011-06-01

    It is generally believed that gradual solar energetic particles (SEPs) are accelerated by shocks associated with coronal mass ejections (CMEs). Using an ice-cream cone model, the radial speed and angular width of 95 CMEs associated with SEP events during 1998 - 2002 are calculated from SOHO/LASCO observations. Then, we investigate the relationships between the kinematic properties of these CMEs and the characteristic times of the intensity-time profile of their accompanied SEP events observed at 1 AU. These characteristic times of SEP are i) the onset time from the accompanying CME eruption at the Sun to the SEP arrival at 1 AU, ii) the rise time from the SEP onset to the time when the SEP intensity is one-half of peak intensity, and iii) the duration over which the SEP intensity is within a factor of two of the peak intensity. It is found that the onset time has neither significant correlation with the radial speed nor with the angular width of the accompanying CME. For events that are poorly connected to the Earth, the SEP rise time and duration have no significant correlation with the radial speed and angular width of the associated CMEs. However, for events that are magnetically well connected to the Earth, the SEP rise time and duration have significantly positive correlations with the radial speed and angular width of the associated CMEs. This indicates that a CME event with wider angular width and higher speed may more easily drive a strong and wide shock near to the Earth-connected interplanetary magnetic field lines, may trap and accelerate particles for a longer time, and may lead to longer rise time and duration of the ensuing SEP event.

  1. Potential of turbidity monitoring for real time control of pollutant discharge in sewers during rainfall events.

    PubMed

    Lacour, C; Joannis, C; Gromaire, M-C; Chebbo, G

    2009-01-01

    Turbidity sensors can be used to continuously monitor the evolution of pollutant mass discharge. For two sites within the Paris combined sewer system, continuous turbidity, conductivity and flow data were recorded at one-minute time intervals over a one-year period. This paper is intended to highlight the variability in turbidity dynamics during wet weather. For each storm event, turbidity response aspects were analysed through different classifications. The correlation between classification and common parameters, such as the antecedent dry weather period, total event volume per impervious hectare and both the mean and maximum hydraulic flow for each event, was also studied. Moreover, the dynamics of flow and turbidity signals were compared at the event scale. No simple relation between turbidity responses, hydraulic flow dynamics and the chosen parameters was derived from this effort. Knowledge of turbidity dynamics could therefore potentially improve wet weather management, especially when using pollution-based real-time control (P-RTC) since turbidity contains information not included in hydraulic flow dynamics and not readily predictable from such dynamics.

  2. Integrated survival analysis using an event-time approach in a Bayesian framework

    USGS Publications Warehouse

    Walsh, Daniel P.; Dreitz, VJ; Heisey, Dennis M.

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  3. Integrated survival analysis using an event-time approach in a Bayesian framework.

    PubMed

    Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M

    2015-02-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  4. Time-to-event methodology improved statistical evaluation in register-based health services research.

    PubMed

    Bluhmki, Tobias; Bramlage, Peter; Volk, Michael; Kaltheuner, Matthias; Danne, Thomas; Rathmann, Wolfgang; Beyersmann, Jan

    2017-02-01

    Complex longitudinal sampling and the observational structure of patient registers in health services research are associated with methodological challenges regarding data management and statistical evaluation. We exemplify common pitfalls and want to stimulate discussions on the design, development, and deployment of future longitudinal patient registers and register-based studies. For illustrative purposes, we use data from the prospective, observational, German DIabetes Versorgungs-Evaluation register. One aim was to explore predictors for the initiation of a basal insulin supported therapy in patients with type 2 diabetes initially prescribed to glucose-lowering drugs alone. Major challenges are missing mortality information, time-dependent outcomes, delayed study entries, different follow-up times, and competing events. We show that time-to-event methodology is a valuable tool for improved statistical evaluation of register data and should be preferred to simple case-control approaches. Patient registers provide rich data sources for health services research. Analyses are accompanied with the trade-off between data availability, clinical plausibility, and statistical feasibility. Cox' proportional hazards model allows for the evaluation of the outcome-specific hazards, but prediction of outcome probabilities is compromised by missing mortality information. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Relative timing of last glacial maximum and late-glacial events in the central tropical Andes

    NASA Astrophysics Data System (ADS)

    Bromley, Gordon R. M.; Schaefer, Joerg M.; Winckler, Gisela; Hall, Brenda L.; Todd, Claire E.; Rademaker, Kurt M.

    2009-11-01

    Whether or not tropical climate fluctuated in synchrony with global events during the Late Pleistocene is a key problem in climate research. However, the timing of past climate changes in the tropics remains controversial, with a number of recent studies reporting that tropical ice age climate is out of phase with global events. Here, we present geomorphic evidence and an in-situ cosmogenic 3He surface-exposure chronology from Nevado Coropuna, southern Peru, showing that glaciers underwent at least two significant advances during the Late Pleistocene prior to Holocene warming. Comparison of our glacial-geomorphic map at Nevado Coropuna to mid-latitude reconstructions yields a striking similarity between Last Glacial Maximum (LGM) and Late-Glacial sequences in tropical and temperate regions. Exposure ages constraining the maximum and end of the older advance at Nevado Coropuna range between 24.5 and 25.3 ka, and between 16.7 and 21.1 ka, respectively, depending on the cosmogenic production rate scaling model used. Similarly, the mean age of the younger event ranges from 10 to 13 ka. This implies that (1) the LGM and the onset of deglaciation in southern Peru occurred no earlier than at higher latitudes and (2) that a significant Late-Glacial event occurred, most likely prior to the Holocene, coherent with the glacial record from mid and high latitudes. The time elapsed between the end of the LGM and the Late-Glacial event at Nevado Coropuna is independent of scaling model and matches the period between the LGM termination and Late-Glacial reversal in classic mid-latitude records, suggesting that these events in both tropical and temperate regions were in phase.

  6. Evidence for the timing of sea-level events during MIS 3

    NASA Astrophysics Data System (ADS)

    Siddall, M.

    2005-12-01

    Four large sea-level peaks of millennial-scale duration occur during MIS 3. In addition smaller peaks may exist close to the sensitivity of existing methods to derive sea level during these periods. Millennial-scale changes in temperature during MIS 3 are well documented across much of the planet and are linked in some unknown, yet fundamental way to changes in ice volume / sea level. It is therefore highly likely that the timing of the sea level events during MIS 3 will prove to be a `Rosetta Stone' for understanding millennial scale climate variability. I will review observational and mechanistic arguments for the variation of sea level on Antarctic, Greenland and absolute time scales.

  7. Delay-time distribution of core-collapse supernovae with late events resulting from binary interaction

    NASA Astrophysics Data System (ADS)

    Zapartas, E.; de Mink, S. E.; Izzard, R. G.; Yoon, S.-C.; Badenes, C.; Götberg, Y.; de Koter, A.; Neijssel, C. J.; Renzo, M.; Schootemeijer, A.; Shrotriya, T. S.

    2017-05-01

    Most massive stars, the progenitors of core-collapse supernovae, are in close binary systems and may interact with their companion through mass transfer or merging. We undertake a population synthesis study to compute the delay-time distribution of core-collapse supernovae, that is, the supernova rate versus time following a starburst, taking into account binary interactions. We test the systematic robustness of our results by running various simulations to account for the uncertainties in our standard assumptions. We find that a significant fraction, %, of core-collapse supernovae are "late", that is, they occur 50-200 Myr after birth, when all massive single stars have already exploded. These late events originate predominantly from binary systems with at least one, or, in most cases, with both stars initially being of intermediate mass (4-8 M⊙). The main evolutionary channels that contribute often involve either the merging of the initially more massive primary star with its companion or the engulfment of the remaining core of the primary by the expanding secondary that has accreted mass at an earlier evolutionary stage. Also, the total number of core-collapse supernovae increases by % because of binarity for the same initial stellar mass. The high rate implies that we should have already observed such late core-collapse supernovae, but have not recognized them as such. We argue that φ Persei is a likely progenitor and that eccentric neutron star - white dwarf systems are likely descendants. Late events can help explain the discrepancy in the delay-time distributions derived from supernova remnants in the Magellanic Clouds and extragalactic type Ia events, lowering the contribution of prompt Ia events. We discuss ways to test these predictions and speculate on the implications for supernova feedback in simulations of galaxy evolution.

  8. Genomic Variation by Whole-Genome SNP Mapping Arrays Predicts Time-to-Event Outcome in Patients with Chronic Lymphocytic Leukemia

    PubMed Central

    Schweighofer, Carmen D.; Coombes, Kevin R.; Majewski, Tadeusz; Barron, Lynn L.; Lerner, Susan; Sargent, Rachel L.; O'Brien, Susan; Ferrajoli, Alessandra; Wierda, William G.; Czerniak, Bogdan A.; Medeiros, L. Jeffrey; Keating, Michael J.; Abruzzo, Lynne V.

    2013-01-01

    Genomic abnormalities, such as deletions in 11q22 or 17p13, are associated with poorer prognosis in patients with chronic lymphocytic leukemia (CLL). We hypothesized that unknown regions of copy number variation (CNV) affect clinical outcome and can be detected by array-based single-nucleotide polymorphism (SNP) genotyping. We compared SNP genotypes from 168 untreated patients with CLL with genotypes from 73 white HapMap controls. We identified 322 regions of recurrent CNV, 82 of which occurred significantly more often in CLL than in HapMap (CLL-specific CNV), including regions typically aberrant in CLL: deletions in 6q21, 11q22, 13q14, and 17p13 and trisomy 12. In univariate analyses, 35 of total and 11 of CLL-specific CNVs were associated with unfavorable time-to-event outcomes, including gains or losses in chromosomes 2p, 4p, 4q, 6p, 6q, 7q, 11p, 11q, and 17p. In multivariate analyses, six CNVs (ie, CLL-specific variations in 11p15.1-15.4 or 6q27) predicted time-to-treatment or overall survival independently of established markers of prognosis. Moreover, genotypic complexity (ie, the number of independent CNVs per patient) significantly predicted prognosis, with a median time-to-treatment of 64 months versus 23 months in patients with zero to one versus two or more CNVs, respectively (P = 3.3 × 10−8). In summary, a comparison of SNP genotypes from patients with CLL with HapMap controls allowed us to identify known and unknown recurrent CNVs and to determine regions and rates of CNV that predict poorer prognosis in patients with CLL. PMID:23273604

  9. Time compression of soil erosion by the effect of largest daily event. A regional analysis of USLE database.

    NASA Astrophysics Data System (ADS)

    Gonzalez-Hidalgo, J. C.; Batalla, R.; Cerda, A.; de Luis, M.

    2009-04-01

    When Thornes and Brunsden wrote in 1977 "How often one hears the researcher (and no less the undergraduate) complain that after weeks of observation "nothing happened" only to learn that, the day after his departure, a flood caused unprecedent erosion and channel changes!" (Thornes and Brunsden, 1977, p. 57), they focussed on two different problems in geomorphological research: the effects of extreme events and the temporal compression of geomorphological processes. The time compression is one of the main characteristic of erosion processes. It means that an important amount of the total soil eroded is produced in very short temporal intervals, i.e. few events mostly related to extreme events. From magnitude-frequency analysis we know that few events, not necessarily extreme by magnitude, produce high amount of geomorphological work. Last but not least, extreme isolated events are a classical issue in geomorphology by their specific effects, and they are receiving permanent attention, increased at present because of scenarios of global change. Notwithstanding, the time compression of geomorphological processes could be focused not only on the analysis of extreme events and the traditional magnitude-frequency approach, but on new complementary approach based on the effects of largest events. The classical approach define extreme event as a rare event (identified by its magnitude and quantified by some deviation from central value), while we define largest events by the rank, whatever their magnitude. In a previous research on time compression of soil erosion, using USLE soil erosion database (Gonzalez-Hidalgo et al., EGU 2007), we described a relationship between the total amount of daily erosive events recorded by plot and the percentage contribution to total soil erosion of n-largest aggregated daily events. Now we offer a further refined analysis comparing different agricultural regions in USA. To do that we have analyzed data from 594 erosion plots from USLE

  10. Memory Reactivation Predicts Resistance to Retroactive Interference: Evidence from Multivariate Classification and Pattern Similarity Analyses

    PubMed Central

    Rugg, Michael D.

    2016-01-01

    Memory reactivation—the reinstatement of processes and representations engaged when an event is initially experienced—is believed to play an important role in strengthening and updating episodic memory. The present study examines how memory reactivation during a potentially interfering event influences memory for a previously experienced event. Participants underwent fMRI during the encoding phase of an AB/AC interference task in which some words were presented twice in association with two different encoding tasks (AB and AC trials) and other words were presented once (DE trials). The later memory test required retrieval of the encoding tasks associated with each of the study words. Retroactive interference was evident for the AB encoding task and was particularly strong when the AC encoding task was remembered rather than forgotten. We used multivariate classification and pattern similarity analysis (PSA) to measure reactivation of the AB encoding task during AC trials. The results demonstrated that reactivation of generic task information measured with multivariate classification predicted subsequent memory for the AB encoding task regardless of whether interference was strong and weak (trials for which the AC encoding task was remembered or forgotten, respectively). In contrast, reactivation of neural patterns idiosyncratic to a given AB trial measured with PSA only predicted memory when the strength of interference was low. These results suggest that reactivation of features of an initial experience shared across numerous events in the same category, but not features idiosyncratic to a particular event, are important in resisting retroactive interference caused by new learning. SIGNIFICANCE STATEMENT Reactivating a previously encoded memory is believed to provide an opportunity to strengthen the memory, but also to return the memory to a labile state, making it susceptible to interference. However, there is debate as to how memory reactivation elicited by

  11. Temporal event structure and timing in schizophrenia: preserved binding in a longer "now".

    PubMed

    Martin, Brice; Giersch, Anne; Huron, Caroline; van Wassenhove, Virginie

    2013-01-01

    Patients with schizophrenia experience a loss of temporal continuity or subjective fragmentation along the temporal dimension. Here, we develop the hypothesis that impaired temporal awareness results from a perturbed structuring of events in time-i.e., canonical neural dynamics. To address this, 26 patients and their matched controls took part in two psychophysical studies using desynchronized audiovisual speech. Two tasks were used and compared: first, an identification task testing for multisensory binding impairments in which participants reported what they heard while looking at a speaker's face; in a second task, we tested the perceived simultaneity of the same audiovisual speech stimuli. In both tasks, we used McGurk fusion and combination that are classic ecologically valid multisensory illusions. First, and contrary to previous reports, our results show that patients do not significantly differ from controls in their rate of illusory reports. Second, the illusory reports of patients in the identification task were more sensitive to audiovisual speech desynchronies than those of controls. Third, and surprisingly, patients considered audiovisual speech to be synchronized for longer delays than controls. As such, the temporal tolerance profile observed in a temporal judgement task was less of a predictor for sensory binding in schizophrenia than for that obtained in controls. We interpret our results as an impairment of temporal event structuring in schizophrenia which does not specifically affect sensory binding operations but rather, the explicit access to timing information associated here with audiovisual speech processing. Our findings are discussed in the context of curent neurophysiological frameworks for the binding and the structuring of sensory events in time. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Using Real-time Event Tracking Sensitivity Analysis to Overcome Sensor Measurement Uncertainties of Geo-Information Management in Drilling Disasters

    NASA Astrophysics Data System (ADS)

    Tavakoli, S.; Poslad, S.; Fruhwirth, R.; Winter, M.

    2012-04-01

    This paper introduces an application of a novel EventTracker platform for instantaneous Sensitivity Analysis (SA) of large scale real-time geo-information. Earth disaster management systems demand high quality information to aid a quick and timely response to their evolving environments. The idea behind the proposed EventTracker platform is the assumption that modern information management systems are able to capture data in real-time and have the technological flexibility to adjust their services to work with specific sources of data/information. However, to assure this adaptation in real time, the online data should be collected, interpreted, and translated into corrective actions in a concise and timely manner. This can hardly be handled by existing sensitivity analysis methods because they rely on historical data and lazy processing algorithms. In event-driven systems, the effect of system inputs on its state is of value, as events could cause this state to change. This 'event triggering' situation underpins the logic of the proposed approach. Event tracking sensitivity analysis method describes the system variables and states as a collection of events. The higher the occurrence of an input variable during the trigger of event, the greater its potential impact will be on the final analysis of the system state. Experiments were designed to compare the proposed event tracking sensitivity analysis with existing Entropy-based sensitivity analysis methods. The results have shown a 10% improvement in a computational efficiency with no compromise for accuracy. It has also shown that the computational time to perform the sensitivity analysis is 0.5% of the time required compared to using the Entropy-based method. The proposed method has been applied to real world data in the context of preventing emerging crises at drilling rigs. One of the major purposes of such rigs is to drill boreholes to explore oil or gas reservoirs with the final scope of recovering the content

  13. The HAWC Real-time Flare Monitor for Rapid Detection of Transient Events

    NASA Astrophysics Data System (ADS)

    Abeysekara, A. U.; Alfaro, R.; Alvarez, C.; Álvarez, J. D.; Arceo, R.; Arteaga-Velázquez, J. C.; Avila Rojas, D.; Ayala Solares, H. A.; Barber, A. S.; Bautista-Elivar, N.; Becerra Gonzalez, J.; Becerril, A.; Belmont-Moreno, E.; BenZvi, S. Y.; Bernal, A.; Braun, J.; Brisbois, C.; Caballero-Mora, K. S.; Capistrán, T.; Carramiñana, A.; Casanova, S.; Castillo, M.; Cotti, U.; Cotzomi, J.; Coutiño de León, S.; De la Fuente, E.; De León, C.; Díaz-Vélez, J. C.; Dingus, B. L.; DuVernois, M. A.; Ellsworth, R. W.; Engel, K.; Fiorino, D. W.; Fraija, N.; García-González, J. A.; Garfias, F.; Gerhardt, M.; González, M. M.; González Muñoz, A.; Goodman, J. A.; Hampel-Arias, Z.; Harding, J. P.; Hernandez, S.; Hernandez-Almada, A.; Hona, B.; Hui, C. M.; Hüntemeyer, P.; Iriarte, A.; Jardin-Blicq, A.; Joshi, V.; Kaufmann, S.; Kieda, D.; Lauer, R. J.; Lee, W. H.; Lennarz, D.; León Vargas, H.; Linnemann, J. T.; Longinotti, A. L.; López-Cámara, D.; López-Coto, R.; Raya, G. Luis; Luna-García, R.; Malone, K.; Marinelli, S. S.; Martinez, O.; Martinez-Castellanos, I.; Martínez-Castro, J.; Martínez-Huerta, H.; Matthews, J. A.; Miranda-Romagnoli, P.; Moreno, E.; Mostafá, M.; Nellen, L.; Newbold, M.; Nisa, M. U.; Noriega-Papaqui, R.; Pelayo, R.; Pérez-Pérez, E. G.; Pretz, J.; Ren, Z.; Rho, C. D.; Rivière, C.; Rosa-González, D.; Rosenberg, M.; Ruiz-Velasco, E.; Salazar, H.; Salesa Greus, F.; Sandoval, A.; Schneider, M.; Schoorlemmer, H.; Sinnis, G.; Smith, A. J.; Springer, R. W.; Surajbali, P.; Taboada, I.; Tibolla, O.; Tollefson, K.; Torres, I.; Ukwatta, T. N.; Vianello, G.; Weisgarber, T.; Westerhoff, S.; Wisher, I. G.; Wood, J.; Yapici, T.; Younk, P. W.; Zepeda, A.; Zhou, H.

    2017-07-01

    We present the development of a real-time flare monitor for the High Altitude Water Cherenkov (HAWC) observatory. The flare monitor has been fully operational since 2017 January and is designed to detect very high energy (VHE; E ≳ 100 GeV) transient events from blazars on timescales lasting from 2 minutes to 10 hr in order to facilitate multiwavelength and multimessenger studies. These flares provide information for investigations into the mechanisms that power the blazars’ relativistic jets and accelerate particles within them, and they may also serve as probes of the populations of particles and fields in intergalactic space. To date, the detection of blazar flares in the VHE range has relied primarily on pointed observations by imaging atmospheric Cherenkov telescopes. The recently completed HAWC observatory offers the opportunity to study VHE flares in survey mode, scanning two-thirds of the entire sky every day with a field of view of ˜1.8 steradians. In this work, we report on the sensitivity of the HAWC real-time flare monitor and demonstrate its capabilities via the detection of three high-confidence VHE events in the blazars Markarian 421 and Markarian 501.

  14. A Bayesian model for time-to-event data with informative censoring

    PubMed Central

    Kaciroti, Niko A.; Raghunathan, Trivellore E.; Taylor, Jeremy M. G.; Julius, Stevo

    2012-01-01

    Randomized trials with dropouts or censored data and discrete time-to-event type outcomes are frequently analyzed using the Kaplan–Meier or product limit (PL) estimation method. However, the PL method assumes that the censoring mechanism is noninformative and when this assumption is violated, the inferences may not be valid. We propose an expanded PL method using a Bayesian framework to incorporate informative censoring mechanism and perform sensitivity analysis on estimates of the cumulative incidence curves. The expanded method uses a model, which can be viewed as a pattern mixture model, where odds for having an event during the follow-up interval (tk−1,tk], conditional on being at risk at tk−1, differ across the patterns of missing data. The sensitivity parameters relate the odds of an event, between subjects from a missing-data pattern with the observed subjects for each interval. The large number of the sensitivity parameters is reduced by considering them as random and assumed to follow a log-normal distribution with prespecified mean and variance. Then we vary the mean and variance to explore sensitivity of inferences. The missing at random (MAR) mechanism is a special case of the expanded model, thus allowing exploration of the sensitivity to inferences as departures from the inferences under the MAR assumption. The proposed approach is applied to data from the TRial Of Preventing HYpertension. PMID:22223746

  15. Univariate and multivariate skewness and kurtosis for measuring nonnormality: Prevalence, influence and estimation.

    PubMed

    Cain, Meghan K; Zhang, Zhiyong; Yuan, Ke-Hai

    2017-10-01

    Nonnormality of univariate data has been extensively examined previously (Blanca et al., Methodology: European Journal of Research Methods for the Behavioral and Social Sciences, 9(2), 78-84, 2013; Miceeri, Psychological Bulletin, 105(1), 156, 1989). However, less is known of the potential nonnormality of multivariate data although multivariate analysis is commonly used in psychological and educational research. Using univariate and multivariate skewness and kurtosis as measures of nonnormality, this study examined 1,567 univariate distriubtions and 254 multivariate distributions collected from authors of articles published in Psychological Science and the American Education Research Journal. We found that 74 % of univariate distributions and 68 % multivariate distributions deviated from normal distributions. In a simulation study using typical values of skewness and kurtosis that we collected, we found that the resulting type I error rates were 17 % in a t-test and 30 % in a factor analysis under some conditions. Hence, we argue that it is time to routinely report skewness and kurtosis along with other summary statistics such as means and variances. To facilitate future report of skewness and kurtosis, we provide a tutorial on how to compute univariate and multivariate skewness and kurtosis by SAS, SPSS, R and a newly developed Web application.

  16. A stochastic storm surge generator for the German North Sea and the multivariate statistical assessment of the simulation results

    NASA Astrophysics Data System (ADS)

    Wahl, Thomas; Jensen, Jürgen; Mudersbach, Christoph

    2010-05-01

    Storm surges along the German North Sea coastline led to major damages in the past and the risk of inundation is expected to increase in the course of an ongoing climate change. The knowledge of the characteristics of possible storm surges is essential for the performance of integrated risk analyses, e.g. based on the source-pathway-receptor concept. The latter includes the storm surge simulation/analyses (source), modelling of dike/dune breach scenarios (pathway) and the quantification of potential losses (receptor). In subproject 1b of the German joint research project XtremRisK (www.xtremrisk.de), a stochastic storm surge generator for the south-eastern North Sea area is developed. The input data for the multivariate model are high resolution sea level observations from tide gauges during extreme events. Based on 25 parameters (19 sea level parameters and 6 time parameters) observed storm surge hydrographs consisting of three tides are parameterised. Followed by the adaption of common parametric probability distributions and a large number of Monte-Carlo-Simulations, the final reconstruction leads to a set of 100.000 (default) synthetic storm surge events with a one-minute resolution. Such a data set can potentially serve as the basis for a large number of applications. For risk analyses, storm surges with peak water levels exceeding the design water levels are of special interest. The occurrence probabilities of the simulated extreme events are estimated based on multivariate statistics, considering the parameters "peak water level" and "fullness/intensity". In the past, most studies considered only the peak water levels during extreme events, which might not be the most important parameter in any cases. Here, a 2D-Archimedian copula model is used for the estimation of the joint probabilities of the selected parameters, accounting for the structures of dependence overlooking the margins. In coordination with subproject 1a, the results will be used as the input

  17. Multivariate Autoregressive Modeling and Granger Causality Analysis of Multiple Spike Trains

    PubMed Central

    Krumin, Michael; Shoham, Shy

    2010-01-01

    Recent years have seen the emergence of microelectrode arrays and optical methods allowing simultaneous recording of spiking activity from populations of neurons in various parts of the nervous system. The analysis of multiple neural spike train data could benefit significantly from existing methods for multivariate time-series analysis which have proven to be very powerful in the modeling and analysis of continuous neural signals like EEG signals. However, those methods have not generally been well adapted to point processes. Here, we use our recent results on correlation distortions in multivariate Linear-Nonlinear-Poisson spiking neuron models to derive generalized Yule-Walker-type equations for fitting ‘‘hidden” Multivariate Autoregressive models. We use this new framework to perform Granger causality analysis in order to extract the directed information flow pattern in networks of simulated spiking neurons. We discuss the relative merits and limitations of the new method. PMID:20454705

  18. The timing of life history events in the presence of soft disturbances.

    PubMed

    Bertacchi, Daniela; Zucca, Fabio; Ambrosini, Roberto

    2016-01-21

    We study a model for the evolutionarily stable strategy (ESS) used by biological populations for choosing the time of life-history events, such as arrival from migration and breeding. In our model we account for both intra-species competition (early individuals have a competitive advantage) and a disturbance which strikes at a random time, killing a fraction 1-p of the population. Disturbances include spells of bad weather, such as freezing or heavily raining days. It has been shown by Iwasa and Levin (1995) that when the disturbance is so strong that it kills any individual present when it strikes (hard disturbance, p=0), then the ESS is a mixed strategy (individuals choose their arrival date in an interval of possible dates, according to a certain probability distribution). In this case, individuals wait for a certain time and afterwards start arriving (or breeding) every day. In this paper we explore a biologically more realistic situation whereby the disturbance kills only a fraction of the individuals (soft disturbance, p>0). We also remove some technical assumptions which Iwasa and Levin made on the distribution of the disturbance. We prove that the ESS is still a mixed choice of times, however with respect to the case of hard disturbance, a new phenomenon arises: whenever the disturbance is soft, if the competition is sufficiently strong, the waiting time disappears and a fraction of the population arrives at the earliest day possible, while the rest will arrive throughout the whole period during which the disturbance may occur. This means that under strong competition, the payoff of early arrival balances the increased risk of being killed by the disturbance. We study the behaviour of the ESS and of the average fitness of the population, depending on the parameters involved. We also investigate how the population may be affected by climate change: namely the occurrence of more extreme weather events, which may kill a larger fraction of the population, and

  19. Direct generation of event-timing equations for generalized flow shop systems

    NASA Astrophysics Data System (ADS)

    Doustmohammadi, Ali; Kamen, Edward W.

    1995-11-01

    Flow shop production lines are very common in manufacturing systems such as car assemblies, manufacturing of electronic circuits, etc. In this paper, a systematic procedure is given for generating event-timing equations directly from the machine interconnections for a generalized flow shop system. The events considered here correspond to completion times of machine operations. It is assumed that the scheduling policy is cyclic (periodic). For a given flow shop system, the open connection dynamics of the machines are derived first. Then interconnection matrices characterizing the routing of parts in the system are obtained from the given system configuration. The open connection dynamics of the machines and the interconnection matrices are then combined together to obtain the overall system dynamics given by an equation of the form X(k+1) equals A(k)X(k) B(k)V(k+1) defined over the max-plus algebra. Here the state X(k) is the vector of completion times and V(k+1) is an external input vector consisting of the arrival times of parts. It is shown that if the machines are numbered in an appropriate way and the states are selected according to certain rules, the matrix A(k) will be in a special (canonical) form. The model obtained here is useful or the analysis of system behavior and for carrying out simulations. In particular, the canonical form of A(k) enables one to study system bottlenecks and the minimal cycle time during steady-state operation. The approach presented in this paper is believed to be more straightforward compared to existing max-plus algebra formulations of flow shop systems. In particular, three advantages of the proposed approach are: (1) it yields timing equations directly from the system configuration and hence there is no need to first derive a Petri net or a digraph equivalent of the system; (2) a change in the system configuration only affects the interconnection matrices and hence does not require rederiving the entire set of equations; (3

  20. Estimating correlation between multivariate longitudinal data in the presence of heterogeneity.

    PubMed

    Gao, Feng; Philip Miller, J; Xiong, Chengjie; Luo, Jingqin; Beiser, Julia A; Chen, Ling; Gordon, Mae O

    2017-08-17

    Estimating correlation coefficients among outcomes is one of the most important analytical tasks in epidemiological and clinical research. Availability of multivariate longitudinal data presents a unique opportunity to assess joint evolution of outcomes over time. Bivariate linear mixed model (BLMM) provides a versatile tool with regard to assessing correlation. However, BLMMs often assume that all individuals are drawn from a single homogenous population where the individual trajectories are distributed smoothly around population average. Using longitudinal mean deviation (MD) and visual acuity (VA) from the Ocular Hypertension Treatment Study (OHTS), we demonstrated strategies to better understand the correlation between multivariate longitudinal data in the presence of potential heterogeneity. Conditional correlation (i.e., marginal correlation given random effects) was calculated to describe how the association between longitudinal outcomes evolved over time within specific subpopulation. The impact of heterogeneity on correlation was also assessed by simulated data. There was a significant positive correlation in both random intercepts (ρ = 0.278, 95% CI: 0.121-0.420) and random slopes (ρ = 0.579, 95% CI: 0.349-0.810) between longitudinal MD and VA, and the strength of correlation constantly increased over time. However, conditional correlation and simulation studies revealed that the correlation was induced primarily by participants with rapid deteriorating MD who only accounted for a small fraction of total samples. Conditional correlation given random effects provides a robust estimate to describe the correlation between multivariate longitudinal data in the presence of unobserved heterogeneity (NCT00000125).

  1. Twitter data analysis: temporal and term frequency analysis with real-time event

    NASA Astrophysics Data System (ADS)

    Yadav, Garima; Joshi, Mansi; Sasikala, R.

    2017-11-01

    From the past few years, World Wide Web (www) has become a prominent and huge source for user generated content and opinionative data. Among various social media, Twitter gained popularity as it offers a fast and effective way of sharing users’ perspective towards various critical and other issues in different domain. As the data is hugely generated on cloud, it has opened doors for the researchers in the field of data science and analysis. There are various domains such as ‘Political’ domain, ‘Entertainment’ domain and ‘Business’ domain. Also there are various APIs that Twitter provides for developers 1) Search API, focus on the old tweets 2) Rest API, focuses on user details and allow to collect the user profile, friends and followers 3) Streaming API, which collects details like tweets, hashtags, geo locations. In our work we are accessing Streaming API in order to fetch real-time tweets for the dynamic happening event. For this we are focusing on ‘Entertainment’ domain especially ‘Sports’ as IPL-T20 is currently the trending on-going event. We are collecting these numerous amounts of tweets and storing them in MongoDB database where the tweets are stored in JSON document format. On this document we are performing time-series analysis and term frequency analysis using different techniques such as filtering, information extraction for text-mining that fulfils our objective of finding interesting moments for temporal data in the event and finding the ranking among the players or the teams based on popularity which helps people in understanding key influencers on the social media platform.

  2. Low-Dose Aspirin Discontinuation and Risk of Cardiovascular Events: A Swedish Nationwide, Population-Based Cohort Study.

    PubMed

    Sundström, Johan; Hedberg, Jakob; Thuresson, Marcus; Aarskog, Pernilla; Johannesen, Kasper Munk; Oldgren, Jonas

    2017-09-26

    There are increasing concerns about risks associated with aspirin discontinuation in the absence of major surgery or bleeding. We investigated whether long-term low-dose aspirin discontinuation and treatment gaps increase the risk of cardiovascular events. We performed a cohort study of 601 527 users of low-dose aspirin for primary or secondary prevention in the Swedish prescription register between 2005 and 2009 who were >40 years of age, were free from previous cancer, and had ≥80% adherence during the first observed year of treatment. Cardiovascular events were identified with the Swedish inpatient and cause-of-death registers. The first 3 months after a major bleeding or surgical procedure were excluded from the time at risk. During a median of 3.0 years of follow-up, 62 690 cardiovascular events occurred. Patients who discontinued aspirin had a higher rate of cardiovascular events than those who continued (multivariable-adjusted hazard ratio, 1.37; 95% confidence interval, 1.34-1.41), corresponding to an additional cardiovascular event observed per year in 1 of every 74 patients who discontinue aspirin. The risk increased shortly after discontinuation and did not appear to diminish over time. In long-term users, discontinuation of low-dose aspirin in the absence of major surgery or bleeding was associated with a >30% increased risk of cardiovascular events. Adherence to low-dose aspirin treatment in the absence of major surgery or bleeding is likely an important treatment goal. © 2017 American Heart Association, Inc.

  3. What can neuromorphic event-driven precise timing add to spike-based pattern recognition?

    PubMed

    Akolkar, Himanshu; Meyer, Cedric; Clady, Zavier; Marre, Olivier; Bartolozzi, Chiara; Panzeri, Stefano; Benosman, Ryad

    2015-03-01

    This letter introduces a study to precisely measure what an increase in spike timing precision can add to spike-driven pattern recognition algorithms. The concept of generating spikes from images by converting gray levels into spike timings is currently at the basis of almost every spike-based modeling of biological visual systems. The use of images naturally leads to generating incorrect artificial and redundant spike timings and, more important, also contradicts biological findings indicating that visual processing is massively parallel, asynchronous with high temporal resolution. A new concept for acquiring visual information through pixel-individual asynchronous level-crossing sampling has been proposed in a recent generation of asynchronous neuromorphic visual sensors. Unlike conventional cameras, these sensors acquire data not at fixed points in time for the entire array but at fixed amplitude changes of their input, resulting optimally sparse in space and time-pixel individually and precisely timed only if new, (previously unknown) information is available (event based). This letter uses the high temporal resolution spiking output of neuromorphic event-based visual sensors to show that lowering time precision degrades performance on several recognition tasks specifically when reaching the conventional range of machine vision acquisition frequencies (30-60 Hz). The use of information theory to characterize separability between classes for each temporal resolution shows that high temporal acquisition provides up to 70% more information that conventional spikes generated from frame-based acquisition as used in standard artificial vision, thus drastically increasing the separability between classes of objects. Experiments on real data show that the amount of information loss is correlated with temporal precision. Our information-theoretic study highlights the potentials of neuromorphic asynchronous visual sensors for both practical applications and theoretical

  4. Simplifying Facility and Event Scheduling: Saving Time and Money.

    ERIC Educational Resources Information Center

    Raasch, Kevin

    2003-01-01

    Describes a product called the Event Management System (EMS), a computer software program to manage facility and event scheduling. Provides example of the school district and university uses of EMS. Describes steps in selecting a scheduling-management system. (PKP)

  5. An investigation into the two-stage meta-analytic copula modelling approach for evaluating time-to-event surrogate endpoints which comprise of one or more events of interest.

    PubMed

    Dimier, Natalie; Todd, Susan

    2017-09-01

    Clinical trials of experimental treatments must be designed with primary endpoints that directly measure clinical benefit for patients. In many disease areas, the recognised gold standard primary endpoint can take many years to mature, leading to challenges in the conduct and quality of clinical studies. There is increasing interest in using shorter-term surrogate endpoints as substitutes for costly long-term clinical trial endpoints; such surrogates need to be selected according to biological plausibility, as well as the ability to reliably predict the unobserved treatment effect on the long-term endpoint. A number of statistical methods to evaluate this prediction have been proposed; this paper uses a simulation study to explore one such method in the context of time-to-event surrogates for a time-to-event true endpoint. This two-stage meta-analytic copula method has been extensively studied for time-to-event surrogate endpoints with one event of interest, but thus far has not been explored for the assessment of surrogates which have multiple events of interest, such as those incorporating information directly from the true clinical endpoint. We assess the sensitivity of the method to various factors including strength of association between endpoints, the quantity of data available, and the effect of censoring. In particular, we consider scenarios where there exist very little data on which to assess surrogacy. Results show that the two-stage meta-analytic copula method performs well under certain circumstances and could be considered useful in practice, but demonstrates limitations that may prevent universal use. Copyright © 2017 John Wiley & Sons, Ltd.

  6. Exercise electrocardiogram in middle-aged and older leisure time sportsmen: 100 exercise tests would be enough to identify one silent myocardial ischemia at risk for cardiac event.

    PubMed

    Hupin, David; Edouard, Pascal; Oriol, Mathieu; Laukkanen, Jari; Abraham, Pierre; Doutreleau, Stéphane; Guy, Jean-Michel; Carré, François; Barthélémy, Jean-Claude; Roche, Frédéric; Chatard, Jean-Claude

    2018-04-15

    The importance of exercise electrocardiogram (ECG) has been controversial in the prevention of cardiac events among sportsmen. The aim of this study was to determine the frequency of silent myocardial ischemia (SMI) from an exercise ECG and its relationship with induced coronary angiographic assessment and potentially preventable cardiac events. This prospective cohort study included leisure time asymptomatic sportsmen over 35years old, referred from 2011 to 2014 in the Sports Medicine Unit of the University Hospital of Saint-Etienne. Of the cohort of 1500 sportsmen (1205 men; mean age 50.7±9.4years; physical activity level 32.8±26.8MET-h/week), 951 (63%) had at least one cardiovascular disease (CVD) risk factor. Family history, medical examination and standard resting 12-lead were collected. A total of 163 exercise ECGs (10.9%) were defined as positive, most of them due to SMI (n=129, 8.6%). SMI was an indication for coronary angiography in 23 cases, leading to 17 documented SMIs (1.1%), including 11 significant stenoses requiring revascularization. In multivariate logistic regression analysis, a high risk of CVD (OR=2.65 [CI 95%: 1.33-5.27], p=0.005) and an age >50years (OR=2.71 [CI 95%: 1.65-4.44], p<0.0001) were independently associated with confirmed SMI. The association of positive exercise ECG with significant coronary stenosis was stronger among sportsmen with CVD risk factors and older than 50years. Screening by exercise ECG can lower the risk of cardiac events in middle-aged and older sportsmen. One hundred tests would be enough to detect one silent myocardial ischemia at risk for cardiac event. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Effect of hyperlipidemia on the incidence of cardio-cerebrovascular events in patients with type 2 diabetes.

    PubMed

    Fan, Dabei; Li, Li; Li, Zhizhen; Zhang, Ying; Ma, Xiaojun; Wu, Lina; Qin, Guijun

    2018-05-08

    This study was to explore the effect of hyperlipidemia on the incidence of cardio-cerebrovascular diseases in patients with type 2 diabetes. Three hundred ninety five patients with type 2 diabetes in our hospital from January 2012 to January 2016 were followed up with an average of 3.8 years. The incidence of cardio-cerebrovascular diseases between diabetes combined with hyperlipidemia group (195 patients) and diabetes group (200 patients) were made a comparison. Multivariable Cox's proportional hazards regression model was used to analyze the effect of hyperlipidemia on the incidence of cardio-cerebrovascular diseases in patients with type 2 diabetes. Diastolic blood pressure, systolic blood pressure, high-density lipoprotein, low-density lipoprotein, body mass index and hyper-sensitive C-reactive protein were higher in diabetes combined with hyperlipidemia group than in diabetes group (P < 0.05). At the end of the follow-up period, all-cause mortality, cardio-cerebrovascular diseases mortality, and the incidence of myocardial infarction, cerebral infarction, cerebral hemorrhage and total cardiovascular events were significantly higher in diabetes combined with hyperlipidemia group than in diabetes group (P < 0.05). The analysis results of multivariable Cox's proportional hazards regression model showed that the risks of myocardial infarction and total cardiovascular events in diabetes combined with hyperlipidemia group were respectively 1.54 times (95%CI 1.13-2.07) and 1.68 times (95%CI 1.23-2.24) higher than those in diabetes group. Population attributable risk percent of all-cause mortality and total cardiovascular events in patients with type 2 diabetes combined with hyperlipidemia was 9.6% and 26.8%, respectively. Hyperlipidemia may promote vascular endothelial injury, increasing the risk of cardio-cerebrovascular diseases in patients with type 2 diabetes. Medical staffs should pay attention to the control of blood lipids in patients with type 2

  8. Reducing uncertainty in Climate Response Time Scale by Bayesian Analysis of the 8.2 ka event

    NASA Astrophysics Data System (ADS)

    Lorenz, A.; Held, H.; Bauer, E.; Schneider von Deimling, T.

    2009-04-01

    We analyze the possibility of uncertainty reduction in Climate Response Time Scale by utilizing Greenland ice-core data that contain the 8.2 ka event within a Bayesian model-data intercomparison with the Earth system model of intermediate complexity, CLIMBER-2.3. Within a stochastic version of the model it has been possible to mimic the 8.2 ka event within a plausible experimental setting and with relatively good accuracy considering the timing of the event in comparison to other modeling exercises [1]. The simulation of the centennial cold event is effectively determined by the oceanic cooling rate which depends largely on the ocean diffusivity described by diffusion coefficients of relatively wide uncertainty ranges. The idea now is to discriminate between the different values of diffusivities according to their likelihood to rightly represent the duration of the 8.2 ka event and thus to exploit the paleo data to constrain uncertainty in model parameters in analogue to [2]. Implementing this inverse Bayesian Analysis with this model the technical difficulty arises to establish the related likelihood numerically in addition to the uncertain model parameters: While mainstream uncertainty analyses can assume a quasi-Gaussian shape of likelihood, with weather fluctuating around a long term mean, the 8.2 ka event as a highly nonlinear effect precludes such an a priori assumption. As a result of this study [3] the Bayesian Analysis showed a reduction of uncertainty in vertical ocean diffusivity parameters of factor 2 compared to prior knowledge. This learning effect on the model parameters is propagated to other model outputs of interest; e.g. the inverse ocean heat capacity, which is important for the dominant time scale of climate response to anthropogenic forcing which, in combination with climate sensitivity, strongly influences the climate systems reaction for the near- and medium-term future. 1 References [1] E. Bauer, A. Ganopolski, M. Montoya: Simulation of the

  9. An Event-Based Verification Scheme for the Real-Time Flare Detection System at Kanzelhöhe Observatory

    NASA Astrophysics Data System (ADS)

    Pötzi, W.; Veronig, A. M.; Temmer, M.

    2018-06-01

    In the framework of the Space Situational Awareness program of the European Space Agency (ESA/SSA), an automatic flare detection system was developed at Kanzelhöhe Observatory (KSO). The system has been in operation since mid-2013. The event detection algorithm was upgraded in September 2017. All data back to 2014 was reprocessed using the new algorithm. In order to evaluate both algorithms, we apply verification measures that are commonly used for forecast validation. In order to overcome the problem of rare events, which biases the verification measures, we introduce a new event-based method. We divide the timeline of the Hα observations into positive events (flaring period) and negative events (quiet period), independent of the length of each event. In total, 329 positive and negative events were detected between 2014 and 2016. The hit rate for the new algorithm reached 96% (just five events were missed) and a false-alarm ratio of 17%. This is a significant improvement of the algorithm, as the original system had a hit rate of 85% and a false-alarm ratio of 33%. The true skill score and the Heidke skill score both reach values of 0.8 for the new algorithm; originally, they were at 0.5. The mean flare positions are accurate within {±} 1 heliographic degree for both algorithms, and the peak times improve from a mean difference of 1.7± 2.9 minutes to 1.3± 2.3 minutes. The flare start times that had been systematically late by about 3 minutes as determined by the original algorithm, now match the visual inspection within -0.47± 4.10 minutes.

  10. Evaluating principal surrogate endpoints with time-to-event data accounting for time-varying treatment efficacy

    PubMed Central

    Gabriel, Erin E.; Gilbert, Peter B.

    2014-01-01

    Principal surrogate (PS) endpoints are relatively inexpensive and easy to measure study outcomes that can be used to reliably predict treatment effects on clinical endpoints of interest. Few statistical methods for assessing the validity of potential PSs utilize time-to-event clinical endpoint information and to our knowledge none allow for the characterization of time-varying treatment effects. We introduce the time-dependent and surrogate-dependent treatment efficacy curve, \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}${\\mathrm {TE}}(t|s)$\\end{document}, and a new augmented trial design for assessing the quality of a biomarker as a PS. We propose a novel Weibull model and an estimated maximum likelihood method for estimation of the \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}${\\mathrm {TE}}(t|s)$\\end{document} curve. We describe the operating characteristics of our methods via simulations. We analyze data from the Diabetes Control and Complications Trial, in which we find evidence of a biomarker with value as a PS. PMID:24337534

  11. Multivariate meta-analysis for non-linear and other multi-parameter associations

    PubMed Central

    Gasparrini, A; Armstrong, B; Kenward, M G

    2012-01-01

    In this paper, we formalize the application of multivariate meta-analysis and meta-regression to synthesize estimates of multi-parameter associations obtained from different studies. This modelling approach extends the standard two-stage analysis used to combine results across different sub-groups or populations. The most straightforward application is for the meta-analysis of non-linear relationships, described for example by regression coefficients of splines or other functions, but the methodology easily generalizes to any setting where complex associations are described by multiple correlated parameters. The modelling framework of multivariate meta-analysis is implemented in the package mvmeta within the statistical environment R. As an illustrative example, we propose a two-stage analysis for investigating the non-linear exposure–response relationship between temperature and non-accidental mortality using time-series data from multiple cities. Multivariate meta-analysis represents a useful analytical tool for studying complex associations through a two-stage procedure. Copyright © 2012 John Wiley & Sons, Ltd. PMID:22807043

  12. Practical robustness measures in multivariable control system analysis. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Lehtomaki, N. A.

    1981-01-01

    The robustness of the stability of multivariable linear time invariant feedback control systems with respect to model uncertainty is considered using frequency domain criteria. Available robustness tests are unified under a common framework based on the nature and structure of model errors. These results are derived using a multivariable version of Nyquist's stability theorem in which the minimum singular value of the return difference transfer matrix is shown to be the multivariable generalization of the distance to the critical point on a single input, single output Nyquist diagram. Using the return difference transfer matrix, a very general robustness theorem is presented from which all of the robustness tests dealing with specific model errors may be derived. The robustness tests that explicitly utilized model error structure are able to guarantee feedback system stability in the face of model errors of larger magnitude than those robustness tests that do not. The robustness of linear quadratic Gaussian control systems are analyzed.

  13. Statistical Searches for Microlensing Events in Large, Non-uniformly Sampled Time-Domain Surveys: A Test Using Palomar Transient Factory Data

    NASA Astrophysics Data System (ADS)

    Price-Whelan, Adrian M.; Agüeros, Marcel A.; Fournier, Amanda P.; Street, Rachel; Ofek, Eran O.; Covey, Kevin R.; Levitan, David; Laher, Russ R.; Sesar, Branimir; Surace, Jason

    2014-01-01

    Many photometric time-domain surveys are driven by specific goals, such as searches for supernovae or transiting exoplanets, which set the cadence with which fields are re-imaged. In the case of the Palomar Transient Factory (PTF), several sub-surveys are conducted in parallel, leading to non-uniform sampling over its ~20,000 deg2 footprint. While the median 7.26 deg2 PTF field has been imaged ~40 times in the R band, ~2300 deg2 have been observed >100 times. We use PTF data to study the trade off between searching for microlensing events in a survey whose footprint is much larger than that of typical microlensing searches, but with far-from-optimal time sampling. To examine the probability that microlensing events can be recovered in these data, we test statistics used on uniformly sampled data to identify variables and transients. We find that the von Neumann ratio performs best for identifying simulated microlensing events in our data. We develop a selection method using this statistic and apply it to data from fields with >10 R-band observations, 1.1 × 109 light curves, uncovering three candidate microlensing events. We lack simultaneous, multi-color photometry to confirm these as microlensing events. However, their number is consistent with predictions for the event rate in the PTF footprint over the survey's three years of operations, as estimated from near-field microlensing models. This work can help constrain all-sky event rate predictions and tests microlensing signal recovery in large data sets, which will be useful to future time-domain surveys, such as that planned with the Large Synoptic Survey Telescope.

  14. Deconstructing multivariate decoding for the study of brain function.

    PubMed

    Hebart, Martin N; Baker, Chris I

    2017-08-04

    Multivariate decoding methods were developed originally as tools to enable accurate predictions in real-world applications. The realization that these methods can also be employed to study brain function has led to their widespread adoption in the neurosciences. However, prior to the rise of multivariate decoding, the study of brain function was firmly embedded in a statistical philosophy grounded on univariate methods of data analysis. In this way, multivariate decoding for brain interpretation grew out of two established frameworks: multivariate decoding for predictions in real-world applications, and classical univariate analysis based on the study and interpretation of brain activation. We argue that this led to two confusions, one reflecting a mixture of multivariate decoding for prediction or interpretation, and the other a mixture of the conceptual and statistical philosophies underlying multivariate decoding and classical univariate analysis. Here we attempt to systematically disambiguate multivariate decoding for the study of brain function from the frameworks it grew out of. After elaborating these confusions and their consequences, we describe six, often unappreciated, differences between classical univariate analysis and multivariate decoding. We then focus on how the common interpretation of what is signal and noise changes in multivariate decoding. Finally, we use four examples to illustrate where these confusions may impact the interpretation of neuroimaging data. We conclude with a discussion of potential strategies to help resolve these confusions in interpreting multivariate decoding results, including the potential departure from multivariate decoding methods for the study of brain function. Copyright © 2017. Published by Elsevier Inc.

  15. MEMD-enhanced multivariate fuzzy entropy for the evaluation of complexity in biomedical signals.

    PubMed

    Azami, Hamed; Smith, Keith; Escudero, Javier

    2016-08-01

    Multivariate multiscale entropy (mvMSE) has been proposed as a combination of the coarse-graining process and multivariate sample entropy (mvSE) to quantify the irregularity of multivariate signals. However, both the coarse-graining process and mvSE may not be reliable for short signals. Although the coarse-graining process can be replaced with multivariate empirical mode decomposition (MEMD), the relative instability of mvSE for short signals remains a problem. Here, we address this issue by proposing the multivariate fuzzy entropy (mvFE) with a new fuzzy membership function. The results using white Gaussian noise show that the mvFE leads to more reliable and stable results, especially for short signals, in comparison with mvSE. Accordingly, we propose MEMD-enhanced mvFE to quantify the complexity of signals. The characteristics of brain regions influenced by partial epilepsy are investigated by focal and non-focal electroencephalogram (EEG) time series. In this sense, the proposed MEMD-enhanced mvFE and mvSE are employed to discriminate focal EEG signals from non-focal ones. The results demonstrate the MEMD-enhanced mvFE values have a smaller coefficient of variation in comparison with those obtained by the MEMD-enhanced mvSE, even for long signals. The results also show that the MEMD-enhanced mvFE has better performance to quantify focal and non-focal signals compared with multivariate multiscale permutation entropy.

  16. Considerations for analysis of time-to-event outcomes measured with error: Bias and correction with SIMEX.

    PubMed

    Oh, Eric J; Shepherd, Bryan E; Lumley, Thomas; Shaw, Pamela A

    2018-04-15

    For time-to-event outcomes, a rich literature exists on the bias introduced by covariate measurement error in regression models, such as the Cox model, and methods of analysis to address this bias. By comparison, less attention has been given to understanding the impact or addressing errors in the failure time outcome. For many diseases, the timing of an event of interest (such as progression-free survival or time to AIDS progression) can be difficult to assess or reliant on self-report and therefore prone to measurement error. For linear models, it is well known that random errors in the outcome variable do not bias regression estimates. With nonlinear models, however, even random error or misclassification can introduce bias into estimated parameters. We compare the performance of 2 common regression models, the Cox and Weibull models, in the setting of measurement error in the failure time outcome. We introduce an extension of the SIMEX method to correct for bias in hazard ratio estimates from the Cox model and discuss other analysis options to address measurement error in the response. A formula to estimate the bias induced into the hazard ratio by classical measurement error in the event time for a log-linear survival model is presented. Detailed numerical studies are presented to examine the performance of the proposed SIMEX method under varying levels and parametric forms of the error in the outcome. We further illustrate the method with observational data on HIV outcomes from the Vanderbilt Comprehensive Care Clinic. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Multivariate missing data in hydrology - Review and applications

    NASA Astrophysics Data System (ADS)

    Ben Aissia, Mohamed-Aymen; Chebana, Fateh; Ouarda, Taha B. M. J.

    2017-12-01

    Water resources planning and management require complete data sets of a number of hydrological variables, such as flood peaks and volumes. However, hydrologists are often faced with the problem of missing data (MD) in hydrological databases. Several methods are used to deal with the imputation of MD. During the last decade, multivariate approaches have gained popularity in the field of hydrology, especially in hydrological frequency analysis (HFA). However, treating the MD remains neglected in the multivariate HFA literature whereas the focus has been mainly on the modeling component. For a complete analysis and in order to optimize the use of data, MD should also be treated in the multivariate setting prior to modeling and inference. Imputation of MD in the multivariate hydrological framework can have direct implications on the quality of the estimation. Indeed, the dependence between the series represents important additional information that can be included in the imputation process. The objective of the present paper is to highlight the importance of treating MD in multivariate hydrological frequency analysis by reviewing and applying multivariate imputation methods and by comparing univariate and multivariate imputation methods. An application is carried out for multiple flood attributes on three sites in order to evaluate the performance of the different methods based on the leave-one-out procedure. The results indicate that, the performance of imputation methods can be improved by adopting the multivariate setting, compared to mean substitution and interpolation methods, especially when using the copula-based approach.

  18. BN-FLEMOps pluvial - A probabilistic multi-variable loss estimation model for pluvial floods

    NASA Astrophysics Data System (ADS)

    Roezer, V.; Kreibich, H.; Schroeter, K.; Doss-Gollin, J.; Lall, U.; Merz, B.

    2017-12-01

    Pluvial flood events, such as in Copenhagen (Denmark) in 2011, Beijing (China) in 2012 or Houston (USA) in 2016, have caused severe losses to urban dwellings in recent years. These floods are caused by storm events with high rainfall rates well above the design levels of urban drainage systems, which lead to inundation of streets and buildings. A projected increase in frequency and intensity of heavy rainfall events in many areas and an ongoing urbanization may increase pluvial flood losses in the future. For an efficient risk assessment and adaptation to pluvial floods, a quantification of the flood risk is needed. Few loss models have been developed particularly for pluvial floods. These models usually use simple waterlevel- or rainfall-loss functions and come with very high uncertainties. To account for these uncertainties and improve the loss estimation, we present a probabilistic multi-variable loss estimation model for pluvial floods based on empirical data. The model was developed in a two-step process using a machine learning approach and a comprehensive database comprising 783 records of direct building and content damage of private households. The data was gathered through surveys after four different pluvial flood events in Germany between 2005 and 2014. In a first step, linear and non-linear machine learning algorithms, such as tree-based and penalized regression models were used to identify the most important loss influencing factors among a set of 55 candidate variables. These variables comprise hydrological and hydraulic aspects, early warning, precaution, building characteristics and the socio-economic status of the household. In a second step, the most important loss influencing variables were used to derive a probabilistic multi-variable pluvial flood loss estimation model based on Bayesian Networks. Two different networks were tested: a score-based network learned from the data and a network based on expert knowledge. Loss predictions are made

  19. Replica analysis of overfitting in regression models for time-to-event data

    NASA Astrophysics Data System (ADS)

    Coolen, A. C. C.; Barrett, J. E.; Paga, P.; Perez-Vicente, C. J.

    2017-09-01

    Overfitting, which happens when the number of parameters in a model is too large compared to the number of data points available for determining these parameters, is a serious and growing problem in survival analysis. While modern medicine presents us with data of unprecedented dimensionality, these data cannot yet be used effectively for clinical outcome prediction. Standard error measures in maximum likelihood regression, such as p-values and z-scores, are blind to overfitting, and even for Cox’s proportional hazards model (the main tool of medical statisticians), one finds in literature only rules of thumb on the number of samples required to avoid overfitting. In this paper we present a mathematical theory of overfitting in regression models for time-to-event data, which aims to increase our quantitative understanding of the problem and provide practical tools with which to correct regression outcomes for the impact of overfitting. It is based on the replica method, a statistical mechanical technique for the analysis of heterogeneous many-variable systems that has been used successfully for several decades in physics, biology, and computer science, but not yet in medical statistics. We develop the theory initially for arbitrary regression models for time-to-event data, and verify its predictions in detail for the popular Cox model.

  20. Rare events modeling with support vector machine: Application to forecasting large-amplitude geomagnetic substorms and extreme events in financial markets.

    NASA Astrophysics Data System (ADS)

    Gavrishchaka, V. V.; Ganguli, S. B.

    2001-12-01

    Reliable forecasting of rare events in a complex dynamical system is a challenging problem that is important for many practical applications. Due to the nature of rare events, data set available for construction of the statistical and/or machine learning model is often very limited and incomplete. Therefore many widely used approaches including such robust algorithms as neural networks can easily become inadequate for rare events prediction. Moreover in many practical cases models with high-dimensional inputs are required. This limits applications of the existing rare event modeling techniques (e.g., extreme value theory) that focus on univariate cases. These approaches are not easily extended to multivariate cases. Support vector machine (SVM) is a machine learning system that can provide an optimal generalization using very limited and incomplete training data sets and can efficiently handle high-dimensional data. These features may allow to use SVM to model rare events in some applications. We have applied SVM-based system to the problem of large-amplitude substorm prediction and extreme event forecasting in stock and currency exchange markets. Encouraging preliminary results will be presented and other possible applications of the system will be discussed.

  1. Towards real-time regional earthquake simulation I: real-time moment tensor monitoring (RMT) for regional events in Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, Shiann-Jong; Liang, Wen-Tzong; Cheng, Hui-Wen; Tu, Feng-Shan; Ma, Kuo-Fong; Tsuruoka, Hiroshi; Kawakatsu, Hitoshi; Huang, Bor-Shouh; Liu, Chun-Chi

    2014-01-01

    We have developed a real-time moment tensor monitoring system (RMT) which takes advantage of a grid-based moment tensor inversion technique and real-time broad-band seismic recordings to automatically monitor earthquake activities in the vicinity of Taiwan. The centroid moment tensor (CMT) inversion technique and a grid search scheme are applied to obtain the information of earthquake source parameters, including the event origin time, hypocentral location, moment magnitude and focal mechanism. All of these source parameters can be determined simultaneously within 117 s after the occurrence of an earthquake. The monitoring area involves the entire Taiwan Island and the offshore region, which covers the area of 119.3°E to 123.0°E and 21.0°N to 26.0°N, with a depth from 6 to 136 km. A 3-D grid system is implemented in the monitoring area with a uniform horizontal interval of 0.1° and a vertical interval of 10 km. The inversion procedure is based on a 1-D Green's function database calculated by the frequency-wavenumber (fk) method. We compare our results with the Central Weather Bureau (CWB) catalogue data for earthquakes occurred between 2010 and 2012. The average differences between event origin time and hypocentral location are less than 2 s and 10 km, respectively. The focal mechanisms determined by RMT are also comparable with the Broadband Array in Taiwan for Seismology (BATS) CMT solutions. These results indicate that the RMT system is realizable and efficient to monitor local seismic activities. In addition, the time needed to obtain all the point source parameters is reduced substantially compared to routine earthquake reports. By connecting RMT with a real-time online earthquake simulation (ROS) system, all the source parameters will be forwarded to the ROS to make the real-time earthquake simulation feasible. The RMT has operated offline (2010-2011) and online (since January 2012 to present) at the Institute of Earth Sciences (IES), Academia Sinica

  2. Shedding Light on the Etiology of Sports Injuries: A Look Behind the Scenes of Time-to-Event Analyses.

    PubMed

    Nielsen, Rasmus Østergaard; Malisoux, Laurent; Møller, Merete; Theisen, Daniel; Parner, Erik Thorlund

    2016-04-01

    The etiological mechanism underpinning any sports-related injury is complex and multifactorial. Frequently, athletes perceive "excessive training" as the principal factor in their injury, an observation that is biologically plausible yet somewhat ambiguous. If the applied training load is suddenly increased, this may increase the risk for sports injury development, irrespective of the absolute amount of training. Indeed, little to no rigorous scientific evidence exists to support the hypothesis that fluctuations in training load, compared to absolute training load, are more important in explaining sports injury development. One reason for this could be that prospective data from scientific studies should be analyzed in a different manner. Time-to-event analysis is a useful statistical tool in which to analyze the influence of changing exposures on injury risk. However, the potential of time-to-event analysis remains insufficiently exploited in sports injury research. Therefore, the purpose of the present article was to present and discuss measures of association used in time-to-event analyses and to present the advanced concept of time-varying exposures and outcomes. In the paper, different measures of association, such as cumulative relative risk, cumulative risk difference, and the classical hazard rate ratio, are presented in a nontechnical manner, and suggestions for interpretation of study results are provided. To summarize, time-to-event analysis complements the statistical arsenal of sports injury prevention researchers, because it enables them to analyze the complex and highly dynamic reality of injury etiology, injury recurrence, and time to recovery across a range of sporting contexts.

  3. An iterative technique to stabilize a linear time invariant multivariable system with output feedback

    NASA Technical Reports Server (NTRS)

    Sankaran, V.

    1974-01-01

    An iterative procedure for determining the constant gain matrix that will stabilize a linear constant multivariable system using output feedback is described. The use of this procedure avoids the transformation of variables which is required in other procedures. For the case in which the product of the output and input vector dimensions is greater than the number of states of the plant, general solution is given. In the case in which the states exceed the product of input and output vector dimensions, a least square solution which may not be stable in all cases is presented. The results are illustrated with examples.

  4. Stochastic modeling of neurobiological time series: Power, coherence, Granger causality, and separation of evoked responses from ongoing activity

    NASA Astrophysics Data System (ADS)

    Chen, Yonghong; Bressler, Steven L.; Knuth, Kevin H.; Truccolo, Wilson A.; Ding, Mingzhou

    2006-06-01

    In this article we consider the stochastic modeling of neurobiological time series from cognitive experiments. Our starting point is the variable-signal-plus-ongoing-activity model. From this model a differentially variable component analysis strategy is developed from a Bayesian perspective to estimate event-related signals on a single trial basis. After subtracting out the event-related signal from recorded single trial time series, the residual ongoing activity is treated as a piecewise stationary stochastic process and analyzed by an adaptive multivariate autoregressive modeling strategy which yields power, coherence, and Granger causality spectra. Results from applying these methods to local field potential recordings from monkeys performing cognitive tasks are presented.

  5. Tethered to the EHR: Primary Care Physician Workload Assessment Using EHR Event Log Data and Time-Motion Observations

    PubMed Central

    Arndt, Brian G.; Beasley, John W.; Watkinson, Michelle D.; Temte, Jonathan L.; Tuan, Wen-Jan; Sinsky, Christine A.; Gilchrist, Valerie J.

    2017-01-01

    PURPOSE Primary care physicians spend nearly 2 hours on electronic health record (EHR) tasks per hour of direct patient care. Demand for non–face-to-face care, such as communication through a patient portal and administrative tasks, is increasing and contributing to burnout. The goal of this study was to assess time allocated by primary care physicians within the EHR as indicated by EHR user-event log data, both during clinic hours (defined as 8:00 am to 6:00 pm Monday through Friday) and outside clinic hours. METHODS We conducted a retrospective cohort study of 142 family medicine physicians in a single system in southern Wisconsin. All Epic (Epic Systems Corporation) EHR interactions were captured from “event logging” records over a 3-year period for both direct patient care and non–face-to-face activities, and were validated by direct observation. EHR events were assigned to 1 of 15 EHR task categories and allocated to either during or after clinic hours. RESULTS Clinicians spent 355 minutes (5.9 hours) of an 11.4-hour workday in the EHR per weekday per 1.0 clinical full-time equivalent: 269 minutes (4.5 hours) during clinic hours and 86 minutes (1.4 hours) after clinic hours. Clerical and administrative tasks including documentation, order entry, billing and coding, and system security accounted for nearly one-half of the total EHR time (157 minutes, 44.2%). Inbox management accounted for another 85 minutes (23.7%). CONCLUSIONS Primary care physicians spend more than one-half of their workday, nearly 6 hours, interacting with the EHR during and after clinic hours. EHR event logs can identify areas of EHR-related work that could be delegated, thus reducing workload, improving professional satisfaction, and decreasing burnout. Direct time-motion observations validated EHR-event log data as a reliable source of information regarding clinician time allocation. PMID:28893811

  6. Tethered to the EHR: Primary Care Physician Workload Assessment Using EHR Event Log Data and Time-Motion Observations.

    PubMed

    Arndt, Brian G; Beasley, John W; Watkinson, Michelle D; Temte, Jonathan L; Tuan, Wen-Jan; Sinsky, Christine A; Gilchrist, Valerie J

    2017-09-01

    Primary care physicians spend nearly 2 hours on electronic health record (EHR) tasks per hour of direct patient care. Demand for non-face-to-face care, such as communication through a patient portal and administrative tasks, is increasing and contributing to burnout. The goal of this study was to assess time allocated by primary care physicians within the EHR as indicated by EHR user-event log data, both during clinic hours (defined as 8:00 am to 6:00 pm Monday through Friday) and outside clinic hours. We conducted a retrospective cohort study of 142 family medicine physicians in a single system in southern Wisconsin. All Epic (Epic Systems Corporation) EHR interactions were captured from "event logging" records over a 3-year period for both direct patient care and non-face-to-face activities, and were validated by direct observation. EHR events were assigned to 1 of 15 EHR task categories and allocated to either during or after clinic hours. Clinicians spent 355 minutes (5.9 hours) of an 11.4-hour workday in the EHR per weekday per 1.0 clinical full-time equivalent: 269 minutes (4.5 hours) during clinic hours and 86 minutes (1.4 hours) after clinic hours. Clerical and administrative tasks including documentation, order entry, billing and coding, and system security accounted for nearly one-half of the total EHR time (157 minutes, 44.2%). Inbox management accounted for another 85 minutes (23.7%). Primary care physicians spend more than one-half of their workday, nearly 6 hours, interacting with the EHR during and after clinic hours. EHR event logs can identify areas of EHR-related work that could be delegated, thus reducing workload, improving professional satisfaction, and decreasing burnout. Direct time-motion observations validated EHR-event log data as a reliable source of information regarding clinician time allocation. © 2017 Annals of Family Medicine, Inc.

  7. The joint associations of occupational, commuting, and leisure-time physical activity, and the Framingham risk score on the 10-year risk of coronary heart disease.

    PubMed

    Hu, Gang; Tuomilehto, Jaakko; Borodulin, Katja; Jousilahti, Pekka

    2007-02-01

    To determine joint associations of different kinds of physical activity and the Framingham risk score (FRS) with the 10-year risk of coronary heart disease (CHD) events. Study cohorts included 41 053 Finnish participants aged 25-64 years without history of CHD and stroke. The multivariable-adjusted 10-year hazard ratios (HRs) of coronary events associated with low, moderate, and high occupational physical activity were 1.00, 0.66, and 0.74 (Ptrend<0.001) for men, and 1.00, 0.53, and 0.58 (Ptrend<0.001) for women, respectively. The multivariable-adjusted 10-year HRs of coronary events associated with low, moderate, and high leisure-time physical activity were 1.00, 0.97, and 0.66 (Ptrend=0.002) for men, and 1.00, 0.74, and 0.54 (Ptrend=0.003) for women, respectively. Active commuting had a significant inverse association with 10-year risk of coronary events in women only. The FRS predicted 10-year risk of coronary events among both men and women. The protective effects of occupational, commuting, or leisure-time physical activity were consistent in subjects with a very low (<6%), low (6-9%), intermediate (10-19%), or high (>or=20%) risk of the FRS. Moderate or high levels of occupational or leisure-time physical activity among both men and women, and daily walking or cycling to and from work among women are associated with a reduced 10-year risk of CHD events. These favourable effects of physical activity on CHD risk are observed at all levels of CHD risk based on FRS assessment.

  8. Towards Understanding the Timing and Frequency of Rain-on-Snow (ROS) Events in Alaska

    NASA Astrophysics Data System (ADS)

    Pan, C.; Kirchner, P. B.; Kimball, J. S.; Kim, Y.; Kamp, U.

    2017-12-01

    Rain-on-snow (ROS) events affect ecosystem processes at multiple spatial and temporal scales including hydrology, carbon cycling, wildlife movement and human transportation and result in marked changes to snowpack processes including enhanced snow melt, surface albedo and energy balance. Changes in the surface structure of the snowpack are visible through optical remote sensing and changes in the relative content and distribution of water, air and ice in the snowpack are detectable using passive microwave remote sensing. This project aims to develop ROS products to elucidate changes in frequency and distribution in ROS events using satellite data products derived from both optical and passive microwave satellite records. To detect ROS events, we use downscaled brightness temperature measurements derived from vertical and horizontal polarizations at 19 and 37 GHz from the Advanced Microwave Scanning Radiometer (AMSR-E/2) passive microwave satellites. Preliminary results indicate an overall classification accuracy of 77.6% relative to in situ weather observations including surface air temperature, precipitation, and snow depth. ROS events are spatially distributed largely to elevations below 500 m and occur most frequently on northern to western aspects in addition to southeastern. Regional ROS hot spots occur in southwest Alaska characterized by warmer climates and transient snowcover. The seasonal timing of ROS events indicates increasing frequency during the fall and spring months.

  9. Accounting for individual differences and timing of events: estimating the effect of treatment on criminal convictions in heroin users

    PubMed Central

    2014-01-01

    Background The reduction of crime is an important outcome of opioid maintenance treatment (OMT). Criminal intensity and treatment regimes vary among OMT patients, but this is rarely adjusted for in statistical analyses, which tend to focus on cohort incidence rates and rate ratios. The purpose of this work was to estimate the relationship between treatment and criminal convictions among OMT patients, adjusting for individual covariate information and timing of events, fitting time-to-event regression models of increasing complexity. Methods National criminal records were cross linked with treatment data on 3221 patients starting OMT in Norway 1997–2003. In addition to calculating cohort incidence rates, criminal convictions was modelled as a recurrent event dependent variable, and treatment a time-dependent covariate, in Cox proportional hazards, Aalen’s additive hazards, and semi-parametric additive hazards regression models. Both fixed and dynamic covariates were included. Results During OMT, the number of days with criminal convictions for the cohort as a whole was 61% lower than when not in treatment. OMT was associated with reduced number of days with criminal convictions in all time-to-event regression models, but the hazard ratio (95% CI) was strongly attenuated when adjusting for covariates; from 0.40 (0.35, 0.45) in a univariate model to 0.79 (0.72, 0.87) in a fully adjusted model. The hazard was lower for females and decreasing with older age, while increasing with high numbers of criminal convictions prior to application to OMT (all p < 0.001). The strongest predictors were level of criminal activity prior to entering into OMT, and having a recent criminal conviction (both p < 0.001). The effect of several predictors was significantly time-varying with their effects diminishing over time. Conclusions Analyzing complex observational data regarding to fixed factors only overlooks important temporal information, and naïve cohort level incidence

  10. Accounting for individual differences and timing of events: estimating the effect of treatment on criminal convictions in heroin users.

    PubMed

    Røislien, Jo; Clausen, Thomas; Gran, Jon Michael; Bukten, Anne

    2014-05-17

    The reduction of crime is an important outcome of opioid maintenance treatment (OMT). Criminal intensity and treatment regimes vary among OMT patients, but this is rarely adjusted for in statistical analyses, which tend to focus on cohort incidence rates and rate ratios. The purpose of this work was to estimate the relationship between treatment and criminal convictions among OMT patients, adjusting for individual covariate information and timing of events, fitting time-to-event regression models of increasing complexity. National criminal records were cross linked with treatment data on 3221 patients starting OMT in Norway 1997-2003. In addition to calculating cohort incidence rates, criminal convictions was modelled as a recurrent event dependent variable, and treatment a time-dependent covariate, in Cox proportional hazards, Aalen's additive hazards, and semi-parametric additive hazards regression models. Both fixed and dynamic covariates were included. During OMT, the number of days with criminal convictions for the cohort as a whole was 61% lower than when not in treatment. OMT was associated with reduced number of days with criminal convictions in all time-to-event regression models, but the hazard ratio (95% CI) was strongly attenuated when adjusting for covariates; from 0.40 (0.35, 0.45) in a univariate model to 0.79 (0.72, 0.87) in a fully adjusted model. The hazard was lower for females and decreasing with older age, while increasing with high numbers of criminal convictions prior to application to OMT (all p < 0.001). The strongest predictors were level of criminal activity prior to entering into OMT, and having a recent criminal conviction (both p < 0.001). The effect of several predictors was significantly time-varying with their effects diminishing over time. Analyzing complex observational data regarding to fixed factors only overlooks important temporal information, and naïve cohort level incidence rates might result in biased estimates of the

  11. Multivariate Density Estimation and Remote Sensing

    NASA Technical Reports Server (NTRS)

    Scott, D. W.

    1983-01-01

    Current efforts to develop methods and computer algorithms to effectively represent multivariate data commonly encountered in remote sensing applications are described. While this may involve scatter diagrams, multivariate representations of nonparametric probability density estimates are emphasized. The density function provides a useful graphical tool for looking at data and a useful theoretical tool for classification. This approach is called a thunderstorm data analysis.

  12. Convolutional neural networks applied to neutrino events in a liquid argon time projection chamber

    NASA Astrophysics Data System (ADS)

    Acciarri, R.; Adams, C.; An, R.; Asaadi, J.; Auger, M.; Bagby, L.; Baller, B.; Barr, G.; Bass, M.; Bay, F.; Bishai, M.; Blake, A.; Bolton, T.; Bugel, L.; Camilleri, L.; Caratelli, D.; Carls, B.; Castillo Fernandez, R.; Cavanna, F.; Chen, H.; Church, E.; Cianci, D.; Collin, G. H.; Conrad, J. M.; Convery, M.; Crespo-Anadón, J. I.; Del Tutto, M.; Devitt, D.; Dytman, S.; Eberly, B.; Ereditato, A.; Escudero Sanchez, L.; Esquivel, J.; Fleming, B. T.; Foreman, W.; Furmanski, A. P.; Garvey, G. T.; Genty, V.; Goeldi, D.; Gollapinni, S.; Graf, N.; Gramellini, E.; Greenlee, H.; Grosso, R.; Guenette, R.; Hackenburg, A.; Hamilton, P.; Hen, O.; Hewes, J.; Hill, C.; Ho, J.; Horton-Smith, G.; James, C.; de Vries, J. Jan; Jen, C.-M.; Jiang, L.; Johnson, R. A.; Jones, B. J. P.; Joshi, J.; Jostlein, H.; Kaleko, D.; Karagiorgi, G.; Ketchum, W.; Kirby, B.; Kirby, M.; Kobilarcik, T.; Kreslo, I.; Laube, A.; Li, Y.; Lister, A.; Littlejohn, B. R.; Lockwitz, S.; Lorca, D.; Louis, W. C.; Luethi, M.; Lundberg, B.; Luo, X.; Marchionni, A.; Mariani, C.; Marshall, J.; Martinez Caicedo, D. A.; Meddage, V.; Miceli, T.; Mills, G. B.; Moon, J.; Mooney, M.; Moore, C. D.; Mousseau, J.; Murrells, R.; Naples, D.; Nienaber, P.; Nowak, J.; Palamara, O.; Paolone, V.; Papavassiliou, V.; Pate, S. F.; Pavlovic, Z.; Porzio, D.; Pulliam, G.; Qian, X.; Raaf, J. L.; Rafique, A.; Rochester, L.; von Rohr, C. Rudolf; Russell, B.; Schmitz, D. W.; Schukraft, A.; Seligman, W.; Shaevitz, M. H.; Sinclair, J.; Snider, E. L.; Soderberg, M.; Söldner-Rembold, S.; Soleti, S. R.; Spentzouris, P.; Spitz, J.; St. John, J.; Strauss, T.; Szelc, A. M.; Tagg, N.; Terao, K.; Thomson, M.; Toups, M.; Tsai, Y.-T.; Tufanli, S.; Usher, T.; Van de Water, R. G.; Viren, B.; Weber, M.; Weston, J.; Wickremasinghe, D. A.; Wolbers, S.; Wongjirad, T.; Woodruff, K.; Yang, T.; Zeller, G. P.; Zennamo, J.; Zhang, C.

    2017-03-01

    We present several studies of convolutional neural networks applied to data coming from the MicroBooNE detector, a liquid argon time projection chamber (LArTPC). The algorithms studied include the classification of single particle images, the localization of single particle and neutrino interactions in an image, and the detection of a simulated neutrino event overlaid with cosmic ray backgrounds taken from real detector data. These studies demonstrate the potential of convolutional neural networks for particle identification or event detection on simulated neutrino interactions. We also address technical issues that arise when applying this technique to data from a large LArTPC at or near ground level.

  13. Classical least squares multivariate spectral analysis

    DOEpatents

    Haaland, David M.

    2002-01-01

    An improved classical least squares multivariate spectral analysis method that adds spectral shapes describing non-calibrated components and system effects (other than baseline corrections) present in the analyzed mixture to the prediction phase of the method. These improvements decrease or eliminate many of the restrictions to the CLS-type methods and greatly extend their capabilities, accuracy, and precision. One new application of PACLS includes the ability to accurately predict unknown sample concentrations when new unmodeled spectral components are present in the unknown samples. Other applications of PACLS include the incorporation of spectrometer drift into the quantitative multivariate model and the maintenance of a calibration on a drifting spectrometer. Finally, the ability of PACLS to transfer a multivariate model between spectrometers is demonstrated.

  14. Decoupling in linear time-varying multivariable systems

    NASA Technical Reports Server (NTRS)

    Sankaran, V.

    1973-01-01

    The necessary and sufficient conditions for the decoupling of an m-input, m-output, linear time varying dynamical system by state variable feedback is described. The class of feedback matrices which decouple the system are illustrated. Systems which do not satisfy these results are described and systems with disturbances are considered. Some examples are illustrated to clarify the results.

  15. Safety and effectiveness of olanzapine in monotherapy: a multivariate analysis of a naturalistic study.

    PubMed

    Ciudad, Antonio; Gutiérrez, Miguel; Cañas, Fernando; Gibert, Juan; Gascón, Josep; Carrasco, José-Luis; Bobes, Julio; Gómez, Juan-Carlos; Alvarez, Enrique

    2005-07-01

    This study investigated safety and effectiveness of olanzapine in monotherapy compared with conventional antipsychotics in treatment of acute inpatients with schizophrenia. This was a prospective, comparative, nonrandomized, open-label, multisite, observational study of Spanish inpatients with an acute episode of schizophrenia. Data included safety assessments with an extrapyramidal symptoms (EPS) questionnaire and the report of spontaneous adverse events, plus clinical assessments with the Brief Psychiatric Rating Scale (BPRS) and the Clinical Global Impressions-Severity of Illness (CGI-S). A multivariate methodology was used to more adequately determine which factors can influence safety and effectiveness of olanzapine in monotherapy. 339 patients treated with olanzapine in monotherapy (OGm) and 385 patients treated with conventional antipsychotics (CG) were included in the analysis. Treatment-emergent EPS were significantly higher in the CG (p<0.0001). Response rate was significantly higher in the OGm (p=0.005). Logistic regression analyses revealed that the only variable significantly correlated with treatment-emergent EPS and clinical response was treatment strategy, with patients in OGm having 1.5 times the probability of obtaining a clinical response and patients in CG having 5 times the risk of developing EPS. In this naturalistic study olanzapine in monotherapy was better-tolerated and at least as effective as conventional antipsychotics.

  16. Impact of real-time traffic characteristics on crash occurrence: Preliminary results of the case of rare events.

    PubMed

    Theofilatos, Athanasios; Yannis, George; Kopelias, Pantelis; Papadimitriou, Fanis

    2018-01-04

    Considerable efforts have been made from researchers and policy makers in order to explain road crash occurrence and improve road safety performance of highways. However, there are cases when crashes are so few that they could be considered as rare events. In such cases, the binary dependent variable is characterized by dozens to thousands of times fewer events (crashes) than non-events (non-crashes). This paper attempts to add to the current knowledge by investigating crash likelihood by utilizing real-time traffic data and by proposing a framework driven by appropriate statistical models (Bias Correction and Firth method) in order to overcome the problems that arise when the number of crashes is very low. Under this approach instead of using traditional logistic regression methods, crashes are considered as rare events In order to demonstrate this approach, traffic data were collected from three random loop detectors in the Attica Tollway ("Attiki Odos") located in Greater Athens Area in Greece for the 2008-2011 period. The traffic dataset consists of hourly aggregated traffic data such as flow, occupancy, mean time speed and percentage of trucks in traffic. This study demonstrates the application and findings of our approach and revealed a negative relationship between crash occurrence and speed in crash locations. The method and findings of the study attempt to provide insights on the mechanism of crash occurrence and also to overcome data considerations for the first time in safety evaluation of motorways. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. A power analysis for multivariate tests of temporal trend in species composition.

    PubMed

    Irvine, Kathryn M; Dinger, Eric C; Sarr, Daniel

    2011-10-01

    Long-term monitoring programs emphasize power analysis as a tool to determine the sampling effort necessary to effectively document ecologically significant changes in ecosystems. Programs that monitor entire multispecies assemblages require a method for determining the power of multivariate statistical models to detect trend. We provide a method to simulate presence-absence species assemblage data that are consistent with increasing or decreasing directional change in species composition within multiple sites. This step is the foundation for using Monte Carlo methods to approximate the power of any multivariate method for detecting temporal trends. We focus on comparing the power of the Mantel test, permutational multivariate analysis of variance, and constrained analysis of principal coordinates. We find that the power of the various methods we investigate is sensitive to the number of species in the community, univariate species patterns, and the number of sites sampled over time. For increasing directional change scenarios, constrained analysis of principal coordinates was as or more powerful than permutational multivariate analysis of variance, the Mantel test was the least powerful. However, in our investigation of decreasing directional change, the Mantel test was typically as or more powerful than the other models.

  18. A novel adaptive, real-time algorithm to detect gait events from wearable sensors.

    PubMed

    Chia Bejarano, Noelia; Ambrosini, Emilia; Pedrocchi, Alessandra; Ferrigno, Giancarlo; Monticone, Marco; Ferrante, Simona

    2015-05-01

    A real-time, adaptive algorithm based on two inertial and magnetic sensors placed on the shanks was developed for gait-event detection. For each leg, the algorithm detected the Initial Contact (IC), as the minimum of the flexion/extension angle, and the End Contact (EC) and the Mid-Swing (MS), as minimum and maximum of the angular velocity, respectively. The algorithm consisted of calibration, real-time detection, and step-by-step update. Data collected from 22 healthy subjects (21 to 85 years) walking at three self-selected speeds were used to validate the algorithm against the GaitRite system. Comparable levels of accuracy and significantly lower detection delays were achieved with respect to other published methods. The algorithm robustness was tested on ten healthy subjects performing sudden speed changes and on ten stroke subjects (43 to 89 years). For healthy subjects, F1-scores of 1 and mean detection delays lower than 14 ms were obtained. For stroke subjects, F1-scores of 0.998 and 0.944 were obtained for IC and EC, respectively, with mean detection delays always below 31 ms. The algorithm accurately detected gait events in real time from a heterogeneous dataset of gait patterns and paves the way for the design of closed-loop controllers for customized gait trainings and/or assistive devices.

  19. Complex numbers in chemometrics: examples from multivariate impedance measurements on lipid monolayers.

    PubMed

    Geladi, Paul; Nelson, Andrew; Lindholm-Sethson, Britta

    2007-07-09

    Electrical impedance gives multivariate complex number data as results. Two examples of multivariate electrical impedance data measured on lipid monolayers in different solutions give rise to matrices (16x50 and 38x50) of complex numbers. Multivariate data analysis by principal component analysis (PCA) or singular value decomposition (SVD) can be used for complex data and the necessary equations are given. The scores and loadings obtained are vectors of complex numbers. It is shown that the complex number PCA and SVD are better at concentrating information in a few components than the naïve juxtaposition method and that Argand diagrams can replace score and loading plots. Different concentrations of Magainin and Gramicidin A give different responses and also the role of the electrolyte medium can be studied. An interaction of Gramicidin A in the solution with the monolayer over time can be observed.

  20. DETECT: a MATLAB toolbox for event detection and identification in time series, with applications to artifact detection in EEG signals.

    PubMed

    Lawhern, Vernon; Hairston, W David; Robbins, Kay

    2013-01-01

    Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration.

  1. DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals

    PubMed Central

    Lawhern, Vernon; Hairston, W. David; Robbins, Kay

    2013-01-01

    Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration. PMID:23638169

  2. Time-to-Event Analysis of Individual Variables Associated with Nursing Students' Academic Failure: A Longitudinal Study

    ERIC Educational Resources Information Center

    Dante, Angelo; Fabris, Stefano; Palese, Alvisa

    2013-01-01

    Empirical studies and conceptual frameworks presented in the extant literature offer a static imagining of academic failure. Time-to-event analysis, which captures the dynamism of individual factors, as when they determine the failure to properly tailor timely strategies, impose longitudinal studies which are still lacking within the field. The…

  3. Placebo group improvement in trials of pharmacotherapies for alcohol use disorders: a multivariate meta-analysis examining change over time.

    PubMed

    Del Re, A C; Maisel, Natalya; Blodgett, Janet C; Wilbourne, Paula; Finney, John W

    2013-10-01

    Placebo group improvement in pharmacotherapy trials has been increasing over time across several pharmacological treatment areas. However, it is unknown to what degree increasing improvement has occurred in pharmacotherapy trials for alcohol use disorders or what factors may account for placebo group improvement. This meta-analysis of 47 alcohol pharmacotherapy trials evaluated (1) the magnitude of placebo group improvement, (2) the extent to which placebo group improvement has been increasing over time, and (3) several potential moderators that might account for variation in placebo group improvement. Random-effects univariate and multivariate analyses were conducted that examined the magnitude of placebo group improvement in the 47 studies and several potential moderators of improvement: (a) publication year, (b) country in which the study was conducted, (c) outcome data source/type, (d) number of placebo administrations, (e) overall severity of study participants, and (f) additional psychosocial treatment. Substantial placebo group improvement was found overall and improvement was larger in more recent studies. Greater improvement was found on moderately subjective outcomes, with more frequent administrations of the placebo, and in studies with greater participant severity of illness. However, even after controlling for these moderators, placebo group improvement remained significant, as did placebo group improvement over time. Similar to previous pharmacotherapy placebo research, substantial pretest to posttest placebo group improvement has occurred in alcohol pharmacotherapy trials, an effect that has been increasing over time. However, several plausible moderator variables were not able to explain why placebo group improvement has been increasing over time.

  4. Operationalizing Proneness to Externalizing Psychopathology as a Multivariate Psychophysiological Phenotype

    PubMed Central

    Nelson, Lindsay D.; Patrick, Christopher J.; Bernat, Edward M.

    2010-01-01

    The externalizing dimension is viewed as a broad dispositional factor underlying risk for numerous disinhibitory disorders. Prior work has documented deficits in event-related brain potential (ERP) responses in individuals prone to externalizing problems. Here, we constructed a direct physiological index of externalizing vulnerability from three ERP indicators and evaluated its validity in relation to criterion measures in two distinct domains: psychometric and physiological. The index was derived from three ERP measures that covaried in their relations with externalizing proneness the error-related negativity and two variants of the P3. Scores on this ERP composite predicted psychometric criterion variables and accounted for externalizing-related variance in P3 response from a separate task. These findings illustrate how a diagnostic construct can be operationalized as a composite (multivariate) psychophysiological variable (phenotype). PMID:20573054

  5. Multivariate Analysis and Machine Learning in Cerebral Palsy Research.

    PubMed

    Zhang, Jing

    2017-01-01

    Cerebral palsy (CP), a common pediatric movement disorder, causes the most severe physical disability in children. Early diagnosis in high-risk infants is critical for early intervention and possible early recovery. In recent years, multivariate analytic and machine learning (ML) approaches have been increasingly used in CP research. This paper aims to identify such multivariate studies and provide an overview of this relatively young field. Studies reviewed in this paper have demonstrated that multivariate analytic methods are useful in identification of risk factors, detection of CP, movement assessment for CP prediction, and outcome assessment, and ML approaches have made it possible to automatically identify movement impairments in high-risk infants. In addition, outcome predictors for surgical treatments have been identified by multivariate outcome studies. To make the multivariate and ML approaches useful in clinical settings, further research with large samples is needed to verify and improve these multivariate methods in risk factor identification, CP detection, movement assessment, and outcome evaluation or prediction. As multivariate analysis, ML and data processing technologies advance in the era of Big Data of this century, it is expected that multivariate analysis and ML will play a bigger role in improving the diagnosis and treatment of CP to reduce mortality and morbidity rates, and enhance patient care for children with CP.

  6. Complex analyses on clinical information systems using restricted natural language querying to resolve time-event dependencies.

    PubMed

    Safari, Leila; Patrick, Jon D

    2018-06-01

    This paper reports on a generic framework to provide clinicians with the ability to conduct complex analyses on elaborate research topics using cascaded queries to resolve internal time-event dependencies in the research questions, as an extension to the proposed Clinical Data Analytics Language (CliniDAL). A cascaded query model is proposed to resolve internal time-event dependencies in the queries which can have up to five levels of criteria starting with a query to define subjects to be admitted into a study, followed by a query to define the time span of the experiment. Three more cascaded queries can be required to define control groups, control variables and output variables which all together simulate a real scientific experiment. According to the complexity of the research questions, the cascaded query model has the flexibility of merging some lower level queries for simple research questions or adding a nested query to each level to compose more complex queries. Three different scenarios (one of them contains two studies) are described and used for evaluation of the proposed solution. CliniDAL's complex analyses solution enables answering complex queries with time-event dependencies at most in a few hours which manually would take many days. An evaluation of results of the research studies based on the comparison between CliniDAL and SQL solutions reveals high usability and efficiency of CliniDAL's solution. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. A neuromorphic network for generic multivariate data classification

    PubMed Central

    Schmuker, Michael; Pfeil, Thomas; Nawrot, Martin Paul

    2014-01-01

    Computational neuroscience has uncovered a number of computational principles used by nervous systems. At the same time, neuromorphic hardware has matured to a state where fast silicon implementations of complex neural networks have become feasible. En route to future technical applications of neuromorphic computing the current challenge lies in the identification and implementation of functional brain algorithms. Taking inspiration from the olfactory system of insects, we constructed a spiking neural network for the classification of multivariate data, a common problem in signal and data analysis. In this model, real-valued multivariate data are converted into spike trains using “virtual receptors” (VRs). Their output is processed by lateral inhibition and drives a winner-take-all circuit that supports supervised learning. VRs are conveniently implemented in software, whereas the lateral inhibition and classification stages run on accelerated neuromorphic hardware. When trained and tested on real-world datasets, we find that the classification performance is on par with a naïve Bayes classifier. An analysis of the network dynamics shows that stable decisions in output neuron populations are reached within less than 100 ms of biological time, matching the time-to-decision reported for the insect nervous system. Through leveraging a population code, the network tolerates the variability of neuronal transfer functions and trial-to-trial variation that is inevitably present on the hardware system. Our work provides a proof of principle for the successful implementation of a functional spiking neural network on a configurable neuromorphic hardware system that can readily be applied to real-world computing problems. PMID:24469794

  8. Effect of first myocardial ischemic event on renal function.

    PubMed

    Eijkelkamp, Wouter B A; de Graeff, Pieter A; van Veldhuisen, Dirk J; van Dokkum, Richard P E; Gansevoort, Ronald T; de Jong, Paul E; de Zeeuw, Dick; Hillege, Hans L

    2007-07-01

    Effects of cardiovascular dysfunction on renal function have been poorly characterized. Therefore, we investigated the relation between a first ischemic cardiac event and long-term renal function changes in the general population from the PREVEND study. We studied 6,360 subjects with a total follow-up duration of 27.017 subject-years. The estimated mean proportional increase in serum creatinine after a first ischemic cardiac event was 3.1% compared with 0.4% per year of follow-up in subjects without such an event (p = 0.005). This represented a significantly larger decrease in estimated glomerular filtration rate after the event in subjects with an event versus the decrease in subjects without a first ischemic cardiac event (2.2 vs 0.5 ml/min/1.73 m(2)/year of follow-up, p = 0.006). In multivariate analysis with adjustment for renal risk factors, this event showed an independent association with serum creatinine change. In conclusion, a first ischemic cardiac event appears to enhance the natural decrease in renal function. Because even mild renal dysfunction should be considered a major cardiovascular risk factor after myocardial infarction, increased renal function loss after an ischemic cardiac event could add to the risk for subsequent cardiovascular morbidity, thus closing a vicious circle.

  9. Event-triggered fault detection for a class of discrete-time linear systems using interval observers.

    PubMed

    Zhang, Zhi-Hui; Yang, Guang-Hong

    2017-05-01

    This paper provides a novel event-triggered fault detection (FD) scheme for discrete-time linear systems. First, an event-triggered interval observer is proposed to generate the upper and lower residuals by taking into account the influence of the disturbances and the event error. Second, the robustness of the residual interval against the disturbances and the fault sensitivity are improved by introducing l 1 and H ∞ performances. Third, dilated linear matrix inequalities are used to decouple the Lyapunov matrices from the system matrices. The nonnegative conditions for the estimation error variables are presented with the aid of the slack matrix variables. This technique allows considering a more general Lyapunov function. Furthermore, the FD decision scheme is proposed by monitoring whether the zero value belongs to the residual interval. It is shown that the information communication burden is reduced by designing the event-triggering mechanism, while the FD performance can still be guaranteed. Finally, simulation results demonstrate the effectiveness of the proposed method. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  10. [Effect of Chinese drugs for activating blood circulation and removing blood stasis on carotid atherosclerosis and ischemic cerebrovascular events].

    PubMed

    Lu, Yan; Li, Tao

    2014-03-01

    To explore the effect of Chinese drugs for activating blood circulation and removing blood stasis (CDABCRBS) on carotid atherosclerotic plaque and long-term ischemic cerebrovascular events. By using open and control method, effect of 4 groups of platelet antagonists, platelet antagonists + CDABCRBS, platelet antagonists +atorvastatin, platelet antagonists +atorvastatin +CDABCRBS on carotid atherosclerotic plaque and long-term ischemic cerebrovascular events of 90 cerebral infarction patients were analyzed. Through survival analysis, there was no statistical difference in the effect of the 4 interventions on the variation of carotid stenosis rates or ischemic cerebrovascular events (P > 0.05). The occurrence of ischemic cerebrovascular events could be postponed by about 4 months in those treated with platelet antagonists + CDABCRBS and platelet antagonists + atorvastatin +CDABCRBS. By multivariate Logistic analysis, age, hypertension, and clopidogrel were associated with stenosis of extracranial carotid arteries (P <0.05). Age, diabetes, aspirin, clopidogrel, CDABCRBS were correlated with cerebrovascular accidents (P < 0.05). Whether or not accompanied with hypertension is an influential factor for carotid stenosis, but it does not affect the occurrence of ischemic cerebrovascular events. CDABCRBS could effectively prolong the occurrence time of ischemic cerebrovascular events.

  11. Convolutional neural networks applied to neutrino events in a liquid argon time projection chamber

    DOE PAGES

    Acciarri, R.; Adams, C.; An, R.; ...

    2017-03-14

    Here, we present several studies of convolutional neural networks applied to data coming from the MicroBooNE detector, a liquid argon time projection chamber (LArTPC). The algorithms studied include the classification of single particle images, the localization of single particle and neutrino interactions in an image, and the detection of a simulated neutrino event overlaid with cosmic ray backgrounds taken from real detector data. These studies demonstrate the potential of convolutional neural networks for particle identification or event detection on simulated neutrino interactions. Lastly, we also address technical issues that arise when applying this technique to data from a large LArTPCmore » at or near ground level.« less

  12. On set-valued functionals: Multivariate risk measures and Aumann integrals

    NASA Astrophysics Data System (ADS)

    Ararat, Cagin

    In this dissertation, multivariate risk measures for random vectors and Aumann integrals of set-valued functions are studied. Both are set-valued functionals with values in a complete lattice of subsets of Rm. Multivariate risk measures are considered in a general d-asset financial market with trading opportunities in discrete time. Specifically, the following features of the market are incorporated in the evaluation of multivariate risk: convex transaction costs modeled by solvency regions, intermediate trading constraints modeled by convex random sets, and the requirement of liquidation into the first m ≤ d of the assets. It is assumed that the investor has a "pure" multivariate risk measure R on the space of m-dimensional random vectors which represents her risk attitude towards the assets but does not take into account the frictions of the market. Then, the investor with a d-dimensional position minimizes the set-valued functional R over all m-dimensional positions that she can reach by trading in the market subject to the frictions described above. The resulting functional Rmar on the space of d-dimensional random vectors is another multivariate risk measure, called the market-extension of R. A dual representation for R mar that decomposes the effects of R and the frictions of the market is proved. Next, multivariate risk measures are studied in a utility-based framework. It is assumed that the investor has a complete risk preference towards each individual asset, which can be represented by a von Neumann-Morgenstern utility function. Then, an incomplete preference is considered for multivariate positions which is represented by the vector of the individual utility functions. Under this structure, multivariate shortfall and divergence risk measures are defined as the optimal values of set minimization problems. The dual relationship between the two classes of multivariate risk measures is constructed via a recent Lagrange duality for set optimization. In

  13. Multivariate reference technique for quantitative analysis of fiber-optic tissue Raman spectroscopy.

    PubMed

    Bergholt, Mads Sylvest; Duraipandian, Shiyamala; Zheng, Wei; Huang, Zhiwei

    2013-12-03

    We report a novel method making use of multivariate reference signals of fused silica and sapphire Raman signals generated from a ball-lens fiber-optic Raman probe for quantitative analysis of in vivo tissue Raman measurements in real time. Partial least-squares (PLS) regression modeling is applied to extract the characteristic internal reference Raman signals (e.g., shoulder of the prominent fused silica boson peak (~130 cm(-1)); distinct sapphire ball-lens peaks (380, 417, 646, and 751 cm(-1))) from the ball-lens fiber-optic Raman probe for quantitative analysis of fiber-optic Raman spectroscopy. To evaluate the analytical value of this novel multivariate reference technique, a rapid Raman spectroscopy system coupled with a ball-lens fiber-optic Raman probe is used for in vivo oral tissue Raman measurements (n = 25 subjects) under 785 nm laser excitation powers ranging from 5 to 65 mW. An accurate linear relationship (R(2) = 0.981) with a root-mean-square error of cross validation (RMSECV) of 2.5 mW can be obtained for predicting the laser excitation power changes based on a leave-one-subject-out cross-validation, which is superior to the normal univariate reference method (RMSE = 6.2 mW). A root-mean-square error of prediction (RMSEP) of 2.4 mW (R(2) = 0.985) can also be achieved for laser power prediction in real time when we applied the multivariate method independently on the five new subjects (n = 166 spectra). We further apply the multivariate reference technique for quantitative analysis of gelatin tissue phantoms that gives rise to an RMSEP of ~2.0% (R(2) = 0.998) independent of laser excitation power variations. This work demonstrates that multivariate reference technique can be advantageously used to monitor and correct the variations of laser excitation power and fiber coupling efficiency in situ for standardizing the tissue Raman intensity to realize quantitative analysis of tissue Raman measurements in vivo, which is particularly appealing in

  14. Multivariate Analysis and Machine Learning in Cerebral Palsy Research

    PubMed Central

    Zhang, Jing

    2017-01-01

    Cerebral palsy (CP), a common pediatric movement disorder, causes the most severe physical disability in children. Early diagnosis in high-risk infants is critical for early intervention and possible early recovery. In recent years, multivariate analytic and machine learning (ML) approaches have been increasingly used in CP research. This paper aims to identify such multivariate studies and provide an overview of this relatively young field. Studies reviewed in this paper have demonstrated that multivariate analytic methods are useful in identification of risk factors, detection of CP, movement assessment for CP prediction, and outcome assessment, and ML approaches have made it possible to automatically identify movement impairments in high-risk infants. In addition, outcome predictors for surgical treatments have been identified by multivariate outcome studies. To make the multivariate and ML approaches useful in clinical settings, further research with large samples is needed to verify and improve these multivariate methods in risk factor identification, CP detection, movement assessment, and outcome evaluation or prediction. As multivariate analysis, ML and data processing technologies advance in the era of Big Data of this century, it is expected that multivariate analysis and ML will play a bigger role in improving the diagnosis and treatment of CP to reduce mortality and morbidity rates, and enhance patient care for children with CP. PMID:29312134

  15. Outcome and risk factors assessment for adverse events in advanced esophageal cancer patients after self-expanding metal stents placement.

    PubMed

    Rodrigues-Pinto, E; Pereira, P; Coelho, R; Andrade, P; Ribeiro, A; Lopes, S; Moutinho-Ribeiro, P; Macedo, G

    2017-02-01

    Self-expanding metal stents (SEMS) are the treatment of choice for advanced esophageal cancers. Literature is scarce on risk factors predictors for adverse events after SEMS placement. Assess risk factors for adverse events after SEMS placement in advanced esophageal cancer and evaluate survival after SEMS placement. Cross-sectional study of patients with advanced esophageal cancer referred for SEMS placement, during a period of 3 years. Ninety-seven patients with advanced esophageal cancer placed SEMS. Adverse events were more common when tumors were located at the level of the distal esophagus/cardia (47% vs 23%, P = 0.011, OR 3.1), with statistical significance being kept in the multivariate analysis (OR 3.1, P = 0.018). Time until adverse events was lower in the tumors located at the level of the distal esophagus/cardia (P = 0.036). Survival was higher in patients who placed SEMS with curative intent (327 days [126-528] vs. 119 days [91-147], P = 0.002) and in patients submitted subsequently to surgery compared with those who did just chemo/radiotherapy or who did not do further treatment (563 days [378-748] vs. 154 days [133-175] vs. 46 days [20-72], P < 0.001). Subsequent treatment kept statistical significance in the multivariate analysis (HR 3.4, P < 0.001). SEMS allow palliation of dysphagia in advanced esophageal cancer and are associated with an increased out-of-hospital survival, as long as there are conditions for further treatments. Tumors located at the level of the distal esophagus/cardia are associated with a greater number of adverse events, which also occur earlier. © 2016 International Society for Diseases of the Esophagus.

  16. Serial Spike Time Correlations Affect Probability Distribution of Joint Spike Events.

    PubMed

    Shahi, Mina; van Vreeswijk, Carl; Pipa, Gordon

    2016-01-01

    Detecting the existence of temporally coordinated spiking activity, and its role in information processing in the cortex, has remained a major challenge for neuroscience research. Different methods and approaches have been suggested to test whether the observed synchronized events are significantly different from those expected by chance. To analyze the simultaneous spike trains for precise spike correlation, these methods typically model the spike trains as a Poisson process implying that the generation of each spike is independent of all the other spikes. However, studies have shown that neural spike trains exhibit dependence among spike sequences, such as the absolute and relative refractory periods which govern the spike probability of the oncoming action potential based on the time of the last spike, or the bursting behavior, which is characterized by short epochs of rapid action potentials, followed by longer episodes of silence. Here we investigate non-renewal processes with the inter-spike interval distribution model that incorporates spike-history dependence of individual neurons. For that, we use the Monte Carlo method to estimate the full shape of the coincidence count distribution and to generate false positives for coincidence detection. The results show that compared to the distributions based on homogeneous Poisson processes, and also non-Poisson processes, the width of the distribution of joint spike events changes. Non-renewal processes can lead to both heavy tailed or narrow coincidence distribution. We conclude that small differences in the exact autostructure of the point process can cause large differences in the width of a coincidence distribution. Therefore, manipulations of the autostructure for the estimation of significance of joint spike events seem to be inadequate.

  17. Serial Spike Time Correlations Affect Probability Distribution of Joint Spike Events

    PubMed Central

    Shahi, Mina; van Vreeswijk, Carl; Pipa, Gordon

    2016-01-01

    Detecting the existence of temporally coordinated spiking activity, and its role in information processing in the cortex, has remained a major challenge for neuroscience research. Different methods and approaches have been suggested to test whether the observed synchronized events are significantly different from those expected by chance. To analyze the simultaneous spike trains for precise spike correlation, these methods typically model the spike trains as a Poisson process implying that the generation of each spike is independent of all the other spikes. However, studies have shown that neural spike trains exhibit dependence among spike sequences, such as the absolute and relative refractory periods which govern the spike probability of the oncoming action potential based on the time of the last spike, or the bursting behavior, which is characterized by short epochs of rapid action potentials, followed by longer episodes of silence. Here we investigate non-renewal processes with the inter-spike interval distribution model that incorporates spike-history dependence of individual neurons. For that, we use the Monte Carlo method to estimate the full shape of the coincidence count distribution and to generate false positives for coincidence detection. The results show that compared to the distributions based on homogeneous Poisson processes, and also non-Poisson processes, the width of the distribution of joint spike events changes. Non-renewal processes can lead to both heavy tailed or narrow coincidence distribution. We conclude that small differences in the exact autostructure of the point process can cause large differences in the width of a coincidence distribution. Therefore, manipulations of the autostructure for the estimation of significance of joint spike events seem to be inadequate. PMID:28066225

  18. The 2009-2010 Guerrero Slow Slip Event Monitored by InSAR, Using Time Series Approach

    NASA Astrophysics Data System (ADS)

    Bacques, G.; Pathier, E.; Lasserre, C.; Cotton, F.; Radiguet, M.; Cycle Sismique et Déformations Transitoires

    2011-12-01

    Time Series approach. Time Series approach is useful for monitoring ground deformation evolution during the slow slip events and makes the slip propagation mapping upon the subduction plane a promising goal. Here we present our first results concerning the 2009-2010 slow slip events, particularly the distribution of the cumulative surface displacement in LOS (satellite Line Of Sight), the slip distribution associated on the fault plane and the ground deformation evolution obtained. Finally, we open the discussion with a first comparison between the 2009-2010 and the 2006 events that reveal some differences concerning the amplitude and the distribution of the ground deformation.

  19. It Depends on When You Ask: Motives for Using Marijuana Assessed Before versus After a Marijuana Use Event

    PubMed Central

    Shrier, Lydia A.; Scherer, Emily Blood

    2014-01-01

    Marijuana use motives are typically evaluated retrospectively using measures that summarize or generalize across episodes of use, which may compromise validity. Using Ecological Momentary Assessment data, we examined the main reason for a specific marijuana use event measured both prospectively and retrospectively. We then determined reason types, event characteristics, and user characteristics that predicted change in reason. Thirty-six medical outpatients age 15 to 24 years who used marijuana two times a week or more used a handheld computer to select their main reason for use from the five categories of the Marijuana Motives Measure (Simons, Correia, & Carey, 1998) just before and after each time they used marijuana over two weeks (n = 263 events with before/after reason). Reasons were examined individually and according to dimensions identified in motivational models of substance use (positive/negative, internal/external). Reason assessed before use changed to a different reason after use for 20% of events: 10% of events for pleasure; 21%, to cope; 35%, to be more social; 55%, to expand my mind; and 100%, to conform. In the multivariable model, external and expansion reasons each predicted change in reason for use (p < 0.0001 and p = 0.001, respectively). Youth were also more likely to change their reason if older (p = 0.04), if male (p = 0.02), and with weekend use (p = 0.002). Retrospective assessments of event-specific motives for marijuana use may be unreliable and therefore invalid for a substantial minority of events, particularly if use is for external or expansion reasons. PMID:25123342

  20. Multivariate statistical analysis strategy for multiple misfire detection in internal combustion engines

    NASA Astrophysics Data System (ADS)

    Hu, Chongqing; Li, Aihua; Zhao, Xingyang

    2011-02-01

    This paper proposes a multivariate statistical analysis approach to processing the instantaneous engine speed signal for the purpose of locating multiple misfire events in internal combustion engines. The state of each cylinder is described with a characteristic vector extracted from the instantaneous engine speed signal following a three-step procedure. These characteristic vectors are considered as the values of various procedure parameters of an engine cycle. Therefore, determination of occurrence of misfire events and identification of misfiring cylinders can be accomplished by a principal component analysis (PCA) based pattern recognition methodology. The proposed algorithm can be implemented easily in practice because the threshold can be defined adaptively without the information of operating conditions. Besides, the effect of torsional vibration on the engine speed waveform is interpreted as the presence of super powerful cylinder, which is also isolated by the algorithm. The misfiring cylinder and the super powerful cylinder are often adjacent in the firing sequence, thus missing detections and false alarms can be avoided effectively by checking the relationship between the cylinders.

  1. Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time-to-Event Analysis.

    PubMed

    Gong, Xiajing; Hu, Meng; Zhao, Liang

    2018-05-01

    Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time-to-event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high-dimensional data featured by a large number of predictor variables. Our results showed that ML-based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high-dimensional data. The prediction performances of ML-based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML-based methods provide a powerful tool for time-to-event analysis, with a built-in capacity for high-dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. © 2018 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  2. Analyzing Multiple Outcomes in Clinical Research Using Multivariate Multilevel Models

    PubMed Central

    Baldwin, Scott A.; Imel, Zac E.; Braithwaite, Scott R.; Atkins, David C.

    2014-01-01

    Objective Multilevel models have become a standard data analysis approach in intervention research. Although the vast majority of intervention studies involve multiple outcome measures, few studies use multivariate analysis methods. The authors discuss multivariate extensions to the multilevel model that can be used by psychotherapy researchers. Method and Results Using simulated longitudinal treatment data, the authors show how multivariate models extend common univariate growth models and how the multivariate model can be used to examine multivariate hypotheses involving fixed effects (e.g., does the size of the treatment effect differ across outcomes?) and random effects (e.g., is change in one outcome related to change in the other?). An online supplemental appendix provides annotated computer code and simulated example data for implementing a multivariate model. Conclusions Multivariate multilevel models are flexible, powerful models that can enhance clinical research. PMID:24491071

  3. Event centrality prospectively predicts PTSD symptoms.

    PubMed

    Boals, Adriel; Ruggero, Camilo

    2016-09-01

    Recent evidence suggests that event centrality has a prominent association with post-traumatic stress disorder (PTSD) symptoms. However, evidence for this notion thus far has been mostly correlational. We report two studies that prospectively examined the relationship between event centrality and PTSD symptoms. Study 1 METHODS: Participants (N = 1438) reported their most stressful event ("prior event"), along with event centrality, PTSD symptoms, and neuroticism. At Time 2 participants reported their most stressful event since Time 1 ("critical event"), along with measures of event centrality and PTSD symptoms. Study 1 RESULTS: Event centrality for the critical event predicted PTSD symptoms, after controlling for event centrality and PTSD symptoms of the prior event and neuroticism. Study In the second study (N = 161) we examined changes in event centrality and PTSD symptoms over a month. Study 2 RESULTS: Using a cross-lagged panel design, results revealed event centrality at Time 1 significantly predicted PTSD symptoms at Time 2, but the reverse was not significant. In two studies, a prospective association between event centrality and PTSD symptoms, but not the reverse, emerged. This evidence implicates event centrality in the pathogenesis and/or maintenance of PTSD symptoms.

  4. Incidence of cardiovascular events and associated risk factors in kidney transplant patients: a competing risks survival analysis.

    PubMed

    Seoane-Pillado, María Teresa; Pita-Fernández, Salvador; Valdés-Cañedo, Francisco; Seijo-Bestilleiro, Rocio; Pértega-Díaz, Sonia; Fernández-Rivera, Constantino; Alonso-Hernández, Ángel; González-Martín, Cristina; Balboa-Barreiro, Vanesa

    2017-03-07

    The high prevalence of cardiovascular risk factors among the renal transplant population accounts for increased mortality. The aim of this study is to determine the incidence of cardiovascular events and factors associated with cardiovascular events in these patients. An observational ambispective follow-up study of renal transplant recipients (n = 2029) in the health district of A Coruña (Spain) during the period 1981-2011 was completed. Competing risk survival analysis methods were applied to estimate the cumulative incidence of developing cardiovascular events over time and to identify which characteristics were associated with the risk of these events. Post-transplant cardiovascular events are defined as the presence of myocardial infarction, invasive coronary artery therapy, cerebral vascular events, new-onset angina, congestive heart failure, rhythm disturbances, peripheral vascular disease and cardiovascular disease and death. The cause of death was identified through the medical history and death certificate using ICD9 (390-459, except: 427.5, 435, 446, 459.0). The mean age of patients at the time of transplantation was 47.0 ± 14.2 years; 62% were male. 16.5% had suffered some cardiovascular disease prior to transplantation and 9.7% had suffered a cardiovascular event. The mean follow-up period for the patients with cardiovascular event was 3.5 ± 4.3 years. Applying competing risk methodology, it was observed that the accumulated incidence of the event was 5.0% one year after transplantation, 8.1% after five years, and 11.9% after ten years. After applying multivariate models, the variables with an independent effect for predicting cardiovascular events are: male sex, age of recipient, previous cardiovascular disorders, pre-transplant smoking and post-transplant diabetes. This study makes it possible to determine in kidney transplant patients, taking into account competitive events, the incidence of post-transplant cardiovascular events and

  5. Multivariate Cryptography Based on Clipped Hopfield Neural Network.

    PubMed

    Wang, Jia; Cheng, Lee-Ming; Su, Tong

    2018-02-01

    Designing secure and efficient multivariate public key cryptosystems [multivariate cryptography (MVC)] to strengthen the security of RSA and ECC in conventional and quantum computational environment continues to be a challenging research in recent years. In this paper, we will describe multivariate public key cryptosystems based on extended Clipped Hopfield Neural Network (CHNN) and implement it using the MVC (CHNN-MVC) framework operated in space. The Diffie-Hellman key exchange algorithm is extended into the matrix field, which illustrates the feasibility of its new applications in both classic and postquantum cryptography. The efficiency and security of our proposed new public key cryptosystem CHNN-MVC are simulated and found to be NP-hard. The proposed algorithm will strengthen multivariate public key cryptosystems and allows hardware realization practicality.

  6. Reporting of Cardiovascular Medical Device Adverse Events to Pharmaceuticals and Medical Devices Agency, Japan☆

    PubMed Central

    Handa, Nobuhiro; Ishii, Kensuke; Matsui, Yutaka; Ando, Yuki

    2015-01-01

    Background Marketing authorization holders (MAHs) are obligated to report adverse events (AEs) within 15 days (some cases 30 days) to the Pharmaceuticals and Medical Devices Agency (PMDA) of Japan. Methods To analyze the timeliness of AE reporting to the PMDA, 6610 reports for five categories of cardiovascular devices were retrieved. Two durations were calculated: (1) time from the date the AE occurred to that when the MAH captured it (DOC: days); and (2) time from the date of MAH capture to that of MAH report (DCR: days). Number of DOC > 15 days (DOC15) and delayed reports (DCR > 15 or 30 days) were also calculated. Results AEs included 9.2% deaths and 7.5% non-recoveries. DOC15 and delayed reports were 51.0% and 10.9%, respectively. By multivariate analysis, DOC15 was associated with foreign AE, device category, MAH, patient outcome, event category, and AE that had to be reported within 15 or 30 days (AE15/30). Delayed report was associated with device category, MAH, patient outcome, event category, and AE15/30. Comments Although Japanese MAHs complied with the obligation to report AEs, they often failed to share AEs with healthcare providers. Registry may be a potential solution, although the cooperation of healthcare providers to input data is essential. PMID:26501120

  7. ADESSA: A Real-Time Decision Support Service for Delivery of Semantically Coded Adverse Drug Event Data

    PubMed Central

    Duke, Jon D.; Friedlin, Jeff

    2010-01-01

    Evaluating medications for potential adverse events is a time-consuming process, typically involving manual lookup of information by physicians. This process can be expedited by CDS systems that support dynamic retrieval and filtering of adverse drug events (ADE’s), but such systems require a source of semantically-coded ADE data. We created a two-component system that addresses this need. First we created a natural language processing application which extracts adverse events from Structured Product Labels and generates a standardized ADE knowledge base. We then built a decision support service that consumes a Continuity of Care Document and returns a list of patient-specific ADE’s. Our database currently contains 534,125 ADE’s from 5602 product labels. An NLP evaluation of 9529 ADE’s showed recall of 93% and precision of 95%. On a trial set of 30 CCD’s, the system provided adverse event data for 88% of drugs and returned these results in an average of 620ms. PMID:21346964

  8. Development of a real time monitor and multivariate method for long term diagnostics of atmospheric pressure dielectric barrier discharges: application to He, He/N2, and He/O2 discharges.

    PubMed

    O'Connor, N; Milosavljević, V; Daniels, S

    2011-08-01

    In this paper we present the development and application of a real time atmospheric pressure discharge monitoring diagnostic. The software based diagnostic is designed to extract latent electrical and optical information associated with the operation of an atmospheric pressure dielectric barrier discharge (APDBD) over long time scales. Given that little is known about long term temporal effects in such discharges, the diagnostic methodology is applied to the monitoring of an APDBD in helium and helium with both 0.1% nitrogen and 0.1% oxygen gas admixtures over periods of tens of minutes. Given the large datasets associated with the experiments, it is shown that this process is much expedited through the novel application of multivariate correlations between the electrical and optical parameters of the corresponding chemistries which, in turn, facilitates comparisons between each individual chemistry also. The results of these studies show that the electrical and optical parameters of the discharge in helium and upon the addition of gas admixtures evolve over time scales far longer than the gas residence time and have been compared to current modelling works. It is envisaged that the diagnostic together with the application of multivariate correlations will be applied to rapid system identification and prototyping in both experimental and industrial APDBD systems in the future.

  9. Determination of the event collision time with the ALICE detector at the LHC

    NASA Astrophysics Data System (ADS)

    Adam, J.; Adamová, D.; Aggarwal, M. M.; Aglieri Rinella, G.; Agnello, M.; Agrawal, N.; Ahammed, Z.; Ahmad, S.; Ahn, S. U.; Aiola, S.; Akindinov, A.; Alam, S. N.; Albuquerque, D. S. D.; Aleksandrov, D.; Alessandro, B.; Alexandre, D.; Alfaro Molina, R.; Alici, A.; Alkin, A.; Alme, J.; Alt, T.; Altinpinar, S.; Altsybeev, I.; Alves Garcia Prado, C.; An, M.; Andrei, C.; Andrews, H. A.; Andronic, A.; Anguelov, V.; Anson, C.; Antičić, T.; Antinori, F.; Antonioli, P.; Anwar, R.; Aphecetche, L.; Appelshäuser, H.; Arcelli, S.; Arnaldi, R.; Arnold, O. W.; Arsene, I. C.; Arslandok, M.; Audurier, B.; Augustinus, A.; Averbeck, R.; Azmi, M. D.; Badalà, A.; Baek, Y. W.; Bagnasco, S.; Bailhache, R.; Bala, R.; Balasubramanian, S.; Baldisseri, A.; Baral, R. C.; Barbano, A. M.; Barbera, R.; Barile, F.; Barnaföldi, G. G.; Barnby, L. S.; Barret, V.; Bartalini, P.; Barth, K.; Bartke, J.; Bartsch, E.; Basile, M.; Bastid, N.; Basu, S.; Bathen, B.; Batigne, G.; Batista Camejo, A.; Batyunya, B.; Batzing, P. C.; Bearden, I. G.; Beck, H.; Bedda, C.; Behera, N. K.; Belikov, I.; Bellini, F.; Bello Martinez, H.; Bellwied, R.; Beltran, L. G. E.; Belyaev, V.; Bencedi, G.; Beole, S.; Bercuci, A.; Berdnikov, Y.; Berenyi, D.; Bertens, R. A.; Berzano, D.; Betev, L.; Bhasin, A.; Bhat, I. R.; Bhati, A. K.; Bhattacharjee, B.; Bhom, J.; Bianchi, L.; Bianchi, N.; Bianchin, C.; Bielčík, J.; Bielčíková, J.; Bilandzic, A.; Biro, G.; Biswas, R.; Biswas, S.; Bjelogrlic, S.; Blair, J. T.; Blau, D.; Blume, C.; Bock, F.; Bogdanov, A.; Boldizsár, L.; Bombara, M.; Bonora, M.; Book, J.; Borel, H.; Borissov, A.; Borri, M.; Botta, E.; Bourjau, C.; Braun-Munzinger, P.; Bregant, M.; Broker, T. A.; Browning, T. A.; Broz, M.; Brucken, E. J.; Bruna, E.; Bruno, G. E.; Budnikov, D.; Buesching, H.; Bufalino, S.; Buhler, P.; Buitron, S. A. I.; Buncic, P.; Busch, O.; Buthelezi, Z.; Butt, J. B.; Buxton, J. T.; Cabala, J.; Caffarri, D.; Caines, H.; Caliva, A.; Calvo Villar, E.; Camerini, P.; Carena, F.; Carena, W.; Carnesecchi, F.; Castillo Castellanos, J.; Castro, A. J.; Casula, E. A. R.; Ceballos Sanchez, C.; Cepila, J.; Cerello, P.; Cerkala, J.; Chang, B.; Chapeland, S.; Chartier, M.; Charvet, J. L.; Chattopadhyay, S.; Chattopadhyay, S.; Chauvin, A.; Chelnokov, V.; Cherney, M.; Cheshkov, C.; Cheynis, B.; Chibante Barroso, V.; Chinellato, D. D.; Cho, S.; Chochula, P.; Choi, K.; Chojnacki, M.; Choudhury, S.; Christakoglou, P.; Christensen, C. H.; Christiansen, P.; Chujo, T.; Chung, S. U.; Cicalo, C.; Cifarelli, L.; Cindolo, F.; Cleymans, J.; Colamaria, F.; Colella, D.; Collu, A.; Colocci, M.; Conesa Balbastre, G.; Conesa del Valle, Z.; Connors, M. E.; Contreras, J. G.; Cormier, T. M.; Corrales Morales, Y.; Cortés Maldonado, I.; Cortese, P.; Cosentino, M. R.; Costa, F.; Crkovská, J.; Crochet, P.; Cruz Albino, R.; Cuautle, E.; Cunqueiro, L.; Dahms, T.; Dainese, A.; Danisch, M. C.; Danu, A.; Das, D.; Das, I.; Das, S.; Dash, A.; Dash, S.; De, S.; De Caro, A.; de Cataldo, G.; de Conti, C.; de Cuveland, J.; De Falco, A.; De Gruttola, D.; De Marco, N.; De Pasquale, S.; De Souza, R. D.; Deisting, A.; Deloff, A.; Deplano, C.; Dhankher, P.; Di Bari, D.; Di Mauro, A.; Di Nezza, P.; Di Ruzza, B.; Diaz Corchero, M. A.; Dietel, T.; Dillenseger, P.; Divià, R.; Djuvsland, Ø.; Dobrin, A.; Domenicis Gimenez, D.; Dönigus, B.; Dordic, O.; Drozhzhova, T.; Dubey, A. K.; Dubla, A.; Ducroux, L.; Duggal, A. K.; Dupieux, P.; Ehlers, R. J.; Elia, D.; Endress, E.; Engel, H.; Epple, E.; Erazmus, B.; Erhardt, F.; Espagnon, B.; Esumi, S.; Eulisse, G.; Eum, J.; Evans, D.; Evdokimov, S.; Eyyubova, G.; Fabbietti, L.; Fabris, D.; Faivre, J.; Fantoni, A.; Fasel, M.; Feldkamp, L.; Feliciello, A.; Feofilov, G.; Ferencei, J.; Fernández Téllez, A.; Ferreiro, E. G.; Ferretti, A.; Festanti, A.; Feuillard, V. J. G.; Figiel, J.; Figueredo, M. A. S.; Filchagin, S.; Finogeev, D.; Fionda, F. M.; Fiore, E. M.; Floris, M.; Foertsch, S.; Foka, P.; Fokin, S.; Fragiacomo, E.; Francescon, A.; Francisco, A.; Frankenfeld, U.; Fronze, G. G.; Fuchs, U.; Furget, C.; Furs, A.; Fusco Girard, M.; Gaardhøje, J. J.; Gagliardi, M.; Gago, A. M.; Gajdosova, K.; Gallio, M.; Galvan, C. D.; Gangadharan, D. R.; Ganoti, P.; Gao, C.; Garabatos, C.; Garcia-Solis, E.; Garg, K.; Garg, P.; Gargiulo, C.; Gasik, P.; Gauger, E. F.; Gay Ducati, M. B.; Germain, M.; Ghosh, P.; Ghosh, S. K.; Gianotti, P.; Giubellino, P.; Giubilato, P.; Gladysz-Dziadus, E.; Glässel, P.; Goméz Coral, D. M.; Gomez Ramirez, A.; Gonzalez, A. S.; Gonzalez, V.; González-Zamora, P.; Gorbunov, S.; Görlich, L.; Gotovac, S.; Grabski, V.; Graczykowski, L. K.; Graham, K. L.; Greiner, L.; Grelli, A.; Grigoras, C.; Grigoriev, V.; Grigoryan, A.; Grigoryan, S.; Grion, N.; Gronefeld, J. M.; Grosse-Oetringhaus, J. F.; Grosso, R.; Gruber, L.; Guber, F.; Guernane, R.; Guerzoni, B.; Gulbrandsen, K.; Gunji, T.; Gupta, A.; Gupta, R.; Guzman, I. B.; Haake, R.; Hadjidakis, C.; Hamagaki, H.; Hamar, G.; Hamon, J. C.; Harris, J. W.; Harton, A.; Hatzifotiadou, D.; Hayashi, S.; Heckel, S. T.; Hellbär, E.; Helstrup, H.; Herghelegiu, A.; Herrera Corral, G.; Herrmann, F.; Hess, B. A.; Hetland, K. F.; Hillemanns, H.; Hippolyte, B.; Hladky, J.; Horak, D.; Hosokawa, R.; Hristov, P.; Hughes, C.; Humanic, T. J.; Hussain, N.; Hussain, T.; Hutter, D.; Hwang, D. S.; Ilkaev, R.; Inaba, M.; Ippolitov, M.; Irfan, M.; Isakov, V.; Islam, M. S.; Ivanov, M.; Ivanov, V.; Izucheev, V.; Jacak, B.; Jacazio, N.; Jacobs, P. M.; Jadhav, M. B.; Jadlovska, S.; Jadlovsky, J.; Jahnke, C.; Jakubowska, M. J.; Janik, M. A.; Jayarathna, P. H. S. Y.; Jena, C.; Jena, S.; Jimenez Bustamante, R. T.; Jones, P. G.; Jusko, A.; Kalinak, P.; Kalweit, A.; Kang, J. H.; Kaplin, V.; Kar, S.; Karasu Uysal, A.; Karavichev, O.; Karavicheva, T.; Karayan, L.; Karpechev, E.; Kebschull, U.; Keidel, R.; Keijdener, D. L. D.; Keil, M.; Mohisin Khan, M.; Khan, P.; Khan, S. A.; Khanzadeev, A.; Kharlov, Y.; Khatun, A.; Khuntia, A.; Kileng, B.; Kim, D. W.; Kim, D. J.; Kim, D.; Kim, H.; Kim, J. S.; Kim, J.; Kim, M.; Kim, M.; Kim, S.; Kim, T.; Kirsch, S.; Kisel, I.; Kiselev, S.; Kisiel, A.; Kiss, G.; Klay, J. L.; Klein, C.; Klein, J.; Klein-Bösing, C.; Klewin, S.; Kluge, A.; Knichel, M. L.; Knospe, A. G.; Kobdaj, C.; Kofarago, M.; Kollegger, T.; Kolojvari, A.; Kondratiev, V.; Kondratyeva, N.; Kondratyuk, E.; Konevskikh, A.; Kopcik, M.; Kour, M.; Kouzinopoulos, C.; Kovalenko, O.; Kovalenko, V.; Kowalski, M.; Koyithatta Meethaleveedu, G.; Králik, I.; Kravčáková, A.; Krivda, M.; Krizek, F.; Kryshen, E.; Krzewicki, M.; Kubera, A. M.; Kučera, V.; Kuhn, C.; Kuijer, P. G.; Kumar, A.; Kumar, J.; Kumar, L.; Kumar, S.; Kundu, S.; Kurashvili, P.; Kurepin, A.; Kurepin, A. B.; Kuryakin, A.; Kushpil, S.; Kweon, M. J.; Kwon, Y.; La Pointe, S. L.; La Rocca, P.; Lagana Fernandes, C.; Lakomov, I.; Langoy, R.; Lapidus, K.; Lara, C.; Lardeux, A.; Lattuca, A.; Laudi, E.; Lazaridis, L.; Lea, R.; Leardini, L.; Lee, S.; Lehas, F.; Lehner, S.; Lehrbach, J.; Lemmon, R. C.; Lenti, V.; Leogrande, E.; León Monzón, I.; Lévai, P.; Li, S.; Li, X.; Lien, J.; Lietava, R.; Lindal, S.; Lindenstruth, V.; Lippmann, C.; Lisa, M. A.; Ljunggren, H. M.; Llope, W.; Lodato, D. F.; Loenne, P. I.; Loginov, V.; Loizides, C.; Lopez, X.; López Torres, E.; Lowe, A.; Luettig, P.; Lunardon, M.; Luparello, G.; Lupi, M.; Lutz, T. H.; Maevskaya, A.; Mager, M.; Mahajan, S.; Mahmood, S. M.; Maire, A.; Majka, R. D.; Malaev, M.; Maldonado Cervantes, I.; Malinina, L.; Mal'Kevich, D.; Malzacher, P.; Mamonov, A.; Manko, V.; Manso, F.; Manzari, V.; Mao, Y.; Marchisone, M.; Mareš, J.; Margagliotti, G. V.; Margotti, A.; Margutti, J.; Marín, A.; Markert, C.; Marquard, M.; Martin, N. A.; Martinengo, P.; Martínez, M. I.; Martínez García, G.; Martinez Pedreira, M.; Mas, A.; Masciocchi, S.; Masera, M.; Masoni, A.; Mastroserio, A.; Matyja, A.; Mayer, C.; Mazer, J.; Mazzilli, M.; Mazzoni, M. A.; Meddi, F.; Melikyan, Y.; Menchaca-Rocha, A.; Meninno, E.; Mercado Pérez, J.; Meres, M.; Mhlanga, S.; Miake, Y.; Mieskolainen, M. M.; Mikhaylov, K.; Milano, L.; Milosevic, J.; Mischke, A.; Mishra, A. N.; Mishra, T.; Miśkowiec, D.; Mitra, J.; Mitu, C. M.; Mohammadi, N.; Mohanty, B.; Molnar, L.; Montes, E.; Moreira De Godoy, D. A.; Moreno, L. A. P.; Moretto, S.; Morreale, A.; Morsch, A.; Muccifora, V.; Mudnic, E.; Mühlheim, D.; Muhuri, S.; Mukherjee, M.; Mulligan, J. D.; Munhoz, M. G.; Münning, K.; Munzer, R. H.; Murakami, H.; Murray, S.; Musa, L.; Musinsky, J.; Myers, C. J.; Naik, B.; Nair, R.; Nandi, B. K.; Nania, R.; Nappi, E.; Naru, M. U.; Natal da Luz, H.; Nattrass, C.; Navarro, S. R.; Nayak, K.; Nayak, R.; Nayak, T. K.; Nazarenko, S.; Nedosekin, A.; Negrao De Oliveira, R. A.; Nellen, L.; Ng, F.; Nicassio, M.; Niculescu, M.; Niedziela, J.; Nielsen, B. S.; Nikolaev, S.; Nikulin, S.; Nikulin, V.; Noferini, F.; Nomokonov, P.; Nooren, G.; Noris, J. C. C.; Norman, J.; Nyanin, A.; Nystrand, J.; Oeschler, H.; Oh, S.; Ohlson, A.; Okubo, T.; Olah, L.; Oleniacz, J.; Oliveira Da Silva, A. C.; Oliver, M. H.; Onderwaater, J.; Oppedisano, C.; Orava, R.; Oravec, M.; Ortiz Velasquez, A.; Oskarsson, A.; Otwinowski, J.; Oyama, K.; Ozdemir, M.; Pachmayer, Y.; Pacik, V.; Pagano, D.; Pagano, P.; Paić, G.; Pal, S. K.; Palni, P.; Pan, J.; Pandey, A. K.; Papikyan, V.; Pappalardo, G. S.; Pareek, P.; Park, J.; Park, W. J.; Parmar, S.; Passfeld, A.; Paticchio, V.; Patra, R. N.; Paul, B.; Pei, H.; Peitzmann, T.; Peng, X.; Pereira Da Costa, H.; Peresunko, D.; Perez Lezama, E.; Peskov, V.; Pestov, Y.; Petráček, V.; Petrov, V.; Petrovici, M.; Petta, C.; Piano, S.; Pikna, M.; Pillot, P.; Pimentel, L. O. D. L.; Pinazza, O.; Pinsky, L.; Piyarathna, D. B.; Płoskoń, M.; Planinic, M.; Pluta, J.; Pochybova, S.; Podesta-Lerma, P. L. M.; Poghosyan, M. G.; Polichtchouk, B.; Poljak, N.; Poonsawat, W.; Pop, A.; Poppenborg, H.; Porteboeuf-Houssais, S.; Porter, J.; Pospisil, J.; Pozdniakov, V.; Prasad, S. K.; Preghenella, R.; Prino, F.; Pruneau, C. A.; Pshenichnov, I.; Puccio, M.; Puddu, G.; Pujahari, P.; Punin, V.; Putschke, J.; Qvigstad, H.; Rachevski, A.; Raha, S.; Rajput, S.; Rak, J.; Rakotozafindrabe, A.; Ramello, L.; Rami, F.; Rana, D. B.; Raniwala, R.; Raniwala, S.; Räsänen, S. S.; Rascanu, B. T.; Rathee, D.; Ratza, V.; Ravasenga, I.; Read, K. F.; Redlich, K.; Rehman, A.; Reichelt, P.; Reidt, F.; Ren, X.; Renfordt, R.; Reolon, A. R.; Reshetin, A.; Reygers, K.; Riabov, V.; Ricci, R. A.; Richert, T.; Richter, M.; Riedler, P.; Riegler, W.; Riggi, F.; Ristea, C.; Rodríguez Cahuantzi, M.; Røed, K.; Rogochaya, E.; Rohr, D.; Röhrich, D.; Ronchetti, F.; Ronflette, L.; Rosnet, P.; Rossi, A.; Roukoutakis, F.; Roy, A.; Roy, C.; Roy, P.; Rubio Montero, A. J.; Rui, R.; Russo, R.; Ryabinkin, E.; Ryabov, Y.; Rybicki, A.; Saarinen, S.; Sadhu, S.; Sadovsky, S.; Šafařík, K.; Sahlmuller, B.; Sahoo, B.; Sahoo, P.; Sahoo, R.; Sahoo, S.; Sahu, P. K.; Saini, J.; Sakai, S.; Saleh, M. A.; Salzwedel, J.; Sambyal, S.; Samsonov, V.; Sandoval, A.; Sano, M.; Sarkar, D.; Sarkar, N.; Sarma, P.; Sas, M. H. P.; Scapparone, E.; Scarlassara, F.; Scharenberg, R. P.; Schiaua, C.; Schicker, R.; Schmidt, C.; Schmidt, H. R.; Schmidt, M.; Schukraft, J.; Schutz, Y.; Schwarz, K.; Schweda, K.; Scioli, G.; Scomparin, E.; Scott, R.; Šefčík, M.; Seger, J. E.; Sekiguchi, Y.; Sekihata, D.; Selyuzhenkov, I.; Senosi, K.; Senyukov, S.; Serradilla, E.; Sett, P.; Sevcenco, A.; Shabanov, A.; Shabetai, A.; Shadura, O.; Shahoyan, R.; Shangaraev, A.; Sharma, A.; Sharma, A.; Sharma, M.; Sharma, M.; Sharma, N.; Sheikh, A. I.; Shigaki, K.; Shou, Q.; Shtejer, K.; Sibiriak, Y.; Siddhanta, S.; Sielewicz, K. M.; Siemiarczuk, T.; Silvermyr, D.; Silvestre, C.; Simatovic, G.; Simonetti, G.; Singaraju, R.; Singh, R.; Singhal, V.; Sinha, T.; Sitar, B.; Sitta, M.; Skaali, T. B.; Slupecki, M.; Smirnov, N.; Snellings, R. J. M.; Snellman, T. W.; Song, J.; Song, M.; Song, Z.; Soramel, F.; Sorensen, S.; Sozzi, F.; Spiriti, E.; Sputowska, I.; Srivastava, B. K.; Stachel, J.; Stan, I.; Stankus, P.; Stenlund, E.; Steyn, G.; Stiller, J. H.; Stocco, D.; Strmen, P.; Suaide, A. A. P.; Sugitate, T.; Suire, C.; Suleymanov, M.; Suljic, M.; Sultanov, R.; Šumbera, M.; Sumowidagdo, S.; Suzuki, K.; Swain, S.; Szabo, A.; Szarka, I.; Szczepankiewicz, A.; Szymanski, M.; Tabassam, U.; Takahashi, J.; Tambave, G. J.; Tanaka, N.; Tarhini, M.; Tariq, M.; Tarzila, M. G.; Tauro, A.; Tejeda Muñoz, G.; Telesca, A.; Terasaki, K.; Terrevoli, C.; Teyssier, B.; Thakur, D.; Thomas, D.; Tieulent, R.; Tikhonov, A.; Timmins, A. R.; Toia, A.; Tripathy, S.; Trogolo, S.; Trombetta, G.; Trubnikov, V.; Trzaska, W. H.; Tsuji, T.; Tumkin, A.; Turrisi, R.; Tveter, T. S.; Ullaland, K.; Umaka, E. N.; Uras, A.; Usai, G. L.; Utrobicic, A.; Vala, M.; Van Der Maarel, J.; Van Hoorne, J. W.; van Leeuwen, M.; Vanat, T.; Vande Vyvre, P.; Varga, D.; Vargas, A.; Vargyas, M.; Varma, R.; Vasileiou, M.; Vasiliev, A.; Vauthier, A.; Vázquez Doce, O.; Vechernin, V.; Veen, A. M.; Velure, A.; Vercellin, E.; Vergara Limón, S.; Vernet, R.; Vértesi, R.; Vickovic, L.; Vigolo, S.; Viinikainen, J.; Vilakazi, Z.; Villalobos Baillie, O.; Villatoro Tello, A.; Vinogradov, A.; Vinogradov, L.; Virgili, T.; Vislavicius, V.; Vodopyanov, A.; Völkl, M. A.; Voloshin, K.; Voloshin, S. A.; Volpe, G.; von Haller, B.; Vorobyev, I.; Voscek, D.; Vranic, D.; Vrláková, J.; Wagner, B.; Wagner, J.; Wang, H.; Wang, M.; Watanabe, D.; Watanabe, Y.; Weber, M.; Weber, S. G.; Weiser, D. F.; Wessels, J. P.; Westerhoff, U.; Whitehead, A. M.; Wiechula, J.; Wikne, J.; Wilk, G.; Wilkinson, J.; Willems, G. A.; Williams, M. C. S.; Windelband, B.; Winn, M.; Witt, W. E.; Yalcin, S.; Yang, P.; Yano, S.; Yin, Z.; Yokoyama, H.; Yoo, I.-K.; Yoon, J. H.; Yurchenko, V.; Zaccolo, V.; Zaman, A.; Zampolli, C.; Zanoli, H. J. C.; Zaporozhets, S.; Zardoshti, N.; Zarochentsev, A.; Závada, P.; Zaviyalov, N.; Zbroszczyk, H.; Zhalov, M.; Zhang, H.; Zhang, X.; Zhang, Y.; Zhang, C.; Zhang, Z.; Zhao, C.; Zhigareva, N.; Zhou, D.; Zhou, Y.; Zhou, Z.; Zhu, H.; Zhu, J.; Zichichi, A.; Zimmermann, A.; Zimmermann, M. B.; Zinovjev, G.; Zmeskal, J.

    2017-02-01

    Particle identification is an important feature of the ALICE detector at the LHC. In particular, for particle identification via the time-of-flight technique, the precise determination of the event collision time represents an important ingredient of the quality of the measurement. In this paper, the different methods used for such a measurement in ALICE by means of the T0 and the TOF detectors are reviewed. Efficiencies, resolution and the improvement of the particle identification separation power of the methods used are presented for the different LHC colliding systems (pp, p-Pb and Pb-Pb) during the first period of data taking of LHC (RUN 1).

  10. A Multivariate Investigation of Employee Absenteeism.

    DTIC Science & Technology

    1980-05-01

    A MULTIVARIATE INVESTIGATION OF EMPLOYEE ABSENTEEISM.(U) MAY 80 J R TERBORG, G A OAVIS, F J SMITH N00014-78"C-0756 UNCLASSIFIED TR-80-5 NL inuuununn...COMPLEX ORGANIZATIONS PROGRAM IN INDUSTRIAL ORGANIZATIONAL PSYCHOLOG C, DEPARTMENT OF PSYCHOLOGY a- UNIVERSITY OF HOUSTON C HOUSTON, TEXAS T7004 C...a-o I *I-- . ’ 4 , ... ,.I .,.- .S 7Jn .jA A Multivariate Investigation of Employee Absenteeism James R. Terborg & Gregory A. Davis University of

  11. Surrogate marker analysis in cancer clinical trials through time-to-event mediation techniques.

    PubMed

    Vandenberghe, Sjouke; Duchateau, Luc; Slaets, Leen; Bogaerts, Jan; Vansteelandt, Stijn

    2017-01-01

    The meta-analytic approach is the gold standard for validation of surrogate markers, but has the drawback of requiring data from several trials. We refine modern mediation analysis techniques for time-to-event endpoints and apply them to investigate whether pathological complete response can be used as a surrogate marker for disease-free survival in the EORTC 10994/BIG 1-00 randomised phase 3 trial in which locally advanced breast cancer patients were randomised to either taxane or anthracycline based neoadjuvant chemotherapy. In the mediation analysis, the treatment effect is decomposed into an indirect effect via pathological complete response and the remaining direct effect. It shows that only 4.2% of the treatment effect on disease-free survival after five years is mediated by the treatment effect on pathological complete response. There is thus no evidence from our analysis that pathological complete response is a valuable surrogate marker to evaluate the effect of taxane versus anthracycline based chemotherapies on progression free survival of locally advanced breast cancer patients. The proposed analysis strategy is broadly applicable to mediation analyses of time-to-event endpoints, is easy to apply and outperforms existing strategies in terms of precision as well as robustness against model misspecification.

  12. Bayesian Phase II optimization for time-to-event data based on historical information.

    PubMed

    Bertsche, Anja; Fleischer, Frank; Beyersmann, Jan; Nehmiz, Gerhard

    2017-01-01

    After exploratory drug development, companies face the decision whether to initiate confirmatory trials based on limited efficacy information. This proof-of-concept decision is typically performed after a Phase II trial studying a novel treatment versus either placebo or an active comparator. The article aims to optimize the design of such a proof-of-concept trial with respect to decision making. We incorporate historical information and develop pre-specified decision criteria accounting for the uncertainty of the observed treatment effect. We optimize these criteria based on sensitivity and specificity, given the historical information. Specifically, time-to-event data are considered in a randomized 2-arm trial with additional prior information on the control treatment. The proof-of-concept criterion uses treatment effect size, rather than significance. Criteria are defined on the posterior distribution of the hazard ratio given the Phase II data and the historical control information. Event times are exponentially modeled within groups, allowing for group-specific conjugate prior-to-posterior calculation. While a non-informative prior is placed on the investigational treatment, the control prior is constructed via the meta-analytic-predictive approach. The design parameters including sample size and allocation ratio are then optimized, maximizing the probability of taking the right decision. The approach is illustrated with an example in lung cancer.

  13. Short Sleep Duration is an Independent Predictor of Cardiovascular Events in Japanese Hypertensive Patients

    PubMed Central

    Eguchi, Kazuo; Pickering, Thomas G.; Schwartz, Joseph E.; Hoshide, Satoshi; Ishikawa, Joji; Ishikawa, Shizukiyo; Shimada, Kazuyuki; Kario, Kazuomi

    2013-01-01

    Context It is not known whether short duration of sleep is a predictor of future cardiovascular events in hypertensive patients. Objective To test the hypothesis that short duration of sleep is independently associated with incident cardiovascular diseases (CVD). Design, Setting, and Participants We performed ambulatory BP monitoring (ABPM) in 1255 subjects with hypertension (mean age: 70.4±9.9 years) and they were followed for an average of 50±23 months. Short sleep duration was defined as <7.5 hrs (20th percentile). Multivariable Cox hazard models predicting CVD events were used to estimate the adjusted hazard ratio (HR) and 95% CI for short sleep duration. A riser pattern was defined when average nighttime SBP exceeded daytime SBP. Main Outcome Measures The end point was cardiovascular events: stroke, fatal or non-fatal myocardial infarction (MI), and sudden cardiac death. Results In multivariable analyses, short duration of sleep (<7.5 hrs) was associated with incident CVD (HR=1.68; 1.06–2.66, P=.03). A synergistic interaction was observed between short sleep duration and the riser pattern (P=.089). When subjects were categorized on the basis of their sleep time and riser/non-riser patterns, the shorter sleep+riser group had a substantially and significantly higher incidence of CVD than the predominant normal sleep+non-riser group (HR=4.43;2.09–9.39, P<0.001), independent of covariates. Conclusions Short duration of sleep is associated with incident CVD risk, and the combination of riser pattern and short duration of sleep that is most strongly predictive of future CVD, independent of ambulatory BP levels. Physicians should inquire about sleep duration in the risk assessment of hypertensive patients. PMID:19001199

  14. Sampling effort affects multivariate comparisons of stream assemblages

    USGS Publications Warehouse

    Cao, Y.; Larsen, D.P.; Hughes, R.M.; Angermeier, P.L.; Patton, T.M.

    2002-01-01

    Multivariate analyses are used widely for determining patterns of assemblage structure, inferring species-environment relationships and assessing human impacts on ecosystems. The estimation of ecological patterns often depends on sampling effort, so the degree to which sampling effort affects the outcome of multivariate analyses is a concern. We examined the effect of sampling effort on site and group separation, which was measured using a mean similarity method. Two similarity measures, the Jaccard Coefficient and Bray-Curtis Index were investigated with 1 benthic macroinvertebrate and 2 fish data sets. Site separation was significantly improved with increased sampling effort because the similarity between replicate samples of a site increased more rapidly than between sites. Similarly, the faster increase in similarity between sites of the same group than between sites of different groups caused clearer separation between groups. The strength of site and group separation completely stabilized only when the mean similarity between replicates reached 1. These results are applicable to commonly used multivariate techniques such as cluster analysis and ordination because these multivariate techniques start with a similarity matrix. Completely stable outcomes of multivariate analyses are not feasible. Instead, we suggest 2 criteria for estimating the stability of multivariate analyses of assemblage data: 1) mean within-site similarity across all sites compared, indicating sample representativeness, and 2) the SD of within-site similarity across sites, measuring sample comparability.

  15. Multivariate-$t$ nonlinear mixed models with application to censored multi-outcome AIDS studies.

    PubMed

    Lin, Tsung-I; Wang, Wan-Lun

    2017-10-01

    In multivariate longitudinal HIV/AIDS studies, multi-outcome repeated measures on each patient over time may contain outliers, and the viral loads are often subject to a upper or lower limit of detection depending on the quantification assays. In this article, we consider an extension of the multivariate nonlinear mixed-effects model by adopting a joint multivariate-$t$ distribution for random effects and within-subject errors and taking the censoring information of multiple responses into account. The proposed model is called the multivariate-$t$ nonlinear mixed-effects model with censored responses (MtNLMMC), allowing for analyzing multi-outcome longitudinal data exhibiting nonlinear growth patterns with censorship and fat-tailed behavior. Utilizing the Taylor-series linearization method, a pseudo-data version of expectation conditional maximization either (ECME) algorithm is developed for iteratively carrying out maximum likelihood estimation. We illustrate our techniques with two data examples from HIV/AIDS studies. Experimental results signify that the MtNLMMC performs favorably compared to its Gaussian analogue and some existing approaches. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Multivariate temporal pattern analysis applied to the study of rat behavior in the elevated plus maze: methodological and conceptual highlights.

    PubMed

    Casarrubea, M; Magnusson, M S; Roy, V; Arabo, A; Sorbera, F; Santangelo, A; Faulisi, F; Crescimanno, G

    2014-08-30

    Aim of this article is to illustrate the application of a multivariate approach known as t-pattern analysis in the study of rat behavior in elevated plus maze. By means of this multivariate approach, significant relationships among behavioral events in the course of time can be described. Both quantitative and t-pattern analyses were utilized to analyze data obtained from fifteen male Wistar rats following a trial 1-trial 2 protocol. In trial 2, in comparison with the initial exposure, mean occurrences of behavioral elements performed in protected zones of the maze showed a significant increase counterbalanced by a significant decrease of mean occurrences of behavioral elements in unprotected zones. Multivariate t-pattern analysis, in trial 1, revealed the presence of 134 t-patterns of different composition. In trial 2, the temporal structure of behavior become more simple, being present only 32 different t-patterns. Behavioral strings and stripes (i.e. graphical representation of each t-pattern onset) of all t-patterns were presented both for trial 1 and trial 2 as well. Finally, percent distributions in the three zones of the maze show a clear-cut increase of t-patterns in closed arm and a significant reduction in the remaining zones. Results show that previous experience deeply modifies the temporal structure of rat behavior in the elevated plus maze. In addition, this article, by highlighting several conceptual, methodological and illustrative aspects on the utilization of t-pattern analysis, could represent a useful background to employ such a refined approach in the study of rat behavior in elevated plus maze. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Long-Term Memory: A Natural Mechanism for the Clustering of Extreme Events and Anomalous Residual Times in Climate Records

    NASA Astrophysics Data System (ADS)

    Bunde, Armin; Eichner, Jan F.; Kantelhardt, Jan W.; Havlin, Shlomo

    2005-01-01

    We study the statistics of the return intervals between extreme events above a certain threshold in long-term persistent records. We find that the long-term memory leads (i)to a stretched exponential distribution of the return intervals, (ii)to a pronounced clustering of extreme events, and (iii)to an anomalous behavior of the mean residual time to the next event that depends on the history and increases with the elapsed time in a counterintuitive way. We present an analytical scaling approach and demonstrate that all these features can be seen in long climate records. The phenomena should also occur in heartbeat records, Internet traffic, and stock market volatility and have to be taken into account for an efficient risk evaluation.

  18. Power analysis for multivariate and repeated measures designs: a flexible approach using the SPSS MANOVA procedure.

    PubMed

    D'Amico, E J; Neilands, T B; Zambarano, R

    2001-11-01

    Although power analysis is an important component in the planning and implementation of research designs, it is often ignored. Computer programs for performing power analysis are available, but most have limitations, particularly for complex multivariate designs. An SPSS procedure is presented that can be used for calculating power for univariate, multivariate, and repeated measures models with and without time-varying and time-constant covariates. Three examples provide a framework for calculating power via this method: an ANCOVA, a MANOVA, and a repeated measures ANOVA with two or more groups. The benefits and limitations of this procedure are discussed.

  19. Space-time variation of respiratory cancers in South Carolina: a flexible multivariate mixture modeling approach to risk estimation.

    PubMed

    Carroll, Rachel; Lawson, Andrew B; Kirby, Russell S; Faes, Christel; Aregay, Mehreteab; Watjou, Kevin

    2017-01-01

    Many types of cancer have an underlying spatiotemporal distribution. Spatiotemporal mixture modeling can offer a flexible approach to risk estimation via the inclusion of latent variables. In this article, we examine the application and benefits of using four different spatiotemporal mixture modeling methods in the modeling of cancer of the lung and bronchus as well as "other" respiratory cancer incidences in the state of South Carolina. Of the methods tested, no single method outperforms the other methods; which method is best depends on the cancer under consideration. The lung and bronchus cancer incidence outcome is best described by the univariate modeling formulation, whereas the "other" respiratory cancer incidence outcome is best described by the multivariate modeling formulation. Spatiotemporal multivariate mixture methods can aid in the modeling of cancers with small and sparse incidences when including information from a related, more common type of cancer. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. A Network-Based Algorithm for Clustering Multivariate Repeated Measures Data

    NASA Technical Reports Server (NTRS)

    Koslovsky, Matthew; Arellano, John; Schaefer, Caroline; Feiveson, Alan; Young, Millennia; Lee, Stuart

    2017-01-01

    The National Aeronautics and Space Administration (NASA) Astronaut Corps is a unique occupational cohort for which vast amounts of measures data have been collected repeatedly in research or operational studies pre-, in-, and post-flight, as well as during multiple clinical care visits. In exploratory analyses aimed at generating hypotheses regarding physiological changes associated with spaceflight exposure, such as impaired vision, it is of interest to identify anomalies and trends across these expansive datasets. Multivariate clustering algorithms for repeated measures data may help parse the data to identify homogeneous groups of astronauts that have higher risks for a particular physiological change. However, available clustering methods may not be able to accommodate the complex data structures found in NASA data, since the methods often rely on strict model assumptions, require equally-spaced and balanced assessment times, cannot accommodate missing data or differing time scales across variables, and cannot process continuous and discrete data simultaneously. To fill this gap, we propose a network-based, multivariate clustering algorithm for repeated measures data that can be tailored to fit various research settings. Using simulated data, we demonstrate how our method can be used to identify patterns in complex data structures found in practice.

  1. Resilient filtering for time-varying stochastic coupling networks under the event-triggering scheduling

    NASA Astrophysics Data System (ADS)

    Wang, Fan; Liang, Jinling; Dobaie, Abdullah M.

    2018-07-01

    The resilient filtering problem is considered for a class of time-varying networks with stochastic coupling strengths. An event-triggered strategy is adopted to save the network resources by scheduling the signal transmission from the sensors to the filters based on certain prescribed rules. Moreover, the filter parameters to be designed are subject to gain perturbations. The primary aim of the addressed problem is to determine a resilient filter that ensures an acceptable filtering performance for the considered network with event-triggering scheduling. To handle such an issue, an upper bound on the estimation error variance is established for each node according to the stochastic analysis. Subsequently, the resilient filter is designed by locally minimizing the derived upper bound at each iteration. Moreover, rigorous analysis shows the monotonicity of the minimal upper bound regarding the triggering threshold. Finally, a simulation example is presented to show effectiveness of the established filter scheme.

  2. Multiple imputation for multivariate data with missing and below-threshold measurements: time-series concentrations of pollutants in the Arctic.

    PubMed

    Hopke, P K; Liu, C; Rubin, D B

    2001-03-01

    Many chemical and environmental data sets are complicated by the existence of fully missing values or censored values known to lie below detection thresholds. For example, week-long samples of airborne particulate matter were obtained at Alert, NWT, Canada, between 1980 and 1991, where some of the concentrations of 24 particulate constituents were coarsened in the sense of being either fully missing or below detection limits. To facilitate scientific analysis, it is appealing to create complete data by filling in missing values so that standard complete-data methods can be applied. We briefly review commonly used strategies for handling missing values and focus on the multiple-imputation approach, which generally leads to valid inferences when faced with missing data. Three statistical models are developed for multiply imputing the missing values of airborne particulate matter. We expect that these models are useful for creating multiple imputations in a variety of incomplete multivariate time series data sets.

  3. [Multivariate Adaptive Regression Splines (MARS), an alternative for the analysis of time series].

    PubMed

    Vanegas, Jairo; Vásquez, Fabián

    Multivariate Adaptive Regression Splines (MARS) is a non-parametric modelling method that extends the linear model, incorporating nonlinearities and interactions between variables. It is a flexible tool that automates the construction of predictive models: selecting relevant variables, transforming the predictor variables, processing missing values and preventing overshooting using a self-test. It is also able to predict, taking into account structural factors that might influence the outcome variable, thereby generating hypothetical models. The end result could identify relevant cut-off points in data series. It is rarely used in health, so it is proposed as a tool for the evaluation of relevant public health indicators. For demonstrative purposes, data series regarding the mortality of children under 5 years of age in Costa Rica were used, comprising the period 1978-2008. Copyright © 2016 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  4. Error Covariance Penalized Regression: A novel multivariate model combining penalized regression with multivariate error structure.

    PubMed

    Allegrini, Franco; Braga, Jez W B; Moreira, Alessandro C O; Olivieri, Alejandro C

    2018-06-29

    A new multivariate regression model, named Error Covariance Penalized Regression (ECPR) is presented. Following a penalized regression strategy, the proposed model incorporates information about the measurement error structure of the system, using the error covariance matrix (ECM) as a penalization term. Results are reported from both simulations and experimental data based on replicate mid and near infrared (MIR and NIR) spectral measurements. The results for ECPR are better under non-iid conditions when compared with traditional first-order multivariate methods such as ridge regression (RR), principal component regression (PCR) and partial least-squares regression (PLS). Copyright © 2018 Elsevier B.V. All rights reserved.

  5. MARSIS Data Bad Time Stamp: Analysis and Solution of an Anomaly Event in a Space Mission

    NASA Astrophysics Data System (ADS)

    Giuppi, S.; Cartacci, M.; Cicchetti, A.; Frigeri, A.; Noschese, R.; Orosei, R.

    2012-04-01

    Mars Express is Europe's first spacecraft to the Red Planet. The spacecraft has been orbiting Mars since December 2003, carrying a suite of instruments that are investigating many scientific aspects of this planet in unprecedented detail. The observations are particularly focused on martian atmosphere, surface and subsurface. The most innovative instrument on board of Mars Express is MARSIS, a subsurface radar sounder with a 40-meter antenna. The main objective of MARSIS is to look for water from the martian surface down to about 5 kilometers below the surface. It provides the first opportunity to detect liquid water directly. It is also able to characterize the surface elevation, roughness, and radar reflectivity of the planet and to study the interaction of the atmosphere and solar wind in the red planet's ionosphere. MARSIS Data are stored on the on-board memory and periodically sent to Earth ground stations. Spacecraft Event Time (SCET) is the time an event occurs in relation to a spacecraft as measured by the spacecraft clock. Since it takes time for a radio transmission to reach the spacecraft from the earth, the usual operation of a spacecraft is done via an uploaded commanding script containing SCET markers to ensure a certain timeline of events. Occasionally the generation time (SCET) of the MARSIS science packets recorded during an observation gets corrupted. This means that while some of the data have the correct SCET, some other data have a SCET not compliant with the effective generation time. For this reason with the standard procedure it is possible to retrieve only partial data. In this paper we describe the cause of the anomaly occurrence and the procedures to be applied depending on the circumstances that arise. The application of these procedures is been successful and allowed to circumvent the problem.

  6. DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals

    DTIC Science & Technology

    2013-04-24

    DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals Vernon...datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal . We have developed...As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and

  7. Multivariate survivorship analysis using two cross-sectional samples.

    PubMed

    Hill, M E

    1999-11-01

    As an alternative to survival analysis with longitudinal data, I introduce a method that can be applied when one observes the same cohort in two cross-sectional samples collected at different points in time. The method allows for the estimation of log-probability survivorship models that estimate the influence of multiple time-invariant factors on survival over a time interval separating two samples. This approach can be used whenever the survival process can be adequately conceptualized as an irreversible single-decrement process (e.g., mortality, the transition to first marriage among a cohort of never-married individuals). Using data from the Integrated Public Use Microdata Series (Ruggles and Sobek 1997), I illustrate the multivariate method through an investigation of the effects of race, parity, and educational attainment on the survival of older women in the United States.

  8. Impact of Azithromycin on Pregnancy Prolongation in Women at Risk of Preterm Labor: A Time-to-Event Analysis.

    PubMed

    Goyer, Isabelle; Ferland, Gabrielle; Ruo, Ni; Morin, Caroline; Brochet, Marie-Sophie; Morin, Lucie; Ferreira, Ema

    2016-09-13

    Since 2006, the empiric use of azithromycin in women at risk of premature birth has become prevalent in our institution without any evidence of its efficacy. Although antibiotics can prolong pregnancy in preterm prolonged rupture of membranes, no published data are available for women with intact membranes. To describe the purpose of adding azithromycin to the usual treatments (cerclage, tocolysis, rest, etc.) to prolong pregnancy in women with intact membranes who are at risk of or already in preterm labour. A retrospective observational cohort study was done at a Mother-Child University Hospital Centre. Patients admitted to obstetric ward who received azithromycin between January 1 st , 2006 and August 1 st , 2010 were included. A total of 127 exposed women were matched to 127 controls through medical records and pharmacy software. A time-to-event analysis was done to compare gestational age at the time of the recorded composite event (delivery, or rupture of membranes, or second intervention to prolong pregnancy). To compare proportions of composite event at different time points, χ 2 tests were used. Patients who received azithromycin had a more severe condition at presentation. Once adjusted for confounding factors, prolongation of pregnancy (HR =1.049; CI 95%: 0.774-1.421 [p=0.758]) and gestational age at the event (HR=1.200; CI 95%: 0.894-1.609 [p=0.225]) did not differ between the groups. The proportions of women with an event ≥7 days post-diagnosis or ≥37 gestational weeks were similar. Azithromycin was added to medical therapy in a more at-risk population and no clear benefit was measured.

  9. Application of the new Cross Recurrence Plots to multivariate data

    NASA Astrophysics Data System (ADS)

    Thiel, M.; Romano, C.; Kurths, J.

    2003-04-01

    We extend and then apply the method of the new Cross Recurrence Plots (XRPs) to multivariate data. After introducing the new method we carry out an analysis of spatiotemporal ecological data. We compute not only the Rényi entropies and cross entropies by XRP, that allow to draw conclusions about the coupling of the systems, but also find a prediction horizon for intermediate time scales.

  10. Real-time detection and classification of anomalous events in streaming data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferragut, Erik M.; Goodall, John R.; Iannacone, Michael D.

    2016-04-19

    A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The events can be displayed to a user in user-defined groupings in an animated fashion. The system can include a plurality of anomaly detectors that together implement an algorithm to identify low probability events and detect atypical traffic patterns. The atypical traffic patterns can then be classified as being of interest or not. In one particular example, in a network environment, the classification can be whether the network traffic is malicious or not.

  11. Hospital staff should use more than one method to detect adverse events and potential adverse events: incident reporting, pharmacist surveillance and local real‐time record review may all have a place

    PubMed Central

    Olsen, Sisse; Neale, Graham; Schwab, Kat; Psaila, Beth; Patel, Tejal; Chapman, E Jane; Vincent, Charles

    2007-01-01

    Background Over the past five years, in most hospitals in England and Wales, incident reporting has become well established but it remains unclear how well reports match clinical adverse events. International epidemiological studies of adverse events are based on retrospective, multi‐hospital case record review. In this paper the authors describe the use of incident reporting, pharmacist surveillance and local real‐time record review for the recognition of clinical risks associated with hospital inpatient care. Methodology Data on adverse events were collected prospectively on 288 patients discharged from adult acute medical and surgical units in an NHS district general hospital using incident reports, active surveillance of prescription charts by pharmacists and record review at time of discharge. Results Record review detected 26 adverse events (AEs) and 40 potential adverse events (PAEs) occurring during the index admission. In contrast, in the same patient group, incident reporting detected 11 PAEs and no AEs. Pharmacy surveillance found 10 medication errors all of which were PAEs. There was little overlap in the nature of events detected by the three methods. Conclusion The findings suggest that incident reporting does not provide an adequate assessment of clinical adverse events and that this method needs to be supplemented with other more systematic forms of data collection. Structured record review, carried out by clinicians, provides an important component of an integrated approach to identifying risk in the context of developing a safety and quality improvement programme. PMID:17301203

  12. Rejection of Multivariate Outliers.

    DTIC Science & Technology

    1983-05-01

    available in Gnanadesikan (1977). 2 The motivation for the present investigation lies in a recent paper of Schvager and Margolin (1982) who derive a... Gnanadesikan , R. (1977). Methods for Statistical Data Analysis of Multivariate Observations. Wiley, New York. [7] Hawkins, D.M. (1980). Identification of

  13. The time course of symbolic number adaptation: oscillatory EEG activity and event-related potential analysis.

    PubMed

    Hsu, Yi-Fang; Szűcs, Dénes

    2012-02-15

    Several functional magnetic resonance imaging (fMRI) studies have used neural adaptation paradigms to detect anatomical locations of brain activity related to number processing. However, currently not much is known about the temporal structure of number adaptation. In the present study, we used electroencephalography (EEG) to elucidate the time course of neural events in symbolic number adaptation. The numerical distance of deviants relative to standards was manipulated. In order to avoid perceptual confounds, all levels of deviants consisted of perceptually identical stimuli. Multiple successive numerical distance effects were detected in event-related potentials (ERPs). Analysis of oscillatory activity further showed at least two distinct stages of neural processes involved in the automatic analysis of numerical magnitude, with the earlier effect emerging at around 200ms and the later effect appearing at around 400ms. The findings support for the hypothesis that numerical magnitude processing involves a succession of cognitive events. Crown Copyright © 2011. Published by Elsevier Inc. All rights reserved.

  14. Compensator improvement for multivariable control systems

    NASA Technical Reports Server (NTRS)

    Mitchell, J. R.; Mcdaniel, W. L., Jr.; Gresham, L. L.

    1977-01-01

    A theory and the associated numerical technique are developed for an iterative design improvement of the compensation for linear, time-invariant control systems with multiple inputs and multiple outputs. A strict constraint algorithm is used in obtaining a solution of the specified constraints of the control design. The result of the research effort is the multiple input, multiple output Compensator Improvement Program (CIP). The objective of the Compensator Improvement Program is to modify in an iterative manner the free parameters of the dynamic compensation matrix so that the system satisfies frequency domain specifications. In this exposition, the underlying principles of the multivariable CIP algorithm are presented and the practical utility of the program is illustrated with space vehicle related examples.

  15. An error bound for a discrete reduced order model of a linear multivariable system

    NASA Technical Reports Server (NTRS)

    Al-Saggaf, Ubaid M.; Franklin, Gene F.

    1987-01-01

    The design of feasible controllers for high dimension multivariable systems can be greatly aided by a method of model reduction. In order for the design based on the order reduction to include a guarantee of stability, it is sufficient to have a bound on the model error. Previous work has provided such a bound for continuous-time systems for algorithms based on balancing. In this note an L-infinity bound is derived for model error for a method of order reduction of discrete linear multivariable systems based on balancing.

  16. On a Family of Multivariate Modified Humbert Polynomials

    PubMed Central

    Aktaş, Rabia; Erkuş-Duman, Esra

    2013-01-01

    This paper attempts to present a multivariable extension of generalized Humbert polynomials. The results obtained here include various families of multilinear and multilateral generating functions, miscellaneous properties, and also some special cases for these multivariable polynomials. PMID:23935411

  17. Hand-wrist and cervical vertebral maturation indicators: how can these events be used to time Class II treatments?

    PubMed

    Grave, Keith; Townsend, Grant

    2003-11-01

    Ossification events in the hand and wrist and in the cervical vertebrae have been shown to occur at specific times before, during and after the adolescent growth spurt, but there is still debate about the applicability of these findings to the clinical management of Class II cases. The aim of this study was to relate, on an individual basis, cervical vertebral maturation stages and hand-wrist ossification events to the timing of peak statural and mandibular growth in a group of indigenous Australians. Velocity curves for stature and mandibular growth were constructed for 47 boys and 27 girls, and maturation events were then plotted on the curves. For the majority of children, peak velocity in mandibular growth coincided with peak velocity in stature. Particular combinations of hand-wrist and cervical maturation events occurred consistently before, during or after the adolescent growth spurt. Our findings are consistent with those for North American children and we believe that assessment by orthodontists of a combination of hand-wrist and cervical vertebral maturation stages will enhance prediction of the adolescent growth spurt, thereby contributing to a positive, purposeful and more confident approach to the management of Class II cases.

  18. Young Children's Memory for the Times of Personal Past Events

    ERIC Educational Resources Information Center

    Pathman, Thanujeni; Larkina, Marina; Burch, Melissa M.; Bauer, Patricia J.

    2013-01-01

    Remembering the temporal information associated with personal past events is critical for autobiographical memory, yet we know relatively little about the development of this capacity. In the present research, we investigated temporal memory for naturally occurring personal events in 4-, 6-, and 8-year-old children. Parents recorded unique events…

  19. Storm Event Suspended Sediment-Discharge Hysteresis and Controls in Agricultural Watersheds: Implications for Watershed Scale Sediment Management.

    PubMed

    Sherriff, Sophie C; Rowan, John S; Fenton, Owen; Jordan, Philip; Melland, Alice R; Mellander, Per-Erik; hUallacháin, Daire Ó

    2016-02-16

    Within agricultural watersheds suspended sediment-discharge hysteresis during storm events is commonly used to indicate dominant sediment sources and pathways. However, availability of high-resolution data, qualitative metrics, longevity of records, and simultaneous multiwatershed analyses has limited the efficacy of hysteresis as a sediment management tool. This two year study utilizes a quantitative hysteresis index from high-resolution suspended sediment and discharge data to assess fluctuations in sediment source location, delivery mechanisms and export efficiency in three intensively farmed watersheds during events over time. Flow-weighted event sediment export was further considered using multivariate techniques to delineate rainfall, stream hydrology, and antecedent moisture controls on sediment origins. Watersheds with low permeability (moderately- or poorly drained soils) with good surface hydrological connectivity, therefore, had contrasting hysteresis due to source location (hillslope versus channel bank). The well-drained watershed with reduced connectivity exported less sediment but, when watershed connectivity was established, the largest event sediment load of all watersheds occurred. Event sediment export was elevated in arable watersheds when low groundcover was coupled with high connectivity, whereas in the grassland watershed, export was attributed to wetter weather only. Hysteresis analysis successfully indicated contrasting seasonality, connectivity and source availability and is a useful tool to identify watershed specific sediment management practices.

  20. Finding the multipath propagation of multivariable crude oil prices using a wavelet-based network approach

    NASA Astrophysics Data System (ADS)

    Jia, Xiaoliang; An, Haizhong; Sun, Xiaoqi; Huang, Xuan; Gao, Xiangyun

    2016-04-01

    The globalization and regionalization of crude oil trade inevitably give rise to the difference of crude oil prices. The understanding of the pattern of the crude oil prices' mutual propagation is essential for analyzing the development of global oil trade. Previous research has focused mainly on the fuzzy long- or short-term one-to-one propagation of bivariate oil prices, generally ignoring various patterns of periodical multivariate propagation. This study presents a wavelet-based network approach to help uncover the multipath propagation of multivariable crude oil prices in a joint time-frequency period. The weekly oil spot prices of the OPEC member states from June 1999 to March 2011 are adopted as the sample data. First, we used wavelet analysis to find different subseries based on an optimal decomposing scale to describe the periodical feature of the original oil price time series. Second, a complex network model was constructed based on an optimal threshold selection to describe the structural feature of multivariable oil prices. Third, Bayesian network analysis (BNA) was conducted to find the probability causal relationship based on periodical structural features to describe the various patterns of periodical multivariable propagation. Finally, the significance of the leading and intermediary oil prices is discussed. These findings are beneficial for the implementation of periodical target-oriented pricing policies and investment strategies.

  1. From sensation to perception: Using multivariate classification of visual illusions to identify neural correlates of conscious awareness in space and time.

    PubMed

    Hogendoorn, Hinze

    2015-01-01

    An important goal of cognitive neuroscience is understanding the neural underpinnings of conscious awareness. Although the low-level processing of sensory input is well understood in most modalities, it remains a challenge to understand how the brain translates such input into conscious awareness. Here, I argue that the application of multivariate pattern classification techniques to neuroimaging data acquired while observers experience perceptual illusions provides a unique way to dissociate sensory mechanisms from mechanisms underlying conscious awareness. Using this approach, it is possible to directly compare patterns of neural activity that correspond to the contents of awareness, independent from changes in sensory input, and to track these neural representations over time at high temporal resolution. I highlight five recent studies using this approach, and provide practical considerations and limitations for future implementations.

  2. Multivariate analysis: A statistical approach for computations

    NASA Astrophysics Data System (ADS)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  3. Arterial stiffness and cardiovascular events: the Framingham Heart Study.

    PubMed

    Mitchell, Gary F; Hwang, Shih-Jen; Vasan, Ramachandran S; Larson, Martin G; Pencina, Michael J; Hamburg, Naomi M; Vita, Joseph A; Levy, Daniel; Benjamin, Emelia J

    2010-02-02

    Various measures of arterial stiffness and wave reflection have been proposed as cardiovascular risk markers. Prior studies have not assessed relations of a comprehensive panel of stiffness measures to prognosis in the community. We used proportional hazards models to analyze first-onset major cardiovascular disease events (myocardial infarction, unstable angina, heart failure, or stroke) in relation to arterial stiffness (pulse wave velocity [PWV]), wave reflection (augmentation index, carotid-brachial pressure amplification), and central pulse pressure in 2232 participants (mean age, 63 years; 58% women) in the Framingham Heart Study. During median follow-up of 7.8 (range, 0.2 to 8.9) years, 151 of 2232 participants (6.8%) experienced an event. In multivariable models adjusted for age, sex, systolic blood pressure, use of antihypertensive therapy, total and high-density lipoprotein cholesterol concentrations, smoking, and presence of diabetes mellitus, higher aortic PWV was associated with a 48% increase in cardiovascular disease risk (95% confidence interval, 1.16 to 1.91 per SD; P=0.002). After PWV was added to a standard risk factor model, integrated discrimination improvement was 0.7% (95% confidence interval, 0.05% to 1.3%; P<0.05). In contrast, augmentation index, central pulse pressure, and pulse pressure amplification were not related to cardiovascular disease outcomes in multivariable models. Higher aortic stiffness assessed by PWV is associated with increased risk for a first cardiovascular event. Aortic PWV improves risk prediction when added to standard risk factors and may represent a valuable biomarker of cardiovascular disease risk in the community.

  4. Multivariate Regression Analysis of Winter Ozone Events in the Uinta Basin of Eastern Utah, USA

    NASA Astrophysics Data System (ADS)

    Mansfield, M. L.

    2012-12-01

    I report on a regression analysis of a number of variables that are involved in the formation of winter ozone in the Uinta Basin of Eastern Utah. One goal of the analysis is to develop a mathematical model capable of predicting the daily maximum ozone concentration from values of a number of independent variables. The dependent variable is the daily maximum ozone concentration at a particular site in the basin. Independent variables are (1) daily lapse rate, (2) daily "basin temperature" (defined below), (3) snow cover, (4) midday solar zenith angle, (5) monthly oil production, (6) monthly gas production, and (7) the number of days since the beginning of a multi-day inversion event. Daily maximum temperature and daily snow cover data are available at ten or fifteen different sites throughout the basin. The daily lapse rate is defined operationally as the slope of the linear least-squares fit to the temperature-altitude plot, and the "basin temperature" is defined as the value assumed by the same least-squares line at an altitude of 1400 m. A multi-day inversion event is defined as a set of consecutive days for which the lapse rate remains positive. The standard deviation in the accuracy of the model is about 10 ppb. The model has been combined with historical climate and oil & gas production data to estimate historical ozone levels.

  5. A mixed-effects regression model for longitudinal multivariate ordinal data.

    PubMed

    Liu, Li C; Hedeker, Donald

    2006-03-01

    A mixed-effects item response theory model that allows for three-level multivariate ordinal outcomes and accommodates multiple random subject effects is proposed for analysis of multivariate ordinal outcomes in longitudinal studies. This model allows for the estimation of different item factor loadings (item discrimination parameters) for the multiple outcomes. The covariates in the model do not have to follow the proportional odds assumption and can be at any level. Assuming either a probit or logistic response function, maximum marginal likelihood estimation is proposed utilizing multidimensional Gauss-Hermite quadrature for integration of the random effects. An iterative Fisher scoring solution, which provides standard errors for all model parameters, is used. An analysis of a longitudinal substance use data set, where four items of substance use behavior (cigarette use, alcohol use, marijuana use, and getting drunk or high) are repeatedly measured over time, is used to illustrate application of the proposed model.

  6. A storm time, Pc 5 event observed in the outer magnetosphere by ISEE 1 and 2 - Wave properties

    NASA Technical Reports Server (NTRS)

    Greenstadt, E. W.; Scarf, F. L.; Mcpherron, R. L.; Anderson, R. R.

    1986-01-01

    The properties of the waves composing a classical storm time Pc 5 event, recorded by the satellite pair ISEE 1,2 during an inbound nearly equatorial pass in the dusk sector on August 21-22, 1978, are described. On the basis of these observations it is concluded that the events of the August 21-22 pass resulted from a combination of sources, namely, distant wideband excitation and ion drift instability, plus a coupling of wave modes. It is suggested that the observed phenomenon was a radial cross section of the type of event reported by Barfield et al. (1972).

  7. The UCSD Time-dependent Tomography and IPS use for Exploring Space Weather Events

    NASA Astrophysics Data System (ADS)

    Yu, H. S.; Jackson, B. V.; Buffington, A.; Hick, P. P.; Tokumaru, M.; Odstrcil, D.; Kim, J.; Yun, J.

    2016-12-01

    The University of California, San Diego (UCSD) time-dependent, iterative, kinematic reconstruction technique has been used and expanded upon for over two decades. It provides some of the most-accurate predictions and three-dimensional (3D) analyses of heliospheric solar-wind parameters now available using interplanetary scintillation (IPS) data. The parameters provided include reconstructions of velocity, density, and three-component magnetic fields. Precise time-dependent results are now obtained at any solar distance in the inner heliosphere using ISEE (formerly STELab), Japan, IPS data sets, and can be used to drive 3D-MHD models including ENLIL. Using IPS data, these reconstructions provide a real-time prediction of the global solar wind parameters across the whole heliosphere with a time cadence of about one day (see http://ips.ucsd.edu). Here we compare the results (such as density, velocity, and magnetic fields) from the IPS tomography with different in-situ measurements and discuss several specific space weather events that demonstrate the issues resulting from these analyses.

  8. A Multivariate Multilevel Approach to the Modeling of Accuracy and Speed of Test Takers

    ERIC Educational Resources Information Center

    Klein Entink, R. H.; Fox, J. P.; van der Linden, W. J.

    2009-01-01

    Response times on test items are easily collected in modern computerized testing. When collecting both (binary) responses and (continuous) response times on test items, it is possible to measure the accuracy and speed of test takers. To study the relationships between these two constructs, the model is extended with a multivariate multilevel…

  9. A Deep Learning Architecture for Temporal Sleep Stage Classification Using Multivariate and Multimodal Time Series.

    PubMed

    Chambon, Stanislas; Galtier, Mathieu N; Arnal, Pierrick J; Wainrib, Gilles; Gramfort, Alexandre

    2018-04-01

    Sleep stage classification constitutes an important preliminary exam in the diagnosis of sleep disorders. It is traditionally performed by a sleep expert who assigns to each 30 s of the signal of a sleep stage, based on the visual inspection of signals such as electroencephalograms (EEGs), electrooculograms (EOGs), electrocardiograms, and electromyograms (EMGs). We introduce here the first deep learning approach for sleep stage classification that learns end-to-end without computing spectrograms or extracting handcrafted features, that exploits all multivariate and multimodal polysomnography (PSG) signals (EEG, EMG, and EOG), and that can exploit the temporal context of each 30-s window of data. For each modality, the first layer learns linear spatial filters that exploit the array of sensors to increase the signal-to-noise ratio, and the last layer feeds the learnt representation to a softmax classifier. Our model is compared to alternative automatic approaches based on convolutional networks or decisions trees. Results obtained on 61 publicly available PSG records with up to 20 EEG channels demonstrate that our network architecture yields the state-of-the-art performance. Our study reveals a number of insights on the spatiotemporal distribution of the signal of interest: a good tradeoff for optimal classification performance measured with balanced accuracy is to use 6 EEG with 2 EOG (left and right) and 3 EMG chin channels. Also exploiting 1 min of data before and after each data segment offers the strongest improvement when a limited number of channels are available. As sleep experts, our system exploits the multivariate and multimodal nature of PSG signals in order to deliver the state-of-the-art classification performance with a small computational cost.

  10. Multivariate Phylogenetic Comparative Methods: Evaluations, Comparisons, and Recommendations.

    PubMed

    Adams, Dean C; Collyer, Michael L

    2018-01-01

    Recent years have seen increased interest in phylogenetic comparative analyses of multivariate data sets, but to date the varied proposed approaches have not been extensively examined. Here we review the mathematical properties required of any multivariate method, and specifically evaluate existing multivariate phylogenetic comparative methods in this context. Phylogenetic comparative methods based on the full multivariate likelihood are robust to levels of covariation among trait dimensions and are insensitive to the orientation of the data set, but display increasing model misspecification as the number of trait dimensions increases. This is because the expected evolutionary covariance matrix (V) used in the likelihood calculations becomes more ill-conditioned as trait dimensionality increases, and as evolutionary models become more complex. Thus, these approaches are only appropriate for data sets with few traits and many species. Methods that summarize patterns across trait dimensions treated separately (e.g., SURFACE) incorrectly assume independence among trait dimensions, resulting in nearly a 100% model misspecification rate. Methods using pairwise composite likelihood are highly sensitive to levels of trait covariation, the orientation of the data set, and the number of trait dimensions. The consequences of these debilitating deficiencies are that a user can arrive at differing statistical conclusions, and therefore biological inferences, simply from a dataspace rotation, like principal component analysis. By contrast, algebraic generalizations of the standard phylogenetic comparative toolkit that use the trace of covariance matrices are insensitive to levels of trait covariation, the number of trait dimensions, and the orientation of the data set. Further, when appropriate permutation tests are used, these approaches display acceptable Type I error and statistical power. We conclude that methods summarizing information across trait dimensions, as well as

  11. Simulating Multivariate Nonnormal Data Using an Iterative Algorithm

    ERIC Educational Resources Information Center

    Ruscio, John; Kaczetow, Walter

    2008-01-01

    Simulating multivariate nonnormal data with specified correlation matrices is difficult. One especially popular method is Vale and Maurelli's (1983) extension of Fleishman's (1978) polynomial transformation technique to multivariate applications. This requires the specification of distributional moments and the calculation of an intermediate…

  12. Short sleep duration as an independent predictor of cardiovascular events in Japanese patients with hypertension.

    PubMed

    Eguchi, Kazuo; Pickering, Thomas G; Schwartz, Joseph E; Hoshide, Satoshi; Ishikawa, Joji; Ishikawa, Shizukiyo; Shimada, Kazuyuki; Kario, Kazuomi

    2008-11-10

    It is not known whether short duration of sleep is a predictor of future cardiovascular events in patients with hypertension. To test the hypothesis that short duration of sleep is independently associated with incident cardiovascular diseases (CVD), we performed ambulatory blood pressure (BP) monitoring in 1255 subjects with hypertension (mean [SD] age, 70.4 [9.9] years) and followed them for a mean period of 50 (23) months. Short sleep duration was defined as less than 7.5 hours (20th percentile). Multivariable Cox hazard models predicting CVD events were used to estimate the adjusted hazard ratio and 95% confidence interval (CI) for short sleep duration. A riser pattern was defined when mean nighttime systolic BP exceeded daytime systolic BP. The end point was a cardiovascular event: stroke, fatal or nonfatal myocardial infarction (MI), and sudden cardiac death. In multivariable analyses, short duration of sleep (<7.5 hours) was associated with incident CVD (hazard ratio [HR], 1.68; 95% CI, 1.06-2.66; P = .03). A synergistic interaction was observed between short sleep duration and the riser pattern (P = .09). When subjects were classified according to their sleep time and a riser vs nonriser pattern, the group with shorter sleep duration plus the riser pattern had a substantially and significantly higher incidence of CVD than the group with predominant normal sleep duration plus the nonriser pattern (HR, 4.43; 95% CI, 2.09-9.39; P < .001), independent of covariates. Short duration of sleep is associated with incident CVD risk and the combination of the riser pattern and short duration of sleep that is most strongly predictive of future CVD, independent of ambulatory BP levels. Physicians should inquire about sleep duration in the risk assessment of patients with hypertension.

  13. The Sensitivity of Adverse Event Cost Estimates to Diagnostic Coding Error

    PubMed Central

    Wardle, Gavin; Wodchis, Walter P; Laporte, Audrey; Anderson, Geoffrey M; Baker, Ross G

    2012-01-01

    Objective To examine the impact of diagnostic coding error on estimates of hospital costs attributable to adverse events. Data Sources Original and reabstracted medical records of 9,670 complex medical and surgical admissions at 11 hospital corporations in Ontario from 2002 to 2004. Patient specific costs, not including physician payments, were retrieved from the Ontario Case Costing Initiative database. Study Design Adverse events were identified among the original and reabstracted records using ICD10-CA (Canadian adaptation of ICD10) codes flagged as postadmission complications. Propensity score matching and multivariate regression analysis were used to estimate the cost of the adverse events and to determine the sensitivity of cost estimates to diagnostic coding error. Principal Findings Estimates of the cost of the adverse events ranged from $16,008 (metabolic derangement) to $30,176 (upper gastrointestinal bleeding). Coding errors caused the total cost attributable to the adverse events to be underestimated by 16 percent. The impact of coding error on adverse event cost estimates was highly variable at the organizational level. Conclusions Estimates of adverse event costs are highly sensitive to coding error. Adverse event costs may be significantly underestimated if the likelihood of error is ignored. PMID:22091908

  14. Hour Glass Half-Full or Half-Empty? Future Time Perspective and Preoccupation with Negative Events Across the Life Span

    PubMed Central

    Strough, JoNell; de Bruin, Wändi Bruine; Parker, Andrew M.; Lemaster, Philip; Pichayayothin, Nipat; Delaney, Rebecca

    2016-01-01

    According to socioemotional selectivity theory, older adults' emotional well-being stems from having limited future time perspective that motivates them to maximize well-being in the “here and now.” Presumably, then, older adults' time horizons are associated with emotional competencies that boost positive affect and dampen negative affect, but little research has addressed this. Using a US national adult life-span sample (N= 3,933, 18-93 yrs), we found that a two-factor model of future time perspective (focus on future opportunities; focus on limited time) fit the data better than a one-factor model. Through middle age, people perceived the life-span hourglass as half full—they focused more on future opportunities than limited time. Around age 60, the balance changed to increasingly perceiving the life-span hourglass as half empty—they focused less on future opportunities and more on limited time. This pattern held even after accounting for perceived health, self-reported decision-making ability, and retirement status. At all ages, women's time horizons focused more on future opportunities compared to men's, and men's focused more on limited time. Focusing on future opportunities was associated with reporting less preoccupation with negative events, whereas focusing on limited time was associated with reporting more preoccupation. Older adults reported less preoccupation with negative events and this association was stronger after controlling for their perceptions of limited time and fewer future opportunities, suggesting that other pathways may explain older adults' reports of their ability to disengage from negative events. Insights gained and questions raised by measuring future time perspective as two dimensions are discussed. PMID:27267222

  15. Hour glass half full or half empty? Future time perspective and preoccupation with negative events across the life span.

    PubMed

    Strough, JoNell; Bruine de Bruin, Wändi; Parker, Andrew M; Lemaster, Philip; Pichayayothin, Nipat; Delaney, Rebecca

    2016-09-01

    According to socioemotional selectivity theory, older adults' emotional well-being stems from having a limited future time perspective that motivates them to maximize well-being in the "here and now." Presumably, then, older adults' time horizons are associated with emotional competencies that boost positive affect and dampen negative affect, but little research has addressed this. Using a U.S. adult life-span sample (N = 3,933; 18-93 years), we found that a 2-factor model of future time perspective (future opportunities; limited time) fit the data better than a 1-factor model. Through middle age, people perceived the life-span hourglass as half full-they focused more on future opportunities than limited time. Around Age 60, the balance changed to increasingly perceiving the life-span hourglass as half empty-they focused less on future opportunities and more on limited time, even after accounting for perceived health, self-reported decision-making ability, and retirement status. At all ages, women's time horizons focused more on future opportunities compared with men's, and men's focused more on limited time. Focusing on future opportunities was associated with reporting less preoccupation with negative events, whereas focusing on limited time was associated with reporting more preoccupation. Older adults reported less preoccupation with negative events, and this association was stronger after controlling for their perceptions of limited time and fewer future opportunities, suggesting that other pathways may explain older adults' reports of their ability to disengage from negative events. Insights gained and questions raised by measuring future time perspective as 2 dimensions are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  16. Individual Change and the Timing and Onset of Important Life Events: Methods, Models, and Assumptions

    ERIC Educational Resources Information Center

    Grimm, Kevin; Marcoulides, Katerina

    2016-01-01

    Researchers are often interested in studying how the timing of a specific event affects concurrent and future development. When faced with such research questions there are multiple statistical models to consider and those models are the focus of this paper as well as their theoretical underpinnings and assumptions regarding the nature of the…

  17. Investigating College and Graduate Students' Multivariable Reasoning in Computational Modeling

    ERIC Educational Resources Information Center

    Wu, Hsin-Kai; Wu, Pai-Hsing; Zhang, Wen-Xin; Hsu, Ying-Shao

    2013-01-01

    Drawing upon the literature in computational modeling, multivariable reasoning, and causal attribution, this study aims at characterizing multivariable reasoning practices in computational modeling and revealing the nature of understanding about multivariable causality. We recruited two freshmen, two sophomores, two juniors, two seniors, four…

  18. Large Time Projection Chambers for Rare Event Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heffner, M

    The Time Projection Chamber (TPC) concept [add ref to TPC section] has been applied to many projects outside of particle physics and the accelerator based experiments where it was initially developed. TPCs in non-accelerator particle physics experiments are principally focused on rare event detection (e.g. neutrino and darkmater experiments) and the physics of these experiments can place dramatically different constraints on the TPC design (only extensions to the traditional TPCs are discussed here). The drift gas, or liquid, is usually the target or matter under observation and due to very low signal rates a TPC with the largest active massmore » is desired. The large mass complicates particle tracking of short and sometimes very low energy particles. Other special design issues include, efficient light collection, background rejection, internal triggering and optimal energy resolution. Backgrounds from gamma-rays and neutrons are significant design issues in the construction of these TPCs. They are generally placed deep underground to shield from cosmogenic particles and surrounded with shielding to reduce radiation from the local surroundings. The construction materials have to be carefully screened for radiopurity as they are in close contact with the active mass and can be a signification source of background events. The TPC excels in reducing this internal background because the mass inside the fieldcage forms one monolithic volume from which fiducial cuts can be made ex post facto to isolate quiet drift mass, and can be circulated and purified to a very high level. Self shielding in these large mass systems can be significant and the effect improves with density. The liquid phase TPC can obtain a high density at low pressure which results in very good self-shielding and compact installation with a lightweight containment. The down sides are the need for cryogenics, slower charge drift, tracks shorter than the typical electron diffusion, lower energy

  19. Visual search of cyclic spatio-temporal events

    NASA Astrophysics Data System (ADS)

    Gautier, Jacques; Davoine, Paule-Annick; Cunty, Claire

    2018-05-01

    The analysis of spatio-temporal events, and especially of relationships between their different dimensions (space-time-thematic attributes), can be done with geovisualization interfaces. But few geovisualization tools integrate the cyclic dimension of spatio-temporal event series (natural events or social events). Time Coil and Time Wave diagrams represent both the linear time and the cyclic time. By introducing a cyclic temporal scale, these diagrams may highlight the cyclic characteristics of spatio-temporal events. However, the settable cyclic temporal scales are limited to usual durations like days or months. Because of that, these diagrams cannot be used to visualize cyclic events, which reappear with an unusual period, and don't allow to make a visual search of cyclic events. Also, they don't give the possibility to identify the relationships between the cyclic behavior of the events and their spatial features, and more especially to identify localised cyclic events. The lack of possibilities to represent the cyclic time, outside of the temporal diagram of multi-view geovisualization interfaces, limits the analysis of relationships between the cyclic reappearance of events and their other dimensions. In this paper, we propose a method and a geovisualization tool, based on the extension of Time Coil and Time Wave, to provide a visual search of cyclic events, by allowing to set any possible duration to the diagram's cyclic temporal scale. We also propose a symbology approach to push the representation of the cyclic time into the map, in order to improve the analysis of relationships between space and the cyclic behavior of events.

  20. A new multivariate zero-adjusted Poisson model with applications to biomedicine.

    PubMed

    Liu, Yin; Tian, Guo-Liang; Tang, Man-Lai; Yuen, Kam Chuen

    2018-05-25

    Recently, although advances were made on modeling multivariate count data, existing models really has several limitations: (i) The multivariate Poisson log-normal model (Aitchison and Ho, ) cannot be used to fit multivariate count data with excess zero-vectors; (ii) The multivariate zero-inflated Poisson (ZIP) distribution (Li et al., 1999) cannot be used to model zero-truncated/deflated count data and it is difficult to apply to high-dimensional cases; (iii) The Type I multivariate zero-adjusted Poisson (ZAP) distribution (Tian et al., 2017) could only model multivariate count data with a special correlation structure for random components that are all positive or negative. In this paper, we first introduce a new multivariate ZAP distribution, based on a multivariate Poisson distribution, which allows the correlations between components with a more flexible dependency structure, that is some of the correlation coefficients could be positive while others could be negative. We then develop its important distributional properties, and provide efficient statistical inference methods for multivariate ZAP model with or without covariates. Two real data examples in biomedicine are used to illustrate the proposed methods. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.