Science.gov

Sample records for multivariate event time

  1. Multivariate Time Series Similarity Searching

    PubMed Central

    Wang, Jimin; Zhu, Yuelong; Li, Shijin; Wan, Dingsheng; Zhang, Pengcheng

    2014-01-01

    Multivariate time series (MTS) datasets are very common in various financial, multimedia, and hydrological fields. In this paper, a dimension-combination method is proposed to search similar sequences for MTS. Firstly, the similarity of single-dimension series is calculated; then the overall similarity of the MTS is obtained by synthesizing each of the single-dimension similarity based on weighted BORDA voting method. The dimension-combination method could use the existing similarity searching method. Several experiments, which used the classification accuracy as a measure, were performed on six datasets from the UCI KDD Archive to validate the method. The results show the advantage of the approach compared to the traditional similarity measures, such as Euclidean distance (ED), cynamic time warping (DTW), point distribution (PD), PCA similarity factor (SPCA), and extended Frobenius norm (Eros), for MTS datasets in some ways. Our experiments also demonstrate that no measure can fit all datasets, and the proposed measure is a choice for similarity searches. PMID:24895665

  2. Multivariate time series similarity searching.

    PubMed

    Wang, Jimin; Zhu, Yuelong; Li, Shijin; Wan, Dingsheng; Zhang, Pengcheng

    2014-01-01

    Multivariate time series (MTS) datasets are very common in various financial, multimedia, and hydrological fields. In this paper, a dimension-combination method is proposed to search similar sequences for MTS. Firstly, the similarity of single-dimension series is calculated; then the overall similarity of the MTS is obtained by synthesizing each of the single-dimension similarity based on weighted BORDA voting method. The dimension-combination method could use the existing similarity searching method. Several experiments, which used the classification accuracy as a measure, were performed on six datasets from the UCI KDD Archive to validate the method. The results show the advantage of the approach compared to the traditional similarity measures, such as Euclidean distance (ED), cynamic time warping (DTW), point distribution (PD), PCA similarity factor (SPCA), and extended Frobenius norm (Eros), for MTS datasets in some ways. Our experiments also demonstrate that no measure can fit all datasets, and the proposed measure is a choice for similarity searches. PMID:24895665

  3. Multivariate cluster analysis of forest fire events in Portugal

    NASA Astrophysics Data System (ADS)

    Tonini, Marj; Pereira, Mario; Vega Orozco, Carmen; Parente, Joana

    2015-04-01

    Portugal is one of the major fire-prone European countries, mainly due to its favourable climatic, topographic and vegetation conditions. Compared to the other Mediterranean countries, the number of events registered here from 1980 up to nowadays is the highest one; likewise, with respect to the burnt area, Portugal is the third most affected country. Portuguese mapped burnt areas are available from the website of the Institute for the Conservation of Nature and Forests (ICNF). This official geodatabase is the result of satellite measurements starting from the year 1990. The spatial information, delivered in shapefile format, provides a detailed description of the shape and the size of area burnt by each fire, while the date/time information relate to the ignition fire is restricted to the year of occurrence. In terms of a statistical formalism wildfires can be associated to a stochastic point process, where events are analysed as a set of geographical coordinates corresponding, for example, to the centroid of each burnt area. The spatio/temporal pattern of stochastic point processes, including the cluster analysis, is a basic procedure to discover predisposing factorsas well as for prevention and forecasting purposes. These kinds of studies are primarily focused on investigating the spatial cluster behaviour of environmental data sequences and/or mapping their distribution at different times. To include both the two dimensions (space and time) a comprehensive spatio-temporal analysis is needful. In the present study authors attempt to verify if, in the case of wildfires in Portugal, space and time act independently or if, conversely, neighbouring events are also closer in time. We present an application of the spatio-temporal K-function to a long dataset (1990-2012) of mapped burnt areas. Moreover, the multivariate K-function allowed checking for an eventual different distribution between small and large fires. The final objective is to elaborate a 3D

  4. Multivariate Statistical Modelling of Drought and Heat Wave Events

    NASA Astrophysics Data System (ADS)

    Manning, Colin; Widmann, Martin; Vrac, Mathieu; Maraun, Douglas; Bevaqua, Emanuele

    2016-04-01

    Multivariate Statistical Modelling of Drought and Heat Wave Events C. Manning1,2, M. Widmann1, M. Vrac2, D. Maraun3, E. Bevaqua2,3 1. School of Geography, Earth and Environmental Sciences, University of Birmingham, Edgbaston, Birmingham, UK 2. Laboratoire des Sciences du Climat et de l'Environnement, (LSCE-IPSL), Centre d'Etudes de Saclay, Gif-sur-Yvette, France 3. Wegener Center for Climate and Global Change, University of Graz, Brandhofgasse 5, 8010 Graz, Austria Compound extreme events are a combination of two or more contributing events which in themselves may not be extreme but through their joint occurrence produce an extreme impact. Compound events are noted in the latest IPCC report as an important type of extreme event that have been given little attention so far. As part of the CE:LLO project (Compound Events: muLtivariate statisticaL mOdelling) we are developing a multivariate statistical model to gain an understanding of the dependence structure of certain compound events. One focus of this project is on the interaction between drought and heat wave events. Soil moisture has both a local and non-local effect on the occurrence of heat waves where it strongly controls the latent heat flux affecting the transfer of sensible heat to the atmosphere. These processes can create a feedback whereby a heat wave maybe amplified or suppressed by the soil moisture preconditioning, and vice versa, the heat wave may in turn have an effect on soil conditions. An aim of this project is to capture this dependence in order to correctly describe the joint probabilities of these conditions and the resulting probability of their compound impact. We will show an application of Pair Copula Constructions (PCCs) to study the aforementioned compound event. PCCs allow in theory for the formulation of multivariate dependence structures in any dimension where the PCC is a decomposition of a multivariate distribution into a product of bivariate components modelled using copulas. A

  5. Network structure of multivariate time series.

    PubMed

    Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito

    2015-10-21

    Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.

  6. Network structure of multivariate time series

    NASA Astrophysics Data System (ADS)

    Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito

    2015-10-01

    Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.

  7. Network structure of multivariate time series

    PubMed Central

    Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito

    2015-01-01

    Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail. PMID:26487040

  8. Time varying, multivariate volume data reduction

    SciTech Connect

    Ahrens, James P; Fout, Nathaniel; Ma, Kwan - Liu

    2010-01-01

    Large-scale supercomputing is revolutionizing the way science is conducted. A growing challenge, however, is understanding the massive quantities of data produced by large-scale simulations. The data, typically time-varying, multivariate, and volumetric, can occupy from hundreds of gigabytes to several terabytes of storage space. Transferring and processing volume data of such sizes is prohibitively expensive and resource intensive. Although it may not be possible to entirely alleviate these problems, data compression should be considered as part of a viable solution, especially when the primary means of data analysis is volume rendering. In this paper we present our study of multivariate compression, which exploits correlations among related variables, for volume rendering. Two configurations for multidimensional compression based on vector quantization are examined. We emphasize quality reconstruction and interactive rendering, which leads us to a solution using graphics hardware to perform on-the-fly decompression during rendering. In this paper we present a solution which addresses the need for data reduction in large supercomputing environments where data resulting from simulations occupies tremendous amounts of storage. Our solution employs a lossy encoding scheme to acrueve data reduction with several options in terms of rate-distortion behavior. We focus on encoding of multiple variables together, with optional compression in space and time. The compressed volumes can be rendered directly with commodity graphics cards at interactive frame rates and rendering quality similar to that of static volume renderers. Compression results using a multivariate time-varying data set indicate that encoding multiple variables results in acceptable performance in the case of spatial and temporal encoding as compared to independent compression of variables. The relative performance of spatial vs. temporal compression is data dependent, although temporal compression has the

  9. Inferring phase equations from multivariate time series.

    PubMed

    Tokuda, Isao T; Jain, Swati; Kiss, István Z; Hudson, John L

    2007-08-10

    An approach is presented for extracting phase equations from multivariate time series data recorded from a network of weakly coupled limit cycle oscillators. Our aim is to estimate important properties of the phase equations including natural frequencies and interaction functions between the oscillators. Our approach requires the measurement of an experimental observable of the oscillators; in contrast with previous methods it does not require measurements in isolated single or two-oscillator setups. This noninvasive technique can be advantageous in biological systems, where extraction of few oscillators may be a difficult task. The method is most efficient when data are taken from the nonsynchronized regime. Applicability to experimental systems is demonstrated by using a network of electrochemical oscillators; the obtained phase model is utilized to predict the synchronization diagram of the system.

  10. Small Sample Properties of Bayesian Multivariate Autoregressive Time Series Models

    ERIC Educational Resources Information Center

    Price, Larry R.

    2012-01-01

    The aim of this study was to compare the small sample (N = 1, 3, 5, 10, 15) performance of a Bayesian multivariate vector autoregressive (BVAR-SEM) time series model relative to frequentist power and parameter estimation bias. A multivariate autoregressive model was developed based on correlated autoregressive time series vectors of varying…

  11. Interpretable Early Classification of Multivariate Time Series

    ERIC Educational Resources Information Center

    Ghalwash, Mohamed F.

    2013-01-01

    Recent advances in technology have led to an explosion in data collection over time rather than in a single snapshot. For example, microarray technology allows us to measure gene expression levels in different conditions over time. Such temporal data grants the opportunity for data miners to develop algorithms to address domain-related problems,…

  12. Integrating unseen events over time.

    PubMed

    Reber, Thomas P; Henke, Katharina

    2012-06-01

    Events often share elements that guide us to integrate knowledge from these events. Integration allows us to make inferences that affect reactions to new events. Integrating events and making inferences are thought to depend on consciousness. We show that even unconsciously experienced events, that share elements, are integrated and influence reactions to new events. An unconscious event consisted of the subliminal presentation of two unrelated words. Half of subliminal word pairs shared one word ('winter red', 'red computer'). Overlapping word pairs were presented between 6s and 78 s apart. The test for integration required participants to judge the semantic distance between suprathreshold words ('winter computer'). Evidence of integration was provided by faster reactions to suprathreshold words that were indirectly related versus unrelated. This effect was independent of the time interval between overlapping word pairs. We conclude that consciousness is no requirement for the integration of discontiguous events.

  13. Regularly timed events amid chaos

    NASA Astrophysics Data System (ADS)

    Blakely, Jonathan N.; Cooper, Roy M.; Corron, Ned J.

    2015-11-01

    We show rigorously that the solutions of a class of chaotic oscillators are characterized by regularly timed events in which the derivative of the solution is instantaneously zero. The perfect regularity of these events is in stark contrast with the well-known unpredictability of chaos. We explore some consequences of these regularly timed events through experiments using chaotic electronic circuits. First, we show that a feedback loop can be implemented to phase lock the regularly timed events to a periodic external signal. In this arrangement the external signal regulates the timing of the chaotic signal but does not strictly lock its phase. That is, phase slips of the chaotic oscillation persist without disturbing timing of the regular events. Second, we couple the regularly timed events of one chaotic oscillator to those of another. A state of synchronization is observed where the oscillators exhibit synchronized regular events while their chaotic amplitudes and phases evolve independently. Finally, we add additional coupling to synchronize the amplitudes, as well, however in the opposite direction illustrating the independence of the amplitudes from the regularly timed events.

  14. Causality networks from multivariate time series and application to epilepsy.

    PubMed

    Siggiridou, Elsa; Koutlis, Christos; Tsimpiris, Alkiviadis; Kimiskidis, Vasilios K; Kugiumtzis, Dimitris

    2015-08-01

    Granger causality and variants of this concept allow the study of complex dynamical systems as networks constructed from multivariate time series. In this work, a large number of Granger causality measures used to form causality networks from multivariate time series are assessed. For this, realizations on high dimensional coupled dynamical systems are considered and the performance of the Granger causality measures is evaluated, seeking for the measures that form networks closest to the true network of the dynamical system. In particular, the comparison focuses on Granger causality measures that reduce the state space dimension when many variables are observed. Further, the linear and nonlinear Granger causality measures of dimension reduction are compared to a standard Granger causality measure on electroencephalographic (EEG) recordings containing episodes of epileptiform discharges.

  15. Multitask Gaussian processes for multivariate physiological time-series analysis.

    PubMed

    Dürichen, Robert; Pimentel, Marco A F; Clifton, Lei; Schweikard, Achim; Clifton, David A

    2015-01-01

    Gaussian process (GP) models are a flexible means of performing nonparametric Bayesian regression. However, GP models in healthcare are often only used to model a single univariate output time series, denoted as single-task GPs (STGP). Due to an increasing prevalence of sensors in healthcare settings, there is an urgent need for robust multivariate time-series tools. Here, we propose a method using multitask GPs (MTGPs) which can model multiple correlated multivariate physiological time series simultaneously. The flexible MTGP framework can learn the correlation between multiple signals even though they might be sampled at different frequencies and have training sets available for different intervals. Furthermore, prior knowledge of any relationship between the time series such as delays and temporal behavior can be easily integrated. A novel normalization is proposed to allow interpretation of the various hyperparameters used in the MTGP. We investigate MTGPs for physiological monitoring with synthetic data sets and two real-world problems from the field of patient monitoring and radiotherapy. The results are compared with standard Gaussian processes and other existing methods in the respective biomedical application areas. In both cases, we show that our framework learned the correlation between physiological time series efficiently, outperforming the existing state of the art.

  16. The LCLS Timing Event System

    SciTech Connect

    Dusatko, John; Allison, S.; Browne, M.; Krejcik, P.; /SLAC

    2012-07-23

    The Linac Coherent Light Source requires precision timing trigger signals for various accelerator diagnostics and controls at SLAC-NAL. A new timing system has been developed that meets these requirements. This system is based on COTS hardware with a mixture of custom-designed units. An added challenge has been the requirement that the LCLS Timing System must co-exist and 'know' about the existing SLC Timing System. This paper describes the architecture, construction and performance of the LCLS timing event system.

  17. Impact of dose intensity of ponatinib on selected adverse events: Multivariate analyses from a pooled population of clinical trial patients.

    PubMed

    Dorer, David J; Knickerbocker, Ronald K; Baccarani, Michele; Cortes, Jorge E; Hochhaus, Andreas; Talpaz, Moshe; Haluska, Frank G

    2016-09-01

    Ponatinib is approved for adults with refractory chronic myeloid leukemia or Philadelphia chromosome-positive acute lymphoblastic leukemia, including those with the T315I BCR-ABL1 mutation. We pooled data from 3 clinical trials (N=671) to determine the impact of ponatinib dose intensity on the following adverse events: arterial occlusive events (cardiovascular, cerebrovascular, and peripheral vascular events), venous thromboembolic events, cardiac failure, thrombocytopenia, neutropenia, hypertension, pancreatitis, increased lipase, increased alanine aminotransferase, increased aspartate aminotransferase, rash, arthralgia, and hypertriglyceridemia. Multivariate analyses allowed adjustment for covariates potentially related to changes in dosing or an event. Logistic regression analysis identified significant associations between dose intensity and most events after adjusting for covariates. Pancreatitis, rash, and cardiac failure had the strongest associations with dose intensity (odds ratios >2). Time-to-event analyses showed significant associations between dose intensity and risk of arterial occlusive events and each subcategory. Further, these analyses suggested that a lag exists between a change in dose and the resulting change in event risk. No significant association between dose intensity and risk of venous thromboembolic events was evident. Collectively, these findings suggest a potential causal relationship between ponatinib dose and certain adverse events and support prospective investigations of approaches to lower average ponatinib dose intensity. PMID:27505637

  18. Impact of dose intensity of ponatinib on selected adverse events: Multivariate analyses from a pooled population of clinical trial patients.

    PubMed

    Dorer, David J; Knickerbocker, Ronald K; Baccarani, Michele; Cortes, Jorge E; Hochhaus, Andreas; Talpaz, Moshe; Haluska, Frank G

    2016-09-01

    Ponatinib is approved for adults with refractory chronic myeloid leukemia or Philadelphia chromosome-positive acute lymphoblastic leukemia, including those with the T315I BCR-ABL1 mutation. We pooled data from 3 clinical trials (N=671) to determine the impact of ponatinib dose intensity on the following adverse events: arterial occlusive events (cardiovascular, cerebrovascular, and peripheral vascular events), venous thromboembolic events, cardiac failure, thrombocytopenia, neutropenia, hypertension, pancreatitis, increased lipase, increased alanine aminotransferase, increased aspartate aminotransferase, rash, arthralgia, and hypertriglyceridemia. Multivariate analyses allowed adjustment for covariates potentially related to changes in dosing or an event. Logistic regression analysis identified significant associations between dose intensity and most events after adjusting for covariates. Pancreatitis, rash, and cardiac failure had the strongest associations with dose intensity (odds ratios >2). Time-to-event analyses showed significant associations between dose intensity and risk of arterial occlusive events and each subcategory. Further, these analyses suggested that a lag exists between a change in dose and the resulting change in event risk. No significant association between dose intensity and risk of venous thromboembolic events was evident. Collectively, these findings suggest a potential causal relationship between ponatinib dose and certain adverse events and support prospective investigations of approaches to lower average ponatinib dose intensity.

  19. Fast and Flexible Multivariate Time Series Subsequence Search

    NASA Technical Reports Server (NTRS)

    Bhaduri, Kanishka; Oza, Nikunj C.; Zhu, Qiang; Srivastava, Ashok N.

    2010-01-01

    Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical monitoring, and financial systems. Domain experts are often interested in searching for interesting multivariate patterns from these MTS databases which often contain several gigabytes of data. Surprisingly, research on MTS search is very limited. Most of the existing work only supports queries with the same length of data, or queries on a fixed set of variables. In this paper, we propose an efficient and flexible subsequence search framework for massive MTS databases, that, for the first time, enables querying on any subset of variables with arbitrary time delays between them. We propose two algorithms to solve this problem (1) a List Based Search (LBS) algorithm which uses sorted lists for indexing, and (2) a R*-tree Based Search (RBS) which uses Minimum Bounding Rectangles (MBR) to organize the subsequences. Both algorithms guarantee that all matching patterns within the specified thresholds will be returned (no false dismissals). The very few false alarms can be removed by a post-processing step. Since our framework is also capable of Univariate Time-Series (UTS) subsequence search, we first demonstrate the efficiency of our algorithms on several UTS datasets previously used in the literature. We follow this up with experiments using two large MTS databases from the aviation domain, each containing several millions of observations. Both these tests show that our algorithms have very high prune rates (>99%) thus needing actual disk access for only less than 1% of the observations. To the best of our knowledge, MTS subsequence search has never been attempted on datasets of the size we have used in this paper.

  20. F100 multivariable control synthesis program: Evaluation of a multivariable control using a real-time engine simulation

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.; Soeder, J. F.; Seldner, K.; Cwynar, D. S.

    1977-01-01

    The design, evaluation, and testing of a practical, multivariable, linear quadratic regulator control for the F100 turbofan engine were accomplished. NASA evaluation of the multivariable control logic and implementation are covered. The evaluation utilized a real time, hybrid computer simulation of the engine. Results of the evaluation are presented, and recommendations concerning future engine testing of the control are made. Results indicated that the engine testing of the control should be conducted as planned.

  1. Optimizing functional network representation of multivariate time series.

    PubMed

    Zanin, Massimiliano; Sousa, Pedro; Papo, David; Bajo, Ricardo; García-Prieto, Juan; del Pozo, Francisco; Menasalvas, Ernestina; Boccaletti, Stefano

    2012-01-01

    By combining complex network theory and data mining techniques, we provide objective criteria for optimization of the functional network representation of generic multivariate time series. In particular, we propose a method for the principled selection of the threshold value for functional network reconstruction from raw data, and for proper identification of the network's indicators that unveil the most discriminative information on the system for classification purposes. We illustrate our method by analysing networks of functional brain activity of healthy subjects, and patients suffering from Mild Cognitive Impairment, an intermediate stage between the expected cognitive decline of normal aging and the more pronounced decline of dementia. We discuss extensions of the scope of the proposed methodology to network engineering purposes, and to other data mining tasks.

  2. Optimizing Functional Network Representation of Multivariate Time Series

    NASA Astrophysics Data System (ADS)

    Zanin, Massimiliano; Sousa, Pedro; Papo, David; Bajo, Ricardo; García-Prieto, Juan; Pozo, Francisco Del; Menasalvas, Ernestina; Boccaletti, Stefano

    2012-09-01

    By combining complex network theory and data mining techniques, we provide objective criteria for optimization of the functional network representation of generic multivariate time series. In particular, we propose a method for the principled selection of the threshold value for functional network reconstruction from raw data, and for proper identification of the network's indicators that unveil the most discriminative information on the system for classification purposes. We illustrate our method by analysing networks of functional brain activity of healthy subjects, and patients suffering from Mild Cognitive Impairment, an intermediate stage between the expected cognitive decline of normal aging and the more pronounced decline of dementia. We discuss extensions of the scope of the proposed methodology to network engineering purposes, and to other data mining tasks.

  3. Evaluating multivariate visualizations on time-varying data

    NASA Astrophysics Data System (ADS)

    Livingston, Mark A.; Decker, Jonathan W.; Ai, Zhuming

    2013-01-01

    Multivariate visualization techniques have been applied to a wide variety of visual analysis tasks and a broad range of data types and sources. Their utility has been evaluated in a modest range of simple analysis tasks. In this work, we extend our previous task to a case of time-varying data. We implemented ve visualizations of our synthetic test data: three previously evaluated techniques (Data-driven Spots, Oriented Slivers, and Attribute Blocks), one hybrid of the rst two that we call Oriented Data-driven Spots, and an implementation of Attribute Blocks that merges the temporal slices. We conducted a user study of these ve techniques. Our previous nding (with static data) was that users performed best when the density of the target (as encoded in the visualization) was either highest or had the highest ratio to non-target features. The time-varying presentations gave us a wider range of density and density gains from which to draw conclusions; we now see evidence for the density gain as the perceptual measure, rather than the absolute density.

  4. Network inference with confidence from multivariate time series.

    PubMed

    Kramer, Mark A; Eden, Uri T; Cash, Sydney S; Kolaczyk, Eric D

    2009-06-01

    Networks--collections of interacting elements or nodes--abound in the natural and manmade worlds. For many networks, complex spatiotemporal dynamics stem from patterns of physical interactions unknown to us. To infer these interactions, it is common to include edges between those nodes whose time series exhibit sufficient functional connectivity, typically defined as a measure of coupling exceeding a predetermined threshold. However, when uncertainty exists in the original network measurements, uncertainty in the inferred network is likely, and hence a statistical propagation of error is needed. In this manuscript, we describe a principled and systematic procedure for the inference of functional connectivity networks from multivariate time series data. Our procedure yields as output both the inferred network and a quantification of uncertainty of the most fundamental interest: uncertainty in the number of edges. To illustrate this approach, we apply a measure of linear coupling to simulated data and electrocorticogram data recorded from a human subject during an epileptic seizure. We demonstrate that the procedure is accurate and robust in both the determination of edges and the reporting of uncertainty associated with that determination. PMID:19658533

  5. Network inference with confidence from multivariate time series

    NASA Astrophysics Data System (ADS)

    Kramer, Mark A.; Eden, Uri T.; Cash, Sydney S.; Kolaczyk, Eric D.

    2009-06-01

    Networks—collections of interacting elements or nodes—abound in the natural and manmade worlds. For many networks, complex spatiotemporal dynamics stem from patterns of physical interactions unknown to us. To infer these interactions, it is common to include edges between those nodes whose time series exhibit sufficient functional connectivity, typically defined as a measure of coupling exceeding a predetermined threshold. However, when uncertainty exists in the original network measurements, uncertainty in the inferred network is likely, and hence a statistical propagation of error is needed. In this manuscript, we describe a principled and systematic procedure for the inference of functional connectivity networks from multivariate time series data. Our procedure yields as output both the inferred network and a quantification of uncertainty of the most fundamental interest: uncertainty in the number of edges. To illustrate this approach, we apply a measure of linear coupling to simulated data and electrocorticogram data recorded from a human subject during an epileptic seizure. We demonstrate that the procedure is accurate and robust in both the determination of edges and the reporting of uncertainty associated with that determination.

  6. A statistical approach for segregating cognitive task stages from multivariate fMRI BOLD time series.

    PubMed

    Demanuele, Charmaine; Bähner, Florian; Plichta, Michael M; Kirsch, Peter; Tost, Heike; Meyer-Lindenberg, Andreas; Durstewitz, Daniel

    2015-01-01

    Multivariate pattern analysis can reveal new information from neuroimaging data to illuminate human cognition and its disturbances. Here, we develop a methodological approach, based on multivariate statistical/machine learning and time series analysis, to discern cognitive processing stages from functional magnetic resonance imaging (fMRI) blood oxygenation level dependent (BOLD) time series. We apply this method to data recorded from a group of healthy adults whilst performing a virtual reality version of the delayed win-shift radial arm maze (RAM) task. This task has been frequently used to study working memory and decision making in rodents. Using linear classifiers and multivariate test statistics in conjunction with time series bootstraps, we show that different cognitive stages of the task, as defined by the experimenter, namely, the encoding/retrieval, choice, reward and delay stages, can be statistically discriminated from the BOLD time series in brain areas relevant for decision making and working memory. Discrimination of these task stages was significantly reduced during poor behavioral performance in dorsolateral prefrontal cortex (DLPFC), but not in the primary visual cortex (V1). Experimenter-defined dissection of time series into class labels based on task structure was confirmed by an unsupervised, bottom-up approach based on Hidden Markov Models. Furthermore, we show that different groupings of recorded time points into cognitive event classes can be used to test hypotheses about the specific cognitive role of a given brain region during task execution. We found that whilst the DLPFC strongly differentiated between task stages associated with different memory loads, but not between different visual-spatial aspects, the reverse was true for V1. Our methodology illustrates how different aspects of cognitive information processing during one and the same task can be separated and attributed to specific brain regions based on information contained in

  7. A statistical approach for segregating cognitive task stages from multivariate fMRI BOLD time series

    PubMed Central

    Demanuele, Charmaine; Bähner, Florian; Plichta, Michael M.; Kirsch, Peter; Tost, Heike; Meyer-Lindenberg, Andreas; Durstewitz, Daniel

    2015-01-01

    Multivariate pattern analysis can reveal new information from neuroimaging data to illuminate human cognition and its disturbances. Here, we develop a methodological approach, based on multivariate statistical/machine learning and time series analysis, to discern cognitive processing stages from functional magnetic resonance imaging (fMRI) blood oxygenation level dependent (BOLD) time series. We apply this method to data recorded from a group of healthy adults whilst performing a virtual reality version of the delayed win-shift radial arm maze (RAM) task. This task has been frequently used to study working memory and decision making in rodents. Using linear classifiers and multivariate test statistics in conjunction with time series bootstraps, we show that different cognitive stages of the task, as defined by the experimenter, namely, the encoding/retrieval, choice, reward and delay stages, can be statistically discriminated from the BOLD time series in brain areas relevant for decision making and working memory. Discrimination of these task stages was significantly reduced during poor behavioral performance in dorsolateral prefrontal cortex (DLPFC), but not in the primary visual cortex (V1). Experimenter-defined dissection of time series into class labels based on task structure was confirmed by an unsupervised, bottom-up approach based on Hidden Markov Models. Furthermore, we show that different groupings of recorded time points into cognitive event classes can be used to test hypotheses about the specific cognitive role of a given brain region during task execution. We found that whilst the DLPFC strongly differentiated between task stages associated with different memory loads, but not between different visual-spatial aspects, the reverse was true for V1. Our methodology illustrates how different aspects of cognitive information processing during one and the same task can be separated and attributed to specific brain regions based on information contained in

  8. Mulstiscale Stochastic Generator of Multivariate Met-Ocean Time Series

    NASA Astrophysics Data System (ADS)

    Guanche, Yanira; Mínguez, Roberto; Méndez, Fernando J.

    2013-04-01

    The design of maritime structures requires information on sea state conditions that influence its behavior during its life cycle. In the last decades, there has been a increasing development of sea databases (buoys, reanalysis, satellite) that allow an accurate description of the marine climate and its interaction with a given structure in terms of functionality and stability. However, these databases have a limited timelength, and its appliance entails an associated uncertainty. To avoid this limitation, engineers try to sample synthetically generated time series, statistically consistent, which allow the simulation of longer time periods. The present work proposes a hybrid methodology to deal with this issue. It is based in the combination of clustering algorithms (k-means) and an autoregressive logistic regression model (logit). Since the marine climate is directly related to the atmospheric conditions at a synoptic scale, the proposed methodology takes both systems into account; generating simultaneously circulation patterns (weather types) time series and the sea state time series related. The generation of these time series can be summarized in three steps: (1) By applying the clustering technique k-means the atmospheric conditions are classified into a representative number of synoptical patterns (2) Taking into account different covariates involved (such as seasonality, interannual variability, trends or autoregressive term) the autoregressive logistic model is adjusted (3) Once the model is able to simulate weather types time series the last step is to generate multivariate hourly metocean parameters related to these weather types. This is done by an autoregressive model (ARMA) for each variable, including cross-correlation between them. To show the goodness of the proposed method the following data has been used: Sea Level Pressure (SLP) databases from NCEP-NCAR and Global Ocean Wave (GOW) reanalysis from IH Cantabria. The synthetical met-ocean hourly

  9. A Framework and Algorithms for Multivariate Time Series Analytics (MTSA): Learning, Monitoring, and Recommendation

    ERIC Educational Resources Information Center

    Ngan, Chun-Kit

    2013-01-01

    Making decisions over multivariate time series is an important topic which has gained significant interest in the past decade. A time series is a sequence of data points which are measured and ordered over uniform time intervals. A multivariate time series is a set of multiple, related time series in a particular domain in which domain experts…

  10. A Bayesian approach to joint analysis of multivariate longitudinal data and parametric accelerated failure time

    PubMed Central

    Luo, Sheng

    2013-01-01

    Impairment caused by Parkinson’s disease (PD) is multidimensional (e.g., sensoria, functions, and cognition) and progressive. Its multidimensional nature precludes a single outcome to measure disease progression. Clinical trials of PD use multiple categorical and continuous longitudinal outcomes to assess the treatment effects on overall improvement. A terminal event such as death or dropout can stop the follow-up process. Moreover, the time to the terminal event may be dependent on the multivariate longitudinal measurements. In this article, we consider a joint random-effects model for the correlated outcomes. A multilevel item response theory model is used for the multivariate longitudinal outcomes and a parametric accelerated failure time model is used for the failure time because of the violation of proportional hazard assumption. These two models are linked via random effects. The Bayesian inference via MCMC is implemented in ‘ BUGS’ language. Our proposed method is evaluated by a simulation study and is applied to DATATOP study, a motivating clinical trial to determine if deprenyl slows the progression of PD. PMID:24009073

  11. A Bayesian approach to joint analysis of multivariate longitudinal data and parametric accelerated failure time.

    PubMed

    Luo, Sheng

    2014-02-20

    Impairment caused by Parkinson's disease (PD) is multidimensional (e.g., sensoria, functions, and cognition) and progressive. Its multidimensional nature precludes a single outcome to measure disease progression. Clinical trials of PD use multiple categorical and continuous longitudinal outcomes to assess the treatment effects on overall improvement. A terminal event such as death or dropout can stop the follow-up process. Moreover, the time to the terminal event may be dependent on the multivariate longitudinal measurements. In this article, we consider a joint random-effects model for the correlated outcomes. A multilevel item response theory model is used for the multivariate longitudinal outcomes and a parametric accelerated failure time model is used for the failure time because of the violation of proportional hazard assumption. These two models are linked via random effects. The Bayesian inference via MCMC is implemented in 'BUGS' language. Our proposed method is evaluated by a simulation study and is applied to DATATOP study, a motivating clinical trial to determine if deprenyl slows the progression of PD. PMID:24009073

  12. Nonparametric Bayesian Segmentation of a Multivariate Inhomogeneous Space-Time Poisson Process.

    PubMed

    Ding, Mingtao; He, Lihan; Dunson, David; Carin, Lawrence

    2012-12-01

    A nonparametric Bayesian model is proposed for segmenting time-evolving multivariate spatial point process data. An inhomogeneous Poisson process is assumed, with a logistic stick-breaking process (LSBP) used to encourage piecewise-constant spatial Poisson intensities. The LSBP explicitly favors spatially contiguous segments, and infers the number of segments based on the observed data. The temporal dynamics of the segmentation and of the Poisson intensities are modeled with exponential correlation in time, implemented in the form of a first-order autoregressive model for uniformly sampled discrete data, and via a Gaussian process with an exponential kernel for general temporal sampling. We consider and compare two different inference techniques: a Markov chain Monte Carlo sampler, which has relatively high computational complexity; and an approximate and efficient variational Bayesian analysis. The model is demonstrated with a simulated example and a real example of space-time crime events in Cincinnati, Ohio, USA. PMID:23741284

  13. A climate-based multivariate extreme emulator of met-ocean-hydrological events for coastal flooding

    NASA Astrophysics Data System (ADS)

    Camus, Paula; Rueda, Ana; Mendez, Fernando J.; Tomas, Antonio; Del Jesus, Manuel; Losada, Iñigo J.

    2015-04-01

    Atmosphere-ocean general circulation models (AOGCMs) are useful to analyze large-scale climate variability (long-term historical periods, future climate projections). However, applications such as coastal flood modeling require climate information at finer scale. Besides, flooding events depend on multiple climate conditions: waves, surge levels from the open-ocean and river discharge caused by precipitation. Therefore, a multivariate statistical downscaling approach is adopted to reproduce relationships between variables and due to its low computational cost. The proposed method can be considered as a hybrid approach which combines a probabilistic weather type downscaling model with a stochastic weather generator component. Predictand distributions are reproduced modeling the relationship with AOGCM predictors based on a physical division in weather types (Camus et al., 2012). The multivariate dependence structure of the predictand (extreme events) is introduced linking the independent marginal distributions of the variables by a probabilistic copula regression (Ben Ayala et al., 2014). This hybrid approach is applied for the downscaling of AOGCM data to daily precipitation and maximum significant wave height and storm-surge in different locations along the Spanish coast. Reanalysis data is used to assess the proposed method. A commonly predictor for the three variables involved is classified using a regression-guided clustering algorithm. The most appropriate statistical model (general extreme value distribution, pareto distribution) for daily conditions is fitted. Stochastic simulation of the present climate is performed obtaining the set of hydraulic boundary conditions needed for high resolution coastal flood modeling. References: Camus, P., Menéndez, M., Méndez, F.J., Izaguirre, C., Espejo, A., Cánovas, V., Pérez, J., Rueda, A., Losada, I.J., Medina, R. (2014b). A weather-type statistical downscaling framework for ocean wave climate. Journal of

  14. Bayesian inference on risk differences: an application to multivariate meta-analysis of adverse events in clinical trials

    PubMed Central

    Chen, Yong; Luo, Sheng; Chu, Haitao; Wei, Peng

    2013-01-01

    Multivariate meta-analysis is useful in combining evidence from independent studies which involve several comparisons among groups based on a single outcome. For binary outcomes, the commonly used statistical models for multivariate meta-analysis are multivariate generalized linear mixed effects models which assume risks, after some transformation, follow a multivariate normal distribution with possible correlations. In this article, we consider an alternative model for multivariate meta-analysis where the risks are modeled by the multivariate beta distribution proposed by Sarmanov (1966). This model have several attractive features compared to the conventional multivariate generalized linear mixed effects models, including simplicity of likelihood function, no need to specify a link function, and has a closed-form expression of distribution functions for study-specific risk differences. We investigate the finite sample performance of this model by simulation studies and illustrate its use with an application to multivariate meta-analysis of adverse events of tricyclic antidepressants treatment in clinical trials. PMID:23853700

  15. Multivariate space - time analysis of PRE-STORM precipitation

    NASA Technical Reports Server (NTRS)

    Polyak, Ilya; North, Gerald R.; Valdes, Juan B.

    1994-01-01

    This paper presents the methodologies and results of the multivariate modeling and two-dimensional spectral and correlation analysis of PRE-STORM rainfall gauge data. Estimated parameters of the models for the specific spatial averages clearly indicate the eastward and southeastward wave propagation of rainfall fluctuations. A relationship between the coefficients of the diffusion equation and the parameters of the stochastic model of rainfall fluctuations is derived that leads directly to the exclusive use of rainfall data to estimate advection speed (about 12 m/s) as well as other coefficients of the diffusion equation of the corresponding fields. The statistical methodology developed here can be used for confirmation of physical models by comparison of the corresponding second-moment statistics of the observed and simulated data, for generating multiple samples of any size, for solving the inverse problem of the hydrodynamic equations, and for application in some other areas of meteorological and climatological data analysis and modeling.

  16. Multivariate real-time assessment of droughts via copula-based multi-site Hazard Trajectories and Fans

    NASA Astrophysics Data System (ADS)

    Salvadori, G.; De Michele, C.

    2015-07-01

    Droughts, like floods, represent the most dangerous, and costly, water cycle expressions, with huge impacts on society and built environment. Droughts are events occurring over a certain region, lasting several weeks or months, and involving multiple variables: thus, a multivariate, multi-site, approach is most appropriate for their statistical characterization. In this methodological work, hydrological droughts are considered, and a multivariate approach is proposed, by regarding as relevant variables the duration and the average intensity. A multivariate, multi-site, frequency analysis is presented, based on the Theory of Copulas and the joint Survival Kendall's Return Periods, by investigating the historical drought episodes occurred at five main river sections of the Po river (Northern Italy), the most important Italian basin. The tool of Dynamic Return Period is used, and the new concepts of Hazard Trajectories and Fans are introduced, in order to provide useful indications for a valuable multi-site real-time assessment of droughts.

  17. Narrative event boundaries, reading times, and expectation.

    PubMed

    Pettijohn, Kyle A; Radvansky, Gabriel A

    2016-10-01

    During text comprehension, readers create mental representations of the described events, called situation models. When new information is encountered, these models must be updated or new ones created. Consistent with the event indexing model, previous studies have shown that when readers encounter an event shift, reading times often increase. However, such increases are not consistently observed. This paper addresses this inconsistency by examining the extent to which reading-time differences observed at event shifts reflect an unexpectedness in the narrative rather than processes involved in model updating. In two reassessments of prior work, event shifts known to increase reading time were rated as less expected, and expectedness ratings significantly predicted reading time. In three new experiments, participants read stories in which an event shift was or was not foreshadowed, thereby influencing expectedness of the shift. Experiment 1 revealed that readers do not expect event shifts, but foreshadowing eliminates this. Experiment 2 showed that foreshadowing does not affect identification of event shifts. Finally, Experiment 3 found that, although reading times increased when an event shift was not foreshadowed, they were not different from controls when it was. Moreover, responses to memory probes were slower following an event shift regardless of foreshadowing, suggesting that situation model updating had taken place. Overall, the results support the idea that previously observed reading time increases at event shifts reflect, at least in part, a reader's unexpected encounter with a shift rather than an increase in processing effort required to update a situation model. PMID:27170375

  18. Narrative event boundaries, reading times, and expectation.

    PubMed

    Pettijohn, Kyle A; Radvansky, Gabriel A

    2016-10-01

    During text comprehension, readers create mental representations of the described events, called situation models. When new information is encountered, these models must be updated or new ones created. Consistent with the event indexing model, previous studies have shown that when readers encounter an event shift, reading times often increase. However, such increases are not consistently observed. This paper addresses this inconsistency by examining the extent to which reading-time differences observed at event shifts reflect an unexpectedness in the narrative rather than processes involved in model updating. In two reassessments of prior work, event shifts known to increase reading time were rated as less expected, and expectedness ratings significantly predicted reading time. In three new experiments, participants read stories in which an event shift was or was not foreshadowed, thereby influencing expectedness of the shift. Experiment 1 revealed that readers do not expect event shifts, but foreshadowing eliminates this. Experiment 2 showed that foreshadowing does not affect identification of event shifts. Finally, Experiment 3 found that, although reading times increased when an event shift was not foreshadowed, they were not different from controls when it was. Moreover, responses to memory probes were slower following an event shift regardless of foreshadowing, suggesting that situation model updating had taken place. Overall, the results support the idea that previously observed reading time increases at event shifts reflect, at least in part, a reader's unexpected encounter with a shift rather than an increase in processing effort required to update a situation model.

  19. Evaluation of an F100 multivariable control using a real-time engine simulation

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.; Skira, C.; Soeder, J. F.

    1977-01-01

    A multivariable control design for the F100 turbofan engine was evaluated, as part of the F100 multivariable control synthesis (MVCS) program. The evaluation utilized a real-time, hybrid computer simulation of the engine and a digital computer implementation of the control. Significant results of the evaluation are presented and recommendations concerning future engine testing of the control are made.

  20. When univariate model-free time series prediction is better than multivariate

    NASA Astrophysics Data System (ADS)

    Chayama, Masayoshi; Hirata, Yoshito

    2016-07-01

    The delay coordinate method is known to be a practically useful technique for reconstructing the states of an observed system. While this method is theoretically supported by Takens' embedding theorem concerning observations of a scalar time series, we can extend the method to include a multivariate time series. It is often assumed that a better prediction can be obtained using a multivariate time series than by using a scalar time series. However, multivariate time series contains various types of information, and it may be difficult to extract information that is useful for predicting the states. Thus, univariate prediction may sometimes be superior to multivariate prediction. Here, we compare univariate model-free time series predictions with multivariate ones, and demonstrate that univariate model-free prediction is better than multivariate one when the prediction steps are small, while multivariate prediction performs better when the prediction steps become larger. We show the validity of the former finding by using artificial datasets generated from the Lorenz 96 models and a real solar irradiance dataset. The results indicate that it is possible to determine which method is the best choice by considering how far into the future we want to predict.

  1. Piecewise aggregate representations and lower-bound distance functions for multivariate time series

    NASA Astrophysics Data System (ADS)

    Li, Hailin

    2015-06-01

    Dimensionality reduction is one of the most important methods to improve the efficiency of the techniques that are applied to the field of multivariate time series data mining. Due to multivariate time series with the variable-based and time-based dimensions, the reduction techniques must take both of them into consideration. To achieve this goal, we use a center sequence to represent a multivariate time series so that the new sequence can be seen as a univariate time series. Thus two sophisticated piecewise aggregate representations, including piecewise aggregate approximation and symbolization applied to univariate time series, are used to further represent the extended sequence that is derived from the center one. Furthermore, some distance functions are designed to measure the similarity between two representations. Through being proven by some related mathematical analysis, the proposed functions are lower bound on Euclidean distance and dynamic time warping. In this way, false dismissals can be avoided when they are used to index the time series. In addition, multivariate time series with different lengths can be transformed into the extended sequences with equal length, and their corresponding distance functions can measure the similarity between two unequal-length multivariate time series. The experimental results demonstrate that the proposed methods can reduce the dimensionality, and their corresponding distance functions satisfy the lower-bound condition, which can speed up the calculation of similarity search and indexing in the multivariate time series datasets.

  2. It's T time: A study on the return period of multivariate problems

    NASA Astrophysics Data System (ADS)

    Michailidi, Eleni Maria; Balistrocchi, Matteo; Bacchi, Baldassare

    2016-04-01

    variables: hydrograph's peak flow, volume and shape. Consequently, a multivariate framework is needed for a more realistic view of the matter at hand. In recent years, the application of copula functions has facilitated overcoming the inadequacies of multivariate distributions as the problem is handled from two non-interwinding aspects: the dependence structure of the pair of variables and the marginal distributions. The main objective of this study is to investigate whether it is possible to find, in a multivariate space, a region where all the multivariate events produce 'risk' lower or greater than a fixed mean inter-occurrence of failures of one time every T-years. Preliminary results seem to confirm that it is impossible to obtain uniqueness in the definition.

  3. A wireless time synchronized event control system

    NASA Astrophysics Data System (ADS)

    Klug, Robert; Williams, Jonathan; Scheffel, Peter

    2014-05-01

    McQ has developed a wireless, time-synchronized, event control system to control, monitor, and record events with precise timing over large test sites for applications such as high speed rocket sled payload testing. Events of interest may include firing rocket motors and launch sleds, initiating flares, ejecting bombs, ejecting seats, triggering high speed cameras, measuring sled velocity, and triggering events based on a velocity window or other criteria. The system consists of Event Controllers, a Launch Controller, and a wireless network. The Event Controllers can be easily deployed at areas of interest within the test site and maintain sub-microsecond timing accuracy for monitoring sensors, electronically triggering other equipment and events, and providing timing signals to other test equipment. Recorded data and status information is reported over the wireless network to a server and user interface. Over the wireless network, the user interface configures the system based on a user specified mission plan and provides real time command, control, and monitoring of the devices and data. An overview of the system, its features, performance, and potential uses is presented.

  4. Granger Causality in Multivariate Time Series Using a Time-Ordered Restricted Vector Autoregressive Model

    NASA Astrophysics Data System (ADS)

    Siggiridou, Elsa; Kugiumtzis, Dimitris

    2016-04-01

    Granger causality has been used for the investigation of the inter-dependence structure of the underlying systems of multi-variate time series. In particular, the direct causal effects are commonly estimated by the conditional Granger causality index (CGCI). In the presence of many observed variables and relatively short time series, CGCI may fail because it is based on vector autoregressive models (VAR) involving a large number of coefficients to be estimated. In this work, the VAR is restricted by a scheme that modifies the recently developed method of backward-in-time selection (BTS) of the lagged variables and the CGCI is combined with BTS. Further, the proposed approach is compared favorably to other restricted VAR representations, such as the top-down strategy, the bottom-up strategy, and the least absolute shrinkage and selection operator (LASSO), in terms of sensitivity and specificity of CGCI. This is shown by using simulations of linear and nonlinear, low and high-dimensional systems and different time series lengths. For nonlinear systems, CGCI from the restricted VAR representations are compared with analogous nonlinear causality indices. Further, CGCI in conjunction with BTS and other restricted VAR representations is applied to multi-channel scalp electroencephalogram (EEG) recordings of epileptic patients containing epileptiform discharges. CGCI on the restricted VAR, and BTS in particular, could track the changes in brain connectivity before, during and after epileptiform discharges, which was not possible using the full VAR representation.

  5. Discrete Events as Units of Perceived Time

    ERIC Educational Resources Information Center

    Liverence, Brandon M.; Scholl, Brian J.

    2012-01-01

    In visual images, we perceive both space (as a continuous visual medium) and objects (that inhabit space). Similarly, in dynamic visual experience, we perceive both continuous time and discrete events. What is the relationship between these units of experience? The most intuitive answer may be similar to the spatial case: time is perceived as an…

  6. A Sandwich-Type Standard Error Estimator of SEM Models with Multivariate Time Series

    ERIC Educational Resources Information Center

    Zhang, Guangjian; Chow, Sy-Miin; Ong, Anthony D.

    2011-01-01

    Structural equation models are increasingly used as a modeling tool for multivariate time series data in the social and behavioral sciences. Standard error estimators of SEM models, originally developed for independent data, require modifications to accommodate the fact that time series data are inherently dependent. In this article, we extend a…

  7. Learning a Mahalanobis Distance-Based Dynamic Time Warping Measure for Multivariate Time Series Classification.

    PubMed

    Mei, Jiangyuan; Liu, Meizhu; Wang, Yuan-Fang; Gao, Huijun

    2016-06-01

    Multivariate time series (MTS) datasets broadly exist in numerous fields, including health care, multimedia, finance, and biometrics. How to classify MTS accurately has become a hot research topic since it is an important element in many computer vision and pattern recognition applications. In this paper, we propose a Mahalanobis distance-based dynamic time warping (DTW) measure for MTS classification. The Mahalanobis distance builds an accurate relationship between each variable and its corresponding category. It is utilized to calculate the local distance between vectors in MTS. Then we use DTW to align those MTS which are out of synchronization or with different lengths. After that, how to learn an accurate Mahalanobis distance function becomes another key problem. This paper establishes a LogDet divergence-based metric learning with triplet constraint model which can learn Mahalanobis matrix with high precision and robustness. Furthermore, the proposed method is applied on nine MTS datasets selected from the University of California, Irvine machine learning repository and Robert T. Olszewski's homepage, and the results demonstrate the improved performance of the proposed approach.

  8. Rotation in the Dynamic Factor Modeling of Multivariate Stationary Time Series.

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Nesselroade, John R.

    2001-01-01

    Proposes a special rotation procedure for the exploratory dynamic factor model for stationary multivariate time series. The rotation procedure applies separately to each univariate component series of a q-variate latent factor series and transforms such a component, initially represented as white noise, into a univariate moving-average.…

  9. Arbitrary eigenvalue assignments for linear time-varying multivariable control systems

    NASA Technical Reports Server (NTRS)

    Nguyen, Charles C.

    1987-01-01

    The problem of eigenvalue assignments for a class of linear time-varying multivariable systems is considered. Using matrix operators and canonical transformations, it is shown that a time-varying system that is 'lexicography-fixedly controllable' can be made via state feedback to be equivalent to a time-invariant system whose eigenvalues are arbitrarily assignable. A simple algorithm for the design of the state feedback is provided.

  10. Design of reduced-order state estimators for linear time-varying multivariable systems

    NASA Technical Reports Server (NTRS)

    Nguyen, Charles C.

    1987-01-01

    The design of reduced-order state estimators for linear time-varying multivariable systems is considered. Employing the concepts of matrix operators and the method of canonical transformations, this paper shows that there exists a reduced-order state estimator for linear time-varying systems that are 'lexicography-fixedly observable'. In addition, the eigenvalues of the estimator can be arbitrarily assigned. A simple algorithm is proposed for the design of the state estimator.

  11. Multivariate spatial analysis of a heavy rain event in a densely populated delta city

    NASA Astrophysics Data System (ADS)

    Gaitan, Santiago; ten Veldhuis, Marie-claire; Bruni, Guenda; van de Giesen, Nick

    2014-05-01

    Delta cities account for half of the world's population and host key infrastructure and services for the global economic growth. Due to the characteristic geography of delta areas, these cities face high vulnerability to extreme weather and pluvial flooding risks, that are expected to increase as climate change drives heavier rain events. Besides, delta cities are subjected to fast urban densification processes that progressively make them more vulnerable to pluvial flooding. Delta cities need to be adapted to better cope with this threat. The mechanism leading to damage after heavy rains is not completely understood. For instance, current research has shown that rain intensities and volumes can only partially explain the occurrence and localization of rain-related insurance claims (Spekkers et al., 2013). The goal of this paper is to provide further insights into spatial characteristics of the urban environment that can significantly be linked to pluvial-related flooding impacts. To that end, a study-case has been selected: on October 12 to 14 2013, a heavy rain event triggered pluvial floods in Rotterdam, a densely populated city which is undergoing multiple climate adaptation efforts and is located in the Meuse river Delta. While the average yearly precipitation in this city is around 800 mm, local rain gauge measurements ranged from aprox. 60 to 130 mm just during these three days. More than 600 citizens' telephonic complaints reported impacts related to rainfall. The registry of those complaints, which comprises around 300 calls made to the municipality and another 300 to the fire brigade, was made available for research. Other accessible information about this city includes a series of rainfall measurements with up to 1 min time-step at 7 different locations around the city, ground-based radar rainfall data (1 Km^2 spatial resolution and 5 min time-step), a digital elevation model (50 cm of horizontal resolution), a model of overland-flow paths, cadastral

  12. Multivariate time series modeling of short-term system scale irrigation demand

    NASA Astrophysics Data System (ADS)

    Perera, Kushan C.; Western, Andrew W.; George, Biju; Nawarathna, Bandara

    2015-12-01

    Travel time limits the ability of irrigation system operators to react to short-term irrigation demand fluctuations that result from variations in weather, including very hot periods and rainfall events, as well as the various other pressures and opportunities that farmers face. Short-term system-wide irrigation demand forecasts can assist in system operation. Here we developed a multivariate time series (ARMAX) model to forecast irrigation demands with respect to aggregated service points flows (IDCGi, ASP) and off take regulator flows (IDCGi, OTR) based across 5 command areas, which included area covered under four irrigation channels and the study area. These command area specific ARMAX models forecast 1-5 days ahead daily IDCGi, ASP and IDCGi, OTR using the real time flow data recorded at the service points and the uppermost regulators and observed meteorological data collected from automatic weather stations. The model efficiency and the predictive performance were quantified using the root mean squared error (RMSE), Nash-Sutcliffe model efficiency coefficient (NSE), anomaly correlation coefficient (ACC) and mean square skill score (MSSS). During the evaluation period, NSE for IDCGi, ASP and IDCGi, OTR across 5 command areas were ranged 0.98-0.78. These models were capable of generating skillful forecasts (MSSS ⩾ 0.5 and ACC ⩾ 0.6) of IDCGi, ASP and IDCGi, OTR for all 5 lead days and IDCGi, ASP and IDCGi, OTR forecasts were better than using the long term monthly mean irrigation demand. Overall these predictive performance from the ARMAX time series models were higher than almost all the previous studies we are aware. Further, IDCGi, ASP and IDCGi, OTR forecasts have improved the operators' ability to react for near future irrigation demand fluctuations as the developed ARMAX time series models were self-adaptive to reflect the short-term changes in the irrigation demand with respect to various pressures and opportunities that farmers' face, such as

  13. Applying the multivariate time-rescaling theorem to neural population models

    PubMed Central

    Gerhard, Felipe; Haslinger, Robert; Pipa, Gordon

    2011-01-01

    Statistical models of neural activity are integral to modern neuroscience. Recently, interest has grown in modeling the spiking activity of populations of simultaneously recorded neurons to study the effects of correlations and functional connectivity on neural information processing. However any statistical model must be validated by an appropriate goodness-of-fit test. Kolmogorov-Smirnov tests based upon the time-rescaling theorem have proven to be useful for evaluating point-process-based statistical models of single-neuron spike trains. Here we discuss the extension of the time-rescaling theorem to the multivariate (neural population) case. We show that even in the presence of strong correlations between spike trains, models which neglect couplings between neurons can be erroneously passed by the univariate time-rescaling test. We present the multivariate version of the time-rescaling theorem, and provide a practical step-by-step procedure for applying it towards testing the sufficiency of neural population models. Using several simple analytically tractable models and also more complex simulated and real data sets, we demonstrate that important features of the population activity can only be detected using the multivariate extension of the test. PMID:21395436

  14. Genetic basis of adult migration timing in anadromous steelhead discovered through multivariate association testing.

    PubMed

    Hess, Jon E; Zendt, Joseph S; Matala, Amanda R; Narum, Shawn R

    2016-05-11

    Migration traits are presumed to be complex and to involve interaction among multiple genes. We used both univariate analyses and a multivariate random forest (RF) machine learning algorithm to conduct association mapping of 15 239 single nucleotide polymorphisms (SNPs) for adult migration-timing phenotype in steelhead (Oncorhynchus mykiss). Our study focused on a model natural population of steelhead that exhibits two distinct migration-timing life histories with high levels of admixture in nature. Neutral divergence was limited between fish exhibiting summer- and winter-run migration owing to high levels of interbreeding, but a univariate mixed linear model found three SNPs from a major effect gene to be significantly associated with migration timing (p < 0.000005) that explained 46% of trait variation. Alignment to the annotated Salmo salar genome provided evidence that all three SNPs localize within a 46 kb region overlapping GREB1-like (an oestrogen target gene) on chromosome Ssa03. Additionally, multivariate analyses with RF identified that these three SNPs plus 15 additional SNPs explained up to 60% of trait variation. These candidate SNPs may provide the ability to predict adult migration timing of steelhead to facilitate conservation management of this species, and this study demonstrates the benefit of multivariate analyses for association studies. PMID:27170720

  15. Visual Inquiry Toolkit – An Integrated Approach for Exploring and Interpreting Space-Time, Multivariate Patterns

    PubMed Central

    Chen, Jin; MacEachren, Alan M.; Guo, Diansheng

    2011-01-01

    While many datasets carry geographic and temporal references, our ability to analyze these datasets lags behind our ability to collect them because of the challenges posed by both data complexity and scalability issues. This study develops a visual analytics approach that integrates human knowledge and judgments with visual, computational, and cartographic methods to support the application of visual analytics to relatively large spatio-temporal, multivariate datasets. Specifically, a variety of methods are employed for data clustering, pattern searching, information visualization and synthesis. By combining both human and machine strengths, this approach has a better chance to discover novel, relevant and potentially useful information that is difficult to detect by any method used in isolation. We demonstrate the effectiveness of the approach by applying the Visual Inquiry Toolkit we developed to analysis of a dataset containing geographically referenced, time-varying and multivariate data for U.S. technology industries. PMID:26566543

  16. Supporting the Process of Exploring and Interpreting Space–Time Multivariate Patterns: The Visual Inquiry Toolkit

    PubMed Central

    Chen, Jin; MacEachren, Alan M.; Guo, Diansheng

    2009-01-01

    While many data sets carry geographic and temporal references, our ability to analyze these datasets lags behind our ability to collect them because of the challenges posed by both data complexity and tool scalability issues. This study develops a visual analytics approach that leverages human expertise with visual, computational, and cartographic methods to support the application of visual analytics to relatively large spatio-temporal, multivariate data sets. We develop and apply a variety of methods for data clustering, pattern searching, information visualization, and synthesis. By combining both human and machine strengths, this approach has a better chance to discover novel, relevant, and potentially useful information that is difficult to detect by any of the methods used in isolation. We demonstrate the effectiveness of the approach by applying the Visual Inquiry Toolkit we developed to analyze a data set containing geographically referenced, time-varying and multivariate data for U.S. technology industries. PMID:19960096

  17. Supporting the Process of Exploring and Interpreting Space-Time Multivariate Patterns: The Visual Inquiry Toolkit.

    PubMed

    Chen, Jin; Maceachren, Alan M; Guo, Diansheng

    2008-01-01

    While many data sets carry geographic and temporal references, our ability to analyze these datasets lags behind our ability to collect them because of the challenges posed by both data complexity and tool scalability issues. This study develops a visual analytics approach that leverages human expertise with visual, computational, and cartographic methods to support the application of visual analytics to relatively large spatio-temporal, multivariate data sets. We develop and apply a variety of methods for data clustering, pattern searching, information visualization, and synthesis. By combining both human and machine strengths, this approach has a better chance to discover novel, relevant, and potentially useful information that is difficult to detect by any of the methods used in isolation. We demonstrate the effectiveness of the approach by applying the Visual Inquiry Toolkit we developed to analyze a data set containing geographically referenced, time-varying and multivariate data for U.S. technology industries.

  18. Constructing networks from a dynamical system perspective for multivariate nonlinear time series.

    PubMed

    Nakamura, Tomomichi; Tanizawa, Toshihiro; Small, Michael

    2016-03-01

    We describe a method for constructing networks for multivariate nonlinear time series. We approach the interaction between the various scalar time series from a deterministic dynamical system perspective and provide a generic and algorithmic test for whether the interaction between two measured time series is statistically significant. The method can be applied even when the data exhibit no obvious qualitative similarity: a situation in which the naive method utilizing the cross correlation function directly cannot correctly identify connectivity. To establish the connectivity between nodes we apply the previously proposed small-shuffle surrogate (SSS) method, which can investigate whether there are correlation structures in short-term variabilities (irregular fluctuations) between two data sets from the viewpoint of deterministic dynamical systems. The procedure to construct networks based on this idea is composed of three steps: (i) each time series is considered as a basic node of a network, (ii) the SSS method is applied to verify the connectivity between each pair of time series taken from the whole multivariate time series, and (iii) the pair of nodes is connected with an undirected edge when the null hypothesis cannot be rejected. The network constructed by the proposed method indicates the intrinsic (essential) connectivity of the elements included in the system or the underlying (assumed) system. The method is demonstrated for numerical data sets generated by known systems and applied to several experimental time series.

  19. Constructing networks from a dynamical system perspective for multivariate nonlinear time series.

    PubMed

    Nakamura, Tomomichi; Tanizawa, Toshihiro; Small, Michael

    2016-03-01

    We describe a method for constructing networks for multivariate nonlinear time series. We approach the interaction between the various scalar time series from a deterministic dynamical system perspective and provide a generic and algorithmic test for whether the interaction between two measured time series is statistically significant. The method can be applied even when the data exhibit no obvious qualitative similarity: a situation in which the naive method utilizing the cross correlation function directly cannot correctly identify connectivity. To establish the connectivity between nodes we apply the previously proposed small-shuffle surrogate (SSS) method, which can investigate whether there are correlation structures in short-term variabilities (irregular fluctuations) between two data sets from the viewpoint of deterministic dynamical systems. The procedure to construct networks based on this idea is composed of three steps: (i) each time series is considered as a basic node of a network, (ii) the SSS method is applied to verify the connectivity between each pair of time series taken from the whole multivariate time series, and (iii) the pair of nodes is connected with an undirected edge when the null hypothesis cannot be rejected. The network constructed by the proposed method indicates the intrinsic (essential) connectivity of the elements included in the system or the underlying (assumed) system. The method is demonstrated for numerical data sets generated by known systems and applied to several experimental time series. PMID:27078382

  20. A multivariate based event detection method and performance comparison with two baseline methods.

    PubMed

    Liu, Shuming; Smith, Kate; Che, Han

    2015-09-01

    Early warning systems have been widely deployed to protect water systems from accidental and intentional contamination events. Conventional detection algorithms are often criticized for having high false positive rates and low true positive rates. This mainly stems from the inability of these methods to determine whether variation in sensor measurements is caused by equipment noise or the presence of contamination. This paper presents a new detection method that identifies the existence of contamination by comparing Euclidean distances of correlation indicators, which are derived from the correlation coefficients of multiple water quality sensors. The performance of the proposed method was evaluated using data from a contaminant injection experiment and compared with two baseline detection methods. The results show that the proposed method can differentiate between fluctuations caused by equipment noise and those due to the presence of contamination. It yielded higher possibility of detection and a lower false alarm rate than the two baseline methods. With optimized parameter values, the proposed method can correctly detect 95% of all contamination events with a 2% false alarm rate.

  1. A Visualization System for Space-Time and Multivariate Patterns (VIS-STAMP)

    PubMed Central

    Guo, Diansheng; Chen, Jin; MacEachren, Alan M.; Liao, Ke

    2011-01-01

    The research reported here integrates computational, visual, and cartographic methods to develop a geovisual analytic approach for exploring and understanding spatio-temporal and multivariate patterns. The developed methodology and tools can help analysts investigate complex patterns across multivariate, spatial, and temporal dimensions via clustering, sorting, and visualization. Specifically, the approach involves a self-organizing map, a parallel coordinate plot, several forms of reorderable matrices (including several ordering methods), a geographic small multiple display, and a 2-dimensional cartographic color design method. The coupling among these methods leverages their independent strengths and facilitates a visual exploration of patterns that are difficult to discover otherwise. The visualization system we developed supports overview of complex patterns and, through a variety of interactions, enables users to focus on specific patterns and examine detailed views. We demonstrate the system with an application to the IEEE InfoVis 2005 Contest data set, which contains time-varying, geographically referenced, and multivariate data for technology companies in the US. PMID:17073369

  2. ANALYSIS OF MULTIVARIATE FAILURE TIME DATA USING MARGINAL PROPORTIONAL HAZARDS MODEL.

    PubMed

    Chen, Ying; Chen, Kani; Ying, Zhiliang

    2010-01-01

    The marginal proportional hazards model is an important tool in the analysis of multivariate failure time data in the presence of censoring. We propose a method of estimation via the linear combinations of martingale residuals. The estimation and inference procedures are easy to implement numerically. The estimation is generally more accurate than the existing pseudo-likelihood approach: the size of efficiency gain can be considerable in some cases, and the maximum relative efficiency in theory is infinite. Consistency and asymptotic normality are established. Empirical evidence in support of the theoretical claims is shown in simulation studies. PMID:24307815

  3. A multivariate model for the meta-analysis of study level survival data at multiple times.

    PubMed

    Jackson, Dan; Rollins, Katie; Coughlin, Patrick

    2014-09-01

    Motivated by our meta-analytic dataset involving survival rates after treatment for critical leg ischemia, we develop and apply a new multivariate model for the meta-analysis of study level survival data at multiple times. Our data set involves 50 studies that provide mortality rates at up to seven time points, which we model simultaneously, and we compare the results to those obtained from standard methodologies. Our method uses exact binomial within-study distributions and enforces the constraints that both the study specific and the overall mortality rates must not decrease over time. We directly model the probabilities of mortality at each time point, which are the quantities of primary clinical interest. We also present I(2) statistics that quantify the impact of the between-study heterogeneity, which is very considerable in our data set.

  4. Multivariable time series prediction for the icing process on overhead power transmission line.

    PubMed

    Li, Peng; Zhao, Na; Zhou, Donghua; Cao, Min; Li, Jingjie; Shi, Xinling

    2014-01-01

    The design of monitoring and predictive alarm systems is necessary for successful overhead power transmission line icing. Given the characteristics of complexity, nonlinearity, and fitfulness in the line icing process, a model based on a multivariable time series is presented here to predict the icing load of a transmission line. In this model, the time effects of micrometeorology parameters for the icing process have been analyzed. The phase-space reconstruction theory and machine learning method were then applied to establish the prediction model, which fully utilized the history of multivariable time series data in local monitoring systems to represent the mapping relationship between icing load and micrometeorology factors. Relevant to the characteristic of fitfulness in line icing, the simulations were carried out during the same icing process or different process to test the model's prediction precision and robustness. According to the simulation results for the Tao-Luo-Xiong Transmission Line, this model demonstrates a good accuracy of prediction in different process, if the prediction length is less than two hours, and would be helpful for power grid departments when deciding to take action in advance to address potential icing disasters. PMID:25136653

  5. Multivariable Time Series Prediction for the Icing Process on Overhead Power Transmission Line

    PubMed Central

    Li, Peng; Zhao, Na; Zhou, Donghua; Cao, Min; Li, Jingjie; Shi, Xinling

    2014-01-01

    The design of monitoring and predictive alarm systems is necessary for successful overhead power transmission line icing. Given the characteristics of complexity, nonlinearity, and fitfulness in the line icing process, a model based on a multivariable time series is presented here to predict the icing load of a transmission line. In this model, the time effects of micrometeorology parameters for the icing process have been analyzed. The phase-space reconstruction theory and machine learning method were then applied to establish the prediction model, which fully utilized the history of multivariable time series data in local monitoring systems to represent the mapping relationship between icing load and micrometeorology factors. Relevant to the characteristic of fitfulness in line icing, the simulations were carried out during the same icing process or different process to test the model's prediction precision and robustness. According to the simulation results for the Tao-Luo-Xiong Transmission Line, this model demonstrates a good accuracy of prediction in different process, if the prediction length is less than two hours, and would be helpful for power grid departments when deciding to take action in advance to address potential icing disasters. PMID:25136653

  6. A Multivariate Statistical Approach based on a Dynamic Moving Storms (DMS) Generator for Estimating the Frequency of Extreme Storm Events

    NASA Astrophysics Data System (ADS)

    Fang, N. Z.; Gao, S.

    2015-12-01

    Challenges of fully considering the complexity among spatially and temporally varied rainfall always exist in flood frequency analysis. Conventional approaches that simplify the complexity of spatiotemporal interactions generally undermine their impacts on flood risks. A previously developed stochastic storm generator called Dynamic Moving Storms (DMS) aims to address the highly-dependent nature of precipitation field: spatial variability, temporal variability, and movement of the storm. The authors utilize a multivariate statistical approach based on DMS to estimate the occurrence probability or frequency of extreme storm events. Fifteen years of radar rainfall data is used to generate a large number of synthetic storms as basis for statistical assessment. Two parametric retrieval algorithms are developed to recognize rain cells and track storm motions respectively. The resulted parameters are then used to establish probability density functions (PDFs), which are fitted to parametric distribution functions for further Monte Carlo simulations. Consequently, over 1,000,000 synthetic storms are generated based on twelve retrieved parameters for integrated risk assessment and ensemble forecasts. Furthermore, PDFs for parameters are used to calculate joint probabilities based on 2-dimensional Archimedean-Copula functions to determine the occurrence probabilities of extreme events. The approach is validated on the Upper Trinity River watershed and the generated results are compared with those from traditional rainfall frequency studies (i.e. Intensity-Duration-Frequency curves, and Areal Reduction Factors).

  7. Uniform approach to linear and nonlinear interrelation patterns in multivariate time series.

    PubMed

    Rummel, Christian; Abela, Eugenio; Müller, Markus; Hauf, Martinus; Scheidegger, Olivier; Wiest, Roland; Schindler, Kaspar

    2011-06-01

    Currently, a variety of linear and nonlinear measures is in use to investigate spatiotemporal interrelation patterns of multivariate time series. Whereas the former are by definition insensitive to nonlinear effects, the latter detect both nonlinear and linear interrelation. In the present contribution we employ a uniform surrogate-based approach, which is capable of disentangling interrelations that significantly exceed random effects and interrelations that significantly exceed linear correlation. The bivariate version of the proposed framework is explored using a simple model allowing for separate tuning of coupling and nonlinearity of interrelation. To demonstrate applicability of the approach to multivariate real-world time series we investigate resting state functional magnetic resonance imaging (rsfMRI) data of two healthy subjects as well as intracranial electroencephalograms (iEEG) of two epilepsy patients with focal onset seizures. The main findings are that for our rsfMRI data interrelations can be described by linear cross-correlation. Rejection of the null hypothesis of linear iEEG interrelation occurs predominantly for epileptogenic tissue as well as during epileptic seizures.

  8. Uniform approach to linear and nonlinear interrelation patterns in multivariate time series

    NASA Astrophysics Data System (ADS)

    Rummel, Christian; Abela, Eugenio; Müller, Markus; Hauf, Martinus; Scheidegger, Olivier; Wiest, Roland; Schindler, Kaspar

    2011-06-01

    Currently, a variety of linear and nonlinear measures is in use to investigate spatiotemporal interrelation patterns of multivariate time series. Whereas the former are by definition insensitive to nonlinear effects, the latter detect both nonlinear and linear interrelation. In the present contribution we employ a uniform surrogate-based approach, which is capable of disentangling interrelations that significantly exceed random effects and interrelations that significantly exceed linear correlation. The bivariate version of the proposed framework is explored using a simple model allowing for separate tuning of coupling and nonlinearity of interrelation. To demonstrate applicability of the approach to multivariate real-world time series we investigate resting state functional magnetic resonance imaging (rsfMRI) data of two healthy subjects as well as intracranial electroencephalograms (iEEG) of two epilepsy patients with focal onset seizures. The main findings are that for our rsfMRI data interrelations can be described by linear cross-correlation. Rejection of the null hypothesis of linear iEEG interrelation occurs predominantly for epileptogenic tissue as well as during epileptic seizures.

  9. Investigation of time and weather effects on crash types using full Bayesian multivariate Poisson lognormal models.

    PubMed

    El-Basyouny, Karim; Barua, Sudip; Islam, Md Tazul

    2014-12-01

    Previous research shows that various weather elements have significant effects on crash occurrence and risk; however, little is known about how these elements affect different crash types. Consequently, this study investigates the impact of weather elements and sudden extreme snow or rain weather changes on crash type. Multivariate models were used for seven crash types using five years of daily weather and crash data collected for the entire City of Edmonton. In addition, the yearly trend and random variation of parameters across the years were analyzed by using four different modeling formulations. The proposed models were estimated in a full Bayesian context via Markov Chain Monte Carlo simulation. The multivariate Poisson lognormal model with yearly varying coefficients provided the best fit for the data according to Deviance Information Criteria. Overall, results showed that temperature and snowfall were statistically significant with intuitive signs (crashes decrease with increasing temperature; crashes increase as snowfall intensity increases) for all crash types, while rainfall was mostly insignificant. Previous snow showed mixed results, being statistically significant and positively related to certain crash types, while negatively related or insignificant in other cases. Maximum wind gust speed was found mostly insignificant with a few exceptions that were positively related to crash type. Major snow or rain events following a dry weather condition were highly significant and positively related to three crash types: Follow-Too-Close, Stop-Sign-Violation, and Ran-Off-Road crashes. The day-of-the-week dummy variables were statistically significant, indicating a possible weekly variation in exposure. Transportation authorities might use the above results to improve road safety by providing drivers with information regarding the risk of certain crash types for a particular weather condition.

  10. Smith predictor with inverted decoupling for square multivariable time delay systems

    NASA Astrophysics Data System (ADS)

    Garrido, Juan; Vázquez, Francisco; Morilla, Fernando; Normey-Rico, Julio E.

    2016-01-01

    This paper presents a new methodology to design multivariable Smith predictor for n×n processes with multiple time delays based on the centralised inverted decoupling structure. The controller elements are calculated in order to achieve good reference tracking and decoupling response. Independent of the system size, very simple general expressions for the controller elements are obtained. The realisability conditions are provided and the particular case of processes with all of its elements as first-order plus time delay systems is discussed in more detail. A diagonal filter is added to the proposed control structure in order to improve the disturbance rejection without modifying the nominal set-point response and to obtain a stable output prediction in unstable plants. The effectiveness of the method is illustrated through different simulation examples in comparison with other works.

  11. Ecological prediction with nonlinear multivariate time-frequency functional data models

    USGS Publications Warehouse

    Yang, Wen-Hsi; Wikle, Christopher K.; Holan, Scott H.; Wildhaber, Mark L.

    2013-01-01

    Time-frequency analysis has become a fundamental component of many scientific inquiries. Due to improvements in technology, the amount of high-frequency signals that are collected for ecological and other scientific processes is increasing at a dramatic rate. In order to facilitate the use of these data in ecological prediction, we introduce a class of nonlinear multivariate time-frequency functional models that can identify important features of each signal as well as the interaction of signals corresponding to the response variable of interest. Our methodology is of independent interest and utilizes stochastic search variable selection to improve model selection and performs model averaging to enhance prediction. We illustrate the effectiveness of our approach through simulation and by application to predicting spawning success of shovelnose sturgeon in the Lower Missouri River.

  12. Multi-variate models are essential for understanding vertebrate diversification in deep time

    PubMed Central

    Benson, Roger B. J.; Mannion, Philip D.

    2012-01-01

    Statistical models are helping palaeontologists to elucidate the history of biodiversity. Sampling standardization has been extensively applied to remedy the effects of uneven sampling in large datasets of fossil invertebrates. However, many vertebrate datasets are smaller, and the issue of uneven sampling has commonly been ignored, or approached using pairwise comparisons with a numerical proxy for sampling effort. Although most authors find a strong correlation between palaeodiversity and sampling proxies, weak correlation is recorded in some datasets. This has led several authors to conclude that uneven sampling does not influence our view of vertebrate macroevolution. We demonstrate that multi-variate regression models incorporating a model of underlying biological diversification, as well as a sampling proxy, fit observed sauropodomorph dinosaur palaeodiversity best. This bivariate model is a better fit than separate univariate models, and illustrates that observed palaeodiversity is a composite pattern, representing a biological signal overprinted by variation in sampling effort. Multi-variate models and other approaches that consider sampling as an essential component of palaeodiversity are central to gaining a more complete understanding of deep time vertebrate diversification. PMID:21697163

  13. Reconstructing causal pathways and optimal prediction from multivariate time series using the Tigramite package

    NASA Astrophysics Data System (ADS)

    Runge, Jakob

    2016-04-01

    Causal reconstruction techniques from multivariate time series have become a popular approach to analyze interactions in complex systems such as the Earth. These approaches allow to exclude effects of common drivers and indirect influences. Practical applications are, however, especially challenging if nonlinear interactions are taken into account and for typically strongly autocorrelated climate time series. Here we discuss a new reconstruction approach with accompanying software package (Tigramite) and focus on two applications: (1) Information or perturbation transfer along causal pathways. This method allows to detect and quantify which intermediate nodes are important mediators of an interaction mechanism and is illustrated to disentangle pathways of atmospheric flow over Europe and for the ENSO - Indian Monsoon interaction mechanism. (2) A nonlinear model-free prediction technique that efficiently utilizes causal drivers and can be shown to yield information-theoretically optimal predictors avoiding over-fitting. The performance of this framework is illustrated on a climatological index of El Nino Southern Oscillation. References: Runge, J. (2015). Quantifying information transfer and mediation along causal pathways in complex systems. Phys. Rev. E, 92(6), 062829. doi:10.1103/PhysRevE.92.062829 Runge, J., Donner, R. V., & Kurths, J. (2015). Optimal model-free prediction from multivariate time series. Phys. Rev. E, 91(5), 052909. doi:10.1103/PhysRevE.91.052909 Runge, J., Petoukhov, V., Donges, J. F., Hlinka, J., Jajcay, N., Vejmelka, M., … Kurths, J. (2015). Identifying causal gateways and mediators in complex spatio-temporal systems. Nature Communications, 6, 8502. doi:10.1038/ncomms9502

  14. Time-varying nonstationary multivariate risk analysis using a dynamic Bayesian copula

    NASA Astrophysics Data System (ADS)

    Sarhadi, Ali; Burn, Donald H.; Concepción Ausín, María.; Wiper, Michael P.

    2016-03-01

    A time-varying risk analysis is proposed for an adaptive design framework in nonstationary conditions arising from climate change. A Bayesian, dynamic conditional copula is developed for modeling the time-varying dependence structure between mixed continuous and discrete multiattributes of multidimensional hydrometeorological phenomena. Joint Bayesian inference is carried out to fit the marginals and copula in an illustrative example using an adaptive, Gibbs Markov Chain Monte Carlo (MCMC) sampler. Posterior mean estimates and credible intervals are provided for the model parameters and the Deviance Information Criterion (DIC) is used to select the model that best captures different forms of nonstationarity over time. This study also introduces a fully Bayesian, time-varying joint return period for multivariate time-dependent risk analysis in nonstationary environments. The results demonstrate that the nature and the risk of extreme-climate multidimensional processes are changed over time under the impact of climate change, and accordingly the long-term decision making strategies should be updated based on the anomalies of the nonstationary environment.

  15. Multivariate Analyses of Small Theropod Dinosaur Teeth and Implications for Paleoecological Turnover through Time

    PubMed Central

    Larson, Derek W.; Currie, Philip J.

    2013-01-01

    Isolated small theropod teeth are abundant in vertebrate microfossil assemblages, and are frequently used in studies of species diversity in ancient ecosystems. However, determining the taxonomic affinities of these teeth is problematic due to an absence of associated diagnostic skeletal material. Species such as Dromaeosaurus albertensis, Richardoestesia gilmorei, and Saurornitholestes langstoni are known from skeletal remains that have been recovered exclusively from the Dinosaur Park Formation (Campanian). It is therefore likely that teeth from different formations widely disparate in age or geographic position are not referable to these species. Tooth taxa without any associated skeletal material, such as Paronychodon lacustris and Richardoestesia isosceles, have also been identified from multiple localities of disparate ages throughout the Late Cretaceous. To address this problem, a dataset of measurements of 1183 small theropod teeth (the most specimen-rich theropod tooth dataset ever constructed) from North America ranging in age from Santonian through Maastrichtian were analyzed using multivariate statistical methods: canonical variate analysis, pairwise discriminant function analysis, and multivariate analysis of variance. The results indicate that teeth referred to the same taxon from different formations are often quantitatively distinct. In contrast, isolated teeth found in time equivalent formations are not quantitatively distinguishable from each other. These results support the hypothesis that small theropod taxa, like other dinosaurs in the Late Cretaceous, tend to be exclusive to discrete host formations. The methods outlined have great potential for future studies of isolated teeth worldwide, and may be the most useful non-destructive technique known of extracting the most data possible from isolated and fragmentary specimens. The ability to accurately assess species diversity and turnover through time based on isolated teeth will help illuminate

  16. A lengthy look at the daily grind: time series analysis of events, mood, stress, and satisfaction.

    PubMed

    Fuller, Julie A; Stanton, Jeffrey M; Fisher, Gwenith G; Spitzmuller, Christiane; Russell, Steven S; Smith, Patricia C

    2003-12-01

    The present study investigated processes by which job stress and satisfaction unfold over time by examining the relations between daily stressful events, mood, and these variables. Using a Web-based daily survey of stressor events, perceived strain, mood, and job satisfaction completed by 14 university workers, 1,060 occasions of data were collected. Transfer function analysis, a multivariate version of time series analysis, was used to examine the data for relationships among the measured variables after factoring out the contaminating influences of serial dependency. Results revealed a contrast effect in which a stressful event associated positively with higher strain on the same day and associated negatively with strain on the following day. Perceived strain increased over the course of a semester for a majority of participants, suggesting that effects of stress build over time. Finally, the data were consistent with the notion that job satisfaction is a distal outcome that is mediated by perceived strain. PMID:14640813

  17. Circuit for measuring time differences among events

    DOEpatents

    Romrell, Delwin M.

    1977-01-01

    An electronic circuit has a plurality of input terminals. Application of a first input signal to any one of the terminals initiates a timing sequence. Later inputs to the same terminal are ignored but a later input to any other terminal of the plurality generates a signal which can be used to measure the time difference between the later input and the first input signal. Also, such time differences may be measured between the first input signal and an input signal to any other terminal of the plurality or the circuit may be reset at any time by an external reset signal.

  18. Hemodynamic Monitoring Using Switching Autoregressive Dynamics of Multivariate Vital Sign Time Series

    PubMed Central

    Lehman, Li-wei H.; Nemati, Shamim; Mark, Roger G.

    2016-01-01

    In a critical care setting, shock and resuscitation endpoints are often defined based on arterial blood pressure values. Patient-specific fluctuations and interactions between heart rate (HR) and blood pressure (BP), however, may provide additional prognostic value to stratify individual patients’ risks for adverse outcomes at different blood pressure targets. In this work, we use the switching autoregressive (SVAR) dynamics inferred from the multivariate vital sign time series to stratify mortality risks of intensive care units (ICUs) patients receiving vasopressor treatment. We model vital sign observations as generated from latent states from an autoregressive Hidden Markov Model (AR-HMM) process, and use the proportion of time patients stayed in different latent states to predict outcome. We evaluate the performance of our approach using minute-by-minute HR and mean arterial BP (MAP) of an ICU patient cohort while on vasopressor treatment. Our results indicate that the bivariate HR/MAP dynamics (AUC 0.74 [0.64, 0.84]) contain additional prognostic information beyond the MAP values (AUC 0.53 [0.42, 0.63]) in mortality prediction. Further, HR/MAP dynamics achieved better performance among a subgroup of patients in a low MAP range (median MAP < 65 mmHg) while on pressors. A realtime implementation of our approach may provide clinicians a tool to quantify the effectiveness of interventions and to inform treatment decisions.

  19. A Regularized Linear Dynamical System Framework for Multivariate Time Series Analysis

    PubMed Central

    Liu, Zitao; Hauskrecht, Milos

    2015-01-01

    Linear Dynamical System (LDS) is an elegant mathematical framework for modeling and learning Multivariate Time Series (MTS). However, in general, it is difficult to set the dimension of an LDS’s hidden state space. A small number of hidden states may not be able to model the complexities of a MTS, while a large number of hidden states can lead to overfitting. In this paper, we study learning methods that impose various regularization penalties on the transition matrix of the LDS model and propose a regularized LDS learning framework (rLDS) which aims to (1) automatically shut down LDSs’ spurious and unnecessary dimensions, and consequently, address the problem of choosing the optimal number of hidden states; (2) prevent the overfitting problem given a small amount of MTS data; and (3) support accurate MTS forecasting. To learn the regularized LDS from data we incorporate a second order cone program and a generalized gradient descent method into the Maximum a Posteriori framework and use Expectation Maximization to obtain a low-rank transition matrix of the LDS model. We propose two priors for modeling the matrix which lead to two instances of our rLDS. We show that our rLDS is able to recover well the intrinsic dimensionality of the time series dynamics and it improves the predictive performance when compared to baselines on both synthetic and real-world MTS datasets. PMID:25905027

  20. Timing Processes Are Correlated when Tasks Share a Salient Event

    ERIC Educational Resources Information Center

    Zelaznik, Howard N.; Rosenbaum, David A.

    2010-01-01

    Event timing is manifested when participants make discrete movements such as repeatedly tapping a key. Emergent timing is manifested when participants make continuous movements such as repeatedly drawing a circle. Here we pursued the possibility that providing salient perceptual events to mark the completion of time intervals could allow circle…

  1. Analysis of heterogeneous dengue transmission in Guangdong in 2014 with multivariate time series model

    PubMed Central

    Cheng, Qing; Lu, Xin; Wu, Joseph T.; Liu, Zhong; Huang, Jincai

    2016-01-01

    Guangdong experienced the largest dengue epidemic in recent history. In 2014, the number of dengue cases was the highest in the previous 10 years and comprised more than 90% of all cases. In order to analyze heterogeneous transmission of dengue, a multivariate time series model decomposing dengue risk additively into endemic, autoregressive and spatiotemporal components was used to model dengue transmission. Moreover, random effects were introduced in the model to deal with heterogeneous dengue transmission and incidence levels and power law approach was embedded into the model to account for spatial interaction. There was little spatial variation in the autoregressive component. In contrast, for the endemic component, there was a pronounced heterogeneity between the Pearl River Delta area and the remaining districts. For the spatiotemporal component, there was considerable heterogeneity across districts with highest values in some western and eastern department. The results showed that the patterns driving dengue transmission were found by using clustering analysis. And endemic component contribution seems to be important in the Pearl River Delta area, where the incidence is high (95 per 100,000), while areas with relatively low incidence (4 per 100,000) are highly dependent on spatiotemporal spread and local autoregression. PMID:27666657

  2. Bicomponent Trend Maps: A Multivariate Approach to Visualizing Geographic Time Series

    PubMed Central

    Schroeder, Jonathan P.

    2012-01-01

    The most straightforward approaches to temporal mapping cannot effectively illustrate all potentially significant aspects of spatio-temporal patterns across many regions and times. This paper introduces an alternative approach, bicomponent trend mapping, which employs a combination of principal component analysis and bivariate choropleth mapping to illustrate two distinct dimensions of long-term trend variations. The approach also employs a bicomponent trend matrix, a graphic that illustrates an array of typical trend types corresponding to different combinations of scores on two principal components. This matrix is useful not only as a legend for bicomponent trend maps but also as a general means of visualizing principal components. To demonstrate and assess the new approach, the paper focuses on the task of illustrating population trends from 1950 to 2000 in census tracts throughout major U.S. urban cores. In a single static display, bicomponent trend mapping is not able to depict as wide a variety of trend properties as some other multivariate mapping approaches, but it can make relationships among trend classes easier to interpret, and it offers some unique flexibility in classification that could be particularly useful in an interactive data exploration environment. PMID:23504193

  3. Detecting synchronization clusters in multivariate time series via coarse-graining of Markov chains

    NASA Astrophysics Data System (ADS)

    Allefeld, Carsten; Bialonski, Stephan

    2007-12-01

    Synchronization cluster analysis is an approach to the detection of underlying structures in data sets of multivariate time series, starting from a matrix R of bivariate synchronization indices. A previous method utilized the eigenvectors of R for cluster identification, analogous to several recent attempts at group identification using eigenvectors of the correlation matrix. All of these approaches assumed a one-to-one correspondence of dominant eigenvectors and clusters, which has however been shown to be wrong in important cases. We clarify the usefulness of eigenvalue decomposition for synchronization cluster analysis by translating the problem into the language of stochastic processes, and derive an enhanced clustering method harnessing recent insights from the coarse-graining of finite-state Markov processes. We illustrate the operation of our method using a simulated system of coupled Lorenz oscillators, and we demonstrate its superior performance over the previous approach. Finally we investigate the question of robustness of the algorithm against small sample size, which is important with regard to field applications.

  4. Environmental Events and the Timing of Death.

    ERIC Educational Resources Information Center

    Marriott, Cindy

    There is some evidence that the timing of death may not be random. Taking into consideration some of the variables which possibly affect death, this paper reviews intervention techniques with the possible goal of saving lives. Knowing that the elderly respond to the environment, society should accept as its responsibility the provision of support…

  5. ETARA - EVENT TIME AVAILABILITY, RELIABILITY ANALYSIS

    NASA Technical Reports Server (NTRS)

    Viterna, L. A.

    1994-01-01

    The ETARA system was written to evaluate the performance of the Space Station Freedom Electrical Power System, but the methodology and software can be modified to simulate any system that can be represented by a block diagram. ETARA is an interactive, menu-driven reliability, availability, and maintainability (RAM) simulation program. Given a Reliability Block Diagram representation of a system, the program simulates the behavior of the system over a specified period of time using Monte Carlo methods to generate block failure and repair times as a function of exponential and/or Weibull distributions. ETARA can calculate availability parameters such as equivalent availability, state availability (percentage of time at a particular output state capability), continuous state duration and number of state occurrences. The program can simulate initial spares allotment and spares replenishment for a resupply cycle. The number of block failures are tabulated both individually and by block type. ETARA also records total downtime, repair time, and time waiting for spares. Maintenance man-hours per year and system reliability, with or without repair, at or above a particular output capability can also be calculated. The key to using ETARA is the development of a reliability or availability block diagram. The block diagram is a logical graphical illustration depicting the block configuration necessary for a function to be successfully accomplished. Each block can represent a component, a subsystem, or a system. The function attributed to each block is considered for modeling purposes to be either available or unavailable; there are no degraded modes of block performance. A block does not have to represent physically connected hardware in the actual system to be connected in the block diagram. The block needs only to have a role in contributing to an available system function. ETARA can model the RAM characteristics of systems represented by multilayered, nesting block diagrams

  6. Detection of flood events in hydrological discharge time series

    NASA Astrophysics Data System (ADS)

    Seibert, S. P.; Ehret, U.

    2012-04-01

    The shortcomings of mean-squared-error (MSE) based distance metrics are well known (Beran 1999, Schaeffli & Gupta 2007) and the development of novel distance metrics (Pappenberger & Beven 2004, Ehret & Zehe 2011) and multi-criteria-approaches enjoy increasing popularity (Reusser 2009, Gupta et al. 2009). Nevertheless, the hydrological community still lacks metrics which identify and thus, allow signature based evaluations of hydrological discharge time series. Signature based information/evaluations are required wherever specific time series features, such as flood events, are of special concern. Calculation of event based runoff coefficients or precise knowledge on flood event characteristics (like onset or duration of rising limp or the volume of falling limp, etc.) are possible applications. The same applies for flood forecasting/simulation models. Directly comparing simulated and observed flood event features may reveal thorough insights into model dynamics. Compared to continuous space-and-time-aggregated distance metrics, event based evaluations may provide answers like the distributions of event characteristics or the percentage of the events which were actually reproduced by a hydrological model. It also may help to provide information on the simulation accuracy of small, medium and/or large events in terms of timing and magnitude. However, the number of approaches which expose time series features is small and their usage is limited to very specific questions (Merz & Blöschl 2009, Norbiato et al. 2009). We believe this is due to the following reasons: i) a generally accepted definition of the signature of interest is missing or difficult to obtain (in our case: what makes a flood event a flood event?) and/or ii) it is difficult to translate such a definition into a equation or (graphical) procedure which exposes the feature of interest in the discharge time series. We reviewed approaches which detect event starts and/or ends in hydrological discharge time

  7. Intraoperative imaging of cortical cerebral perfusion by time-resolved thermography and multivariate data analysis

    NASA Astrophysics Data System (ADS)

    Steiner, Gerald; Sobottka, Stephan B.; Koch, Edmund; Schackert, Gabriele; Kirsch, Matthias

    2011-01-01

    A new approach to cortical perfusion imaging is demonstrated using high-sensitivity thermography in conjunction with multivariate statistical data analysis. Local temperature changes caused by a cold bolus are imaged and transferred to a false color image. A cold bolus of 10 ml saline at ice temperature is injected systemically via a central venous access. During the injection, a sequence of 735 thermographic images are recorded within 2 min. The recorded data cube is subjected to a principal component analysis (PCA) to select slight changes of the cortical temperature caused by the cold bolus. PCA reveals that 11 s after injection the temperature of blood vessels is shortly decreased followed by an increase to the temperature before the cold bolus is injected. We demonstrate the potential of intraoperative thermography in combination with multivariate data analysis to image cortical cerebral perfusion without any markers. We provide the first in vivo application of multivariate thermographic imaging.

  8. Balance characteristics of multivariate background error covariance for rainy and dry seasons and their impact on precipitation forecasts of two rainfall events

    NASA Astrophysics Data System (ADS)

    Chen, Yaodeng; Xia, Xue; Min, Jinzhong; Huang, Xiang-Yu; Rizvi, Syed R. H.

    2016-10-01

    Atmospheric moisture content or humidity is an important analysis variable of any meteorological data assimilation system. The humidity analysis can be univariate, using humidity background (normally short-range numerical forecasts) and humidity observations. However, more and more data assimilation systems are multivariate, analyzing humidity together with wind, temperature and pressure. Background error covariances, with unbalanced velocity potential and humidity in the multivariate formulation, are generated from weather research and forecasting model forecasts, collected over a summer rainy season and a winter dry season. The unbalanced velocity potential and humidity related correlations are shown to be significantly larger, indicating more important roles unbalanced velocity potential and humidity play, in the rainy season than that in the dry season. Three cycling data assimilation experiments of two rainfall events in the middle and lower reaches of the Yangtze River are carried out. The experiments differ in the formulation of the background error covariances. Results indicate that only including unbalanced velocity potential in the multivariate background error covariance improves wind analyses, but has little impact on temperature and humidity analyses. In contrast, further including humidity in the multivariate background error covariance although has a slight negative effect on wind analyses and a neutral effect on temperature analyses, but significantly improves humidity analyses, leading to precipitation forecasts more consistent with China Hourly Merged Precipitation Analysis.

  9. Nuclear event zero-time calculation and uncertainty evaluation.

    PubMed

    Pan, Pujing; Ungar, R Kurt

    2012-04-01

    It is important to know the initial time, or zero-time, of a nuclear event such as a nuclear weapon's test, a nuclear power plant accident or a nuclear terrorist attack (e.g. with an improvised nuclear device, IND). Together with relevant meteorological information, the calculated zero-time is used to help locate the origin of a nuclear event. The zero-time of a nuclear event can be derived from measured activity ratios of two nuclides. The calculated zero-time of a nuclear event would not be complete without an appropriately evaluated uncertainty term. In this paper, analytical equations for zero-time and the associated uncertainty calculations are derived using a measured activity ratio of two nuclides. Application of the derived equations is illustrated in a realistic example using data from the last Chinese thermonuclear test in 1980.

  10. Family Events and the Timing of Intergenerational Transfers

    ERIC Educational Resources Information Center

    Leopold, Thomas; Schneider, Thorsten

    2011-01-01

    This research investigates how family events in adult children's lives influence the timing of their parents' financial transfers. We draw on retrospective data collected by the German Socio-Economic Panel Study and use event history models to study the effects of marriage, divorce and childbirth on the receipt of large gifts from parents. We find…

  11. Sensor-Generated Time Series Events: A Definition Language

    PubMed Central

    Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan

    2012-01-01

    There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.

  12. Reciprocal Benefits of Mass-Univariate and Multivariate Modeling in Brain Mapping: Applications to Event-Related Functional MRI, H2 15O-, and FDG-PET

    PubMed Central

    Habeck, Christian G.

    2006-01-01

    In brain mapping studies of sensory, cognitive, and motor operations, specific waveforms of dynamic neural activity are predicted based on theoretical models of human information processing. For example in event-related functional MRI (fMRI), the general linear model (GLM) is employed in mass-univariate analyses to identify the regions whose dynamic activity closely matches the expected waveforms. By comparison multivariate analyses based on PCA or ICA provide greater flexibility in detecting spatiotemporal properties of experimental data that may strongly support alternative neuroscientific explanations. We investigated conjoint multivariate and mass-univariate analyses that combine the capabilities to (1) verify activation of neural machinery we already understand and (2) discover reliable signatures of new neural machinery. We examined combinations of GLM and PCA that recover latent neural signals (waveforms and footprints) with greater accuracy than either method alone. Comparative results are illustrated with analyses of real fMRI data, adding to Monte Carlo simulation support. PMID:23165047

  13. Bayesian joint modeling of longitudinal measurements and time-to-event data using robust distributions.

    PubMed

    Baghfalaki, T; Ganjali, M; Hashemi, R

    2014-01-01

    Distributional assumptions of most of the existing methods for joint modeling of longitudinal measurements and time-to-event data cannot allow incorporation of outlier robustness. In this article, we develop and implement a joint modeling of longitudinal and time-to-event data using some powerful distributions for robust analyzing that are known as normal/independent distributions. These distributions include univariate and multivariate versions of the Student's t, the slash, and the contaminated normal distributions. The proposed model implements a linear mixed effects model under a normal/independent distribution assumption for both random effects and residuals of the longitudinal process. For the time-to-event process a parametric proportional hazard model with a Weibull baseline hazard is used. Also, a Bayesian approach using the Markov-chain Monte Carlo method is adopted for parameter estimation. Some simulation studies are performed to investigate the performance of the proposed method under presence and absence of outliers. Also, the proposed methods are applied for analyzing a real AIDS clinical trial, with the aim of comparing the efficiency and safety of two antiretroviral drugs, where CD4 count measurements are gathered as longitudinal outcomes. In these data, time to death or dropout is considered as the interesting time-to-event outcome variable. Different model structures are developed for analyzing these data sets, where model selection is performed by the deviance information criterion (DIC), expected Akaike information criterion (EAIC), and expected Bayesian information criterion (EBIC).

  14. Observer Agreement for Timed-Event Sequential Data: A Comparison of Time-Based and Event-Based Algorithms

    PubMed Central

    Bakeman, Roger; Quera, Vicenç; Gnisci, Augusto

    2009-01-01

    Observer agreement is often regarded as the sine qua non of observational research. Cohen’s kappa is a widely-used index and is appropriate when discrete entities, such as a turn-of-talk or a demarcated time-interval, are presented to pairs of observers to code. Kappa-like statistics and agreement matrixes are also used for the timed-event sequential data produced when observers first segment and then code events detected in the stream of behavior, noting onset and offset times. Such kappas are of two kinds, time-based and event-based. Available for download is a computer program (OASTES, Observer Agreement for Simulated Timed Event Sequences) that simulates the coding of observers of a stated accuracy, and then computes agreement statistics for two time-based kappas (with and without tolerance) and three event-based kappas (one implemented in The Observer, one in INTERACT, and one in GSEQ). Based on simulation results presented here, and due to the somewhat different information provide by each, reporting of both a time-based and an event-based kappa is recommended. PMID:19182133

  15. Analysing adverse events by time-to-event models: the CLEOPATRA study.

    PubMed

    Proctor, Tanja; Schumacher, Martin

    2016-07-01

    When analysing primary and secondary endpoints in a clinical trial with patients suffering from a chronic disease, statistical models for time-to-event data are commonly used and accepted. This is in contrast to the analysis of data on adverse events where often only a table with observed frequencies and corresponding test statistics is reported. An example is the recently published CLEOPATRA study where a three-drug regimen is compared with a two-drug regimen in patients with HER2-positive first-line metastatic breast cancer. Here, as described earlier, primary and secondary endpoints (progression-free and overall survival) are analysed using time-to-event models, whereas adverse events are summarized in a simple frequency table, although the duration of study treatment differs substantially. In this paper, we demonstrate the application of time-to-event models to first serious adverse events using the data of the CLEOPATRA study. This will cover the broad range between a simple incidence rate approach over survival and competing risks models (with death as a competing event) to multi-state models. We illustrate all approaches by means of graphical displays highlighting the temporal dynamics and compare the obtained results. For the CLEOPATRA study, the resulting hazard ratios are all in the same order of magnitude. But the use of time-to-event models provides valuable and additional information that would potentially be overlooked by only presenting incidence proportions. These models adequately address the temporal dynamics of serious adverse events as well as death of patients. Copyright © 2016 John Wiley & Sons, Ltd.

  16. The Carrington Event: Possible Solar Proton Intensity-Time Profile

    NASA Astrophysics Data System (ADS)

    Smart, D. F.; Shea, M. A.; McCracken, K. G.

    2004-05-01

    We evaluate the >30 MeV proton fluence associated with the Carrington event as 1.9 x 10**10 protons per sqcm based on the analysis of solar proton generated NO(y) radicals that are deposited in polar ice. (See McCracken et al., JGR, 106, 21,585, 2001.) We construct a possible intensity-time profile of the solar particle flux for this event by assuming that it is part of the class of interplanetary shock dominated events where the maximum particle flux is observed as the shock passes the earth. We show that most of the very large solar proton fluence events (those with >30 MeV omnidirectonal fluence exceeding 1 x 10**9 protons per cmsq) observed at the earth during the last 50 years belong to this class of event.

  17. An iterative technique to stabilize a linear time invariant multivariable system with output feedback

    NASA Technical Reports Server (NTRS)

    Sankaran, V.

    1974-01-01

    An iterative procedure for determining the constant gain matrix that will stabilize a linear constant multivariable system using output feedback is described. The use of this procedure avoids the transformation of variables which is required in other procedures. For the case in which the product of the output and input vector dimensions is greater than the number of states of the plant, general solution is given. In the case in which the states exceed the product of input and output vector dimensions, a least square solution which may not be stable in all cases is presented. The results are illustrated with examples.

  18. Moving Events in Time: Time-Referent Hand-Arm Movements Influence Perceived Temporal Distance to Past Events

    ERIC Educational Resources Information Center

    Blom, Stephanie S. A. H.; Semin, Gun R.

    2013-01-01

    We examine and find support for the hypothesis that time-referent hand-arm movements influence temporal judgments. In line with the concept of "left is associated with earlier times, and right is associated with later times," we show that performing left (right) hand-arm movements while thinking about a past event increases (decreases) the…

  19. Effects of alcohol intake on time-based event expectations.

    PubMed

    Kunchulia, Marina; Thomaschke, Roland

    2016-04-01

    Previous evidence suggests that alcohol affects various forms of temporal cognition. However, there are presently no studies investigating whether and how alcohol affects on time-based event expectations. Here, we investigated the effects of alcohol on time-based event expectations. Seventeen healthy volunteers, aged between 19 and 36 years, participated. We employed a variable foreperiod paradigm with temporally predictable events, mimicking a computer game. Error rate and reaction time were analyzed in placebo (0 g/kg), low dose (0.2 g/kg) and high dose (0.6 g/kg) conditions. We found that alcohol intake did not eliminate, but substantially reduced, the formation of time-based expectancy. This effect was stronger for high doses, than for low doses, of alcohol. As a result of our studies, we have evidence that alcohol intake impairs time-based event expectations. The mechanism by which the level of alcohol impairs time-based event expectations needs to be clarified by future research. PMID:26680768

  20. Asynchronous visual event-based time-to-contact.

    PubMed

    Clady, Xavier; Clercq, Charles; Ieng, Sio-Hoi; Houseini, Fouzhan; Randazzo, Marco; Natale, Lorenzo; Bartolozzi, Chiara; Benosman, Ryad

    2014-01-01

    Reliable and fast sensing of the environment is a fundamental requirement for autonomous mobile robotic platforms. Unfortunately, the frame-based acquisition paradigm at the basis of main stream artificial perceptive systems is limited by low temporal dynamics and redundant data flow, leading to high computational costs. Hence, conventional sensing and relative computation are obviously incompatible with the design of high speed sensor-based reactive control for mobile applications, that pose strict limits on energy consumption and computational load. This paper introduces a fast obstacle avoidance method based on the output of an asynchronous event-based time encoded imaging sensor. The proposed method relies on an event-based Time To Contact (TTC) computation based on visual event-based motion flows. The approach is event-based in the sense that every incoming event adds to the computation process thus allowing fast avoidance responses. The method is validated indoor on a mobile robot, comparing the event-based TTC with a laser range finder TTC, showing that event-based sensing offers new perspectives for mobile robotics sensing. PMID:24570652

  1. Asynchronous visual event-based time-to-contact.

    PubMed

    Clady, Xavier; Clercq, Charles; Ieng, Sio-Hoi; Houseini, Fouzhan; Randazzo, Marco; Natale, Lorenzo; Bartolozzi, Chiara; Benosman, Ryad

    2014-01-01

    Reliable and fast sensing of the environment is a fundamental requirement for autonomous mobile robotic platforms. Unfortunately, the frame-based acquisition paradigm at the basis of main stream artificial perceptive systems is limited by low temporal dynamics and redundant data flow, leading to high computational costs. Hence, conventional sensing and relative computation are obviously incompatible with the design of high speed sensor-based reactive control for mobile applications, that pose strict limits on energy consumption and computational load. This paper introduces a fast obstacle avoidance method based on the output of an asynchronous event-based time encoded imaging sensor. The proposed method relies on an event-based Time To Contact (TTC) computation based on visual event-based motion flows. The approach is event-based in the sense that every incoming event adds to the computation process thus allowing fast avoidance responses. The method is validated indoor on a mobile robot, comparing the event-based TTC with a laser range finder TTC, showing that event-based sensing offers new perspectives for mobile robotics sensing.

  2. Visualization-by-Sketching: An Artist's Interface for Creating Multivariate Time-Varying Data Visualizations.

    PubMed

    Schroeder, David; Keefe, Daniel F

    2016-01-01

    We present Visualization-by-Sketching, a direct-manipulation user interface for designing new data visualizations. The goals are twofold: First, make the process of creating real, animated, data-driven visualizations of complex information more accessible to artists, graphic designers, and other visual experts with traditional, non-technical training. Second, support and enhance the role of human creativity in visualization design, enabling visual experimentation and workflows similar to what is possible with traditional artistic media. The approach is to conceive of visualization design as a combination of processes that are already closely linked with visual creativity: sketching, digital painting, image editing, and reacting to exemplars. Rather than studying and tweaking low-level algorithms and their parameters, designers create new visualizations by painting directly on top of a digital data canvas, sketching data glyphs, and arranging and blending together multiple layers of animated 2D graphics. This requires new algorithms and techniques to interpret painterly user input relative to data "under" the canvas, balance artistic freedom with the need to produce accurate data visualizations, and interactively explore large (e.g., terabyte-sized) multivariate datasets. Results demonstrate a variety of multivariate data visualization techniques can be rapidly recreated using the interface. More importantly, results and feedback from artists support the potential for interfaces in this style to attract new, creative users to the challenging task of designing more effective data visualizations and to help these users stay "in the creative zone" as they work.

  3. Visualization-by-Sketching: An Artist's Interface for Creating Multivariate Time-Varying Data Visualizations.

    PubMed

    Schroeder, David; Keefe, Daniel F

    2016-01-01

    We present Visualization-by-Sketching, a direct-manipulation user interface for designing new data visualizations. The goals are twofold: First, make the process of creating real, animated, data-driven visualizations of complex information more accessible to artists, graphic designers, and other visual experts with traditional, non-technical training. Second, support and enhance the role of human creativity in visualization design, enabling visual experimentation and workflows similar to what is possible with traditional artistic media. The approach is to conceive of visualization design as a combination of processes that are already closely linked with visual creativity: sketching, digital painting, image editing, and reacting to exemplars. Rather than studying and tweaking low-level algorithms and their parameters, designers create new visualizations by painting directly on top of a digital data canvas, sketching data glyphs, and arranging and blending together multiple layers of animated 2D graphics. This requires new algorithms and techniques to interpret painterly user input relative to data "under" the canvas, balance artistic freedom with the need to produce accurate data visualizations, and interactively explore large (e.g., terabyte-sized) multivariate datasets. Results demonstrate a variety of multivariate data visualization techniques can be rapidly recreated using the interface. More importantly, results and feedback from artists support the potential for interfaces in this style to attract new, creative users to the challenging task of designing more effective data visualizations and to help these users stay "in the creative zone" as they work. PMID:26529734

  4. Time Separation Between Events in a Sequence: a Regional Property?

    NASA Astrophysics Data System (ADS)

    Muirwood, R.; Fitzenz, D. D.

    2013-12-01

    Earthquake sequences are loosely defined as events occurring too closely in time and space to appear unrelated. Depending on the declustering method, several, all, or no event(s) after the first large event might be recognized as independent mainshocks. It can therefore be argued that a probabilistic seismic hazard assessment (PSHA, traditionally dealing with mainshocks only) might already include the ground shaking effects of such sequences. Alternatively all but the largest event could be classified as an ';aftershock' and removed from the earthquake catalog. While in PSHA the question is only whether to keep or remove the events from the catalog, for Risk Management purposes, the community response to the earthquakes, as well as insurance risk transfer mechanisms, can be profoundly affected by the actual timing of events in such a sequence. In particular the repetition of damaging earthquakes over a period of weeks to months can lead to businesses closing and families evacuating from the region (as happened in Christchurch, New Zealand in 2011). Buildings that are damaged in the first earthquake may go on to be damaged again, even while they are being repaired. Insurance also functions around a set of critical timeframes - including the definition of a single 'event loss' for reinsurance recoveries within the 192 hour ';hours clause', the 6-18 month pace at which insurance claims are settled, and the annual renewal of insurance and reinsurance contracts. We show how temporal aspects of earthquake sequences need to be taken into account within models for Risk Management, and what time separation between events are most sensitive, both in terms of the modeled disruptions to lifelines and business activity as well as in the losses to different parties (such as insureds, insurers and reinsurers). We also explore the time separation between all events and between loss causing events for a collection of sequences from across the world and we point to the need to

  5. Identifying multiple periodicities in sparse photon event time series

    NASA Astrophysics Data System (ADS)

    Koen, Chris

    2016-07-01

    The data considered are event times (e.g. photon arrival times, or the occurrence of sharp pulses). The source is multiperiodic, or the data could be multiperiodic because several unresolved sources contribute to the time series. Most events may be unobserved, either because the source is intermittent, or because some events are below the detection limit. The data may also be contaminated by spurious pulses. The problem considered is the determination of the periods in the data. A two-step procedure is proposed: in the first, a likely period is identified; in the second, events associated with this periodicity are removed from the time series. The steps are repeated until the remaining events do not exhibit any periodicity. A number of period-finding methods from the literature are reviewed, and a new maximum likelihood statistic is also introduced. It is shown that the latter is competitive compared to other techniques. The proposed methodology is tested on simulated data. Observations of two rotating radio transients are discussed, but contrary to claims in the literature, no evidence for multiperiodicity could be found.

  6. Events in time: Basic analysis of Poisson data

    SciTech Connect

    Engelhardt, M.E.

    1994-09-01

    The report presents basic statistical methods for analyzing Poisson data, such as the member of events in some period of time. It gives point estimates, confidence intervals, and Bayesian intervals for the rate of occurrence per unit of time. It shows how to compare subsets of the data, both graphically and by statistical tests, and how to look for trends in time. It presents a compound model when the rate of occurrence varies randomly. Examples and SAS programs are given.

  7. Deconstructing events: The neural bases for space, time, and causality

    PubMed Central

    Kranjec, Alexander; Cardillo, Eileen R.; Lehet, Matthew; Chatterjee, Anjan

    2013-01-01

    Space, time, and causality provide a natural structure for organizing our experience. These abstract categories allow us to think relationally in the most basic sense; understanding simple events require one to represent the spatial relations among objects, the relative durations of actions or movements, and links between causes and effects. The present fMRI study investigates the extent to which the brain distinguishes between these fundamental conceptual domains. Participants performed a one-back task with three conditions of interest (SPACE, TIME and CAUSALITY). Each condition required comparing relations between events in a simple verbal narrative. Depending on the condition, participants were instructed to either attend to the spatial, temporal, or causal characteristics of events, but between participants, each particular event relation appeared in all three conditions. Contrasts compared neural activity during each condition against the remaining two and revealed how thinking about events is deconstructed neurally. Space trials recruited neural areas traditionally associated with visuospatial processing, primarily bilateral frontal and occipitoparietal networks. Causality trials activated areas previously found to underlie causal thinking and thematic role assignment, such as left medial frontal, and left middle temporal gyri, respectively. Causality trials also produced activations in SMA, caudate, and cerebellum; cortical and subcortical regions associated with the perception of time at different timescales. The TIME contrast however, produced no significant effects. This pattern, indicating negative results for TIME trials, but positive effects for CAUSALITY trials in areas important for time perception, motivated additional overlap analyses to further probe relations between domains. The results of these analyses suggest a closer correspondence between time and causality than between time and space. PMID:21861674

  8. Deconstructing events: the neural bases for space, time, and causality.

    PubMed

    Kranjec, Alexander; Cardillo, Eileen R; Schmidt, Gwenda L; Lehet, Matthew; Chatterjee, Anjan

    2012-01-01

    Space, time, and causality provide a natural structure for organizing our experience. These abstract categories allow us to think relationally in the most basic sense; understanding simple events requires one to represent the spatial relations among objects, the relative durations of actions or movements, and the links between causes and effects. The present fMRI study investigates the extent to which the brain distinguishes between these fundamental conceptual domains. Participants performed a 1-back task with three conditions of interest (space, time, and causality). Each condition required comparing relations between events in a simple verbal narrative. Depending on the condition, participants were instructed to either attend to the spatial, temporal, or causal characteristics of events, but between participants each particular event relation appeared in all three conditions. Contrasts compared neural activity during each condition against the remaining two and revealed how thinking about events is deconstructed neurally. Space trials recruited neural areas traditionally associated with visuospatial processing, primarily bilateral frontal and occipitoparietal networks. Causality trials activated areas previously found to underlie causal thinking and thematic role assignment, such as left medial frontal and left middle temporal gyri, respectively. Causality trials also produced activations in SMA, caudate, and cerebellum; cortical and subcortical regions associated with the perception of time at different timescales. The time contrast, however, produced no significant effects. This pattern, indicating negative results for time trials but positive effects for causality trials in areas important for time perception, motivated additional overlap analyses to further probe relations between domains. The results of these analyses suggest a closer correspondence between time and causality than between time and space.

  9. Estimating differences and ratios in median times to event

    PubMed Central

    Rogawski, Elizabeth T.; Westreich, Daniel J.; Kang, Gagandeep; Ward, Honorine D.; Cole, Stephen R.

    2016-01-01

    Time differences and time ratios are often more interpretable estimates of effect than hazard ratios for time-to-event data, especially for common outcomes. We developed a SAS macro for estimating time differences and time ratios between baseline-fixed binary exposure groups based on inverse probability weighted Kaplan-Meier curves. The macro uses pooled logistic regression to calculate inverse probability of censoring and exposure weights, draws Kaplan-Meier curves based on the weighted data, and estimates the time difference and time ratio at a user-defined survival proportion. The macro also calculates the risk difference and risk ratio at a user-specified time. Confidence intervals are constructed by bootstrap. We provide an example assessing the effect of exclusive breastfeeding during diarrhea on the incidence of subsequent diarrhea in children followed from birth to 3 years in Vellore, India. The SAS macro provided here should facilitate the wider reporting of time differences and time ratios. PMID:27465526

  10. Absolute GPS Time Event Generation and Capture for Remote Locations

    NASA Astrophysics Data System (ADS)

    HIRES Collaboration

    The HiRes experiment operates fixed location and portable lasers at remote desert locations to generate calibration events. One physics goal of HiRes is to search for unusual showers. These may appear similar to upward or horizontally pointing laser tracks used for atmospheric calibration. It is therefore necessary to remove all of these calibration events from the HiRes detector data stream in a physics blind manner. A robust and convenient "tagging" method is to generate the calibration events at precisely known times. To facilitate this tagging method we have developed the GPSY (Global Positioning System YAG) module. It uses a GPS receiver, an embedded processor and additional timing logic to generate laser triggers at arbitrary programmed times and frequencies with better than 100nS accuracy. The GPSY module has two trigger outputs (one microsecond resolution) to trigger the laser flash-lamp and Q-switch and one event capture input (25nS resolution). The GPSY module can be programmed either by a front panel menu based interface or by a host computer via an RS232 serial interface. The latter also allows for computer logging of generated and captured event times. Details of the design and the implementation of these devices will be presented. 1 Motivation Air Showers represent a small fraction, much less than a percent, of the total High Resolution Fly's Eye data sample. The bulk of the sample is calibration data. Most of this calibration data is generated by two types of systems that use lasers. One type sends light directly to the detectors via optical fibers to monitor detector gains (Girard 2001). The other sends a beam of light into the sky and the scattered light that reaches the detectors is used to monitor atmospheric effects (Wiencke 1998). It is important that these calibration events be cleanly separated from the rest of the sample both to provide a complete set of monitoring information, and more

  11. Off-Time Events and Life Quality of Older Adults.

    ERIC Educational Resources Information Center

    Goodhart, Darlene; Zautra, Alex

    Many previous studies have found that daily life events influence community residents' perceived quality of life, which refers to the relative goodness of life as evaluated subjectively. A subsample population of 539 older residents, aged 55 and over, were interviewed in their homes. A 60-item scale was devised to measure the effects of "off-time"…

  12. Established time series measure occurrence and frequency of episodic events.

    NASA Astrophysics Data System (ADS)

    Pebody, Corinne; Lampitt, Richard

    2015-04-01

    Established time series measure occurrence and frequency of episodic events. Episodic flux events occur in open oceans. Time series making measurements over significant time scales are one of the few methods that can capture these events and compare their impact with 'normal' flux. Seemingly rare events may be significant on local scales, but without the ability to measure the extent of flux on spatial and temporal scales and combine with the frequency of occurrence, it is difficult to constrain their impact. The Porcupine Abyssal Plain Sustained Observatory (PAP-SO) in the Northeast Atlantic (49 °N 16 °W, 5000m water depth) has measured particle flux since 1989 and zooplankton swimmers since 2000. Sediment traps at 3000m and 100 metres above bottom, collect material year round and we have identified close links between zooplankton and particle flux. Some of these larger animals, for example Diacria trispinosa, make a significant contribution to carbon flux through episodic flux events. D. trispinosa is a euthecosome mollusc which occurs in the Northeast Atlantic, though the PAP-SO is towards the northern limit of its distribution. Pteropods are comprised of aragonite shell, containing soft body parts excepting the muscular foot which extends beyond the mouth of the living animal. Pteropods, both live-on-entry animals and the empty shells are found year round in the 3000m trap. Generally the abundance varies with particle flux, but within that general pattern there are episodic events where significant numbers of these animals containing both organic and inorganic carbon are captured at depth and therefore could be defined as contributing to export flux. Whether the pulse of animals is as a result of the life cycle of D. trispinosa or the effects of the physics of the water column is unclear, but the complexity of the PAP-SO enables us not only to collect these animals but to examine them in parallel to the biogeochemical and physical elements measured by the

  13. Initial Time Dependence of Abundances in Solar Energetic Particle Events

    NASA Technical Reports Server (NTRS)

    Reames, Donald V.; Ny, C. K.; Tylka, A. J.

    1999-01-01

    We compare the initial behavior of Fe/O and He/H abundance ratios and their relationship to the evolution of the proton energy spectra in "small" and "large" gradual solar energetic particle (SEP) events. The results are qualitatively consistent with the behavior predicted by the theory of Ng et al. (1999a, b). He/H ratios that initially rise with time are a signature of scattering by non-Kolmogorov Alfven wave spectra generated by intense beams of shock-accelerated protons streaming outward in large gradual SEP events.

  14. The Time of Our Lives: Life Span Development of Timing and Event Tracking

    ERIC Educational Resources Information Center

    McAuley, J. Devin; Jones, Mari Riess; Holub, Shayla; Johnston, Heather M.; Miller, Nathaniel S.

    2006-01-01

    Life span developmental profiles were constructed for 305 participants (ages 4-95) for a battery of paced and unpaced perceptual-motor timing tasks that included synchronize-continue tapping at a wide range of target event rates. Two life span hypotheses, derived from an entrainment theory of timing and event tracking, were tested. A preferred…

  15. Encoding of event timing in the phase of neural oscillations.

    PubMed

    Kösem, Anne; Gramfort, Alexandre; van Wassenhove, Virginie

    2014-05-15

    Time perception is a critical component of conscious experience. To be in synchrony with the environment, the brain must deal not only with differences in the speed of light and sound but also with its computational and neural transmission delays. Here, we asked whether the brain could actively compensate for temporal delays by changing its processing time. Specifically, can changes in neural timing or in the phase of neural oscillation index perceived timing? For this, a lag-adaptation paradigm was used to manipulate participants' perceived audiovisual (AV) simultaneity of events while they were recorded with magnetoencephalography (MEG). Desynchronized AV stimuli were presented rhythmically to elicit a robust 1 Hz frequency-tagging of auditory and visual cortical responses. As participants' perception of AV simultaneity shifted, systematic changes in the phase of entrained neural oscillations were observed. This suggests that neural entrainment is not a passive response and that the entrained neural oscillation shifts in time. Crucially, our results indicate that shifts in neural timing in auditory cortices linearly map participants' perceived AV simultaneity. To our knowledge, these results provide the first mechanistic evidence for active neural compensation in the encoding of sensory event timing in support of the emergence of time awareness. PMID:24531044

  16. Recurrent-Neural-Network-Based Multivariable Adaptive Control for a Class of Nonlinear Dynamic Systems With Time-Varying Delay.

    PubMed

    Hwang, Chih-Lyang; Jan, Chau

    2016-02-01

    At the beginning, an approximate nonlinear autoregressive moving average (NARMA) model is employed to represent a class of multivariable nonlinear dynamic systems with time-varying delay. It is known that the disadvantages of robust control for the NARMA model are as follows: 1) suitable control parameters for larger time delay are more sensitive to achieving desirable performance; 2) it only deals with bounded uncertainty; and 3) the nominal NARMA model must be learned in advance. Due to the dynamic feature of the NARMA model, a recurrent neural network (RNN) is online applied to learn it. However, the system performance becomes deteriorated due to the poor learning of the larger variation of system vector functions. In this situation, a simple network is employed to compensate the upper bound of the residue caused by the linear parameterization of the approximation error of RNN. An e -modification learning law with a projection for weight matrix is applied to guarantee its boundedness without persistent excitation. Under suitable conditions, the semiglobally ultimately bounded tracking with the boundedness of estimated weight matrix is obtained by the proposed RNN-based multivariable adaptive control. Finally, simulations are presented to verify the effectiveness and robustness of the proposed control.

  17. Prediction problem for target events based on the inter-event waiting time

    NASA Astrophysics Data System (ADS)

    Shapoval, A.

    2010-11-01

    In this paper we address the problem of forecasting the target events of a time series given the distribution ξ of time gaps between target events. Strong earthquakes and stock market crashes are the two types of such events that we are focusing on. In the series of earthquakes, as McCann et al. show [W.R. Mc Cann, S.P. Nishenko, L.R. Sykes, J. Krause, Seismic gaps and plate tectonics: seismic potential for major boundaries, Pure and Applied Geophysics 117 (1979) 1082-1147], there are well-defined gaps (called seismic gaps) between strong earthquakes. On the other hand, usually there are no regular gaps in the series of stock market crashes [M. Raberto, E. Scalas, F. Mainardi, Waiting-times and returns in high-frequency financial data: an empirical study, Physica A 314 (2002) 749-755]. For the case of seismic gaps, we analytically derive an upper bound of prediction efficiency given the coefficient of variation of the distribution ξ. For the case of stock market crashes, we develop an algorithm that predicts the next crash within a certain time interval after the previous one. We show that this algorithm outperforms random prediction. The efficiency of our algorithm sets up a lower bound of efficiency for effective prediction of stock market crashes.

  18. Predicting the timing of dynamic events through sound: Bouncing balls.

    PubMed

    Gygi, Brian; Giordano, Bruno L; Shafiro, Valeriy; Kharkhurin, Anatoliy; Zhang, Peter Xinya

    2015-07-01

    Dynamic information in acoustical signals produced by bouncing objects is often used by listeners to predict the objects' future behavior (e.g., hitting a ball). This study examined factors that affect the accuracy of motor responses to sounds of real-world dynamic events. In experiment 1, listeners heard 2-5 bounces from a tennis ball, ping-pong, basketball, or wiffle ball, and would tap to indicate the time of the next bounce in a series. Across ball types and number of bounces, listeners were extremely accurate in predicting the correct bounce time (CT) with a mean prediction error of only 2.58% of the CT. Prediction based on a physical model of bouncing events indicated that listeners relied primarily on temporal cues when estimating the timing of the next bounce, and to a lesser extent on the loudness and spectral cues. In experiment 2, the timing of each bounce pattern was altered to correspond to the bounce timing pattern of another ball, producing stimuli with contradictory acoustic cues. Nevertheless, listeners remained highly accurate in their estimates of bounce timing. This suggests that listeners can adopt their estimates of bouncing-object timing based on acoustic cues that provide most veridical information about dynamic aspects of object behavior.

  19. Predicting the timing of dynamic events through sound: Bouncing balls.

    PubMed

    Gygi, Brian; Giordano, Bruno L; Shafiro, Valeriy; Kharkhurin, Anatoliy; Zhang, Peter Xinya

    2015-07-01

    Dynamic information in acoustical signals produced by bouncing objects is often used by listeners to predict the objects' future behavior (e.g., hitting a ball). This study examined factors that affect the accuracy of motor responses to sounds of real-world dynamic events. In experiment 1, listeners heard 2-5 bounces from a tennis ball, ping-pong, basketball, or wiffle ball, and would tap to indicate the time of the next bounce in a series. Across ball types and number of bounces, listeners were extremely accurate in predicting the correct bounce time (CT) with a mean prediction error of only 2.58% of the CT. Prediction based on a physical model of bouncing events indicated that listeners relied primarily on temporal cues when estimating the timing of the next bounce, and to a lesser extent on the loudness and spectral cues. In experiment 2, the timing of each bounce pattern was altered to correspond to the bounce timing pattern of another ball, producing stimuli with contradictory acoustic cues. Nevertheless, listeners remained highly accurate in their estimates of bounce timing. This suggests that listeners can adopt their estimates of bouncing-object timing based on acoustic cues that provide most veridical information about dynamic aspects of object behavior. PMID:26233044

  20. Life Events and Depressive Symptoms in African American Adolescents: Do Ecological Domains and Timing of Life Events Matter?

    ERIC Educational Resources Information Center

    Sanchez, Yadira M.; Lambert, Sharon F.; Ialongo, Nicholas S.

    2012-01-01

    Considerable research has documented associations between adverse life events and internalizing symptoms in adolescents, but much of this research has focused on the number of events experienced, with less attention to the ecological context or timing of events. This study examined life events in three ecological domains relevant to adolescents…

  1. Space-Time Characteristic Functions in Multivariate Logic and Possible Interpretation of Entanglement

    NASA Astrophysics Data System (ADS)

    Gaudeau de Gerlicz, Claude; Sechpine, Pierre; Bobola, Philippe; Antoine, Mathias

    The knowledge about hidden variables in physics, (Bohr's-Schrödinger theories) and their developments, boundaries seem more and more fuzzy at physical scales. Also some other new theories give to both time and space as much fuzziness. The classical theory, (school of Copenhagen's) and also Heisenberg and Louis de Broglie give us the idea of a dual wave and particle parts such the way we observe. Thus, the Pondichery interpretation recently developed by Cramer and al. gives to the time part this duality. According Cramer, there could be a little more to this duality, some late or advanced waves of time that have been confirmed and admitted as possible solutions with the Maxwell's equations. We developed here a possible pattern that could matched in the sequence between Space and both retarded and advanced time wave in the "Cramer handshake" in locality of the present when the observation is made everything become local.

  2. A diary after dinner: How the time of event recording influences later accessibility of diary events.

    PubMed

    Szőllősi, Ágnes; Keresztes, Attila; Conway, Martin A; Racsmány, Mihály

    2015-01-01

    Recording the events of a day in a diary may help improve their later accessibility. An interesting question is whether improvements in long-term accessibility will be greater if the diary is completed at the end of the day, or after a period of sleep, the following morning. We investigated this question using an internet-based diary method. On each of five days, participants (n = 109) recorded autobiographical memories for that day or for the previous day. Recording took place either in the morning or in the evening. Following a 30-day retention interval, the diary events were free recalled. We found that participants who recorded their memories in the evening before sleep had best memory performance. These results suggest that the time of reactivation and recording of recent autobiographical events has a significant effect on the later accessibility of those diary events. We discuss our results in the light of related findings that show a beneficial effect of reduced interference during sleep on memory consolidation and reconsolidation. PMID:26088958

  3. Real-Time Multimission Event Notification System for Mars Relay

    NASA Technical Reports Server (NTRS)

    Wallick, Michael N.; Allard, Daniel A.; Gladden, Roy E.; Wang, Paul; Hy, Franklin H.

    2013-01-01

    As the Mars Relay Network is in constant flux (missions and teams going through their daily workflow), it is imperative that users are aware of such state changes. For example, a change by an orbiter team can affect operations on a lander team. This software provides an ambient view of the real-time status of the Mars network. The Mars Relay Operations Service (MaROS) comprises a number of tools to coordinate, plan, and visualize various aspects of the Mars Relay Network. As part of MaROS, a feature set was developed that operates on several levels of the software architecture. These levels include a Web-based user interface, a back-end "ReSTlet" built in Java, and databases that store the data as it is received from the network. The result is a real-time event notification and management system, so mission teams can track and act upon events on a moment-by-moment basis. This software retrieves events from MaROS and displays them to the end user. Updates happen in real time, i.e., messages are pushed to the user while logged into the system, and queued when the user is not online for later viewing. The software does not do away with the email notifications, but augments them with in-line notifications. Further, this software expands the events that can generate a notification, and allows user-generated notifications. Existing software sends a smaller subset of mission-generated notifications via email. A common complaint of users was that the system-generated e-mails often "get lost" with other e-mail that comes in. This software allows for an expanded set (including user-generated) of notifications displayed in-line of the program. By separating notifications, this can improve a user's workflow.

  4. Efficiency and time-dependent cross correlations in multivariable Monte Carlo updating

    NASA Astrophysics Data System (ADS)

    Potter, Christopher C. J.; Swendsen, Robert H.

    2013-11-01

    We show that any Monte Carlo (MC) algorithm using joint updates of more than a single variable at each step produces time-shifted correlations between variables—even if the equilibrium probabilities of the variables are independent. These spurious time-shifted correlations will affect both the magnitudes of correlation times and the values of optimal acceptance ratios. In particular, correlation times computed with local variables will not generally give the same predictions of efficiency or optimal acceptance ratios as those computed with global variables. Gelman, Roberts, and Gilks [Bayesian Statistics, edited by J. M. Bernardo, J. O. Berger, A. P. Dawid, and A. R. M. Smith, Vol. 5 (Oxford University Press, 1996), pp. 599-607] have used a local measure of efficiency to prove a theorem for global MC updating that predicts an optimal acceptance ratio of 0.234 as the number of variables goes to infinity. We show that global measures of efficiency can produce different, and arguably more appropriate, optimal acceptance ratios. More importantly, global updating is inherently far less efficient than updating variables separately or in small groups. We suggest that previously determined optimal acceptance ratios and their implications for practical applications should be reconsidered in the context of these findings.

  5. MULTIVARIATE STATISTICAL MODELS FOR EFFECTS OF PM AND COPOLLUTANTS IN A DAILY TIME SERIES EPIDEMIOLOGY STUDY

    EPA Science Inventory

    Most analyses of daily time series epidemiology data relate mortality or morbidity counts to PM and other air pollutants by means of single-outcome regression models using multiple predictors, without taking into account the complex statistical structure of the predictor variable...

  6. Higher Dimensional Clayton–Oakes Models for Multivariate Failure Time Data

    PubMed Central

    Prentice, R. L.

    2016-01-01

    Summary The Clayton–Oakes bivariate failure time model is extended to dimensions m > 2 in a manner that allows unspecified marginal survivor functions for all dimensions less than m. Special cases that allow unspecified marginal survivor functions of dimension q with q < m, while making some provisions for dependencies of dimension greater than q, are also described. PMID:27738350

  7. Time use choices and healthy body weight: A multivariate analysis of data from the American Time use Survey

    PubMed Central

    2011-01-01

    Background We examine the relationship between time use choices and healthy body weight as measured by survey respondents' body mass index (BMI). Using data from the 2006 and 2007 American Time Use Surveys, we expand upon earlier research by including more detailed measures of time spent eating as well as measures of physical activity time and sedentary time. We also estimate three alternative models that relate time use to BMI. Results Our results suggest that time use and BMI are simultaneously determined. The preferred empirical model reveals evidence of an inverse relationship between time spent eating and BMI for women and men. In contrast, time spent drinking beverages while simultaneously doing other things and time spent watching television/videos are positively linked to BMI. For women only, time spent in food preparation and clean-up is inversely related to BMI while for men only, time spent sleeping is inversely related to BMI. Models that include grocery prices, opportunity costs of time, and nonwage income reveal that as these economic variables increase, BMI declines. Conclusions In this large, nationally representative data set, our analyses that correct for time use endogeneity reveal that the Americans' time use decisions have implications for their BMI. The analyses suggest that both eating time and context (i.e., while doing other tasks simultaneously) matters as does time spent in food preparation, and time spent in sedentary activities. Reduced form models suggest that shifts in grocery prices, opportunity costs of time, and nonwage income may be contributing to alterations in time use patterns and food choices that have implications for BMI. PMID:21810246

  8. Chemical fingerprinting of petroleum biomarkers in biota samples using retention-time locking chromatography and multivariate analysis.

    PubMed

    Bartolomé, Luis; Deusto, Miren; Etxebarria, Nestor; Navarro, Patricia; Usobiaga, Aresatz; Zuloaga, Olatz

    2007-07-20

    This work was conducted to study a new separation and evaluation approach for the chemical fingerprinting of petroleum biomarkers in biota samples. The final aim of this work was to study the correlation between the observed effects in the shore habitats (mussels and limpets) and one pollution source: the oil spill of the Prestige tanker. The method combined a clean-up step of the biota extracts (mussels and limpets), the retention-time locking of the gas chromatographic set up, and the multivariate data analysis of the chromatograms. For clean-up, solid-phase extraction and gel permeation chromatography were compared, and 5g Florisil cartridges assured the lack of interfering compounds in the last extracts. In order to assure reproducible retention times and to avoid the realignment of the chromatograms, the retention-time locking feature of our gas chromatography-mass spectrometry (GC-MS) set up was used. Finally, in the case of multivariate analysis, the GC-MS chromatograms were treated, essentially by derivatization and by normalization, and all the chromatograms at m/z 191 (terpenes), m/z 217-218 (steranes and diasteranes) and m/z 231 (triaromatic steranes) were treated by means of principal component analysis. Furthermore, slightly different four oil samples from the Prestige oil spill were analyzed following the Nordtest method, and the GC-MS chromatograms were considered as the reference chemical fingerprints of the sources. In this sense, the correlation between the studied samples, including sediments and biota samples, and the source candidate was completed by means of a supervised pattern recognition method. As a result, the method proposed in this work was useful to identify the Prestige oil spill as the source of many of the analyzed samples.

  9. Putting Predictive Models to Use: Scoring of Unseen Streaming Data using a Multivariate Time Series Classification Tool

    NASA Astrophysics Data System (ADS)

    Sipes, T.; Karimabadi, H.; Imber, S. M.; Slavin, J. A.; Pothier, N. M.; Coeli, R.

    2013-12-01

    Advances in data collection and data storage technologies have made the assembly of multivariate time series data more common. Data analysis and extraction of knowledge from such massive and complex datasets encountered in space physics today present a major obstacle to fully utilizing our vast data repositories and to scientific progress. In the previous years we introduced a time series classification tool MineTool-TS [Karimabadi et al, 2009] and its extension to simulation and streaming data [Sipes& Karimabadi, 2012, 2013]. In this work we demonstrate the applicability and real world utility of the predictive models created using the tool to scoring and labeling of a large dataset of unseen, streaming data. Predictive models that are created are based on the assumption that the training data used to create them is a true representative of the population. Multivariate time series datasets are also characterized by large amounts of variability and potential background noise. Moreover, there are multiple issues being raised by the streaming nature of the data. In this work we illustrate how we dealt with these challenges and demonstrate the results in a study of flux ropes in the plasma sheet. We have used an iterative process of building a predictive model using the original labeled training set, tested it on a week worth of streaming data, had the results checked by a scientific expert in the domain, and fed the results and the labels back into the training set, creating a large training set and using it to produce the final model. This final model was then put to use to predict a very large, unseen, six month period of streaming data. In this work we present the results of our machine learning approach to automatically detect flux ropes in spacecraft data.

  10. Time course of salinity adaptation in a strongly euryhaline estuarine teleost, fundulus heteroclitus: A multivariable approach

    USGS Publications Warehouse

    Marshall, W.S.; Emberley, T.R.; Singer, T.D.; Bryson, S.E.; McCormick, S.D.

    1999-01-01

    Freshwater-adapted killifish (Fundulus heteroclitus) were transferred directly from soft fresh water to full-strength sea water for periods of 1h, 3h, 8h and 1, 2, 7, 14 and 30 days. Controls were transferred to fresh water for 24 h. Measured variables included: blood [Na+], osmolality, glucose and cortisol levels, basal and stimulated rates of ion transport and permeability of in vitro opercular epithelium, gill Na+/K+-ATPase and citrate synthase activity and chloride cell ultrastructure. These data were compared with previously published killifish cystic fibrosis transmembrane conductance regulator (kfCFTR) expression in the gills measured over a similar time course. Plasma cortisol levels peaked at 1 h, coincident with a rise in plasma [Na+]. At 8 h after transfer to sea water, a time at which previous work has shown kfCFTR expression to be elevated, blood osmolality and [Na+] were high, and cortisol levels and opercular membrane short-circuit current (I(SC); a measure of Cl- secretion rate) were low. The 24h group, which showed the highest level of kfCFTR expression, had the highest plasma [Na+] and osmolality, elevated plasma cortisol levels, significantly lower opercular membrane resistance, an increased opercular membrane ion secretion rate and collapsed tubule inclusions in mitochondria-rich cells, but no change in gill Na+/K+-ATPase and citrate synthase activity or plasma glucose levels. Apparently, killifish have a rapid (<1h) cortisol response to salinity coupled to subsequent (8-48 h) expression of kfCFTR anion channel proteins in existing mitochondria-rich cells that convert transport from ion uptake to ion secretion.

  11. Empirical reconstruction of storm-time steady magnetospheric convection events

    NASA Astrophysics Data System (ADS)

    Stephens, G. K.; Sitnov, M. I.; Kissinger, J.; Tsyganenko, N. A.; McPherron, R. L.; Korth, H.; Anderson, B. J.

    2013-12-01

    We investigate the storm-scale morphology of the magnetospheric magnetic field as well as underlying distributions of electric currents, equatorial plasma pressure and entropy for four Steady Magnetospheric Convection (SMC) events that occurred during the May 2000 and October 2011 magnetic storms. The analysis is made using the empirical geomagnetic field model TS07D, in which the structure of equatorial currents is not predefined and it is dictated by data. The model also combines the strengths of statistical and event-oriented approaches in mining data for the reconstruction of the magnetic field. The formation of a near-Earth minimum of the equatorial magnetic field in the midnight sector is inferred from data without ad hoc assumptions of a special current system postulated in earlier empirical reconstructions. In addition, a new SMC class is discovered where the minimum equatorial field is substantially larger and located closer to Earth. The magnetic field tailward of the minimum is also much larger, and the corresponding region of accumulated magnetic flux may occupy a very short tail region. The equatorial current and plasma pressure are found to be strongly enhanced far beyond geosynchronous orbit and in a broad local time interval covering the whole nightside region. This picture is consistent with independent recent statistical studies of the SMC pressure distributions, global MHD and kinetic RCM-E simulations. Distributions of the flux tube volume and entropy inferred from data reveal different mechanisms of the magnetotail convection crisis resolution for two classes of SMC events.

  12. What controls the local time extent of flux transfer events?

    NASA Astrophysics Data System (ADS)

    Milan, S. E.; Imber, S. M.; Carter, J. A.; Walach, M.-T.; Hubert, B.

    2016-02-01

    Flux transfer events (FTEs) are the manifestation of bursty and/or patchy magnetic reconnection at the magnetopause. We compare two sequences of the ionospheric signatures of flux transfer events observed in global auroral imagery and coherent ionospheric radar measurements. Both sequences were observed during very similar seasonal and interplanetary magnetic field (IMF) conditions, though with differing solar wind speed. A key observation is that the signatures differed considerably in their local time extent. The two periods are 26 August 1998, when the IMF had components BZ≈-10 nT and BY≈9 nT and the solar wind speed was VX≈650 km s-1, and 31 August 2005, IMF BZ≈-7 nT, BY≈17 nT, and VX≈380 km s-1. In the first case, the reconnection rate was estimated to be near 160 kV, and the FTE signatures extended across at least 7 h of magnetic local time (MLT) of the dayside polar cap boundary. In the second, a reconnection rate close to 80 kV was estimated, and the FTEs had a MLT extent of roughly 2 h. We discuss the ramifications of these differences for solar wind-magnetosphere coupling.

  13. Intelligent fuzzy controller for event-driven real time systems

    NASA Technical Reports Server (NTRS)

    Grantner, Janos; Patyra, Marek; Stachowicz, Marian S.

    1992-01-01

    Most of the known linguistic models are essentially static, that is, time is not a parameter in describing the behavior of the object's model. In this paper we show a model for synchronous finite state machines based on fuzzy logic. Such finite state machines can be used to build both event-driven, time-varying, rule-based systems and the control unit section of a fuzzy logic computer. The architecture of a pipelined intelligent fuzzy controller is presented, and the linguistic model is represented by an overall fuzzy relation stored in a single rule memory. A VLSI integrated circuit implementation of the fuzzy controller is suggested. At a clock rate of 30 MHz, the controller can perform 3 MFLIPS on multi-dimensional fuzzy data.

  14. Detection of intermittent events in atmospheric time series

    NASA Astrophysics Data System (ADS)

    Paradisi, P.; Cesari, R.; Palatella, L.; Contini, D.; Donateo, A.

    2009-04-01

    The modeling approach in atmospheric sciences is based on the assumption that local fluxes of mass, momentum, heat, etc... can be described as linear functions of the local gradient of some intensive property (concentration, flow strain, temperature,...). This is essentially associated with Gaussian statistics and short range (exponential) correlations. However, the atmosphere is a complex dynamical system displaying a wide range of spatial and temporal scales. A global description of the atmospheric dynamics should include a great number of degrees of freedom, strongly interacting on several temporal and spatial scales, thus generating long range (power-law) correlations and non-Gaussian distribution of fluctuations (Lévy flights, Lévy walks, Continuous Time Random Walks) [1]. This is typically associated with anomalous diffusion and scaling, non-trivial memory features and correlation decays and, especially, with the emergence of flux-gradient relationships that are non-linear and/or non-local in time and/or space. Actually, the local flux-gradient relationship is greatly preferred due to a more clear physical meaning, allowing to perform direct comparisons with experimental data, and, especially, to smaller computational costs in numerical models. In particular, the linearity of this relationship allows to define a transport coefficient (e.g., turbulent diffusivity) and the modeling effort is usually focused on this coefficient. However, the validity of the local (and linear) flux-gradient model is strongly dependent on the range of spatial and temporal scales represented by the model and, consequently, by the sub-grid processes included in the flux-gradient relationship. In this work, in order to check the validity of local and linear flux-gradient relationships, an approach based on the concept of renewal critical events [2] is introduced. In fact, in renewal theory [2], the dynamical origin of anomalous behaviour and non-local flux-gradient relation is

  15. Event coincidence analysis for quantifying statistical interrelationships between event time series. On the role of flood events as triggers of epidemic outbreaks

    NASA Astrophysics Data System (ADS)

    Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.

    2016-05-01

    Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.

  16. UNCERTAINTY IN PHASE ARRIVAL TIME PICKS FOR REGIONAL SEISMIC EVENTS: AN EXPERIMENTAL DESIGN

    SciTech Connect

    A. VELASCO; ET AL

    2001-02-01

    The detection and timing of seismic arrivals play a critical role in the ability to locate seismic events, especially at low magnitude. Errors can occur with the determination of the timing of the arrivals, whether these errors are made by automated processing or by an analyst. One of the major obstacles encountered in properly estimating travel-time picking error is the lack of a clear and comprehensive discussion of all of the factors that influence phase picks. This report discusses possible factors that need to be modeled to properly study phase arrival time picking errors. We have developed a multivariate statistical model, experimental design, and analysis strategy that can be used in this study. We have embedded a general form of the International Data Center(IDC)/U.S. National Data Center (USNDC) phase pick measurement error model into our statistical model. We can use this statistical model to optimally calibrate a picking error model to regional data. A follow-on report will present the results of this analysis plan applied to an implementation of an experiment/data-gathering task.

  17. An update on multivariate return periods in hydrology

    NASA Astrophysics Data System (ADS)

    Gräler, Benedikt; Petroselli, Andrea; Grimaldi, Salvatore; De Baets, Bernard; Verhoest, Niko

    2016-05-01

    Many hydrological studies are devoted to the identification of events that are expected to occur on average within a certain time span. While this topic is well established in the univariate case, recent advances focus on a multivariate characterization of events based on copulas. Following a previous study, we show how the definition of the survival Kendall return period fits into the set of multivariate return periods.Moreover, we preliminary investigate the ability of the multivariate return period definitions to select maximal events from a time series. Starting from a rich simulated data set, we show how similar the selection of events from a data set is. It can be deduced from the study and theoretically underpinned that the strength of correlation in the sample influences the differences between the selection of maximal events.

  18. Predictive modeling in Clostridium acetobutylicum fermentations employing Raman spectroscopy and multivariate data analysis for real-time culture monitoring

    NASA Astrophysics Data System (ADS)

    Zu, Theresah N. K.; Liu, Sanchao; Germane, Katherine L.; Servinsky, Matthew D.; Gerlach, Elliot S.; Mackie, David M.; Sund, Christian J.

    2016-05-01

    The coupling of optical fibers with Raman instrumentation has proven to be effective for real-time monitoring of chemical reactions and fermentations when combined with multivariate statistical data analysis. Raman spectroscopy is relatively fast, with little interference from the water peak present in fermentation media. Medical research has explored this technique for analysis of mammalian cultures for potential diagnosis of some cancers. Other organisms studied via this route include Escherichia coli, Saccharomyces cerevisiae, and some Bacillus sp., though very little work has been performed on Clostridium acetobutylicum cultures. C. acetobutylicum is a gram-positive anaerobic bacterium, which is highly sought after due to its ability to use a broad spectrum of substrates and produce useful byproducts through the well-known Acetone-Butanol-Ethanol (ABE) fermentation. In this work, real-time Raman data was acquired from C. acetobutylicum cultures grown on glucose. Samples were collected concurrently for comparative off-line product analysis. Partial-least squares (PLS) models were built both for agitated cultures and for static cultures from both datasets. Media components and metabolites monitored include glucose, butyric acid, acetic acid, and butanol. Models were cross-validated with independent datasets. Experiments with agitation were more favorable for modeling with goodness of fit (QY) values of 0.99 and goodness of prediction (Q2Y) values of 0.98. Static experiments did not model as well as agitated experiments. Raman results showed the static experiments were chaotic, especially during and shortly after manual sampling.

  19. Placebo group improvement in trials of pharmacotherapies for alcohol use disorders: A multivariate meta-analysis examining change over time

    PubMed Central

    Del Re, AC; Maisel, Natalya; Blodgett, Janet; Wilbourne, Paula; Finney, John

    2014-01-01

    Objective Placebo group improvement in pharmacotherapy trials has been increasing over time across several pharmacological treatment areas. However, it is unknown to what degree increasing improvement has occurred in pharmacotherapy trials for alcohol use disorders or what factors may account for placebo group improvement. This meta-analysis of 47 alcohol pharmacotherapy trials evaluated (1) the magnitude of placebo group improvement, (2) the extent to which placebo group improvement has been increasing over time, and (3) several potential moderators that might account for variation in placebo group improvement. Method Random-effects univariate and multivariate analyses were conducted that examined the magnitude of placebo group improvement in the 47 studies and several potential moderators of improvement: (a) publication year, (b) country in which the study was conducted, (c) outcome data source/type, (d) number of placebo administrations, (e) overall severity of study participants, and (f) additional psychosocial treatment. Results Substantial placebo group improvement was found overall and improvement was larger in more recent studies. Greater improvement was found on moderately subjective outcomes, with more frequent administrations of the placebo, and in studies with greater participant severity of illness. However, even after controlling for these moderators, placebo group improvement remained significant, as did placebo group improvement over time. Conclusion Similar to previous pharmacotherapy placebo research, substantial pre- to post-test placebo group improvement has occurred in alcohol pharmacotherapy trials, an effect that has been increasing over time. However, several plausible moderator variables were not able to explain why placebo group improvement has been increasing over time. PMID:23857312

  20. Relative timing of deglacial climate events in Antarctica and Greenland.

    PubMed

    Morgan, Vin; Delmotte, Marc; van Ommen, Tas; Jouzel, Jean; Chappellaz, Jérôme; Woon, Suenor; Masson-Delmotte, Valérie; Raynaud, Dominique

    2002-09-13

    The last deglaciation was marked by large, hemispheric, millennial-scale climate variations: the Bølling-Allerød and Younger Dryas periods in the north, and the Antarctic Cold Reversal in the south. A chronology from the high-accumulation Law Dome East Antarctic ice core constrains the relative timing of these two events and provides strong evidence that the cooling at the start of the Antarctic Cold Reversal did not follow the abrupt warming during the northern Bølling transition around 14,500 years ago. This result suggests that southern changes are not a direct response to abrupt changes in North Atlantic thermohaline circulation, as is assumed in the conventional picture of a hemispheric temperature seesaw.

  1. Detecting Rare Events in the Time-Domain

    SciTech Connect

    Rest, A; Garg, A

    2008-10-31

    One of the biggest challenges in current and future time-domain surveys is to extract the objects of interest from the immense data stream. There are two aspects to achieving this goal: detecting variable sources and classifying them. Difference imaging provides an elegant technique for identifying new transients or changes in source brightness. Much progress has been made in recent years toward refining the process. We discuss a selection of pitfalls that can afflict an automated difference imagine pipeline and describe some solutions. After identifying true astrophysical variables, we are faced with the challenge of classifying them. For rare events, such as supernovae and microlensing, this challenge is magnified because we must balance having selection criteria that select for the largest number of objects of interest against a high contamination rate. We discuss considerations and techniques for developing classification schemes.

  2. Relative timing of deglacial climate events in Antarctica and Greenland.

    PubMed

    Morgan, Vin; Delmotte, Marc; van Ommen, Tas; Jouzel, Jean; Chappellaz, Jérôme; Woon, Suenor; Masson-Delmotte, Valérie; Raynaud, Dominique

    2002-09-13

    The last deglaciation was marked by large, hemispheric, millennial-scale climate variations: the Bølling-Allerød and Younger Dryas periods in the north, and the Antarctic Cold Reversal in the south. A chronology from the high-accumulation Law Dome East Antarctic ice core constrains the relative timing of these two events and provides strong evidence that the cooling at the start of the Antarctic Cold Reversal did not follow the abrupt warming during the northern Bølling transition around 14,500 years ago. This result suggests that southern changes are not a direct response to abrupt changes in North Atlantic thermohaline circulation, as is assumed in the conventional picture of a hemispheric temperature seesaw. PMID:12228715

  3. Time to tenure in Spanish universities: an event history analysis.

    PubMed

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility. PMID:24116199

  4. Time to Tenure in Spanish Universities: An Event History Analysis

    PubMed Central

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility. PMID:24116199

  5. Real-time probing of radical events with sulfide molecules

    NASA Astrophysics Data System (ADS)

    Gauduel, Yann A.; Glinec, Yannick; Malka, Victor

    2007-02-01

    The physio-pathological roles of sulfide biomolecules in cellular environments involves redox processes and radical reactions that alter or protect the functional properties of enzymatic systems, proteins and nucleic acids repair. We focus on micromolar monitoring of sulfur-centered radical anions produced by direct electron attachment, using sulfide molecules (a thioether and a disulfide biomolecule) and two complementary spectroscopic approaches: low energy radiation femtochemistry (1-8 eV) and high energy radiation femtochemistry (2.5-15 MeV). The early step of a disulfide bond making RS∴SR from thiol molecules involves a very-short lived odd-electron bonded intermediate for which an excess electron is transiently localized by a preexisting two sulfide monomers complex. The reactive center of oxidized glutathione (cystamine), a major cytoplasmic disulfide biomolecule, is also used as sensor for the real-time IR investigation of effective reaction radius r eff in homogenous aqueous environments and interfacial water of biomimetic systems. Femtosecond high-energy electrons beams, typically in the 2.5 - 15 MeV range, may conjecture the picosecond observation of primary radical events in nanometric radiation spurs. The real-time investigation of sulfide and disulfide molecules opens exciting opportunities for sensitisation of confined environments (aqueous groove of DNA, protein pockets, sub-cellular systems) to ionizing radiation. Low and high-energy femtoradical probing foreshadow the development of new applications in radiobiology (low dose effect at the nanometric scale) and anticancer radiotherapy (pro-drogue activation).

  6. Real-time interpretation of novel events across childhood

    PubMed Central

    Borovsky, Arielle; Sweeney, Kim; Elman, Jeffrey L.; Fernald, Anne

    2014-01-01

    Despite extensive evidence that adults and children rapidly integrate world knowledge to generate expectancies for upcoming language, little work has explored how this knowledge is initially acquired and used. We explore this question in 3- to 10-year-old children and adults by measuring the degree to which sentences depicting recently learned connections between agents, actions and objects lead to anticipatory eye-movements to the objects. Combinatory information in sentences about agent and action elicited anticipatory eye-movements to the Target object in adults and older children. Our findings suggest that adults and school-aged children can quickly activate information about recently exposed novel event relationships in real-time language processing. However, there were important developmental differences in the use of this knowledge. Adults and school-aged children used the sentential agent and action to predict the sentence final theme, while preschool children’s fixations reflected a simple association to the currently spoken item. We consider several reasons for this developmental difference and possible extensions of this paradigm. PMID:24976677

  7. Young Children's Memory for the Times of Personal Past Events

    ERIC Educational Resources Information Center

    Pathman, Thanujeni; Larkina, Marina; Burch, Melissa M.; Bauer, Patricia J.

    2013-01-01

    Remembering the temporal information associated with personal past events is critical for autobiographical memory, yet we know relatively little about the development of this capacity. In the present research, we investigated temporal memory for naturally occurring personal events in 4-, 6-, and 8-year-old children. Parents recorded unique events…

  8. Validation of cross-sectional time series and multivariate adaptive regression splines models for the prediction of energy expenditure in children and adolescents using doubly labeled water

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Accurate, nonintrusive, and inexpensive techniques are needed to measure energy expenditure (EE) in free-living populations. Our primary aim in this study was to validate cross-sectional time series (CSTS) and multivariate adaptive regression splines (MARS) models based on observable participant cha...

  9. Discrete-Time Survival Factor Mixture Analysis for Low-Frequency Recurrent Event Histories

    PubMed Central

    Masyn, Katherine E.

    2013-01-01

    In this article, the latent class analysis framework for modeling single event discrete-time survival data is extended to low-frequency recurrent event histories. A partial gap time model, parameterized as a restricted factor mixture model, is presented and illustrated using juvenile offending data. This model accommodates event-specific baseline hazard probabilities and covariate effects; event recurrences within a single time period; and accounts for within- and between-subject correlations of event times. This approach expands the family of latent variable survival models in a way that allows researchers to explicitly address questions about unobserved heterogeneity in the timing of events across the lifespan. PMID:24489519

  10. Dead-time correction for time-of-flight secondary-ion mass spectral images: a critical issue in multivariate image analysis.

    PubMed

    Tyler, Bonnie J; Peterson, Richard E

    2013-01-01

    Dead-time effects result in a non-linear detector response in the common time-of-flight secondary-ion mass spectrometry instruments. This can result in image artifacts that can often be misinterpreted. Although the Poisson correction procedure has been shown to effectively eliminate this non-linearity in spectra, applying the correction to images presents difficulties because the low number of counts per pixel can create large statistical errors. The efficacy of three approaches to dead-time correction in images has been explored. These approaches include: pixel binning, image segmentation and a binomial statistical correction. When few pixels are fully saturated, all three approaches work satisfactorily. When a large number of pixels are fully saturated, the statistical approach fails to remove the dead-time artifacts revealed by multivariate analysis. Pixel binning is accurate at higher levels of saturation so long as the bin size is much smaller than the feature size. The segmentation approach works well independent of feature size or the number of fully saturated pixels but requires an accurate segmentation algorithm. It is recommended that images be collected under conditions that minimize the number of fully saturated pixels. When this is impractical and small features are present in the image, segmentation can provide an accurate way to correct for the detector saturation effect. PMID:24707067

  11. Modality transition-based network from multivariate time series for characterizing horizontal oil-water flow patterns

    NASA Astrophysics Data System (ADS)

    Ding, Mei-Shuang; Jin, Ning-De; Gao, Zhong-Ke

    2015-11-01

    The simultaneous flow of oil and water through a horizontal pipe is a common occurrence during petroleum industrial processes. Characterizing the flow behavior underlying horizontal oil-water flows is a challenging problem of significant importance. In order to solve this problem, we carry out experiment to measure multivariate signals from different flow patterns and then propose a novel modality transition-based network to analyze the multivariate signals. The results suggest that the local betweenness centrality and weighted shortest path of the constructed network can characterize the transitions of flow conditions and further allow quantitatively distinguishing and uncovering the dynamic flow behavior underlying different horizontal oil-water flow patterns.

  12. Large Time Projection Chambers for Rare Event Detection

    SciTech Connect

    Heffner, M

    2009-11-03

    The Time Projection Chamber (TPC) concept [add ref to TPC section] has been applied to many projects outside of particle physics and the accelerator based experiments where it was initially developed. TPCs in non-accelerator particle physics experiments are principally focused on rare event detection (e.g. neutrino and darkmater experiments) and the physics of these experiments can place dramatically different constraints on the TPC design (only extensions to the traditional TPCs are discussed here). The drift gas, or liquid, is usually the target or matter under observation and due to very low signal rates a TPC with the largest active mass is desired. The large mass complicates particle tracking of short and sometimes very low energy particles. Other special design issues include, efficient light collection, background rejection, internal triggering and optimal energy resolution. Backgrounds from gamma-rays and neutrons are significant design issues in the construction of these TPCs. They are generally placed deep underground to shield from cosmogenic particles and surrounded with shielding to reduce radiation from the local surroundings. The construction materials have to be carefully screened for radiopurity as they are in close contact with the active mass and can be a signification source of background events. The TPC excels in reducing this internal background because the mass inside the fieldcage forms one monolithic volume from which fiducial cuts can be made ex post facto to isolate quiet drift mass, and can be circulated and purified to a very high level. Self shielding in these large mass systems can be significant and the effect improves with density. The liquid phase TPC can obtain a high density at low pressure which results in very good self-shielding and compact installation with a lightweight containment. The down sides are the need for cryogenics, slower charge drift, tracks shorter than the typical electron diffusion, lower energy resolution (e

  13. The effect of intruded events on peak time: the role of reinforcement history during the intruded event.

    PubMed

    Aum, SangWeon; Brown, Bruce L; Hemmes, Nancy S

    2007-02-22

    Pigeons were studied in an extension of a study by Aum et al. [Aum, S., Brown, B.L., Hemmes, N.S. 2004. The effects of concurrent task and gap events on peak time in the peak procedure. Behav. Process. 65, 43-56] on timing behavior under a discrete-trial fixed-interval (FI) procedure during which 6-s intruded events were superimposed on peak-interval (PI) test trials. In Aum et al., one event consisted in termination of the timing cue (gap trial); the other was a stimulus in the presence of which subjects had been trained to respond under an independent random-interval (RI) schedule of reinforcement (concurrent task trial). Aum et al. found a disruption of timing on concurrent task trials that was greater than that on gap trials. The present study investigated history of reinforcement associated with intruded events as a possible explanation of this earlier finding. After training to peck a side key on a 30-s PI procedure, discrimination training was conducted on the center key in separate sessions; red or green 6-s stimuli were associated with RI 24s or EXT (extinction) schedules. During testing under the PI procedure, three types of intruded events were presented during probe trials--the stimulus associated with the RI (S+) or EXT (S-) schedule during discrimination training, or a gap (termination of the side-keylight). Intruded events occurred 3, 9, or 15s after PI trial onset. Effects of reinforcement history were revealed as substantial disruption of timing during the S+ event and relatively little disruption during the S- event. Intermediate effects were found for the gap event. Results indicate that postcue effects are at least partially responsible for the disruptive effects of the S+ event. PMID:17157998

  14. Events and children’s sense of time: a perspective on the origins of everyday time-keeping

    PubMed Central

    Forman, Helen

    2015-01-01

    In this article I discuss abstract or pure time versus the content of time, (i.e., events, activities, and other goings-on). Or, more specifically, the utility of these two sorts of time in time-keeping or temporal organization. It is often assumed that abstract, uniform, and objective time is a universal physical entity out there, which humans may perceive of. However, this sort of evenly flowing time was only recently introduced to the human community, together with the mechanical clock. Before the introduction of mechanical clock-time, there were only events available to denote the extent of time. Events defined time, unlike the way time may define events in our present day culture. It is therefore conceivable that our primeval or natural mode of time-keeping involves the perception, estimation, and coordination of events. I find it likely that events continues to subserve our sense of time and time-keeping efforts, especially for children who have not yet mastered the use of clock-time. Instead of seeing events as a distraction to our perception of time, I suggest that our experience and understanding of time emerges from our perception of events. PMID:25814969

  15. Water quality change detection: multivariate algorithms

    NASA Astrophysics Data System (ADS)

    Klise, Katherine A.; McKenna, Sean A.

    2006-05-01

    In light of growing concern over the safety and security of our nation's drinking water, increased attention has been focused on advanced monitoring of water distribution systems. The key to these advanced monitoring systems lies in the combination of real time data and robust statistical analysis. Currently available data streams from sensors provide near real time information on water quality. Combining these data streams with change detection algorithms, this project aims to develop automated monitoring techniques that will classify real time data and denote anomalous water types. Here, water quality data in 1 hour increments over 3000 hours at 4 locations are used to test multivariate algorithms to detect anomalous water quality events. The algorithms use all available water quality sensors to measure deviation from expected water quality. Simulated anomalous water quality events are added to the measured data to test three approaches to measure this deviation. These approaches include multivariate distance measures to 1) the previous observation, 2) the closest observation in multivariate space, and 3) the closest cluster of previous water quality observations. Clusters are established using kmeans classification. Each approach uses a moving window of previous water quality measurements to classify the current measurement as normal or anomalous. Receiver Operating Characteristic (ROC) curves test the ability of each approach to discriminate between normal and anomalous water quality using a variety of thresholds and simulated anomalous events. These analyses result in a better understanding of the deviation from normal water quality that is necessary to sound an alarm.

  16. Time Scales of Solar Energetic Particle Events and Speeds of Source CMEs

    NASA Astrophysics Data System (ADS)

    Kahler, S.

    2004-05-01

    Solar Energetic Particle (SEP) events are characterized primarily by their peak intensities or fluences. Event temporal characteristics and their associations with solar phenomena are less frequently considered. We measure the times to SEP event onsets, rise times and event durations of E = 20 MeV solar proton events observed with the NASA/GSFC Epact instrument on the Wind spacecraft. The approximately 140 SEP events, observed from 1998 through 2002, were accompanied by associated coronal mass ejections (CMEs) observed with the Lasco coronagraph on the SOHO spacecraft. The timing characteristics of the SEP events are compared with the speeds and widths of the associated CMEs to determine whether any of the characteristics of the SEP intensity-time profiles can be related to CME properties. The longitude dependence of the temporal profiles is considered separately to determine the geometric extents of the shocks producing the SEP events at 1 AU.

  17. A Simple Computer Interface To Time Relatively Slow Physical Events.

    ERIC Educational Resources Information Center

    Ocaya, R. O.

    2000-01-01

    Describes a simple computer interface that can be used to make reliable time measurements, such as when timing the swings of a pendulum. Presents a sample experiment involving a form of pendulum known as the compound pendulum. (Author/YDS)

  18. Geological Time, Biological Events and the Learning Transfer Problem

    ERIC Educational Resources Information Center

    Johnson, Claudia C.; Middendorf, Joan; Rehrey, George; Dalkilic, Mehmet M.; Cassidy, Keely

    2014-01-01

    Comprehension of geologic time does not come easily, especially for students who are studying the earth sciences for the first time. This project investigated the potential success of two teaching interventions that were designed to help non-science majors enrolled in an introductory geology class gain a richer conceptual understanding of the…

  19. Reporting of Life Events Over Time: Methodological Issues in a Longitudinal Sample of Women

    ERIC Educational Resources Information Center

    Pachana, Nancy A.; Brilleman, Sam L.; Dobson, Annette J.

    2011-01-01

    The number of life events reported by study participants is sensitive to the method of data collection and time intervals under consideration. Individual characteristics also influence reporting; respondents with poor mental health report more life events. Much current research on life events is cross-sectional. Data from a longitudinal study of…

  20. Multivariate normality

    NASA Technical Reports Server (NTRS)

    Crutcher, H. L.; Falls, L. W.

    1976-01-01

    Sets of experimentally determined or routinely observed data provide information about the past, present and, hopefully, future sets of similarly produced data. An infinite set of statistical models exists which may be used to describe the data sets. The normal distribution is one model. If it serves at all, it serves well. If a data set, or a transformation of the set, representative of a larger population can be described by the normal distribution, then valid statistical inferences can be drawn. There are several tests which may be applied to a data set to determine whether the univariate normal model adequately describes the set. The chi-square test based on Pearson's work in the late nineteenth and early twentieth centuries is often used. Like all tests, it has some weaknesses which are discussed in elementary texts. Extension of the chi-square test to the multivariate normal model is provided. Tables and graphs permit easier application of the test in the higher dimensions. Several examples, using recorded data, illustrate the procedures. Tests of maximum absolute differences, mean sum of squares of residuals, runs and changes of sign are included in these tests. Dimensions one through five with selected sample sizes 11 to 101 are used to illustrate the statistical tests developed.

  1. Maximum likelihood estimation of time to first event in the presence of data gaps and multiple events.

    PubMed

    Green, Cynthia L; Brownie, Cavell; Boos, Dennis D; Lu, Jye-Chyi; Krucoff, Mitchell W

    2016-04-01

    We propose a novel likelihood method for analyzing time-to-event data when multiple events and multiple missing data intervals are possible prior to the first observed event for a given subject. This research is motivated by data obtained from a heart monitor used to track the recovery process of subjects experiencing an acute myocardial infarction. The time to first recovery, T1, is defined as the time when the ST-segment deviation first falls below 50% of the previous peak level. Estimation of T1 is complicated by data gaps during monitoring and the possibility that subjects can experience more than one recovery. If gaps occur prior to the first observed event, T, the first observed recovery may not be the subject's first recovery. We propose a parametric gap likelihood function conditional on the gap locations to estimate T1 Standard failure time methods that do not fully utilize the data are compared to the gap likelihood method by analyzing data from an actual study and by simulation. The proposed gap likelihood method is shown to be more efficient and less biased than interval censoring and more efficient than right censoring if data gaps occur early in the monitoring process or are short in duration.

  2. Parent–offspring similarity in the timing of developmental events: an origin of heterochrony?

    PubMed Central

    Tills, Oliver; Rundle, Simon D.; Spicer, John I.

    2013-01-01

    Understanding the link between ontogeny (development) and phylogeny (evolution) remains a key aim of biology. Heterochrony, the altered timing of developmental events between ancestors and descendants, could be such a link although the processes responsible for producing heterochrony, widely viewed as an interspecific phenomenon, are still unclear. However, intraspecific variation in developmental event timing, if heritable, could provide the raw material from which heterochronies originate. To date, however, heritable developmental event timing has not been demonstrated, although recent work did suggest a genetic basis for intraspecific differences in event timing in the embryonic development of the pond snail, Radix balthica. Consequently, here we used high-resolution (temporal and spatial) imaging of the entire embryonic development of R. balthica to perform a parent–offspring comparison of the timing of twelve, physiological and morphological developmental events. Between-parent differences in the timing of all events were good predictors of such timing differences between their offspring, and heritability was demonstrated for two of these events (foot attachment and crawling). Such heritable intraspecific variation in developmental event timing could be the raw material for speciation events, providing a fundamental link between ontogeny and phylogeny, via heterochrony. PMID:23966639

  3. Pipeline Implementation of Real Time Event Cross Correlation for Nuclear Treaty Monitoring

    NASA Astrophysics Data System (ADS)

    Junek, W. N.; Wehlen, J. A., III

    2014-12-01

    The United States National Data Center (US NDC) is responsible for monitoring international compliance to nuclear test ban treaties. This mission is performed through real time acquisition, processing, and evaluation of data acquired by a global network of seismic, hydroacoustic, and infrasonic sensors. Automatic and human reviewed event solutions are stored in a data warehouse which contains over 15 years of alphanumeric information and waveform data. A significant effort is underway to employ the data warehouse in real time processing to improve the quality of automatic event solutions, reduce analyst burden, and supply decision makers with information regarding relevant historic events. To this end, the US NDC processing pipeline has been modified to automatically recognize events built in the past. Event similarity information and the most relevant historic solution are passed to the human analyst to assist their evaluation of automatically formed events. This is achieved through real time cross correlation of selected seismograms from automatically formed events against those stored in the data warehouse. Historic events used in correlation analysis are selected based on a set of user defined parameters, which are tuned to maintain pipeline timeliness requirements. Software architecture and database infrastructure were modified using a multithreaded design for increased processing speed, database connection pools for parallel queries, and Oracle spatial indexing to enhance query efficiency. This functionality allows the human analyst to spend more time studying anomalous events and less time rebuilding routine events.

  4. Real-Time GPS Network Monitors Bayou Corne Sinkhole Event

    NASA Astrophysics Data System (ADS)

    Kent, Joshua D.; Dunaway, Larry

    2013-10-01

    In August 2012 a sinkhole developed in the swampy marshland near the rural community of Bayou Corne in Assumption Parish (i.e., county), Louisiana. The area was evacuated, and some residents have still not been able to return. The sinkhole—which now measures about 450 meters wide and is continuing to grow—is being monitored by multiple systems, including four rapid-response GPS continuously operating reference stations (CORS) called CORS911. The real-time data provided by this system are used by scientists and decision makers to help ensure public safety.

  5. The time travelling self: comparing self and other in narratives of past and future events.

    PubMed

    Grysman, Azriel; Prabhakar, Janani; Anglin, Stephanie M; Hudson, Judith A

    2013-09-01

    Mental time travel research emphasizes the connection between past and future thinking, whereas autobiographical memory research emphasizes the interrelationship of self and memory. This study explored the relationship between self and memory when thinking about both past and future events. Participants reported events from the near and distant past and future, for themselves, a close friend, or an acquaintance. Past events were rated higher in phenomenological quality than future events, and near self events were rated higher in quality than those about friends. Although future events were more positive than past events, only valence ratings for self and close friend showed a linear increase in positivity from distant past to future. Content analysis showed that this increase in positivity could not be ascribed to choosing events from the cultural life script. These findings provide evidence for the role of personal goals in imagining the future.

  6. Developmental and Cognitive Perspectives on Humans' Sense of the Times of Past and Future Events

    ERIC Educational Resources Information Center

    Friedman, W.J.

    2005-01-01

    Mental time travel in human adults includes a sense of when past events occurred and future events are expected to occur. Studies with adults and children reveal that a number of distinct psychological processes contribute to a temporally differentiated sense of the past and future. Adults possess representations of multiple time patterns, and…

  7. The Roles of Prior Experience and the Timing of Misinformation Presentation on Young Children's Event Memories

    ERIC Educational Resources Information Center

    Roberts, Kim P.; Powell, Martine B.

    2007-01-01

    The current study addressed how the timing of interviews affected children's memories of unique and repeated events. Five- to six-year-olds (N = 125) participated in activities 1 or 4 times and were misinformed either 3 or 21 days after the only or last event. Although single-experience children were subsequently less accurate in the 21- versus…

  8. Qualitative and event-specific real-time PCR detection methods for Bt brinjal event EE-1.

    PubMed

    Randhawa, Gurinder Jit; Sharma, Ruchi; Singh, Monika

    2012-01-01

    Bt brinjal event EE-1 with cry1Ac gene, expressing insecticidal protein against fruit and shoot borer, is the first genetically modified food crop in the pipeline for commercialization in India. Qualitative polymerase chain reaction (PCR) along with event-specific conventional as well as real-time PCR methods to characterize the event EE-1 is reported. A multiplex (pentaplex) PCR system simultaneously amplifying cry1Ac transgene, Cauliflower Mosaic Virus (CaMV) 35S promoter, nopaline synthase (nos) terminator, aminoglycoside adenyltransferase (aadA) marker gene, and a taxon-specific beta-fructosidase gene in event EE-1 has been developed. Furthermore, construct-specific PCR, targeting the approximate 1.8 kb region of inserted gene construct comprising the region of CaMV 35S promoter and cry1Ac gene has also been developed. The LOD of developed EE-1 specific conventional PCR assay is 0.01%. The method performance of the reported real-time PCR assay was consistent with the acceptance criteria of Codex Alimentarius Commission ALINORM 10/33/23, with the LOD and LOQ values of 0.05%. The developed detection methods would not only facilitate effective regulatory compliance for identification of genetic traits, risk assessment, management, and postrelease monitoring, but also address consumer concerns and resolution of legal disputes. PMID:23451391

  9. Qualitative and event-specific real-time PCR detection methods for Bt brinjal event EE-1.

    PubMed

    Randhawa, Gurinder Jit; Sharma, Ruchi; Singh, Monika

    2012-01-01

    Bt brinjal event EE-1 with cry1Ac gene, expressing insecticidal protein against fruit and shoot borer, is the first genetically modified food crop in the pipeline for commercialization in India. Qualitative polymerase chain reaction (PCR) along with event-specific conventional as well as real-time PCR methods to characterize the event EE-1 is reported. A multiplex (pentaplex) PCR system simultaneously amplifying cry1Ac transgene, Cauliflower Mosaic Virus (CaMV) 35S promoter, nopaline synthase (nos) terminator, aminoglycoside adenyltransferase (aadA) marker gene, and a taxon-specific beta-fructosidase gene in event EE-1 has been developed. Furthermore, construct-specific PCR, targeting the approximate 1.8 kb region of inserted gene construct comprising the region of CaMV 35S promoter and cry1Ac gene has also been developed. The LOD of developed EE-1 specific conventional PCR assay is 0.01%. The method performance of the reported real-time PCR assay was consistent with the acceptance criteria of Codex Alimentarius Commission ALINORM 10/33/23, with the LOD and LOQ values of 0.05%. The developed detection methods would not only facilitate effective regulatory compliance for identification of genetic traits, risk assessment, management, and postrelease monitoring, but also address consumer concerns and resolution of legal disputes.

  10. Hippocampal “time cells” bridge the gap in memory for discontiguous events

    PubMed Central

    MacDonald, Christopher J.; Lepage, Kyle Q.; Eden, Uri T.; Eichenbaum, Howard

    2011-01-01

    Summary The hippocampus is critical to remembering the flow of events in distinct experiences and, in doing so, bridges temporal gaps between discontiguous events. Here we report a robust hippocampal representation of sequence memories, highlighted by “time cells” that encode successive moments during an empty temporal gap between the key events, while at the same times encoding location and ongoing behavior. Furthermore, just as most place cells “remap” when a salient spatial cue is altered, most time cells form qualitatively different representations (“re-time”) when the main temporal parameter is altered. Hippocampal neurons also differentially encode the key events and disambiguate different event sequences to compose unique, temporally organized representations of specific experiences. These findings suggest that hippocampal neural ensembles segment temporally organized memories much the same as they represent locations of important events in spatially defined environments. PMID:21867888

  11. High Mass Measurement Accuracy Determination for Proteomics using Multivariate Regression Fitting: Application to Electrospray Ionization Time-Of-Flight Mass Spectrometry

    SciTech Connect

    Strittmatter, Eric F.; Rodriguez, Nestor; Smith, Richard D.

    2003-02-01

    Important factors that limit the mass measurement accuracy from a mass spectrometer are related to (1) the type of mass analyzer used and (2) the data processing/calibration methods used to obtain mass values from the raw data. Here, two data processing methods are presented that correct for systematic deviations when measuring the mass of ions using a time-of-flight (TOF) mass spectrometer. The first fitting method is one where m/z values are obtained from fitting peak distributions using double Gaussian functions. A second calibration method takes into account the slight non-linear response of the time-of-flight analyzer in addition to the drift in the calibration over time. Using multivariate regression, both of these two effects can be corrected for using a single calibration formula. Achievable performance was evaluated with a trypsin digestion of serum albumin and proteins from the organism D. radiodurans that were analyzed using gradient reverse-phase liquid chromatography combined with electrospray ionization orthogonal TOF mass spectrometer. The root mean square deviation between the theoretical and experimental m/z for serum albumin was found to be 8 ppm using the double Gaussian-multivariate method compared to 29 ppm determined using linear calibration and normal peak centroiding. An advantage of the methods presented here is that no calibrant compounds need to be added to the mobile phase, thereby avoiding interference effects and signal suppression of analytes.

  12. Cognitive tasks in information analysis: Use of event dwell time to characterize component activities

    SciTech Connect

    Sanquist, Thomas F.; Greitzer, Frank L.; Slavich, Antoinette L.; Littlefield, Rik J.; Littlefield, Janis S.; Cowley, Paula J.

    2004-09-28

    Technology-based enhancement of information analysis requires a detailed understanding of the cognitive tasks involved in the process. The information search and report production tasks of the information analysis process were investigated through evaluation of time-stamped workstation data gathered with custom software. Model tasks simulated the search and production activities, and a sample of actual analyst data were also evaluated. Task event durations were calculated on the basis of millisecond-level time stamps, and distributions were plotted for analysis. The data indicate that task event time shows a cyclic pattern of variation, with shorter event durations (< 2 sec) reflecting information search and filtering, and longer event durations (> 10 sec) reflecting information evaluation. Application of cognitive principles to the interpretation of task event time data provides a basis for developing “cognitive signatures” of complex activities, and can facilitate the development of technology aids for information intensive tasks.

  13. Time, space, and events in language and cognition: a comparative view.

    PubMed

    Sinha, Chris; Gärdenfors, Peter

    2014-10-01

    We propose an event-based account of the cognitive and linguistic representation of time and temporal relations. Human beings differ from nonhuman animals in entertaining and communicating elaborate detached (as opposed to cued) event representations and temporal relational schemas. We distinguish deictically based (D-time) from sequentially based (S-time) representations, identifying these with the philosophical categories of A-series and B-series time. On the basis of cross-linguistic data, we claim that all cultures employ both D-time and S-time representations. We outline a cognitive model of event structure, emphasizing that this does not entail an explicit, separate representation of a time dimension. We propose that the notion of an event-independent, metric "time as such" is not universal, but a cultural and historical construction based on cognitive technologies for measuring time intervals. We critically examine claims that time is universally conceptualized in terms of spatial metaphors, and hypothesize that systematic space-time metaphor is only found in languages and cultures that have constructed the notion of time as a separate dimension. We emphasize the importance of distinguishing what is universal from what is variable in cultural and linguistic representations of time, and speculate on the general implications of an event-based understanding of time. PMID:25098724

  14. Time, space, and events in language and cognition: a comparative view.

    PubMed

    Sinha, Chris; Gärdenfors, Peter

    2014-10-01

    We propose an event-based account of the cognitive and linguistic representation of time and temporal relations. Human beings differ from nonhuman animals in entertaining and communicating elaborate detached (as opposed to cued) event representations and temporal relational schemas. We distinguish deictically based (D-time) from sequentially based (S-time) representations, identifying these with the philosophical categories of A-series and B-series time. On the basis of cross-linguistic data, we claim that all cultures employ both D-time and S-time representations. We outline a cognitive model of event structure, emphasizing that this does not entail an explicit, separate representation of a time dimension. We propose that the notion of an event-independent, metric "time as such" is not universal, but a cultural and historical construction based on cognitive technologies for measuring time intervals. We critically examine claims that time is universally conceptualized in terms of spatial metaphors, and hypothesize that systematic space-time metaphor is only found in languages and cultures that have constructed the notion of time as a separate dimension. We emphasize the importance of distinguishing what is universal from what is variable in cultural and linguistic representations of time, and speculate on the general implications of an event-based understanding of time.

  15. Hippocampal "time cells" bridge the gap in memory for discontiguous events.

    PubMed

    MacDonald, Christopher J; Lepage, Kyle Q; Eden, Uri T; Eichenbaum, Howard

    2011-08-25

    The hippocampus is critical to remembering the flow of events in distinct experiences and, in doing so, bridges temporal gaps between discontiguous events. Here, we report a robust hippocampal representation of sequence memories, highlighted by "time cells" that encode successive moments during an empty temporal gap between the key events, while also encoding location and ongoing behavior. Furthermore, just as most place cells "remap" when a salient spatial cue is altered, most time cells form qualitatively different representations ("retime") when the main temporal parameter is altered. Hippocampal neurons also differentially encode the key events and disambiguate different event sequences to compose unique, temporally organized representations of specific experiences. These findings suggest that hippocampal neural ensembles segment temporally organized memories much the same as they represent locations of important events in spatially defined environments.

  16. Monitoring Natural Events Globally in Near Real-Time Using NASA's Open Web Services and Tools

    NASA Technical Reports Server (NTRS)

    Boller, Ryan A.; Ward, Kevin Alan; Murphy, Kevin J.

    2015-01-01

    Since 1960, NASA has been making global measurements of the Earth from a multitude of space-based missions, many of which can be useful for monitoring natural events. In recent years, these measurements have been made available in near real-time, making it possible to use them to also aid in managing the response to natural events. We present the challenges and ongoing solutions to using NASA satellite data for monitoring and managing these events.

  17. Multicomponent seismic noise attenuation with multivariate order statistic filters

    NASA Astrophysics Data System (ADS)

    Wang, Chao; Wang, Yun; Wang, Xiaokai; Xun, Chao

    2016-10-01

    The vector relationship between multicomponent seismic data is highly important for multicomponent processing and interpretation, but this vector relationship could be damaged when each component is processed individually. To overcome the drawback of standard component-by-component filtering, multivariate order statistic filters are introduced and extended to attenuate the noise of multicomponent seismic data by treating such dataset as a vector wavefield rather than a set of scalar fields. According to the characteristics of seismic signals, we implement this type of multivariate filtering along local events. First, the optimal local events are recognized according to the similarity between the vector signals which are windowed from neighbouring seismic traces with a sliding time window along each trial trajectory. An efficient strategy is used to reduce the computational cost of similarity measurement for vector signals. Next, one vector sample each from the neighbouring traces are extracted along the optimal local event as the input data for a multivariate filter. Different multivariate filters are optimal for different noise. The multichannel modified trimmed mean (MTM) filter, as one of the multivariate order statistic filters, is applied to synthetic and field multicomponent seismic data to test its performance for attenuating white Gaussian noise. The results indicate that the multichannel MTM filter can attenuate noise while preserving the relative amplitude information of multicomponent seismic data more effectively than a single-channel filter.

  18. Orbital chronology for the Cenomanian-Turonian Oceanic Anoxic Event 2 and the timing of the "Plenus Cold Event"

    NASA Astrophysics Data System (ADS)

    Voigt, Silke; Erbacher, Jochen; Pälike, Heiko; Westerhold, Thomas

    2015-04-01

    The Cenomanian-Turonian OAE 2 is reflected by one of the most extreme carbon cycle perturbations in Earth's history possibly triggered by massive volcanic CO2 degassing during the emplacement of large igneous provinces (LIPs). Severe climatic, oceanographic and biotic feedbacks are reported from different depositional settings. The nature of these changes as well as their spatial and temporal dimension is still not well understood to date. The main difficulty to integrate different observations in different locations is the insufficient resolution of available timescales and stratigraphies. Although new radiometric ages exists for the stratotype section at Pueblo and regional orbital age models are developed from shelf settings from both sides of the Atlantic Ocean, their correlation to the open ocean is not unequivocal. Here, we present a cyclostratigraphic correlation based on time series analyses of relative changes in XRF-element concentrations derived from two sites, the oceanic ODP-Site 1261 (Demerara Rise, tropical Western Atlantic) and a mid-latitude shelf-sea locality exposed in the Wunstorf Core (Germany). Both successions expose distinct sedimentary cycles as well as a brief period of intermittent surface-water cooling and bottom water oxygenation ("Plenus Cold Event" in western Europe) during the early OAE 2 which is considered as synchronous event by several authors. The estimated overall duration of OAE 2 is about 5 and 4.5 short eccentricity cycles for both Site 1261 and Wunstorf. For correlation purposes the independently derived floating orbital time scales of Site 1261 and Wunstorf are tied to each other using the first prominent increase of the δ13C anomaly, a characteristic feature of all OAE 2 successions. Sedimentary cycles, interpreted as short eccentricity cycles during OAE 2, are correlated between the two different depositional settings. Based on this correlation the cooling pulses recorded in the tropical Atlantic and the European mid

  19. Schizophrenia Spectrum Disorders Show Reduced Specificity and Less Positive Events in Mental Time Travel

    PubMed Central

    Chen, Xing-jie; Liu, Lu-lu; Cui, Ji-fang; Wang, Ya; Chen, An-tao; Li, Feng-hua; Wang, Wei-hong; Zheng, Han-feng; Gan, Ming-yuan; Li, Chun-qiu; Shum, David H. K.; Chan, Raymond C. K.

    2016-01-01

    Mental time travel refers to the ability to recall past events and to imagine possible future events. Schizophrenia (SCZ) patients have problems in remembering specific personal experiences in the past and imagining what will happen in the future. This study aimed to examine episodic past and future thinking in SCZ spectrum disorders including SCZ patients and individuals with schizotypal personality disorder (SPD) proneness who are at risk for developing SCZ. Thirty-two SCZ patients, 30 SPD proneness individuals, and 33 healthy controls participated in the study. The Sentence Completion for Events from the Past Test (SCEPT) and the Sentence Completion for Events in the Future Test were used to measure past and future thinking abilities. Results showed that SCZ patients showed significantly reduced specificity in recalling past and imagining future events, they generated less proportion of specific and extended events compared to healthy controls. SPD proneness individuals only generated less extended events compared to healthy controls. The reduced specificity was mainly manifested in imagining future events. Both SCZ patients and SPD proneness individuals generated less positive events than controls. These results suggest that mental time travel impairments in SCZ spectrum disorders and have implications for understanding their cognitive and emotional deficits. PMID:27507958

  20. Schizophrenia Spectrum Disorders Show Reduced Specificity and Less Positive Events in Mental Time Travel.

    PubMed

    Chen, Xing-Jie; Liu, Lu-Lu; Cui, Ji-Fang; Wang, Ya; Chen, An-Tao; Li, Feng-Hua; Wang, Wei-Hong; Zheng, Han-Feng; Gan, Ming-Yuan; Li, Chun-Qiu; Shum, David H K; Chan, Raymond C K

    2016-01-01

    Mental time travel refers to the ability to recall past events and to imagine possible future events. Schizophrenia (SCZ) patients have problems in remembering specific personal experiences in the past and imagining what will happen in the future. This study aimed to examine episodic past and future thinking in SCZ spectrum disorders including SCZ patients and individuals with schizotypal personality disorder (SPD) proneness who are at risk for developing SCZ. Thirty-two SCZ patients, 30 SPD proneness individuals, and 33 healthy controls participated in the study. The Sentence Completion for Events from the Past Test (SCEPT) and the Sentence Completion for Events in the Future Test were used to measure past and future thinking abilities. Results showed that SCZ patients showed significantly reduced specificity in recalling past and imagining future events, they generated less proportion of specific and extended events compared to healthy controls. SPD proneness individuals only generated less extended events compared to healthy controls. The reduced specificity was mainly manifested in imagining future events. Both SCZ patients and SPD proneness individuals generated less positive events than controls. These results suggest that mental time travel impairments in SCZ spectrum disorders and have implications for understanding their cognitive and emotional deficits. PMID:27507958

  1. A novel way to detect correlations on multi-time scales, with temporal evolution and for multi-variables

    PubMed Central

    Yuan, Naiming; Xoplaki, Elena; Zhu, Congwen; Luterbacher, Juerg

    2016-01-01

    In this paper, two new methods, Temporal evolution of Detrended Cross-Correlation Analysis (TDCCA) and Temporal evolution of Detrended Partial-Cross-Correlation Analysis (TDPCCA), are proposed by generalizing DCCA and DPCCA. Applying TDCCA/TDPCCA, it is possible to study correlations on multi-time scales and over different periods. To illustrate their properties, we used two climatological examples: i) Global Sea Level (GSL) versus North Atlantic Oscillation (NAO); and ii) Summer Rainfall over Yangtze River (SRYR) versus previous winter Pacific Decadal Oscillation (PDO). We find significant correlations between GSL and NAO on time scales of 60 to 140 years, but the correlations are non-significant between 1865–1875. As for SRYR and PDO, significant correlations are found on time scales of 30 to 35 years, but the correlations are more pronounced during the recent 30 years. By combining TDCCA/TDPCCA and DCCA/DPCCA, we proposed a new correlation-detection system, which compared to traditional methods, can objectively show how two time series are related (on which time scale, during which time period). These are important not only for diagnosis of complex system, but also for better designs of prediction models. Therefore, the new methods offer new opportunities for applications in natural sciences, such as ecology, economy, sociology and other research fields. PMID:27293028

  2. A novel way to detect correlations on multi-time scales, with temporal evolution and for multi-variables

    NASA Astrophysics Data System (ADS)

    Yuan, Naiming; Xoplaki, Elena; Zhu, Congwen; Luterbacher, Juerg

    2016-06-01

    In this paper, two new methods, Temporal evolution of Detrended Cross-Correlation Analysis (TDCCA) and Temporal evolution of Detrended Partial-Cross-Correlation Analysis (TDPCCA), are proposed by generalizing DCCA and DPCCA. Applying TDCCA/TDPCCA, it is possible to study correlations on multi-time scales and over different periods. To illustrate their properties, we used two climatological examples: i) Global Sea Level (GSL) versus North Atlantic Oscillation (NAO); and ii) Summer Rainfall over Yangtze River (SRYR) versus previous winter Pacific Decadal Oscillation (PDO). We find significant correlations between GSL and NAO on time scales of 60 to 140 years, but the correlations are non-significant between 1865–1875. As for SRYR and PDO, significant correlations are found on time scales of 30 to 35 years, but the correlations are more pronounced during the recent 30 years. By combining TDCCA/TDPCCA and DCCA/DPCCA, we proposed a new correlation-detection system, which compared to traditional methods, can objectively show how two time series are related (on which time scale, during which time period). These are important not only for diagnosis of complex system, but also for better designs of prediction models. Therefore, the new methods offer new opportunities for applications in natural sciences, such as ecology, economy, sociology and other research fields.

  3. A discrete time event-history approach to informative drop-out in mixed latent Markov models with covariates.

    PubMed

    Bartolucci, Francesco; Farcomeni, Alessio

    2015-03-01

    Mixed latent Markov (MLM) models represent an important tool of analysis of longitudinal data when response variables are affected by time-fixed and time-varying unobserved heterogeneity, in which the latter is accounted for by a hidden Markov chain. In order to avoid bias when using a model of this type in the presence of informative drop-out, we propose an event-history (EH) extension of the latent Markov approach that may be used with multivariate longitudinal data, in which one or more outcomes of a different nature are observed at each time occasion. The EH component of the resulting model is referred to the interval-censored drop-out, and bias in MLM modeling is avoided by correlated random effects, included in the different model components, which follow common latent distributions. In order to perform maximum likelihood estimation of the proposed model by the expectation-maximization algorithm, we extend the usual forward-backward recursions of Baum and Welch. The algorithm has the same complexity as the one adopted in cases of non-informative drop-out. We illustrate the proposed approach through simulations and an application based on data coming from a medical study about primary biliary cirrhosis in which there are two outcomes of interest, one continuous and the other binary. PMID:25227970

  4. Event-Based Tone Mapping for Asynchronous Time-Based Image Sensor.

    PubMed

    Simon Chane, Camille; Ieng, Sio-Hoi; Posch, Christoph; Benosman, Ryad B

    2016-01-01

    The asynchronous time-based neuromorphic image sensor ATIS is an array of autonomously operating pixels able to encode luminance information with an exceptionally high dynamic range (>143 dB). This paper introduces an event-based methodology to display data from this type of event-based imagers, taking into account the large dynamic range and high temporal accuracy that go beyond available mainstream display technologies. We introduce an event-based tone mapping methodology for asynchronously acquired time encoded gray-level data. A global and a local tone mapping operator are proposed. Both are designed to operate on a stream of incoming events rather than on time frame windows. Experimental results on real outdoor scenes are presented to evaluate the performance of the tone mapping operators in terms of quality, temporal stability, adaptation capability, and computational time. PMID:27642275

  5. Event-Based Tone Mapping for Asynchronous Time-Based Image Sensor

    PubMed Central

    Simon Chane, Camille; Ieng, Sio-Hoi; Posch, Christoph; Benosman, Ryad B.

    2016-01-01

    The asynchronous time-based neuromorphic image sensor ATIS is an array of autonomously operating pixels able to encode luminance information with an exceptionally high dynamic range (>143 dB). This paper introduces an event-based methodology to display data from this type of event-based imagers, taking into account the large dynamic range and high temporal accuracy that go beyond available mainstream display technologies. We introduce an event-based tone mapping methodology for asynchronously acquired time encoded gray-level data. A global and a local tone mapping operator are proposed. Both are designed to operate on a stream of incoming events rather than on time frame windows. Experimental results on real outdoor scenes are presented to evaluate the performance of the tone mapping operators in terms of quality, temporal stability, adaptation capability, and computational time.

  6. Event-Based Tone Mapping for Asynchronous Time-Based Image Sensor

    PubMed Central

    Simon Chane, Camille; Ieng, Sio-Hoi; Posch, Christoph; Benosman, Ryad B.

    2016-01-01

    The asynchronous time-based neuromorphic image sensor ATIS is an array of autonomously operating pixels able to encode luminance information with an exceptionally high dynamic range (>143 dB). This paper introduces an event-based methodology to display data from this type of event-based imagers, taking into account the large dynamic range and high temporal accuracy that go beyond available mainstream display technologies. We introduce an event-based tone mapping methodology for asynchronously acquired time encoded gray-level data. A global and a local tone mapping operator are proposed. Both are designed to operate on a stream of incoming events rather than on time frame windows. Experimental results on real outdoor scenes are presented to evaluate the performance of the tone mapping operators in terms of quality, temporal stability, adaptation capability, and computational time. PMID:27642275

  7. WAITING TIME DISTRIBUTION OF SOLAR ENERGETIC PARTICLE EVENTS MODELED WITH A NON-STATIONARY POISSON PROCESS

    SciTech Connect

    Li, C.; Su, W.; Fang, C.; Zhong, S. J.; Wang, L.

    2014-09-10

    We present a study of the waiting time distributions (WTDs) of solar energetic particle (SEP) events observed with the spacecraft WIND and GOES. The WTDs of both solar electron events (SEEs) and solar proton events (SPEs) display a power-law tail of ∼Δt {sup –γ}. The SEEs display a broken power-law WTD. The power-law index is γ{sub 1} = 0.99 for the short waiting times (<70 hr) and γ{sub 2} = 1.92 for large waiting times (>100 hr). The break of the WTD of SEEs is probably due to the modulation of the corotating interaction regions. The power-law index, γ ∼ 1.82, is derived for the WTD of the SPEs which is consistent with the WTD of type II radio bursts, indicating a close relationship between the shock wave and the production of energetic protons. The WTDs of SEP events can be modeled with a non-stationary Poisson process, which was proposed to understand the waiting time statistics of solar flares. We generalize the method and find that, if the SEP event rate λ = 1/Δt varies as the time distribution of event rate f(λ) = Aλ{sup –α}exp (– βλ), the time-dependent Poisson distribution can produce a power-law tail WTD of ∼Δt {sup α} {sup –3}, where 0 ≤ α < 2.

  8. A multivariate time-frequency method to characterize the influence of respiration over heart period and arterial pressure

    NASA Astrophysics Data System (ADS)

    Orini, Michele; Bailón, Raquel; Laguna, Pablo; Mainardi, Luca T.; Barbieri, Riccardo

    2012-12-01

    Respiratory activity introduces oscillations both in arterial pressure and heart period, through mechanical and autonomic mechanisms. Respiration, arterial pressure, and heart period are, generally, non-stationary processes and the interactions between them are dynamic. In this study we present a methodology to robustly estimate the time course of cross spectral indices to characterize dynamic interactions between respiratory oscillations of heart period and blood pressure, as well as their interactions with respiratory activity. Time-frequency distributions belonging to Cohen's class are used to estimate time-frequency (TF) representations of coherence, partial coherence and phase difference. The characterization is based on the estimation of the time course of cross spectral indices estimated in specific TF regions around the respiratory frequency. We used this methodology to describe the interactions between respiration, heart period variability (HPV) and systolic arterial pressure variability (SAPV) during tilt table test with both spontaneous and controlled respiratory patterns. The effect of selective autonomic blockade was also studied. Results suggest the presence of common underling mechanisms of regulation between cardiovascular signals, whose interactions are time-varying. SAPV changes followed respiratory flow both in supine and standing positions and even after selective autonomic blockade. During head-up tilt, phase differences between respiration and SAPV increased. Phase differences between respiration and HPV were comparable to those between respiration and SAPV during supine position, and significantly increased during standing. As a result, respiratory oscillations in SAPV preceded respiratory oscillations in HPV during standing. Partial coherence was the most sensitive index to orthostatic stress. Phase difference estimates were consistent among spontaneous and controlled breathing patterns, whereas coherence was higher in spontaneous breathing

  9. Spatial Cueing in Time-Space Synesthetes: An Event-Related Brain Potential Study

    ERIC Educational Resources Information Center

    Teuscher, Ursina; Brang, David; Ramachandran, Vilayanur S.; Coulson, Seana

    2010-01-01

    Some people report that they consistently and involuntarily associate time events, such as months of the year, with specific spatial locations; a condition referred to as time-space synesthesia. The present study investigated the manner in which such synesthetic time-space associations affect visuo-spatial attention via an endogenous cuing…

  10. Improving linear accelerator service response with a real- time electronic event reporting system.

    PubMed

    Hoisak, Jeremy D P; Pawlicki, Todd; Kim, Gwe-Ya; Fletcher, Richard; Moore, Kevin L

    2014-09-08

    To track linear accelerator performance issues, an online event recording system was developed in-house for use by therapists and physicists to log the details of technical problems arising on our institution's four linear accelerators. In use since October 2010, the system was designed so that all clinical physicists would receive email notification when an event was logged. Starting in October 2012, we initiated a pilot project in collaboration with our linear accelerator vendor to explore a new model of service and support, in which event notifications were also sent electronically directly to dedicated engineers at the vendor's technical help desk, who then initiated a response to technical issues. Previously, technical issues were reported by telephone to the vendor's call center, which then disseminated information and coordinated a response with the Technical Support help desk and local service engineers. The purpose of this work was to investigate the improvements to clinical operations resulting from this new service model. The new and old service models were quantitatively compared by reviewing event logs and the oncology information system database in the nine months prior to and after initiation of the project. Here, we focus on events that resulted in an inoperative linear accelerator ("down" machine). Machine downtime, vendor response time, treatment cancellations, and event resolution were evaluated and compared over two equivalent time periods. In 389 clinical days, there were 119 machine-down events: 59 events before and 60 after introduction of the new model. In the new model, median time to service response decreased from 45 to 8 min, service engineer dispatch time decreased 44%, downtime per event decreased from 45 to 20 min, and treatment cancellations decreased 68%. The decreased vendor response time and reduced number of on-site visits by a service engineer resulted in decreased downtime and decreased patient treatment cancellations.

  11. Improving linear accelerator service response with a real- time electronic event reporting system.

    PubMed

    Hoisak, Jeremy D P; Pawlicki, Todd; Kim, Gwe-Ya; Fletcher, Richard; Moore, Kevin L

    2014-01-01

    To track linear accelerator performance issues, an online event recording system was developed in-house for use by therapists and physicists to log the details of technical problems arising on our institution's four linear accelerators. In use since October 2010, the system was designed so that all clinical physicists would receive email notification when an event was logged. Starting in October 2012, we initiated a pilot project in collaboration with our linear accelerator vendor to explore a new model of service and support, in which event notifications were also sent electronically directly to dedicated engineers at the vendor's technical help desk, who then initiated a response to technical issues. Previously, technical issues were reported by telephone to the vendor's call center, which then disseminated information and coordinated a response with the Technical Support help desk and local service engineers. The purpose of this work was to investigate the improvements to clinical operations resulting from this new service model. The new and old service models were quantitatively compared by reviewing event logs and the oncology information system database in the nine months prior to and after initiation of the project. Here, we focus on events that resulted in an inoperative linear accelerator ("down" machine). Machine downtime, vendor response time, treatment cancellations, and event resolution were evaluated and compared over two equivalent time periods. In 389 clinical days, there were 119 machine-down events: 59 events before and 60 after introduction of the new model. In the new model, median time to service response decreased from 45 to 8 min, service engineer dispatch time decreased 44%, downtime per event decreased from 45 to 20 min, and treatment cancellations decreased 68%. The decreased vendor response time and reduced number of on-site visits by a service engineer resulted in decreased downtime and decreased patient treatment cancellations. PMID

  12. Near Optimal Event-Triggered Control of Nonlinear Discrete-Time Systems Using Neurodynamic Programming.

    PubMed

    Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani

    2016-09-01

    This paper presents an event-triggered near optimal control of uncertain nonlinear discrete-time systems. Event-driven neurodynamic programming (NDP) is utilized to design the control policy. A neural network (NN)-based identifier, with event-based state and input vectors, is utilized to learn the system dynamics. An actor-critic framework is used to learn the cost function and the optimal control input. The NN weights of the identifier, the critic, and the actor NNs are tuned aperiodically once every triggered instant. An adaptive event-trigger condition to decide the trigger instants is derived. Thus, a suitable number of events are generated to ensure a desired accuracy of approximation. A near optimal performance is achieved without using value and/or policy iterations. A detailed analysis of nontrivial inter-event times with an explicit formula to show the reduction in computation is also derived. The Lyapunov technique is used in conjunction with the event-trigger condition to guarantee the ultimate boundedness of the closed-loop system. The simulation results are included to verify the performance of the controller. The net result is the development of event-driven NDP. PMID:26285220

  13. Near Optimal Event-Triggered Control of Nonlinear Discrete-Time Systems Using Neurodynamic Programming.

    PubMed

    Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani

    2016-09-01

    This paper presents an event-triggered near optimal control of uncertain nonlinear discrete-time systems. Event-driven neurodynamic programming (NDP) is utilized to design the control policy. A neural network (NN)-based identifier, with event-based state and input vectors, is utilized to learn the system dynamics. An actor-critic framework is used to learn the cost function and the optimal control input. The NN weights of the identifier, the critic, and the actor NNs are tuned aperiodically once every triggered instant. An adaptive event-trigger condition to decide the trigger instants is derived. Thus, a suitable number of events are generated to ensure a desired accuracy of approximation. A near optimal performance is achieved without using value and/or policy iterations. A detailed analysis of nontrivial inter-event times with an explicit formula to show the reduction in computation is also derived. The Lyapunov technique is used in conjunction with the event-trigger condition to guarantee the ultimate boundedness of the closed-loop system. The simulation results are included to verify the performance of the controller. The net result is the development of event-driven NDP.

  14. From sensation to perception: Using multivariate classification of visual illusions to identify neural correlates of conscious awareness in space and time.

    PubMed

    Hogendoorn, Hinze

    2015-01-01

    An important goal of cognitive neuroscience is understanding the neural underpinnings of conscious awareness. Although the low-level processing of sensory input is well understood in most modalities, it remains a challenge to understand how the brain translates such input into conscious awareness. Here, I argue that the application of multivariate pattern classification techniques to neuroimaging data acquired while observers experience perceptual illusions provides a unique way to dissociate sensory mechanisms from mechanisms underlying conscious awareness. Using this approach, it is possible to directly compare patterns of neural activity that correspond to the contents of awareness, independent from changes in sensory input, and to track these neural representations over time at high temporal resolution. I highlight five recent studies using this approach, and provide practical considerations and limitations for future implementations.

  15. A Study in Relating Time-Between-Events to Seismic Source Mechanisms in Hardrock Mining

    NASA Astrophysics Data System (ADS)

    Beneteau, Donna-Lynn Lorette

    This thesis presents a Time-Between-Events (TBE) methodology for enhancing the interpretation of source mechanisms causing populations of microseismic data. The study was done using several datasets prepared by operators of mine seismic systems. These datasets, varying in size from 38 to 16324 events, represent groups of events in close proximity to each other. They may have been identified based on their nearness to individual seismic sources (faults, dykes, etc.) or mining structures (stopes, abutments, pillars, orepasses, etc.). Plotting sets of data collected over periods of time using a frequency-magnitude distribution is common with both earthquake and mining seismology. This TBE technique simply makes use of the inter-event times and b-values (slopes of the best-fit lines on the frequency-magnitude charts), from the same set of data. Four distinct patterns in TBE versus Event Magnitude have been found which suggest that there are constant seismic sources causing populations of data, or whether the smaller and larger events within the population represent varying seismic sources. Different rates of events, identified as TBE-rates in this study, may suggest whether the events are blast induced, or not. Interpretations of TBE results are combined with other methods that have been proven successful for inferring seismic source including magnitude-time history analysis, frequency-magnitude charts, S-wave to P-wave energy ratios, diurnal and phasor charts. A "Seismic Mechanism Assessment Worksheet" brings all of the collected information together to assist in the interpretation. Every dataset in the study is identified based on their composition of shear, fracture or indeterminate events. This was necessary to understand that there may be one dominant seismic source creating datasets, or varying blends of any of these three sources. In differing sizes of datasets, examples are found to show that the b-values and TBE-rates will be the same only when these

  16. Consensus analysis of networks with time-varying topology and event-triggered diffusions.

    PubMed

    Han, Yujuan; Lu, Wenlian; Chen, Tianping

    2015-11-01

    This paper studies the consensus problem of networks with time-varying topology. Event-triggered rules are employed in diffusion coupling terms to reduce the updating load of the coupled system. Two strategies are considered: event-triggered strategy, that each node observes the state information in an instantaneous way, to determine the next triggering event time, and self-triggered strategy, that each node only needs to observe the state information at the event time to predict the next triggering event time. In each strategy, two kinds of algorithms are considered: the pull-based algorithm, that the diffusion coupling term of every node is updated at the latest observations of the neighborhood at its triggered time, and push-based algorithm, the diffusion coupling term of every node uses the state information of its neighborhood at their latest triggered time. It is proved that if the coupling matrix across time intervals with length less than some given constant has spanning trees, then the proposed algorithms can realize consensus. Examples with numerical simulation are provided to show the effectiveness of the theoretical results.

  17. Case-based damage assessment of storm events in near real-time

    NASA Astrophysics Data System (ADS)

    Möhrle, Stella; Mühr, Bernhard

    2015-04-01

    Damage assessment in times of crisis is complex due to a highly dynamic environment and uncertainty in respect of available information. In order to assess the extent of a disaster in near real-time, historic events and their consequences may facilitate first estimations. Events of the past, which are in the same category or which have similar frame conditions like imminent or just occurring storms, might give preliminary information about possible damages. The challenge here is to identify useful historic events based on little information regarding the current event. This work investigates the potential of drawing conclusions about a current event based on similar historic disasters, exemplarily for storm events in Germany. Predicted wind speed and area affected can be used for roughly classifying a storm event. For this purpose, a grid of equidistant points can be used to split up the area of Germany. In combination with predicted wind speed at these points and the predicted number of points affected, respectively, a storm can be categorized in a fast manner. In contrast to investigate only data taken by the observation network, the grid approach is more objective, since stations are not equally distributed. Based on model data, the determined storm class provides one key factor for identifying similar historic events. Further aspects, such as region or specific event characteristics, complete knowledge about the potential storm scale and result in a similarity function, which automatically identifies useful events from the past. This work presents a case-based approach to estimate damages in the event of an extreme storm event in Germany. The focus in on the similarity function, which is based on model storm classes, particularly wind speed and area affected. In order to determine possible damages more precisely, event specific characteristics and region will be included. In the frame of determining similar storm events, neighboring storm classes will be

  18. Electrophysiological correlates of strategic monitoring in event-based and time-based prospective memory.

    PubMed

    Cona, Giorgia; Arcara, Giorgio; Tarantino, Vincenza; Bisiacchi, Patrizia Silvia

    2012-01-01

    Prospective memory (PM) is the ability to remember to accomplish an action when a particular event occurs (i.e., event-based PM), or at a specific time (i.e., time-based PM) while performing an ongoing activity. Strategic Monitoring is one of the basic cognitive functions supporting PM tasks, and involves two mechanisms: a retrieval mode, which consists of maintaining active the intention in memory; and target checking, engaged for verifying the presence of the PM cue in the environment. The present study is aimed at providing the first evidence of event-related potentials (ERPs) associated with time-based PM, and at examining differences and commonalities in the ERPs related to Strategic Monitoring mechanisms between event- and time-based PM tasks.The addition of an event-based or a time-based PM task to an ongoing activity led to a similar sustained positive modulation of the ERPs in the ongoing trials, mainly expressed over prefrontal and frontal regions. This modulation might index the retrieval mode mechanism, similarly engaged in the two PM tasks. On the other hand, two further ERP modulations were shown specifically in an event-based PM task. An increased positivity was shown at 400-600 ms post-stimulus over occipital and parietal regions, and might be related to target checking. Moreover, an early modulation at 130-180 ms post-stimulus seems to reflect the recruitment of attentional resources for being ready to respond to the event-based PM cue. This latter modulation suggests the existence of a third mechanism specific for the event-based PM; that is, the "readiness mode".

  19. Multivariate Analyses and Classification of Inertial Sensor Data to Identify Aging Effects on the Timed-Up-and-Go Test.

    PubMed

    Vervoort, Danique; Vuillerme, Nicolas; Kosse, Nienke; Hortobágyi, Tibor; Lamoth, Claudine J C

    2016-01-01

    Many tests can crudely quantify age-related mobility decrease but instrumented versions of mobility tests could increase their specificity and sensitivity. The Timed-up-and-Go (TUG) test includes several elements that people use in daily life. The test has different transition phases: rise from a chair, walk, 180° turn, walk back, turn, and sit-down on a chair. For this reason the TUG is an often used test to evaluate in a standardized way possible decline in balance and walking ability due to age and or pathology. Using inertial sensors, qualitative information about the performance of the sub-phases can provide more specific information about a decline in balance and walking ability. The first aim of our study was to identify variables extracted from the instrumented timed-up-and-go (iTUG) that most effectively distinguished performance differences across age (age 18-75). Second, we determined the discriminative ability of those identified variables to classify a younger (age 18-45) and older age group (age 46-75). From healthy adults (n = 59), trunk accelerations and angular velocities were recorded during iTUG performance. iTUG phases were detected with wavelet-analysis. Using a Partial Least Square (PLS) model, from the 72-iTUG variables calculated across phases, those that explained most of the covariance between variables and age were extracted. Subsequently, a PLS-discriminant analysis (DA) assessed classification power of the identified iTUG variables to discriminate the age groups. 27 variables, related to turning, walking and the stand-to-sit movement explained 71% of the variation in age. The PLS-DA with these 27 variables showed a sensitivity and specificity of 90% and 85%. Based on this model, the iTUG can accurately distinguish young and older adults. Such data can serve as a reference for pathological aging with respect to a widely used mobility test. Mobility tests like the TUG supplemented with smart technology could be used in clinical practice

  20. Multivariate Analyses and Classification of Inertial Sensor Data to Identify Aging Effects on the Timed-Up-and-Go Test

    PubMed Central

    Vervoort, Danique; Vuillerme, Nicolas; Kosse, Nienke; Hortobágyi, Tibor; Lamoth, Claudine J. C.

    2016-01-01

    Many tests can crudely quantify age-related mobility decrease but instrumented versions of mobility tests could increase their specificity and sensitivity. The Timed-up-and-Go (TUG) test includes several elements that people use in daily life. The test has different transition phases: rise from a chair, walk, 180° turn, walk back, turn, and sit-down on a chair. For this reason the TUG is an often used test to evaluate in a standardized way possible decline in balance and walking ability due to age and or pathology. Using inertial sensors, qualitative information about the performance of the sub-phases can provide more specific information about a decline in balance and walking ability. The first aim of our study was to identify variables extracted from the instrumented timed-up-and-go (iTUG) that most effectively distinguished performance differences across age (age 18–75). Second, we determined the discriminative ability of those identified variables to classify a younger (age 18–45) and older age group (age 46–75). From healthy adults (n = 59), trunk accelerations and angular velocities were recorded during iTUG performance. iTUG phases were detected with wavelet-analysis. Using a Partial Least Square (PLS) model, from the 72-iTUG variables calculated across phases, those that explained most of the covariance between variables and age were extracted. Subsequently, a PLS-discriminant analysis (DA) assessed classification power of the identified iTUG variables to discriminate the age groups. 27 variables, related to turning, walking and the stand-to-sit movement explained 71% of the variation in age. The PLS-DA with these 27 variables showed a sensitivity and specificity of 90% and 85%. Based on this model, the iTUG can accurately distinguish young and older adults. Such data can serve as a reference for pathological aging with respect to a widely used mobility test. Mobility tests like the TUG supplemented with smart technology could be used in clinical

  1. A LORETA study of mental time travel: similar and distinct electrophysiological correlates of re-experiencing past events and pre-experiencing future events.

    PubMed

    Lavallee, Christina F; Persinger, Michael A

    2010-12-01

    Previous studies exploring mental time travel paradigms with functional neuroimaging techniques have uncovered both common and distinct neural correlates of re-experiencing past events or pre-experiencing future events. A gap in the mental time travel literature exists, as paradigms have not explored the affective component of re-experiencing past episodic events; this study explored this sparsely researched area. The present study employed standardized low resolution electromagnetic tomography (sLORETA) to identify electrophysiological correlates of re-experience affect-laden and non-affective past events, as well as pre-experiencing a future anticipated event. Our results confirm previous research and are also novel in that we illustrate common and distinct electrophysiological correlates of re-experiencing affective episodic events. Furthermore, research from this experiment yields results outlining a pattern of activation in the frontal and temporal regions is correlated with the time frame of past or future events subjects imagined. PMID:20598583

  2. Time distributions of solar energetic particle events: Are SEPEs really random?

    NASA Astrophysics Data System (ADS)

    Jiggens, P. T. A.; Gabriel, S. B.

    2009-10-01

    Solar energetic particle events (SEPEs) can exhibit flux increases of several orders of magnitude over background levels and have always been considered to be random in nature in statistical models with no dependence of any one event on the occurrence of previous events. We examine whether this assumption of randomness in time is correct. Engineering modeling of SEPEs is important to enable reliable and efficient design of both Earth-orbiting and interplanetary spacecraft and future manned missions to Mars and the Moon. All existing engineering models assume that the frequency of SEPEs follows a Poisson process. We present analysis of the event waiting times using alternative distributions described by Lévy and time-dependent Poisson processes and compared these with the usual Poisson distribution. The results show significant deviation from a Poisson process and indicate that the underlying physical processes might be more closely related to a Lévy-type process, suggesting that there is some inherent “memory” in the system. Inherent Poisson assumptions of stationarity and event independence are investigated, and it appears that they do not hold and can be dependent upon the event definition used. SEPEs appear to have some memory indicating that events are not completely random with activity levels varying even during solar active periods and are characterized by clusters of events. This could have significant ramifications for engineering models of the SEP environment, and it is recommended that current statistical engineering models of the SEP environment should be modified to incorporate long-term event dependency and short-term system memory.

  3. Classification and Space-Time Analysis of Precipitation Events in Manizales, Caldas, Colombia.

    NASA Astrophysics Data System (ADS)

    Suarez Hincapie, J. N.; Vélez, J.; Romo Melo, L.; Chang, P.

    2015-12-01

    Manizales is a mid-mountain Andean city located near the Nevado del Ruiz volcano in west-central Colombia, this location exposes it to earthquakes, floods, landslides and volcanic eruptions. It is located in the intertropical convergence zone (ITCZ) and presents a climate with a bimodal rainfall regime (Cortés, 2010). Its mean annual rainfall is 2000 mm, one may observe precipitation 70% of the days over a year. This rain which favors the formation of large masses of clouds and the presence of macroclimatic phenomenon as "El Niño South Oscillation", has historically caused great impacts in the region (Vélez et al, 2012). For example the geographical location coupled with rain events results in a high risk of landslides in the city. Manizales has a hydrometeorological network of 40 stations that measure and transmit data of up to eight climate variables. Some of these stations keep 10 years of historical data. However, until now this information has not been used for space-time classification of precipitation events, nor has the meteorological variables that influence them been thoroughly researched. The purpose of this study was to classify historical events of rain in an urban area of Manizales and investigate patterns of atmospheric behavior that influence or trigger such events. Classification of events was performed by calculating the "n" index of the heavy rainfall, describing the behavior of precipitation as a function of time throughout the event (Monjo, 2009). The analysis of meteorological variables was performed using statistical quantification over variable time periods before each event. The proposed classification allowed for an analysis of the evolution of rainfall events. Specially, it helped to look for the influence of different meteorological variables triggering rainfall events in hazardous areas as the city of Manizales.

  4. The role of musical training in emergent and event-based timing

    PubMed Central

    Baer, L. H.; Thibodeau, J. L. N.; Gralnick, T. M.; Li, K. Z. H.; Penhune, V. B.

    2013-01-01

    Introduction: Musical performance is thought to rely predominantly on event-based timing involving a clock-like neural process and an explicit internal representation of the time interval. Some aspects of musical performance may rely on emergent timing, which is established through the optimization of movement kinematics, and can be maintained without reference to any explicit representation of the time interval. We predicted that musical training would have its largest effect on event-based timing, supporting the dissociability of these timing processes and the dominance of event-based timing in musical performance. Materials and Methods: We compared 22 musicians and 17 non-musicians on the prototypical event-based timing task of finger tapping and on the typically emergently timed task of circle drawing. For each task, participants first responded in synchrony with a metronome (Paced) and then responded at the same rate without the metronome (Unpaced). Results: Analyses of the Unpaced phase revealed that non-musicians were more variable in their inter-response intervals for finger tapping compared to circle drawing. Musicians did not differ between the two tasks. Between groups, non-musicians were more variable than musicians for tapping but not for drawing. We were able to show that the differences were due to less timer variability in musicians on the tapping task. Correlational analyses of movement jerk and inter-response interval variability revealed a negative association for tapping and a positive association for drawing in non-musicians only. Discussion: These results suggest that musical training affects temporal variability in tapping but not drawing. Additionally, musicians and non-musicians may be employing different movement strategies to maintain accurate timing in the two tasks. These findings add to our understanding of how musical training affects timing and support the dissociability of event-based and emergent timing modes. PMID:23717275

  5. Gunbarrel mafic magmatic event: A key 780 Ma time marker for Rodinia plate reconstructions

    USGS Publications Warehouse

    Harlan, S.S.; Heaman, L.; LeCheminant, A.N.; Premo, W.R.

    2003-01-01

    Precise U-Pb baddeleyite dating of mafic igneous rocks provides evidence for a widespread and synchronous magmatic event that extended for >2400 km along the western margin of the Neoproterozoic Laurentian craton. U-Pb baddeleyite analyses for eight intrusions from seven localities ranging from the northern Canadian Shield to northwestern Wyoming-southwestern Montana are statistically indistinguishable and yield a composite U-Pb concordia age for this event of 780.3 ?? 1.4 Ma (95% confidence level). This 780 Ma event is herein termed the Gunbarrel magmatic event. The mafic magmatism of the Gunbarrel event represents the largest mafic dike swarm yet identified along the Neoproterozoic margin of Laurentia. The origin of the mafic magmatism is not clear, but may be related to mantle-plume activity or upwelling asthenosphere leading to crustal extension accompanying initial breakup of the supercontinent Rodinia and development of the proto-Pacific Ocean. The mafic magmatism of the Gunbarrel magmatic event at 780 Ma predates the voluminous magmatism of the 723 Ma Franklin igneous event of the northwestern Canadian Shield by ???60 m.y. The precise dating of the extensive Neoproterozoic Gunbarrel and Franklin magmatic events provides unique time markers that can ultimately be used for robust testing of Neoproterozoic continental reconstructions.

  6. Timing of the most recent surface rupture event on the Ohariu Fault near Paraparaumu, New Zealand

    USGS Publications Warehouse

    Litchfield, N.; Van Dissen, R.; Langridge, Rob; Heron, D.; Prentice, C.

    2004-01-01

    Thirteen radiocarbon ages from three trenches across the Ohariu Fault tightly constrain the timing of the most recent surface rupture event at Muaupoko Stream valley, c. 2 km east of Paraparaumu, to between 930 and 1050 cal. yr BP. This age overlaps with previously published ages of the most recent event on the Ohariu Fault and together they further constrain the event to 1000-1050 cal. yr BP. Two trenches provide loose constraints on the maximum recurrence interval at 3-7000 yr. Tephra, most probably the Kawakawa Tephra, was found within alluvial fan deposits in two of the trenches. ?? The Royal Society of New Zealand 2004.

  7. Energy dependence of the characteristic decay time of proton fluxes in solar cosmic ray events

    NASA Astrophysics Data System (ADS)

    Daibog, E. I.; Logachev, Yu. I.; Kecskemety, K.

    2008-02-01

    Energetic solar proton events within the energy interval 1-48 MeV at the stage of their decay are considered over the period of 1974-2001. The dependence of the characteristic decay time on the proton energy in the assumed power-law representation τ( E) = E - n is analyzed for the events with an exponential decay form. The dependence of n on the heliolongitude of the flare (the particles source on the Sun) is studied.

  8. Temporal evolution of nightglow emission responses to SSW events observed by TIMED/SABER

    NASA Astrophysics Data System (ADS)

    Gao, Hong; Xu, Jiyao; Ward, William; Smith, Anne K.

    2011-10-01

    Using the SSW (Stratospheric Sudden Warming) event in 2009 as a representative case, the temporal evolution of the responses of OH and O2 infrared atmospheric (0-0) nightglow emissions to SSW events is analyzed using the TIMED (Thermosphere Ionosphere Mesosphere Energetics and Dynamics)/SABER (Sounding of the Atmosphere using Broadband Emission Radiometry) data. The results show that during the mesospheric cooling that occurs during the stratospheric warming stage of SSW events, the brightness of OH and O2 nightglow emissions and the thicknesses of OH and O2 emission layers decrease noticeably and the peak heights of the emissions ascend. During the recovery stage in the mesosphere, the brightness of both nightglow emissions and the thicknesses of the emission layers increase dramatically and the peak heights of the emissions descend. These emission variations are mainly caused by perturbations in temperature and the transport of O in the MLT (Mesosphere Lower Thermosphere) region. For the SSW event that started in January 2009, the onset times of the cooling stage and recovery stage in the mesosphere are ˜2 days ahead of the onset times of the warming stage and recovery stage of the SSW event, respectively. For this event, the influence of the SSW on the OH and O2 nightglow emissions increases with latitude between 50°N and 80°N.

  9. AXS and SOM: A new statistical approach for treating within-subject, time-varying, multivariate data collected using the AXS Test Battery

    NASA Astrophysics Data System (ADS)

    Lauter, Judith L.; Ninness, Chris

    2003-10-01

    The Auditory Cross-Section (AXS) Test Battery [J. L. Lauter, Behav. Res. Methods Instrum. Comput. 32, 180-190 (2000)], described in presentations to ASA in 2002 and 2003, is designed to document dynamic relations linking the cortex, brainstem, and body periphery (whether physics, physiology, or behavior) on an individually-specific basis. Data collections using the battery typically employ a within-subject, time-varying, multivariate design, yet conventional group statistics do not provide satisfactory means of treating such data. We have recently developed an approach based on Kohonens (2001) Self-Organizing Maps (SOM) algorithm, which categorizes time-varying profiles across variables, either within- or between-subjects. The treatment entails three steps: (1) z-score transformation of all raw data; (2) employing the SOM to sort the time-varying profiles into groups; and (3) deriving an estimate of the bounds for the Bayes error rate. Our three-step procedure will be briefly described and illustrated with data from a recent study combining otoacoustic emissions, auditory brainstem responses, and cortical qEEG.

  10. Multi-variable X-band radar observation and tracking of ash plume from Mt. Etna volcano on November 23, 2013 event

    NASA Astrophysics Data System (ADS)

    Montopoli, Mario; Vulpiani, Gianfranco; Riccci, Matteo; Corradini, Stefano; Merucci, Luca; Marzano, Frank S.

    2015-04-01

    Ground based weather radar observations of volcanic ash clouds are gaining momentum after recent works which demonstrated their potential use either as stand alone tool or in combination with satellite retrievals. From an operational standpoint, radar data have been mainly exploited to derive the height of ash plume and its temporal-spatial development, taking into account the radar limitation of detecting coarse ash particles (from approximately 20 microns to 10 millimeters and above in terms of particle's radius). More sophisticated radar retrievals can include airborne ash concentration, ash fall rate and out-flux rate. Marzano et al. developed several volcanic ash radar retrieval (VARR) schemes, even though their practical use is still subject to a robust validation activity. The latter is made particularly difficult due to the lack of field campaigns with multiple observations and the scarce repetition of volcanic events. The radar variable, often used to infer the physical features of actual ash clouds, is the radar reflectivity named ZHH. It is related to ash particle size distribution and it shows a nice power law relationship with ash concentration. This makes ZHH largely used in radar-volcanology studies. However, weather radars are often able to detect Doppler frequency shifts and, more and more, they have a polarization-diversity capability. The former means that wind speed spectrum of the ash cloud is potentially inferable, whereas the latter implies that variables other than ZHH are available. Theoretically, these additional radar variables are linked to the degree of eccentricity of ash particles, their orientation and density as well as the presence of strong turbulence effects. Thus, the opportunity to refine the ash radar estimates so far developed can benefit from the thorough analysis of radar Doppler and polarization diversity. In this work we show a detailed analysis of Doppler shifts and polarization variables measured by the X band radar

  11. The timing of life-history events in a changing climate.

    PubMed Central

    Post, E; Forchhammer, M C; Stenseth, N C; Callaghan, T V

    2001-01-01

    Although empirical and theoretical studies suggest that climate influences the timing of life-history events in animals and plants, correlations between climate and the timing of events such as egg-laying, migration or flowering do not reveal the mechanisms by which natural selection operates on life-history events. We present a general autoregressive model of the timing of life-history events in relation to variation in global climate that, like autoregressive models of population dynamics, allows for a more mechanistic understanding of the roles of climate, resources and competition. We applied the model to data on 50 years of annual dates of first flowering by three species of plants in 26 populations covering 4 degrees of latitude in Norway. In agreement with earlier studies, plants in most populations and all three species bloomed earlier following warmer winters. Moreover, our model revealed that earlier blooming reflected increasing influences of resources and density-dependent population limitation under climatic warming. The insights available from the application of this model to phenological data in other taxa will contribute to our understanding of the roles of endogenous versus exogenous processes in the evolution of the timing of life-history events in a changing climate. PMID:12123293

  12. The timing of life-history events in a changing climate.

    PubMed

    Post, E; Forchhammer, M C; Stenseth, N C; Callaghan, T V

    2001-01-01

    Although empirical and theoretical studies suggest that climate influences the timing of life-history events in animals and plants, correlations between climate and the timing of events such as egg-laying, migration or flowering do not reveal the mechanisms by which natural selection operates on life-history events. We present a general autoregressive model of the timing of life-history events in relation to variation in global climate that, like autoregressive models of population dynamics, allows for a more mechanistic understanding of the roles of climate, resources and competition. We applied the model to data on 50 years of annual dates of first flowering by three species of plants in 26 populations covering 4 degrees of latitude in Norway. In agreement with earlier studies, plants in most populations and all three species bloomed earlier following warmer winters. Moreover, our model revealed that earlier blooming reflected increasing influences of resources and density-dependent population limitation under climatic warming. The insights available from the application of this model to phenological data in other taxa will contribute to our understanding of the roles of endogenous versus exogenous processes in the evolution of the timing of life-history events in a changing climate.

  13. Multivariate respiratory motion prediction

    NASA Astrophysics Data System (ADS)

    Dürichen, R.; Wissel, T.; Ernst, F.; Schlaefer, A.; Schweikard, A.

    2014-10-01

    In extracranial robotic radiotherapy, tumour motion is compensated by tracking external and internal surrogates. To compensate system specific time delays, time series prediction of the external optical surrogates is used. We investigate whether the prediction accuracy can be increased by expanding the current clinical setup by an accelerometer, a strain belt and a flow sensor. Four previously published prediction algorithms are adapted to multivariate inputs—normalized least mean squares (nLMS), wavelet-based least mean squares (wLMS), support vector regression (SVR) and relevance vector machines (RVM)—and evaluated for three different prediction horizons. The measurement involves 18 subjects and consists of two phases, focusing on long term trends (M1) and breathing artefacts (M2). To select the most relevant and least redundant sensors, a sequential forward selection (SFS) method is proposed. Using a multivariate setting, the results show that the clinically used nLMS algorithm is susceptible to large outliers. In the case of irregular breathing (M2), the mean root mean square error (RMSE) of a univariate nLMS algorithm is 0.66 mm and can be decreased to 0.46 mm by a multivariate RVM model (best algorithm on average). To investigate the full potential of this approach, the optimal sensor combination was also estimated on the complete test set. The results indicate that a further decrease in RMSE is possible for RVM (to 0.42 mm). This motivates further research about sensor selection methods. Besides the optical surrogates, the sensors most frequently selected by the algorithms are the accelerometer and the strain belt. These sensors could be easily integrated in the current clinical setup and would allow a more precise motion compensation.

  14. Neural Correlates of the Time Marker for the Perception of Event Timing

    PubMed Central

    Qi, Liang; Terada, Yoshikazu; Nishida, Shin’ya

    2016-01-01

    While sensory processing latency, inferred from the manual reaction time (RT), is substantially affected by diverse stimulus parameters, subjective temporal judgments are relatively accurate. The neural mechanisms underlying this timing perception remain obscure. Here, we measured human neural activity by magnetoencephalography while participants performed a simultaneity judgment task between the onset of random-dot coherent motion and a beep. In a separate session, participants performed an RT task for the same stimuli. We analyzed the relationship between neural activity evoked by motion onset and point of subjective simultaneity (PSS) or RT. The effect of motion coherence was smaller for PSS than RT, but changes in RT and PSS could both be predicted by the time at which an integrated sensory response crossed a threshold. The task differences could be ascribed to the lower threshold for PSS than for RT. In agreement with the psychophysical threshold difference, the participants reported longer delays in their motor response from the subjective motion onset for weaker stimuli. However, they could not judge the timing of stimuli weaker than the detection threshold. A possible interpretation of the present findings is that the brain assigns the time marker for timing perception prior to stimulus detection, but the time marker is available only after stimulus detection. PMID:27679810

  15. Neural Correlates of the Time Marker for the Perception of Event Timing

    PubMed Central

    Qi, Liang; Terada, Yoshikazu; Nishida, Shin’ya

    2016-01-01

    While sensory processing latency, inferred from the manual reaction time (RT), is substantially affected by diverse stimulus parameters, subjective temporal judgments are relatively accurate. The neural mechanisms underlying this timing perception remain obscure. Here, we measured human neural activity by magnetoencephalography while participants performed a simultaneity judgment task between the onset of random-dot coherent motion and a beep. In a separate session, participants performed an RT task for the same stimuli. We analyzed the relationship between neural activity evoked by motion onset and point of subjective simultaneity (PSS) or RT. The effect of motion coherence was smaller for PSS than RT, but changes in RT and PSS could both be predicted by the time at which an integrated sensory response crossed a threshold. The task differences could be ascribed to the lower threshold for PSS than for RT. In agreement with the psychophysical threshold difference, the participants reported longer delays in their motor response from the subjective motion onset for weaker stimuli. However, they could not judge the timing of stimuli weaker than the detection threshold. A possible interpretation of the present findings is that the brain assigns the time marker for timing perception prior to stimulus detection, but the time marker is available only after stimulus detection.

  16. Non-linear time series analysis of precipitation events using regional climate networks for Germany

    NASA Astrophysics Data System (ADS)

    Rheinwalt, Aljoscha; Boers, Niklas; Marwan, Norbert; Kurths, Jürgen; Hoffmann, Peter; Gerstengarbe, Friedrich-Wilhelm; Werner, Peter

    2016-02-01

    Synchronous occurrences of heavy rainfall events and the study of their relation in time and space are of large socio-economical relevance, for instance for the agricultural and insurance sectors, but also for the general well-being of the population. In this study, the spatial synchronization structure is analyzed as a regional climate network constructed from precipitation event series. The similarity between event series is determined by the number of synchronous occurrences. We propose a novel standardization of this number that results in synchronization scores which are not biased by the number of events in the respective time series. Additionally, we introduce a new version of the network measure directionality that measures the spatial directionality of weighted links by also taking account of the effects of the spatial embedding of the network. This measure provides an estimate of heavy precipitation isochrones by pointing out directions along which rainfall events synchronize. We propose a climatological interpretation of this measure in terms of propagating fronts or event traces and confirm it for Germany by comparing our results to known atmospheric circulation patterns.

  17. Model for the evolution of the time profile in optimistic parallel discrete event simulations

    NASA Astrophysics Data System (ADS)

    Ziganurova, L.; Novotny, M. A.; Shchur, L. N.

    2016-02-01

    We investigate synchronisation aspects of an optimistic algorithm for parallel discrete event simulations (PDES). We present a model for the time evolution in optimistic PDES. This model evaluates the local virtual time profile of the processing elements. We argue that the evolution of the time profile is reminiscent of the surface profile in the directed percolation problem and in unrestricted surface growth. We present results of the simulation of the model and emphasise predictive features of our approach.

  18. Effective target binarization method for linear timed address-event vision system

    NASA Astrophysics Data System (ADS)

    Xu, Jiangtao; Zou, Jiawei; Yan, Shi; Gao, Zhiyuan

    2016-06-01

    This paper presents an effective target binarization method for a linear timed address-event (TAE) vision system. In the preprocessing phase, TAE data are processed by denoising, thinning, and edge connection methods sequentially to obtain the denoised- and clear-event contours. Then, the object region will be confirmed by an event-pair matching method. Finally, the image open and close operations of morphology methods are introduced to remove the artifacts generated by event-pair mismatching. Several degraded images were processed by our method and some traditional binarization methods, and the experimental results are provided. As compared with other methods, the proposed method performs efficiently on extracting the target region and gets satisfactory binarization results from object images with low-contrast and nonuniform illumination.

  19. Low time resolution analysis of polar ice cores cannot detect impulsive nitrate events

    NASA Astrophysics Data System (ADS)

    Smart, D. F.; Shea, M. A.; Melott, A. L.; Laird, C. M.

    2014-12-01

    Ice cores are archives of climate change and possibly large solar proton events (SPEs). Wolff et al. (2012) used a single event, a nitrate peak in the GISP2-H core, which McCracken et al. (2001a) time associated with the poorly quantified 1859 Carrington event, to discredit SPE-produced, impulsive nitrate deposition in polar ice. This is not the ideal test case. We critique the Wolff et al. analysis and demonstrate that the data they used cannot detect impulsive nitrate events because of resolution limitations. We suggest reexamination of the top of the Greenland ice sheet at key intervals over the last two millennia with attention to fine resolution and replicate sampling of multiple species. This will allow further insight into polar depositional processes on a subseasonal scale, including atmospheric sources, transport mechanisms to the ice sheet, postdepositional interactions, and a potential SPE association.

  20. A novel multivariate approach using science-based calibration for direct coating thickness determination in real-time NIR process monitoring.

    PubMed

    Möltgen, C-V; Herdling, T; Reich, G

    2013-11-01

    This study demonstrates an approach, using science-based calibration (SBC), for direct coating thickness determination on heart-shaped tablets in real-time. Near-Infrared (NIR) spectra were collected during four full industrial pan coating operations. The tablets were coated with a thin hydroxypropyl methylcellulose (HPMC) film up to a film thickness of 28 μm. The application of SBC permits the calibration of the NIR spectral data without using costly determined reference values. This is due to the fact that SBC combines classical methods to estimate the coating signal and statistical methods for the noise estimation. The approach enabled the use of NIR for the measurement of the film thickness increase from around 8 to 28 μm of four independent batches in real-time. The developed model provided a spectroscopic limit of detection for the coating thickness of 0.64 ± 0.03 μm root-mean square (RMS). In the commonly used statistical methods for calibration, such as Partial Least Squares (PLS), sufficiently varying reference values are needed for calibration. For thin non-functional coatings this is a challenge because the quality of the model depends on the accuracy of the selected calibration standards. The obvious and simple approach of SBC eliminates many of the problems associated with the conventional statistical methods and offers an alternative for multivariate calibration.

  1. Disentangling the effect of event-based cues on children's time-based prospective memory performance.

    PubMed

    Redshaw, Jonathan; Henry, Julie D; Suddendorf, Thomas

    2016-10-01

    Previous time-based prospective memory research, both with children and with other groups, has measured the ability to perform an action with the arrival of a time-dependent yet still event-based cue (e.g., the occurrence of a specific clock pattern) while also engaged in an ongoing activity. Here we introduce a novel means of operationalizing time-based prospective memory and assess children's growing capacities when the availability of an event-based cue is varied. Preschoolers aged 3, 4, and 5years (N=72) were required to ring a bell when a familiar 1-min sand timer had completed a cycle under four conditions. In a 2×2 within-participants design, the timer was either visible or hidden and was either presented in the context of a single task or embedded within a dual picture-naming task. Children were more likely to ring the bell before 2min had elapsed in the visible-timer and single-task conditions, with performance improving with age across all conditions. These results suggest a divergence in the development of time-based prospective memory in the presence versus absence of event-based cues, and they also suggest that performance on typical time-based tasks may be partly driven by event-based prospective memory. PMID:27295204

  2. Event- and time-dependent decline of outcome information in the primate prefrontal cortex.

    PubMed

    Marcos, Encarni; Tsujimoto, Satoshi; Genovesio, Aldo

    2016-05-10

    The prefrontal cortex (PF) is involved in outcome-based flexible adaptation in a dynamically changing environment. The outcome signal dissipates gradually over time, but the temporal dynamics of this dissipation remains unknown. To examine this issue, we analyzed the outcome-related activity of PF neurons in 2 monkeys in a distance discrimination task. The initial prestimulus period of this task varied in duration, allowing us to dissociate the effects of time and event on the decline in previous outcome-related activity -previous correct versus previous error. We observed 2 types of decline in previous outcome representation: PF neurons that ceased to encode the previous outcome as time passed (time-dependent) and neurons that maintained their signal but it decreased rapidly after the occurrence of a new external event (event-dependent). Although the time-dependent dynamics explained the decline in a greater proportion of neurons, the event-dependent decline was also observed in a significant population of neurons.

  3. Disentangling the effect of event-based cues on children's time-based prospective memory performance.

    PubMed

    Redshaw, Jonathan; Henry, Julie D; Suddendorf, Thomas

    2016-10-01

    Previous time-based prospective memory research, both with children and with other groups, has measured the ability to perform an action with the arrival of a time-dependent yet still event-based cue (e.g., the occurrence of a specific clock pattern) while also engaged in an ongoing activity. Here we introduce a novel means of operationalizing time-based prospective memory and assess children's growing capacities when the availability of an event-based cue is varied. Preschoolers aged 3, 4, and 5years (N=72) were required to ring a bell when a familiar 1-min sand timer had completed a cycle under four conditions. In a 2×2 within-participants design, the timer was either visible or hidden and was either presented in the context of a single task or embedded within a dual picture-naming task. Children were more likely to ring the bell before 2min had elapsed in the visible-timer and single-task conditions, with performance improving with age across all conditions. These results suggest a divergence in the development of time-based prospective memory in the presence versus absence of event-based cues, and they also suggest that performance on typical time-based tasks may be partly driven by event-based prospective memory.

  4. Event- and time-dependent decline of outcome information in the primate prefrontal cortex

    PubMed Central

    Marcos, Encarni; Tsujimoto, Satoshi; Genovesio, Aldo

    2016-01-01

    The prefrontal cortex (PF) is involved in outcome-based flexible adaptation in a dynamically changing environment. The outcome signal dissipates gradually over time, but the temporal dynamics of this dissipation remains unknown. To examine this issue, we analyzed the outcome-related activity of PF neurons in 2 monkeys in a distance discrimination task. The initial prestimulus period of this task varied in duration, allowing us to dissociate the effects of time and event on the decline in previous outcome-related activity —previous correct versus previous error. We observed 2 types of decline in previous outcome representation: PF neurons that ceased to encode the previous outcome as time passed (time-dependent) and neurons that maintained their signal but it decreased rapidly after the occurrence of a new external event (event-dependent). Although the time-dependent dynamics explained the decline in a greater proportion of neurons, the event-dependent decline was also observed in a significant population of neurons. PMID:27162060

  5. Time-Critical Studies: Rapid response to Transient Dynamic Mid-Ocean Ridge Events

    NASA Astrophysics Data System (ADS)

    Cowen, J. P.; Baker, E. T.; Dziak, R. P.; Lilley, M. M.

    2003-12-01

    The Time-Critical Studies (TCS) Theme of Ridge 2000 focuses on observations of the immediate geochemical and geobiological consequences of magmatic and tectonic events along the global mid-ocean ridge system. To date funding has centered on the Juan de Fuca and Gorda Ridges which are within the range of the U.S. Navy's Northeast Pacific Sound Surveillance System (SOSUS). NOAA's T-Phase Monitoring Program has accessed SOSUS in real-time since 1993, providing the TCS community with detection of seismicity associated with eruptive or tectonic activity along these two ridges. This remote detection of earthquake swarms along the N.E. Pacific mid-ocean ridge coupled to NSF funding for pre-event staging equipment and supplies has allowed directed and increasingly well-organized field responses to the event site. Major rapid and follow-up response cruises have been successfully mounted to 1993 CoAxial, 1996 and 2001 Gorda Ridge, the 1998 Axial Volcano, and 2001 Middle Valley magmatic episodes. The logistical approach required to study these events has been greatly facilitated by the RIDGE/Ridge 2000 programs and collaboration between university, NOAA and Canadian investigators. Not only have our studies of these events significantly impacted our ideas on the nature of crustal accretion, but they also have led to the discovery and preliminary documentation of a previously unrecognized biomass reservoir that lives below the seafloor and is swept out during these cataclysmic events, and to increased appreciation of the formation and thermal, chemical and biogeochemical implications of the 'Event Plumes' commonly associated with sea floor magmatic events. Rapid shore-to-event site response is an important aspect of TCS. Proposals to enhance the event detection and response effort are welcome at any Ridge 2000 target date. The Ridge 2000 program recognizes that even the most rapid ship response will miss the earliest subsurface and water column expressions of magmatic events

  6. Ants Can Expect the Time of an Event on Basis of Previous Experiences.

    PubMed

    Cammaerts, Marie-Claire; Cammaerts, Roger

    2016-01-01

    Working on three ant species of the genus Myrmica, M. ruginodis, M. rubra, and M. sabuleti, we showed that foragers can expect the subsequent time at which food will be available on the basis of the previous times at which food was present. The ants acquired this expectative ability right after having experienced two time shifts of food delivery. Moreover, the ants' learning score appeared to be a logarithmic function of time (i.e., of the number of training days). This ability to expect subsequent times at which an event will occur may be an advantageous ethological trait. PMID:27403457

  7. Ants Can Expect the Time of an Event on Basis of Previous Experiences

    PubMed Central

    Cammaerts, Roger

    2016-01-01

    Working on three ant species of the genus Myrmica, M. ruginodis, M. rubra, and M. sabuleti, we showed that foragers can expect the subsequent time at which food will be available on the basis of the previous times at which food was present. The ants acquired this expectative ability right after having experienced two time shifts of food delivery. Moreover, the ants' learning score appeared to be a logarithmic function of time (i.e., of the number of training days). This ability to expect subsequent times at which an event will occur may be an advantageous ethological trait. PMID:27403457

  8. Real-time extreme weather event attribution with forecast seasonal SSTs

    NASA Astrophysics Data System (ADS)

    Haustein, K.; Otto, F. E. L.; Uhe, P.; Schaller, N.; Allen, M. R.; Hermanson, L.; Christidis, N.; McLean, P.; Cullen, H.

    2016-06-01

    Within the last decade, extreme weather event attribution has emerged as a new field of science and garnered increasing attention from the wider scientific community and the public. Numerous methods have been put forward to determine the contribution of anthropogenic climate change to individual extreme weather events. So far nearly all such analyses were done months after an event has happened. Here we present a new method which can assess the fraction of attributable risk of a severe weather event due to an external driver in real-time. The method builds on a large ensemble of atmosphere-only general circulation model simulations forced by seasonal forecast sea surface temperatures (SSTs). Taking the England 2013/14 winter floods as an example, we demonstrate that the change in risk for heavy rainfall during the England floods due to anthropogenic climate change, is of similar magnitude using either observed or seasonal forecast SSTs. Testing the dynamic response of the model to the anomalous ocean state for January 2014, we find that observed SSTs are required to establish a discernible link between a particular SST pattern and an atmospheric response such as a shift in the jetstream in the model. For extreme events occurring under strongly anomalous SST patterns associated with known low-frequency climate modes, however, forecast SSTs can provide sufficient guidance to determine the dynamic contribution to the event.

  9. The effect of time constraints and running phases on combined event pistol shooting performance.

    PubMed

    Dadswell, Clare; Payton, Carl; Holmes, Paul; Burden, Adrian

    2016-01-01

    The combined event is a crucial aspect of the modern pentathlon competition, but little is known about how shooting performance changes through the event. This study aimed to identify (i) how performance-related variables changed within each shooting series and (ii) how performance-related variables changed between each shooting series. Seventeen modern pentathletes completed combined event trials. An optoelectronic shooting system recorded score and pistol movement, and force platforms recorded centre of pressure movement 1 s prior to every shot. Heart rate and blood lactate values were recorded throughout the event. Whilst heart rate and blood lactate significantly increased between series (P < 0.05), there were no accompanying changes in the time period that participants spent aiming at the target, shot score, pistol movement or centre of pressure movement (P > 0.05). Thus, combined event shooting performance following each running phase appears similar to shooting performance following only 20 m of running. This finding has potential implications for the way in which modern pentathletes train for combined event shooting, and highlights the need for modern pentathletes to establish new methods with which to enhance shooting accuracy.

  10. Time-Frequency Characteristics of Tsunami Magnetic Signals from Four Pacific Ocean Events

    NASA Astrophysics Data System (ADS)

    Schnepf, N. R.; Manoj, C.; An, C.; Sugioka, H.; Toh, H.

    2016-07-01

    The recent deployment of highly sensitive seafloor magnetometers coinciding with the deep solar minimum has provided excellent opportunities for observing tsunami electromagnetic signals. These fluctuating signals (periods ranging from 10-20 min) are generally found to be within ± ˜ 1 nT and coincide with the arrival of the tsunami waves. Previous studies focused on tsunami electromagnetic characteristics, as well as modeling the signal for individual events. This study instead aims to provide the time-frequency characteristics for a range of tsunami signals and a method to separate the data's noise using additional data from a remote observatory. We focus on four Pacific Ocean events of varying tsunami signal amplitude: (1) the 2011 Tohoku, Japan event (M9.0), (2) the 2010 Chile event (M8.8), (3) the 2009 Samoa event (M8.0) and, (4) the 2007 Kuril Islands event (M8.1). We find possible tsunami signals in high-pass filtered data and successfully isolate the signals from noise using a cross-wavelet analysis. The cross-wavelet analysis reveals that the longer period signals precede the stronger, shorter period signals. Our results are very encouraging for using tsunami magnetic signals in warning systems.

  11. Jointly modeling time-to-event and longitudinal data: A Bayesian approach.

    PubMed

    Huang, Yangxin; Hu, X Joan; Dagne, Getachew A

    2014-03-01

    This article explores Bayesian joint models of event times and longitudinal measures with an attempt to overcome departures from normality of the longitudinal response, measurement errors, and shortages of confidence in specifying a parametric time-to-event model. We allow the longitudinal response to have a skew distribution in the presence of measurement errors, and assume the time-to-event variable to have a nonparametric prior distribution. Posterior distributions of the parameters are attained simultaneously for inference based on Bayesian approach. An example from a recent AIDS clinical trial illustrates the methodology by jointly modeling the viral dynamics and the time to decrease in CD4/CD8 ratio in the presence of CD4 counts with measurement errors and to compare potential models with various scenarios and different distribution specifications. The analysis outcome indicates that the time-varying CD4 covariate is closely related to the first-phase viral decay rate, but the time to CD4/CD8 decrease is not highly associated with either the two viral decay rates or the CD4 changing rate over time. These findings may provide some quantitative guidance to better understand the relationship of the virological and immunological responses to antiretroviral treatments. PMID:24611039

  12. Intensity/time profiles of solar particle events at one astronomical unit

    NASA Technical Reports Server (NTRS)

    Shea, M. A.

    1988-01-01

    A description of the intensity-time profiles of solar proton events observed at the orbit of the earth is presented. The discussion, which includes descriptive figures, presents a general overview of the subject without the detailed mathematical description of the physical processes which usually accompany most reviews.

  13. "Anniversary Reaction": Important Events and Timing of Death in a Group of Roman Catholic Priests.

    ERIC Educational Resources Information Center

    Walker, Lee; Walker, Lawrence D.

    1990-01-01

    Compared death dates of 1,038 Roman Catholic priests with dates of Christmas, Easter, birthday, and day of ordination. Found no meaningful patterns of death around any anniversary, suggesting either no association between time of death and important anniversaries or that important event may be so extraordinary to each individuals that it is not…

  14. The influence of pubertal timing and stressful life events on depression and delinquency among Chinese adolescents.

    PubMed

    Chen, Jie; Yu, Jing; Wu, Yun; Zhang, Jianxin

    2015-06-01

    This study aimed to investigate the influences of pubertal timing and stressful life events on Chinese adolescents' depression and delinquency. Sex differences in these influences were also examined. A large sample with 4,228 participants aged 12-15 years (53% girls) was recruited in Beijing, China. Participants' pubertal development, stressful life events, depressive symptoms, and delinquency were measured using self-reported questionnaires. Both early maturing girls and boys displayed more delinquency than their same-sex on-time and late maturing peers. Early maturing girls displayed more depressive symptoms than on-time and late maturing girls, but boys in the three maturation groups showed similar levels of depressive symptoms. The interactive effects between early pubertal timing and stressful life events were significant in predicting depression and delinquency, particularly for girls. Early pubertal maturation is an important risk factor for Chinese adolescents' depression and delinquency. Stressful life events intensified the detrimental effects of early pubertal maturation on adolescents' depression and delinquency, particularly for girls. PMID:26261908

  15. Modeling Repeatable Events Using Discrete-Time Data: Predicting Marital Dissolution

    ERIC Educational Resources Information Center

    Teachman, Jay

    2011-01-01

    I join two methodologies by illustrating the application of multilevel modeling principles to hazard-rate models with an emphasis on procedures for discrete-time data that contain repeatable events. I demonstrate this application using data taken from the 1995 National Survey of Family Growth (NSFG) to ascertain the relationship between multiple…

  16. Individual Change and the Timing and Onset of Important Life Events: Methods, Models, and Assumptions

    ERIC Educational Resources Information Center

    Grimm, Kevin; Marcoulides, Katerina

    2016-01-01

    Researchers are often interested in studying how the timing of a specific event affects concurrent and future development. When faced with such research questions there are multiple statistical models to consider and those models are the focus of this paper as well as their theoretical underpinnings and assumptions regarding the nature of the…

  17. Combined Use of Absolute and Differential Seismic Arrival Time Data to Improve Absolute Event Location

    NASA Astrophysics Data System (ADS)

    Myers, S.; Johannesson, G.

    2012-12-01

    Arrival time measurements based on waveform cross correlation are becoming more common as advanced signal processing methods are applied to seismic data archives and real-time data streams. Waveform correlation can precisely measure the time difference between the arrival of two phases, and differential time data can be used to constrain relative location of events. Absolute locations are needed for many applications, which generally requires the use of absolute time data. Current methods for measuring absolute time data are approximately two orders of magnitude less precise than differential time measurements. To exploit the strengths of both absolute and differential time data, we extend our multiple-event location method Bayesloc, which previously used absolute time data only, to include the use of differential time measurements that are based on waveform cross correlation. Fundamentally, Bayesloc is a formulation of the joint probability over all parameters comprising the multiple event location system. The Markov-Chain Monte Carlo method is used to sample from the joint probability distribution given arrival data sets. The differential time component of Bayesloc includes scaling a stochastic estimate of differential time measurement precision based the waveform correlation coefficient for each datum. For a regional-distance synthetic data set with absolute and differential time measurement error of 0.25 seconds and 0.01 second, respectively, epicenter location accuracy is improved from and average of 1.05 km when solely absolute time data are used to 0.28 km when absolute and differential time data are used jointly (73% improvement). The improvement in absolute location accuracy is the result of conditionally limiting absolute location probability regions based on the precise relative position with respect to neighboring events. Bayesloc estimates of data precision are found to be accurate for the synthetic test, with absolute and differential time measurement

  18. Metabolic profiling of Angelica acutiloba roots utilizing gas chromatography-time-of-flight-mass spectrometry for quality assessment based on cultivation area and cultivar via multivariate pattern recognition.

    PubMed

    Tianniam, Sukanda; Tarachiwin, Lucksanaporn; Bamba, Takeshi; Kobayashi, Akio; Fukusaki, Eiichiro

    2008-06-01

    Gas chromatography time-of-flight mass spectrometry was applied to elucidate the profiling of primary metabolites and to evaluate the differences between quality differences in Angelica acutiloba (or Yamato-toki) roots through the utilization of multivariate pattern recognition-principal component analysis (PCA). Twenty-two metabolites consisting of sugars, amino and organic acids were identified. PCA analysis successfully discriminated the good, the moderate and the bad quality Yamato-toki roots in accordance to their cultivation areas. The results signified two reducing sugars, fructose and glucose being the most accumulated in the bad quality, whereas higher quantity of phosphoric acid, proline, malic acid and citric acid were found in the good and the moderate quality toki roots. PCA was also effective in discriminating samples derive from different cultivars. Yamato-toki roots with the moderate quality were compared by means of PCA, and the results illustrated good discrimination which was influenced most by malic acid. Overall, this study demonstrated that metabolomics technique is accurate and efficient in determining the quality differences in Yamato-toki roots, and has a potential to be a superior and suitable method to assess the quality of this medicinal plant. PMID:18640606

  19. Profiling and multivariate statistical analysis of Panax ginseng based on ultra-high-performance liquid chromatography coupled with quadrupole-time-of-flight mass spectrometry.

    PubMed

    Wu, Wei; Sun, Le; Zhang, Zhe; Guo, Yingying; Liu, Shuying

    2015-03-25

    An ultra-high-performance liquid chromatography coupled with quadrupole-time-of-flight mass spectrometry (UHPLC-Q-TOF-MS) method was developed for the detection and structural analysis of ginsenosides in white ginseng and related processed products (red ginseng). Original neutral, malonyl, and chemically transformed ginsenosides were identified in white and red ginseng samples. The aglycone types of ginsenosides were determined by MS/MS as PPD (m/z 459), PPT (m/z 475), C-24, -25 hydrated-PPD or PPT (m/z 477 or m/z 493), and Δ20(21)-or Δ20(22)-dehydrated-PPD or PPT (m/z 441 or m/z 457). Following the structural determination, the UHPLC-Q-TOF-MS-based chemical profiling coupled with multivariate statistical analysis method was applied for global analysis of white and processed ginseng samples. The chemical markers present between the processed products red ginseng and white ginseng could be assigned. Process-mediated chemical changes were recognized as the hydrolysis of ginsenosides with large molecular weight, chemical transformations of ginsenosides, changes in malonyl-ginsenosides, and generation of 20-(R)-ginsenoside enantiomers. The relative contents of compounds classified as PPD, PPT, malonyl, and transformed ginsenosides were calculated based on peak areas in ginseng before and after processing. This study provides possibility to monitor multiple components for the quality control and global evaluation of ginseng products during processing.

  20. Profiling and classification of French propolis by combined multivariate data analysis of planar chromatograms and scanning direct analysis in real time mass spectra.

    PubMed

    Chasset, Thibaut; Häbe, Tim T; Ristivojevic, Petar; Morlock, Gertrud E

    2016-09-23

    Quality control of propolis is challenging, as it is a complex natural mixture of compounds, and thus, very difficult to analyze and standardize. Shown on the example of 30 French propolis samples, a strategy for an improved quality control was demonstrated in which high-performance thin-layer chromatography (HPTLC) fingerprints were evaluated in combination with selected mass signals obtained by desorption-based scanning mass spectrometry (MS). The French propolis sample extracts were separated by a newly developed reversed phase (RP)-HPTLC method. The fingerprints obtained by two different detection modes, i.e. after (1) derivatization and fluorescence detection (FLD) at UV 366nm and (2) scanning direct analysis in real time (DART)-MS, were analyzed by multivariate data analysis. Thus, RP-HPTLC-FLD and RP-HPTLC-DART-MS fingerprints were explored and the best classification was obtained using both methods in combination with pattern recognition techniques, such as principal component analysis. All investigated French propolis samples were divided in two types and characteristic patterns were observed. Phenolic compounds such as caffeic acid, p-coumaric acid, chrysin, pinobanksin, pinobanksin-3-acetate, galangin, kaempferol, tectochrysin and pinocembrin were identified as characteristic marker compounds of French propolis samples. This study expanded the research on the European poplar type of propolis and confirmed the presence of two botanically different types of propolis, known as the blue and orange types. PMID:27599799

  1. The DOE Model for Improving Seismic Event Locations Using Travel Time Corrections: Description and Demonstration

    SciTech Connect

    Hipp, J.R.; Moore, S.G.; Shepherd, E.; Young, C.J.

    1998-10-20

    The U.S. National Laboratories, under the auspices of the Department of Energy, have been tasked with improv- ing the capability of the United States National Data Center (USNDC) to monitor compliance with the Comprehen- sive Test Ban Trea~ (CTBT). One of the most important services which the USNDC must provide is to locate suspicious events, preferably as accurately as possible to help identify their origin and to insure the success of on-site inspections if they are deemed necessary. The seismic location algorithm used by the USNDC has the capability to generate accurate locations by applying geographically dependent travel time corrections, but to date, none of the means, proposed for generating and representing these corrections has proven to be entirely satisfactory. In this presentation, we detail the complete DOE model for how regional calibration travel time information gathered by the National Labs will be used to improve event locations and provide more realistic location error esti- mates. We begin with residual data and error estimates from ground truth events. Our model consists of three parts: data processing, data storage, and data retrieval. The former two are effectively one-time processes, executed in advance before the system is made operational. The last step is required every time an accurate event location is needed. Data processing involves applying non-stationary Bayesian kriging to the residwd data to densifi them, and iterating to find the optimal tessellation representation for the fast interpolation in the data retrieval task. Both the kriging and the iterative re-tessellation are slow, computationally-expensive processes but this is acceptable because they are performed off-line, before any events are to be located. In the data storage task, the densified data set is stored in a database and spatially indexed. Spatial indexing improves the access efficiency of the geographically-ori- ented data requests associated with event location

  2. gPhoton: Time-tagged GALEX photon events analysis tools

    NASA Astrophysics Data System (ADS)

    Million, Chase C.; Fleming, S. W.; Shiao, B.; Loyd, P.; Seibert, M.; Smith, M.

    2016-03-01

    Written in Python, gPhoton calibrates and sky-projects the ~1.1 trillion ultraviolet photon events detected by the microchannel plates on the Galaxy Evolution Explorer Spacecraft (GALEX), archives these events in a publicly accessible database at the Mikulski Archive for Space Telescopes (MAST), and provides tools for working with the database to extract scientific results, particularly over short time domains. The software includes a re-implementation of core functionality of the GALEX mission calibration pipeline to produce photon list files from raw spacecraft data as well as a suite of command line tools to generate calibrated light curves, images, and movies from the MAST database.

  3. Note: Gaussian mixture model for event recognition in optical time-domain reflectometry based sensing systems.

    PubMed

    Fedorov, A K; Anufriev, M N; Zhirnov, A A; Stepanov, K V; Nesterov, E T; Namiot, D E; Karasik, V E; Pnev, A B

    2016-03-01

    We propose a novel approach to the recognition of particular classes of non-conventional events in signals from phase-sensitive optical time-domain-reflectometry-based sensors. Our algorithmic solution has two main features: filtering aimed at the de-nosing of signals and a Gaussian mixture model to cluster them. We test the proposed algorithm using experimentally measured signals. The results show that two classes of events can be distinguished with the best-case recognition probability close to 0.9 at sufficient numbers of training samples. PMID:27036840

  4. The Dependence of Characteristic Times of Gradual SEP Events on Their Associated CME Properties

    NASA Astrophysics Data System (ADS)

    Pan, Z. H.; Wang, C. B.; Xue, X. H.; Wang, Y. M.

    It is generally believed that coronal mass ejections CMEs are the drivers of shocks that accelerate gradual solar energetic particles SEPs One might expect that the characteristics of the SEP intensity time profiles observed at 1 AU are determined by properties of the associated CMEs such as the radial speed and the angular width Recently Kahler statistically investigated the characteristic times of gradual SEP events observed from 1998-2002 and their associated coronal mass ejection properties Astrophys J 628 1014--1022 2005 Three characteristic times of gradual SEP events are determined as functions of solar source longitude 1 T 0 the time from associated CME launch to SEP onset at 1 AU 2 T R the rise time from SEP onset to the time when the SEP intensity is a factor of 2 below peak intensity and 3 T D the duration over which the SEP intensity is within a factor of 2 of the peak intensity However in his study the CME speeds and angular widths are directly taken from the LASCO CME catalog In this study we analyze the radial speeds and the angular widths of CMEs by an ice-cream cone model and re-investigate their correlationships with the characteristic times of the corresponding SEP events We find T R and T D are significantly correlated with radial speed for SEP events in the best-connected longitude range and there is no correlation between T 0 and CME radial speed and angular width which is consistent with Kahler s results On the other hand it s found that T R and T D are also have

  5. Relative Time-scale for Channeling Events Within Chaotic Terrains, Margaritifer Sinus, Mars

    NASA Technical Reports Server (NTRS)

    Janke, D.

    1985-01-01

    A relative time scale for ordering channel and chaos forming events was constructed for areas within the Margaritifer Sinus region of Mars. Transection and superposition relationships of channels, chaotic terrain, and the surfaces surrounding them were used to create the relative time scale; crater density studies were not used. Channels and chaos in contact with one another were treated as systems. These systems were in turn treated both separately (in order to understand internal relationships) and as members of the suite of Martian erosional forms (in order to produce a combined, master time scale). Channeling events associated with chaotic terrain development occurred over an extended geomorphic period. The channels can be divided into three convenient groups: those that pre-date intercrater plains development post-plains, pre-chasma systems; and those associated with the development of the Vallis Marineris chasmata. No correlations with cyclic climatic changes, major geologic events in other regions on Mars, or triggering phenomena (for example, specific impact events) were found.

  6. Timing and climatic expression of Dansgaard-Oeschger events in stalagmites from Turkey

    NASA Astrophysics Data System (ADS)

    Fleitmann, D.; Badertscher, S.; Cheng, H.; Edwards, R.

    2011-12-01

    The timing of Dansgaard-Oeschger events (D-O events), also known as Greenland interstadials (GIS), and their climatic and environmental impact in the Eastern Mediterranean is not well constrained. A set of highly-resolved and precisely dated stalagmite oxygen and carbon isotope records from Sofular Cave in northern Turkey cover the last 130 kyr before present almost continously and show D-O events 1-25 in great detail. Rapid climatic changes at the transition into D-O events are marked by a fast ecosystem response within a few decades. The timing of most D-O events in the Sofular time series is broadly consistent with the Hulu Cave (Wang et al., 2001)and other absolutely dated speleothem records, such as the Kleegruobe Cave (Spotl et al., 2006) or Fort Stanton Cave (Asmerom et al., 2010) records. Importantly, the timing is also consistent within age uncertainties with the most recent NGRIP ice core chronology (GICC05; Svenson et al., 2006; 2008). References Asmerom, Y., V. J. Polyak, et al. (2010). "Variable winter moisture in the southwestern United States linked to rapid glacial climate shifts." Nature Geoscience 3(2): 114-117. Spotl, C., A. Mangini, et al. (2006). "Chronology and paleoenvironment of Marine Isotope Stage 3 from two high-elevation speleothems, Austrian Alps." Quaternary Science Reviews 25(9-10): 1127-1136. Svensson, A., K. K. Andersen, et al. (2006). "The Greenland Ice Core Chronology 2005, 15-42 ka. Part 2: comparison to other records." Quaternary Science Reviews 25(23-24): 3258-3267. Svensson, A., K. K. Andersen, et al. (2008). "A 60 000 year Greenland stratigraphic ice core chronology." Climate of the Past 4(1): 47-57.

  7. Lessons Learned from Real-Time, Event-Based Internet Science Communications

    NASA Technical Reports Server (NTRS)

    Phillips, T.; Myszka, E.; Gallagher, D. L.; Adams, M. L.; Koczor, R. J.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The Directorate's Science Roundtable includes active researchers, NASA public relations, educators, and administrators. The Science@NASA award-winning family of Web sites features science, mathematics, and space news. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing science activities in real-time has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases, broadcasts accommodate active feedback and questions from Internet participants. Through these projects a pattern has emerged in the level of interest or popularity with the public. The pattern differentiates projects that include science from those that do not, All real-time, event-based Internet activities have captured public interest at a level not achieved through science stories or educator resource material exclusively. The worst event-based activity attracted more interest than the best written science story. One truly rewarding lesson learned through these projects is that the public recognizes the importance and excitement of being part of scientific discovery. Flying a camera to 100,000 feet altitude isn't as interesting to the public as searching for viable life-forms at these oxygen-poor altitudes. The details of these real-time, event-based projects and lessons learned will be discussed.

  8. Statistical Property and Model for the Inter-Event Time of Terrorism Attacks

    NASA Astrophysics Data System (ADS)

    Zhu, Jun-Fang; Han, Xiao-Pu; Wang, Bing-Hong

    2010-06-01

    The inter-event time of terrorism attack events is investigated by empirical data and model analysis. Empirical evidence shows that it follows a scale-free property. In order to understand the dynamic mechanism of such a statistical feature, an opinion dynamic model with a memory effect is proposed on a two-dimensional lattice network. The model mainly highlights the role of individual social conformity and self-affirmation psychology. An attack event occurs when the order parameter indicating the strength of public opposition opinion is smaller than a critical value. Ultimately, the model can reproduce the same statistical property as the empirical data and gives a good understanding for the possible dynamic mechanism of terrorism attacks.

  9. Adults’ reports of their earliest memories: Consistency in events, ages, and narrative characteristics over time

    PubMed Central

    Bauer, Patricia J.; Tasdemir-Ozdes, Aylin; Larkina, Marina

    2014-01-01

    Earliest memories have been of interest since the late 1800s, when it was first noted that most adults do not have memories from the first years of life (so-called childhood amnesia). Several characteristics of adults’ earliest memories have been investigated, including emotional content, the perspective from which they are recalled, and vividness. The focus of the present research was a feature of early memories heretofore relatively neglected in the literature, namely, their consistency. Adults reported their earliest memories 2 to 4 times over a 4-year period. Reports of earliest memories were highly consistent in the events identified as the bases for earliest memories, the reported age at the time of the event, and in terms of qualities of the narrative descriptions. These findings imply stability in the boundary that marks the offset of childhood amnesia, as well as in the beginning of a continuous sense of self over time. PMID:24836979

  10. Time forecast of a break-off event from a hanging glacier

    NASA Astrophysics Data System (ADS)

    Faillettaz, Jérome; Funk, Martin; Vagliasindi, Marco

    2016-06-01

    A cold hanging glacier located on the south face of the Grandes Jorasses (Mont Blanc, Italy) broke off on the 23 and 29 September 2014 with a total estimated ice volume of 105 000 m3. Thanks to accurate surface displacement measurements taken up to the final break-off, this event was successfully predicted 10 days in advance, enabling local authorities to take the necessary safety measures. The break-off event also confirmed that surface displacements experienced a power law acceleration along with superimposed log-periodic oscillations prior to the final rupture. This paper describes the methods used to achieve a satisfactory time forecast in real time and demonstrates, using a retrospective analysis, their potential for the development of early-warning systems in real time.

  11. APNEA list mode data acquisition and real-time event processing

    SciTech Connect

    Hogle, R.A.; Miller, P.; Bramblett, R.L.

    1997-11-01

    The LMSC Active Passive Neutron Examinations and Assay (APNEA) Data Logger is a VME-based data acquisition system using commercial-off-the-shelf hardware with the application-specific software. It receives TTL inputs from eighty-eight {sup 3}He detector tubes and eight timing signals. Two data sets are generated concurrently for each acquisition session: (1) List Mode recording of all detector and timing signals, timestamped to 3 microsecond resolution; (2) Event Accumulations generated in real-time by counting events into short (tens of microseconds) and long (seconds) time bins following repetitive triggers. List Mode data sets can be post-processed to: (1) determine the optimum time bins for TRU assay of waste drums, (2) analyze a given data set in several ways to match different assay requirements and conditions and (3) confirm assay results by examining details of the raw data. Data Logger events are processed and timestamped by an array of 15 TMS320C40 DSPs and delivered to an embedded controller (PowerPC604) for interim disk storage. Three acquisition modes, corresponding to different trigger sources are provided. A standard network interface to a remote host system (Windows NT or SunOS) provides for system control, status, and transfer of previously acquired data. 6 figs.

  12. A New Characteristic Function for Fast Time-Reverse Seismic Event Location

    NASA Astrophysics Data System (ADS)

    Hendriyana, Andri; Bauer, Klaus; Weber, Michael; Jaya, Makky; Muksin, Muksin

    2015-04-01

    Microseismicity produced by natural activities is usually characterized by low signal-to-noise ratio and huge amount of data as recording is conducted for a long period of time. Locating microseismic events is preferably carried out using migration-based methods such as time-reverse modeling (TRM). The original TRM is based on backpropagating the wavefield from the receiver down to the source location. Alternatively, we are using a characteristic function (CF) derived from the measured wavefield as input for the TRM. The motivation for such a strategy is to avoid undesired contributions from secondary arrivals which may generate artifacts in the final images. In this presentation, we introduce a new CF as input for TRM method. To obtain this CF, initially we apply kurtosis-based automatic onset detection and convolution with a given wavelet. The convolution with low frequency wavelets allows us to conduct time-reverse modeling using coarser sampling hence it will reduce computing time. We apply the method to locate seismic events measured along an active part of the Sumatra Fault around the Tarutung pull-apart basin (North Sumatra, Indonesia). The results show that seismic events are well-determined since they are concentrated along the Sumatran fault. Internal details of the Tarutung basin structure could be derived. Our results are consistent with those obtained from inversion of manually picked travel time data.

  13. Music, clicks, and their imaginations favor differently the event-based timing component for rhythmic movements.

    PubMed

    Bravi, Riccardo; Quarta, Eros; Del Tongo, Claudia; Carbonaro, Nicola; Tognetti, Alessandro; Minciacchi, Diego

    2015-06-01

    The involvement or noninvolvement of a clock-like neural process, an effector-independent representation of the time intervals to produce, is described as the essential difference between event-based and emergent timing. In a previous work (Bravi et al. in Exp Brain Res 232:1663-1675, 2014a. doi: 10.1007/s00221-014-3845-9 ), we studied repetitive isochronous wrist's flexion-extensions (IWFEs), performed while minimizing visual and tactile information, to clarify whether non-temporal and temporal characteristics of paced auditory stimuli affect the precision and accuracy of the rhythmic motor performance. Here, with the inclusion of new recordings, we expand the examination of the dataset described in our previous study to investigate whether simple and complex paced auditory stimuli (clicks and music) and their imaginations influence in a different way the timing mechanisms for repetitive IWFEs. Sets of IWFEs were analyzed by the windowed (lag one) autocorrelation-wγ(1), a statistical method recently introduced for the distinction between event-based and emergent timing. Our findings provide evidence that paced auditory information and its imagination favor the engagement of a clock-like neural process, and specifically that music, unlike clicks, lacks the power to elicit event-based timing, not counteracting the natural shift of wγ(1) toward positive values as frequency of movements increase.

  14. Foreshocks and aftershocks of Pisagua 2014 earthquake: time and space evolution of megathrust event.

    NASA Astrophysics Data System (ADS)

    Fuenzalida Velasco, Amaya; Rietbrock, Andreas; Wollam, Jack; Thomas, Reece; de Lima Neto, Oscar; Tavera, Hernando; Garth, Thomas; Ruiz, Sergio

    2016-04-01

    The 2014 Pisagua earthquake of magnitude 8.2 is the first case in Chile where a foreshock sequence was clearly recorded by a local network, as well all the complete sequence including the mainshock and its aftershocks. The seismicity of the last year before the mainshock include numerous clusters close to the epicentral zone (Ruiz et al; 2014) but it was on 16th March that this activity became stronger with the Mw 6.7 precursory event taking place in front of Iquique coast at 12 km depth. The Pisagua earthquake arrived on 1st April 2015 breaking almost 120 km N-S and two days after a 7.6 aftershock occurred in the south of the rupture, enlarging the zone affected by this sequence. In this work, we analyse the foreshocks and aftershock sequence of Pisagua earthquake, from the spatial and time evolution for a total of 15.764 events that were recorded from the 1st March to 31th May 2015. This event catalogue was obtained from the automatic analyse of seismic raw data of more than 50 stations installed in the north of Chile and the south of Peru. We used the STA/LTA algorithm for the detection of P and S arrival times on the vertical components and then a method of back propagation in a 1D velocity model for the event association and preliminary location of its hypocenters following the algorithm outlined by Rietbrock et al. (2012). These results were then improved by locating with NonLinLoc software using a regional velocity model. We selected the larger events to analyse its moment tensor solution by a full waveform inversion using ISOLA software. In order to understand the process of nucleation and propagation of the Pisagua earthquake, we also analysed the evolution in time of the seismicity of the three months of data. The zone where the precursory events took place was strongly activated two weeks before the mainshock and remained very active until the end of the analysed period with an important quantity of the seismicity located in the upper plate and having

  15. Multivariate meta-analysis: potential and promise.

    PubMed

    Jackson, Dan; Riley, Richard; White, Ian R

    2011-09-10

    The multivariate random effects model is a generalization of the standard univariate model. Multivariate meta-analysis is becoming more commonly used and the techniques and related computer software, although continually under development, are now in place. In order to raise awareness of the multivariate methods, and discuss their advantages and disadvantages, we organized a one day 'Multivariate meta-analysis' event at the Royal Statistical Society. In addition to disseminating the most recent developments, we also received an abundance of comments, concerns, insights, critiques and encouragement. This article provides a balanced account of the day's discourse. By giving others the opportunity to respond to our assessment, we hope to ensure that the various view points and opinions are aired before multivariate meta-analysis simply becomes another widely used de facto method without any proper consideration of it by the medical statistics community. We describe the areas of application that multivariate meta-analysis has found, the methods available, the difficulties typically encountered and the arguments for and against the multivariate methods, using four representative but contrasting examples. We conclude that the multivariate methods can be useful, and in particular can provide estimates with better statistical properties, but also that these benefits come at the price of making more assumptions which do not result in better inference in every case. Although there is evidence that multivariate meta-analysis has considerable potential, it must be even more carefully applied than its univariate counterpart in practice.

  16. Stochastic univariate and multivariate time series analysis of PM2.5 and PM10 air pollution: A comparative case study for Plovdiv and Asenovgrad, Bulgaria

    NASA Astrophysics Data System (ADS)

    Gocheva-Ilieva, S.; Stoimenova, M.; Ivanov, A.; Voynikova, D.; Iliev, I.

    2016-10-01

    Fine particulate matter PM2.5 and PM10 air pollutants are a serious problem in many urban areas affecting both the health of the population and the environment as a whole. The availability of large data arrays for the levels of these pollutants makes it possible to perform statistical analysis, to obtain relevant information, and to find patterns within the data. Research in this field is particularly topical for a number of Bulgarian cities, European country, where in recent years regulatory air pollution health limits are constantly being exceeded. This paper examines average daily data for air pollution with PM2.5 and PM10, collected by 3 monitoring stations in the cities of Plovdiv and Asenovgrad between 2011 and 2016. The goal is to find and analyze actual relationships in data time series, to build adequate mathematical models, and to develop short-term forecasts. Modeling is carried out by stochastic univariate and multivariate time series analysis, based on Box-Jenkins methodology. The best models are selected following initial transformation of the data and using a set of standard and robust statistical criteria. The Mathematica and SPSS software were used to perform calculations. This examination showed measured concentrations of PM2.5 and PM10 in the region of Plovdiv and Asenovgrad regularly exceed permissible European and national health and safety thresholds. We obtained adequate stochastic models with high statistical fit with the data and good quality forecasting when compared against actual measurements. The mathematical approach applied provides an independent alternative to standard official monitoring and control means for air pollution in urban areas.

  17. Real-time gesture interface based on event-driven processing from stereo silicon retinas.

    PubMed

    Lee, Jun Haeng; Delbruck, Tobi; Pfeiffer, Michael; Park, Paul K J; Shin, Chang-Woo; Ryu, Hyunsurk Eric; Kang, Byung Chang

    2014-12-01

    We propose a real-time hand gesture interface based on combining a stereo pair of biologically inspired event-based dynamic vision sensor (DVS) silicon retinas with neuromorphic event-driven postprocessing. Compared with conventional vision or 3-D sensors, the use of DVSs, which output asynchronous and sparse events in response to motion, eliminates the need to extract movements from sequences of video frames, and allows significantly faster and more energy-efficient processing. In addition, the rate of input events depends on the observed movements, and thus provides an additional cue for solving the gesture spotting problem, i.e., finding the onsets and offsets of gestures. We propose a postprocessing framework based on spiking neural networks that can process the events received from the DVSs in real time, and provides an architecture for future implementation in neuromorphic hardware devices. The motion trajectories of moving hands are detected by spatiotemporally correlating the stereoscopically verged asynchronous events from the DVSs by using leaky integrate-and-fire (LIF) neurons. Adaptive thresholds of the LIF neurons achieve the segmentation of trajectories, which are then translated into discrete and finite feature vectors. The feature vectors are classified with hidden Markov models, using a separate Gaussian mixture model for spotting irrelevant transition gestures. The disparity information from stereovision is used to adapt LIF neuron parameters to achieve recognition invariant of the distance of the user to the sensor, and also helps to filter out movements in the background of the user. Exploiting the high dynamic range of DVSs, furthermore, allows gesture recognition over a 60-dB range of scene illuminance. The system achieves recognition rates well over 90% under a variety of variable conditions with static and dynamic backgrounds with naïve users. PMID:25420246

  18. Precipitation-snowmelt timing and snowmelt augmentation of large peak flow events, western Cascades, Oregon

    NASA Astrophysics Data System (ADS)

    Jennings, Keith; Jones, Julia A.

    2015-09-01

    This study tested multiple hydrologic mechanisms to explain snowpack dynamics in extreme rain-on-snow floods, which occur widely in the temperate and polar regions. We examined 26, 10 day large storm events over the period 1992-2012 in the H.J. Andrews Experimental Forest in western Oregon, using statistical analyses (regression, ANOVA, and wavelet coherence) of hourly snowmelt lysimeter, air and dewpoint temperature, wind speed, precipitation, and discharge data. All events involved snowpack outflow, but only seven events had continuous net snowpack outflow, including three of the five top-ranked peak discharge events. Peak discharge was not related to precipitation rate, but it was related to the 10 day sum of precipitation and net snowpack outflow, indicating an increased flood response to continuously melting snowpacks. The two largest peak discharge events in the study had significant wavelet coherence at multiple time scales over several days; a distribution of phase differences between precipitation and net snowpack outflow at the 12-32 h time scale with a sharp peak at π/2 radians; and strongly correlated snowpack outflow among lysimeters representing 42% of basin area. The recipe for an extreme rain-on-snow event includes persistent, slow melt within the snowpack, which appears to produce a near-saturated zone within the snowpack throughout the landscape, such that the snowpack may transmit pressure waves of precipitation directly to streams, and this process is synchronized across the landscape. Further work is needed to understand the internal dynamics of a melting snowpack throughout a snow-covered landscape and its contribution to extreme rain-on-snow floods.

  19. Network meta-analysis of (individual patient) time to event data alongside (aggregate) count data

    PubMed Central

    2014-01-01

    Background Network meta-analysis methods extend the standard pair-wise framework to allow simultaneous comparison of multiple interventions in a single statistical model. Despite published work on network meta-analysis mainly focussing on the synthesis of aggregate data, methods have been developed that allow the use of individual patient-level data specifically when outcomes are dichotomous or continuous. This paper focuses on the synthesis of individual patient-level and summary time to event data, motivated by a real data example looking at the effectiveness of high compression treatments on the healing of venous leg ulcers. Methods This paper introduces a novel network meta-analysis modelling approach that allows individual patient-level (time to event with censoring) and summary-level data (event count for a given follow-up time) to be synthesised jointly by assuming an underlying, common, distribution of time to healing. Alternative model assumptions were tested within the motivating example. Model fit and adequacy measures were used to compare and select models. Results Due to the availability of individual patient-level data in our example we were able to use a Weibull distribution to describe time to healing; otherwise, we would have been limited to specifying a uniparametric distribution. Absolute effectiveness estimates were more sensitive than relative effectiveness estimates to a range of alternative specifications for the model. Conclusions The synthesis of time to event data considering individual patient-level data provides modelling flexibility, and can be particularly important when absolute effectiveness estimates, and not just relative effect estimates, are of interest. PMID:25209121

  20. Estimation of the Unextendable Dead Time Period in a Flow of Physical Events by the Method of Maximum Likelihood

    NASA Astrophysics Data System (ADS)

    Nezhel'skaya, L. A.

    2016-09-01

    A flow of physical events (photons, electrons, and other elementary particles) is studied. One of the mathematical models of such flows is the modulated MAP flow of events circulating under conditions of unextendable dead time period. It is assumed that the dead time period is an unknown fixed value. The problem of estimation of the dead time period from observations of arrival times of events is solved by the method of maximum likelihood.

  1. High Speed Multichannel Charge Sensitive Data Acquisition System with Self-Triggered Event Timing.

    PubMed

    Tremsin, Anton S; Siegmund, Oswald H W; Vallerga, John V; Raffanti, Rick; Weiss, Shimon; Michalet, Xavier

    2009-06-16

    A number of modern experiments require simultaneous measurement of charges on multiple channels at > MHz event rates with an accuracy of 100-1000 e(-) rms. One widely used data processing scheme relies on application of specific integrated circuits enabling multichannel analog peak detection asserted by an external trigger followed by a serial/sparsified readout. Although this configuration minimizes the back end electronics, its counting rate capability is limited by the speed of the serial readout. Recent advances in analog to digital converters and FPGA devices enable fully parallel high speed multichannel data processing with digital peak detection enhanced by finite impulse response filtering. Not only can accurate charge values be obtained at high event rates, but the timing of the event on each channel can also be determined with high accuracy.We present the concept and first experimental tests of fully parallel 128-channel charge sensitive data processing electronics capable of measuring charges with accuracy of ~1000 e- rms. Our system does not require an external trigger and, in addition to charge values, it provides the event timing with an accuracy of ~1 ns FWHM. One of the possible applications of this system is high resolution position sensitive event counting detectors with microchannel plates combined with cross strip readout. Implementation of fast data acquisition electronics increases the counting rates of those detectors to multi-MHz level, preserving their unique capability of virtually noiseless detection of both position (with accuracy of ~10 μm FWHM) and timing (~1 ns FWHM) of individual particles, including photons, electrons, ions, neutrals, and neutrons.

  2. Potential of turbidity monitoring for real time control of pollutant discharge in sewers during rainfall events.

    PubMed

    Lacour, C; Joannis, C; Gromaire, M-C; Chebbo, G

    2009-01-01

    Turbidity sensors can be used to continuously monitor the evolution of pollutant mass discharge. For two sites within the Paris combined sewer system, continuous turbidity, conductivity and flow data were recorded at one-minute time intervals over a one-year period. This paper is intended to highlight the variability in turbidity dynamics during wet weather. For each storm event, turbidity response aspects were analysed through different classifications. The correlation between classification and common parameters, such as the antecedent dry weather period, total event volume per impervious hectare and both the mean and maximum hydraulic flow for each event, was also studied. Moreover, the dynamics of flow and turbidity signals were compared at the event scale. No simple relation between turbidity responses, hydraulic flow dynamics and the chosen parameters was derived from this effort. Knowledge of turbidity dynamics could therefore potentially improve wet weather management, especially when using pollution-based real-time control (P-RTC) since turbidity contains information not included in hydraulic flow dynamics and not readily predictable from such dynamics.

  3. Automated detection of instantaneous gait events using time frequency analysis and manifold embedding.

    PubMed

    Aung, Min S H; Thies, Sibylle B; Kenney, Laurence P J; Howard, David; Selles, Ruud W; Findlow, Andrew H; Goulermas, John Y

    2013-11-01

    Accelerometry is a widely used sensing modality in human biomechanics due to its portability, non-invasiveness, and accuracy. However, difficulties lie in signal variability and interpretation in relation to biomechanical events. In walking, heel strike and toe off are primary gait events where robust and accurate detection is essential for gait-related applications. This paper describes a novel and generic event detection algorithm applicable to signals from tri-axial accelerometers placed on the foot, ankle, shank or waist. Data from healthy subjects undergoing multiple walking trials on flat and inclined, as well as smooth and tactile paving surfaces is acquired for experimentation. The benchmark timings at which heel strike and toe off occur, are determined using kinematic data recorded from a motion capture system. The algorithm extracts features from each of the acceleration signals using a continuous wavelet transform over a wide range of scales. A locality preserving embedding method is then applied to reduce the high dimensionality caused by the multiple scales while preserving salient features for classification. A simple Gaussian mixture model is then trained to classify each of the time samples into heel strike, toe off or no event categories. Results show good detection and temporal accuracies for different sensor locations and different walking terrains. PMID:23322764

  4. Rise Times of Solar Energetic Particle Events and Speeds of CMEs

    NASA Astrophysics Data System (ADS)

    Kahler, S.; Reames, D.

    2002-12-01

    Gradual solar energetic particle (SEP) events are assumed to be produced in coronal and interplanetary shocks driven by fast coronal mass ejections (CMEs). These fast CMEs are decelerated as they move through the slower ambient solar wind. However, the Alfven speed is decreasing with increasing distance. Faster CMEs may therefore continue to drive strong shocks for longer characteristic times than do the slower CMEs, such that shock production and injection of SEPs of a given energy will also continue longer with the faster CMEs. We test this proposition observationally by comparing the times to maxima of 20 MeV SEP events with the observed speeds of associated CMEs. The SEP/CME events are sorted by solar longitude to factor out the longitudinal dependence of the SEP rise times. A preliminary analysis comparing 20 MeV protons from the GSFC EPACT detector on the Wind satellite with CMEs observed by the LASCO coronagraph on the SOHO spacecraft showed a correlation between SEP rise times and CME speeds. We expand the database to include the 1996-2001 period for a more definitive test of the correlation. The implications of the results will be discussed.

  5. Bootstrap-based methods for estimating standard errors in Cox's regression analyses of clustered event times.

    PubMed

    Xiao, Yongling; Abrahamowicz, Michal

    2010-03-30

    We propose two bootstrap-based methods to correct the standard errors (SEs) from Cox's model for within-cluster correlation of right-censored event times. The cluster-bootstrap method resamples, with replacement, only the clusters, whereas the two-step bootstrap method resamples (i) the clusters, and (ii) individuals within each selected cluster, with replacement. In simulations, we evaluate both methods and compare them with the existing robust variance estimator and the shared gamma frailty model, which are available in statistical software packages. We simulate clustered event time data, with latent cluster-level random effects, which are ignored in the conventional Cox's model. For cluster-level covariates, both proposed bootstrap methods yield accurate SEs, and type I error rates, and acceptable coverage rates, regardless of the true random effects distribution, and avoid serious variance under-estimation by conventional Cox-based standard errors. However, the two-step bootstrap method over-estimates the variance for individual-level covariates. We also apply the proposed bootstrap methods to obtain confidence bands around flexible estimates of time-dependent effects in a real-life analysis of cluster event times.

  6. State feedback control of real-time discrete event systems with infinite states

    NASA Astrophysics Data System (ADS)

    Park, Seong-Jin; Cho, Kwang-Hyun

    2015-05-01

    In this paper, we study a state feedback supervisory control of timed discrete event systems (TDESs) with infinite number of states modelled as timed automata. To this end, we represent a timed automaton with infinite number of untimed states (called locations) by a finite set of conditional assignment statements. Predicates and predicate transformers are employed to finitely represent the behaviour and specification of a TDES with infinite number of locations. In addition, the notion of clock regions in timed automata is used to identify the reachable states of a TDES with an infinite time space. For a real-time specification described as a predicate, we present the controllability condition for the existence of a state feedback supervisor that restricts the behaviour of the controlled TDES within the specification.

  7. Pulsed illumination, closed circuit television system for real-time viewing of unsteady (> 1 micros) events.

    PubMed

    Marden, W W; Steinberger, R L; Bracco, F V

    1978-10-01

    A pulsed illumination closed circuit television system is described whereby fast (times <33 ms), unsteady events can be observed in real time. A low-power helium-neon laser beam is modulated to send a short duration light pulse through the unsteady test medium. The light is refracted according to the instantaneous optical properties of the medium. The refracted light travels to a solid state television camera, known as a charge injection device (CID), in which the sensor array is charged within microseconds. The scanning of the charged array then follows, requiring the standard 33 ms for information transfer to video tape and a TV monitor. The image is thus formed during the laser pulse duration (which presently is 10 to 100 micros, but shorter duration pulses are possible with more powerful lasers), but no more than one image every 33 ms can be observed and recorded. Thus this method is particularly suited for the investigation of high frequency periodic events in which one can observe both a single image, or an ensemble average of as many as 100 images, occurring at corresponding times in different cycles. The reported applications include the recording of steady and transient propane torch flames, of the transient fuel injection process in a motored internal combustion engine, and of the propagation of a flame under firing conditions in the engine. In the shadowgraph and Schlieren modes the method is particularly suited for application to periodic combustion events such as those occurring in internal combustion engines. The method then presents the following advantages over high-speed filming (> 3000 pictures/s); real-time observation and recording of chamber events at any crankangle; real-time observation and recording of the effects of changes in the engine variables (speed, load, spark timing, injection pressure and duration, chamber swirl, etc.) on the combustion events; real-time observation and recording of ensemble averages and cycle-to-cycle variations

  8. A Bayesian Approach for Instrumental Variable Analysis with Censored Time-to-Event Outcome

    PubMed Central

    Li, Gang; Lu, Xuyang

    2014-01-01

    Instrumental variable (IV) analysis has been widely used in economics, epidemiology, and other fields to estimate the causal effects of covariates on outcomes, in the presence of unobserved confounders and/or measurement errors in covariates. However, IV methods for time-to-event outcome with censored data remain underdeveloped. This paper proposes a Bayesian approach for IV analysis with censored time-to-event outcome by using a two-stage linear model. A Markov Chain Monte Carlo sampling method is developed for parameter estimation for both normal and non-normal linear models with elliptically contoured error distributions. Performance of our method is examined by simulation studies. Our method largely reduces bias and greatly improves coverage probability of the estimated causal effect, compared to the method that ignores the unobserved confounders and measurement errors. We illustrate our method on the Women's Health Initiative Observational Study and the Atherosclerosis Risk in Communities Study. PMID:25393617

  9. Monotonic continuous-time random walks with drift and stochastic reset events

    NASA Astrophysics Data System (ADS)

    Montero, Miquel; Villarroel, Javier

    2013-01-01

    In this paper we consider a stochastic process that may experience random reset events which suddenly bring the system to the starting value and analyze the relevant statistical magnitudes. We focus our attention on monotonic continuous-time random walks with a constant drift: The process increases between the reset events, either by the effect of the random jumps, or by the action of the deterministic drift. As a result of all these combined factors interesting properties emerge, like the existence (for any drift strength) of a stationary transition probability density function, or the faculty of the model to reproduce power-law-like behavior. General formulas for two extreme statistics, the survival probability, and the mean exit time are also derived. To corroborate in an independent way the results of the paper, Monte Carlo methods were used. These numerical estimations are in full agreement with the analytical predictions.

  10. Validation of Cross-Sectional Time Series and Multivariate Adaptive Regression Splines Models for the Prediction of Energy Expenditure in Children and Adolescents Using Doubly Labeled Water12

    PubMed Central

    Butte, Nancy F.; Wong, William W.; Adolph, Anne L.; Puyau, Maurice R.; Vohra, Firoz A.; Zakeri, Issa F.

    2010-01-01

    Accurate, nonintrusive, and inexpensive techniques are needed to measure energy expenditure (EE) in free-living populations. Our primary aim in this study was to validate cross-sectional time series (CSTS) and multivariate adaptive regression splines (MARS) models based on observable participant characteristics, heart rate (HR), and accelerometer counts (AC) for prediction of minute-by-minute EE, and hence 24-h total EE (TEE), against a 7-d doubly labeled water (DLW) method in children and adolescents. Our secondary aim was to demonstrate the utility of CSTS and MARS to predict awake EE, sleep EE, and activity EE (AEE) from 7-d HR and AC records, because these shorter periods are not verifiable by DLW, which provides an estimate of the individual's mean TEE over a 7-d interval. CSTS and MARS models were validated in 60 normal-weight and overweight participants (ages 5–18 y). The Actiheart monitor was used to simultaneously measure HR and AC. For prediction of TEE, mean absolute errors were 10.7 ± 307 kcal/d and 18.7 ± 252 kcal/d for CSTS and MARS models, respectively, relative to DLW. Corresponding root mean square error values were 305 and 251 kcal/d for CSTS and MARS models, respectively. Bland-Altman plots indicated that the predicted values were in good agreement with the DLW-derived TEE values. Validation of CSTS and MARS models based on participant characteristics, HR monitoring, and accelerometry for the prediction of minute-by-minute EE, and hence 24-h TEE, against the DLW method indicated no systematic bias and acceptable limits of agreement for pediatric groups and individuals under free-living conditions. PMID:20573939

  11. Racial Disparities for Age at Time of Cardiovascular Events and Cardiovascular Death in SLE Patients

    PubMed Central

    Scalzi, Lisabeth V.; Hollenbeak, Christopher S.; Wang, Li

    2010-01-01

    Objective The aim of this study was to determine if there are racial disparities in regard to the age at which SLE patients experience CVD and CVD associated death. Methods Using the 2003–2006 National Inpatient Sample, we calculated the age difference between SLE patients and their race and gender-matched controls at the time of hospitalization for a cardiovascular (CVD) event and for CVD-associated death. In addition, we also calculated the age difference for the same outcomes between White SLE patients and gender-matched controls for each minority group. Results The mean age difference at the time of CVD event between women with and without SLE was 10.5 years. All age differences between women with SLE (n=3,625) and women without SLE admitted for CVD were significant (p<0.0001). Black women were the youngest female SLE racial group to be admitted with CVD (53.9 years) and have a CVD associated inhospital mortality (52.8 years; n=218). Black SLE women were 19.8 years younger than race and gender-matched controls at the time of CVD associated death. Admission trends for CVD were reversed for Black women such that the highest proportions of these patients were admitted before age 55 and then steadily decreased across age categories. There were 805 men with SLE admitted with a CVD event, with Black and Hispanic groups being the youngest. Conclusions There are significant racial disparities with regard to age at the time of hospital admission for CVD events and a CVD-related hospitalization resulting in death in patients with SLE. PMID:20506536

  12. Automated seismic event location by arrival time stacking: Applications to local and micro-seismicity

    NASA Astrophysics Data System (ADS)

    Grigoli, F.; Cesca, S.; Braun, T.; Philipp, J.; Dahm, T.

    2012-04-01

    Locating seismic events is one of the oldest problem in seismology. In microseismicity application, when the number of event is very large, it is not possible to locate earthquake manually and automated location procedures must be established. Automated seismic event location at different scales is very important in different application areas, including mining monitoring, reservoir geophysics and early warning systems. Location is needed to start rescue operations rapidly. Locating and mapping microearthquakes or acoustic emission sources in mining environments is important for monitoring of mines stability. Mapping fractures through microseimicity distribution inside hydrocarbon reservoirs is needed to find areas with an higher permeability and enhance oil production. In the last 20 years a large number of picking algorithm was developed in order to locate seismic events automatically. While P onsets can now be accurately picked using automatic routines, the automatic picking of later seismic phases (including S onset) is still problematic , thus limiting the location performance. In this work we present a picking free location method based on the use of the Short-Term-Average/Long-Term-Average (STA/LTA) traces at different stations as observed data. For different locations and origin times, observed STA/LTA are stacked along the travel time surface corresponding to the selected hypocentre. Iterating this procedure on a three-dimensional grid we retrieve a multidimensional matrix whose absolute maximum corresponds to the spatio-temporal coordinates of the seismic event. We tested our methodology on synthetic data, simulating different environments and network geometries. Finally, we apply our method to real datasets related to microseismic activity in mines and earthquake swarms in Italy. This work has been funded by the German BMBF "Geotechnologien" project MINE (BMBF03G0737A).

  13. Event-driven time-optimal control for a class of discontinuous bioreactors.

    PubMed

    Moreno, Jaime A; Betancur, Manuel J; Buitrón, Germán; Moreno-Andrade, Iván

    2006-07-01

    Discontinuous bioreactors may be further optimized for processing inhibitory substrates using a convenient fed-batch mode. To do so the filling rate must be controlled in such a way as to push the reaction rate to its maximum value, by increasing the substrate concentration just up to the point where inhibition begins. However, an exact optimal controller requires measuring several variables (e.g., substrate concentrations in the feed and in the tank) and also good model knowledge (e.g., yield and kinetic parameters), requirements rarely satisfied in real applications. An environmentally important case, that exemplifies all these handicaps, is toxicant wastewater treatment. There the lack of online practical pollutant sensors may allow unforeseen high shock loads to be fed to the bioreactor, causing biomass inhibition that slows down the treatment process and, in extreme cases, even renders the biological process useless. In this work an event-driven time-optimal control (ED-TOC) is proposed to circumvent these limitations. We show how to detect a "there is inhibition" event by using some computable function of the available measurements. This event drives the ED-TOC to stop the filling. Later, by detecting the symmetric event, "there is no inhibition," the ED-TOC may restart the filling. A fill-react cycling then maintains the process safely hovering near its maximum reaction rate, allowing a robust and practically time-optimal operation of the bioreactor. An experimental study case of a wastewater treatment process application is presented. There the dissolved oxygen concentration was used to detect the events needed to drive the controller. PMID:16523521

  14. The timing of the Black Sea flood event: Insights from modeling of glacial isostatic adjustment

    NASA Astrophysics Data System (ADS)

    Goldberg, Samuel L.; Lau, Harriet C. P.; Mitrovica, Jerry X.; Latychev, Konstantin

    2016-10-01

    We present a suite of gravitationally self-consistent predictions of sea-level change since Last Glacial Maximum (LGM) in the vicinity of the Bosphorus and Dardanelles straits that combine signals associated with glacial isostatic adjustment (GIA) and the flooding of the Black Sea. Our predictions are tuned to fit a relative sea level (RSL) record at the island of Samothrace in the north Aegean Sea and they include realistic 3-D variations in viscoelastic structure, including lateral variations in mantle viscosity and the elastic thickness of the lithosphere, as well as weak plate boundary zones. We demonstrate that 3-D Earth structure and the magnitude of the flood event (which depends on the pre-flood level of the lake) both have significant impact on the predicted RSL change at the location of the Bosphorus sill, and therefore on the inferred timing of the marine incursion. We summarize our results in a plot showing the predicted RSL change at the Bosphorus sill as a function of the timing of the flood event for different flood magnitudes up to 100 m. These results suggest, for example, that a flood event at 9 ka implies that the elevation of the sill was lowered through erosion by ∼14-21 m during, and after, the flood. In contrast, a flood event at 7 ka suggests erosion of ∼24-31 m at the sill since the flood. More generally, our results will be useful for future research aimed at constraining the details of this controversial, and widely debated geological event.

  15. Event-driven time-optimal control for a class of discontinuous bioreactors.

    PubMed

    Moreno, Jaime A; Betancur, Manuel J; Buitrón, Germán; Moreno-Andrade, Iván

    2006-07-01

    Discontinuous bioreactors may be further optimized for processing inhibitory substrates using a convenient fed-batch mode. To do so the filling rate must be controlled in such a way as to push the reaction rate to its maximum value, by increasing the substrate concentration just up to the point where inhibition begins. However, an exact optimal controller requires measuring several variables (e.g., substrate concentrations in the feed and in the tank) and also good model knowledge (e.g., yield and kinetic parameters), requirements rarely satisfied in real applications. An environmentally important case, that exemplifies all these handicaps, is toxicant wastewater treatment. There the lack of online practical pollutant sensors may allow unforeseen high shock loads to be fed to the bioreactor, causing biomass inhibition that slows down the treatment process and, in extreme cases, even renders the biological process useless. In this work an event-driven time-optimal control (ED-TOC) is proposed to circumvent these limitations. We show how to detect a "there is inhibition" event by using some computable function of the available measurements. This event drives the ED-TOC to stop the filling. Later, by detecting the symmetric event, "there is no inhibition," the ED-TOC may restart the filling. A fill-react cycling then maintains the process safely hovering near its maximum reaction rate, allowing a robust and practically time-optimal operation of the bioreactor. An experimental study case of a wastewater treatment process application is presented. There the dissolved oxygen concentration was used to detect the events needed to drive the controller.

  16. Time-Dependent characteristics of Slow Slip Events beneath the Nicoya Peninsula, Costa Rica

    NASA Astrophysics Data System (ADS)

    Voss, N. K.; Liu, Z.; Malservisi, R.; Dixon, T. H.; Jiang, Y.; Schwartz, S. Y.; Protti, M.

    2015-12-01

    Large geodetically resolved Slow Slip Events (SSE) beneath the Nicoya Peninsula in Costa Rica have been occurring every 21 months +/- 6 months since the installation of continuous GPS stations in 2003, with smaller shallow events occurring irregularly in between. This recurrence interval appears to continue after the 2012 Nicoya M 7.6 earthquake that ruptured a strongly locked patch [Protti et al., 2014, Xue et al., 2015] located between two recurring and well-imaged regions of shallow and deep slow slip. The most recent SSE began in February of 2014, ~17 months after the Nicoya earthquake and 21 months after the start of an earthquake preceding SSE. Despite the observed regularity in time, the characteristics of the individual SSEs vary greatly, with some having shallow or deep slip dominate and others having a more equal distribution of both shallow and deep slip. We use a modified version of the Extended Network Inversion Filter [e.g. McGuire and Segall, 2003] (ENIF) to identify time dependent characteristics of SSEs before and after the 2012 Nicoya earthquake. Slip in the distinct shallow and deep patches appears to be closely related although not coincident in time, deep slip is slightly delayed compared to the onset of shallow slip for events in 2007 and 2009. Distinct migration both along dip and strike are observed. Further, the ability of the filter to distinguish signal from noise, has allowed for resolution of subtle slip rate variations during the SSEs that correlate reasonably well with previously identified tremor rate [Walter et al , 2011]. Such correlation is less clear in space, possibly due to a combination of uncertainty in tremor locations and inversion regularization. Some coincident migration of slip and tremor did occur during the 2007 event. Investigation of time-dependent slip behaviors of other events is still in progress. A preliminary static inversion of the 2014 event indicates an absence of slip in the shallow slow slip patch that was

  17. Meteorological factors and timing of the initiating event of human parturition

    NASA Astrophysics Data System (ADS)

    Hirsch, Emmet; Lim, Courtney; Dobrez, Deborah; Adams, Marci G.; Noble, William

    2011-03-01

    The aim of this study was to determine whether meteorological factors are associated with the timing of either onset of labor with intact membranes or rupture of membranes prior to labor—together referred to as `the initiating event' of parturition. All patients delivering at Evanston Hospital after spontaneous labor or rupture of membranes at ≥20 weeks of gestation over a 6-month period were studied. Logistic regression models of the initiating event of parturition using clinical variables (maternal age, gestational age, parity, multiple gestation and intrauterine infection) with and without the addition of meteorological variables (barometric pressure, temperature and humidity) were compared. A total of 1,088 patients met the inclusion criteria. Gestational age, multiple gestation and chorioamnionitis were associated with timing of initiation of parturition ( P < 0.01). The addition of meteorological to clinical variables generated a statistically significant improvement in prediction of the initiating event; however, the magnitude of this improvement was small (less than 2% difference in receiver-operating characteristic score). These observations held regardless of parity, fetal number and gestational age. Meteorological factors are associated with the timing of parturition, but the magnitude of this association is small.

  18. Time-frequency analysis of event-related potentials: a brief tutorial.

    PubMed

    Herrmann, Christoph S; Rach, Stefan; Vosskuhl, Johannes; Strüber, Daniel

    2014-07-01

    Event-related potentials (ERPs) reflect cognitive processes and are usually analyzed in the so-called time domain. Additional information on cognitive functions can be assessed when analyzing ERPs in the frequency domain and treating them as event-related oscillations (EROs). This procedure results in frequency spectra but lacks information about the temporal dynamics of EROs. Here, we describe a method-called time-frequency analysis-that allows analyzing both the frequency of an ERO and its evolution over time. In a brief tutorial, the reader will learn how to use wavelet analysis in order to compute time-frequency transforms of ERP data. Basic steps as well as potential artifacts are described. Rather than in terms of formulas, descriptions are in textual form (written text) with numerous figures illustrating the topics. Recommendations on how to present frequency and time-frequency data in journal articles are provided. Finally, we briefly review studies that have applied time-frequency analysis to mismatch negativity paradigms. The deviant stimulus of such a paradigm evokes an ERO in the theta frequency band that is stronger than for the standard stimulus. Conversely, the standard stimulus evokes a stronger gamma-band response than does the deviant. This is interpreted in the context of the so-called match-and-utilization model. PMID:24194116

  19. Comprehensive temporal information detection from clinical text: medical events, time, and TLINK identification

    PubMed Central

    Sohn, Sunghwan; Wagholikar, Kavishwar B; Li, Dingcheng; Jonnalagadda, Siddhartha R; Tao, Cui; Komandur Elayavilli, Ravikumar; Liu, Hongfang

    2013-01-01

    Background Temporal information detection systems have been developed by the Mayo Clinic for the 2012 i2b2 Natural Language Processing Challenge. Objective To construct automated systems for EVENT/TIMEX3 extraction and temporal link (TLINK) identification from clinical text. Materials and methods The i2b2 organizers provided 190 annotated discharge summaries as the training set and 120 discharge summaries as the test set. Our Event system used a conditional random field classifier with a variety of features including lexical information, natural language elements, and medical ontology. The TIMEX3 system employed a rule-based method using regular expression pattern match and systematic reasoning to determine normalized values. The TLINK system employed both rule-based reasoning and machine learning. All three systems were built in an Apache Unstructured Information Management Architecture framework. Results Our TIMEX3 system performed the best (F-measure of 0.900, value accuracy 0.731) among the challenge teams. The Event system produced an F-measure of 0.870, and the TLINK system an F-measure of 0.537. Conclusions Our TIMEX3 system demonstrated good capability of regular expression rules to extract and normalize time information. Event and TLINK machine learning systems required well-defined feature sets to perform well. We could also leverage expert knowledge as part of the machine learning features to further improve TLINK identification performance. PMID:23558168

  20. Imaging transient slip events in southwest Japan using reanalyzed Japanese GEONET GPS time series

    NASA Astrophysics Data System (ADS)

    Liu, Z.; Moore, A. W.; Owen, S. E.

    2012-12-01

    The Japanese continuous GPS network (GEONET) with ~1450 stations provide a unique opportunity to study ongoing subduction zone dynamics, and crustal deformation at various spatiotemporal scales. Recently we completed a reanalysis of GPS position time series for the entire GEONET from 1996 to 2012 using JPL GIPSY/OASIS-II based GPS Network Processor [Owen et al., 2006] and raw data provided by Geospatial Information Authority of Japan (GSI) and Caltech. We use the JPL precise GPS orbits reestimated from the present through 1996 [Desai et al., 2011], troposphere global mapping function, and single receiver phase ambiguity resolution strategy [Bertiger et al., 2010] in the analysis. The resultant GPS time series solution shows improved repeatability and consistency over the ~16 yrs span, in comparison with 1996-2006 GPS position estimates used in our previous analysis [Liu et al., 2010a,b]. We apply a time-series analysis framework to estimate bias, offsets caused by instrument changes, earthquakes and other unknown sources, linear trends, seasonal variations, post-seismic deformation and other transient signals. The principal component analysis method is used to estimate the common mode error across the network [Dong et al. 2006]. We construct an interplate fault geometry from a composite plate boundary model [Wang et al. 2004] and apply a Kalman filter based network inversion method to image the spatiotemporal slip variation of slip transient events on the plate interface. The highly precise GPS time series enables the detectability of much smaller transient signals and start to reveal previously unobserved features of slow slip events. For example, the application to 2009-2011 Bungo Channel slow slip event shows it has a complex slip history with the major event initiating in late 2009 beneath the northeast corner of the region and migrating southwestward and updip. At ~2010.75 there is activation of a smaller slip subevent to the east of main slip region

  1. Sensitivity to censored-at-random assumption in the analysis of time-to-event endpoints.

    PubMed

    Lipkovich, Ilya; Ratitch, Bohdana; O'Kelly, Michael

    2016-05-01

    Over the past years, significant progress has been made in developing statistically rigorous methods to implement clinically interpretable sensitivity analyses for assumptions about the missingness mechanism in clinical trials for continuous and (to a lesser extent) for binary or categorical endpoints. Studies with time-to-event outcomes have received much less attention. However, such studies can be similarly challenged with respect to the robustness and integrity of primary analysis conclusions when a substantial number of subjects withdraw from treatment prematurely prior to experiencing an event of interest. We discuss how the methods that are widely used for primary analyses of time-to-event outcomes could be extended in a clinically meaningful and interpretable way to stress-test the assumption of ignorable censoring. We focus on a 'tipping point' approach, the objective of which is to postulate sensitivity parameters with a clear clinical interpretation and to identify a setting of these parameters unfavorable enough towards the experimental treatment to nullify a conclusion that was favorable to that treatment. Robustness of primary analysis results can then be assessed based on clinical plausibility of the scenario represented by the tipping point. We study several approaches for conducting such analyses based on multiple imputation using parametric, semi-parametric, and non-parametric imputation models and evaluate their operating characteristics via simulation. We argue that these methods are valuable tools for sensitivity analyses of time-to-event data and conclude that the method based on piecewise exponential imputation model of survival has some advantages over other methods studied here. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Sensitivity to censored-at-random assumption in the analysis of time-to-event endpoints.

    PubMed

    Lipkovich, Ilya; Ratitch, Bohdana; O'Kelly, Michael

    2016-05-01

    Over the past years, significant progress has been made in developing statistically rigorous methods to implement clinically interpretable sensitivity analyses for assumptions about the missingness mechanism in clinical trials for continuous and (to a lesser extent) for binary or categorical endpoints. Studies with time-to-event outcomes have received much less attention. However, such studies can be similarly challenged with respect to the robustness and integrity of primary analysis conclusions when a substantial number of subjects withdraw from treatment prematurely prior to experiencing an event of interest. We discuss how the methods that are widely used for primary analyses of time-to-event outcomes could be extended in a clinically meaningful and interpretable way to stress-test the assumption of ignorable censoring. We focus on a 'tipping point' approach, the objective of which is to postulate sensitivity parameters with a clear clinical interpretation and to identify a setting of these parameters unfavorable enough towards the experimental treatment to nullify a conclusion that was favorable to that treatment. Robustness of primary analysis results can then be assessed based on clinical plausibility of the scenario represented by the tipping point. We study several approaches for conducting such analyses based on multiple imputation using parametric, semi-parametric, and non-parametric imputation models and evaluate their operating characteristics via simulation. We argue that these methods are valuable tools for sensitivity analyses of time-to-event data and conclude that the method based on piecewise exponential imputation model of survival has some advantages over other methods studied here. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26997353

  3. Event processing time prediction at the CMS experiment of the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Cury, Samir; Gutsche, Oliver; Kcira, Dorian

    2014-06-01

    The physics event reconstruction is one of the biggest challenges for the computing of the LHC experiments. Among the different tasks that computing systems of the CMS experiment performs, the reconstruction takes most of the available CPU resources. The reconstruction time of single collisions varies according to event complexity. Measurements were done in order to determine this correlation quantitatively, creating means to predict it based on the data-taking conditions of the input samples. Currently the data processing system splits tasks in groups with the same number of collisions and does not account for variations in the processing time. These variations can be large and can lead to a considerable increase in the time it takes for CMS workflows to finish. The goal of this study was to use estimates on processing time to more efficiently split the workflow into jobs. By considering the CPU time needed for each job the spread of the job-length distribution in a workflow is reduced.

  4. Model-based estimation of measures of association for time-to-event outcomes

    PubMed Central

    2014-01-01

    Background Hazard ratios are ubiquitously used in time to event applications to quantify adjusted covariate effects. Although hazard ratios are invaluable for hypothesis testing, other adjusted measures of association, both relative and absolute, should be provided to fully appreciate studies results. The corrected group prognosis method is generally used to estimate the absolute risk reduction and the number needed to be treated for categorical covariates. Methods The goal of this paper is to present transformation models for time-to-event outcomes to obtain, directly from estimated coefficients, the measures of association widely used in biostatistics together with their confidence interval. Pseudo-values are used for a practical estimation of transformation models. Results Using the regression model estimated through pseudo-values with suitable link functions, relative risks, risk differences and the number needed to treat, are obtained together with their confidence intervals. One example based on literature data and one original application to the study of prognostic factors in primary retroperitoneal soft tissue sarcomas are presented. A simulation study is used to show some properties of the different estimation methods. Conclusions Clinically useful measures of treatment or exposure effect are widely available in epidemiology. When time to event outcomes are present, the analysis is performed generally resorting to predicted values from Cox regression model. It is now possible to resort to more general regression models, adopting suitable link functions and pseudo values for estimation, to obtain alternative measures of effect directly from regression coefficients together with their confidence interval. This may be especially useful when, in presence of time dependent covariate effects, it is not straightforward to specify the correct, if any, time dependent functional form. The method can easily be implemented with standard software. PMID:25106903

  5. Biomarker driven population enrichment for adaptive oncology trials with time to event endpoints.

    PubMed

    Mehta, Cyrus; Schäfer, Helmut; Daniel, Hanna; Irle, Sebastian

    2014-11-20

    The development of molecularly targeted therapies for certain types of cancers has led to the consideration of population enrichment designs that explicitly factor in the possibility that the experimental compound might differentially benefit different biomarker subgroups. In such designs, enrollment would initially be open to a broad patient population with the option to restrict future enrollment, following an interim analysis, to only those biomarker subgroups that appeared to be benefiting from the experimental therapy. While this strategy could greatly improve the chances of success for the trial, it poses several statistical and logistical design challenges. Because late-stage oncology trials are typically event driven, one faces a complex trade-off between power, sample size, number of events, and study duration. This trade-off is further compounded by the importance of maintaining statistical independence of the data before and after the interim analysis and of optimizing the timing of the interim analysis. This paper presents statistical methodology that ensures strong control of type 1 error for such population enrichment designs, based on generalizations of the conditional error rate approach. The special difficulties encountered with time-to-event endpoints are addressed by our methods. The crucial role of simulation for guiding the choice of design parameters is emphasized. Although motivated by oncology, the methods are applicable as well to population enrichment designs in other therapeutic areas.

  6. Real-Time Classification of Bladder Events for Effective Diagnosis and Treatment of Urinary Incontinence.

    PubMed

    Karam, Robert; Bourbeau, Dennis; Majerus, Steve; Makovey, Iryna; Goldman, Howard B; Damaser, Margot S; Bhunia, Swarup

    2016-04-01

    Diagnosis of lower urinary tract dysfunction with urodynamics has historically relied on data acquired from multiple sensors using nonphysiologically fast cystometric filling. In addition, state-of-the-art neuromodulation approaches to restore bladder function could benefit from a bladder sensor for closed-loop control, but a practical sensor and automated data analysis are not available. We have developed an algorithm for real-time bladder event detection based on a single in situ sensor, making it attractive for both extended ambulatory bladder monitoring and closed-loop control of stimulation systems for diagnosis and treatment of bladder overactivity. Using bladder pressure data acquired from 14 human subjects with neurogenic bladder, we developed context-aware thresholding, a novel, parameterized, user-tunable algorithmic framework capable of real-time classification of bladder events, such as detrusor contractions, from single-sensor bladder pressure data. We compare six event detection algorithms with both single-sensor and two-sensor systems using a metric termed Conditional Stimulation Score, which ranks algorithms based on projected stimulation efficacy and efficiency. We demonstrate that adaptive methods are more robust against day-to-day variations than static thresholding, improving sensitivity and specificity without parameter modifications. Relative to other methods, context-aware thresholding is fast, robust, highly accurate, noise-tolerant, and amenable to energy-efficient hardware implementation, which is important for mapping to an implant device. PMID:26292331

  7. Negativity bias of the self across time: an event-related potentials study.

    PubMed

    Luo, Yangmei; Huang, Xiting; Chen, Youguo; Jackson, Todd; Wei, Dongtao

    2010-05-14

    To investigate the neural basis of self-evaluation across time as a function of emotional valence, event-related potentials were recorded among participants instructed to make self-reference judgments when evaluating their past, present and future selves. Results showed that, when evaluating present and past selves, negative words elicited a more positive ERP deflection in the time window between 650ms and 800ms (LPC) relative to positive words. However, when evaluating the future selves, there was no significant difference on the amplitude of the LPC evoked by negative versus positive words. Findings provided evidence for the effect of emotional valence on the self across time at a neurophysiological level and identified the time course of negative bias in the temporal self. More specifically, people were inclined to be relatively less negative and optimistic about their future self but had mixed emotions about past and present selves.

  8. What can neuromorphic event-driven precise timing add to spike-based pattern recognition?

    PubMed

    Akolkar, Himanshu; Meyer, Cedric; Clady, Zavier; Marre, Olivier; Bartolozzi, Chiara; Panzeri, Stefano; Benosman, Ryad

    2015-03-01

    This letter introduces a study to precisely measure what an increase in spike timing precision can add to spike-driven pattern recognition algorithms. The concept of generating spikes from images by converting gray levels into spike timings is currently at the basis of almost every spike-based modeling of biological visual systems. The use of images naturally leads to generating incorrect artificial and redundant spike timings and, more important, also contradicts biological findings indicating that visual processing is massively parallel, asynchronous with high temporal resolution. A new concept for acquiring visual information through pixel-individual asynchronous level-crossing sampling has been proposed in a recent generation of asynchronous neuromorphic visual sensors. Unlike conventional cameras, these sensors acquire data not at fixed points in time for the entire array but at fixed amplitude changes of their input, resulting optimally sparse in space and time-pixel individually and precisely timed only if new, (previously unknown) information is available (event based). This letter uses the high temporal resolution spiking output of neuromorphic event-based visual sensors to show that lowering time precision degrades performance on several recognition tasks specifically when reaching the conventional range of machine vision acquisition frequencies (30-60 Hz). The use of information theory to characterize separability between classes for each temporal resolution shows that high temporal acquisition provides up to 70% more information that conventional spikes generated from frame-based acquisition as used in standard artificial vision, thus drastically increasing the separability between classes of objects. Experiments on real data show that the amount of information loss is correlated with temporal precision. Our information-theoretic study highlights the potentials of neuromorphic asynchronous visual sensors for both practical applications and theoretical

  9. What can neuromorphic event-driven precise timing add to spike-based pattern recognition?

    PubMed

    Akolkar, Himanshu; Meyer, Cedric; Clady, Zavier; Marre, Olivier; Bartolozzi, Chiara; Panzeri, Stefano; Benosman, Ryad

    2015-03-01

    This letter introduces a study to precisely measure what an increase in spike timing precision can add to spike-driven pattern recognition algorithms. The concept of generating spikes from images by converting gray levels into spike timings is currently at the basis of almost every spike-based modeling of biological visual systems. The use of images naturally leads to generating incorrect artificial and redundant spike timings and, more important, also contradicts biological findings indicating that visual processing is massively parallel, asynchronous with high temporal resolution. A new concept for acquiring visual information through pixel-individual asynchronous level-crossing sampling has been proposed in a recent generation of asynchronous neuromorphic visual sensors. Unlike conventional cameras, these sensors acquire data not at fixed points in time for the entire array but at fixed amplitude changes of their input, resulting optimally sparse in space and time-pixel individually and precisely timed only if new, (previously unknown) information is available (event based). This letter uses the high temporal resolution spiking output of neuromorphic event-based visual sensors to show that lowering time precision degrades performance on several recognition tasks specifically when reaching the conventional range of machine vision acquisition frequencies (30-60 Hz). The use of information theory to characterize separability between classes for each temporal resolution shows that high temporal acquisition provides up to 70% more information that conventional spikes generated from frame-based acquisition as used in standard artificial vision, thus drastically increasing the separability between classes of objects. Experiments on real data show that the amount of information loss is correlated with temporal precision. Our information-theoretic study highlights the potentials of neuromorphic asynchronous visual sensors for both practical applications and theoretical

  10. Nonparametric estimation of time-to-event distribution based on recall data in observational studies.

    PubMed

    Mirzaei Salehabadi, Sedigheh; Sengupta, Debasis

    2016-10-01

    In a cross-sectional observational study, time-to-event distribution can be estimated from data on current status or from recalled data on the time of occurrence. In either case, one can treat the data as having been interval censored, and use the nonparametric maximum likelihood estimator proposed by Turnbull (J R Stat Soc Ser B 38:290-295, 1976). However, the chance of recall may depend on the time span between the occurrence of the event and the time of interview. In such a case, the underlying censoring would be informative, rendering the Turnbull estimator inappropriate. In this article, we provide a nonparametric maximum likelihood estimator of the distribution of interest, by using a model adapted to the special nature of the data at hand. We also provide a computationally simple approximation of this estimator, and establish the consistency of both the original and the approximate versions, under mild conditions. Monte Carlo simulations indicate that the proposed estimators have smaller bias than the Turnbull estimator based on incomplete recall data, smaller variance than the Turnbull estimator based on current status data, and smaller mean squared error than both of them. The method is applied to menarcheal data from a recent Anthropometric study of adolescent and young adult females in Kolkata, India.

  11. Nonparametric estimation of time-to-event distribution based on recall data in observational studies.

    PubMed

    Mirzaei Salehabadi, Sedigheh; Sengupta, Debasis

    2016-10-01

    In a cross-sectional observational study, time-to-event distribution can be estimated from data on current status or from recalled data on the time of occurrence. In either case, one can treat the data as having been interval censored, and use the nonparametric maximum likelihood estimator proposed by Turnbull (J R Stat Soc Ser B 38:290-295, 1976). However, the chance of recall may depend on the time span between the occurrence of the event and the time of interview. In such a case, the underlying censoring would be informative, rendering the Turnbull estimator inappropriate. In this article, we provide a nonparametric maximum likelihood estimator of the distribution of interest, by using a model adapted to the special nature of the data at hand. We also provide a computationally simple approximation of this estimator, and establish the consistency of both the original and the approximate versions, under mild conditions. Monte Carlo simulations indicate that the proposed estimators have smaller bias than the Turnbull estimator based on incomplete recall data, smaller variance than the Turnbull estimator based on current status data, and smaller mean squared error than both of them. The method is applied to menarcheal data from a recent Anthropometric study of adolescent and young adult females in Kolkata, India. PMID:26391480

  12. Towards real-time regional earthquake simulation I: real-time moment tensor monitoring (RMT) for regional events in Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, Shiann-Jong; Liang, Wen-Tzong; Cheng, Hui-Wen; Tu, Feng-Shan; Ma, Kuo-Fong; Tsuruoka, Hiroshi; Kawakatsu, Hitoshi; Huang, Bor-Shouh; Liu, Chun-Chi

    2014-01-01

    We have developed a real-time moment tensor monitoring system (RMT) which takes advantage of a grid-based moment tensor inversion technique and real-time broad-band seismic recordings to automatically monitor earthquake activities in the vicinity of Taiwan. The centroid moment tensor (CMT) inversion technique and a grid search scheme are applied to obtain the information of earthquake source parameters, including the event origin time, hypocentral location, moment magnitude and focal mechanism. All of these source parameters can be determined simultaneously within 117 s after the occurrence of an earthquake. The monitoring area involves the entire Taiwan Island and the offshore region, which covers the area of 119.3°E to 123.0°E and 21.0°N to 26.0°N, with a depth from 6 to 136 km. A 3-D grid system is implemented in the monitoring area with a uniform horizontal interval of 0.1° and a vertical interval of 10 km. The inversion procedure is based on a 1-D Green's function database calculated by the frequency-wavenumber (fk) method. We compare our results with the Central Weather Bureau (CWB) catalogue data for earthquakes occurred between 2010 and 2012. The average differences between event origin time and hypocentral location are less than 2 s and 10 km, respectively. The focal mechanisms determined by RMT are also comparable with the Broadband Array in Taiwan for Seismology (BATS) CMT solutions. These results indicate that the RMT system is realizable and efficient to monitor local seismic activities. In addition, the time needed to obtain all the point source parameters is reduced substantially compared to routine earthquake reports. By connecting RMT with a real-time online earthquake simulation (ROS) system, all the source parameters will be forwarded to the ROS to make the real-time earthquake simulation feasible. The RMT has operated offline (2010-2011) and online (since January 2012 to present) at the Institute of Earth Sciences (IES), Academia Sinica

  13. Time dependent inversion of the August 2010 Northern Cascadia slow slip event

    NASA Astrophysics Data System (ADS)

    Mitchell, C. A.; Bartlow, N. M.; Segall, P.

    2011-12-01

    Although slip and tremor occur together, their precise spatial and temporal relationship is not well determined. We analyze data for the period of August to September 2010 from the northern Cascadia subduction zone, where the Juan de Fuca plate obliquely subducts below North America. Using geodetic (GPS and tilt) data from Pacific Northwest Geodetic Array (PANGA) we reconstruct a record of slip and slip-rate during the ETS event. To do this we first remove seasonal signals and long-term secular velocities from the data, and then use a time-dependent slip inversion method (constrained Network Inversion Filter [Segall and Matthews, J. Geophys. Res., 1997; Simon and Simon, IEE Proc.-Control Theory Appl., 2006]) to solve for slip and slip-rate. We also utilize tremor locations from University of Washington Pacific Northwest Seismic Network (PNSN), to reconstruct the migration of the tectonic tremor. Utilizing the UW tremor locations, we superimpose the tremor data onto the geodetic inversion results to correlate the tremor migration with the slow slip event. The tremor belt spans from Vancouver, BC to a little north of Portland, Oregon. Fairly consistent with model predictions, the geodetic data displays noticeable displacement of GPS stations located in and around the Olympic peninsula region. Although derived from different data types, slip and tremor occur generally in the same location and during the same time period. A similar synchronized movement of the tremor and slip-rate locations was found by Bartlow et al for the 2009 ETS event. Our results indicate that this synchronized migration is likely true of all ETS in Cascadia, as it has now been shown for both this ETS event and the August 2009 central Cascadia ETS (Bartlow et al, 2011). Analysis of the geodetic data and tremor epicenters indicate that the ETS event migrated bimodally and appears to have originated near the Puget sound. GPS stations located in this general area had greater measured displacement

  14. The Temporally Integrated Monitoring of Ecosystems (TIME) Project Design: 1. Classification of Northeast Lakes Using a Combination of Geographic, Hydrogeochemical, and Multivariate Techniques

    NASA Astrophysics Data System (ADS)

    Young, Thomas C.; Stoddard, John L.

    1996-08-01

    This investigation is part of the Temporally Integrated Monitoring of Ecosystems ((TIME) project, an effort to meet the difficult challenge of monitoring surface water quality in the northeastern United States for signs of change in response to the Clean Air Act Amendments of 1990. The overall objective of the study was to develop a unified scheme for classifying lakes in the northeast into relatively homogeneous groups and improve the likelihood of detecting water quality trends in the region. The study approach involved combining the best elements of several procedures recently used for defining regional subpopulations of lakes; these were termed the hydrogeochemical model (HM), geographical model (GM), and multivariate statistical model (MSM). Lake and watershed data from the U.S. Environmental Protection Agency Eastern Lake Survey (ELS) were used to evaluate the classification methods and their modifications. After preliminary comparisons were made of the three classification schemes, it was concluded that the resulting subpopulations indicated that there was meaningful similarity among methods but that the significant dissimilarity reflected distinctive attributes of each classification method. These differences were deemed important; accordingly, integration of the methods entailed efforts to preserve parts of each. This was accomplished by assigning each lake of the ELS data set into a lake cluster that had been defined by jointly applying the HM and GM methods. Subsequently, the jointly classified clusters were aggregated by coupling an application of the MSM (cluster analysis) with subjective judgment regarding termination of the process of cluster formation. This integration of procedures gave rise to nine subpopulations that separated mainly on the basis of hydrogeochemical factors, though geographic influences also were evident in the results. The integrated classification procedure provided an explicit method involving the combination of several kinds

  15. Time-Frequency Data Reduction for Event Related Potentials: Combining Principal Component Analysis and Matching Pursuit

    PubMed Central

    Aviyente, Selin; Bernat, Edward M.; Malone, Stephen M.; Iacono, William G.

    2010-01-01

    Joint time-frequency representations offer a rich representation of event related potentials (ERPs) that cannot be obtained through individual time or frequency domain analysis. This representation, however, comes at the expense of increased data volume and the difficulty of interpreting the resulting representations. Therefore, methods that can reduce the large amount of time-frequency data to experimentally relevant components are essential. In this paper, we present a method that reduces the large volume of ERP time-frequency data into a few significant time-frequency parameters. The proposed method is based on applying the widely-used matching pursuit (MP) approach, with a Gabor dictionary, to principal components extracted from the time-frequency domain. The proposed PCA-Gabor decomposition is compared with other time-frequency data reduction methods such as the time-frequency PCA approach alone and standard matching pursuit methods using a Gabor dictionary for both simulated and biological data. The results show that the proposed PCA-Gabor approach performs better than either the PCA alone or the standard MP data reduction methods, by using the smallest amount of ERP data variance to produce the strongest statistical separation between experimental conditions. PMID:20730031

  16. Problems with Multivariate Normality: Can the Multivariate Bootstrap Help?

    ERIC Educational Resources Information Center

    Thompson, Bruce

    Multivariate normality is required for some statistical tests. This paper explores the implications of violating the assumption of multivariate normality and illustrates a graphical procedure for evaluating multivariate normality. The logic for using the multivariate bootstrap is presented. The multivariate bootstrap can be used when distribution…

  17. Auroral observations in the Antarctic at the time of the Tunguska event, 1908 June 30.

    NASA Astrophysics Data System (ADS)

    Steel, D.; Ferguson, R.

    1993-03-01

    The original notebooks of Sir Douglas Mawson containing observations of the aurora australis by members of the British Antarctic Expedition at the time of the Tunguska explosion over Siberia on 1908 June 30 have been inspected, and it is found that, contrary to some suggestions which note that geomagnetic transients were witnessed elsewhere, and that the BAE was in winter quarters close to the south magnetic pole at the time, no exceptional auroral activity was seen which might have provided useful information on a planet-wide disturbance at the time of the event. However, an exceptional aurora was seen about seven hours prior to the explosion, and it is suggested that this may have been due to an anti-solar comet-like ion tail producing that auroral display whilst the impactor was still far from Earth.

  18. A mathematical approach for evaluating Markov models in continuous time without discrete-event simulation.

    PubMed

    van Rosmalen, Joost; Toy, Mehlika; O'Mahony, James F

    2013-08-01

    Markov models are a simple and powerful tool for analyzing the health and economic effects of health care interventions. These models are usually evaluated in discrete time using cohort analysis. The use of discrete time assumes that changes in health states occur only at the end of a cycle period. Discrete-time Markov models only approximate the process of disease progression, as clinical events typically occur in continuous time. The approximation can yield biased cost-effectiveness estimates for Markov models with long cycle periods and if no half-cycle correction is made. The purpose of this article is to present an overview of methods for evaluating Markov models in continuous time. These methods use mathematical results from stochastic process theory and control theory. The methods are illustrated using an applied example on the cost-effectiveness of antiviral therapy for chronic hepatitis B. The main result is a mathematical solution for the expected time spent in each state in a continuous-time Markov model. It is shown how this solution can account for age-dependent transition rates and discounting of costs and health effects, and how the concept of tunnel states can be used to account for transition rates that depend on the time spent in a state. The applied example shows that the continuous-time model yields more accurate results than the discrete-time model but does not require much computation time and is easily implemented. In conclusion, continuous-time Markov models are a feasible alternative to cohort analysis and can offer several theoretical and practical advantages. PMID:23715464

  19. Real-time gait event detection for transfemoral amputees during ramp ascending and descending.

    PubMed

    Maqbool, H F; Husman, M A B; Awad, M I; Abouhossein, A; Dehghani-Sanij, A A

    2015-01-01

    Events and phases detection of the human gait are vital for controlling prosthesis, orthosis and functional electrical stimulation (FES) systems. Wearable sensors are inexpensive, portable and have fast processing capability. They are frequently used to assess spatio-temporal, kinematic and kinetic parameters of the human gait which in turn provide more details about the human voluntary control and ampute-eprosthesis interaction. This paper presents a reliable real-time gait event detection algorithm based on simple heuristics approach, applicable to signals from tri-axial gyroscope for lower limb amputees during ramp ascending and descending. Experimental validation is done by comparing the results of gyroscope signal with footswitches. For healthy subjects, the mean difference between events detected by gyroscope and footswitches is 14 ms and 10.5 ms for initial contact (IC) whereas for toe off (TO) it is -5 ms and -25 ms for ramp up and down respectively. For transfemoral amputee, the error is slightly higher either due to the placement of footswitches underneath the foot or the lack of proper knee flexion and ankle plantarflexion/dorsiflexion during ramp up and down. Finally, repeatability tests showed promising results.

  20. Real-time gait event detection for transfemoral amputees during ramp ascending and descending.

    PubMed

    Maqbool, H F; Husman, M A B; Awad, M I; Abouhossein, A; Dehghani-Sanij, A A

    2015-01-01

    Events and phases detection of the human gait are vital for controlling prosthesis, orthosis and functional electrical stimulation (FES) systems. Wearable sensors are inexpensive, portable and have fast processing capability. They are frequently used to assess spatio-temporal, kinematic and kinetic parameters of the human gait which in turn provide more details about the human voluntary control and ampute-eprosthesis interaction. This paper presents a reliable real-time gait event detection algorithm based on simple heuristics approach, applicable to signals from tri-axial gyroscope for lower limb amputees during ramp ascending and descending. Experimental validation is done by comparing the results of gyroscope signal with footswitches. For healthy subjects, the mean difference between events detected by gyroscope and footswitches is 14 ms and 10.5 ms for initial contact (IC) whereas for toe off (TO) it is -5 ms and -25 ms for ramp up and down respectively. For transfemoral amputee, the error is slightly higher either due to the placement of footswitches underneath the foot or the lack of proper knee flexion and ankle plantarflexion/dorsiflexion during ramp up and down. Finally, repeatability tests showed promising results. PMID:26737364

  1. Recurrent event data analysis with intermittently observed time-varying covariates.

    PubMed

    Li, Shanshan; Sun, Yifei; Huang, Chiung-Yu; Follmann, Dean A; Krause, Richard

    2016-08-15

    Although recurrent event data analysis is a rapidly evolving area of research, rigorous studies on estimation of the effects of intermittently observed time-varying covariates on the risk of recurrent events have been lacking. Existing methods for analyzing recurrent event data usually require that the covariate processes are observed throughout the entire follow-up period. However, covariates are often observed periodically rather than continuously. We propose a novel semiparametric estimator for the regression parameters in the popular proportional rate model. The proposed estimator is based on an estimated score function where we kernel smooth the mean covariate process. We show that the proposed semiparametric estimator is asymptotically unbiased, normally distributed, and derives the asymptotic variance. Simulation studies are conducted to compare the performance of the proposed estimator and the simple methods carrying forward the last covariates. The different methods are applied to an observational study designed to assess the effect of group A streptococcus on pharyngitis among school children in India. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26887664

  2. Integrated survival analysis using an event-time approach in a Bayesian framework

    USGS Publications Warehouse

    Walsh, Daniel P.; Dreitz, VJ; Heisey, Dennis M.

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  3. Integrated survival analysis using an event-time approach in a Bayesian framework

    PubMed Central

    Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  4. Integrated survival analysis using an event-time approach in a Bayesian framework.

    PubMed

    Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M

    2015-02-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  5. SOLAR ENERGETIC-PARTICLE RELEASE TIMES IN HISTORIC GROUND-LEVEL EVENTS

    SciTech Connect

    Reames, Donald V.

    2009-11-20

    Ground-level events (GLEs) are large solar energetic-particle events with sufficiently hard spectra for GeV protons to be detected by neutron monitors at ground level. For each of 30 well-observed historic GLEs from four solar cycles, extending back to 1973, I have plotted onset times versus velocity{sup -1} for particles observed on the IMP-7 and 8, ISEE-3, Wind, and GOES spacecraft and by neutron monitors. A linear fit on such a plot for each GLE determines the initial solar particle release (SPR) time, as the intercept, and the magnetic path length traversed, as the slope, of the fitted line. Magnetic path lengths and SPR times are well determined by the fits and cannot be used as adjustable parameters to make particle and photon emission times coincide. SPR times follow the onsets of shock-induced type II radio bursts and the coronal height of the coronal mass ejection (CME)-driven shock at SPR time can be determined for GLEs spanning an interval of solar longitude of approx140 deg. For a given GLE, all particle species and energies diverge from a single SPR point at a given coronal height and footpoint longitude of the field line to the Earth. These heights tend to increase with longitudinal distance away from the source, a pattern expected for shock acceleration. Acceleration for magnetically well-connected large GLEs begins at approx2 solar radii, in contrast to non-GLEs that have been found to be strongly associated with shocks above approx3 solar radii. The higher densities and magnetic field strengths at lower altitudes may be responsible for the acceleration of higher-energy particles in GLEs, while those GLEs that begin above 3R {sub S} may compensate by having higher shock speeds. These results support the joint dependence of maximum particle energy on magnetic field strength, injected particle density, and shock speed, all predicted theoretically.

  6. A Word Extraction Method from Newspaper Articles Based on Time Infomation for Event Sequence Mining

    NASA Astrophysics Data System (ADS)

    Tada, Tomomichi; Iwanuma, Koji; Nabeshima, Hidetomo

    This paper shows a new method of extracting important words from newspaper articles based on time-sequence information. This word extraction method plays an important role in event sequence mining. TF-IDF is a well-known method to rank word's importance in a document. However, the TF-IDF method never consider the time information embedded in sequential textual data, which is peculiar to newspapers. In this research, we will propose a new word-extraction method, called the TF-IDayF method, which considers time-sequence information, and can extract important/characteristic words expressing sequential events. The TF-IDayF method never use so-called burst phenomenon of topic word occurrences, which has been studied by lots of researchers. The TF-IDayF method is quite simple, but effective and easy to compute in sequential textual mining. We evaluate the proposed method from three points of view, i.e., a semantic viewpoint, a statistical one and a data mining viewpoint through several experiments.

  7. Time evolution of atmospheric particle number concentration during high-intensity pyrotechnic events

    NASA Astrophysics Data System (ADS)

    Crespo, Javier; Yubero, Eduardo; Nicolás, Jose F.; Caballero, Sandra; Galindo, Nuria

    2014-10-01

    The Mascletàs are high-intensity pyrotechnic events, typical of eastern Spanish festivals, in which thousands of firecrackers are burnt at ground level in an intense, short-time (<8 min) deafening spectacle that generates short-lived, thick aerosol clouds. In this study, the impact of such events on air quality has been evaluated by means of particle number concentration measurements performed close to the venue during the June festival in Alicante (southeastern Spain). Peak concentrations and dilution times observed throughout the Mascletàs have been compared to those measured when conventional aerial fireworks were launched 2 km away from the monitoring site. The impact of the Mascletàs on the total number concentration of particles larger than 0.3 μm was higher (maximum ˜2·104 cm-3) than that of fireworks (maximum ˜2·103 cm-3). The effect of fireworks depended on whether the dominant meteorological conditions favoured the transport of the plume to the measurement location. However, the time required for particle concentrations to return to background levels is longer and more variable for firework displays (minutes to hours) than for the Mascletàs (<25 min).

  8. Timing of follicular phase events and the postovulatory progesterone rise following synchronisation of oestrus in cows.

    PubMed

    Starbuck, G R; Gutierrez, C G; Peters, A R; Mann, G E

    2006-07-01

    In cows the timing of both ovulation and the subsequent postovulatory progesterone rise are critical to successful fertilisation and early embryo development. The aim of this study was to determine the degree of variability in the timing of ovulation relative to other follicular phase events and to determine how variations in the timing of follicular phase events contribute to the timing of the postovulatory progesterone rise. Plasma concentrations of progesterone, oestradiol and luteinising hormone (LH) and the timing of oestrus and ovulation were determined following induction of luteolysis were determined in 18 mature, non-lactating Holstein-Friesian cows. Four cows were excluded on the basis of abnormal reproductive function. In the remaining 14 cows oestrus occurred at 57.4+/-4.3h and the LH surge at 54.6+/-4.0h following luteolysis (progesterone <1ngmL(-1)) followed by a fall in circulating oestradiol concentration at 64.6+/-4.4h. Cows ovulated at 88.0+/-4.7h with the postovulatory progesterone rise (to >1ngmL(-1)) occurring 159+/-7.2h after luteolysis. There was considerable variation in the timing of ovulation following luteolysis (range 64-136h) onset of oestrus (range 24-40h) and onset of the LH surge (range 24-44h). Cows were then split on the basis of interval from progesterone fall to progesterone rise giving groups (n=7 per group) with intervals of 180.6+/-6.7 and 138.3+/-5.7h (P<0.001). Between groups, both the intervals from luteolysis to ovulation (98.3+/-6.9 vs 77.7+/-3.4h; P<0.05) and ovulation to progesterone rise (82.3+/-4.2 vs. 60.6+/-5.5h; P<0.01) were longer in late rise cows. There was no difference between groups in the interval from oestrus or LH surge to ovulation. In conclusion the results of this study further highlight the high variability that exists in the timing and interrelationships of follicular phase events in the modern dairy cow, reemphasising the challenges that exist in optimising mating strategies. However, the data do

  9. Time-Based and Event-Based Prospective Memory in Autism Spectrum Disorder: The Roles of Executive Function and Theory of Mind, and Time-Estimation

    ERIC Educational Resources Information Center

    Williams, David; Boucher, Jill; Lind, Sophie; Jarrold, Christopher

    2013-01-01

    Prospective memory (remembering to carry out an action in the future) has been studied relatively little in ASD. We explored time-based (carry out an action at a pre-specified time) and event-based (carry out an action upon the occurrence of a pre-specified event) prospective memory, as well as possible cognitive correlates, among 21…

  10. The timing of life history events in the presence of soft disturbances.

    PubMed

    Bertacchi, Daniela; Zucca, Fabio; Ambrosini, Roberto

    2016-01-21

    We study a model for the evolutionarily stable strategy (ESS) used by biological populations for choosing the time of life-history events, such as arrival from migration and breeding. In our model we account for both intra-species competition (early individuals have a competitive advantage) and a disturbance which strikes at a random time, killing a fraction 1-p of the population. Disturbances include spells of bad weather, such as freezing or heavily raining days. It has been shown by Iwasa and Levin (1995) that when the disturbance is so strong that it kills any individual present when it strikes (hard disturbance, p=0), then the ESS is a mixed strategy (individuals choose their arrival date in an interval of possible dates, according to a certain probability distribution). In this case, individuals wait for a certain time and afterwards start arriving (or breeding) every day. In this paper we explore a biologically more realistic situation whereby the disturbance kills only a fraction of the individuals (soft disturbance, p>0). We also remove some technical assumptions which Iwasa and Levin made on the distribution of the disturbance. We prove that the ESS is still a mixed choice of times, however with respect to the case of hard disturbance, a new phenomenon arises: whenever the disturbance is soft, if the competition is sufficiently strong, the waiting time disappears and a fraction of the population arrives at the earliest day possible, while the rest will arrive throughout the whole period during which the disturbance may occur. This means that under strong competition, the payoff of early arrival balances the increased risk of being killed by the disturbance. We study the behaviour of the ESS and of the average fitness of the population, depending on the parameters involved. We also investigate how the population may be affected by climate change: namely the occurrence of more extreme weather events, which may kill a larger fraction of the population, and

  11. Trends in the Timing and Volume of Peak Flow Events in the Missouri River Basin

    NASA Astrophysics Data System (ADS)

    Stamm, J. F.; Anderson, M. T.; Norton, P. A.

    2008-12-01

    Long-term changes in streamflow may respond to climate change which will impact management practices for water resources and wetland ecosystems. Previous studies of mean annual streamflow in the Missouri River Basin indicate statistically significant decreases in the western part of the basin and increases in the eastern part over the last 50 years. Trends were further explored from 1950-2007 at three streams in the basin that are part of the USGS Hydro-Climatic Data Network (HCDN): Cheyenne River at Edgemont SD, Yellowstone River at Billings MT, and James River at Scotland SD. Patterns were examined for annual peak flow and flows exceeding the 2-year recurrence interval (RI). Changes in the nature and timing of peak flows are of particular geomorphic importance in that they sculpt and control channel form. The Cheyenne River exhibited a marked decrease in the volume of water conveyed by floods since 1980. From 1950 to 1980, 20 peak flow events exceeded the 2-year RI and conveyed a total of 763,000 acre-feet (AF). From 1980 to 2007, there were only 9 such events and they conveyed 134,000 AF. From 1950-70, the annual peak discharge occurred between May and August. Since 1970, timing of annual peak discharge is more variable, occurring from March to October. The Yellowstone River had 18 peak flows exceeding the 2-year RI prior to 1980 and conveyed a total of 16,300,000 AF, and 11 such events since 1980 conveyed 9,160,000 AF. By contrast, the James River showed increasing volumes conveyed by floods since 1980. From 1950 to 1980, the 11 peak flows exceeding the 2-year RI conveyed a total of 4,210,000 AF, and 15 events after 1980 conveyed 12,200,000 AF. Peak flow exceeded the 2-year RI in 8 out of 9 years from 1993 to 2001. The change in frequency of channel shaping flows and volume of runoff will over time change the geomorphic nature of these streams and rivers.

  12. A Catalog of Transit Timing Posterior Distributions for all Kepler Planet Candidate Transit Events

    NASA Astrophysics Data System (ADS)

    Montet, Benjamin Tyler; Becker, Juliette C.; Johnson, John Asher

    2015-12-01

    Kepler has ushered in a new era of planetary dynamics, enabling the detection of interactions between multiple planets in transiting systems for hundreds of systems. These interactions, observed as transit timing variations (TTVs), have been used to find non-transiting companions to transiting systems and to measure masses, eccentricities, and inclinations of transiting planets. Often, physical parameters are inferred by comparing the observed light curve to the result of a photodynamical model, a time-intensive process that often ignores the effects of correlated noise in the light curve. Catalogs of transit timing observations have previously neglected non-Gaussian uncertainties in the times of transit, uncertainties in the transit shape, and short cadence data. Here, I present a catalog of not only times of transit centers, but also posterior distributions on the time of transit for every planet candidate transit event in the Kepler data, developed through importance sampling of each transit. This catalog allows one to marginalize over uncertainties in the transit shape and incorporate short cadence data, the effects of correlated noise, and non-Gaussian posteriors. Our catalog will enable dynamical studies that reflect accurately the precision of Kepler and its limitations without requiring the computational power to model the light curve completely with every integration. I will also present our open-source N-body photodynamical modeling code, which integrates planetary and stellar orbits accounting for the effects of GR, tidal effects, and Doppler beaming.

  13. NASA Climate Days: Promoting Climate Literacy One Ambassador and One Event at a Time

    NASA Astrophysics Data System (ADS)

    Weir, H. M.; Lewis, P. M.; Chambers, L. H.; Millham, R. A.; Richardson, A.

    2012-12-01

    presentations from the training, along with downloadable Climate Day Kit materials. Utilizing informal educators from museums, aquariums, libraries and other similar venues allow the hard-to-understand, sometimes-controversial, topic of climate change to be presented to the public in tailored events that suit an individual community's needs. Included in the process of scheduling and executing these climate events, the Ambassadors participate in virtual conferences to discuss progress, to ensure proper evaluation and to allow ample time for questions from the trainers and scientists. This ensures an accurate stream of information from the scientist to the public in a fashion that can be understood and digested by the layperson, helping them to make better-informed decisions about societal issues related to global climate change. Through a series of local Climate Day events, it is hoped that the public will have the opportunity to have first hand experience with the topic of climate change, leaving with a better understanding of its scientific basis. Outcome: This paper will summarize the various methods and strategies used in the Climate Day training events. A discussion of methods that work and those that do not for informal education will help provide a better understanding of the challenges faced in educating the public on such a controversial and hard-to-understand topic.

  14. Timing of Mississippi Valley-type mineralization: Relation to Appalachian orogenic events

    SciTech Connect

    Kesler, S.E.; van der Pluijm, B.A. )

    1990-11-01

    Although Mississippi Valley-type deposits in Lower Ordovician carbonate rocks of the Appalachian orogen are commonly interpreted to have been precipitated by basinal brines, the timing of brine migration remains poorly known. Late Paleozoic K-Ar isotopic ages on authigenic K-feldspar, which is widespread in Appalachian carbonate rocks, as well as evidence of paleomagnetic overprints of similar age, have focused attention on the possibility that these Mississippi Valley-type deposits formed as a result of late Paleozoic deformation. Geologic and geochemical similarities among most of these deposits, from Georgia to Newfoundland, including unusually high sphalerite/galena ratios, isotopically heavy sulfur, and relatively nonradiogenic lead, suggest that they are coeval. Sphalerite sand that parallels host-rock layering in many of the deposits indicates that mineralization occurred before regional deformation. Although the late Paleozoic age of deformation in the southern Appalachians provides little constraint on the age of Mississippi Valley-type mineralization, deformation of these deposits in the Newfoundland Appalachians is early to middle Paleozoic in age. Thus, if Ordovician-hosted, Appalachian Mississippi Valley-type deposits are coeval, they must have formed by middle Paleozoic time and cannot be the product of a late Paleozoic fluid-expulsion event. This hypothesis has important implications for basin evolution, fluid events, and remagnetization in the Appalachians.

  15. Two-group time-to-event continual reassessment method using likelihood estimation.

    PubMed

    Salter, Amber; O'Quigley, John; Cutter, Gary R; Aban, Inmaculada B

    2015-11-01

    The presence of patient heterogeneity in dose finding studies is inherent (i.e. groups with different maximum tolerated doses). When this type of heterogeneity is not accounted for in the trial design, subjects may be exposed to toxic or suboptimal doses. Options to handle patient heterogeneity include conducting separate trials or splitting the trial into arms. However, cost and/or lack of resources may limit the feasibility of these options. If information is shared between the groups, then both of these options do not benefit from using the shared information. Extending current dose finding designs to handle patient heterogeneity maximizes the utility of existing methods within a single trial. We propose a modification to the time-to-event continual reassessment method to accommodate two groups using a two-parameter model and maximum likelihood estimation. The operating characteristics of the design are investigated through simulations under different scenarios including the scenario where one conducts two separate trials, one for each group, using the one-sample time-to-event continual reassessment method.

  16. Post-event human decision errors: operator action tree/time reliability correlation

    SciTech Connect

    Hall, R E; Fragola, J; Wreathall, J

    1982-11-01

    This report documents an interim framework for the quantification of the probability of errors of decision on the part of nuclear power plant operators after the initiation of an accident. The framework can easily be incorporated into an event tree/fault tree analysis. The method presented consists of a structure called the operator action tree and a time reliability correlation which assumes the time available for making a decision to be the dominating factor in situations requiring cognitive human response. This limited approach decreases the magnitude and complexity of the decision modeling task. Specifically, in the past, some human performance models have attempted prediction by trying to emulate sequences of human actions, or by identifying and modeling the information processing approach applicable to the task. The model developed here is directed at describing the statistical performance of a representative group of hypothetical individuals responding to generalized situations.

  17. Real-time prediction of clinical trial enrollment and event counts: A review.

    PubMed

    Heitjan, Daniel F; Ge, Zhiyun; Ying, Gui-Shuang

    2015-11-01

    Clinical trial planning involves the specification of a projected duration of enrollment and follow-up needed to achieve the targeted study power. If pre-trial estimates of enrollment and event rates are inaccurate, projections can be faulty, leading potentially to inadequate power or other mis-allocation of resources. Recent years have witnessed the development of methods that use the accumulating data from the trial itself to create improved predictions in real time. We review these methods, taking as a case study REMATCH, a trial that compared a left-ventricular assist device to optimal medical management in the treatment of end-stage heart failure. REMATCH provided the motivation and test bed for the first real-time clinical trial prediction model. Our review summarizes developments to date and points to unresolved issues and open research opportunities. PMID:26188165

  18. Adaptation-Induced Compression of Event Time Occurs Only for Translational Motion

    PubMed Central

    Fornaciai, Michele; Arrighi, Roberto; Burr, David C.

    2016-01-01

    Adaptation to fast motion reduces the perceived duration of stimuli displayed at the same location as the adapting stimuli. Here we show that the adaptation-induced compression of time is specific for translational motion. Adaptation to complex motion, either circular or radial, did not affect perceived duration of subsequently viewed stimuli. Adaptation with multiple patches of translating motion caused compression of duration only when the motion of all patches was in the same direction. These results show that adaptation-induced compression of event-time occurs only for uni-directional translational motion, ruling out the possibility that the neural mechanisms of the adaptation occur at early levels of visual processing. PMID:27003445

  19. Tracking Visual Events in Time in the Absence of Time Perception: Implicit Processing at the ms Level.

    PubMed

    Poncelet, Patrick Eric; Giersch, Anne

    2015-01-01

    Previous studies have suggested that even if subjects deem two visual stimuli less than 20 ms apart to be simultaneous, implicitly they are nonetheless distinguished in time. It is unclear, however, how information is encoded within this short timescale. We used a priming paradigm to demonstrate how successive visual stimuli are processed over time intervals of less than 20 ms. The primers were two empty square frames displayed either simultaneously or with a 17 ms asynchrony. The primers were followed by the target information after a delay of 25 ms to 100 ms. The two square frames were filled in one after another with a delay of 100 ms between them, and subjects had to decide on the location of the first of the frames to be filled in. In a second version of the paradigm, only one square frame was filled in, and subjects had to decide where it was positioned. The influence of the primers is revealed through faster response times depending on the location of the first and second primers. Experiment 1 replicates earlier results, with a bias towards the side of the second primer, but only when there is a delay of 75 to 100 ms between primers and targets. The following experiments suggest this effect to be relatively independent of the task context, except for a slight effect on the time course of the biases. For the temporal order judgment task, identical results were observed when subjects have to answer to the side of the second rather than the first target, showing the effect to be independent of the hand response, and suggesting it might be related to a displacement of attention. All in all the results suggest the flow of events is followed more efficiently than suggested by explicit asynchrony judgment studies. We discuss the possible impact of these results on our understanding of the sense of time continuity.

  20. Tracking Visual Events in Time in the Absence of Time Perception: Implicit Processing at the ms Level

    PubMed Central

    Poncelet, Patrick Eric; Giersch, Anne

    2015-01-01

    Previous studies have suggested that even if subjects deem two visual stimuli less than 20 ms apart to be simultaneous, implicitly they are nonetheless distinguished in time. It is unclear, however, how information is encoded within this short timescale. We used a priming paradigm to demonstrate how successive visual stimuli are processed over time intervals of less than 20 ms. The primers were two empty square frames displayed either simultaneously or with a 17ms asynchrony. The primers were followed by the target information after a delay of 25 ms to 100 ms. The two square frames were filled in one after another with a delay of 100 ms between them, and subjects had to decide on the location of the first of the frames to be filled in. In a second version of the paradigm, only one square frame was filled in, and subjects had to decide where it was positioned. The influence of the primers is revealed through faster response times depending on the location of the first and second primers. Experiment 1 replicates earlier results, with a bias towards the side of the second primer, but only when there is a delay of 75 to 100 ms between primers and targets. The following experiments suggest this effect to be relatively independent of the task context, except for a slight effect on the time course of the biases. For the temporal order judgment task, identical results were observed when subjects have to answer to the side of the second rather than the first target, showing the effect to be independent of the hand response, and suggesting it might be related to a displacement of attention. All in all the results suggest the flow of events is followed more efficiently than suggested by explicit asynchrony judgment studies. We discuss the possible impact of these results on our understanding of the sense of time continuity. PMID:26030155

  1. Event-related potentials with the Stroop colour-word task: timing of semantic conflict.

    PubMed

    Zurrón, Montserrat; Pouso, María; Lindín, Mónica; Galdo, Santiago; Díaz, Fernando

    2009-06-01

    Event-Related Potentials (ERPs) elicited by congruent and incongruent colour-word stimuli of a Stroop paradigm, in a task in which participants were required to judge the congruence/incongruence of the two dimensions of the stimuli, were recorded in order to study the timing of the semantic conflict. The reaction time to colour-word incongruent stimuli was significantly longer than the reaction time to congruent stimuli (the Stroop effect). A temporal Principal Components Analysis was applied to the data to identify the ERP components. Three positive components were identified in the 300-600 ms interval in response to the congruent and incongruent stimuli: First P3, P3b and PSW. The factor scores corresponding to the First P3 and P3b components were significantly smaller for the incongruent stimuli than for the congruent stimuli. No differences between stimuli were observed in the factor scores corresponding to the PSW or in the ERP latencies. We conclude that the temporal locus of the semantic conflict, which intervenes in generating the Stroop effect, may occur within the time interval in which the First P3 and P3b components are identified, i.e. at approximately 300-450 ms post-stimulus. We suggest that the semantic conflict delays the start of the response selection process, which explains the longer reaction time to incongruent stimuli.

  2. An analysis of P times reported in the Reviewed Event Bulletin for Chinese underground explosions

    NASA Astrophysics Data System (ADS)

    Douglas, A.; O'Mongain, A. M.; Porter, D.; Young, J. B.

    2005-11-01

    Analysis of variance is used to estimate the measurement error and path effects in the P times reported in the Reviewed Event Bulletins (REBs, produced by the provisional International Data Center, Arlington, USA) and in times we have read, for explosions at the Chinese Test Site. Path effects are those differences between traveltimes calculated from tables and the true times that result in epicentre error. The main conclusions of the study are: (1) the estimated variance of the measurement error for P times reported in the REB at large signal-to-noise ratio (SNR) is 0.04s2, the bulk of the readings being analyst-adjusted automatic-detections, whereas for our times the variance is 0.01s2 and (2) the standard deviation of the path effects for both sets of observations is about 0.6s. The study shows that measurement error is about twice (~0.2s rather than ~0.1s) and path effects about half the values assumed for the REB times. However, uncertainties in the estimated epicentres are poorly described by treating path effects as a random variable with a normal distribution. Only by estimating path effects and using these to correct onset times can reliable estimates of epicentre uncertainty be obtained. There is currently an international programme to do just this. The results imply that with P times from explosions at three or four stations with good SNR (so that the measurement error is around 0.1s) and well distributed in azimuth, then with correction for path effects the area of the 90 per cent coverage ellipse should be much less than 1000km2-the area allowed for an on-site inspection under the Comprehensive Test Ban Treaty-and should cover the true epicentre with the given probability.

  3. Characteristic Times of Gradual Solar Energetic Particle Events and Their Dependence on Associated Coronal Mass Ejection Properties

    NASA Astrophysics Data System (ADS)

    Kahler, S. W.

    2005-08-01

    We use 20 MeV proton intensities from the EPACT instrument on Wind and coronal mass ejections (CMEs) from the LASCO coronagraph on SOHO observed during 1998-2002 to statistically determine three characteristic times of gradual solar energetic particle (SEP) events as functions of solar source longitude: (1) TO, the time from associated CME launch to SEP onset at 1 AU, (2) TR, the rise time from SEP onset to the time when the SEP intensity is a factor of 2 below peak intensity, and (3) TD, the duration over which the SEP intensity is within a factor of 2 of the peak intensity. Those SEP event times are compared with associated CME speeds, accelerations, and widths to determine whether and how the SEP event times may depend on the formation and dynamics of coronal/interplanetary shocks driven by the CMEs. Solar source longitudinal variations are clearly present in the SEP times, but TR and TD are significantly correlated with CME speeds only for SEP events in the best-connected longitude range. No significant correlations between the SEP times and CME accelerations are found except for TD in one longitude range, but there is a weak correlation of TR and TD with CME widths. We also find no correlation of any SEP times with the solar wind O+7/O+6 values, suggesting no dependence on solar wind stream type. The SEP times of the small subset of events occurring in interplanetary CMEs may be slightly shorter than those of all events.

  4. Timing and return period of major palaeoseismic events in the Shillong Plateau, India

    NASA Astrophysics Data System (ADS)

    Sukhija, B. S.; Rao, M. N.; Reddy, D. V.; Nagabhushanam, P.; Hussain, Syed; Chadha, R. K.; Gupta, H. K.

    1999-07-01

    The close temporal occurrence of four great earthquakes in the past century, including the great Assam earthquake of 1897 in the Shillong Plateau, necessitated examination of the palaeoseismicity of the region. The results from such investigation would definitely aid in addressing the problem of the earthquake hazard evaluation more realistically. Our recent palaeoseismological study in the Shillong Plateau has led us to identify and provide geological evidence for large/major earthquakes and estimate the probable recurrence period of such violent earthquakes in parts of the Shillong Plateau and the adjoining Brahmaputra valley. Trenching along the Krishnai River, a tributary of the River Brahmaputra, has unravelled very conspicuous and significant earthquake-induced signatures in the alluvial deposits of the valley. The geological evidence includes: (1) palaeoliquefaction features, like sand dykes and sand blows; (2) deformational features, like tilted beds; (3) fractures and syndepositional deformational features, like flame structures caused by coeval seismic events. Chronological constraints of the past large/major earthquakes are provided from upper and lower radiocarbon age bounds in the case of the palaeoliquefaction features, and the coeval timing of the palaeoseismic events is obtained from the radiocarbon dating of the organic material associated with the deformed horizon as well as buried tree trunks observed wide distances apart. Our palaeoseismic measurements, which are the first from the area, indicate that the Shillong Plateau has been struck by large/major earthquakes around 500±150, 1100±150 and >1500±150 yr BP, in addition to the well-known great seismic event of 1897, thereby the 14C dates indicate a recurrence period of the order of 500 yr for large earthquakes in the Shillong Plateau.

  5. Multivariate postprocessing techniques for probabilistic hydrological forecasting

    NASA Astrophysics Data System (ADS)

    Hemri, Stephan; Lisniak, Dmytro; Klein, Bastian

    2016-04-01

    Hydrologic ensemble forecasts driven by atmospheric ensemble prediction systems need statistical postprocessing in order to account for systematic errors in terms of both mean and spread. Runoff is an inherently multivariate process with typical events lasting from hours in case of floods to weeks or even months in case of droughts. This calls for multivariate postprocessing techniques that yield well calibrated forecasts in univariate terms and ensure a realistic temporal dependence structure at the same time. To this end, the univariate ensemble model output statistics (EMOS; Gneiting et al., 2005) postprocessing method is combined with two different copula approaches that ensure multivariate calibration throughout the entire forecast horizon. These approaches comprise ensemble copula coupling (ECC; Schefzik et al., 2013), which preserves the dependence structure of the raw ensemble, and a Gaussian copula approach (GCA; Pinson and Girard, 2012), which estimates the temporal correlations from training observations. Both methods are tested in a case study covering three subcatchments of the river Rhine that represent different sizes and hydrological regimes: the Upper Rhine up to the gauge Maxau, the river Moselle up to the gauge Trier, and the river Lahn up to the gauge Kalkofen. The results indicate that both ECC and GCA are suitable for modelling the temporal dependences of probabilistic hydrologic forecasts (Hemri et al., 2015). References Gneiting, T., A. E. Raftery, A. H. Westveld, and T. Goldman (2005), Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS estimation, Monthly Weather Review, 133(5), 1098-1118, DOI: 10.1175/MWR2904.1. Hemri, S., D. Lisniak, and B. Klein, Multivariate postprocessing techniques for probabilistic hydrological forecasting, Water Resources Research, 51(9), 7436-7451, DOI: 10.1002/2014WR016473. Pinson, P., and R. Girard (2012), Evaluating the quality of scenarios of short-term wind power

  6. New isotopic ages and the timing of orogenic events in the Cordillera Darwin, southernmost Chilean Andes

    NASA Astrophysics Data System (ADS)

    Hervé, F.; Nelson, E.; Kawashita, K.; Suárez, M.

    1981-10-01

    The Cordillera Darwin, a structural culmination in the Andes of Tierra del Fuego, exposes an orogenic core zone that has undergone polyphase deformation and metamorphism. Some of the classic problems of orogenic zones have remained unanswered in the Cordillera Darwin: the age of deformed plutonic rocks, the distinction of structurally reactivated basement and metamorphosed cover rocks, and the timing of orogenic events. This study addresses and partially answers these questions. A well-constrained Rb-Sr isochron age of157±8m.y. and an initial 87Sr/ 86Sr ratio of 0.7087 obtained from a pre-tectonic granitic suite suggest a genetic relation between this suite and Upper Jurassic silicic volcanic rocks in the cover sequence (Tobifera Formation), and also suggest involvement of continental crust in formation of these magmas. A poorly constrained Rb-Sr isochron age of240±40m.y. obtained from supposed basement schists is consistent with field relations in the area which suggest a late Paleozoic/early Mesozoic metamorphism for these pre-Late Jurassic rocks. However, because of scatter in the data and the uncertainties involved in dating metasedimentary rocks, the significance of the isotopic age is dubious. Compilation of previously published ages in the area [9] with new mineral ages reported here indicate that "early Andean" orogenic events occurred between 100 and 84 m.y. ago, and that subduction-related magmatism has contributed, probably discontinuously, to the crustal evolution of the region throughout the Mesozoic.

  7. Predicting time to prostate cancer recurrence based on joint models for non-linear longitudinal biomarkers and event time outcomes.

    PubMed

    Pauler, Donna K; Finkelstein, Dianne M

    2002-12-30

    Biological markers that are both sensitive and specific for tumour regrowth or metastasis are increasingly becoming available and routinely monitored during the regular follow-up of patients treated for cancer. Obtained by a simple blood test, these markers provide an inexpensive non-invasive means for the early detection of recurrence (or progression). Currently, the longitudinal behaviour of the marker is viewed as an indicator of early disease progression, and is applied by a physician in making clinical decisions. One marker that has been studied for use in both population screening for early disease and for detection of recurrence in prostate cancer patients is PSA. The elevation of PSA levels is known to precede clinically detectable recurrence by 2 to 5 years, and current clinical practice often relies partially on multiple recent rises in PSA to trigger a change in treatment. However, the longitudinal trajectory for individual markers is often non-linear; in many cases there is a decline immediately following radiation therapy or surgery, a plateau during remission, followed by an exponential rise following the recurrence of the cancer. The aim of this article is to determine the multiple aspects of the longitudinal PSA biomarker trajectory that can be most sensitive for predicting time to clinical recurrence. Joint Bayesian models for the longitudinal measures and event times are utilized based on non-linear hierarchical models, implied by unknown change-points, for the longitudinal trajectories, and a Cox proportional hazard model for progression times, with functionals of the longitudinal parameters as covariates in the Cox model. Using Markov chain Monte Carlo sampling schemes, the joint model is fit to longitudinal PSA measures from 676 patients treated at Massachusetts General Hospital between the years 1988 and 1995 with follow-up to 1999. Based on these data, predictive schemes for detecting cancer recurrence in new patients based on their

  8. Mixed Effects Models for Recurrent Events Data with Partially Observed Time-Varying Covariates: Ecological Momentary Assessment of Smoking

    PubMed Central

    Rathbun, Stephen L.; Shiffman, Saul

    2015-01-01

    Summary Cigarette smoking is a prototypical example of a recurrent event. The pattern of recurrent smoking events may depend on time-varying covariates including mood and environmental variables. Fixed effects and frailty models for recurrent events data assume that smokers have a common association with time-varying covariates. We develop a mixed effects version of a recurrent events model that may be used to describe variation among smokers in how they respond to those covariates, potentially leading to the development of individual-based smoking cessation therapies. Our method extends the modified EM algorithm of Steele (1996) for generalized mixed models to recurrent events data with partially observed time-varying covariates. It is offered as an alternative to the method of Rizopoulos, Verbeke and Lesaffre (2009) who extended Steele’s (1996) algorithm to a joint-model for the recurrent events data and time-varying covariates. Our approach does not require a model for the time-varying covariates, but instead assumes that the time-varying covariates are sampled according to a Poisson point process with known intensity. Our methods are well suited to data collected using Ecological Momentary Assessment (EMA), a method of data collection widely used in the behavioral sciences to collect data on emotional state and recurrent events in the every-day environments of study subjects using electronic devices such as Personal Digital Assistants (PDA) or smart phones. PMID:26410189

  9. The time course of implicit processing of erotic pictures: an event-related potential study.

    PubMed

    Feng, Chunliang; Wang, Lili; Wang, Naiyi; Gu, Ruolei; Luo, Yue-Jia

    2012-12-13

    The current study investigated the time course of the implicit processing of erotic stimuli using event-related potentials (ERPs). ERPs elicited by erotic pictures were compared with those by three other types of pictures: non-erotic positive, negative, and neutral pictures. We observed that erotic pictures evoked enhanced neural responses compared with other pictures at both early (P2/N2) and late (P3/positive slow wave) temporal stages. These results suggested that erotic pictures selectively captured individuals' attention at early stages and evoked deeper processing at late stages. More importantly, the amplitudes of P2, N2, and P3 only discriminated between erotic and non-erotic (i.e., positive, neutral, and negative) pictures. That is, no difference was revealed among non-erotic pictures, although these pictures differed in both valence and arousal. Thus, our results suggest that the erotic picture processing is beyond the valence and arousal.

  10. Aesthetic appreciation: event-related field and time-frequency analyses

    PubMed Central

    Munar, Enric; Nadal, Marcos; Castellanos, Nazareth P.; Flexas, Albert; Maestú, Fernando; Mirasso, Claudio; Cela-Conde, Camilo J.

    2012-01-01

    Improvements in neuroimaging methods have afforded significant advances in our knowledge of the cognitive and neural foundations of aesthetic appreciation. We used magnetoencephalography (MEG) to register brain activity while participants decided about the beauty of visual stimuli. The data were analyzed with event-related field (ERF) and Time-Frequency (TF) procedures. ERFs revealed no significant differences between brain activity related with stimuli rated as “beautiful” and “not beautiful.” TF analysis showed clear differences between both conditions 400 ms after stimulus onset. Oscillatory power was greater for stimuli rated as “beautiful” than those regarded as “not beautiful” in the four frequency bands (theta, alpha, beta, and gamma). These results are interpreted in the frame of synchronization studies. PMID:22287948

  11. Real-time detection of an extreme scattering event: Constraints on Galactic plasma lenses

    NASA Astrophysics Data System (ADS)

    Bannister, Keith W.; Stevens, Jamie; Tuntsov, Artem V.; Walker, Mark A.; Johnston, Simon; Reynolds, Cormac; Bignall, Hayley

    2016-01-01

    Extreme scattering events (ESEs) are distinctive fluctuations in the brightness of astronomical radio sources caused by occulting plasma lenses in the interstellar medium. The inferred plasma pressures of the lenses are ~103 times the ambient pressure, challenging our understanding of gas conditions in the Milky Way. Using a new survey technique, we discovered an ESE while it was in progress. Here we report radio and optical follow-up observations. Modeling of the radio data demonstrates that the lensing structure is a density enhancement and the lens is diverging, ruling out one of two competing physical models. Our technique will uncover many more ESEs, addressing a long-standing mystery of the small-scale gas structure of our Galaxy.

  12. Aesthetic appreciation: event-related field and time-frequency analyses.

    PubMed

    Munar, Enric; Nadal, Marcos; Castellanos, Nazareth P; Flexas, Albert; Maestú, Fernando; Mirasso, Claudio; Cela-Conde, Camilo J

    2011-01-01

    Improvements in neuroimaging methods have afforded significant advances in our knowledge of the cognitive and neural foundations of aesthetic appreciation. We used magnetoencephalography (MEG) to register brain activity while participants decided about the beauty of visual stimuli. The data were analyzed with event-related field (ERF) and Time-Frequency (TF) procedures. ERFs revealed no significant differences between brain activity related with stimuli rated as "beautiful" and "not beautiful." TF analysis showed clear differences between both conditions 400 ms after stimulus onset. Oscillatory power was greater for stimuli rated as "beautiful" than those regarded as "not beautiful" in the four frequency bands (theta, alpha, beta, and gamma). These results are interpreted in the frame of synchronization studies. PMID:22287948

  13. Real-time detection of an extreme scattering event: Constraints on Galactic plasma lenses.

    PubMed

    Bannister, Keith W; Stevens, Jamie; Tuntsov, Artem V; Walker, Mark A; Johnston, Simon; Reynolds, Cormac; Bignall, Hayley

    2016-01-22

    Extreme scattering events (ESEs) are distinctive fluctuations in the brightness of astronomical radio sources caused by occulting plasma lenses in the interstellar medium. The inferred plasma pressures of the lenses are ~10(3) times the ambient pressure, challenging our understanding of gas conditions in the Milky Way. Using a new survey technique, we discovered an ESE while it was in progress. Here we report radio and optical follow-up observations. Modeling of the radio data demonstrates that the lensing structure is a density enhancement and the lens is diverging, ruling out one of two competing physical models. Our technique will uncover many more ESEs, addressing a long-standing mystery of the small-scale gas structure of our Galaxy.

  14. Real-time detection of an extreme scattering event: Constraints on Galactic plasma lenses.

    PubMed

    Bannister, Keith W; Stevens, Jamie; Tuntsov, Artem V; Walker, Mark A; Johnston, Simon; Reynolds, Cormac; Bignall, Hayley

    2016-01-22

    Extreme scattering events (ESEs) are distinctive fluctuations in the brightness of astronomical radio sources caused by occulting plasma lenses in the interstellar medium. The inferred plasma pressures of the lenses are ~10(3) times the ambient pressure, challenging our understanding of gas conditions in the Milky Way. Using a new survey technique, we discovered an ESE while it was in progress. Here we report radio and optical follow-up observations. Modeling of the radio data demonstrates that the lensing structure is a density enhancement and the lens is diverging, ruling out one of two competing physical models. Our technique will uncover many more ESEs, addressing a long-standing mystery of the small-scale gas structure of our Galaxy. PMID:26798008

  15. Unattended monitoring system at a static storage area with real-time event notification.

    SciTech Connect

    West, J. D.; Betts, S. E.; Michel, K. D.; Schanfein, M. J.; Ricketts, T. E.

    2005-01-01

    Domestic Safeguards at Los Alamos National Laboratory (LANL) and throughout the Department of Energy (DOE)/National Nuclear Security Administration (NNSA) complex has historically relied on administrative and non-integrated approaches to implement nuclear safeguards at its facilities. Besides the heavy cost born by the facility and the compliance oversight organization, the safeguards assurance is only periodic, potentially allowing an adversary a longer time before detection. Even after detection, the lack of situational awareness makes it difficult to assess events. By leveraging unattended monitoring systems (UMS) used by the International Atomic Energy Agency (IAEA), we have designed a baseline system that has high reliability through fault tolerant designs for both hardware and software. Applying IAEA design goals to assure no loss of data and using a dual containment strategy, this system is a first step in implementing modern safeguards monitoring systems at LANL and, hopefully, applications at other DOE/NNSA sites. This paper will review the design requirements and how they will be met, to provide a real-time event notification for a static storage location. The notification system triggers communications to pagers and email addresses for a fast response by facility personnel to the violation of a defined safeguards exclusion zone. Since the system has to be installed in an existing facility, the challenges to the designers will be presented. Aside from the initial baseline system that relies on surveillance cameras and seals, other optional upgrades will be detailed, showing both the power and the promise of unattended systems for domestic safeguards. We will also include a short discussion of the business obstacles to modernizing safeguards and how a UMS system may be applied to dynamic activities at a nuclear facility. Ultimately, the current lack of such modern monitoring systems reflects the many business obstacles internal to DOE/NNSA to the use of

  16. Disambiguating past events: Accurate source memory for time and context depends on different retrieval processes.

    PubMed

    Persson, Bjorn M; Ainge, James A; O'Connor, Akira R

    2016-07-01

    Current animal models of episodic memory are usually based on demonstrating integrated memory for what happened, where it happened, and when an event took place. These models aim to capture the testable features of the definition of human episodic memory which stresses the temporal component of the memory as a unique piece of source information that allows us to disambiguate one memory from another. Recently though, it has been suggested that a more accurate model of human episodic memory would include contextual rather than temporal source information, as humans' memory for time is relatively poor. Here, two experiments were carried out investigating human memory for temporal and contextual source information, along with the underlying dual process retrieval processes, using an immersive virtual environment paired with a 'Remember-Know' memory task. Experiment 1 (n=28) showed that contextual information could only be retrieved accurately using recollection, while temporal information could be retrieved using either recollection or familiarity. Experiment 2 (n=24), which used a more difficult task, resulting in reduced item recognition rates and therefore less potential for contamination by ceiling effects, replicated the pattern of results from Experiment 1. Dual process theory predicts that it should only be possible to retrieve source context from an event using recollection, and our results are consistent with this prediction. That temporal information can be retrieved using familiarity alone suggests that it may be incorrect to view temporal context as analogous to other typically used source contexts. This latter finding supports the alternative proposal that time since presentation may simply be reflected in the strength of memory trace at retrieval - a measure ideally suited to trace strength interrogation using familiarity, as is typically conceptualised within the dual process framework. PMID:27174312

  17. Extreme weather event in spring 2013 delayed breeding time of Great Tit and Blue Tit.

    PubMed

    Glądalski, Michał; Bańbura, Mirosława; Kaliński, Adam; Markowski, Marcin; Skwarska, Joanna; Wawrzyniak, Jarosław; Zieliński, Piotr; Bańbura, Jerzy

    2014-12-01

    The impact of climatic changes on life cycles by re-scheduling the timing of reproduction is an important topic in studies of biodiversity. Global warming causes and will probably cause in the future not only raising temperatures but also an increasing frequency of extreme weather events. In 2013, the winter in central and north Europe ended late, with low temperatures and long-retained snow cover--this extreme weather phenomenon acted in opposition to the increasing temperature trend. In 2013, thermal conditions measured by the warmth sum in the period 15 March–15 April, a critical time for early breeding passerines, went far beyond the range of the warmth sums for at least 40 preceding years. Regardless of what was the reason for the extreme early spring 2013 and assuming that there is a potential for more atypical years because of climate change, we should look closely at every extreme phenomenon and its consequences for the phenology of organisms. In this paper, we report that the prolonged occurrence of winter conditions during the time that is crucial for Blue Tit (Cyanistes caeruleus) and Great Tit (Parus major) reproduction caused a substantial delay in the onset of egg laying in comparison with typical springs.

  18. A new method for centralised and modular supervisory control of real-time discrete event systems

    NASA Astrophysics Data System (ADS)

    Ouédraogo, Lucien; Khoumsi, Ahmed; Nourelfath, Mustapha

    2010-01-01

    This article deals with the problem of controlling a plant described as a real-time discrete event system (RTDES). In particular, automata-based supervisory control of RTDES is addressed. The aim of supervisory control is to restrict the behaviour (using a supervisor) of an uncontrolled plant in order to conform to a given specification. First, we propose a centralised method for the synthesis of a supervisor that forces a given plant to conform to a given specification. Then, we extend this centralised method to the modular case, that is, for the synthesis of n supervisors that force the plant to conform to n given specifications, respectively. Timed automata (TA) with invariants is the model used to describe the plant and the specification(s). The synthesis approach is based on the transformation of the control problem into a non-real-time form, using a transformation of TA into equivalent particular finite state automata called Set-Exp-Automata. This transformation allows to adapt the theory of Ramadge and Wonham, and is justified by the fact that it reduces the state space explosion problem compared to other transformation methods such as the transformation of TA into region automata. Moreover, the Set-Exp-Automata model provides a suitable control architecture for implementation. The proposed approach allows to obtain the solution to both the centralised and modular supervisory control problem, by identifying the solvability conditions and giving a step-by-step computation procedure of the solution.

  19. Framework for modeling urban restoration resilience time in the aftermath of an extreme event

    USGS Publications Warehouse

    Ramachandran, Varun; Long, Suzanna K.; Shoberg, Thomas G.; Corns, Steven; Carlo, Héctor

    2015-01-01

    The impacts of extreme events continue long after the emergency response has terminated. Effective reconstruction of supply-chain strategic infrastructure (SCSI) elements is essential for postevent recovery and the reconnectivity of a region with the outside. This study uses an interdisciplinary approach to develop a comprehensive framework to model resilience time. The framework is tested by comparing resilience time results for a simulated EF-5 tornado with ground truth data from the tornado that devastated Joplin, Missouri, on May 22, 2011. Data for the simulated tornado were derived for Overland Park, Johnson County, Kansas, in the greater Kansas City, Missouri, area. Given the simulated tornado, a combinatorial graph considering the damages in terms of interconnectivity between different SCSI elements is derived. Reconstruction in the aftermath of the simulated tornado is optimized using the proposed framework to promote a rapid recovery of the SCSI. This research shows promising results when compared with the independent quantifiable data obtained from Joplin, Missouri, returning a resilience time of 22 days compared with 25 days reported by city and state officials.

  20. A Real-Time Web Services Hub to Improve Situation Awareness during Flash Flood Events

    NASA Astrophysics Data System (ADS)

    Salas, F. R.; Liu, F.; Maidment, D. R.; Hodges, B. R.

    2011-12-01

    The central Texas corridor is one of the most flash flood-prone regions in the United States. Over the years, flash floods have resulted in hundreds of flood fatalities and billions of dollars in property damage. In order to mitigate risk to residents and infrastructure during flood events, both citizens and emergency responders need to exhibit proactive behavior instead of reactive. Real-time and forecasted flood information is fairly limited and hard to come by at varying spatial scales. The University of Texas at Austin has collaborated with IBM Research-Austin and ESRI to build a distributed real-time flood information system through a framework that leverages large scale data management and distribution, Open Geospatial Consortium standardized web services, and smart map applications. Within this paradigm, observed precipitation data encoded in WaterML is ingested into HEC-HMS and then delivered to a high performance hydraulic routing software package developed by IBM that utilizes the latest advancements in VLSI design, numerical linear algebra and numerical integration techniques on contemporary multicore architecture to solve fully dynamic Saint Venant equations at both small and large scales. In this paper we present a real-time flood inundation map application that in conjunction with a web services Hub, seamlessly integrates hydrologic information available through both public and private data services, model services and mapping services. As a case study for this project, we demonstrate how this system has been implemented in the City of Austin, Texas.

  1. The time course of word retrieval revealed by event-related brain potentials during overt speech

    PubMed Central

    Costa, Albert; Strijkers, Kristof; Martin, Clara; Thierry, Guillaume

    2009-01-01

    Speech production is one of the most fundamental activities of humans. A core cognitive operation involved in this skill is the retrieval of words from long-term memory, that is, from the mental lexicon. In this article, we establish the time course of lexical access by recording the brain electrical activity of participants while they named pictures aloud. By manipulating the ordinal position of pictures belonging to the same semantic categories, the cumulative semantic interference effect, we were able to measure the exact time at which lexical access takes place. We found significant correlations between naming latencies, ordinal position of pictures, and event-related potential mean amplitudes starting 200 ms after picture presentation and lasting for 180 ms. The study reveals that the brain engages extremely fast in the retrieval of words one wishes to utter and offers a clear time frame of how long it takes for the competitive process of activating and selecting words in the course of speech to be resolved. PMID:19934043

  2. Multivariate Regression with Calibration*

    PubMed Central

    Liu, Han; Wang, Lie; Zhao, Tuo

    2014-01-01

    We propose a new method named calibrated multivariate regression (CMR) for fitting high dimensional multivariate regression models. Compared to existing methods, CMR calibrates the regularization for each regression task with respect to its noise level so that it is simultaneously tuning insensitive and achieves an improved finite-sample performance. Computationally, we develop an efficient smoothed proximal gradient algorithm which has a worst-case iteration complexity O(1/ε), where ε is a pre-specified numerical accuracy. Theoretically, we prove that CMR achieves the optimal rate of convergence in parameter estimation. We illustrate the usefulness of CMR by thorough numerical simulations and show that CMR consistently outperforms other high dimensional multivariate regression methods. We also apply CMR on a brain activity prediction problem and find that CMR is as competitive as the handcrafted model created by human experts. PMID:25620861

  3. Real-time Monitoring Network to Characterize Anthropogenic and Natural Events Affecting the Hudson River, NY

    NASA Astrophysics Data System (ADS)

    Islam, M. S.; Bonner, J. S.; Fuller, C.; Kirkey, W.; Ojo, T.

    2011-12-01

    The Hudson River watershed spans 34,700 km2 predominantly in New York State, including agricultural, wilderness, and urban areas. The Hudson River supports many activities including shipping, supplies water for municipal, commercial, and agricultural uses, and is an important recreational resource. As the population increases within this watershed, so does the anthropogenic impact on this natural system. To address the impacts of anthropogenic and natural activities on this ecosystem, the River and Estuary Observatory Network (REON) is being developed through a joint venture between the Beacon Institute, Clarkson University, General Electric Inc. and IBM Inc. to monitor New York's Hudson and Mohawk Rivers in real-time. REON uses four sensor platform types with multiple nodes within the network to capture environmentally relevant episodic events. Sensor platform types include: 1) fixed robotic vertical profiler (FRVP); 2) mobile robotic undulating platform (MRUP); 3) fixed acoustic Doppler current profiler (FADCP) and 4) Autonomous Underwater Vehicle (AUV). The FRVP periodically generates a vertical profile with respect to water temperature, salinity, dissolved oxygen, particle concentration and size distribution, and fluorescence. The MRUP utilizes an undulating tow-body tethered behind a research vessel to measure the same set of water parameters as the FRVP, but does so 'synchronically' over a highly-resolved spatial regime. The fixed ADCP provides continuous water current profiles. The AUV maps four-dimensional (time, latitude, longitude, depth) variation of water quality, water currents and bathymetry along a pre-determined transect route. REON data can be used to identify episodic events, both anthropogenic and natural, that impact the Hudson River. For example, a strong heat signature associated with cooling water discharge from the Indian Point nuclear power plant was detected with the MRUP. The FRVP monitoring platform at Beacon, NY, located in the

  4. Recurrence time statistics of landslide events simulated by a cellular automaton model

    NASA Astrophysics Data System (ADS)

    Piegari, Ester; Di Maio, Rosa; Avella, Adolfo

    2014-05-01

    The recurrence time statistics of a cellular automaton modelling landslide events is analyzed by performing a numerical analysis in the parameter space and estimating Fano factor behaviors. The model is an extended version of the OFC model, which is a paradigm for SOC in non-conserved systems, but it works differently from the original OFC model as a finite value of the driving rate is applied. By driving the system to instability with different rates, the model exhibits a smooth transition from a correlated to an uncorrelated regime as the effect of a change in predominant mechanisms to propagate instability. If the rate at which instability is approached is small, chain processes dominate the landslide dynamics, and power laws govern probability distributions. However, the power-law regime typical of SOC-like systems is found in a range of return intervals that becomes shorter and shorter by increasing the values of the driving rates. Indeed, if the rates at which instability is approached are large, domino processes are no longer active in propagating instability, and large events simply occur because a large number of cells simultaneously reach instability. Such a gradual loss of the effectiveness of the chain propagation mechanism causes the system gradually enter to an uncorrelated regime where recurrence time distributions are characterized by Weibull behaviors. Simulation results are qualitatively compared with those from a recent analysis performed by Witt et al.(Earth Surf. Process. Landforms, 35, 1138, 2010) for the first complete databases of landslide occurrences over a period as large as fifty years. From the comparison with the extensive landslide data set, the numerical analysis suggests that statistics of such landslide data seem to be described by a crossover region between a correlated regime and an uncorrelated regime, where recurrence time distributions are characterized by power-law and Weibull behaviors for short and long return times

  5. Multivariate bubbles and antibubbles

    NASA Astrophysics Data System (ADS)

    Fry, John

    2014-08-01

    In this paper we develop models for multivariate financial bubbles and antibubbles based on statistical physics. In particular, we extend a rich set of univariate models to higher dimensions. Changes in market regime can be explicitly shown to represent a phase transition from random to deterministic behaviour in prices. Moreover, our multivariate models are able to capture some of the contagious effects that occur during such episodes. We are able to show that declining lending quality helped fuel a bubble in the US stock market prior to 2008. Further, our approach offers interesting insights into the spatial development of UK house prices.

  6. Multivariate Data EXplorer (MDX)

    SciTech Connect

    Steed, Chad Allen

    2012-08-01

    The MDX toolkit facilitates exploratory data analysis and visualization of multivariate datasets. MDX provides and interactive graphical user interface to load, explore, and modify multivariate datasets stored in tabular forms. MDX uses an extended version of the parallel coordinates plot and scatterplots to represent the data. The user can perform rapid visual queries using mouse gestures in the visualization panels to select rows or columns of interest. The visualization panel provides coordinated multiple views whereby selections made in one plot are propagated to the other plots. Users can also export selected data or reconfigure the visualization panel to explore relationships between columns and rows in the data.

  7. Digitized pressure-time records, selected nuclear events. Technical report, 1 September 1982-1 April 1986

    SciTech Connect

    McMullan, F.W.; Bryant, E.J.

    1986-04-30

    Pressure-time records are presented for selected atmospheric nuclear events. The records were extracted from published test reports, digitized, and given uniform pressure-time scales for a given event and a given range to permit easier comparison. Data include p-t, q-t, p(tot)-t, Mach No-t, and Impulse-t as appropriate. Selected data were scaled to 1 kT.

  8. Event- and time-triggered remembering: the impact of attention deficit hyperactivity disorder on prospective memory performance in children.

    PubMed

    Talbot, Karley-Dale S; Kerns, Kimberly A

    2014-11-01

    The current study examined prospective memory (PM, both time-based and event-based) and time estimation (TR, a time reproduction task) in children with and without attention deficit hyperactivity disorder (ADHD). This study also investigated the influence of task performance and TR on time-based PM in children with ADHD relative to controls. A sample of 69 children, aged 8 to 13 years, completed the CyberCruiser-II time-based PM task, a TR task, and the Super Little Fisherman event-based PM task. PM performance was compared with children's TR abilities, parental reports of daily prospective memory disturbances (Prospective and Retrospective Memory Questionnaire for Children, PRMQC), and ADHD symptomatology (Conner's rating scales). Children with ADHD scored more poorly on event-based PM, time-based PM, and TR; interestingly, TR did not appear related to performance on time-based PM. In addition, it was found that PRMQC scores and ADHD symptom severity were related to performance on the time-based PM task but not to performance on the event-based PM task. These results provide some limited support for theories that propose a distinction between event-based PM and time-based PM.

  9. Modeling a Typical Winter-time Dust Event over the Arabian Peninsula and the Red Sea

    SciTech Connect

    Kalenderski, S.; Stenchikov, G.; Zhao, Chun

    2013-02-20

    We used WRF-Chem, a regional meteorological model coupled with an aerosol-chemistry component, to simulate various aspects of the dust phenomena over the Arabian Peninsula and Red Sea during a typical winter-time dust event that occurred in January 2009. The model predicted that the total amount of emitted dust was 18.3 Tg for the entire dust outburst period and that the two maximum daily rates were ~2.4 Tg/day and ~1.5 Tg/day, corresponding to two periods with the highest aerosol optical depth that were well captured by ground- and satellite-based observations. The model predicted that the dust plume was thick, extensive, and mixed in a deep boundary layer at an altitude of 3-4 km. Its spatial distribution was modeled to be consistent with typical spatial patterns of dust emissions. We utilized MODIS-Aqua and Solar Village AERONET measurements of the aerosol optical depth (AOD) to evaluate the radiative impact of aerosols. Our results clearly indicated that the presence of dust particles in the atmosphere caused a significant reduction in the amount of solar radiation reaching the surface during the dust event. We also found that dust aerosols have significant impact on the energy and nutrient balances of the Red Sea. Our results showed that the simulated cooling under the dust plume reached 100 W/m2, which could have profound effects on both the sea surface temperature and circulation. Further analysis of dust generation and its spatial and temporal variability is extremely important for future projections and for better understanding of the climate and ecological history of the Red Sea.

  10. Assessment of realistic nowcasting lead-times based on predictability analysis of Mediterranean Heavy Precipitation Events

    NASA Astrophysics Data System (ADS)

    Bech, Joan; Berenguer, Marc

    2014-05-01

    Operational quantitative precipitation forecasts (QPF) are provided routinely by weather services or hydrological authorities, particularly those responsible for densely populated regions of small catchments, such as those typically found in Mediterranean areas prone to flash-floods. Specific rainfall values are used as thresholds for issuing warning levels considering different time frameworks (mid-range, short-range, 24h, 1h, etc.), for example 100 mm in 24h or 60 mm in 1h. There is a clear need to determine how feasible is a specific rainfall value for a given lead-time, in particular for very short range forecasts or nowcasts typically obtained from weather radar observations (Pierce et al 2012). In this study we assess which specific nowcast lead-times can be provided for a number of heavy precipitation events (HPE) that affected Catalonia (NE Spain). The nowcasting system we employed generates QPFs through the extrapolation of rainfall fields observed with weather radar following a Lagrangian approach developed and tested successfully in previous studies (Berenguer et al. 2005, 2011).Then QPFs up to 3h are compared with two quality controlled observational data sets: weather radar quantitative precipitation estimates (QPE) and raingauge data. Several high-impact weather HPE were selected including the 7 September 2005 Llobregat Delta river tornado outbreak (Bech et al. 2007) or the 2 November 2008 supercell tornadic thunderstorms (Bech et al. 2011) both producing, among other effects, local flash floods. In these two events there were torrential rainfall rates (30' amounts exceeding 38.2 and 12.3 mm respectively) and 24h accumulation values above 100 mm. A number of verification scores are used to characterize the evolution of precipitation forecast quality with time, which typically presents a decreasing trend but showing an strong dependence on the selected rainfall threshold and integration period. For example considering correlation factors, 30

  11. Time-based and event-based prospective memory in autism spectrum disorder: the roles of executive function and theory of mind, and time-estimation.

    PubMed

    Williams, David; Boucher, Jill; Lind, Sophie; Jarrold, Christopher

    2013-07-01

    Prospective memory (remembering to carry out an action in the future) has been studied relatively little in ASD. We explored time-based (carry out an action at a pre-specified time) and event-based (carry out an action upon the occurrence of a pre-specified event) prospective memory, as well as possible cognitive correlates, among 21 intellectually high-functioning children with ASD, and 21 age- and IQ-matched neurotypical comparison children. We found impaired time-based, but undiminished event-based, prospective memory among children with ASD. In the ASD group, time-based prospective memory performance was associated significantly with diminished theory of mind, but not with diminished cognitive flexibility. There was no evidence that time-estimation ability contributed to time-based prospective memory impairment in ASD.

  12. Event-specific detection of stacked genetically modified maize Bt11 x GA21 by UP-M-PCR and real-time PCR.

    PubMed

    Xu, Wentao; Yuan, Yanfang; Luo, Yunbo; Bai, Weibin; Zhang, Chunjiao; Huang, Kunlun

    2009-01-28

    More and more stacked GMOs have been developed for more improved functional properties and/or a stronger intended characteristic, such as antipest, improved product efficiency etc. Bt11 x GA21 is a new kind of stacked GM maize developed by Monsanto Company. Since there are no unique flanking sequences in stacked GMOs, up to now, no appropriate method has been reported to accurately detect them. In this passage, a novel universal primer multiplex PCR (UP-M-PCR) was developed and applied as a rapid screening method for the simultaneous detection of five target sequences (NOS, 35S, Bt11 event, GA21 event, and IVR) in maize Bt11 x GA21. This method overcame the disadvantages rooted deeply in conventional multiplex PCR such as complex manipulation, lower sensitivity, self-inhibition and amplification disparity resulting from different primers. What's more, it got a high specificity and had a detection limit of 0.1% (approximates to 38 haploid genome copies). Furthermore, real-time PCR combined with multivariate statistical analysis was used for accurate quantification of stacked GM maize Bt11 x GA21 in 100% GM maize mixture (Bt11 x GA21, Bt11 and GA21). Detection results showed that this method could accurately validate the content of Bt11, GA21 and Bt11 x GA21 in 100% GM mixture with a detection limit of 0.5% (approximates to 200 haploid genome copies) and a low relative standard deviation <5%. All the data proved that this method may be widely applied in event-specific detection of other stacked GMOs in GM-mixture.

  13. Exploiting semantics for scheduling real-time data collection from sensors to maximize event detection

    NASA Astrophysics Data System (ADS)

    Vaisenberg, Ronen; Mehrotra, Sharad; Ramanan, Deva

    2009-01-01

    A distributed camera network allows for many compelling applications such as large-scale tracking or event detection. In most practical systems, resources are constrained. Although one would like to probe every camera at every time instant and store every frame, this is simply not feasible. Constraints arise from network bandwidth restrictions, I/O and disk usage from writing images, and CPU usage needed to extract features from the images. Assume that, due to resource constraints, only a subset of sensors can be probed at any given time unit. This paper examines the problem of selecting the "best" subset of sensors to probe under some user-specified objective - e.g., detecting as much motion as possible. With this objective, we would like to probe a camera when we expect motion, but would not like to waste resources on a non-active camera. The main idea behind our approach is the use of sensor semantics to guide the scheduling of resources. We learn a dynamic probabilistic model of motion correlations between cameras, and use the model to guide resource allocation for our sensor network. Although previous work has leveraged probabilistic models for sensor-scheduling, our work is distinct in its focus on real-time building-monitoring using a camera network. We validate our approach on a sensor network of a dozen cameras spread throughout a university building, recording measurements of unscripted human activity over a two week period. We automatically learnt a semantic model of typical behaviors, and show that one can significantly improve effciency of resource allocation by exploiting this model.

  14. The variations of long time period slow slip events along the Ryukyu subduction zone

    NASA Astrophysics Data System (ADS)

    Tu, Y. T.; Heki, K.

    2014-12-01

    Slow slip events (SSEs) are a type of slow earthquakes that can be observed with Global Positioning System (GPS) networks in the world. Those events are detected on intensely coupled plate boundaries such as Cascadia subduction zone (Dragert et al., 2001), western North America, Mexico (Kostoglodov et al., 2003), Alaska (Ohta et al., 2007) and Tokai and Boso areas (Ozawa et al., 2002, 2003), central Japan and are considered to have relations to large subduction thrust earthquakes. However, in southwestern Ryukyu trench where most of researchers believe that it should be a decoupled plate boundary, SSEs recur regularly and are located at a patch that is as deep as 20 to 40 km (Heki and Kataoka, 2008). For comprehending the characteristics and time variations of SSEs in this area, the GEONET GPS data of 16 years are used in this study. During 1997 to 2014, more than thirty SSEs are identified near Hateruma Island, Ryukyu. The average recurrence interval is calculated to be 6.3 months and release seismic moment is Mw 6.6 on average. However, the values of recurrence interval are not invariable. From 1997 to 2002, interval period of SSEs is 7.5 months, but during 2002 to 2008, the interval period decreases suddenly to 5.5 months. After 2008, the value restores to 7.2 months again. Furthermore, the slip amount of SSEs in this area varies with time. From 1997 to 2002, the slip is 9.5 cm/year; and during 2002 to 2008, the value slightly increases to 10.5 cm/year. However, in 2008 to 2013, the slip drops to 6.6 cm/year, but accord to the trend of cumulative slip, the slip value would increase in 2014. Considering these data, we find the slip values increase conspicuously in 2002 and 2013. Coincidentally, one Mw 7.1 thrust earthquake occurred in 2002 and earthquake swarm activity started in the Okinawa trough approximately 50km north of the SSE patch. In 2013, another earthquake swarm activity occurred in nearly the same area as the 2002 activity. This suggests that the

  15. How the timing of weather events influences early development in a large mammal.

    PubMed

    Hendrichsen, D K; Tyler, N J C

    2014-07-01

    Capturing components of the weather that drive environment-animal interactions is a perennial problem in ecology. Identifying biologically significant elements of weather conditions in sensible statistics suitable for analysis of life history variation and population dynamics is central. Meteorological variables such as temperature, precipitation, and wind modulate rates of heat loss in animals, but analysis of their effects on endothermic species is complicated by the fact that their influence on energy balance is not invariably linear, even across the thermoneutral range. Rather, the thermal load imposed by a given set of weather conditions is a function of organisms' metabolic requirement, which, crucially, may vary spontaneously both seasonally and across different life phases. We propose that the endogenous component of variation in metabolic demand introduces a temporal dimension and that, as a consequence, the specific effect of meteorological variables on energy balance and attendant life history parameters is a function of the timing of weather events with respect to the organism's metabolic rhythm(s). To test this, we examined how a spontaneous increase in metabolic demand influenced the effect of weather on early development in a large mammal. Specifically, we examined interaction between the exponential rise in the energy requirements of pregnancy and depth of snow, which restricts dams' access to forage, on the body mass of reindeer calves (Rangifer tarandus) at weaning. As expected, we detected a significant temporal component: the specific negative effect of snow on weaning mass was not constant, but increased across pregnancy. The life history response was therefore better predicted by interaction between the magnitude and the timing of weather events than by their magnitude alone. To our knowledge, this is the first demonstration of the influence of an endogenous metabolic dynamic on the impact of weather on a life history trait in a free

  16. How the timing of weather events influences early development in a large mammal.

    PubMed

    Hendrichsen, D K; Tyler, N J C

    2014-07-01

    Capturing components of the weather that drive environment-animal interactions is a perennial problem in ecology. Identifying biologically significant elements of weather conditions in sensible statistics suitable for analysis of life history variation and population dynamics is central. Meteorological variables such as temperature, precipitation, and wind modulate rates of heat loss in animals, but analysis of their effects on endothermic species is complicated by the fact that their influence on energy balance is not invariably linear, even across the thermoneutral range. Rather, the thermal load imposed by a given set of weather conditions is a function of organisms' metabolic requirement, which, crucially, may vary spontaneously both seasonally and across different life phases. We propose that the endogenous component of variation in metabolic demand introduces a temporal dimension and that, as a consequence, the specific effect of meteorological variables on energy balance and attendant life history parameters is a function of the timing of weather events with respect to the organism's metabolic rhythm(s). To test this, we examined how a spontaneous increase in metabolic demand influenced the effect of weather on early development in a large mammal. Specifically, we examined interaction between the exponential rise in the energy requirements of pregnancy and depth of snow, which restricts dams' access to forage, on the body mass of reindeer calves (Rangifer tarandus) at weaning. As expected, we detected a significant temporal component: the specific negative effect of snow on weaning mass was not constant, but increased across pregnancy. The life history response was therefore better predicted by interaction between the magnitude and the timing of weather events than by their magnitude alone. To our knowledge, this is the first demonstration of the influence of an endogenous metabolic dynamic on the impact of weather on a life history trait in a free

  17. Correlation Analyses Between the Characteristic Times of Gradual Solar Energetic Particle Events and the Properties of Associated Coronal Mass Ejections

    NASA Astrophysics Data System (ADS)

    Pan, Z. H.; Wang, C. B.; Wang, Yuming; Xue, X. H.

    2011-06-01

    It is generally believed that gradual solar energetic particles (SEPs) are accelerated by shocks associated with coronal mass ejections (CMEs). Using an ice-cream cone model, the radial speed and angular width of 95 CMEs associated with SEP events during 1998 - 2002 are calculated from SOHO/LASCO observations. Then, we investigate the relationships between the kinematic properties of these CMEs and the characteristic times of the intensity-time profile of their accompanied SEP events observed at 1 AU. These characteristic times of SEP are i) the onset time from the accompanying CME eruption at the Sun to the SEP arrival at 1 AU, ii) the rise time from the SEP onset to the time when the SEP intensity is one-half of peak intensity, and iii) the duration over which the SEP intensity is within a factor of two of the peak intensity. It is found that the onset time has neither significant correlation with the radial speed nor with the angular width of the accompanying CME. For events that are poorly connected to the Earth, the SEP rise time and duration have no significant correlation with the radial speed and angular width of the associated CMEs. However, for events that are magnetically well connected to the Earth, the SEP rise time and duration have significantly positive correlations with the radial speed and angular width of the associated CMEs. This indicates that a CME event with wider angular width and higher speed may more easily drive a strong and wide shock near to the Earth-connected interplanetary magnetic field lines, may trap and accelerate particles for a longer time, and may lead to longer rise time and duration of the ensuing SEP event.

  18. Real-time detection and classification of anomalous events in streaming data

    DOEpatents

    Ferragut, Erik M.; Goodall, John R.; Iannacone, Michael D.; Laska, Jason A.; Harrison, Lane T.

    2016-04-19

    A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The events can be displayed to a user in user-defined groupings in an animated fashion. The system can include a plurality of anomaly detectors that together implement an algorithm to identify low probability events and detect atypical traffic patterns. The atypical traffic patterns can then be classified as being of interest or not. In one particular example, in a network environment, the classification can be whether the network traffic is malicious or not.

  19. Hierarchy of temporal responses of multivariate self-excited epidemic processes

    NASA Astrophysics Data System (ADS)

    Saichev, Alexander; Maillart, Thomas; Sornette, Didier

    2013-04-01

    Many natural and social systems are characterized by bursty dynamics, for which past events trigger future activity. These systems can be modelled by so-called self-excited Hawkes conditional Poisson processes. It is generally assumed that all events have similar triggering abilities. However, some systems exhibit heterogeneity and clusters with possibly different intra- and inter-triggering, which can be accounted for by generalization into the "multivariate" self-excited Hawkes conditional Poisson processes. We develop the general formalism of the multivariate moment generating function for the cumulative number of first-generation and of all generation events triggered by a given mother event (the "shock") as a function of the current time t. This corresponds to studying the response function of the process. A variety of different systems have been analyzed. In particular, for systems in which triggering between events of different types proceeds through a one-dimension directed or symmetric chain of influence in type space, we report a novel hierarchy of intermediate asymptotic power law decays ˜ 1/ t 1-( m+1) θ of the rate of triggered events as a function of the distance m of the events to the initial shock in the type space, where 0 < θ < 1 for the relevant long-memory processes characterizing many natural and social systems. The richness of the generated time dynamics comes from the cascades of intermediate events of possibly different kinds, unfolding via random changes of types genealogy.

  20. Development of a real time monitor and multivariate method for long term diagnostics of atmospheric pressure dielectric barrier discharges: Application to He, He/N2, and He/O2 discharges

    NASA Astrophysics Data System (ADS)

    O'Connor, N.; Milosavljević, V.; Daniels, S.

    2011-08-01

    In this paper we present the development and application of a real time atmospheric pressure discharge monitoring diagnostic. The software based diagnostic is designed to extract latent electrical and optical information associated with the operation of an atmospheric pressure dielectric barrier discharge (APDBD) over long time scales. Given that little is known about long term temporal effects in such discharges, the diagnostic methodology is applied to the monitoring of an APDBD in helium and helium with both 0.1% nitrogen and 0.1% oxygen gas admixtures over periods of tens of minutes. Given the large datasets associated with the experiments, it is shown that this process is much expedited through the novel application of multivariate correlations between the electrical and optical parameters of the corresponding chemistries which, in turn, facilitates comparisons between each individual chemistry also. The results of these studies show that the electrical and optical parameters of the discharge in helium and upon the addition of gas admixtures evolve over time scales far longer than the gas residence time and have been compared to current modelling works. It is envisaged that the diagnostic together with the application of multivariate correlations will be applied to rapid system identification and prototyping in both experimental and industrial APDBD systems in the future.

  1. Illustration of compositional variations over time of Chinese porcelain glazes combining micro-X-ray Fluorescence spectrometry, multivariate data analysis and Seger formulas

    NASA Astrophysics Data System (ADS)

    Van Pevenage, J.; Verhaeven, E.; Vekemans, B.; Lauwers, D.; Herremans, D.; De Clercq, W.; Vincze, L.; Moens, L.; Vandenabeele, P.

    2015-01-01

    In this research, the transparent glaze layers of Chinese porcelain samples were investigated. Depending on the production period, these samples can be divided into two groups: the samples of group A dating from the Kangxi period (1661-1722), and the samples of group B produced under emperor Qianlong (1735-1795). Due to the specific sample preparation method and the small spot size of the X-ray beam, investigation of the transparent glaze layers is enabled. Despite the many existing research papers about glaze investigations of ceramics and/or porcelain ware, this research reveals new insights into the glaze composition and structure of Chinese porcelain samples. In this paper it is demonstrated, using micro-X-ray Fluorescence (μ-XRF) spectrometry, multivariate data analysis and statistical analysis (Hotelling's T-Square test) that the transparent glaze layers of the samples of groups A and B are significantly different (95% confidence level). Calculation of the Seger formulas, enabled classification of the glazes. Combining all the information, the difference in composition of the Chinese porcelain glazes of the Kangxi period and the Qianlong period can be demonstrated.

  2. A Naive Bayes machine learning approach to risk prediction using censored, time-to-event data.

    PubMed

    Wolfson, Julian; Bandyopadhyay, Sunayan; Elidrisi, Mohamed; Vazquez-Benitez, Gabriela; Vock, David M; Musgrove, Donald; Adomavicius, Gediminas; Johnson, Paul E; O'Connor, Patrick J

    2015-09-20

    Predicting an individual's risk of experiencing a future clinical outcome is a statistical task with important consequences for both practicing clinicians and public health experts. Modern observational databases such as electronic health records provide an alternative to the longitudinal cohort studies traditionally used to construct risk models, bringing with them both opportunities and challenges. Large sample sizes and detailed covariate histories enable the use of sophisticated machine learning techniques to uncover complex associations and interactions, but observational databases are often 'messy', with high levels of missing data and incomplete patient follow-up. In this paper, we propose an adaptation of the well-known Naive Bayes machine learning approach to time-to-event outcomes subject to censoring. We compare the predictive performance of our method with the Cox proportional hazards model which is commonly used for risk prediction in healthcare populations, and illustrate its application to prediction of cardiovascular risk using an electronic health record dataset from a large Midwest integrated healthcare system.

  3. Nuclear event time histories and computed site transfer functions for locations in the Los Angeles region

    USGS Publications Warehouse

    Rogers, A.M.; Covington, P.A.; Park, R.B.; Borcherdt, R.D.; Perkins, D.M.

    1980-01-01

    This report presents a collection of Nevada Test Site (NTS) nuclear explosion recordings obtained at sites in the greater Los Angeles, Calif., region. The report includes ground velocity time histories, as well as, derived site transfer functions. These data have been collected as part of a study to evaluate the validity of using low-level ground motions to predict the frequency-dependent response of a site during an earthquake. For this study 19 nuclear events were recorded at 98 separate locations. Some of these sites have recorded more than one of the nuclear explosions, and, consequently, there are a total of 159, three-component station records. The location of all the recording sites are shown in figures 1–5, the station coordinates and abbreviations are given in table 1. The station addresses are listed in table 2, and the nuclear explosions that were recorded are listed in table 3. The recording sites were chosen on the basis of three criteria: (1) that the underlying geological conditions were representative of conditions over significant areas of the region, (2) that the site was the location of a strong-motion recording of the 1971 San Fernando earthquake, or (3) that more complete geographical coverage was required in that location.

  4. Early Events in Insulin Fibrillization Studied by Time-Lapse Atomic Force Microscopy

    PubMed Central

    Podestà, Alessandro; Tiana, Guido; Milani, Paolo; Manno, Mauro

    2006-01-01

    The importance of understanding the mechanism of protein aggregation into insoluble amyloid fibrils lies not only in its medical consequences, but also in its more basic properties of self-organization. The discovery that a large number of uncorrelated proteins can form, under proper conditions, structurally similar fibrils has suggested that the underlying mechanism is a general feature of polypeptide chains. In this work, we address the early events preceding amyloid fibril formation in solutions of zinc-free human insulin incubated at low pH and high temperature. Here, we show by time-lapse atomic force microscopy that a steady-state distribution of protein oligomers with a quasiexponential tail is reached within a few minutes after heating. This metastable phase lasts for a few hours, until fibrillar aggregates are observable. Although for such complex systems different aggregation mechanisms can occur simultaneously, our results indicate that the prefibrillar phase is mainly controlled by a simple coagulation-evaporation kinetic mechanism, in which concentration acts as a critical parameter. These experimental facts, along with the kinetic model used, suggest a critical role for thermal concentration fluctuations in the process of fibril nucleation. PMID:16239333

  5. Early events in insulin fibrillization studied by time-lapse atomic force microscopy.

    PubMed

    Podestà, Alessandro; Tiana, Guido; Milani, Paolo; Manno, Mauro

    2006-01-15

    The importance of understanding the mechanism of protein aggregation into insoluble amyloid fibrils lies not only in its medical consequences, but also in its more basic properties of self-organization. The discovery that a large number of uncorrelated proteins can form, under proper conditions, structurally similar fibrils has suggested that the underlying mechanism is a general feature of polypeptide chains. In this work, we address the early events preceding amyloid fibril formation in solutions of zinc-free human insulin incubated at low pH and high temperature. Here, we show by time-lapse atomic force microscopy that a steady-state distribution of protein oligomers with a quasiexponential tail is reached within a few minutes after heating. This metastable phase lasts for a few hours, until fibrillar aggregates are observable. Although for such complex systems different aggregation mechanisms can occur simultaneously, our results indicate that the prefibrillar phase is mainly controlled by a simple coagulation-evaporation kinetic mechanism, in which concentration acts as a critical parameter. These experimental facts, along with the kinetic model used, suggest a critical role for thermal concentration fluctuations in the process of fibril nucleation.

  6. Real-Time Microbiology Laboratory Surveillance System to Detect Abnormal Events and Emerging Infections, Marseille, France.

    PubMed

    Abat, Cédric; Chaudet, Hervé; Colson, Philippe; Rolain, Jean-Marc; Raoult, Didier

    2015-08-01

    Infectious diseases are a major threat to humanity, and accurate surveillance is essential. We describe how to implement a laboratory data-based surveillance system in a clinical microbiology laboratory. Two historical Microsoft Excel databases were implemented. The data were then sorted and used to execute the following 2 surveillance systems in Excel: the Bacterial real-time Laboratory-based Surveillance System (BALYSES) for monitoring the number of patients infected with bacterial species isolated at least once in our laboratory during the study periodl and the Marseille Antibiotic Resistance Surveillance System (MARSS), which surveys the primary β-lactam resistance phenotypes for 15 selected bacterial species. The first historical database contained 174,853 identifications of bacteria, and the second contained 12,062 results of antibiotic susceptibility testing. From May 21, 2013, through June 4, 2014, BALYSES and MARSS enabled the detection of 52 abnormal events for 24 bacterial species, leading to 19 official reports. This system is currently being refined and improved.

  7. Orbit Determination and Navigation of the Time History of Events and Macroscale Interactions during Substorms (THEMIS)

    NASA Technical Reports Server (NTRS)

    Morinelli, Patrick; Cosgrove, jennifer; Blizzard, Mike; Nicholson, Ann; Robertson, Mika

    2007-01-01

    This paper provides an overview of the launch and early orbit activities performed by the NASA Goddard Space Flight Center's (GSFC) Flight Dynamics Facility (FDF) in support of five probes comprising the Time History of Events and Macroscale Interactions during Substorms (THEMIS) spacecraft. The FDF was tasked to support THEMIS in a limited capacity providing backup orbit determination support for validation purposes for all five THEMIS probes during launch plus 30 days in coordination with University of California Berkeley Flight Dynamics Center (UCB/FDC). The FDF's orbit determination responsibilities were originally planned to be as a backup to the UCB/FDC for validation purposes only. However, various challenges early on in the mission and a Spacecraft Emergency declared thirty hours after launch placed the FDF team in the role of providing the orbit solutions that enabled contact with each of the probes and the eventual termination of the Spacecraft Emergency. This paper details the challenges and various techniques used by the GSFC FDF team to successfully perform orbit determination for all five THEMIS probes during the early mission. In addition, actual THEMIS orbit determination results are presented spanning the launch and early orbit mission phase. Lastly, this paper enumerates lessons learned from the THEMIS mission, as well as demonstrates the broad range of resources and capabilities within the FDF for supporting critical launch and early orbit navigation activities, especially challenging for constellation missions.

  8. Orbit Determination and Navigation of the Time History of Events and Macroscale Interactions during Substorms (THEMIS)

    NASA Technical Reports Server (NTRS)

    Morinelli, Patrick; Cosgrove, Jennifer; Blizzard, Mike; Robertson, Mike

    2007-01-01

    This paper provides an overview of the launch and early orbit activities performed by the NASA Goddard Space Flight Center's (GSFC) Flight Dynamics Facility (FDF) in support of five probes comprising the Time History of Events and Macroscale Interactions during Substorms (THEMIS) spacecraft. The FDF was tasked to support THEMIS in a limited capacity providing backup orbit determination support for validation purposes for all five THEMIS probes during launch plus 30 days in coordination with University of California Berkeley Flight Dynamics Center (UCB/FDC)2. The FDF's orbit determination responsibilities were originally planned to be as a backup to the UCB/FDC for validation purposes only. However, various challenges early on in the mission and a Spacecraft Emergency declared thirty hours after launch placed the FDF team in the role of providing the orbit solutions that enabled contact with each of the probes and the eventual termination of the Spacecraft Emergency. This paper details the challenges and various techniques used by the GSFC FDF team to successfully perform orbit determination for all five THEMIS probes during the early mission. In addition, actual THEMIS orbit determination results are presented spanning the launch and early orbit mission phase. Lastly, this paper enumerates lessons learned from the THEMIS mission, as well as demonstrates the broad range of resources and capabilities within the FDF for supporting critical launch and early orbit navigation activities, especially challenging for constellation missions.

  9. Evolution of Storm-time Subauroral Electric Fields: RCM Event Simulations

    NASA Astrophysics Data System (ADS)

    Sazykin, S.; Spiro, R. W.; Wolf, R. A.; Toffoletto, F.; Baker, J.; Ruohoniemi, J. M.

    2012-12-01

    Subauroral polarization streams (SAPS) are regions of strongly-enhanced westward ExB plasma drift (poleward-directed electric fields) located just equatorward of the evening auroral oval. Several recently -installed HF (coherent scatter) radars in the SuperDARN chain at mid-latitudes present a novel opportunity for obtaining two-dimensional maps of ionospheric ExB flows at F-region altitudes that span several hours of the evening and nighttime subauroral ionosphere. These new and exciting observations of SAPS provide an opportunity and a challenge to coupled magnetosphere-ionosphere models. In this paper, we use the Rice Convection Model (RCM) to simulate several events where SAPS were observed by the mid-latitude SuperDARN chain. RCM frequently predicts the occurrence of SAPS in the subauroral evening MLT sector; the mechanism is essentially current closure on the dusk side where downward Birkeland currents (associated with the ion plasma sheet inner edge) map to a region of reduced ionospheric conductance just equatorward of the diffuse auroral precipitation (associated with the electron plasma sheet inner edge). We present detailed comparisons of model-computed ionospheric convection patterns with observations, with two goals in mind: (1) to analyze to what extent the observed appearance and time evolution of SAPS structures are driven by time variations of the cross polar cap potential drop (or, equivalently, the z-component of the interplanetary magnetic field), and (2) to evaluate the ability of the model to reproduce the spatial extent and magnitude of SAPS structures.

  10. Observations of Time Variable Magnitude Events of Phoebe, Ariel, and Titania

    NASA Astrophysics Data System (ADS)

    Miller, Charles; Chanover, N. J.; Holtzman, J. A.; Verbiscer, A. J.

    2007-10-01

    Visual observations of Saturn's moon Phoebe and Uranus' moons Ariel and Titania were made from the Apache Point Observatory (APO). Phoebe was observed with the APO 1 meter telescope over a two month period from 06 January to 04 March 2005, bracketing the zero-phase opposition on 13 January 2005. Phoebe was observed at Sun-Phoebe-Earth phase angles as low as 0.05 degrees on consecutive nights immediately before and after opposition in V, B, R, and I filters. Light curves of the opposition surge, the brightness increase that occurs as the phase angle drops below 0.10 degrees, are presented from this data. The data were processed using standard IRAF aperture photometry image processing techniques. The magnitude and duration of the opposition surge provide clues about the grain size of surface particles on Phoebe. Observations were also made of Uranian moons during mutual occultations in August 2007. Mutual satellite occultations are taking place throughout 2007 as Uranus passes through its equinox, which occurs once every 42 years. The timing and flux variation of satellite occultations provide a check on the accuracy of satellite orbital models. Light curves for Ariel and Titania in R and I filters as they are occulted by Umbriel are presented from data acquired with the APO 1 meter and 3.5 meter telescopes. Comparison is made to the predicted total flux reduction and event timing for each occultation as calculated by the Institut de Mecanique Celeste et de Calcul des Ephemerides (IMCCE) and implications of the results on determination of the relative orbital inclinations of Umbriel, Ariel, and Titania are discussed. This work was supported by an NMSU Space and Aerospace Research Cluster Graduate Fellowship .

  11. U-Scores for Multivariate Data in Sports.

    PubMed

    Wittkowski, Knut M; Song, Tingting; Anderson, Kent; Daniels, John E

    2008-07-18

    In many sport competitions athletes, teams, or countries are evaluated based on several variables. The strong assumptions underlying traditional 'linear weight' scoring systems (that the relative importance, interactions and linearizing transformations of the variables are known) can often not be justified on theoretical grounds, and empirical 'validation' of weights, interactions and transformations, is problematic when a 'gold standard' is lacking. With μ-scores (u-scores for multivariate data) one can integrate information even if the variables have different scales and unknown interactions or if the events counted are not directly comparable, as long as the variables have an 'orientation'. Using baseball as an example, we discuss how measures based on μ-scores can complement the existing measures for 'performance' (which may depend on the situation) by providing the first multivariate measures for 'ability' (which should be independent of the situation). Recently, μ-scores have been extended to situations where count variables are graded by importance or relevance, such as medals in the Olympics (Wittkowski 2003) or Tour-de-France jerseys (Cherchye and Vermeulen 2006, 2007). Here, we present extensions to 'censored' variables (life-time achievements of active athletes), penalties (counting a win more than two ties) and hierarchically structured variables (Nordic, alpine, outdoor, and indoor Olympic events). The methods presented are not restricted to sports. Other applications of the method include medicine (adverse events), finance (risk analysis), social choice theory (voting), and economy (long-term profit).

  12. Limits of declustering methods for disentangling exogenous from endogenous events in time series with foreshocks, main shocks, and aftershocks.

    PubMed

    Sornette, D; Utkin, S

    2009-06-01

    Many time series in natural and social sciences can be seen as resulting from an interplay between exogenous influences and an endogenous organization. We use a simple epidemic-type aftershock model of events occurring sequentially, in which future events are influenced (partially triggered) by past events to ask the question of how well can one disentangle the exogenous events from the endogenous ones. We apply both model-dependent and model-independent stochastic declustering methods to reconstruct the tree of ancestry and estimate key parameters. In contrast with previously reported positive results, we have to conclude that declustered catalogs are rather unreliable for the synthetic catalogs that we have investigated, which contains of the order of thousands of events, typical of realistic applications. The estimated rates of exogenous events suffer from large errors. The branching ratio n, quantifying the fraction of events that have been triggered by previous events, is also badly estimated in general from declustered catalogs. We find, however, that the errors tend to be smaller and perhaps acceptable in some cases for small triggering efficiency and branching ratios. The high level of randomness together with the long memory makes the stochastic reconstruction of trees of ancestry and the estimation of the key parameters perhaps intrinsically unreliable for long-memory processes. For shorter memories (larger "bare" Omori exponent), the results improve significantly.

  13. Multivariate Analysis in Metabolomics

    PubMed Central

    Worley, Bradley; Powers, Robert

    2015-01-01

    Metabolomics aims to provide a global snapshot of all small-molecule metabolites in cells and biological fluids, free of observational biases inherent to more focused studies of metabolism. However, the staggeringly high information content of such global analyses introduces a challenge of its own; efficiently forming biologically relevant conclusions from any given metabolomics dataset indeed requires specialized forms of data analysis. One approach to finding meaning in metabolomics datasets involves multivariate analysis (MVA) methods such as principal component analysis (PCA) and partial least squares projection to latent structures (PLS), where spectral features contributing most to variation or separation are identified for further analysis. However, as with any mathematical treatment, these methods are not a panacea; this review discusses the use of multivariate analysis for metabolomics, as well as common pitfalls and misconceptions. PMID:26078916

  14. Multivariate Data EXplorer (MDX)

    2012-08-01

    The MDX toolkit facilitates exploratory data analysis and visualization of multivariate datasets. MDX provides and interactive graphical user interface to load, explore, and modify multivariate datasets stored in tabular forms. MDX uses an extended version of the parallel coordinates plot and scatterplots to represent the data. The user can perform rapid visual queries using mouse gestures in the visualization panels to select rows or columns of interest. The visualization panel provides coordinated multiple views wherebymore » selections made in one plot are propagated to the other plots. Users can also export selected data or reconfigure the visualization panel to explore relationships between columns and rows in the data.« less

  15. How does leaving home affect marital timing? An event-history analysis of migration and marriage in Nang Rong, Thailand.

    PubMed

    Jampaklay, Aree

    2006-11-01

    This study examines the effects of migration on marital timing in Thailand between 1984 and 2000 using prospective and retrospective survey data from Nang Rong. In contrast to previous results in the literature, event-history analysis of the longitudinal data reveals a positive, not a negative, effect of lagged migration experience on the likelihood of marriage. The findings also indicate gender differences. Migration's positive impact is independent of other life events for women but is completely "explained" by employment for men.

  16. Time-to-Event Analysis of Individual Variables Associated with Nursing Students' Academic Failure: A Longitudinal Study

    ERIC Educational Resources Information Center

    Dante, Angelo; Fabris, Stefano; Palese, Alvisa

    2013-01-01

    Empirical studies and conceptual frameworks presented in the extant literature offer a static imagining of academic failure. Time-to-event analysis, which captures the dynamism of individual factors, as when they determine the failure to properly tailor timely strategies, impose longitudinal studies which are still lacking within the field. The…

  17. Monitoring the data quality of the real-time event reconstruction in the ALICE High Level Trigger

    NASA Astrophysics Data System (ADS)

    Austrheim Erdal, Hege; Richther, Matthias; Szostak, Artur; Toia, Alberica

    2012-12-01

    ALICE is a dedicated heavy ion experiment at the CERN LHC. The ALICE High Level Trigger was designed to select events with desirable physics properties. Data from several of the major subdetectors in ALICE are processed by the HLT for real-time event reconstruction, for instance the Inner Tracking System, the Time Projection Chamber, the electromagnetc calorimeters, the Transition Radiation Detector and the muon spectrometer. The HLT reconstructs events in real-time and thus provides input for trigger algorithms. It is necessary to monitor the quality of the reconstruction where one focuses on track and event properties. Also, HLT implemented data compression for the TPC during the heavy ion data taking in 2011 to reduce the data rate from the ALICE detector. The key for the data compression is to store clusters (spacepoints) calculated by HLT rather than storing raw data. It is thus very important to monitor the cluster finder performance as a way to monitor the data compression. The data monitoring is divided into two stages. The first stage is performed during data taking. A part of the HLT production chain is dedicated to performing online monitoring and facilities are available in the HLT production cluster to have real-time access to the reconstructed events in the ALICE control room. This includes track and event properties, and in addition, this facility gives a way to display a small fraction of the reconstructed events in an online display. The second part of the monitoring is performed after the data has been transferred to permanent storage. After a post-process of the real-time reconstructed data, one can look in more detail at the cluster finder performance, the quality of the reconstruction of tracks, vertices and vertex position. The monitoring solution is presented in detail, with special attention to the heavy ion data taking of 2011.

  18. Effects of Stimulant Medication, Incentives, and Event Rate on Reaction Time Variability in Children With ADHD

    PubMed Central

    Epstein, Jeffery N; Brinkman, William B; Froehlich, Tanya; Langberg, Joshua M; Narad, Megan E; Antonini, Tanya N; Shiels, Keri; Simon, John O; Altaye, Mekibib

    2011-01-01

    This study examined the effects of methylphenidate (MPH) on reaction time (RT) variability in children with attention deficit hyperactivity disorder (ADHD). Using a broad battery of computerized tasks, and both conventional and ex-Gaussian indicators of RT variability, in addition to within-task manipulations of incentive and event rate (ER), this study comprehensively examined the breadth, specificity, and possible moderators of effects of MPH on RT variability. A total of 93 children with ADHD completed a 4-week within-subject, randomized, double-blind, placebo-controlled crossover trial of MPH to identify an optimal dosage. Children were then randomly assigned to receive either their optimal MPH dose or placebo after which they completed five neuropsychological tasks, each allowing trial-by-trial assessment of RTs. Stimulant effects on RT variability were observed on both measures of the total RT distribution (ie, coefficient of variation) as well as on an ex-Gaussian measure examining the exponential portion of the RT distribution (ie, τ). There was minimal, if any, effect of MPH on performance accuracy or RT speed. Within-task incentive and ER manipulations did not appreciably affect stimulant effects across the tasks. The pattern of significant and pervasive effects of MPH on RT variability, and few effects of MPH on accuracy and RT speed suggest that MPH primarily affects RT variability. Given the magnitude and breadth of effects of MPH on RT variability as well as the apparent specificity of these effects of MPH on RT variability indicators, future research should focus on neurophysiological correlates of effects of MPH on RT variability in an effort to better define MPH pharmacodynamics. PMID:21248722

  19. LATE-TIME RADIO EMISSION FROM X-RAY-SELECTED TIDAL DISRUPTION EVENTS

    SciTech Connect

    Bower, Geoffrey C.; Cenko, S. Bradley; Silverman, Jeffrey M.; Bloom, Joshua S.; Metzger, Brian D.

    2013-02-15

    We present new observations with the Karl G. Jansky Very Large Array of seven X-ray-selected tidal disruption events (TDEs). The radio observations were carried out between 9 and 22 years after the initial X-ray discovery, and thus probe the late-time formation of relativistic jets and jet interactions with the interstellar medium in these systems. We detect a compact radio source in the nucleus of the galaxy IC 3599 and a compact radio source that is a possible counterpart to RX J1420.4+5334. We find no radio counterparts for five other sources with flux density upper limits between 51 and 200 {mu}Jy (3{sigma}). If the detections truly represent late radio emission associated with a TDE, then our results suggest that a fraction, {approx}> 10%, of X-ray-detected TDEs are accompanied by relativistic jets. We explore several models for producing late radio emission, including interaction of the jet with gas in the circumnuclear environment (blast wave model), and emission from the core of the jet itself. Upper limits on the radio flux density from archival observations suggest that the jet formation may have been delayed for years after the TDE, possibly triggered by the accretion rate dropping below a critical threshold of {approx}10{sup -2}-10{sup -3} M-dot {sub Edd}. The non-detections are also consistent with this scenario; deeper radio observations can determine whether relativistic jets are present in these systems. The emission from RX J1420.4+5334 is also consistent with the predictions of the blast wave model; however, the radio emission from IC 3599 is substantially underluminous, and its spectral slope is too flat, relative to the blast wave model expectations. Future radio monitoring of IC 3599 and RX J1420.4+5334 will help to better constrain the nature of the jets in these systems.

  20. Demanding response time requirements on coherent receivers due to fast polarization rotations caused by lightning events.

    PubMed

    Krummrich, Peter M; Ronnenberg, David; Schairer, Wolfgang; Wienold, Daniel; Jenau, Frank; Herrmann, Maximilian

    2016-05-30

    Lightning events can cause fast polarization rotations and phase changes in optical transmission fibers due to strong electrical currents and magnetic fields. Whereas these are unlikely to affect legacy transmission systems with direct detection, different mechanisms have to be considered in systems with local oscillator based coherent receivers and digital signal processing. A theoretical analysis reveals that lightning events can result in polarization rotations with speeds as fast as a few hundred kRad/s. We discuss possible mechanisms how such lightning events can affect coherent receivers with digital signal processing. In experimental investigations with a high current pulse generator and transponder prototypes, we observed post FEC errors after polarization rotation events which can be expected from lightning strikes. PMID:27410158

  1. Demanding response time requirements on coherent receivers due to fast polarization rotations caused by lightning events.

    PubMed

    Krummrich, Peter M; Ronnenberg, David; Schairer, Wolfgang; Wienold, Daniel; Jenau, Frank; Herrmann, Maximilian

    2016-05-30

    Lightning events can cause fast polarization rotations and phase changes in optical transmission fibers due to strong electrical currents and magnetic fields. Whereas these are unlikely to affect legacy transmission systems with direct detection, different mechanisms have to be considered in systems with local oscillator based coherent receivers and digital signal processing. A theoretical analysis reveals that lightning events can result in polarization rotations with speeds as fast as a few hundred kRad/s. We discuss possible mechanisms how such lightning events can affect coherent receivers with digital signal processing. In experimental investigations with a high current pulse generator and transponder prototypes, we observed post FEC errors after polarization rotation events which can be expected from lightning strikes.

  2. Real-Time Imaging of Discrete Exocytic Events Mediating Surface Delivery of AMPA Receptors

    PubMed Central

    Yudowski, Guillermo A.; Puthenveedu, Manojkumar A.; Leonoudakis, Dmitri; Panicker, Sandip; Thorn, Kurt S.; Beattie, Eric C.; von Zastrow, Mark

    2011-01-01

    We directly resolved discrete exocytic fusion events mediating insertion of AMPA-type glutamate receptors (AMPARs) to the somatodendritic surface of rat hippocampal pyramidal neurons, in slice and dissociated cultures, using protein tagging with a pH-sensitive GFP (green fluorescent protein) variant and rapid (10 frames/s) fluorescence microscopy. AMPAR-containing exocytic events occurred under basal culture conditions in both the cell body and dendrites; potentiating chemical stimuli produced an NMDA receptor-dependent increase in the frequency of individual exocytic events. The number of AMPARs inserted per exocytic event, estimated using single-molecule analysis, was quite uniform but individual events differed significantly in kinetic properties affecting the subsequent surface distribution of receptors. “Transient” events, from which AMPARs dispersed laterally immediately after surface insertion, generated a pronounced but short-lived (dissipating within ~1 s) increase in surface AMPAR fluorescence extending locally (2–5µm) from the site of exocytosis. “Persistent” events, from which inserted AMPARs dispersed slowly (typically over 5–10 s), affected local surface receptor concentration to a much smaller degree. Both modes of exocytic insertion occurred throughout the dendritic shaft, but remarkably, neither mode of insertion was observed directly into synaptic spines. AMPARs entered spines preferentially from transient events occurring in the adjoining dendritic shaft, driven apparently by mass action and short-range lateral diffusion, and locally delivered AMPARs remained mostly in the mobile fraction. These results suggest a highly dynamic mechanism for both constitutive and activity-dependent surface delivery of AMPARs, mediated by kinetically distinct exocytic modes that differ in propensity to drive lateral entry of receptors to nearby synapses. PMID:17928453

  3. Here and now: how time segments may become events in the hippocampus.

    PubMed

    Lorincz, András; Szirtes, Gábor

    2009-01-01

    The hippocampal formation is believed to play a central role in memory functions related to the representation of events. Events are usually considered as temporally bounded processes, in contrast to the continuous nature of sensory signal flow they originate from. Events are then organized and stored according to behavioral relevance and are used to facilitate prediction of similar events. In this paper we are interested in the kind of representation of sensory signals that allows for detecting and/or predicting events. Based on new results on the identification problem of linear hidden processes, we propose a connectionist network with biologically sound parameter tuning that can represent causal relationships and define events. Interestingly, the wiring diagram of our architecture not only resembles the gross anatomy of the hippocampal formation (including the entorhinal cortex), but it also features similar spatial distribution functions of activity (localized and periodic, 'grid-like' patterns) as found in the different parts of the hippocampal formation. We shortly discuss how our model corresponds to different theories on the role of the hippocampal formation in forming episodic memories or supporting spatial navigation. We speculate that our approach may constitute a step toward a unified theory about the functional role of the hippocampus and the structure of memory representations. PMID:19616410

  4. Time distribution of heavy rainfall events in south west of Iran

    NASA Astrophysics Data System (ADS)

    Ghassabi, Zahra; kamali, G. Ali; Meshkatee, Amir-Hussain; Hajam, Sohrab; Javaheri, Nasrolah

    2016-07-01

    Accurate knowledge of rainfall time distribution is a fundamental issue in many Meteorological-Hydrological studies such as using the information of the surface runoff in the design of the hydraulic structures, flood control and risk management, and river engineering studies. Since the main largest dams of Iran are in the south-west of the country (i.e. South Zagros), this research investigates the temporal rainfall distribution based on an analytical numerical method to increase the accuracy of hydrological studies in Iran. The United States Soil Conservation Service (SCS) estimated the temporal rainfall distribution in various forms. Hydrology studies usually utilize the same distribution functions in other areas of the world including Iran due to the lack of sufficient observation data. However, we first used Weather Research Forecasting (WRF) model to achieve the simulated rainfall results of the selected storms on south west of Iran in this research. Then, a three-parametric Logistic function was fitted to the rainfall data in order to compute the temporal rainfall distribution. The domain of the WRF model is 30.5N-34N and 47.5E-52.5E with a resolution of 0.08 degree in latitude and longitude. We selected 35 heavy storms based on the observed rainfall data set to simulate with the WRF Model. Storm events were scrutinized independently from each other and the best analytical three-parametric logistic function was fitted for each grid point. The results show that the value of the coefficient a of the logistic function, which indicates rainfall intensity, varies from the minimum of 0.14 to the maximum of 0.7. Furthermore, the values of the coefficient B of the logistic function, which indicates rain delay of grid points from start time of rainfall, vary from 1.6 in south-west and east to more than 8 in north and central parts of the studied area. In addition, values of rainfall intensities are lower in south west of IRAN than those of observed or proposed by the

  5. Scaling Time Warp-based Discrete Event Execution to 104 Processors on Blue Gene Supercomputer

    SciTech Connect

    Perumalla, Kalyan S

    2007-01-01

    Lately, important large-scale simulation applications, such as emergency/event planning and response, are emerging that are based on discrete event models. The applications are characterized by their scale (several millions of simulated entities), their fine-grained nature of computation (microseconds per event), and their highly dynamic inter-entity event interactions. The desired scale and speed together call for highly scalable parallel discrete event simulation (PDES) engines. However, few such parallel engines have been designed or tested on platforms with thousands of processors. Here an overview is given of a unique PDES engine that has been designed to support Time Warp-style optimistic parallel execution as well as a more generalized mixed, optimistic-conservative synchronization. The engine is designed to run on massively parallel architectures with minimal overheads. A performance study of the engine is presented, including the first results to date of PDES benchmarks demonstrating scalability to as many as 16,384 processors, on an IBM Blue Gene supercomputer. The results show, for the first time, the promise of effectively sustaining very large scale discrete event execution on up to 104 processors.

  6. Real time imaging of live cell ATP leaking or release events by chemiluminescence microscopy

    SciTech Connect

    Zhang, Yun

    2008-12-18

    The purpose of this research was to expand the chemiluminescence microscopy applications in live bacterial/mammalian cell imaging and to improve the detection sensitivity for ATP leaking or release events. We first demonstrated that chemiluminescence (CL) imaging can be used to interrogate single bacterial cells. While using a luminometer allows detecting ATP from cell lysate extracted from at least 10 bacterial cells, all previous cell CL detection never reached this sensitivity of single bacteria level. We approached this goal with a different strategy from before: instead of breaking bacterial cell membrane and trying to capture the transiently diluted ATP with the firefly luciferase CL assay, we introduced the firefly luciferase enzyme into bacteria using the modern genetic techniques and placed the CL reaction substrate D-luciferin outside the cells. By damaging the cell membrane with various antibacterial drugs including antibiotics such as Penicillins and bacteriophages, the D-luciferin molecules diffused inside the cell and initiated the reaction that produces CL light. As firefly luciferases are large protein molecules which are retained within the cells before the total rupture and intracellular ATP concentration is high at the millmolar level, the CL reaction of firefly luciferase, ATP and D-luciferin can be kept for a relatively long time within the cells acting as a reaction container to generate enough photons for detection by the extremely sensitive intensified charge coupled device (ICCD) camera. The result was inspiring as various single bacterium lysis and leakage events were monitored with 10-s temporal resolution movies. We also found a new way of enhancing diffusion D-luciferin into cells by dehydrating the bacteria. Then we started with this novel single bacterial CL imaging technique, and applied it for quantifying gene expression levels from individual bacterial cells. Previous published result in single cell gene expression quantification

  7. ADESSA: A Real-Time Decision Support Service for Delivery of Semantically Coded Adverse Drug Event Data.

    PubMed

    Duke, Jon D; Friedlin, Jeff

    2010-11-13

    Evaluating medications for potential adverse events is a time-consuming process, typically involving manual lookup of information by physicians. This process can be expedited by CDS systems that support dynamic retrieval and filtering of adverse drug events (ADE's), but such systems require a source of semantically-coded ADE data. We created a two-component system that addresses this need. First we created a natural language processing application which extracts adverse events from Structured Product Labels and generates a standardized ADE knowledge base. We then built a decision support service that consumes a Continuity of Care Document and returns a list of patient-specific ADE's. Our database currently contains 534,125 ADE's from 5602 product labels. An NLP evaluation of 9529 ADE's showed recall of 93% and precision of 95%. On a trial set of 30 CCD's, the system provided adverse event data for 88% of drugs and returned these results in an average of 620ms.

  8. A time-varying subjective quality model for mobile streaming videos with stalling events

    NASA Astrophysics Data System (ADS)

    Ghadiyaram, Deepti; Pan, Janice; Bovik, Alan C.

    2015-09-01

    Over-the-top mobile video streaming is invariably influenced by volatile network conditions which cause playback interruptions (stalling events), thereby impairing users' quality of experience (QoE). Developing models that can accurately predict users' QoE could enable the more efficient design of quality-control protocols for video streaming networks that reduce network operational costs while still delivering high-quality video content to the customers. Existing objective models that predict QoE are based on global video features, such as the number of stall events and their lengths, and are trained and validated on a small pool of ad hoc video datasets, most of which are not publicly available. The model we propose in this work goes beyond previous models as it also accounts for the fundamental effect that a viewer's recent level of satisfaction or dissatisfaction has on their overall viewing experience. In other words, the proposed model accounts for and adapts to the recency, or hysteresis effect caused by a stall event in addition to accounting for the lengths, frequency of occurrence, and the positions of stall events - factors that interact in a complex way to affect a user's QoE. On the recently introduced LIVE-Avvasi Mobile Video Database, which consists of 180 distorted videos of varied content that are afflicted solely with over 25 unique realistic stalling events, we trained and validated our model to accurately predict the QoE, attaining standout QoE prediction performance.

  9. Time-dependent three-dimensional (latitude, longitude, altitude) response of the ionosphere to the 2009 SSW event

    NASA Astrophysics Data System (ADS)

    Azeem, S. I.; Crowley, G.; Reynolds, A.

    2013-12-01

    Recent studies have shown variations in the low and mid latitude ionosphere that are linked to Sudden Stratospheric Warming events. These studies suggest that during SSW events the equatorial electric fields vary in a quasi-deterministic way, producing vertical plasma drifts that deviate from climatological values more than expected. Although previous studies have provided important information on the ionospheric response to SSW events, they have been fairly localized. Therefore, broader observational capabilities and data are required that can unambiguously reveal the instantaneous global response of the ionosphere to SSW events. In this paper, we present four-dimensional (latitude, longitude, height and time) results of the Ionospheric Data Assimilation Four-Dimensional (IDA4D) algorithm to describe a global view of the ionospheric response to the 2009 SSW event. We use the IDA4D to assimilate ionosondes, ground-based GPS TEC, DORIS, CHAMP and GRACE occultation measurements for several days in January 2009 during the SSW event. IDA4D results show that at the peak of the 2009 SSW event, TEC values in the low latitudes were elevated in the morning hours while they were suppressed in the evening sector. The effects of enhanced dynamo forcing during the January 2009 SSW were also captured by the IDA4D showing an increased separation of the Appleton Anomaly peaks. The IDA4D results will be discussed in the context of horizontal, vertical and temporal evolution of ionospheric disturbances associated with the 2009 SSW event. The evolution of longitudinal, local time, and height (where applicable) variations of various plasma parameters (such as Ne, TEC, NmF2, hmF2, foF2) through the full 2009 SSW cycle (including genesis, onset, and recovery) will be presented.

  10. Introduction to multivariate discrimination

    NASA Astrophysics Data System (ADS)

    Kégl, Balázs

    2013-07-01

    Multivariate discrimination or classification is one of the best-studied problem in machine learning, with a plethora of well-tested and well-performing algorithms. There are also several good general textbooks [1-9] on the subject written to an average engineering, computer science, or statistics graduate student; most of them are also accessible for an average physics student with some background on computer science and statistics. Hence, instead of writing a generic introduction, we concentrate here on relating the subject to a practitioner experimental physicist. After a short introduction on the basic setup (Section 1) we delve into the practical issues of complexity regularization, model selection, and hyperparameter optimization (Section 2), since it is this step that makes high-complexity non-parametric fitting so different from low-dimensional parametric fitting. To emphasize that this issue is not restricted to classification, we illustrate the concept on a low-dimensional but non-parametric regression example (Section 2.1). Section 3 describes the common algorithmic-statistical formal framework that unifies the main families of multivariate classification algorithms. We explain here the large-margin principle that partly explains why these algorithms work. Section 4 is devoted to the description of the three main (families of) classification algorithms, neural networks, the support vector machine, and AdaBoost. We do not go into the algorithmic details; the goal is to give an overview on the form of the functions these methods learn and on the objective functions they optimize. Besides their technical description, we also make an attempt to put these algorithm into a socio-historical context. We then briefly describe some rather heterogeneous applications to illustrate the pattern recognition pipeline and to show how widespread the use of these methods is (Section 5). We conclude the chapter with three essentially open research problems that are either

  11. MOBBED: a computational data infrastructure for handling large collections of event-rich time series datasets in MATLAB.

    PubMed

    Cockfield, Jeremy; Su, Kyungmin; Robbins, Kay A

    2013-01-01

    Experiments to monitor human brain activity during active behavior record a variety of modalities (e.g., EEG, eye tracking, motion capture, respiration monitoring) and capture a complex environmental context leading to large, event-rich time series datasets. The considerable variability of responses within and among subjects in more realistic behavioral scenarios requires experiments to assess many more subjects over longer periods of time. This explosion of data requires better computational infrastructure to more systematically explore and process these collections. MOBBED is a lightweight, easy-to-use, extensible toolkit that allows users to incorporate a computational database into their normal MATLAB workflow. Although capable of storing quite general types of annotated data, MOBBED is particularly oriented to multichannel time series such as EEG that have event streams overlaid with sensor data. MOBBED directly supports access to individual events, data frames, and time-stamped feature vectors, allowing users to ask questions such as what types of events or features co-occur under various experimental conditions. A database provides several advantages not available to users who process one dataset at a time from the local file system. In addition to archiving primary data in a central place to save space and avoid inconsistencies, such a database allows users to manage, search, and retrieve events across multiple datasets without reading the entire dataset. The database also provides infrastructure for handling more complex event patterns that include environmental and contextual conditions. The database can also be used as a cache for expensive intermediate results that are reused in such activities as cross-validation of machine learning algorithms. MOBBED is implemented over PostgreSQL, a widely used open source database, and is freely available under the GNU general public license at http://visual.cs.utsa.edu/mobbed. Source and issue reports for MOBBED

  12. MOBBED: a computational data infrastructure for handling large collections of event-rich time series datasets in MATLAB

    PubMed Central

    Cockfield, Jeremy; Su, Kyungmin; Robbins, Kay A.

    2013-01-01

    Experiments to monitor human brain activity during active behavior record a variety of modalities (e.g., EEG, eye tracking, motion capture, respiration monitoring) and capture a complex environmental context leading to large, event-rich time series datasets. The considerable variability of responses within and among subjects in more realistic behavioral scenarios requires experiments to assess many more subjects over longer periods of time. This explosion of data requires better computational infrastructure to more systematically explore and process these collections. MOBBED is a lightweight, easy-to-use, extensible toolkit that allows users to incorporate a computational database into their normal MATLAB workflow. Although capable of storing quite general types of annotated data, MOBBED is particularly oriented to multichannel time series such as EEG that have event streams overlaid with sensor data. MOBBED directly supports access to individual events, data frames, and time-stamped feature vectors, allowing users to ask questions such as what types of events or features co-occur under various experimental conditions. A database provides several advantages not available to users who process one dataset at a time from the local file system. In addition to archiving primary data in a central place to save space and avoid inconsistencies, such a database allows users to manage, search, and retrieve events across multiple datasets without reading the entire dataset. The database also provides infrastructure for handling more complex event patterns that include environmental and contextual conditions. The database can also be used as a cache for expensive intermediate results that are reused in such activities as cross-validation of machine learning algorithms. MOBBED is implemented over PostgreSQL, a widely used open source database, and is freely available under the GNU general public license at http://visual.cs.utsa.edu/mobbed. Source and issue reports for MOBBED

  13. An inverse relation between event-related and time-frequency violation responses in sentence processing.

    PubMed

    Davidson, D J; Indefrey, P

    2007-07-16

    The relationship between semantic and grammatical processing in sentence comprehension was investigated by examining event-related potential (ERP) and event-related power changes in response to semantic and grammatical violations. Sentences with semantic, phrase structure, or number violations and matched controls were presented serially (1.25 words/s) to 20 participants while EEG was recorded. Semantic violations were associated with an N400 effect and a theta band increase in power, while grammatical violations were associated with a P600 effect and an alpha/beta band decrease in power. A quartile analysis showed that for both types of violations, larger average violation effects were associated with lower relative amplitudes of oscillatory activity, implying an inverse relation between ERP amplitude and event-related power magnitude change in sentence processing.

  14. Nutrient losses from manure and fertilizer applications as impacted by time to first runoff event.

    PubMed

    Smith, D R; Owens, P R; Leytem, A B; Warnemuende, E A

    2007-05-01

    Nutrient losses to surface waters following fertilization contribute to eutrophication. This study was conducted to compare the impacts of fertilization with inorganic fertilizer, swine (Sus scrofa domesticus) manure or poultry (Gallus domesticus) litter on runoff water quality, and how the duration between application and the first runoff event affects resulting water quality. Fertilizers were applied at 35 kg P ha-1, and the duration between application and the first runoff event varied between 1 and 29 days. Swine manure was the greatest risk to water quality 1 day after fertilization due to elevated phosphorus (8.4 mg P L-1) and ammonium (10.3 mg NH4-N L-1) concentrations; however, this risk decreased rapidly. Phosphorus concentrations were 2.6 mg L-1 29 days after fertilization with inorganic fertilizer. This research demonstrates that manures might be more environmentally sustainable than inorganic fertilizers, provided runoff events do not occur soon after application.

  15. Nutrient losses from manure and fertilizer applications as impacted by time to first runoff event.

    PubMed

    Smith, D R; Owens, P R; Leytem, A B; Warnemuende, E A

    2007-05-01

    Nutrient losses to surface waters following fertilization contribute to eutrophication. This study was conducted to compare the impacts of fertilization with inorganic fertilizer, swine (Sus scrofa domesticus) manure or poultry (Gallus domesticus) litter on runoff water quality, and how the duration between application and the first runoff event affects resulting water quality. Fertilizers were applied at 35 kg P ha-1, and the duration between application and the first runoff event varied between 1 and 29 days. Swine manure was the greatest risk to water quality 1 day after fertilization due to elevated phosphorus (8.4 mg P L-1) and ammonium (10.3 mg NH4-N L-1) concentrations; however, this risk decreased rapidly. Phosphorus concentrations were 2.6 mg L-1 29 days after fertilization with inorganic fertilizer. This research demonstrates that manures might be more environmentally sustainable than inorganic fertilizers, provided runoff events do not occur soon after application. PMID:17029684

  16. Climate Central World Weather Attribution (WWA) project: Real-time extreme weather event attribution analysis

    NASA Astrophysics Data System (ADS)

    Haustein, Karsten; Otto, Friederike; Uhe, Peter; Allen, Myles; Cullen, Heidi

    2015-04-01

    Extreme weather detection and attribution analysis has emerged as a core theme in climate science over the last decade or so. By using a combination of observational data and climate models it is possible to identify the role of climate change in certain types of extreme weather events such as sea level rise and its contribution to storm surges, extreme heat events and droughts or heavy rainfall and flood events. These analyses are usually carried out after an extreme event has occurred when reanalysis and observational data become available. The Climate Central WWA project will exploit the increasing forecast skill of seasonal forecast prediction systems such as the UK MetOffice GloSea5 (Global seasonal forecasting system) ensemble forecasting method. This way, the current weather can be fed into climate models to simulate large ensembles of possible weather scenarios before an event has fully emerged yet. This effort runs along parallel and intersecting tracks of science and communications that involve research, message development and testing, staged socialization of attribution science with key audiences, and dissemination. The method we employ uses a very large ensemble of simulations of regional climate models to run two different analyses: one to represent the current climate as it was observed, and one to represent the same events in the world that might have been without human-induced climate change. For the weather "as observed" experiment, the atmospheric model uses observed sea surface temperature (SST) data from GloSea5 (currently) and present-day atmospheric gas concentrations to simulate weather events that are possible given the observed climate conditions. The weather in the "world that might have been" experiments is obtained by removing the anthropogenic forcing from the observed SSTs, thereby simulating a counterfactual world without human activity. The anthropogenic forcing is obtained by comparing the CMIP5 historical and natural simulations

  17. Joint modelling of repeated measurements and time-to-event outcomes: flexible model specification and exact likelihood inference

    PubMed Central

    Barrett, Jessica; Diggle, Peter; Henderson, Robin; Taylor-Robinson, David

    2015-01-01

    Random effects or shared parameter models are commonly advocated for the analysis of combined repeated measurement and event history data, including dropout from longitudinal trials. Their use in practical applications has generally been limited by computational cost and complexity, meaning that only simple special cases can be fitted by using readily available software. We propose a new approach that exploits recent distributional results for the extended skew normal family to allow exact likelihood inference for a flexible class of random-effects models. The method uses a discretization of the timescale for the time-to-event outcome, which is often unavoidable in any case when events correspond to dropout. We place no restriction on the times at which repeated measurements are made. An analysis of repeated lung function measurements in a cystic fibrosis cohort is used to illustrate the method. PMID:25866468

  18. Genomic Variation by Whole-Genome SNP Mapping Arrays Predicts Time-to-Event Outcome in Patients with Chronic Lymphocytic Leukemia

    PubMed Central

    Schweighofer, Carmen D.; Coombes, Kevin R.; Majewski, Tadeusz; Barron, Lynn L.; Lerner, Susan; Sargent, Rachel L.; O'Brien, Susan; Ferrajoli, Alessandra; Wierda, William G.; Czerniak, Bogdan A.; Medeiros, L. Jeffrey; Keating, Michael J.; Abruzzo, Lynne V.

    2013-01-01

    Genomic abnormalities, such as deletions in 11q22 or 17p13, are associated with poorer prognosis in patients with chronic lymphocytic leukemia (CLL). We hypothesized that unknown regions of copy number variation (CNV) affect clinical outcome and can be detected by array-based single-nucleotide polymorphism (SNP) genotyping. We compared SNP genotypes from 168 untreated patients with CLL with genotypes from 73 white HapMap controls. We identified 322 regions of recurrent CNV, 82 of which occurred significantly more often in CLL than in HapMap (CLL-specific CNV), including regions typically aberrant in CLL: deletions in 6q21, 11q22, 13q14, and 17p13 and trisomy 12. In univariate analyses, 35 of total and 11 of CLL-specific CNVs were associated with unfavorable time-to-event outcomes, including gains or losses in chromosomes 2p, 4p, 4q, 6p, 6q, 7q, 11p, 11q, and 17p. In multivariate analyses, six CNVs (ie, CLL-specific variations in 11p15.1-15.4 or 6q27) predicted time-to-treatment or overall survival independently of established markers of prognosis. Moreover, genotypic complexity (ie, the number of independent CNVs per patient) significantly predicted prognosis, with a median time-to-treatment of 64 months versus 23 months in patients with zero to one versus two or more CNVs, respectively (P = 3.3 × 10−8). In summary, a comparison of SNP genotypes from patients with CLL with HapMap controls allowed us to identify known and unknown recurrent CNVs and to determine regions and rates of CNV that predict poorer prognosis in patients with CLL. PMID:23273604

  19. Performance Characteristics of a Methodology to Quantify Adverse Events over Time in Hospitalized Patients

    PubMed Central

    Sharek, Paul J; Parry, Gareth; Goldmann, Donald; Bones, Kate; Hackbarth, Andrew; Resar, Roger; Griffin, Frances A; Rhoda, Dale; Murphy, Cathy; Landrigan, Christopher P

    2011-01-01

    Objective To assess the performance characteristics of the Institute for Healthcare Improvement Global Trigger Tool (GTT) to determine its reliability for tracking local and national adverse event rates. Data Sources Primary data from 2008 chart reviews. Study Design A retrospective study in a stratified random sample of 10 North Carolina hospitals. Hospital-based (internal) and contract research organization–hired (external) reviewers used the GTT to identify adverse events in the same 10 randomly selected medical records per hospital in each quarter from January 2002 through December 2007. Data Collection/Extraction Interrater and intrarater reliability was assessed using κ statistics on 10 percent and 5 percent, respectively, of selected medical records. Additionally, experienced GTT users reviewed 10 percent of records to calculate internal and external teams' sensitivity and specificity. Principal Findings Eighty-eight to 98 percent of the targeted 2,400 medical records were reviewed. The reliability of the GTT to detect the presence, number, and severity of adverse events varied from κ=0.40 to 0.60. When compared with a team of experienced reviewers, the internal teams' sensitivity (49 percent) and specificity (94 percent) exceeded the external teams' (34 and 93 percent), as did their performance on all other metrics. Conclusions The high specificity, moderate sensitivity, and favorable interrater and intrarater reliability of the GTT make it appropriate for tracking local and national adverse event rates. The strong performance of hospital-based reviewers supports their use in future studies. PMID:20722749

  20. Time scales of biogeochemical and organismal responses to individual precipitation events

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In temperate grasslands, spatial and intra-annual variability in the activity of plants and microbes are structured by patterns in the precipitation regime. While the effects of total annual precipitation have been well-explored, the ecological dynamics associated with individual precipitation event...

  1. Children, Literacy and Mass Trauma Teaching in Times of Catastrophic Events and on Going Emergency Situations

    ERIC Educational Resources Information Center

    Taylor, Denny

    2006-01-01

    This article focuses on children living in areas of armed conflict, catastrophic events and on-going emergencies. Based upon her ethnographic research, the author wants to try to share some insights into the complexities of the world in which children sometimes have to fight to live. Her intent is to create a space in which a discussion can take…

  2. The Dynamics of Attending: How People Track Time-Varying Events.

    ERIC Educational Resources Information Center

    Large, Edward W.; Jones, Mari Riess

    1999-01-01

    Proposes a theory of attentional dynamics and aims at explaining how listeners respond to systematic change in everyday events while retaining a general sense of their rhythmic structure. A mathematical formulation of the theory describes internal oscillations, called attending rhythms, that focus on pulses of attending energy and interact in…

  3. Constraining timing and origin of extreme wave events, Shirazuka Lowlands, Japan

    NASA Astrophysics Data System (ADS)

    Riedesel, Svenja; Brill, Dominik; Brückner, Helmut; De Batist, Marc; Fujiwara, Osamu; Garrett, Ed; Heyvaert, Vanessa M. A.; Miyairi, Yosuke; Opitz, Stephan; Seeliger, Martin; Shishikura, Masanubu; Yokoyama, Yusuke; Zander, Anja

    2016-04-01

    Tsunami and storm surges are major threats on coastal settlements. The Pacific Coast of southwest Japan is impacted by typhoons and tsunamis caused by earthquakes along the Nankai trough. This part of the Philippine Sea to Eurasia Plate subduction zone is expected to cause another earthquake and tsunami in near future. To improve the predictability of potential events, it is important to establish chronologies of former tsunamis as a basis for long-term recurrence intervals. Characterization of potential event deposits following a multi-proxy approach provides information about sediment source, transport dynamics and depositional processes. Sandwiched between a mid-Pleistocene terrace and a beach ridge, the coastal lowlands at Shirasuka, are ideally situated to record evidence of typhoons and tsunamis. Sediment cores from the lowlands include seven potential extreme wave event deposits. Their age, roughly constrained from a radiocarbon chronology, is historical. However, since the radiocarbon plateau deteriorates the precision of radiocarbon dating, optically stimulated luminescence dating was tested at this site. Quartz, as the favoured mineral for dating young and potentially poorly bleached sediments failed due to low signal intensity, absence of a fast component, and sensitivity to IR stimulation. Instead, feldspar dating is applied, using a standard IR50 and the post-IR-IR130 protocol to account for both signal stability (anomalous fading) and bleachability of the relatively young age of the sediments (<1000 years). The promising feldspar luminescence properties revealed by both protocols may offer the potential to establish robust OSL ages for all seven recorded event deposits that, in the end, may help to refine the existing radiocarbon chronology. Beside the establishment of a high-resolution OSL chronology, sedimentological, geochemical and microfaunal analyses allow a more detailed characterization of the event deposits. By applying the end

  4. A robust real-time gait event detection using wireless gyroscope and its application on normal and altered gaits.

    PubMed

    Gouwanda, Darwin; Gopalai, Alpha Agape

    2015-02-01

    Gait events detection allows clinicians and biomechanics researchers to determine timing of gait events, to estimate duration of stance phase and swing phase and to segment gait data. It also aids biomedical engineers to improve the design of orthoses and FES (functional electrical stimulation) systems. In recent years, researchers have resorted to using gyroscopes to determine heel-strike (HS) and toe-off (TO) events in gait cycles. However, these methods are subjected to significant delays when implemented in real-time gait monitoring devices, orthoses, and FES systems. Therefore, the work presented in this paper proposes a method that addresses these delays, to ensure real-time gait event detection. The proposed algorithm combines the use of heuristics and zero-crossing method to identify HS and TO. Experiments involving: (1) normal walking; (2) walking with knee brace; and (3) walking with ankle brace for overground walking and treadmill walking were designed to verify and validate the identified HS and TO. The performance of the proposed method was compared against the established gait detection algorithms. It was observed that the proposed method produced detection rate that was comparable to earlier reported methods and recorded reduced time delays, at an average of 100 ms.

  5. Tracking the Time Course of Word-Frequency Effects in Auditory Word Recognition with Event-Related Potentials

    ERIC Educational Resources Information Center

    Dufour, Sophie; Brunelliere, Angele; Frauenfelder, Ulrich H.

    2013-01-01

    Although the word-frequency effect is one of the most established findings in spoken-word recognition, the precise processing locus of this effect is still a topic of debate. In this study, we used event-related potentials (ERPs) to track the time course of the word-frequency effect. In addition, the neighborhood density effect, which is known to…

  6. Language Context Effects on Interlingual Homograph Recognition: Evidence from Event-Related Potentials and Response Times in Semantic Priming.

    ERIC Educational Resources Information Center

    de Bruijn, Ellen R. A.; Dijkstra, Ton; Chwilla, Dorothee J.; Schriefers, Herbert J.

    2001-01-01

    Dutch-English bilinguals performed a generalized lexical decision task on triplets of items, responding with "yes" if all items wee correct Dutch and/or English words, and with "no" if one or ore of the items was not a word in wither language. Semantic priming effects were found in on-line response times. Event-related potentials that were…

  7. A robust real-time gait event detection using wireless gyroscope and its application on normal and altered gaits.

    PubMed

    Gouwanda, Darwin; Gopalai, Alpha Agape

    2015-02-01

    Gait events detection allows clinicians and biomechanics researchers to determine timing of gait events, to estimate duration of stance phase and swing phase and to segment gait data. It also aids biomedical engineers to improve the design of orthoses and FES (functional electrical stimulation) systems. In recent years, researchers have resorted to using gyroscopes to determine heel-strike (HS) and toe-off (TO) events in gait cycles. However, these methods are subjected to significant delays when implemented in real-time gait monitoring devices, orthoses, and FES systems. Therefore, the work presented in this paper proposes a method that addresses these delays, to ensure real-time gait event detection. The proposed algorithm combines the use of heuristics and zero-crossing method to identify HS and TO. Experiments involving: (1) normal walking; (2) walking with knee brace; and (3) walking with ankle brace for overground walking and treadmill walking were designed to verify and validate the identified HS and TO. The performance of the proposed method was compared against the established gait detection algorithms. It was observed that the proposed method produced detection rate that was comparable to earlier reported methods and recorded reduced time delays, at an average of 100 ms. PMID:25619613

  8. Hemispheric Differences in the Time-Course of Semantic Priming Processes: Evidence from Event-Related Potentials (ERPs)

    ERIC Educational Resources Information Center

    Bouaffre, Sarah; Faita-Ainseba, Frederique

    2007-01-01

    To investigate hemispheric differences in the timing of word priming, the modulation of event-related potentials by semantic word relationships was examined in each cerebral hemisphere. Primes and targets, either categorically (silk-wool) or associatively (needle-sewing) related, were presented to the left or right visual field in a go/no-go…

  9. Timing and duration of climate variability during the 8.2 ka event reconstructed from four speleothems from Germany

    NASA Astrophysics Data System (ADS)

    Wenz, Sarah; Scholz, Denis; Spötl, Christoph; Plessen, Birgit; Mischel, Simon; Breitenbach, Sebastian F. M.; Jochum, Klaus Peter; Fohlmeister, Jens

    2016-04-01

    The most prominent climate anomaly of the Holocene is the 8.2 ka event, which reflects the impact of a dramatic freshwater influx into the North Atlantic during an interglacial climate state. Thus, it can be considered as a possible analogue for future climate change. Due to the short-lived nature of the event (160.5 ± 5.5 years; Thomas et al., 2007), a detailed investigation requires archives of both high temporal resolution and accurate chronology. We present high-resolution stable oxygen and carbon isotope (ca. 3-4 years) as well as sub-annually resolved trace element records of the 8.2 ka event from stalagmites (BB-3, Bu4, HLK2 and TV1) from three cave systems in Germany (Blessberg Cave, Bunker Cave and Herbstlabyrinth). The location of these caves in central European is well suited in order to detect changes in temperature and precipitation in relation to changes in the North Atlantic region (Fohlmeister et al., 2012). The 8.2 ka event is clearly recorded as a pronounced negative excursion in the δ18O values of all four speleothems. While stalagmites BB-3 from Blessberg Cave and Bu4 from Bunker Cave also show a negative excursion in the δ13C values during the event, the two speleothems from Herbstlabyrinth show no distinctive features in their δ13C values. The timing, duration and structure of the event differ between the individual records. In BB-3, the event occurs earlier (ca. 8.4 ka) and has a relatively short duration of ca. 90 years. In Bu4, the event occurs later (ca. 8.1 ka) and shows a relatively long duration of more than 200 years. In the two speleothems from the Herbstlabyrinth, the event is replicated and has a timing between 8.3 and 8.1 ka and a duration of ca. 150 years. These differences may at least in part be related to the dating uncertainties of 100-200 years (95 % confidence limits). References: Fohlmeister, J., Schroder-Ritzrau, A., Scholz, D., Spötl, C., Riechelmann, D.F.C., Mudelsee, M., Wackerbarth, A., Gerdes, A., Riechelmann, S

  10. Are rare, long waiting times between rearrangement events responsible for the slowdown of the dynamics at the glass transition?

    NASA Astrophysics Data System (ADS)

    Ahn, Ji Won; Falahee, Bryn; Del Piccolo, Chiara; Vogel, Michael; Bingemann, Dieter

    2013-03-01

    The dramatic slowdown of the structural relaxation at the glass transition is one of the most puzzling features of glass dynamics. Single molecule orientational correlation times show this strong Vogel-Fulcher-Tammann temperature dependence typical for glasses. Through statistical analysis of single molecule trajectories, we can identify individual glass rearrangement events in the vicinity of a probe molecule in the glass former poly(vinyl acetate) from 8 K below to 6 K above the glass transition temperature. We find that changes in the distribution of waiting times between individual glass rearrangement events are much less dramatic with temperature, the main difference being a small, but decisive number of increasingly long waiting times at lower temperatures. We notice similar individual, local relaxation events in molecular dynamics trajectories for a variety of glassy systems further from the glass transition, leading to waiting time distributions with similar features as those observed in the single molecule experiments. We show that these rare long waiting times are responsible for the dramatic increase in correlation time upon cooling.

  11. Beyond Visibility: the "Crucifixion Eclipse" in the Context of Some Other Astronomical Events of the Times

    NASA Astrophysics Data System (ADS)

    Gaskell, C. M.

    1993-12-01

    A variety of astronomical, biblical and other historical evidence favors Friday April 3, AD 33 as the date of the crucifixion of Jesus Christ (see Hoehner 1977, "Chronological Aspects of the Life of Christ"). There was also a partial lunar eclipse on that day. Schaefer (1990, QJRAS, 31, 53) has shown convincingly that, while technically the eclipse did occur while the moon was above the horizon in Jerusalem, this eclipse could not have been seen from Jerusalem. However there is good evidence that predictable celestial events were regarded as significant even if they were not visible because of daylight or clouds. Some specific examples will be given of celestial events which would not have been visible from the region, but which were none the less regarded as highly significant during this period. It will be argued that the significance of the lunar eclipse on the day of the crucifixion would be independent of its visibility.

  12. Hour glass half full or half empty? Future time perspective and preoccupation with negative events across the life span.

    PubMed

    Strough, JoNell; Bruine de Bruin, Wändi; Parker, Andrew M; Lemaster, Philip; Pichayayothin, Nipat; Delaney, Rebecca

    2016-09-01

    According to socioemotional selectivity theory, older adults' emotional well-being stems from having a limited future time perspective that motivates them to maximize well-being in the "here and now." Presumably, then, older adults' time horizons are associated with emotional competencies that boost positive affect and dampen negative affect, but little research has addressed this. Using a U.S. adult life-span sample (N = 3,933; 18-93 years), we found that a 2-factor model of future time perspective (future opportunities; limited time) fit the data better than a 1-factor model. Through middle age, people perceived the life-span hourglass as half full-they focused more on future opportunities than limited time. Around Age 60, the balance changed to increasingly perceiving the life-span hourglass as half empty-they focused less on future opportunities and more on limited time, even after accounting for perceived health, self-reported decision-making ability, and retirement status. At all ages, women's time horizons focused more on future opportunities compared with men's, and men's focused more on limited time. Focusing on future opportunities was associated with reporting less preoccupation with negative events, whereas focusing on limited time was associated with reporting more preoccupation. Older adults reported less preoccupation with negative events, and this association was stronger after controlling for their perceptions of limited time and fewer future opportunities, suggesting that other pathways may explain older adults' reports of their ability to disengage from negative events. Insights gained and questions raised by measuring future time perspective as 2 dimensions are discussed. (PsycINFO Database Record

  13. Hour glass half full or half empty? Future time perspective and preoccupation with negative events across the life span.

    PubMed

    Strough, JoNell; Bruine de Bruin, Wändi; Parker, Andrew M; Lemaster, Philip; Pichayayothin, Nipat; Delaney, Rebecca

    2016-09-01

    According to socioemotional selectivity theory, older adults' emotional well-being stems from having a limited future time perspective that motivates them to maximize well-being in the "here and now." Presumably, then, older adults' time horizons are associated with emotional competencies that boost positive affect and dampen negative affect, but little research has addressed this. Using a U.S. adult life-span sample (N = 3,933; 18-93 years), we found that a 2-factor model of future time perspective (future opportunities; limited time) fit the data better than a 1-factor model. Through middle age, people perceived the life-span hourglass as half full-they focused more on future opportunities than limited time. Around Age 60, the balance changed to increasingly perceiving the life-span hourglass as half empty-they focused less on future opportunities and more on limited time, even after accounting for perceived health, self-reported decision-making ability, and retirement status. At all ages, women's time horizons focused more on future opportunities compared with men's, and men's focused more on limited time. Focusing on future opportunities was associated with reporting less preoccupation with negative events, whereas focusing on limited time was associated with reporting more preoccupation. Older adults reported less preoccupation with negative events, and this association was stronger after controlling for their perceptions of limited time and fewer future opportunities, suggesting that other pathways may explain older adults' reports of their ability to disengage from negative events. Insights gained and questions raised by measuring future time perspective as 2 dimensions are discussed. (PsycINFO Database Record PMID:27267222

  14. It’s always snack time: An investigation of event scripts in young children

    PubMed Central

    Musher-Eizenman, Dara R.; Marx, Jenna M.; Taylor, Maija B.

    2015-01-01

    This study examined whether young children include eating in their cognitive scripts for various events, and whether food-related scripts are associated with body mass index (BMI) percentile. Data were collected in a structured interview format. Participants, recruited from area preschools and day cares, provided a four-activity sequence for each of three events, and responses were recorded verbatim. Forty-four children (45% female) participated, with an average BMI percentile of 73.3% (SD = 25.9). Data were binarily coded to indicate whether each response was food-related. Frequencies were obtained, and responses were correlated with BMI percentile. Over 22% of the activities in the children’s scripts involved food. The number of food-related activities reported was positively correlated with children’s BMI percentile (r = 0.53, p = 0.03). Results provide preliminary evidence that food features prominently in young children’s event scripts and that children with higher BMI percentiles may possess scripts that feature more food-related themes. Future researchers should investigate the causal nature of this relationship. PMID:25447019

  15. It's always snack time: an investigation of event scripts in young children.

    PubMed

    Musher-Eizenman, Dara R; Marx, Jenna M; Taylor, Maija B

    2015-02-01

    This study examined whether young children include eating in their cognitive scripts for various events, and whether food-related scripts are associated with body mass index (BMI) percentile. Data were collected in a structured interview format. Participants, recruited from area preschools and day cares, provided a four-activity sequence for each of three events, and responses were recorded verbatim. Forty-four children (45% female) participated, with an average BMI percentile of 73.3% (SD = 25.9). Data were binarily coded to indicate whether each response was food-related. Frequencies were obtained, and responses were correlated with BMI percentile. Over 22% of the activities in the children's scripts involved food. The number of food-related activities reported was positively correlated with children's BMI percentile (r = 0.53, p = 0.03). Results provide preliminary evidence that food features prominently in young children's event scripts and that children with higher BMI percentiles may possess scripts that feature more food-related themes. Future researchers should investigate the causal nature of this relationship.

  16. [Clinical evaluation of ECG at the onset subjective symptoms using real-time analysis electrocardiograph (event recorder)].

    PubMed

    Takizawa, Yoshinori; Shimetani, Naoto; Uchiyama, Kenji; Takayanagi, Kan; Mori, Mikio

    2005-05-01

    Examination of patient complaining of palpitation, chest pain and chest discomfort is usually performed by 12-lead electrocardiograph. However, the recording time is short and there are few opportunities to capture an ECG demonstrating conditions during subjective symptoms. To investigate the cause, we need to obtain an ECG during subjective symptoms. Thus, we frequently use a Holter ECG, which can be recorded for 24 hours. However, some patients have a low frequency of subjective symptoms, which may not appear during a 24-hour examination. We used a real-time electrocardiograph (Event Recorder CG-6106 made by Card Guard Scientific Survival Limited) as a monitor during subjective symptoms. Thereafter, ECG findings at the onset of subjective symptoms could be analyzed in 30 patients who did not have a clear cardiac disease. In this examination, arrhythmia was recorded in 25 of 30 cases. Although in these cases ECG during subjective symptoms could not be captured even when Holter examination was performed several times ECG during subjective symptoms was captured using an Event Recorder. This method using an Event Recorder is simple and convenient, moreover, is considered very useful for investigation of subjective symptoms. In the future, the use of an Event Recorder for heart-health-care in the daily life of healthy people and/or cardiac disease patient is highly anticipated.

  17. Multivariate Hypergeometric Similarity Measure

    PubMed Central

    Kaddi, Chanchala D.; Parry, R. Mitchell; Wang, May D.

    2016-01-01

    We propose a similarity measure based on the multivariate hypergeometric distribution for the pairwise comparison of images and data vectors. The formulation and performance of the proposed measure are compared with other similarity measures using synthetic data. A method of piecewise approximation is also implemented to facilitate application of the proposed measure to large samples. Example applications of the proposed similarity measure are presented using mass spectrometry imaging data and gene expression microarray data. Results from synthetic and biological data indicate that the proposed measure is capable of providing meaningful discrimination between samples, and that it can be a useful tool for identifying potentially related samples in large-scale biological data sets. PMID:24407308

  18. Survival Outcomes and Effect of Early vs. Deferred cART Among HIV-Infected Patients Diagnosed at the Time of an AIDS-Defining Event: A Cohort Analysis

    PubMed Central

    Mussini, Cristina; Johnson, Margaret; d'Arminio Monforte, Antonella; Antinori, Andrea; Gill, M. John; Sighinolfi, Laura; Uberti-Foppa, Caterina; Borghi, Vanni; Sabin, Caroline

    2011-01-01

    Objectives We analyzed clinical progression among persons diagnosed with HIV at the time of an AIDS-defining event, and assessed the impact on outcome of timing of combined antiretroviral treatment (cART). Methods Retrospective, European and Canadian multicohort study.. Patients were diagnosed with HIV from 1997–2004 and had clinical AIDS from 30 days before to 14 days after diagnosis. Clinical progression (new AIDS event, death) was described using Kaplan-Meier analysis stratifying by type of AIDS event. Factors associated with progression were identified with multivariable Cox regression. Progression rates were compared between those starting early (<30 days after AIDS event) or deferred (30–270 days after AIDS event) cART. Results The median (interquartile range) CD4 count and viral load (VL) at diagnosis of the 584 patients were 42 (16, 119) cells/µL and 5.2 (4.5, 5.7) log10 copies/mL. Clinical progression was observed in 165 (28.3%) patients. Older age, a higher VL at diagnosis, and a diagnosis of non-Hodgkin lymphoma (NHL) (vs. other AIDS events) were independently associated with disease progression. Of 366 patients with an opportunistic infection, 178 (48.6%) received early cART. There was no significant difference in clinical progression between those initiating cART early and those deferring treatment (adjusted hazard ratio 1.32 [95% confidence interval 0.87, 2.00], p = 0.20). Conclusions Older patients and patients with high VL or NHL at diagnosis had a worse outcome. Our data suggest that earlier initiation of cART may be beneficial among HIV-infected patients diagnosed with clinical AIDS in our setting. PMID:22043301

  19. Implementation of a multivariate regional index-flood model

    NASA Astrophysics Data System (ADS)

    Requena, Ana Isabel; Chebana, Fateh; Mediero, Luis; Garrote, Luis

    2014-05-01

    A multivariate flood frequency approach is required to obtain appropriate estimates of the design flood associated to a given return period, as the nature of floods is multivariate. A regional frequency analysis is usually conducted to procure estimates or reduce the corresponding uncertainty when no information is available at ungauged sites or a short record is observed at gauged sites. In the present study a multivariate regional methodology based on the index-flood model is presented, seeking to enrich and complete the existing methods by i) considering more general two-parameter copulas for simulating synthetic homogeneous regions to test homogeneity; ii) using the latest definitions of bivariate return periods for quantile estimation; and iii) applying recent procedures for the selection of a subset of bivariate design events from the wider quantile curves. A complete description of the selection processes of both marginal distributions and copula is also included. The proposed methodology provides an entire procedure focused on its practical application. The proposed methodology was applied to a case study located in the Ebro basin in the north of Spain. Series of annual maximum flow peaks (Q) and their associated hydrograph volumes (V ) were selected as flood variables. The initial region was divided into two homogeneous sub-regions by a cluster analysis and a multivariate homogeneity test. The Gumbel and Generalised Extreme Value distributions were selected as marginal distributions to fit the two flood variables. The BB1 copula was found to be the best regional copula for characterising the dependence relation between variables. The OR bivariate joint return period related to the (non-exceedance) probability of the event{Q ≤ qδ§ V ≤ v}was considered for quantile estimation. The index flood was based on the mean of the flood variables. Multiple linear regressions were used to estimate the index flood at ungauged sites. Basin concentration time

  20. Jointly Modeling Event Time and Skewed-Longitudinal Data with Missing Response and Mismeasured Covariate for AIDS Studies.

    PubMed

    Huang, Yangxin; Yan, Chunning; Xing, Dongyuan; Zhang, Nanhua; Chen, Henian

    2015-01-01

    In longitudinal studies it is often of interest to investigate how a repeatedly measured marker in time is associated with a time to an event of interest. This type of research question has given rise to a rapidly developing field of biostatistics research that deals with the joint modeling of longitudinal and time-to-event data. Normality of model errors in longitudinal model is a routine assumption, but it may be unrealistically obscuring important features of subject variations. Covariates are usually introduced in the models to partially explain between- and within-subject variations, but some covariates such as CD4 cell count may be often measured with substantial errors. Moreover, the responses may encounter nonignorable missing. Statistical analysis may be complicated dramatically based on longitudinal-survival joint models where longitudinal data with skewness, missing values, and measurement errors are observed. In this article, we relax the distributional assumptions for the longitudinal models using skewed (parametric) distribution and unspecified (nonparametric) distribution placed by a Dirichlet process prior, and address the simultaneous influence of skewness, missingness, covariate measurement error, and time-to-event process by jointly modeling three components (response process with missing values, covariate process with measurement errors, and time-to-event process) linked through the random-effects that characterize the underlying individual-specific longitudinal processes in Bayesian analysis. The method is illustrated with an AIDS study by jointly modeling HIV/CD4 dynamics and time to viral rebound in comparison with potential models with various scenarios and different distributional specifications. PMID:24905593

  1. Expressive timing facilitates the neural processing of phrase boundaries in music: evidence from event-related potentials.

    PubMed

    Istók, Eva; Friberg, Anders; Huotilainen, Minna; Tervaniemi, Mari

    2013-01-01

    The organization of sound into meaningful units is fundamental to the processing of auditory information such as speech and music. In expressive music performance, structural units or phrases may become particularly distinguishable through subtle timing variations highlighting musical phrase boundaries. As such, expressive timing may support the successful parsing of otherwise continuous musical material. By means of the event-related potential technique (ERP), we investigated whether expressive timing modulates the neural processing of musical phrases. Musicians and laymen listened to short atonal scale-like melodies that were presented either isochronously (deadpan) or with expressive timing cues emphasizing the melodies' two-phrase structure. Melodies were presented in an active and a passive condition. Expressive timing facilitated the processing of phrase boundaries as indicated by decreased N2b amplitude and enhanced P3a amplitude for target phrase boundaries and larger P2 amplitude for non-target boundaries. When timing cues were lacking, task demands increased especially for laymen as reflected by reduced P3a amplitude. In line, the N2b occurred earlier for musicians in both conditions indicating general faster target detection compared to laymen. Importantly, the elicitation of a P3a-like response to phrase boundaries marked by a pitch leap during passive exposure suggests that expressive timing information is automatically encoded and may lead to an involuntary allocation of attention towards significant events within a melody. We conclude that subtle timing variations in music performance prepare the listener for musical key events by directing and guiding attention towards their occurrences. That is, expressive timing facilitates the structuring and parsing of continuous musical material even when the auditory input is unattended. PMID:23383088

  2. Expressive timing facilitates the neural processing of phrase boundaries in music: evidence from event-related potentials.

    PubMed

    Istók, Eva; Friberg, Anders; Huotilainen, Minna; Tervaniemi, Mari

    2013-01-01

    The organization of sound into meaningful units is fundamental to the processing of auditory information such as speech and music. In expressive music performance, structural units or phrases may become particularly distinguishable through subtle timing variations highlighting musical phrase boundaries. As such, expressive timing may support the successful parsing of otherwise continuous musical material. By means of the event-related potential technique (ERP), we investigated whether expressive timing modulates the neural processing of musical phrases. Musicians and laymen listened to short atonal scale-like melodies that were presented either isochronously (deadpan) or with expressive timing cues emphasizing the melodies' two-phrase structure. Melodies were presented in an active and a passive condition. Expressive timing facilitated the processing of phrase boundaries as indicated by decreased N2b amplitude and enhanced P3a amplitude for target phrase boundaries and larger P2 amplitude for non-target boundaries. When timing cues were lacking, task demands increased especially for laymen as reflected by reduced P3a amplitude. In line, the N2b occurred earlier for musicians in both conditions indicating general faster target detection compared to laymen. Importantly, the elicitation of a P3a-like response to phrase boundaries marked by a pitch leap during passive exposure suggests that expressive timing information is automatically encoded and may lead to an involuntary allocation of attention towards significant events within a melody. We conclude that subtle timing variations in music performance prepare the listener for musical key events by directing and guiding attention towards their occurrences. That is, expressive timing facilitates the structuring and parsing of continuous musical material even when the auditory input is unattended.

  3. Are Jewish deathdates affected by the timing of important religious events?

    PubMed

    Lee, P; Smith, G

    2000-01-01

    Earlier studies reported a decline in September mortality in New York City and Budapest during years when Yom Kippur was in the interval September 28 through October 3, and fewer deaths among Californians with Jewish surnames during the week preceding Passover than during the week after Passover. These studies suggest that some Jews are able to postpone their deaths until after the celebration of an important religious event. We reexamine these findings using new data and find no statistically persuasive evidence that Jewish deaths decline before religious holidays. We do find an increase in deaths in the weeks shortly before and after birthdays.

  4. Audiovisual interactions in music reading. A reaction times and event-related potentials study.

    PubMed

    Schön, Daniele; Besson, Mireille

    2003-11-01

    The general aim of this experiment was to investigate the processes involved in reading musical notation and to study the relationship between written music and its auditory representation. Our main interest was to determine if musicians can develop expectancies for plausible or implausible auditory events on the sole basis of the visual score. Results showed that musicians can clearly expect auditory endings on the basis of visual information. These findings enliven the discussion on the question of whether music reading is actually music perception.

  5. Quantifying Immune Response to Influenza Virus Infection via Multivariate Nonlinear ODE Models with Partially Observed State Variables and Time-Varying Parameters

    PubMed Central

    Wu, Hulin; Miao, Hongyu; Xue, Hongqi; Topham, David J.; Zand, Martin

    2014-01-01

    Summary Influenza A virus (IAV) infection continues to be a global health threat, as evidenced by the outbreak of the novel A/California/7/2009 IAV strain. Previous flu vaccines have proven less effective than hoped for emerging IAV strains, indicating a more thorough understanding of immune responses to primary infection is needed. One issue is the difficulty in directly measuring many key parameters and variables of the immune response. To address these issues, we considered a comprehensive workflow for statistical inference for ordinary differential question (ODE) models with partially observed variables and time-varying parameters, including identifiability analysis, two-stage and NLS estimation, and model selection etc‥ In particular, we proposed a novel one-step method to verify parameter identifiability and formulate estimating equations simultaneously. Thus, the pseudo-LS method can now deal with general ODE models with partially observed state variables for the first time. Using this workflow, we verified the relative significance of various immune factors to virus control, including target epithelial cells, cytotoxic T-lymphocyte (CD8+) cells and IAV specific antibodies (IgG and IgM). Factors other than cytotoxic T-lymphocyte (CTL) killing contributed the most to the loss of infected epithelial cells, though the effects of CTL are still significant. IgM antibody was found to be the major contributor to neutralization of free infectious viral particles. Also, the maximum viral load, which correlates well with mortality, was found to depend more on viral replication rates than infectivity. In contrast to current hypotheses, the results obtained via our methods suggest that IgM antibody and viral replication rates may be worth of further explorations in vaccine development. PMID:26085850

  6. Near Real-Time Optimal Prediction of Adverse Events in Aviation Data

    NASA Technical Reports Server (NTRS)

    Martin, Rodney Alexander; Das, Santanu

    2010-01-01

    The prediction of anomalies or adverse events is a challenging task, and there are a variety of methods which can be used to address the problem. In this paper, we demonstrate how to recast the anomaly prediction problem into a form whose solution is accessible as a level-crossing prediction problem. The level-crossing prediction problem has an elegant, optimal, yet untested solution under certain technical constraints, and only when the appropriate modeling assumptions are made. As such, we will thoroughly investigate the resilience of these modeling assumptions, and show how they affect final performance. Finally, the predictive capability of this method will be assessed by quantitative means, using both validation and test data containing anomalies or adverse events from real aviation data sets that have previously been identified as operationally significant by domain experts. It will be shown that the formulation proposed yields a lower false alarm rate on average than competing methods based on similarly advanced concepts, and a higher correct detection rate than a standard method based upon exceedances that is commonly used for prediction.

  7. Cosmic ray measurements on board Helios 1 from December 1974 to September 1975 Quiet time spectra, radial gradients, and solar events

    NASA Technical Reports Server (NTRS)

    Kunow, H.; Witte, M.; Wibberenz, G.; Hempe, H.; Mueller-Mellin, R.; Green, G.; Iwers, B.; Fuckner, J.

    1977-01-01

    The considered time period is characterized by a general decrease in solar activity towards a minimum which occurred in July 1976. The relatively quiet solar conditions facilitate the separation of the gradually varying galactic cosmic radiation from superimposed events of different characteristics. The inspection of neutron monitor data shows that the period is characterized by a slow increase of the high energy galactic cosmic radiation at a relatively constant rate. Attention is given to the instrumentation employed, intensity time profiles and preliminary radial gradients, and quiet time energy spectra. The solar particle events discussed include the January 5, 1976 event, the March 3, 1975 event, and the March 19/20, 1975 event.

  8. The 2009-2010 Guerrero Slow Slip Event Monitored by InSAR, Using Time Series Approach

    NASA Astrophysics Data System (ADS)

    Bacques, G.; Pathier, E.; Lasserre, C.; Cotton, F.; Radiguet, M.; Cycle Sismique et Déformations Transitoires

    2011-12-01

    Time Series approach. Time Series approach is useful for monitoring ground deformation evolution during the slow slip events and makes the slip propagation mapping upon the subduction plane a promising goal. Here we present our first results concerning the 2009-2010 slow slip events, particularly the distribution of the cumulative surface displacement in LOS (satellite Line Of Sight), the slip distribution associated on the fault plane and the ground deformation evolution obtained. Finally, we open the discussion with a first comparison between the 2009-2010 and the 2006 events that reveal some differences concerning the amplitude and the distribution of the ground deformation.

  9. Performance of joint modelling of time-to-event data with time-dependent predictors: an assessment based on transition to psychosis data

    PubMed Central

    2016-01-01

    Joint modelling has emerged to be a potential tool to analyse data with a time-to-event outcome and longitudinal measurements collected over a series of time points. Joint modelling involves the simultaneous modelling of the two components, namely the time-to-event component and the longitudinal component. The main challenges of joint modelling are the mathematical and computational complexity. Recent advances in joint modelling have seen the emergence of several software packages which have implemented some of the computational requirements to run joint models. These packages have opened the door for more routine use of joint modelling. Through simulations and real data based on transition to psychosis research, we compared joint model analysis of time-to-event outcome with the conventional Cox regression analysis. We also compared a number of packages for fitting joint models. Our results suggest that joint modelling do have advantages over conventional analysis despite its potential complexity. Our results also suggest that the results of analyses may depend on how the methodology is implemented. PMID:27781169

  10. Time-Related Determinants of Marital Dissolution.

    ERIC Educational Resources Information Center

    Heaton, Tim B.

    1991-01-01

    Examined temporal dimensions (timing of prior events, historical time, duration dependence, selectivity) and their impact on marital dissolution in multivariate continuous time model using data from June 1985 Current Population Survey. Results indicated that marital stability decreased over time, increased over marital duration, increased with age…

  11. Multivariate statistic and time series analyses of grain-size data in quaternary sediments of Lake El'gygytgyn, NE Russia

    NASA Astrophysics Data System (ADS)

    Francke, A.; Wennrich, V.; Sauerbrey, M.; Juschus, O.; Melles, M.; Brigham-Grette, J.

    2013-11-01

    Lake El'gygytgyn, located in the Far East Russian Arctic, was formed by a meteorite impact about 3.58 Ma ago. In 2009, the International Continental Scientific Drilling Program (ICDP) at Lake El'gygytgyn obtained a continuous sediment sequence of the lacustrine deposits and the upper part of the impact breccia. Here, we present grain-size data of the past 2.6 Ma. General downcore grain-size variations yield coarser sediments during warm periods and finer ones during cold periods. According to principal component analysis (PCA), the climate-dependent variations in grain-size distributions mainly occur in the coarse silt and very fine silt fraction. During interglacial periods, accumulation of coarser material in the lake center is caused by redistribution of clastic material by a wind-induced current pattern during the ice-free period. Sediment supply to the lake is triggered by the thickness of the active layer in the catchment and the availability of water as a transport medium. During glacial periods, sedimentation at Lake El'gygytgyn is hampered by the occurrence of a perennial ice cover, with sedimentation being restricted to seasonal moats and vertical conduits through the ice. Thus, the summer temperature predominantly triggers transport of coarse material into the lake center. Time series analysis that was carried out to gain insight into the frequency of the grain-size data showed variations predominately on 98.5, 40.6, and 22.9 kyr oscillations, which correspond to Milankovitch's eccentricity, obliquity and precession bands. Variations in the relative power of these three oscillation bands during the Quaternary suggest that sedimentation processes at Lake El'gygytgyn are dominated by environmental variations caused by global glacial-interglacial variations (eccentricity, obliquity), and local insolation forcing and/or latitudinal teleconnections (precession), respectively.

  12. Estimating incremental cost-effectiveness ratios and their confidence intervals with different terminating events for survival time and costs.

    PubMed

    Chen, Shuai; Zhao, Hongwei

    2013-07-01

    Cost-effectiveness analysis (CEA) is an important component of the economic evaluation of new treatment options. In many clinical and observational studies of costs, censored data pose challenges to the CEA. We consider a special situation where the terminating events for the survival time and costs are different. Traditional methods for statistical inference offer no means for dealing with censored data in these circumstances. To address this gap, we propose a new method for deriving the confidence interval for the incremental cost-effectiveness ratio. The simulation studies and real data example show that our method performs very well for some practical settings, revealing a great potential for application to actual settings in which terminating events for the survival time and costs differ.

  13. Real-time Monitoring of Discrete Synaptic Release Events and Excitatory Potentials within Self-reconstructed Neuromuscular Junctions.

    PubMed

    Li, Yu-Tao; Zhang, Shu-Hui; Wang, Xue-Ying; Zhang, Xin-Wei; Oleinick, Alexander I; Svir, Irina; Amatore, Christian; Huang, Wei-Hua

    2015-08-01

    Chemical synaptic transmission is central to the brain functions. In this regard, real-time monitoring of chemical synaptic transmission during neuronal communication remains a great challenge. In this work, in vivo-like oriented neural networks between superior cervical ganglion (SCG) neurons and their effector smooth muscle cells (SMC) were assembled in a microfluidic device. This allowed amperometric detection of individual neurotransmitter release events inside functional SCG-SMC synapse with carbon fiber nanoelectrodes as well as recording of postsynaptic potential using glass nanopipette electrodes. The high vesicular release activities essentially involved complex events arising from flickering fusion pores as quantitatively established based on simulations. This work allowed for the first time monitoring in situ chemical synaptic transmission under conditions close to those found in vivo, which may yield important and new insights into the nature of neuronal communications. PMID:26079517

  14. Adverse events following immunization: is this time for the use of WHO causality assessment?

    PubMed

    Tafuri, Silvio; Gallone, Maria Serena; Calabrese, Giulia; Germinario, Cinzia

    2015-05-01

    In recent years, public health authorities in industrialized countries have noted an increase in the numbers of parents choosing not to have their children vaccinated and in the activities of 'antivaccination' movements. Doubts about vaccine safety and lack of surveillance of adverse events following immunization (AEFI) are the most frequent themes proposed by antivaccination movements. This editorial aims to critically analyze the use of AEFI assessment procedures among national health authorities and public health researchers. In fact, the WHO recommended and published a systematic and standardized causality assessment process for serious AEFI, providing a method for individual causality assessment to be used by staff of national immunization programs, regulatory authorities and pharmacovigilance or surveillance departments. The last update was published in March 2013 but to date, an Internet search reveals no information or reports on AEFI surveillance that uses the WHO AEFI causality assessment.

  15. Cause of death--so-called designed event acclimaxing timed happenings.

    PubMed

    Kothari, M L; Mehta, L A; Kothari, V M

    2000-01-01

    Cause-of-death as an established global medical institution faces its greatest challenge in the commonplace observation that the healthy do not necessarily survive and the diseased do not necessarily die. A logical analysis of the assumed relationships between disease and death provides some insights that allow questioning the taken-for-granted relationship between defined disease/s and the final common parameter of death. Causalism as a paradigm has taken leave of all advanced sciences. In medicine, it is lingering on for anthropocentric reasons. Natural death does not come to pass because of some (replaceable) missing element, but because the evolution of the individual from womb to tomb has arrived at its final destination. To accept death as a physiologic event is to advance thanatology and to disburden medical colleges and hospitals of a lot of avoidable thinking and doing.

  16. Low precipitation events in the European Greater Alpine Region and their space-time patterns in the past 210 years.

    NASA Astrophysics Data System (ADS)

    Haslinger, Klaus; Holawe, Franz; Blöschl, Günter

    2016-04-01

    In this study space-time patterns of low precipitation events in the Greater Alpine Region (GAR) of Europe are investigated. A long term gridded dataset of monthly precipitation sums spanning the last 210 years is used to assess abnormally dry states by applying a monthly percentile deceedence threshold. Furthermore, these anomalies are calculated for 1, 3, 6 and 12 months moving averages. Contiguous areas of grid points below the threshold are recorded in a lookup table in order to assess the dry anomalies on an event-based approach. The overall event severity is determined by the mean deviation from the threshold level and the area affected. With this approach we are able to show that the most severe dry anomalies take place in the 1860s, 1850s and 1940s, although there are some differences in the occurrence over time in summer and winter. Winter dry anomalies are more frequent in the 19th century, whereas in summer no clear patterns are perceptible. A spatial clustering analysis of the anomaly fields also reveals distinct patterns in space, clearly indicating the Main Alpine Crest as a major divide of dry anomalies from North to South. A joint consideration of detected dry anomaly events and their associated temperature anomalies shows that in winter of the late 19th and early 20th century dry conditions are more often accompanied by cold temperatures in contrast to the last 50 years where dry anomalies are associated with above average winter temperatures. In general dry summers are more likely warmer than the long term mean, but there is also a considerable number of dry events with negative temperature anomalies, particularly in the late 19th and early 20th century.

  17. Random-Effects Meta-Analysis of Time-to-Event Data Using the Expectation-Maximisation Algorithm and Shrinkage Estimators

    ERIC Educational Resources Information Center

    Simmonds, Mark C.; Higgins, Julian P. T.; Stewart, Lesley A.

    2013-01-01

    Meta-analysis of time-to-event data has proved difficult in the past because consistent summary statistics often cannot be extracted from published results. The use of individual patient data allows for the re-analysis of each study in a consistent fashion and thus makes meta-analysis of time-to-event data feasible. Time-to-event data can be…

  18. ADESSA: A Real-Time Decision Support Service for Delivery of Semantically Coded Adverse Drug Event Data

    PubMed Central

    Duke, Jon D.; Friedlin, Jeff

    2010-01-01

    Evaluating medications for potential adverse events is a time-consuming process, typically involving manual lookup of information by physicians. This process can be expedited by CDS systems that support dynamic retrieval and filtering of adverse drug events (ADE’s), but such systems require a source of semantically-coded ADE data. We created a two-component system that addresses this need. First we created a natural language processing application which extracts adverse events from Structured Product Labels and generates a standardized ADE knowledge base. We then built a decision support service that consumes a Continuity of Care Document and returns a list of patient-specific ADE’s. Our database currently contains 534,125 ADE’s from 5602 product labels. An NLP evaluation of 9529 ADE’s showed recall of 93% and precision of 95%. On a trial set of 30 CCD’s, the system provided adverse event data for 88% of drugs and returned these results in an average of 620ms. PMID:21346964

  19. An efficient approach to identify different chemical markers between fibrous root and rhizome of Anemarrhena asphodeloides by ultra high-performance liquid chromatography quadrupole time-of-flight tandem mass spectrometry with multivariate statistical analysis.

    PubMed

    Wang, Fang-Xu; Yuan, Jian-Chao; Kang, Li-Ping; Pang, Xu; Yan, Ren-Yi; Zhao, Yang; Zhang, Jie; Sun, Xin-Guang; Ma, Bai-Ping

    2016-09-10

    An ultra high-performance liquid chromatography quadrupole time-of-flight tandem mass spectrometry approach coupled with multivariate statistical analysis was established and applied to rapidly distinguish the chemical differences between fibrous root and rhizome of Anemarrhena asphodeloides. The datasets of tR-m/z pairs, ion intensity and sample code were processed by principal component analysis and orthogonal partial least squares discriminant analysis. Chemical markers could be identified based on their exact mass data, fragmentation characteristics, and retention times. And the new compounds among chemical markers could be isolated rapidly guided by the ultra high-performance liquid chromatography quadrupole time-of-flight tandem mass spectrometry and their definitive structures would be further elucidated by NMR spectra. Using this approach, twenty-four markers were identified on line including nine new saponins and five new steroidal saponins of them were obtained in pure form. The study validated this proposed approach as a suitable method for identification of the chemical differences between various medicinal parts in order to expand medicinal parts and increase the utilization rate of resources. PMID:27416524

  20. Time-stratigraphic reconstruction and integration of paleopedologic, sedimentologic, and biotic events (Willwood Formation, Lower Eocene, northwest Wyoming, USA)

    USGS Publications Warehouse

    Bown, T.M.; Kraus, M.J.

    1993-01-01

    An empirically-based model is advanced using paleosol maturities to estimate the relative geologic time separating any stratigraphic levels within the lower Eocene Willwood Formation. The reviewed Willwood time stratigraphy from this analysis helps evaluate the nature, tempo, and possible causes of three major episodes of mammalian appearance and disappearance. These faunal events are directly correlated with certain apects of paleosol evolution in the Willwood Formation. That evolution is tied directly to climatic changes and to varying sediment accumulation rates in response to tectonism. -from Authors

  1. Pyroclastic density current hazard maps at Campi Flegrei caldera (Italy): the effects of event scale, vent location and time forecasts.

    NASA Astrophysics Data System (ADS)

    Bevilacqua, Andrea; Neri, Augusto; Esposti Ongaro, Tomaso; Isaia, Roberto; Flandoli, Franco; Bisson, Marina

    2016-04-01

    Today hundreds of thousands people live inside the Campi Flegrei caldera (Italy) and in the adjacent part of the city of Naples making a future eruption of such volcano an event with huge consequences. Very high risks are associated with the occurrence of pyroclastic density currents (PDCs). Mapping of background or long-term PDC hazard in the area is a great challenge due to the unknown eruption time, scale and vent location of the next event as well as the complex dynamics of the flow over the caldera topography. This is additionally complicated by the remarkable epistemic uncertainty on the eruptive record, affecting the time of past events, the location of vents as well as the PDCs areal extent estimates. First probability maps of PDC invasion were produced combining a vent-opening probability map, statistical estimates concerning the eruptive scales and a Cox-type temporal model including self-excitement effects, based on the eruptive record of the last 15 kyr. Maps were produced by using a Monte Carlo approach and adopting a simplified inundation model based on the "box model" integral approximation tested with 2D transient numerical simulations of flow dynamics. In this presentation we illustrate the independent effects of eruption scale, vent location and time of forecast of the next event. Specific focus was given to the remarkable differences between the eastern and western sectors of the caldera and their effects on the hazard maps. The analysis allowed to identify areas with elevated probabilities of flow invasion as a function of the diverse assumptions made. With the quantification of some sources of uncertainty in relation to the system, we were also able to provide mean and percentile maps of PDC hazard levels.

  2. Structural Equation Modeling of Multivariate Time Series

    ERIC Educational Resources Information Center

    du Toit, Stephen H. C.; Browne, Michael W.

    2007-01-01

    The covariance structure of a vector autoregressive process with moving average residuals (VARMA) is derived. It differs from other available expressions for the covariance function of a stationary VARMA process and is compatible with current structural equation methodology. Structural equation modeling programs, such as LISREL, may therefore be…

  3. Event timing and shape analysis of vibration bursts from power circuit breakers

    SciTech Connect

    Polycarpou, A.A.; Soom, A.; Swarnakar, V.; Valtin, R.A.; Acharya, R.S.; Demjanenko, V.; Soumekh, M.; Benenson, D.M.; Porter, J.W.

    1996-04-01

    Noninvasive vibration diagnostic techniques are implemented to assess the mechanical condition of power circuit breakers. A diagnostic system, the Prototype Commercial Portable Diagnostic System (PCPDS) has been developed. Hardware of the PCPDS includes a portable computer, and a data acquisition unit and computer communication cards. Signal processing techniques include the discrete energy statistics envelope, short-time power spectrum, timing extraction algorithm and chi-square based shape test. Decision-making is carried out via a voter program, to which individual results from the timing and shape analysis programs are passed. Statistical and empirical thresholds have been established, that classify the circuit breaker as being in Normal-Transitional-Abnormal (Green-Yellow-Red) condition.

  4. Time-to-event analysis of predictors for recovery from Salmonella Dublin infection in Danish dairy herds between 2002 and 2012.

    PubMed

    Nielsen, Liza Rosenbaum; Dohoo, Ian

    2013-07-01

    Salmonella Dublin infections reduce gross margins and compromise animal health and welfare in dairy cattle herds. Despite on-going control efforts in several countries the duration and risk factors of a persistent infection have been difficult to study due to a lack of suitable data. This study utilised the unique opportunity to extract systematically collected repeated bulk-tank milk antibody measurements from all the Danish dairy herds during a 10-year period to perform a time-to-event analysis of the factors that affect the duration of test-positivity and the hazards of recovery from S. Dublin at herd level. Recovery was defined as a shift from test-positive to test-negative between two year-quarters followed by at least three more test-negative year-quarters. The average duration of infection was approximately 2 years. Predictors of recovery were tested in a multivariable Cox proportional hazard model allowing herds to recover from infection multiple times over the 10-year surveillance period. The model results were based on 36,429 observations with data on all the predictors, representing 3563 herds with a total of 3246 recoveries. Sixty-seven herds (2.4%) remained test-positive throughout the study period. The rest of the 317 herds that did not have any recoveries were censored, mainly due to a cessation of milk production. Prior recovery from test-positivity turned out not to be a significant predictor of recovery in the model. The effect of the duration of infection on the conditional probability of recovery (i.e. the hazard) was time-dependent: early in the study period, long durations of infection were predictive of a low hazard of recovery. Later in the control programme the effect of duration of infection was reduced indicating a desired effect of an intensified control programme. There was an increasing tendency towards longer durations and lower hazard of recovery with: (i) increasing herd sizes, (ii) increasing bulk-tank milk somatic cell counts

  5. The influence of minimum time between rain events (MTE) on the daily rainfall and EI30 erosivity index relation.

    NASA Astrophysics Data System (ADS)

    Ayuso-Ruiz, P.; Ayuso-Muñoz, J. L.; Taguas, E. V.; García-Marín, A.

    2010-05-01

    The amount of rain registered between two consecutives dry time intervals can be defined as a downpour or rain event. The length of these dry periods is known as minimum time between events (MTE). This work analyses the influence of the MTE value on the daily rainfall and EI30 erosivity index relationship. Using a potential equation like , the relation between daily EI30 index and daily precipitation, P, was obtained for Malaga. Hourly rainfall data from 1981 to 2007 were used. Rain events of at least 10 mm were identified four each rainy day and several MTE were used (1, 2, 3, 4, 5 and 6 hours). Due to hourly resolution of the data, the EI60 index was then obtained by multiplying the kinetic energy and the maximum hourly rainfall. Ten minutes resolution data were also available in Malaga from 1999 to 2002. Using these records the lineal correlation between EI30 and EI60 indexes was obtained, allowing the conversion of the EI60 indexes previously obtained. The results showed that no significant differences appear when varying the MTE value. The R2 coefficient had values of 0.7192 when working with a 2 hour MTE and 0.7503 for 6 hour MTE. Thus, it can be concluded that the best relation was obtained for the last MTE, though a slightly dependency between daily rainfall and EI30 index was found.

  6. Seismicity along the Main Marmara Fault, Turkey: from space-time distribution to repeating events

    NASA Astrophysics Data System (ADS)

    Schmittbuhl, Jean; Karabulut, Hayrullah; Lengliné, Olivier; Bouchon, Michel

    2016-04-01

    The North Anatolian Fault (NAF) poses a significant hazard for the large cities surrounding the Marmara Sea region particularly the megalopolis of Istanbul. Indeed, the NAF is presently hosting a long unruptured segment below the Sea of Marmara. This seismic gap is approximately 150 km long and corresponds to the Main Marmara Fault (MMF). The seismicity along the Main Marmara Fault (MMF) below the Marmara Sea is analyzed here during the 2007-2012 period to provide insights on the recent evolution of this important regional seismic gap. High precision locations show that seismicity is strongly varying along strike and depth providing fine details of the fault behavior that are inaccessible from geodetic inversions. The activity strongly clusters at the regions of transition between basins. The Central basin shows significant seismicity located below the shallow locking depth inferred from GPS measurements. Its b-value is low and the average seismic slip is high. Interestingly we found also several long term repeating earthquakes in this domain. Using a template matching technique, we evidenced two new families of repeaters: a first family that typically belongs to aftershock sequences and a second family of long lasting repeaters with a multi-month recurrence period. All observations are consistent with a deep creep of this segment. On the contrary, the Kumburgaz basin at the center of the fault shows sparse seismicity with the hallmarks of a locked segment. In the eastern Marmara Sea, the seismicity distribution along the Princes Island segment in the Cinarcik basin, is consistent with the geodetic locking depth of 10km and a low contribution to the regional seismic energy release. The assessment of the locked segment areas provide an estimate of the magnitude of the main forthcoming event to be about 7.3 assuming that the rupture will not enter significantly within creeping domains.

  7. Generalized Robertson-Walker Space-Time Admitting Evolving Null Horizons Related to a Black Hole Event Horizon

    PubMed Central

    2016-01-01

    A new technique is used to study a family of time-dependent null horizons, called “Evolving Null Horizons” (ENHs), of generalized Robertson-Walker (GRW) space-time (M¯,g¯) such that the metric g¯ satisfies a kinematic condition. This work is different from our early papers on the same issue where we used (1 + n)-splitting space-time but only some special subcases of GRW space-time have this formalism. Also, in contrast to previous work, we have proved that each member of ENHs is totally umbilical in (M¯,g¯). Finally, we show that there exists an ENH which is always a null horizon evolving into a black hole event horizon and suggest some open problems. PMID:27722202

  8. Relating Derived Relations as a Model of Analogical Reasoning: Reaction Times and Event-Related Potentials

    ERIC Educational Resources Information Center

    Barnes-Holmes, Dermot; Regan, Donal; Barnes-Holmes, Yvonne; Commins, Sean; Walsh, Derek; Stewart, Ian; Smeets, Paul M.; Whelan, Robert; Dymond, Simon

    2005-01-01

    The current study aimed to test a Relational Frame Theory (RFT) model of analogical reasoning based on the relating of derived same and derived difference relations. Experiment 1 recorded reaction time measures of similar-similar (e.g., "apple is to orange as dog is to cat") versus different-different (e.g., "he is to his brother as chalk is to…

  9. Cure models for the analysis of time-to-event data in cancer studies.

    PubMed

    Jia, Xiaoyu; Sima, Camelia S; Brennan, Murray F; Panageas, Katherine S

    2013-11-01

    In settings when it is biologically plausible that some patients are cured after definitive treatment, cure models present an alternative to conventional survival analysis. Cure models can inform on the group of patients cured, by estimating the probability of cure, and identifying factors that influence it; while simultaneously focusing on time to recurrence and associated factors for the remaining patients.

  10. Event-driven Monte Carlo: Exact dynamics at all time scales for discrete-variable models

    NASA Astrophysics Data System (ADS)

    Mendoza-Coto, Alejandro; Díaz-Méndez, Rogelio; Pupillo, Guido

    2016-06-01

    We present an algorithm for the simulation of the exact real-time dynamics of classical many-body systems with discrete energy levels. In the same spirit of kinetic Monte Carlo methods, a stochastic solution of the master equation is found, with no need to define any other phase-space construction. However, unlike existing methods, the present algorithm does not assume any particular statistical distribution to perform moves or to advance the time, and thus is a unique tool for the numerical exploration of fast and ultra-fast dynamical regimes. By decomposing the problem in a set of two-level subsystems, we find a natural variable step size, that is well defined from the normalization condition of the transition probabilities between the levels. We successfully test the algorithm with known exact solutions for non-equilibrium dynamics and equilibrium thermodynamical properties of Ising-spin models in one and two dimensions, and compare to standard implementations of kinetic Monte Carlo methods. The present algorithm is directly applicable to the study of the real-time dynamics of a large class of classical Markovian chains, and particularly to short-time situations where the exact evolution is relevant.

  11. Ultra-high throughput real-time instruments for capturing fast signals and rare events

    NASA Astrophysics Data System (ADS)

    Buckley, Brandon Walter

    Wide-band signals play important roles in the most exciting areas of science, engineering, and medicine. To keep up with the demands of exploding internet traffic, modern data centers and communication networks are employing increasingly faster data rates. Wide-band techniques such as pulsed radar jamming and spread spectrum frequency hopping are used on the battlefield to wrestle control of the electromagnetic spectrum. Neurons communicate with each other using transient action potentials that last for only milliseconds at a time. And in the search for rare cells, biologists flow large populations of cells single file down microfluidic channels, interrogating them one-by-one, tens of thousands of times per second. Studying and enabling such high-speed phenomena pose enormous technical challenges. For one, parasitic capacitance inherent in analog electrical components limits their response time. Additionally, converting these fast analog signals to the digital domain requires enormous sampling speeds, which can lead to significant jitter and distortion. State-of-the-art imaging technologies, essential for studying biological dynamics and cells in flow, are limited in speed and sensitivity by finite charge transfer and read rates, and by the small numbers of photo-electrons accumulated in short integration times. And finally, ultra-high throughput real-time digital processing is required at the backend to analyze the streaming data. In this thesis, I discuss my work in developing real-time instruments, employing ultrafast optical techniques, which overcome some of these obstacles. In particular, I use broadband dispersive optics to slow down fast signals to speeds accessible to high-bit depth digitizers and signal processors. I also apply telecommunication multiplexing techniques to boost the speeds of confocal fluorescence microscopy. The photonic time stretcher (TiSER) uses dispersive Fourier transformation to slow down analog signals before digitization and

  12. A combined Event-Driven/Time-Driven molecular dynamics algorithm for the simulation of shock waves in rarefied gases

    NASA Astrophysics Data System (ADS)

    Valentini, Paolo; Schwartzentruber, Thomas E.

    2009-12-01

    A novel combined Event-Driven/Time-Driven (ED/TD) algorithm to speed-up the Molecular Dynamics simulation of rarefied gases using realistic spherically symmetric soft potentials is presented. Due to the low density regime, the proposed method correctly identifies the time that must elapse before the next interaction occurs, similarly to Event-Driven Molecular Dynamics. However, each interaction is treated using Time-Driven Molecular Dynamics, thereby integrating Newton's Second Law using the sufficiently small time step needed to correctly resolve the atomic motion. Although infrequent, many-body interactions are also accounted for with a small approximation. The combined ED/TD method is shown to correctly reproduce translational relaxation in argon, described using the Lennard-Jones potential. For densities between ρ=10-4 kg/m and ρ=10-1 kg/m, comparisons with kinetic theory, Direct Simulation Monte Carlo, and pure Time-Driven Molecular Dynamics demonstrate that the ED/TD algorithm correctly reproduces the proper collision rates and the evolution toward thermal equilibrium. Finally, the combined ED/TD algorithm is applied to the simulation of a Mach 9 shock wave in rarefied argon. Density and temperature profiles as well as molecular velocity distributions accurately match DSMC results, and the shock thickness is within the experimental uncertainty. For the problems considered, the ED/TD algorithm ranged from several hundred to several thousand times faster than conventional Time-Driven MD. Moreover, the force calculation to integrate the molecular trajectories is found to contribute a negligible amount to the overall ED/TD simulation time. Therefore, this method could pave the way for the application of much more refined and expensive interatomic potentials, either classical or first-principles, to Molecular Dynamics simulations of shock waves in rarefied gases, involving vibrational nonequilibrium and chemical reactivity.

  13. A combined Event-Driven/Time-Driven molecular dynamics algorithm for the simulation of shock waves in rarefied gases

    SciTech Connect

    Valentini, Paolo Schwartzentruber, Thomas E.

    2009-12-10

    A novel combined Event-Driven/Time-Driven (ED/TD) algorithm to speed-up the Molecular Dynamics simulation of rarefied gases using realistic spherically symmetric soft potentials is presented. Due to the low density regime, the proposed method correctly identifies the time that must elapse before the next interaction occurs, similarly to Event-Driven Molecular Dynamics. However, each interaction is treated using Time-Driven Molecular Dynamics, thereby integrating Newton's Second Law using the sufficiently small time step needed to correctly resolve the atomic motion. Although infrequent, many-body interactions are also accounted for with a small approximation. The combined ED/TD method is shown to correctly reproduce translational relaxation in argon, described using the Lennard-Jones potential. For densities between {rho}=10{sup -4}kg/m{sup 3} and {rho}=10{sup -1}kg/m{sup 3}, comparisons with kinetic theory, Direct Simulation Monte Carlo, and pure Time-Driven Molecular Dynamics demonstrate that the ED/TD algorithm correctly reproduces the proper collision rates and the evolution toward thermal equilibrium. Finally, the combined ED/TD algorithm is applied to the simulation of a Mach 9 shock wave in rarefied argon. Density and temperature profiles as well as molecular velocity distributions accurately match DSMC results, and the shock thickness is within the experimental uncertainty. For the problems considered, the ED/TD algorithm ranged from several hundred to several thousand times faster than conventional Time-Driven MD. Moreover, the force calculation to integrate the molecular trajectories is found to contribute a negligible amount to the overall ED/TD simulation time. Therefore, this method could pave the way for the application of much more refined and expensive interatomic potentials, either classical or first-principles, to Molecular Dynamics simulations of shock waves in rarefied gases, involving vibrational nonequilibrium and chemical reactivity.

  14. Quantifying the Rate of Surface Soil Drying Following Precipitation Events Using PBO H2o Soil Moisture Time Series

    NASA Astrophysics Data System (ADS)

    Prue, A. M.

    2014-12-01

    Surface soil moisture affects latent and sensible heat fluxes, as well as setting the top boundary condition for water redistribution within the soil column. The fluctuations in surface soil moisture have been described in numerous modeling studies, but characterization based on measurements is lacking. We use a new soil moisture dataset based on reflected GPS signals to provide some constraints on rates of surface soil drying after a rain event. The soil moisture time series used in this study are derived from GPS data collected at NSF's EarthScope Plate Boundary Observatory (PBO) sites. The University of Colorado Boulder's PBO H2O project estimates daily near-surface soil moisture (approximately 0-5 cm) from the interference pattern between the direct and ground-reflected GPS signals. The sensing footprint is ~1000 m2, and thus intermediate in scale between in situ and remotely sensed observations. Twelve sites from this network of more than 100 were used in this study. To characterize the rate of soil drying, we fit exponential curves to daily soil moisture observations following ten isolated rainfall events at each site. Event sizes varied from 5 to 40 mm and were followed by 17 days without rain. The decay model fits the data quite well, with r2 values exceeding 0.85 in nearly all cases. For 95% of the events studied, the exponential decay constant (e-folding time) fell between 2 and 6 days. Precipitation amount is not correlated with drydown rates. Instead, the rate of soil drying is well-correlated with air temperature: the exponential constant decreases by 0.1 days per degree Celsius. We are currently investigating how other factors, such as soil type and vegetation, influence soil drying. This study highlights the utility of the PBO H2O soil moisture product. Surface soil moisture changes rapidly, and thus the dynamics of surface soil moisture cannot be accurately characterized using datasets based on less than daily measurements.

  15. Unexpected spatial intensity distributions and onset timing of solar electron events observed by closely spaced STEREO spacecraft

    NASA Astrophysics Data System (ADS)

    Klassen, A.; Dresing, N.; Gómez-Herrero, R.; Heber, B.; Müller-Mellin, R.

    2016-09-01

    We present multi-spacecraft observations of four solar electron events using measurements from the Solar Electron Proton Telescope (SEPT) and the Electron Proton Helium INstrument (EPHIN) on board the STEREO and SOHO spacecraft, respectively, occurring between 11 October 2013 and 1 August 2014, during the approaching superior conjunction period of the two STEREO spacecraft. At this time the longitudinal separation angle between STEREO-A (STA) and STEREO-B (STB) was less than 72°. The parent particle sources (flares) of the four investigated events were situated close to, in between, or to the west of the STEREO's magnetic footpoints. The STEREO measurements revealed a strong difference in electron peak intensities (factor ≤12) showing unexpected intensity distributions at 1 AU, although the two spacecraft had nominally nearly the same angular magnetic footpoint separation from the flaring active region (AR) or their magnetic footpoints were both situated eastwards from the parent particle source. Furthermore, the events detected by the two STEREO imply a strongly unexpected onset timing with respect to each other: the spacecraft magnetically best connected to the flare detected a later arrival of electrons than the other one. This leads us to suggest the concept of a rippled peak intensity distribution at 1 AU formed by narrow peaks (fingers) superposed on a quasi-uniform Gaussian distribution. Additionally, two of the four investigated solar energetic particle (SEP) events show a so-called circumsolar distribution and their characteristics make it plausible to suggest a two-component particle injection scenario forming an unusual, non-uniform intensity distribution at 1 AU.

  16. Do pollution time-series studies contain uncontrolled or residual confounding by risk factors for acute health events?

    PubMed

    Bukowski, John

    2008-07-01

    Acute health effects from air pollution are based largely on weak associations identified in time-series studies comparing daily air pollution levels to daily mortality. Much of this mortality is due to cardiovascular disease. Time-series studies have many potential limitations, but are not thought to be confounded by traditional cardiovascular risk factors (e.g., smoking status or hypertension) because these chronic risk factors are not obviously associated with daily pollution levels. However, acute psychobehavioral variants of these risk factors (e.g., smoking patterns and episodes of stress on any given day) are plausible confounders for the associations observed in time-series studies, given that time-series studies attempt to predict acute rather than chronic health outcomes. There is a fairly compelling literature on the strong link between cardiovascular events and daily "triggers" such as stress. Stress-related triggers are plausibly associated with daily pollution levels through surrogate stressors such as ambient temperature, daily workload, local traffic congestion, or other correlates of air pollution. For example, variables such as traffic congestion and industrial activity increase both stress-related health events and air pollution, suggesting the potential for classical confounding. Support for this argument is illustrated through examples of the well-demonstrated relationship between emotional stress and heart attack/stroke.

  17. A Bayesian mixture of semiparametric mixed-effects joint models for skewed-longitudinal and time-to-event data.

    PubMed

    Chen, Jiaqing; Huang, Yangxin

    2015-09-10

    In longitudinal studies, it is of interest to investigate how repeatedly measured markers in time are associated with a time to an event of interest, and in the mean time, the repeated measurements are often observed with the features of a heterogeneous population, non-normality, and covariate measured with error because of longitudinal nature. Statistical analysis may complicate dramatically when one analyzes longitudinal-survival data with these features together. Recently, a mixture of skewed distributions has received increasing attention in the treatment of heterogeneous data involving asymmetric behaviors across subclasses, but there are relatively few studies accommodating heterogeneity, non-normality, and measurement error in covariate simultaneously arose in longitudinal-survival data setting. Under the umbrella of Bayesian inference, this article explores a finite mixture of semiparametric mixed-effects joint models with skewed distributions for longitudinal measures with an attempt to mediate homogeneous characteristics, adjust departures from normality, and tailor accuracy from measurement error in covariate as well as overcome shortages of confidence in specifying a time-to-event model. The Bayesian mixture of joint modeling offers an appropriate avenue to estimate not only all parameters of mixture joint models but also probabilities of class membership. Simulation studies are conducted to assess the performance of the proposed method, and a real example is analyzed to demonstrate the methodology. The results are reported by comparing potential models with various scenarios.

  18. Event-sequence time series analysis in ground-based gamma-ray astronomy

    SciTech Connect

    Barres de Almeida, U.; Chadwick, P.; Daniel, M.; Nolan, S.; McComb, L.

    2008-12-24

    The recent, extreme episodes of variability detected from Blazars by the leading atmospheric Cerenkov experiments motivate the development and application of specialized statistical techniques that enable the study of this rich data set to its furthest extent. The identification of the shortest variability timescales supported by the data and the actual variability structure observed in the light curves of these sources are some of the fundamental aspects being studied, that answers can bring new developments on the understanding of the physics of these objects and on the mechanisms of production of VHE gamma-rays in the Universe. Some of our efforts in studying the time variability of VHE sources involve the application of dynamic programming algorithms to the problem of detecting change-points in a Poisson sequence. In this particular paper we concentrate on the more primary issue of the applicability of counting statistics to the analysis of time-series on VHE gamma-ray astronomy.

  19. Network Events on Multiple Space and Time Scales in Cultured Neural Networks and in a Stochastic Rate Model

    PubMed Central

    Gigante, Guido; Deco, Gustavo; Marom, Shimon; Del Giudice, Paolo

    2015-01-01

    Cortical networks, in-vitro as well as in-vivo, can spontaneously generate a variety of collective dynamical events such as network spikes, UP and DOWN states, global oscillations, and avalanches. Though each of them has been variously recognized in previous works as expression of the excitability of the cortical tissue and the associated nonlinear dynamics, a unified picture of the determinant factors (dynamical and architectural) is desirable and not yet available. Progress has also been partially hindered by the use of a variety of statistical measures to define the network events of interest. We propose here a common probabilistic definition of network events that, applied to the firing activity of cultured neural networks, highlights the co-occurrence of network spikes, power-law distributed avalanches, and exponentially distributed ‘quasi-orbits’, which offer a third type of collective behavior. A rate model, including synaptic excitation and inhibition with no imposed topology, synaptic short-term depression, and finite-size noise, accounts for all these different, coexisting phenomena. We find that their emergence is largely regulated by the proximity to an oscillatory instability of the dynamics, where the non-linear excitable behavior leads to a self-amplification of activity fluctuations over a wide range of scales in space and time. In this sense, the cultured network dynamics is compatible with an excitation-inhibition balance corresponding to a slightly sub-critical regime. Finally, we propose and test a method to infer the characteristic time of the fatigue process, from the observed time course of the network’s firing rate. Unlike the model, possessing a single fatigue mechanism, the cultured network appears to show multiple time scales, signalling the possible coexistence of different fatigue mechanisms. PMID:26558616

  20. Network Events on Multiple Space and Time Scales in Cultured Neural Networks and in a Stochastic Rate Model.

    PubMed

    Gigante, Guido; Deco, Gustavo; Marom, Shimon; Del Giudice, Paolo

    2015-11-01

    Cortical networks, in-vitro as well as in-vivo, can spontaneously generate a variety of collective dynamical events such as network spikes, UP and DOWN states, global oscillations, and avalanches. Though each of them has been variously recognized in previous works as expression of the excitability of the cortical tissue and the associated nonlinear dynamics, a unified picture of the determinant factors (dynamical and architectural) is desirable and not yet available. Progress has also been partially hindered by the use of a variety of statistical measures to define the network events of interest. We propose here a common probabilistic definition of network events that, applied to the firing activity of cultured neural networks, highlights the co-occurrence of network spikes, power-law distributed avalanches, and exponentially distributed 'quasi-orbits', which offer a third type of collective behavior. A rate model, including synaptic excitation and inhibition with no imposed topology, synaptic short-term depression, and finite-size noise, accounts for all these different, coexisting phenomena. We find that their emergence is largely regulated by the proximity to an oscillatory instability of the dynamics, where the non-linear excitable behavior leads to a self-amplification of activity fluctuations over a wide range of scales in space and time. In this sense, the cultured network dynamics is compatible with an excitation-inhibition balance corresponding to a slightly sub-critical regime. Finally, we propose and test a method to infer the characteristic time of the fatigue process, from the observed time course of the network's firing rate. Unlike the model, possessing a single fatigue mechanism, the cultured network appears to show multiple time scales, signalling the possible coexistence of different fatigue mechanisms.

  1. Network Events on Multiple Space and Time Scales in Cultured Neural Networks and in a Stochastic Rate Model.

    PubMed

    Gigante, Guido; Deco, Gustavo; Marom, Shimon; Del Giudice, Paolo

    2015-11-01

    Cortical networks, in-vitro as well as in-vivo, can spontaneously generate a variety of collective dynamical events such as network spikes, UP and DOWN states, global oscillations, and avalanches. Though each of them has been variously recognized in previous works as expression of the excitability of the cortical tissue and the associated nonlinear dynamics, a unified picture of the determinant factors (dynamical and architectural) is desirable and not yet available. Progress has also been partially hindered by the use of a variety of statistical measures to define the network events of interest. We propose here a common probabilistic definition of network events that, applied to the firing activity of cultured neural networks, highlights the co-occurrence of network spikes, power-law distributed avalanches, and exponentially distributed 'quasi-orbits', which offer a third type of collective behavior. A rate model, including synaptic excitation and inhibition with no imposed topology, synaptic short-term depression, and finite-size noise, accounts for all these different, coexisting phenomena. We find that their emergence is largely regulated by the proximity to an oscillatory instability of the dynamics, where the non-linear excitable behavior leads to a self-amplification of activity fluctuations over a wide range of scales in space and time. In this sense, the cultured network dynamics is compatible with an excitation-inhibition balance corresponding to a slightly sub-critical regime. Finally, we propose and test a method to infer the characteristic time of the fatigue process, from the observed time course of the network's firing rate. Unlike the model, possessing a single fatigue mechanism, the cultured network appears to show multiple time scales, signalling the possible coexistence of different fatigue mechanisms. PMID:26558616

  2. Analysis of Time to Event Outcomes in Randomized Controlled Trials by Generalized Additive Models

    PubMed Central

    Argyropoulos, Christos; Unruh, Mark L.

    2015-01-01

    Background Randomized Controlled Trials almost invariably utilize the hazard ratio calculated with a Cox proportional hazard model as a treatment efficacy measure. Despite the widespread adoption of HRs, these provide a limited understanding of the treatment effect and may even provide a biased estimate when the assumption of proportional hazards in the Cox model is not verified by the trial data. Additional treatment effect measures on the survival probability or the time scale may be used to supplement HRs but a framework for the simultaneous generation of these measures is lacking. Methods By splitting follow-up time at the nodes of a Gauss Lobatto numerical quadrature rule, techniques for Poisson Generalized Additive Models (PGAM) can be adopted for flexible hazard modeling. Straightforward simulation post-estimation transforms PGAM estimates for the log hazard into estimates of the survival function. These in turn were used to calculate relative and absolute risks or even differences in restricted mean survival time between treatment arms. We illustrate our approach with extensive simulations and in two trials: IPASS (in which the proportionality of hazards was violated) and HEMO a long duration study conducted under evolving standards of care on a heterogeneous patient population. Findings PGAM can generate estimates of the survival function and the hazard ratio that are essentially identical to those obtained by Kaplan Meier curve analysis and the Cox model. PGAMs can simultaneously provide multiple measures of treatment efficacy after a single data pass. Furthermore, supported unadjusted (overall treatment effect) but also subgroup and adjusted analyses, while incorporating multiple time scales and accounting for non-proportional hazards in survival data. Conclusions By augmenting the HR conventionally reported, PGAMs have the potential to support the inferential goals of multiple stakeholders involved in the evaluation and appraisal of clinical trial

  3. Time-Shift Correlation Algorithm for P300 Event Related Potential Brain-Computer Interface Implementation

    PubMed Central

    Liu, Ju-Chi; Chou, Hung-Chyun; Chen, Chien-Hsiu; Lin, Yi-Tseng

    2016-01-01

    A high efficient time-shift correlation algorithm was proposed to deal with the peak time uncertainty of P300 evoked potential for a P300-based brain-computer interface (BCI). The time-shift correlation series data were collected as the input nodes of an artificial neural network (ANN), and the classification of four LED visual stimuli was selected as the output node. Two operating modes, including fast-recognition mode (FM) and accuracy-recognition mode (AM), were realized. The proposed BCI system was implemented on an embedded system for commanding an adult-size humanoid robot to evaluate the performance from investigating the ground truth trajectories of the humanoid robot. When the humanoid robot walked in a spacious area, the FM was used to control the robot with a higher information transfer rate (ITR). When the robot walked in a crowded area, the AM was used for high accuracy of recognition to reduce the risk of collision. The experimental results showed that, in 100 trials, the accuracy rate of FM was 87.8% and the average ITR was 52.73 bits/min. In addition, the accuracy rate was improved to 92% for the AM, and the average ITR decreased to 31.27 bits/min. due to strict recognition constraints. PMID:27579033

  4. Time-Shift Correlation Algorithm for P300 Event Related Potential Brain-Computer Interface Implementation.

    PubMed

    Liu, Ju-Chi; Chou, Hung-Chyun; Chen, Chien-Hsiu; Lin, Yi-Tseng; Kuo, Chung-Hsien

    2016-01-01

    A high efficient time-shift correlation algorithm was proposed to deal with the peak time uncertainty of P300 evoked potential for a P300-based brain-computer interface (BCI). The time-shift correlation series data were collected as the input nodes of an artificial neural network (ANN), and the classification of four LED visual stimuli was selected as the output node. Two operating modes, including fast-recognition mode (FM) and accuracy-recognition mode (AM), were realized. The proposed BCI system was implemented on an embedded system for commanding an adult-size humanoid robot to evaluate the performance from investigating the ground truth trajectories of the humanoid robot. When the humanoid robot walked in a spacious area, the FM was used to control the robot with a higher information transfer rate (ITR). When the robot walked in a crowded area, the AM was used for high accuracy of recognition to reduce the risk of collision. The experimental results showed that, in 100 trials, the accuracy rate of FM was 87.8% and the average ITR was 52.73 bits/min. In addition, the accuracy rate was improved to 92% for the AM, and the average ITR decreased to 31.27 bits/min. due to strict recognition constraints.

  5. Time-Shift Correlation Algorithm for P300 Event Related Potential Brain-Computer Interface Implementation.

    PubMed

    Liu, Ju-Chi; Chou, Hung-Chyun; Chen, Chien-Hsiu; Lin, Yi-Tseng; Kuo, Chung-Hsien

    2016-01-01

    A high efficient time-shift correlation algorithm was proposed to deal with the peak time uncertainty of P300 evoked potential for a P300-based brain-computer interface (BCI). The time-shift correlation series data were collected as the input nodes of an artificial neural network (ANN), and the classification of four LED visual stimuli was selected as the output node. Two operating modes, including fast-recognition mode (FM) and accuracy-recognition mode (AM), were realized. The proposed BCI system was implemented on an embedded system for commanding an adult-size humanoid robot to evaluate the performance from investigating the ground truth trajectories of the humanoid robot. When the humanoid robot walked in a spacious area, the FM was used to control the robot with a higher information transfer rate (ITR). When the robot walked in a crowded area, the AM was used for high accuracy of recognition to reduce the risk of collision. The experimental results showed that, in 100 trials, the accuracy rate of FM was 87.8% and the average ITR was 52.73 bits/min. In addition, the accuracy rate was improved to 92% for the AM, and the average ITR decreased to 31.27 bits/min. due to strict recognition constraints. PMID:27579033

  6. Multivariate skew- t approach to the design of accumulation risk scenarios for the flooding hazard

    NASA Astrophysics Data System (ADS)

    Ghizzoni, Tatiana; Roth, Giorgio; Rudari, Roberto

    2010-10-01

    The multivariate version of the skew- t distribution provides a powerful analytical description of the joint behavior of multivariate processes. It enjoys valuable properties: from the aptitude to model skewed as well as leptokurtic datasets to the availability of moments and likelihood analytical expressions. Moreover, it offers a wide range of extremal dependence strength, allowing for upper and lower tail dependence. The idea underneath this work is to employ the multivariate skew- t distribution to provide an estimation of the joint probability of flood events in a multi-site multi-basin approach. This constitutes the basis for the design and evaluation of flood hazard scenarios for large areas in terms of their intensity, extension and frequency, i.e. those information required by civil protection agencies to put in action mitigation strategies and by insurance companies to price the flooding risk and to evaluate portfolios. Performances of the skew- t distribution and the corresponding t copula function, introduced to represent the state of the art for multivariate simulations, are discussed with reference to the Tanaro Basin, North-western Italy. To enhance the characteristics of the correlation structure, three nested and non-nested gauging stations are selected with contributing areas from 1500 to 8000 km 2. A dataset of 76 trivariate flood events is extracted from a mean daily discharges database available for the time period from January 1995 to December 2003. Applications include the generation of multivariate skew- t and t copula samples and models' comparison through the principle of minimum cross-entropy, here revised for the application to multivariate samples. Copula and skew- t based scenario return period estimations are provided for the November 1994 flood event, i.e. the worst on record in the 1801-2001 period. Results are encouraging: the skew- t distribution seems able to describe the joint behavior, being close to the observations. Marginal

  7. Relating interesting quantitative time series patterns with text events and text features

    NASA Astrophysics Data System (ADS)

    Wanner, Franz; Schreck, Tobias; Jentner, Wolfgang; Sharalieva, Lyubka; Keim, Daniel A.

    2013-12-01

    In many application areas, the key to successful data analysis is the integrated analysis of heterogeneous data. One example is the financial domain, where time-dependent and highly frequent quantitative data (e.g., trading volume and price information) and textual data (e.g., economic and political news reports) need to be considered jointly. Data analysis tools need to support an integrated analysis, which allows studying the relationships between textual news documents and quantitative properties of the stock market price series. In this paper, we describe a workflow and tool that allows a flexible formation of hypotheses about text features and their combinations, which reflect quantitative phenomena observed in stock data. To support such an analysis, we combine the analysis steps of frequent quantitative and text-oriented data using an existing a-priori method. First, based on heuristics we extract interesting intervals and patterns in large time series data. The visual analysis supports the analyst in exploring parameter combinations and their results. The identified time series patterns are then input for the second analysis step, in which all identified intervals of interest are analyzed for frequent patterns co-occurring with financial news. An a-priori method supports the discovery of such sequential temporal patterns. Then, various text features like the degree of sentence nesting, noun phrase complexity, the vocabulary richness, etc. are extracted from the news to obtain meta patterns. Meta patterns are defined by a specific combination of text features which significantly differ from the text features of the remaining news data. Our approach combines a portfolio of visualization and analysis techniques, including time-, cluster- and sequence visualization and analysis functionality. We provide two case studies, showing the effectiveness of our combined quantitative and textual analysis work flow. The workflow can also be generalized to other

  8. A survival tree method for the analysis of discrete event times in clinical and epidemiological studies.

    PubMed

    Schmid, Matthias; Küchenhoff, Helmut; Hoerauf, Achim; Tutz, Gerhard

    2016-02-28

    Survival trees are a popular alternative to parametric survival modeling when there are interactions between the predictor variables or when the aim is to stratify patients into prognostic subgroups. A limitation of classical survival tree methodology is that most algorithms for tree construction are designed for continuous outcome variables. Hence, classical methods might not be appropriate if failure time data are measured on a discrete time scale (as is often the case in longitudinal studies where data are collected, e.g., quarterly or yearly). To address this issue, we develop a method for discrete survival tree construction. The proposed technique is based on the result that the likelihood of a discrete survival model is equivalent to the likelihood of a regression model for binary outcome data. Hence, we modify tree construction methods for binary outcomes such that they result in optimized partitions for the estimation of discrete hazard functions. By applying the proposed method to data from a randomized trial in patients with filarial lymphedema, we demonstrate how discrete survival trees can be used to identify clinically relevant patient groups with similar survival behavior.

  9. A survival tree method for the analysis of discrete event times in clinical and epidemiological studies.

    PubMed

    Schmid, Matthias; Küchenhoff, Helmut; Hoerauf, Achim; Tutz, Gerhard

    2016-02-28

    Survival trees are a popular alternative to parametric survival modeling when there are interactions between the predictor variables or when the aim is to stratify patients into prognostic subgroups. A limitation of classical survival tree methodology is that most algorithms for tree construction are designed for continuous outcome variables. Hence, classical methods might not be appropriate if failure time data are measured on a discrete time scale (as is often the case in longitudinal studies where data are collected, e.g., quarterly or yearly). To address this issue, we develop a method for discrete survival tree construction. The proposed technique is based on the result that the likelihood of a discrete survival model is equivalent to the likelihood of a regression model for binary outcome data. Hence, we modify tree construction methods for binary outcomes such that they result in optimized partitions for the estimation of discrete hazard functions. By applying the proposed method to data from a randomized trial in patients with filarial lymphedema, we demonstrate how discrete survival trees can be used to identify clinically relevant patient groups with similar survival behavior. PMID:26358826

  10. NasoNet, modeling the spread of nasopharyngeal cancer with networks of probabilistic events in discrete time.

    PubMed

    Galán, S F; Aguado, F; Díez, F J; Mira, J

    2002-07-01

    The spread of cancer is a non-deterministic dynamic process. As a consequence, the design of an assistant system for the diagnosis and prognosis of the extent of a cancer should be based on a representation method that deals with both uncertainty and time. The ultimate goal is to know the stage of development of a cancer in a patient before selecting the appropriate treatment. A network of probabilistic events in discrete time (NPEDT) is a type of Bayesian network for temporal reasoning that models the causal mechanisms associated with the time evolution of a process. This paper describes NasoNet, a system that applies NPEDTs to the diagnosis and prognosis of nasopharyngeal cancer. We have made use of temporal noisy gates to model the dynamic causal interactions that take place in the domain. The methodology we describe is general enough to be applied to any other type of cancer.

  11. Adaptive Neural Network-Based Event-Triggered Control of Single-Input Single-Output Nonlinear Discrete-Time Systems.

    PubMed

    Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani

    2016-01-01

    This paper presents a novel adaptive neural network (NN) control of single-input and single-output uncertain nonlinear discrete-time systems under event sampled NN inputs. In this control scheme, the feedback signals are transmitted, and the NN weights are tuned in an aperiodic manner at the event sampled instants. After reviewing the NN approximation property with event sampled inputs, an adaptive state estimator (SE), consisting of linearly parameterized NNs, is utilized to approximate the unknown system dynamics in an event sampled context. The SE is viewed as a model and its approximated dynamics and the state vector, during any two events, are utilized for the event-triggered controller design. An adaptive event-trigger condition is derived by using both the estimated NN weights and a dead-zone operator to determine the event sampling instants. This condition both facilitates the NN approximation and reduces the transmission of feedback signals. The ultimate boundedness of both the NN weight estimation error and the system state vector is demonstrated through the Lyapunov approach. As expected, during an initial online learning phase, events are observed more frequently. Over time with the convergence of the NN weights, the inter-event times increase, thereby lowering the number of triggered events. These claims are illustrated through the simulation results.

  12. Characterization and event specific-detection by quantitative real-time PCR of T25 maize insert.

    PubMed

    Collonnier, Cécile; Schattner, Alexandra; Berthier, Georges; Boyer, Francine; Coué-Philippe, Géraldine; Diolez, Annick; Duplan, Marie-Noëlle; Fernandez, Sophie; Kebdani, Naïma; Kobilinsky, André; Romaniuk, Marcel; de Beuckeleer, Marc; de Loose, Marc; Windels, Pieter; Bertheau, Yves

    2005-01-01

    T25 is one of the 4 maize transformation events from which commercial lines have so far been authorized in Europe. It was created by polyethylene glycol-mediated transformation using a construct bearing one copy of the synthetic pat gene associated with both promoter and terminator of the 35S ribosomal gene from cauliflower mosaic virus. In this article, we report the sequencing of the whole T25 insert and the characterization of its integration site by using a genome walking strategy. Our results confirmed that one intact copy of the initial construct had been integrated in the plant genome. They also revealed, at the 5' junction of the insert, the presence of a second truncated 35S promoter, probably resulting from rearrangements which may have occurred before or during integration of the plasmid DNA. The analysis of the junction fragments showed that the integration site of the insert presented high homologies with the Huck retrotransposon family. By using one primer annealing in the maize genome and the other in the 5' end of the integrated DNA, we developed a reliable event-specific detection system for T25 maize. To provide means to comply with the European regulation, a real-time PCR test was designed for specific quantitation of T25 event by using Taqman chemistry.

  13. Characterization and event specific-detection by quantitative real-time PCR of T25 maize insert.

    PubMed

    Collonnier, Cécile; Schattner, Alexandra; Berthier, Georges; Boyer, Francine; Coué-Philippe, Géraldine; Diolez, Annick; Duplan, Marie-Noëlle; Fernandez, Sophie; Kebdani, Naïma; Kobilinsky, André; Romaniuk, Marcel; de Beuckeleer, Marc; de Loose, Marc; Windels, Pieter; Bertheau, Yves

    2005-01-01

    T25 is one of the 4 maize transformation events from which commercial lines have so far been authorized in Europe. It was created by polyethylene glycol-mediated transformation using a construct bearing one copy of the synthetic pat gene associated with both promoter and terminator of the 35S ribosomal gene from cauliflower mosaic virus. In this article, we report the sequencing of the whole T25 insert and the characterization of its integration site by using a genome walking strategy. Our results confirmed that one intact copy of the initial construct had been integrated in the plant genome. They also revealed, at the 5' junction of the insert, the presence of a second truncated 35S promoter, probably resulting from rearrangements which may have occurred before or during integration of the plasmid DNA. The analysis of the junction fragments showed that the integration site of the insert presented high homologies with the Huck retrotransposon family. By using one primer annealing in the maize genome and the other in the 5' end of the integrated DNA, we developed a reliable event-specific detection system for T25 maize. To provide means to comply with the European regulation, a real-time PCR test was designed for specific quantitation of T25 event by using Taqman chemistry. PMID:15859082

  14. Direct real-time detection of the structural and biochemical events in the myosin power stroke

    PubMed Central

    Muretta, Joseph M.; Rohde, John A.; Johnsrud, Daniel O.; Cornea, Sinziana; Thomas, David D.

    2015-01-01

    A principal goal of molecular biophysics is to show how protein structural transitions explain physiology. We have developed a strategic tool, transient time-resolved FRET [(TR)2FRET], for this purpose and use it here to measure directly, with millisecond resolution, the structural and biochemical kinetics of muscle myosin and to determine directly how myosin’s power stroke is coupled to the thermodynamic drive for force generation, actin-activated phosphate release, and the weak-to-strong actin-binding transition. We find that actin initiates the power stroke before phosphate dissociation and not after, as many models propose. This result supports a model for muscle contraction in which power output and efficiency are tuned by the distribution of myosin structural states. This technology should have wide application to other systems in which questions about the temporal coupling of allosteric structural and biochemical transitions remain unanswered. PMID:26578772

  15. Time scales of critical events around the Cretaceous-Paleogene boundary.

    PubMed

    Renne, Paul R; Deino, Alan L; Hilgen, Frederik J; Kuiper, Klaudia F; Mark, Darren F; Mitchell, William S; Morgan, Leah E; Mundil, Roland; Smit, Jan

    2013-02-01

    Mass extinctions manifest in Earth's geologic record were turning points in biotic evolution. We present (40)Ar/(39)Ar data that establish synchrony between the Cretaceous-Paleogene boundary and associated mass extinctions with the Chicxulub bolide impact to within 32,000 years. Perturbation of the atmospheric carbon cycle at the boundary likely lasted less than 5000 years, exhibiting a recovery time scale two to three orders of magnitude shorter than that of the major ocean basins. Low-diversity mammalian fauna in the western Williston Basin persisted for as little as 20,000 years after the impact. The Chicxulub impact likely triggered a state shift of ecosystems already under near-critical stress.

  16. Direct real-time detection of the structural and biochemical events in the myosin power stroke.

    PubMed

    Muretta, Joseph M; Rohde, John A; Johnsrud, Daniel O; Cornea, Sinziana; Thomas, David D

    2015-11-17

    A principal goal of molecular biophysics is to show how protein structural transitions explain physiology. We have developed a strategic tool, transient time-resolved FRET [(TR)(2)FRET], for this purpose and use it here to measure directly, with millisecond resolution, the structural and biochemical kinetics of muscle myosin and to determine directly how myosin's power stroke is coupled to the thermodynamic drive for force generation, actin-activated phosphate release, and the weak-to-strong actin-binding transition. We find that actin initiates the power stroke before phosphate dissociation and not after, as many models propose. This result supports a model for muscle contraction in which power output and efficiency are tuned by the distribution of myosin structural states. This technology should have wide application to other systems in which questions about the temporal coupling of allosteric structural and biochemical transitions remain unanswered. PMID:26578772

  17. Statistics in review. Part 2: generalised linear models, time-to-event and time-series analysis, evidence synthesis and clinical trials.

    PubMed

    Moran, John L; Solomon, Patricia J

    2007-06-01

    In Part I, we reviewed graphical display and data summary, followed by a consideration of linear regression models. Generalised linear models, structured in terms of an exponential response distribution and link function, are now introduced, subsuming logistic and Poisson regression. Time-to-event ("survival") analysis is developed from basic principles of hazard rate, and survival, cumulative distribution and density functions. Semi-parametric (Cox) and parametric (accelerated failure time) regression models are contrasted. Time-series analysis is explicated in terms of trend, seasonal, and other cyclical and irregular components, and further illustrated by development of a classical Box-Jenkins ARMA (autoregressive moving average) model for monthly ICU-patient hospital mortality rates recorded over 11 years. Multilevel (random-effects) models and principles of meta-analysis are outlined, and the review concludes with a brief consideration of important statistical aspects of clinical trials: sample size determination, interim analysis and "early stopping".

  18. Storm-time Large-Scale Birkeland Currents: Salient Dynamics in Grand Challenge Events

    NASA Astrophysics Data System (ADS)

    Korth, H.; Anderson, B. J.; Waters, C. L.; Barnes, R. J.

    2015-12-01

    The Active Magnetosphere and Planetary Electrodynamics Response Experiment (AMPERE) provides continuous global observations of Birkeland currents on a 10 minute cadence. During geomagnetic storms, currents intensify to over 15 MA, are dynamic both in intensity and distribution, and exhibit features not discernible in statistical analyses. For all of the subject grand challenge storms, AMPERE data reveal a number of novel phenomena illustrating the profound dynamics of the storm-time system. Storm-time onsets associated with shock arrivals are often very prompt and lead to dramatic surges in total current from 1 MA to over 5 MA in less than 20 minutes. The current surges occur predominantly on the dayside at high latitudes prior to any ring current or auroral expansions, indicating that neutral density upwelling is often driven independently of ring current or auroral zone intensifications. Rapid reconfigurations of the currents with IMF BY reversals within the sheath structures of coronal mass ejections (CMEs) are also common. This implies that convection of ionospheric density patches over the polar cap may be quite complex, particularly during the early phase of geomagnetic storms related to the CME sheath passage. The 3 September 2012 storm exhibited intense driving with classic quasi-stable Region 1 and 2 currents spanning 55 to 70 degrees magnetic latitude for over 10 hours at the beginning of the day, corresponding to stable southward IMF prior to shock arrival at noon on that day. The shock arrival and IMF southward intensification led to further expansion of the currents below 50 degrees magnetic latitude and to episodic surges in currents on the nightside, which is unique to storms. The resulting current structure showed multiple large-scale alternations in downward-upward-downward-upward direction that often occurs during intense, sustained driving during strong storms.

  19. Comparison of Patients with Parkinson's Disease or Cerebellar Lesions in the Production of Periodic Movements Involving Event-Based or Emergent Timing

    ERIC Educational Resources Information Center

    Spencer, R.M.C.; Ivry, R.B.

    2005-01-01

    We have hypothesized a distinction between the processes required to control the timing of different classes of periodic movements. In one class, salient events mark successive cycles. For these movements, we hypothesize that the temporal goal is a requisite component of the task representation, what we refer to as event-based timing. In the other…

  20. Single event time series analysis in a binary karst catchment evaluated using a groundwater model (Lurbach system, Austria).

    PubMed

    Mayaud, C; Wagner, T; Benischke, R; Birk, S

    2014-04-16

    The Lurbach karst system (Styria, Austria) is drained by two major springs and replenished by both autogenic recharge from the karst massif itself and a sinking stream that originates in low permeable schists (allogenic recharge). Detailed data from two events recorded during a tracer experiment in 2008 demonstrate that an overflow from one of the sub-catchments to the other is activated if the discharge of the main spring exceeds a certain threshold. Time series analysis (autocorrelation and cross-correlation) was applied to examine to what extent the various available methods support the identification of the transient inter-catchment flow observed in this binary karst system. As inter-catchment flow is found to be intermittent, the evaluation was focused on single events. In order to support the interpretation of the results from the time series analysis a simplified groundwater flow model was built using MODFLOW. The groundwater model is based on the current conceptual understanding of the karst system and represents a synthetic karst aquifer for which the same methods were applied. Using the wetting capability package of MODFLOW, the model simulated an overflow similar to what has been observed during the tracer experiment. Various intensities of allogenic recharge were employed to generate synthetic discharge data for the time series analysis. In addition, geometric and hydraulic properties of the karst system were varied in several model scenarios. This approach helps to identify effects of allogenic recharge and aquifer properties in the results from the time series analysis. Comparing the results from the time series analysis of the observed data with those of the synthetic data a good agreement was found. For instance, the cross-correlograms show similar patterns with respect to time lags and maximum cross-correlation coefficients if appropriate hydraulic parameters are assigned to the groundwater model. The comparable behaviors of the real and the

  1. Single event time series analysis in a binary karst catchment evaluated using a groundwater model (Lurbach system, Austria)

    PubMed Central

    Mayaud, C.; Wagner, T.; Benischke, R.; Birk, S.

    2014-01-01

    Summary The Lurbach karst system (Styria, Austria) is drained by two major springs and replenished by both autogenic recharge from the karst massif itself and a sinking stream that originates in low permeable schists (allogenic recharge). Detailed data from two events recorded during a tracer experiment in 2008 demonstrate that an overflow from one of the sub-catchments to the other is activated if the discharge of the main spring exceeds a certain threshold. Time series analysis (autocorrelation and cross-correlation) was applied to examine to what extent the various available methods support the identification of the transient inter-catchment flow observed in this binary karst system. As inter-catchment flow is found to be intermittent, the evaluation was focused on single events. In order to support the interpretation of the results from the time series analysis a simplified groundwater flow model was built using MODFLOW. The groundwater model is based on the current conceptual understanding of the karst system and represents a synthetic karst aquifer for which the same methods were applied. Using the wetting capability package of MODFLOW, the model simulated an overflow similar to what has been observed during the tracer experiment. Various intensities of allogenic recharge were employed to generate synthetic discharge data for the time series analysis. In addition, geometric and hydraulic properties of the karst system were varied in several model scenarios. This approach helps to identify effects of allogenic recharge and aquifer properties in the results from the time series analysis. Comparing the results from the time series analysis of the observed data with those of the synthetic data a good agreement was found. For instance, the cross-correlograms show similar patterns with respect to time lags and maximum cross-correlation coefficients if appropriate hydraulic parameters are assigned to the groundwater model. The comparable behaviors of the real and

  2. RARtool: A MATLAB Software Package for Designing Response-Adaptive Randomized Clinical Trials with Time-to-Event Outcomes

    PubMed Central

    Ryeznik, Yevgen; Sverdlov, Oleksandr; Wong, Weng Kee

    2016-01-01

    Response-adaptive randomization designs are becoming increasingly popular in clinical trial practice. In this paper, we present RARtool, a user interface software developed in MATLAB for designing response-adaptive randomized comparative clinical trials with censored time-to-event outcomes. The RARtool software can compute different types of optimal treatment allocation designs, and it can simulate response-adaptive randomization procedures targeting selected optimal allocations. Through simulations, an investigator can assess design characteristics under a variety of experimental scenarios and select the best procedure for practical implementation. We illustrate the utility of our RARtool software by redesigning a survival trial from the literature. PMID:26997924

  3. Time and space correlation between sprites and lightning flash events for a storm case during HyMeX campaign

    NASA Astrophysics Data System (ADS)

    Soula, S.; Defer, E.; Fullekrug, M.; van der Velde, O.; Coquillat, S.; Pinty, J.; Rison, W.; Krehbiel, P. R.; Thomas, R. J.; Bousquet, O.; Pedeboy, S.

    2013-12-01

    During the SOP1 (Special Observation Period) of the HyMeX (Hydrology cycle in the Mediterranean eXperiment) campaign (September-November 2012), optical observations of sprite events were performed thanks to low light video cameras located in southern France. For the night of October 22nd - 23rd, a storm developed along the coastline in southeastern France, mo