Sample records for event timing based

  1. Cognitive load and task condition in event- and time-based prospective memory: an experimental investigation.

    PubMed

    Khan, Azizuddin; Sharma, Narendra K; Dixit, Shikha

    2008-09-01

    Prospective memory is memory for the realization of delayed intention. Researchers distinguish 2 kinds of prospective memory: event- and time-based (G. O. Einstein & M. A. McDaniel, 1990). Taking that distinction into account, the present authors explored participants' comparative performance under event- and time-based tasks. In an experimental study of 80 participants, the authors investigated the roles of cognitive load and task condition in prospective memory. Cognitive load (low vs. high) and task condition (event- vs. time-based task) were the independent variables. Accuracy in prospective memory was the dependent variable. Results showed significant differential effects under event- and time-based tasks. However, the effect of cognitive load was more detrimental in time-based prospective memory. Results also revealed that time monitoring is critical in successful performance of time estimation and so in time-based prospective memory. Similarly, participants' better performance on the event-based prospective memory task showed that they acted on the basis of environment cues. Event-based prospective memory was environmentally cued; time-based prospective memory required self-initiation.

  2. Event- and time-triggered remembering: the impact of attention deficit hyperactivity disorder on prospective memory performance in children.

    PubMed

    Talbot, Karley-Dale S; Kerns, Kimberly A

    2014-11-01

    The current study examined prospective memory (PM, both time-based and event-based) and time estimation (TR, a time reproduction task) in children with and without attention deficit hyperactivity disorder (ADHD). This study also investigated the influence of task performance and TR on time-based PM in children with ADHD relative to controls. A sample of 69 children, aged 8 to 13 years, completed the CyberCruiser-II time-based PM task, a TR task, and the Super Little Fisherman event-based PM task. PM performance was compared with children's TR abilities, parental reports of daily prospective memory disturbances (Prospective and Retrospective Memory Questionnaire for Children, PRMQC), and ADHD symptomatology (Conner's rating scales). Children with ADHD scored more poorly on event-based PM, time-based PM, and TR; interestingly, TR did not appear related to performance on time-based PM. In addition, it was found that PRMQC scores and ADHD symptom severity were related to performance on the time-based PM task but not to performance on the event-based PM task. These results provide some limited support for theories that propose a distinction between event-based PM and time-based PM. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Time-based and event-based prospective memory in autism spectrum disorder: the roles of executive function and theory of mind, and time-estimation.

    PubMed

    Williams, David; Boucher, Jill; Lind, Sophie; Jarrold, Christopher

    2013-07-01

    Prospective memory (remembering to carry out an action in the future) has been studied relatively little in ASD. We explored time-based (carry out an action at a pre-specified time) and event-based (carry out an action upon the occurrence of a pre-specified event) prospective memory, as well as possible cognitive correlates, among 21 intellectually high-functioning children with ASD, and 21 age- and IQ-matched neurotypical comparison children. We found impaired time-based, but undiminished event-based, prospective memory among children with ASD. In the ASD group, time-based prospective memory performance was associated significantly with diminished theory of mind, but not with diminished cognitive flexibility. There was no evidence that time-estimation ability contributed to time-based prospective memory impairment in ASD.

  4. Event-Based Tone Mapping for Asynchronous Time-Based Image Sensor

    PubMed Central

    Simon Chane, Camille; Ieng, Sio-Hoi; Posch, Christoph; Benosman, Ryad B.

    2016-01-01

    The asynchronous time-based neuromorphic image sensor ATIS is an array of autonomously operating pixels able to encode luminance information with an exceptionally high dynamic range (>143 dB). This paper introduces an event-based methodology to display data from this type of event-based imagers, taking into account the large dynamic range and high temporal accuracy that go beyond available mainstream display technologies. We introduce an event-based tone mapping methodology for asynchronously acquired time encoded gray-level data. A global and a local tone mapping operator are proposed. Both are designed to operate on a stream of incoming events rather than on time frame windows. Experimental results on real outdoor scenes are presented to evaluate the performance of the tone mapping operators in terms of quality, temporal stability, adaptation capability, and computational time. PMID:27642275

  5. Time-Based and Event-Based Prospective Memory in Autism Spectrum Disorder: The Roles of Executive Function and Theory of Mind, and Time-Estimation

    ERIC Educational Resources Information Center

    Williams, David; Boucher, Jill; Lind, Sophie; Jarrold, Christopher

    2013-01-01

    Prospective memory (remembering to carry out an action in the future) has been studied relatively little in ASD. We explored time-based (carry out an action at a pre-specified time) and event-based (carry out an action upon the occurrence of a pre-specified event) prospective memory, as well as possible cognitive correlates, among 21…

  6. On the Application of Different Event-Based Sampling Strategies to the Control of a Simple Industrial Process

    PubMed Central

    Sánchez, José; Guarnes, Miguel Ángel; Dormido, Sebastián

    2009-01-01

    This paper is an experimental study of the utilization of different event-based strategies for the automatic control of a simple but very representative industrial process: the level control of a tank. In an event-based control approach it is the triggering of a specific event, and not the time, that instructs the sensor to send the current state of the process to the controller, and the controller to compute a new control action and send it to the actuator. In the document, five control strategies based on different event-based sampling techniques are described, compared, and contrasted with a classical time-based control approach and a hybrid one. The common denominator in the time, the hybrid, and the event-based control approaches is the controller: a proportional-integral algorithm with adaptations depending on the selected control approach. To compare and contrast each one of the hybrid and the pure event-based control algorithms with the time-based counterpart, the two tasks that a control strategy must achieve (set-point following and disturbance rejection) are independently analyzed. The experimental study provides new proof concerning the ability of event-based control strategies to minimize the data exchange among the control agents (sensors, controllers, actuators) when an error-free control of the process is not a hard requirement. PMID:22399975

  7. Event Classification and Identification Based on the Characteristic Ellipsoid of Phasor Measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Jian; Diao, Ruisheng; Makarov, Yuri V.

    2011-09-23

    In this paper, a method to classify and identify power system events based on the characteristic ellipsoid of phasor measurement is presented. The decision tree technique is used to perform the event classification and identification. Event types, event locations and clearance times are identified by decision trees based on the indices of the characteristic ellipsoid. A sufficiently large number of transient events were simulated on the New England 10-machine 39-bus system based on different system configurations. Transient simulations taking into account different event types, clearance times and various locations are conducted to simulate phasor measurement. Bus voltage magnitudes and recordedmore » reactive and active power flows are used to build the characteristic ellipsoid. The volume, eccentricity, center and projection of the longest axis in the parameter space coordinates of the characteristic ellipsoids are used to classify and identify events. Results demonstrate that the characteristic ellipsoid and the decision tree are capable to detect the event type, location, and clearance time with very high accuracy.« less

  8. A SAS-based solution to evaluate study design efficiency of phase I pediatric oncology trials via discrete event simulation.

    PubMed

    Barrett, Jeffrey S; Jayaraman, Bhuvana; Patel, Dimple; Skolnik, Jeffrey M

    2008-06-01

    Previous exploration of oncology study design efficiency has focused on Markov processes alone (probability-based events) without consideration for time dependencies. Barriers to study completion include time delays associated with patient accrual, inevaluability (IE), time to dose limiting toxicities (DLT) and administrative and review time. Discrete event simulation (DES) can incorporate probability-based assignment of DLT and IE frequency, correlated with cohort in the case of DLT, with time-based events defined by stochastic relationships. A SAS-based solution to examine study efficiency metrics and evaluate design modifications that would improve study efficiency is presented. Virtual patients are simulated with attributes defined from prior distributions of relevant patient characteristics. Study population datasets are read into SAS macros which select patients and enroll them into a study based on the specific design criteria if the study is open to enrollment. Waiting times, arrival times and time to study events are also sampled from prior distributions; post-processing of study simulations is provided within the decision macros and compared across designs in a separate post-processing algorithm. This solution is examined via comparison of the standard 3+3 decision rule relative to the "rolling 6" design, a newly proposed enrollment strategy for the phase I pediatric oncology setting.

  9. The role of musical training in emergent and event-based timing.

    PubMed

    Baer, L H; Thibodeau, J L N; Gralnick, T M; Li, K Z H; Penhune, V B

    2013-01-01

    Musical performance is thought to rely predominantly on event-based timing involving a clock-like neural process and an explicit internal representation of the time interval. Some aspects of musical performance may rely on emergent timing, which is established through the optimization of movement kinematics, and can be maintained without reference to any explicit representation of the time interval. We predicted that musical training would have its largest effect on event-based timing, supporting the dissociability of these timing processes and the dominance of event-based timing in musical performance. We compared 22 musicians and 17 non-musicians on the prototypical event-based timing task of finger tapping and on the typically emergently timed task of circle drawing. For each task, participants first responded in synchrony with a metronome (Paced) and then responded at the same rate without the metronome (Unpaced). Analyses of the Unpaced phase revealed that non-musicians were more variable in their inter-response intervals for finger tapping compared to circle drawing. Musicians did not differ between the two tasks. Between groups, non-musicians were more variable than musicians for tapping but not for drawing. We were able to show that the differences were due to less timer variability in musicians on the tapping task. Correlational analyses of movement jerk and inter-response interval variability revealed a negative association for tapping and a positive association for drawing in non-musicians only. These results suggest that musical training affects temporal variability in tapping but not drawing. Additionally, musicians and non-musicians may be employing different movement strategies to maintain accurate timing in the two tasks. These findings add to our understanding of how musical training affects timing and support the dissociability of event-based and emergent timing modes.

  10. Distinct and shared cognitive functions mediate event- and time-based prospective memory impairment in normal ageing

    PubMed Central

    Gonneaud, Julie; Kalpouzos, Grégoria; Bon, Laetitia; Viader, Fausto; Eustache, Francis; Desgranges, Béatrice

    2011-01-01

    Prospective memory (PM) is the ability to remember to perform an action at a specific point in the future. Regarded as multidimensional, PM involves several cognitive functions that are known to be impaired in normal aging. In the present study, we set out to investigate the cognitive correlates of PM impairment in normal aging. Manipulating cognitive load, we assessed event- and time-based PM, as well as several cognitive functions, including executive functions, working memory and retrospective episodic memory, in healthy subjects covering the entire adulthood. We found that normal aging was characterized by PM decline in all conditions and that event-based PM was more sensitive to the effects of aging than time-based PM. Whatever the conditions, PM was linked to inhibition and processing speed. However, while event-based PM was mainly mediated by binding and retrospective memory processes, time-based PM was mainly related to inhibition. The only distinction between high- and low-load PM cognitive correlates lays in an additional, but marginal, correlation between updating and the high-load PM condition. The association of distinct cognitive functions, as well as shared mechanisms with event- and time-based PM confirms that each type of PM relies on a different set of processes. PMID:21678154

  11. Time Here, Time There, Time Everywhere: Teaching Young Children Time through Daily Routine

    ERIC Educational Resources Information Center

    Lee, Joohi; Lee, Joo Ok; Fox, Jill

    2009-01-01

    According to Piaget, 5- or 6-year-old children gradually acquire the concept of time based on events (Piaget, 1969). In his experiment of investigating children's time concepts, Piaget found that children of these ages were able to place pictures based on sequential events with some errors; the younger children made more errors. The National…

  12. Element analysis: a wavelet-based method for analysing time-localized events in noisy time series.

    PubMed

    Lilly, Jonathan M

    2017-04-01

    A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized 'events'. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event's 'region of influence' within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis , is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry.

  13. GPS-based PWV for precipitation forecasting and its application to a typhoon event

    NASA Astrophysics Data System (ADS)

    Zhao, Qingzhi; Yao, Yibin; Yao, Wanqiang

    2018-01-01

    The temporal variability of precipitable water vapour (PWV) derived from Global Navigation Satellite System (GNSS) observations can be used to forecast precipitation events. A number of case studies of precipitation events have been analysed in Zhejiang Province, and a forecasting method for precipitation events was proposed. The PWV time series retrieved from the Global Positioning System (GPS) observations was processed by using a least-squares fitting method, so as to obtain the line tendency of ascents and descents over PWV. The increment of PWV for a short time (two to six hours) and PWV slope for a longer time (a few hours to more than ten hours) during the PWV ascending period are considered as predictive factors with which to forecast the precipitation event. The numerical results show that about 80%-90% of precipitation events and more than 90% of heavy rain events can be forecasted two to six hours in advance of the precipitation event based on the proposed method. 5-minute PWV data derived from GPS observations based on real-time precise point positioning (RT-PPP) were used for the typhoon event that passed over Zhejiang Province between 10 and 12 July, 2015. A good result was acquired using the proposed method and about 74% of precipitation events were predicted at some ten to thirty minutes earlier than their onset with a false alarm rate of 18%. This study shows that the GPS-based PWV was promising for short-term and now-casting precipitation forecasting.

  14. A comparison of the conditional inference survival forest model to random survival forests based on a simulation study as well as on two applications with time-to-event data.

    PubMed

    Nasejje, Justine B; Mwambi, Henry; Dheda, Keertan; Lesosky, Maia

    2017-07-28

    Random survival forest (RSF) models have been identified as alternative methods to the Cox proportional hazards model in analysing time-to-event data. These methods, however, have been criticised for the bias that results from favouring covariates with many split-points and hence conditional inference forests for time-to-event data have been suggested. Conditional inference forests (CIF) are known to correct the bias in RSF models by separating the procedure for the best covariate to split on from that of the best split point search for the selected covariate. In this study, we compare the random survival forest model to the conditional inference model (CIF) using twenty-two simulated time-to-event datasets. We also analysed two real time-to-event datasets. The first dataset is based on the survival of children under-five years of age in Uganda and it consists of categorical covariates with most of them having more than two levels (many split-points). The second dataset is based on the survival of patients with extremely drug resistant tuberculosis (XDR TB) which consists of mainly categorical covariates with two levels (few split-points). The study findings indicate that the conditional inference forest model is superior to random survival forest models in analysing time-to-event data that consists of covariates with many split-points based on the values of the bootstrap cross-validated estimates for integrated Brier scores. However, conditional inference forests perform comparably similar to random survival forests models in analysing time-to-event data consisting of covariates with fewer split-points. Although survival forests are promising methods in analysing time-to-event data, it is important to identify the best forest model for analysis based on the nature of covariates of the dataset in question.

  15. Visual tracking using neuromorphic asynchronous event-based cameras.

    PubMed

    Ni, Zhenjiang; Ieng, Sio-Hoi; Posch, Christoph; Régnier, Stéphane; Benosman, Ryad

    2015-04-01

    This letter presents a novel computationally efficient and robust pattern tracking method based on a time-encoded, frame-free visual data. Recent interdisciplinary developments, combining inputs from engineering and biology, have yielded a novel type of camera that encodes visual information into a continuous stream of asynchronous, temporal events. These events encode temporal contrast and intensity locally in space and time. We show that the sparse yet accurately timed information is well suited as a computational input for object tracking. In this letter, visual data processing is performed for each incoming event at the time it arrives. The method provides a continuous and iterative estimation of the geometric transformation between the model and the events representing the tracked object. It can handle isometry, similarities, and affine distortions and allows for unprecedented real-time performance at equivalent frame rates in the kilohertz range on a standard PC. Furthermore, by using the dimension of time that is currently underexploited by most artificial vision systems, the method we present is able to solve ambiguous cases of object occlusions that classical frame-based techniques handle poorly.

  16. HOTS: A Hierarchy of Event-Based Time-Surfaces for Pattern Recognition.

    PubMed

    Lagorce, Xavier; Orchard, Garrick; Galluppi, Francesco; Shi, Bertram E; Benosman, Ryad B

    2017-07-01

    This paper describes novel event-based spatio-temporal features called time-surfaces and how they can be used to create a hierarchical event-based pattern recognition architecture. Unlike existing hierarchical architectures for pattern recognition, the presented model relies on a time oriented approach to extract spatio-temporal features from the asynchronously acquired dynamics of a visual scene. These dynamics are acquired using biologically inspired frameless asynchronous event-driven vision sensors. Similarly to cortical structures, subsequent layers in our hierarchy extract increasingly abstract features using increasingly large spatio-temporal windows. The central concept is to use the rich temporal information provided by events to create contexts in the form of time-surfaces which represent the recent temporal activity within a local spatial neighborhood. We demonstrate that this concept can robustly be used at all stages of an event-based hierarchical model. First layer feature units operate on groups of pixels, while subsequent layer feature units operate on the output of lower level feature units. We report results on a previously published 36 class character recognition task and a four class canonical dynamic card pip task, achieving near 100 percent accuracy on each. We introduce a new seven class moving face recognition task, achieving 79 percent accuracy.This paper describes novel event-based spatio-temporal features called time-surfaces and how they can be used to create a hierarchical event-based pattern recognition architecture. Unlike existing hierarchical architectures for pattern recognition, the presented model relies on a time oriented approach to extract spatio-temporal features from the asynchronously acquired dynamics of a visual scene. These dynamics are acquired using biologically inspired frameless asynchronous event-driven vision sensors. Similarly to cortical structures, subsequent layers in our hierarchy extract increasingly abstract features using increasingly large spatio-temporal windows. The central concept is to use the rich temporal information provided by events to create contexts in the form of time-surfaces which represent the recent temporal activity within a local spatial neighborhood. We demonstrate that this concept can robustly be used at all stages of an event-based hierarchical model. First layer feature units operate on groups of pixels, while subsequent layer feature units operate on the output of lower level feature units. We report results on a previously published 36 class character recognition task and a four class canonical dynamic card pip task, achieving near 100 percent accuracy on each. We introduce a new seven class moving face recognition task, achieving 79 percent accuracy.

  17. Serial recall and presentation schedule: a micro-analysis of local distinctiveness.

    PubMed

    Lewandowsky, Stephan; Brown, Gordon D A

    2005-01-01

    According to temporal distinctiveness theories, items that are temporally isolated from their neighbours during presentation are more distinct and thus are recalled better. Event-based theories, which deny that elapsed time plays a role at encoding, explain isolation effects by assuming that temporal isolation provides extra time for rehearsal or consolidation of encoding. The two classes of theories can be differentiated by examining the symmetry of isolation effects: Event-based accounts predict that performance should be affected only by pauses following item presentation (because they allow time for rehearsal or consolidation), whereas distinctiveness predicts that items should also benefit from preceding pauses. The first experiment manipulated inter-item intervals and showed an effect of intervals following but not preceding presentation, in line with event-based accounts. The second experiment showed that the effect of following interval was abolished by articulatory suppression. The data are consistent with event-based theories but can be handled by time-based distinctiveness models if they allow for additional encoding during inter-item pauses.

  18. Application of process monitoring to anomaly detection in nuclear material processing systems via system-centric event interpretation of data from multiple sensors of varying reliability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao

    In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a systemcentric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologiesmore » within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jason L. Wright; Milos Manic

    Time synchronization and event time correlation are important in wireless sensor networks. In particular, time is used to create a sequence events or time line to answer questions of cause and effect. Time is also used as a basis for determining the freshness of received packets and the validity of cryptographic certificates. This paper presents secure method of time synchronization and event time correlation for TESLA-based hierarchical wireless sensor networks. The method demonstrates that events in a TESLA network can be accurately timestamped by adding only a few pieces of data to the existing protocol.

  20. A Long, Long Time Ago: Student Perceptions of Geologic Time Using a 45.6-foot-long Timeline

    NASA Astrophysics Data System (ADS)

    Gehman, J. R.; Johnson, E. A.

    2008-12-01

    In this study we investigated preconceptions of geologic time held by students in five large (50-115 students each) sections of introductory geology and Earth science courses. Students were randomly divided into groups of eleven individuals, and each group was assigned a separate timeline made from a roll of adding machine paper. Students were encouraged to work as a group to place the eleven geological or biological events where they thought they should belong on their timeline based only on their previous knowledge of geologic time. Geologic events included "Oldest Known Earth Rock" and "The Colorado River Begins to Form the Grand Canyon" while biological events included such milestones as "First Fish," "Dinosaurs go Extinct," and "First Modern Humans." Students were asked in an anonymous survey how they decided to place the events on the timeline in this initial exercise. After the eleven event cards were clipped to the timeline and marks were made to record the initial location of each event, students returned to the classroom and were provided with a scale and the correct dates for the events. Each paper timeline was 45.6 ft. long to represent the 4.56 billion years of Earth history (each one-foot-wide floor tile in the hallways outside the classroom equals 100 million years). Student then returned to their timelines and moved the event cards to the correct locations. At the end of the exercise, survey questions and the paper timelines with the markings of the original position of geologic events were collected and compiled. Analysis of the timeline data based on previous knowledge revealed that no group of students arranged all of the events in the proper sequence, although several groups misplaced only two events in relative order. Students consistently placed events further back in time than their correct locations based on absolute age dates. The survey revealed that several student groups used one "old" event such as the "First Dinosaurs Appear" or "Oldest Known Earth Rock" as a marker from which they based relative placement of other events on the timeline. The most recent events including "First Modern Humans" showed the greatest percentage error of placement.

  1. A Web Portal-Based Time-Aware KML Animation Tool for Exploring Spatiotemporal Dynamics of Hydrological Events

    NASA Astrophysics Data System (ADS)

    Bao, X.; Cai, X.; Liu, Y.

    2009-12-01

    Understanding spatiotemporal dynamics of hydrological events such as storms and droughts is highly valuable for decision making on disaster mitigation and recovery. Virtual Globe-based technologies such as Google Earth and Open Geospatial Consortium KML standards show great promises for collaborative exploration of such events using visual analytical approaches. However, currently there are two barriers for wider usage of such approaches. First, there lacks an easy way to use open source tools to convert legacy or existing data formats such as shapefiles, geotiff, or web services-based data sources to KML and to produce time-aware KML files. Second, an integrated web portal-based time-aware animation tool is currently not available. Thus users usually share their files in the portal but have no means to visually explore them without leaving the portal environment which the users are familiar with. We develop a web portal-based time-aware KML animation tool for viewing extreme hydrologic events. The tool is based on Google Earth JavaScript API and Java Portlet standard 2.0 JSR-286, and it is currently deployable in one of the most popular open source portal frameworks, namely Liferay. We have also developed an open source toolkit kml-soc-ncsa (http://code.google.com/p/kml-soc-ncsa/) to facilitate the conversion of multiple formats into KML and the creation of time-aware KML files. We illustrate our tool using some example cases, in which drought and storm events with both time and space dimension can be explored in this web-based KML animation portlet. The tool provides an easy-to-use web browser-based portal environment for multiple users to collaboratively share and explore their time-aware KML files as well as improving the understanding of the spatiotemporal dynamics of the hydrological events.

  2. Adaptive Event-Triggered Control Based on Heuristic Dynamic Programming for Nonlinear Discrete-Time Systems.

    PubMed

    Dong, Lu; Zhong, Xiangnan; Sun, Changyin; He, Haibo

    2017-07-01

    This paper presents the design of a novel adaptive event-triggered control method based on the heuristic dynamic programming (HDP) technique for nonlinear discrete-time systems with unknown system dynamics. In the proposed method, the control law is only updated when the event-triggered condition is violated. Compared with the periodic updates in the traditional adaptive dynamic programming (ADP) control, the proposed method can reduce the computation and transmission cost. An actor-critic framework is used to learn the optimal event-triggered control law and the value function. Furthermore, a model network is designed to estimate the system state vector. The main contribution of this paper is to design a new trigger threshold for discrete-time systems. A detailed Lyapunov stability analysis shows that our proposed event-triggered controller can asymptotically stabilize the discrete-time systems. Finally, we test our method on two different discrete-time systems, and the simulation results are included.

  3. Prospective Memory Impairments in Alzheimer's Disease and Behavioral Variant Frontotemporal Dementia: Clinical and Neural Correlates.

    PubMed

    Dermody, Nadene; Hornberger, Michael; Piguet, Olivier; Hodges, John R; Irish, Muireann

    2016-01-01

    Prospective memory (PM) refers to a future-oriented form of memory in which the individual must remember to execute an intended action either at a future point in time (Time-based) or in response to a specific event (Event-based). Lapses in PM are commonly exhibited in neurodegenerative disorders including Alzheimer's disease (AD) and frontotemporal dementia (FTD), however, the neurocognitive mechanisms driving these deficits remain unknown. To investigate the clinical and neural correlates of Time- and Event-based PM disruption in AD and the behavioral-variant FTD (bvFTD). Twelve AD, 12 bvFTD, and 12 healthy older Control participants completed a modified version of the Cambridge Prospective Memory test, which examines Time- and Event-based aspects of PM. All participants completed a standard neuropsychological assessment and underwent whole-brain structural MRI. AD and bvFTD patients displayed striking impairments across Time- and Event-based PM relative to Controls, however, Time-based PM was disproportionately affected in the AD group. Episodic memory dysfunction and hippocampal atrophy were found to correlate strongly with PM integrity in both patient groups, however, dissociable neural substrates were also evident for PM performance across dementia syndromes. Our study reveals the multifaceted nature of PM dysfunction in neurodegenerative disorders, and suggests common and dissociable neurocognitive mechanisms, which subtend these deficits in each patient group. Future studies of PM disturbance in dementia syndromes will be crucial for the development of successful interventions to improve functional independence in the patient's daily life.

  4. Event-based Sensing for Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Cohen, G.; Afshar, S.; van Schaik, A.; Wabnitz, A.; Bessell, T.; Rutten, M.; Morreale, B.

    A revolutionary type of imaging device, known as a silicon retina or event-based sensor, has recently been developed and is gaining in popularity in the field of artificial vision systems. These devices are inspired by a biological retina and operate in a significantly different way to traditional CCD-based imaging sensors. While a CCD produces frames of pixel intensities, an event-based sensor produces a continuous stream of events, each of which is generated when a pixel detects a change in log light intensity. These pixels operate asynchronously and independently, producing an event-based output with high temporal resolution. There are also no fixed exposure times, allowing these devices to offer a very high dynamic range independently for each pixel. Additionally, these devices offer high-speed, low power operation and a sparse spatiotemporal output. As a consequence, the data from these sensors must be interpreted in a significantly different way to traditional imaging sensors and this paper explores the advantages this technology provides for space imaging. The applicability and capabilities of event-based sensors for SSA applications are demonstrated through telescope field trials. Trial results have confirmed that the devices are capable of observing resident space objects from LEO through to GEO orbital regimes. Significantly, observations of RSOs were made during both day-time and nighttime (terminator) conditions without modification to the camera or optics. The event based sensor’s ability to image stars and satellites during day-time hours offers a dramatic capability increase for terrestrial optical sensors. This paper shows the field testing and validation of two different architectures of event-based imaging sensors. An eventbased sensor’s asynchronous output has an intrinsically low data-rate. In addition to low-bandwidth communications requirements, the low weight, low-power and high-speed make them ideally suitable to meeting the demanding challenges required by space-based SSA systems. Results from these experiments and the systems developed highlight the applicability of event-based sensors to ground and space-based SSA tasks.

  5. DEVS representation of dynamical systems - Event-based intelligent control. [Discrete Event System Specification

    NASA Technical Reports Server (NTRS)

    Zeigler, Bernard P.

    1989-01-01

    It is shown how systems can be advantageously represented as discrete-event models by using DEVS (discrete-event system specification), a set-theoretic formalism. Such DEVS models provide a basis for the design of event-based logic control. In this control paradigm, the controller expects to receive confirming sensor responses to its control commands within definite time windows determined by its DEVS model of the system under control. The event-based contral paradigm is applied in advanced robotic and intelligent automation, showing how classical process control can be readily interfaced with rule-based symbolic reasoning systems.

  6. Prospective memory in first-degree relatives of patients with schizophrenia.

    PubMed

    Saleem, Saima; Kumar, Devvarta; Venkatasubramanian, Ganesan

    2017-12-07

    Among various cognitive impairments in schizophrenia, prospective memory (ProM) deficit is unequivocally established. However, there is a paucity of research examining whether ProM impairment can be considered a cognitive endophenotypic marker in schizophrenia. An important step toward this is to assess the status of ProM in first-degree relatives (FDRs) of patients with schizophrenia. Keeping this in view, present study has been conducted to assess event- and time-based ProM in FDRs of patients with schizophrenia. Twenty patients with schizophrenia, 20 FDRs of these patients, and 20 nonpsychiatric (healthy) controls were administered event- and time-based ProM tasks. Findings show that the FDRs had poorer performance on event-based ProM task in comparison to healthy controls. On time-based task, though the FDRs performed poorly in comparison to healthy controls the difference was statistically non-significant. The patient group performed poorer than healthy controls on both event- and time-based tasks. Findings of the present study indicate that the FDRs of patients with schizophrenia exhibit ProM impairment, though to a lesser degree than the patients with schizophrenia.

  7. Scalable and responsive event processing in the cloud

    PubMed Central

    Suresh, Visalakshmi; Ezhilchelvan, Paul; Watson, Paul

    2013-01-01

    Event processing involves continuous evaluation of queries over streams of events. Response-time optimization is traditionally done over a fixed set of nodes and/or by using metrics measured at query-operator levels. Cloud computing makes it easy to acquire and release computing nodes as required. Leveraging this flexibility, we propose a novel, queueing-theory-based approach for meeting specified response-time targets against fluctuating event arrival rates by drawing only the necessary amount of computing resources from a cloud platform. In the proposed approach, the entire processing engine of a distinct query is modelled as an atomic unit for predicting response times. Several such units hosted on a single node are modelled as a multiple class M/G/1 system. These aspects eliminate intrusive, low-level performance measurements at run-time, and also offer portability and scalability. Using model-based predictions, cloud resources are efficiently used to meet response-time targets. The efficacy of the approach is demonstrated through cloud-based experiments. PMID:23230164

  8. Limits on the Efficiency of Event-Based Algorithms for Monte Carlo Neutron Transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romano, Paul K.; Siegel, Andrew R.

    The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup duemore » to vectorization as a function of the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size to achieve vector efficiency greater than 90%. Lastly, when the execution times for events are allowed to vary, the vector speedup is also limited by differences in execution time for events being carried out in a single event-iteration.« less

  9. Limits on the Efficiency of Event-Based Algorithms for Monte Carlo Neutron Transport

    DOE PAGES

    Romano, Paul K.; Siegel, Andrew R.

    2017-07-01

    The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup duemore » to vectorization as a function of the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size to achieve vector efficiency greater than 90%. Lastly, when the execution times for events are allowed to vary, the vector speedup is also limited by differences in execution time for events being carried out in a single event-iteration.« less

  10. Increasing the Operational Value of Event Messages

    NASA Technical Reports Server (NTRS)

    Li, Zhenping; Savkli, Cetin; Smith, Dan

    2003-01-01

    Assessing the health of a space mission has traditionally been performed using telemetry analysis tools. Parameter values are compared to known operational limits and are plotted over various time periods. This presentation begins with the notion that there is an incredible amount of untapped information contained within the mission s event message logs. Through creative advancements in message handling tools, the event message logs can be used to better assess spacecraft and ground system status and to highlight and report on conditions not readily apparent when messages are evaluated one-at-a-time during a real-time pass. Work in this area is being funded as part of a larger NASA effort at the Goddard Space Flight Center to create component-based, middleware-based, standards-based general purpose ground system architecture referred to as GMSEC - the GSFC Mission Services Evolution Center. The new capabilities and operational concepts for event display, event data analyses and data mining are being developed by Lockheed Martin and the new subsystem has been named GREAT - the GMSEC Reusable Event Analysis Toolkit. Planned for use on existing and future missions, GREAT has the potential to increase operational efficiency in areas of problem detection and analysis, general status reporting, and real-time situational awareness.

  11. Event-Based Processing of Neutron Scattering Data

    DOE PAGES

    Peterson, Peter F.; Campbell, Stuart I.; Reuter, Michael A.; ...

    2015-09-16

    Many of the world's time-of-flight spallation neutrons sources are migrating to the recording of individual neutron events. This provides for new opportunities in data processing, the least of which is to filter the events based on correlating them with logs of sample environment and other ancillary equipment. This paper will describe techniques for processing neutron scattering data acquired in event mode that preserve event information all the way to a final spectrum, including any necessary corrections or normalizations. This results in smaller final errors, while significantly reducing processing time and memory requirements in typical experiments. Results with traditional histogramming techniquesmore » will be shown for comparison.« less

  12. Time-based Reconstruction of Free-streaming Data in CBM

    NASA Astrophysics Data System (ADS)

    Akishina, Valentina; Kisel, Ivan; Vassiliev, Iouri; Zyzak, Maksym

    2018-02-01

    Traditional latency-limited trigger architectures typical for conventional experiments are inapplicable for the CBM experiment. Instead, CBM will ship and collect time-stamped data into a readout buffer in a form of a time-slice of a certain length and deliver it to a large computer farm, where online event reconstruction and selection will be performed. Grouping measurements into physical collisions must be performed in software and requires reconstruction not only in space, but also in time, the so-called 4-dimensional track reconstruction and event building. The tracks, reconstructed with 4D Cellular Automaton track finder, are combined into event-corresponding clusters according to the estimated time in the target position and the errors, obtained with the Kalman Filter method. The reconstructed events are given as inputs to the KF Particle Finder package for short-lived particle reconstruction. The results of time-based reconstruction of simulated collisions in CBM are presented and discussed in details.

  13. Lessons Learned from Real-Time, Event-Based Internet Science Communications

    NASA Technical Reports Server (NTRS)

    Phillips, T.; Myszka, E.; Gallagher, D. L.; Adams, M. L.; Koczor, R. J.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The Directorate's Science Roundtable includes active researchers, NASA public relations, educators, and administrators. The Science@NASA award-winning family of Web sites features science, mathematics, and space news. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing science activities in real-time has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases, broadcasts accommodate active feedback and questions from Internet participants. Through these projects a pattern has emerged in the level of interest or popularity with the public. The pattern differentiates projects that include science from those that do not, All real-time, event-based Internet activities have captured public interest at a level not achieved through science stories or educator resource material exclusively. The worst event-based activity attracted more interest than the best written science story. One truly rewarding lesson learned through these projects is that the public recognizes the importance and excitement of being part of scientific discovery. Flying a camera to 100,000 feet altitude isn't as interesting to the public as searching for viable life-forms at these oxygen-poor altitudes. The details of these real-time, event-based projects and lessons learned will be discussed.

  14. Limits on the Efficiency of Event-Based Algorithms for Monte Carlo Neutron Transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romano, Paul K.; Siegel, Andrew R.

    The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup duemore » to vectorization as a function of two parameters: the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size in order to achieve vector efficiency greater than 90%. When the execution times for events are allowed to vary, however, the vector speedup is also limited by differences in execution time for events being carried out in a single event-iteration. For some problems, this implies that vector effciencies over 50% may not be attainable. While there are many factors impacting performance of an event-based algorithm that are not captured by our model, it nevertheless provides insights into factors that may be limiting in a real implementation.« less

  15. SU43. The Effect of Implementation Intention on Different Types of Prospective Memory Performance in Patients With Schizophrenia

    PubMed Central

    Wang, Ya; Liu, Lu-lu; Gan, Ming-yuan; Tan, Shu-ping; Shum, David; Chan, Raymond

    2017-01-01

    Abstract Background: Prospective memory (PM) refers to remembering to execute a planned intention in the future, which can been divided as event-based PM (focal, nonfocal) and time-based PM according to the nature of the cue. Focal event-based PM, where the ongoing task requires processing of the characteristics of PM cues, has been found to be benefited from implementation intention (II, ie, an encoding strategy in the format of “if I see X, then I will do Y”). However, to date, it is unclear whether implementation intention can produce a positive effect on nonfocal event-based PM (where the ongoing task is irrelevant with the PM cues) and time-based PM. Moreover, patients with schizophrenia (SCZ) were found to have impairments in these types of PM, and few studies have been conducted to examine the effect of II on these types of PM. This study investigated whether (and how) implementation intention can improve nonfocal event-based PM and time-based PM performance in patients with SCZ. Methods: Forty-two patients with SCZ and 42 healthy control participants were administered both computerized nonfocal event-based PM task and time-based PM task. Patients and healthy controls were further randomly allocated to implementation intention condition (N = 21) and typical instruction condition (N = 21). Results: Patients with SCZ in the implementation intention group showed higher PM accuracy than the typical instruction group in both nonfocal event-based PM task (0.51 ± 0.32 vs 0.19 ± 0.29, t(40) = 3.39, P = .002) and time-based PM task (0.72 ± 0.31 vs 0.39 ± 0.40, t(40) = 2.98, P = .005). Similarly, healthy controls in the II group also showed better PM performance than the typical instruction group in both tasks (all P’s < 0.05). Time check frequency of time-based PM task in the II group of all the participants was significantly higher than the typical instruction group. Conclusion: Implementation intention is an effective strategy for improving different types of PM performance in patients with schizophrenia and can be applied for clinical settings.

  16. CIFAR10-DVS: An Event-Stream Dataset for Object Classification

    PubMed Central

    Li, Hongmin; Liu, Hanchao; Ji, Xiangyang; Li, Guoqi; Shi, Luping

    2017-01-01

    Neuromorphic vision research requires high-quality and appropriately challenging event-stream datasets to support continuous improvement of algorithms and methods. However, creating event-stream datasets is a time-consuming task, which needs to be recorded using the neuromorphic cameras. Currently, there are limited event-stream datasets available. In this work, by utilizing the popular computer vision dataset CIFAR-10, we converted 10,000 frame-based images into 10,000 event streams using a dynamic vision sensor (DVS), providing an event-stream dataset of intermediate difficulty in 10 different classes, named as “CIFAR10-DVS.” The conversion of event-stream dataset was implemented by a repeated closed-loop smooth (RCLS) movement of frame-based images. Unlike the conversion of frame-based images by moving the camera, the image movement is more realistic in respect of its practical applications. The repeated closed-loop image movement generates rich local intensity changes in continuous time which are quantized by each pixel of the DVS camera to generate events. Furthermore, a performance benchmark in event-driven object classification is provided based on state-of-the-art classification algorithms. This work provides a large event-stream dataset and an initial benchmark for comparison, which may boost algorithm developments in even-driven pattern recognition and object classification. PMID:28611582

  17. CIFAR10-DVS: An Event-Stream Dataset for Object Classification.

    PubMed

    Li, Hongmin; Liu, Hanchao; Ji, Xiangyang; Li, Guoqi; Shi, Luping

    2017-01-01

    Neuromorphic vision research requires high-quality and appropriately challenging event-stream datasets to support continuous improvement of algorithms and methods. However, creating event-stream datasets is a time-consuming task, which needs to be recorded using the neuromorphic cameras. Currently, there are limited event-stream datasets available. In this work, by utilizing the popular computer vision dataset CIFAR-10, we converted 10,000 frame-based images into 10,000 event streams using a dynamic vision sensor (DVS), providing an event-stream dataset of intermediate difficulty in 10 different classes, named as "CIFAR10-DVS." The conversion of event-stream dataset was implemented by a repeated closed-loop smooth (RCLS) movement of frame-based images. Unlike the conversion of frame-based images by moving the camera, the image movement is more realistic in respect of its practical applications. The repeated closed-loop image movement generates rich local intensity changes in continuous time which are quantized by each pixel of the DVS camera to generate events. Furthermore, a performance benchmark in event-driven object classification is provided based on state-of-the-art classification algorithms. This work provides a large event-stream dataset and an initial benchmark for comparison, which may boost algorithm developments in even-driven pattern recognition and object classification.

  18. SQL Triggers Reacting on Time Events: An Extension Proposal

    NASA Astrophysics Data System (ADS)

    Behrend, Andreas; Dorau, Christian; Manthey, Rainer

    Being able to activate triggers at timepoints reached or after time intervals elapsed has been acknowledged by many authors as a valuable functionality of a DBMS. Recently, the interest in time-based triggers has been renewed in the context of data stream monitoring. However, up till now SQL triggers react to data changes only, even though research proposals and prototypes have been supporting several other event types, in particular time-based ones, since long. We therefore propose a seamless extension of the SQL trigger concept by time-based triggers, focussing on semantic issues arising from such an extension.

  19. Improving the Critic Learning for Event-Based Nonlinear $H_{\\infty }$ Control Design.

    PubMed

    Wang, Ding; He, Haibo; Liu, Derong

    2017-10-01

    In this paper, we aim at improving the critic learning criterion to cope with the event-based nonlinear H ∞ state feedback control design. First of all, the H ∞ control problem is regarded as a two-player zero-sum game and the adaptive critic mechanism is used to achieve the minimax optimization under event-based environment. Then, based on an improved updating rule, the event-based optimal control law and the time-based worst-case disturbance law are obtained approximately by training a single critic neural network. The initial stabilizing control is no longer required during the implementation process of the new algorithm. Next, the closed-loop system is formulated as an impulsive model and its stability issue is handled by incorporating the improved learning criterion. The infamous Zeno behavior of the present event-based design is also avoided through theoretical analysis on the lower bound of the minimal intersample time. Finally, the applications to an aircraft dynamics and a robot arm plant are carried out to verify the efficient performance of the present novel design method.

  20. Detecting a Non-Gaussian Stochastic Background of Gravitational Radiation

    NASA Astrophysics Data System (ADS)

    Drasco, Steve; Flanagan, Éanna É.

    2002-12-01

    We derive a detection method for a stochastic background of gravitational waves produced by events where the ratio of the average time between events to the average duration of an event is large. Such a signal would sound something like popcorn popping. Our derivation is based on the somewhat unrealistic assumption that the duration of an event is smaller than the detector time resolution.

  1. Prospective memory impairments in heavy social drinkers are partially overcome by future event simulation.

    PubMed

    Platt, Bradley; Kamboj, Sunjeev K; Italiano, Tommaso; Rendell, Peter G; Curran, H Valerie

    2016-02-01

    Recent research suggests that alcohol acutely impairs prospective memory (PM), and this impairment can be overcome using a strategy called 'future event simulation' (FES). Impairment in event-based PM found in detoxifying alcohol-dependent participants is reversed through FES. However, the impact of the most common problematic drinking patterns that do not involve alcohol dependence on PM remains unclear. Here, we examine the impact of frequent heavy drinking on PM and the degree to which any impairments can be reversed through FES. PM was assessed in 19 heavy drinkers (AUDIT scores ≥ 15) and 18 matched control participants (AUDIT scores ≤ 7) using the 'Virtual Week' task both at baseline and again following FES. Heavy drinkers performed significantly worse than controls on regular and irregular time-based PM tasks. FES improved the performance of controls but not of heavy drinkers on time-based tasks. In contrast, FES improved heavy drinkers' performance on event-based PM tasks. These findings suggest that heavy drinkers experience deficits in strategic monitoring processing associated with time-based PM tasks which do not abate after FES. That the same strategy improves their event-based PM suggests that FES may be helpful for individuals with problematic drinking patterns in improving their prospective memory.

  2. Comparison of Ionospheric and Thermospheric Effects During Two High Speed Stream Events

    NASA Astrophysics Data System (ADS)

    Verkhoglyadova, O. P.; Tsurutani, B.; Mannucci, A. J.; Paxton, L.; Mlynczak, M. G.; Hunt, L. A.; Echer, E.

    2013-12-01

    We analyze two CIR-HSS events during ascending phase of the current solar cycle. The first event occurred on 8-12 May 2012 and was characterized by a large CIR and intense High Intensity Long Duration Continuous Auroral Activity (HILDCAA). Long-duration moderate geomagnetic storm (Dst ~ -50 nT) occurred during this event. The second event on 29 April - 4 May 2011 had a large CIR and extended HSS, but weaker geomagnetic activity. We focus on understanding differences and similarities of the magnetosphere-ionosphere-thermosphere coupling during these two events. We will use a suite of ground-based and satellite measurements to create a comprehensive picture of the events. Evolution of the polar cap convection pattern is analyzed based on SuperDARN data. DMSP/SSUSI far ultraviolet measurements provide information on airglow intensity and characteristics of the F-region of the dusktime ionosphere. The GPS total electron content (TEC) database and JPL's Global Ionospheric Maps (GIM) are used to study vertical TEC (VTEC) for different local times and latitude ranges. We discuss dynamics of VTEC above individual ground GPS sites with respect to local time and latitude ranges. We analyze the TIMED/SABER zonal flux of nitric oxide (NO) infrared cooling radiation and auroral heating throughout the events. Global dynamics of the column density ratio ΣO/N2 is studied based on TIMED/GUVI measurements. Our results will advance understanding of the ionosphere-thermosphere response to external forcing and help future forecasting efforts.

  3. Signature-based search for delayed photons in exclusive photon plus missing transverse energy events from pp¯ collisions with s=1.96TeV

    NASA Astrophysics Data System (ADS)

    Aaltonen, T.; Amerio, S.; Amidei, D.; Anastassov, A.; Annovi, A.; Antos, J.; Apollinari, G.; Appel, J. A.; Arisawa, T.; Artikov, A.; Asaadi, J.; Ashmanskas, W.; Auerbach, B.; Aurisano, A.; Azfar, F.; Badgett, W.; Bae, T.; Barbaro-Galtieri, A.; Barnes, V. E.; Barnett, B. A.; Barria, P.; Bartos, P.; Bauce, M.; Bedeschi, F.; Behari, S.; Bellettini, G.; Bellinger, J.; Benjamin, D.; Beretvas, A.; Bhatti, A.; Bland, K. R.; Blumenfeld, B.; Bocci, A.; Bodek, A.; Bortoletto, D.; Boudreau, J.; Boveia, A.; Brigliadori, L.; Bromberg, C.; Brucken, E.; Budagov, J.; Budd, H. S.; Burkett, K.; Busetto, G.; Bussey, P.; Butti, P.; Buzatu, A.; Calamba, A.; Camarda, S.; Campanelli, M.; Canelli, F.; Carls, B.; Carlsmith, D.; Carosi, R.; Carrillo, S.; Casal, B.; Casarsa, M.; Castro, A.; Catastini, P.; Cauz, D.; Cavaliere, V.; Cavalli-Sforza, M.; Cerri, A.; Cerrito, L.; Chen, Y. C.; Chertok, M.; Chiarelli, G.; Chlachidze, G.; Cho, K.; Chokheli, D.; Ciocci, M. A.; Clark, A.; Clarke, C.; Convery, M. E.; Conway, J.; Corbo, M.; Cordelli, M.; Cox, C. A.; Cox, D. J.; Cremonesi, M.; Cruz, D.; Cuevas, J.; Culbertson, R.; d'Ascenzo, N.; Datta, M.; De Barbaro, P.; Demortier, L.; Deninno, M.; d'Errico, M.; Devoto, F.; Di Canto, A.; Di Ruzza, B.; Dittmann, J. R.; D'Onofrio, M.; Donati, S.; Dorigo, M.; Driutti, A.; Ebina, K.; Edgar, R.; Elagin, A.; Erbacher, R.; Errede, S.; Esham, B.; Eusebi, R.; Farrington, S.; Fernández Ramos, J. P.; Field, R.; Flanagan, G.; Forrest, R.; Franklin, M.; Freeman, J. C.; Frisch, H.; Funakoshi, Y.; Garfinkel, A. F.; Garosi, P.; Gerberich, H.; Gerchtein, E.; Giagu, S.; Giakoumopoulou, V.; Gibson, K.; Ginsburg, C. M.; Giokaris, N.; Giromini, P.; Giurgiu, G.; Glagolev, V.; Glenzinski, D.; Gold, M.; Goldin, D.; Golossanov, A.; Gomez, G.; Gomez-Ceballos, G.; Goncharov, M.; González López, O.; Gorelov, I.; Goshaw, A. T.; Goulianos, K.; Gramellini, E.; Grinstein, S.; Grosso-Pilcher, C.; Group, R. C.; Guimaraes da Costa, J.; Hahn, S. R.; Han, J. Y.; Happacher, F.; Hara, K.; Hare, M.; Harr, R. F.; Harrington-Taber, T.; Hatakeyama, K.; Hays, C.; Heinrich, J.; Herndon, M.; Hocker, A.; Hong, Z.; Hopkins, W.; Hou, S.; Hughes, R. E.; Husemann, U.; Hussein, M.; Huston, J.; Introzzi, G.; Iori, M.; Ivanov, A.; James, E.; Jang, D.; Jayatilaka, B.; Jeon, E. J.; Jindariani, S.; Jones, M.; Joo, K. K.; Jun, S. Y.; Junk, T. R.; Kambeitz, M.; Kamon, T.; Karchin, P. E.; Kasmi, A.; Kato, Y.; Ketchum, W.; Keung, J.; Kilminster, B.; Kim, D. H.; Kim, H. S.; Kim, J. E.; Kim, M. J.; Kim, S. B.; Kim, S. H.; Kim, Y. J.; Kim, Y. K.; Kimura, N.; Kirby, M.; Knoepfel, K.; Kondo, K.; Kong, D. J.; Konigsberg, J.; Kotwal, A. V.; Kreps, M.; Kroll, J.; Kruse, M.; Kuhr, T.; Kurata, M.; Laasanen, A. T.; Lammel, S.; Lancaster, M.; Lannon, K.; Latino, G.; Lee, H. S.; Lee, J. S.; Leo, S.; Leone, S.; Lewis, J. D.; Limosani, A.; Lipeles, E.; Lister, A.; Liu, H.; Liu, Q.; Liu, T.; Lockwitz, S.; Loginov, A.; Lucà, A.; Lucchesi, D.; Lueck, J.; Lujan, P.; Lukens, P.; Lungu, G.; Lys, J.; Lysak, R.; Madrak, R.; Maestro, P.; Malik, S.; Manca, G.; Manousakis-Katsikakis, A.; Margaroli, F.; Marino, P.; Martínez, M.; Matera, K.; Mattson, M. E.; Mazzacane, A.; Mazzanti, P.; McNulty, R.; Mehta, A.; Mehtala, P.; Mesropian, C.; Miao, T.; Mietlicki, D.; Mitra, A.; Miyake, H.; Moed, S.; Moggi, N.; Moon, C. S.; Moore, R.; Morello, M. J.; Mukherjee, A.; Muller, Th.; Murat, P.; Mussini, M.; Nachtman, J.; Nagai, Y.; Naganoma, J.; Nakano, I.; Napier, A.; Nett, J.; Neu, C.; Nigmanov, T.; Nodulman, L.; Noh, S. Y.; Norniella, O.; Oakes, L.; Oh, S. H.; Oh, Y. D.; Oksuzian, I.; Okusawa, T.; Orava, R.; Ortolan, L.; Pagliarone, C.; Palencia, E.; Palni, P.; Papadimitriou, V.; Parker, W.; Pauletta, G.; Paulini, M.; Paus, C.; Phillips, T. J.; Piacentino, G.; Pianori, E.; Pilot, J.; Pitts, K.; Plager, C.; Pondrom, L.; Poprocki, S.; Potamianos, K.; Pranko, A.; Prokoshin, F.; Ptohos, F.; Punzi, G.; Ranjan, N.; Redondo Fernández, I.; Renton, P.; Rescigno, M.; Rimondi, F.; Ristori, L.; Robson, A.; Rodriguez, T.; Rolli, S.; Ronzani, M.; Roser, R.; Rosner, J. L.; Ruffini, F.; Ruiz, A.; Russ, J.; Rusu, V.; Sakumoto, W. K.; Sakurai, Y.; Santi, L.; Sato, K.; Saveliev, V.; Savoy-Navarro, A.; Schlabach, P.; Schmidt, E. E.; Schwarz, T.; Scodellaro, L.; Scuri, F.; Seidel, S.; Seiya, Y.; Semenov, A.; Sforza, F.; Shalhout, S. Z.; Shears, T.; Shepard, P. F.; Shimojima, M.; Shochet, M.; Shreyber-Tecker, I.; Simonenko, A.; Sinervo, P.; Sliwa, K.; Smith, J. R.; Snider, F. D.; Song, H.; Sorin, V.; Stancari, M.; Denis, R. St.; Stelzer, B.; Stelzer-Chilton, O.; Stentz, D.; Strologas, J.; Sudo, Y.; Sukhanov, A.; Suslov, I.; Takemasa, K.; Takeuchi, Y.; Tang, J.; Tecchio, M.; Teng, P. K.; Thom, J.; Thomson, E.; Thukral, V.; Toback, D.; Tokar, S.; Tollefson, K.; Tomura, T.; Tonelli, D.; Torre, S.; Torretta, D.; Totaro, P.; Trovato, M.; Ukegawa, F.; Uozumi, S.; Vázquez, F.; Velev, G.; Vellidis, C.; Vernieri, C.; Vidal, M.; Vilar, R.; Vizán, J.; Vogel, M.; Volpi, G.; Wagner, P.; Wallny, R.; Wang, S. M.; Warburton, A.; Waters, D.; Wester, W. C., III; Whiteson, D.; Wicklund, A. B.; Wilbur, S.; Williams, H. H.; Wilson, J. S.; Wilson, P.; Winer, B. L.; Wittich, P.; Wolbers, S.; Wolfe, H.; Wright, T.; Wu, X.; Wu, Z.; Yamamoto, K.; Yamato, D.; Yang, T.; Yang, U. K.; Yang, Y. C.; Yao, W.-M.; Yeh, G. P.; Yi, K.; Yoh, J.; Yorita, K.; Yoshida, T.; Yu, G. B.; Yu, I.; Zanetti, A. M.; Zeng, Y.; Zhou, C.; Zucchelli, S.

    2013-08-01

    We present the first signature-based search for delayed photons using an exclusive photon plus missing transverse energy final state. Events are reconstructed in a data sample from the CDF II detector corresponding to 6.3fb-1 of integrated luminosity from s=1.96TeV proton-antiproton collisions. Candidate events are selected if they contain a photon with an arrival time in the detector larger than expected from a promptly produced photon. The mean number of events from standard model sources predicted by the data-driven background model based on the photon timing distribution is 286±24. A total of 322 events are observed. A p value of 12% is obtained, showing consistency of the data with standard model predictions.

  4. Differentiating location- and distance-based processes in memory for time: an ERP study.

    PubMed

    Curran, Tim; Friedman, William J

    2003-09-01

    Memory for the time of events may benefit from reconstructive, location-based, and distance-based processes, but these processes are difficult to dissociate with behavioral methods. Neuropsychological research has emphasized the contribution of prefrontal brain mechanisms to memory for time but has not clearly differentiated location- from distance-based processing. The present experiment recorded event-related brain potentials (ERPs) while subjects completed two different temporal memory tests, designed to emphasize either location- or distance-based processing. The subjects' reports of location-based versus distance-based strategies and the reaction time pattern validated our experimental manipulation. Late (800-1,800 msec) frontal ERP effects were related to location-based processing. The results provide support for a two-process theory of memory for time and suggest that frontal memory mechanisms are specifically related to reconstructive, location-based processing.

  5. Event-Based Robust Control for Uncertain Nonlinear Systems Using Adaptive Dynamic Programming.

    PubMed

    Zhang, Qichao; Zhao, Dongbin; Wang, Ding

    2018-01-01

    In this paper, the robust control problem for a class of continuous-time nonlinear system with unmatched uncertainties is investigated using an event-based control method. First, the robust control problem is transformed into a corresponding optimal control problem with an augmented control and an appropriate cost function. Under the event-based mechanism, we prove that the solution of the optimal control problem can asymptotically stabilize the uncertain system with an adaptive triggering condition. That is, the designed event-based controller is robust to the original uncertain system. Note that the event-based controller is updated only when the triggering condition is satisfied, which can save the communication resources between the plant and the controller. Then, a single network adaptive dynamic programming structure with experience replay technique is constructed to approach the optimal control policies. The stability of the closed-loop system with the event-based control policy and the augmented control policy is analyzed using the Lyapunov approach. Furthermore, we prove that the minimal intersample time is bounded by a nonzero positive constant, which excludes Zeno behavior during the learning process. Finally, two simulation examples are provided to demonstrate the effectiveness of the proposed control scheme.

  6. Event coincidence analysis for quantifying statistical interrelationships between event time series. On the role of flood events as triggers of epidemic outbreaks

    NASA Astrophysics Data System (ADS)

    Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.

    2016-05-01

    Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.

  7. Real-Time Gait Event Detection Based on Kinematic Data Coupled to a Biomechanical Model.

    PubMed

    Lambrecht, Stefan; Harutyunyan, Anna; Tanghe, Kevin; Afschrift, Maarten; De Schutter, Joris; Jonkers, Ilse

    2017-03-24

    Real-time detection of multiple stance events, more specifically initial contact (IC), foot flat (FF), heel off (HO), and toe off (TO), could greatly benefit neurorobotic (NR) and neuroprosthetic (NP) control. Three real-time threshold-based algorithms have been developed, detecting the aforementioned events based on kinematic data in combination with a biomechanical model. Data from seven subjects walking at three speeds on an instrumented treadmill were used to validate the presented algorithms, accumulating to a total of 558 steps. The reference for the gait events was obtained using marker and force plate data. All algorithms had excellent precision and no false positives were observed. Timing delays of the presented algorithms were similar to current state-of-the-art algorithms for the detection of IC and TO, whereas smaller delays were achieved for the detection of FF. Our results indicate that, based on their high precision and low delays, these algorithms can be used for the control of an NR/NP, with the exception of the HO event. Kinematic data is used in most NR/NP control schemes and is thus available at no additional cost, resulting in a minimal computational burden. The presented methods can also be applied for screening pathological gait or gait analysis in general in/outside of the laboratory.

  8. Bootstrap-based methods for estimating standard errors in Cox's regression analyses of clustered event times.

    PubMed

    Xiao, Yongling; Abrahamowicz, Michal

    2010-03-30

    We propose two bootstrap-based methods to correct the standard errors (SEs) from Cox's model for within-cluster correlation of right-censored event times. The cluster-bootstrap method resamples, with replacement, only the clusters, whereas the two-step bootstrap method resamples (i) the clusters, and (ii) individuals within each selected cluster, with replacement. In simulations, we evaluate both methods and compare them with the existing robust variance estimator and the shared gamma frailty model, which are available in statistical software packages. We simulate clustered event time data, with latent cluster-level random effects, which are ignored in the conventional Cox's model. For cluster-level covariates, both proposed bootstrap methods yield accurate SEs, and type I error rates, and acceptable coverage rates, regardless of the true random effects distribution, and avoid serious variance under-estimation by conventional Cox-based standard errors. However, the two-step bootstrap method over-estimates the variance for individual-level covariates. We also apply the proposed bootstrap methods to obtain confidence bands around flexible estimates of time-dependent effects in a real-life analysis of cluster event times.

  9. An operational integrated short-term warning solution for solar radiation storms: introducing the Forecasting Solar Particle Events and Flares (FORSPEF) system

    NASA Astrophysics Data System (ADS)

    Anastasiadis, Anastasios; Sandberg, Ingmar; Papaioannou, Athanasios; Georgoulis, Manolis; Tziotziou, Kostas; Jiggens, Piers; Hilgers, Alain

    2015-04-01

    We present a novel integrated prediction system, of both solar flares and solar energetic particle (SEP) events, which is in place to provide short-term warnings for hazardous solar radiation storms. FORSPEF system provides forecasting of solar eruptive events, such as solar flares with a projection to coronal mass ejections (CMEs) (occurrence and velocity) and the likelihood of occurrence of a SEP event. It also provides nowcasting of SEP events based on actual solar flare and CME near real-time alerts, as well as SEP characteristics (peak flux, fluence, rise time, duration) per parent solar event. The prediction of solar flares relies on a morphological method which is based on the sophisticated derivation of the effective connected magnetic field strength (Beff) of potentially flaring active-region (AR) magnetic configurations and it utilizes analysis of a large number of AR magnetograms. For the prediction of SEP events a new reductive statistical method has been implemented based on a newly constructed database of solar flares, CMEs and SEP events that covers a large time span from 1984-2013. The method is based on flare location (longitude), flare size (maximum soft X-ray intensity), and the occurrence (or not) of a CME. Warnings are issued for all > C1.0 soft X-ray flares. The warning time in the forecasting scheme extends to 24 hours with a refresh rate of 3 hours while the respective warning time for the nowcasting scheme depends on the availability of the near real-time data and falls between 15-20 minutes. We discuss the modules of the FORSPEF system, their interconnection and the operational set up. The dual approach in the development of FORPSEF (i.e. forecasting and nowcasting scheme) permits the refinement of predictions upon the availability of new data that characterize changes on the Sun and the interplanetary space, while the combined usage of solar flare and SEP forecasting methods upgrades FORSPEF to an integrated forecasting solution. This work has been funded through the "FORSPEF: FORecasting Solar Particle Events and Flares", ESA Contract No. 4000109641/13/NL/AK

  10. Time Variations in Forecasts and Occurrences of Large Solar Energetic Particle Events

    NASA Astrophysics Data System (ADS)

    Kahler, S. W.

    2015-12-01

    The onsets and development of large solar energetic (E > 10 MeV) particle (SEP) events have been characterized in many studies. The statistics of SEP event onset delay times from associated solar flares and coronal mass ejections (CMEs), which depend on solar source longitudes, can be used to provide better predictions of whether a SEP event will occur following a large flare or fast CME. In addition, size distributions of peak SEP event intensities provide a means for a probabilistic forecast of peak intensities attained in observed SEP increases. SEP event peak intensities have been compared with their rise and decay times for insight into the acceleration and transport processes. These two time scales are generally treated as independent parameters describing the development of a SEP event, but we can invoke an alternative two-parameter description based on the assumption that decay times exceed rise times for all events. These two parameters, from the well known Weibull distribution, provide an event description in terms of its basic shape and duration. We apply this distribution to several large SEP events and ask what the characteristic parameters and their dependence on source longitudes can tell us about the origins of these important events.

  11. Distributed Continuous Event-Based Data Acquisition Using the IEEE 1588 Synchronization and FlexRIO FPGA

    NASA Astrophysics Data System (ADS)

    Taliercio, C.; Luchetta, A.; Manduchi, G.; Rigoni, A.

    2017-07-01

    High-speed event driven acquisition is normally performed by analog-to-digital converter (ADC) boards with a given number of pretrigger sample and posttrigger sample that are recorded upon the occurrence of a hardware trigger. A direct physical connection is, therefore, required between the source of event (trigger) and the ADC, because any other software-based communication method would introduce a delay in triggering that would turn out to be not acceptable in many cases. This paper proposes a solution for the relaxation of the event communication time that can be, in this case, carried out by software messaging (e.g., via an LAN), provided that the system components are synchronized in time using the IEEE 1588 synchronization mechanism. The information about the exact event occurrence time is contained in the software packet that is sent to communicate the event and is used by the ADC FPGA to identify the exact sample in the ADC sample queue. The length of the ADC sample queue will depend on the maximum delay in software event message communication time. A prototype implementation using a National FlexRIO FPGA board connected with an ADC device is presented as the proof of concept.

  12. wayGoo recommender system: personalized recommendations for events scheduling, based on static and real-time information

    NASA Astrophysics Data System (ADS)

    Thanos, Konstantinos-Georgios; Thomopoulos, Stelios C. A.

    2016-05-01

    wayGoo is a fully functional application whose main functionalities include content geolocation, event scheduling, and indoor navigation. However, significant information about events do not reach users' attention, either because of the size of this information or because some information comes from real - time data sources. The purpose of this work is to facilitate event management operations by prioritizing the presented events, based on users' interests using both, static and real - time data. Through the wayGoo interface, users select conceptual topics that are interesting for them. These topics constitute a browsing behavior vector which is used for learning users' interests implicitly, without being intrusive. Then, the system estimates user preferences and return an events list sorted from the most preferred one to the least. User preferences are modeled via a Naïve Bayesian Network which consists of: a) the `decision' random variable corresponding to users' decision on attending an event, b) the `distance' random variable, modeled by a linear regression that estimates the probability that the distance between a user and each event destination is not discouraging, ` the seat availability' random variable, modeled by a linear regression, which estimates the probability that the seat availability is encouraging d) and the `relevance' random variable, modeled by a clustering - based collaborative filtering, which determines the relevance of each event users' interests. Finally, experimental results show that the proposed system contribute essentially to assisting users in browsing and selecting events to attend.

  13. Near Optimal Event-Triggered Control of Nonlinear Discrete-Time Systems Using Neurodynamic Programming.

    PubMed

    Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani

    2016-09-01

    This paper presents an event-triggered near optimal control of uncertain nonlinear discrete-time systems. Event-driven neurodynamic programming (NDP) is utilized to design the control policy. A neural network (NN)-based identifier, with event-based state and input vectors, is utilized to learn the system dynamics. An actor-critic framework is used to learn the cost function and the optimal control input. The NN weights of the identifier, the critic, and the actor NNs are tuned aperiodically once every triggered instant. An adaptive event-trigger condition to decide the trigger instants is derived. Thus, a suitable number of events are generated to ensure a desired accuracy of approximation. A near optimal performance is achieved without using value and/or policy iterations. A detailed analysis of nontrivial inter-event times with an explicit formula to show the reduction in computation is also derived. The Lyapunov technique is used in conjunction with the event-trigger condition to guarantee the ultimate boundedness of the closed-loop system. The simulation results are included to verify the performance of the controller. The net result is the development of event-driven NDP.

  14. Rule-Based Event Processing and Reaction Rules

    NASA Astrophysics Data System (ADS)

    Paschke, Adrian; Kozlenkov, Alexander

    Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.

  15. Element analysis: a wavelet-based method for analysing time-localized events in noisy time series

    PubMed Central

    2017-01-01

    A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized ‘events’. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event’s ‘region of influence’ within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis, is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry. PMID:28484325

  16. Event Recognition Based on Deep Learning in Chinese Texts

    PubMed Central

    Zhang, Yajun; Liu, Zongtian; Zhou, Wen

    2016-01-01

    Event recognition is the most fundamental and critical task in event-based natural language processing systems. Existing event recognition methods based on rules and shallow neural networks have certain limitations. For example, extracting features using methods based on rules is difficult; methods based on shallow neural networks converge too quickly to a local minimum, resulting in low recognition precision. To address these problems, we propose the Chinese emergency event recognition model based on deep learning (CEERM). Firstly, we use a word segmentation system to segment sentences. According to event elements labeled in the CEC 2.0 corpus, we classify words into five categories: trigger words, participants, objects, time and location. Each word is vectorized according to the following six feature layers: part of speech, dependency grammar, length, location, distance between trigger word and core word and trigger word frequency. We obtain deep semantic features of words by training a feature vector set using a deep belief network (DBN), then analyze those features in order to identify trigger words by means of a back propagation neural network. Extensive testing shows that the CEERM achieves excellent recognition performance, with a maximum F-measure value of 85.17%. Moreover, we propose the dynamic-supervised DBN, which adds supervised fine-tuning to a restricted Boltzmann machine layer by monitoring its training performance. Test analysis reveals that the new DBN improves recognition performance and effectively controls the training time. Although the F-measure increases to 88.11%, the training time increases by only 25.35%. PMID:27501231

  17. Event Recognition Based on Deep Learning in Chinese Texts.

    PubMed

    Zhang, Yajun; Liu, Zongtian; Zhou, Wen

    2016-01-01

    Event recognition is the most fundamental and critical task in event-based natural language processing systems. Existing event recognition methods based on rules and shallow neural networks have certain limitations. For example, extracting features using methods based on rules is difficult; methods based on shallow neural networks converge too quickly to a local minimum, resulting in low recognition precision. To address these problems, we propose the Chinese emergency event recognition model based on deep learning (CEERM). Firstly, we use a word segmentation system to segment sentences. According to event elements labeled in the CEC 2.0 corpus, we classify words into five categories: trigger words, participants, objects, time and location. Each word is vectorized according to the following six feature layers: part of speech, dependency grammar, length, location, distance between trigger word and core word and trigger word frequency. We obtain deep semantic features of words by training a feature vector set using a deep belief network (DBN), then analyze those features in order to identify trigger words by means of a back propagation neural network. Extensive testing shows that the CEERM achieves excellent recognition performance, with a maximum F-measure value of 85.17%. Moreover, we propose the dynamic-supervised DBN, which adds supervised fine-tuning to a restricted Boltzmann machine layer by monitoring its training performance. Test analysis reveals that the new DBN improves recognition performance and effectively controls the training time. Although the F-measure increases to 88.11%, the training time increases by only 25.35%.

  18. Noise suppression in surface microseismic data

    USGS Publications Warehouse

    Forghani-Arani, Farnoush; Batzle, Mike; Behura, Jyoti; Willis, Mark; Haines, Seth S.; Davidson, Michael

    2012-01-01

    We introduce a passive noise suppression technique, based on the τ − p transform. In the τ − p domain, one can separate microseismic events from surface noise based on distinct characteristics that are not visible in the time-offset domain. By applying the inverse τ − p transform to the separated microseismic event, we suppress the surface noise in the data. Our technique significantly improves the signal-to-noise ratios of the microseismic events and is superior to existing techniques for passive noise suppression in the sense that it preserves the waveform. We introduce a passive noise suppression technique, based on the τ − p transform. In the τ − p domain, one can separate microseismic events from surface noise based on distinct characteristics that are not visible in the time-offset domain. By applying the inverse τ − p transform to the separated microseismic event, we suppress the surface noise in the data. Our technique significantly improves the signal-to-noise ratios of the microseismic events and is superior to existing techniques for passive noise suppression in the sense that it preserves the waveform.

  19. An extension of the OpenModelica compiler for using Modelica models in a discrete event simulation

    DOE PAGES

    Nutaro, James

    2014-11-03

    In this article, a new back-end and run-time system is described for the OpenModelica compiler. This new back-end transforms a Modelica model into a module for the adevs discrete event simulation package, thereby extending adevs to encompass complex, hybrid dynamical systems. The new run-time system that has been built within the adevs simulation package supports models with state-events and time-events and that comprise differential-algebraic systems with high index. Finally, although the procedure for effecting this transformation is based on adevs and the Discrete Event System Specification, it can be adapted to any discrete event simulation package.

  20. A Differential Deficit in Time- versus Event-based Prospective Memory in Parkinson's Disease

    PubMed Central

    Raskin, Sarah A.; Woods, Steven Paul; Poquette, Amelia J.; McTaggart, April B.; Sethna, Jim; Williams, Rebecca C.; Tröster, Alexander I.

    2010-01-01

    Objective The aim of the current study was to clarify the nature and extent of impairment in time- versus event-based prospective memory in Parkinson's disease (PD). Prospective memory is thought to involve cognitive processes that are mediated by prefrontal systems and are executive in nature. Given that individuals with PD frequently show executive dysfunction, it is important to determine whether these individuals may have deficits in prospective memory that could impact daily functions, such as taking medications. Although it has been reported that individuals with PD evidence impairment in prospective memory, it is still unclear whether they show a greater deficit for time- versus event-based cues. Method Fifty-four individuals with PD and 34 demographically similar healthy adults were administered a standardized measure of prospective memory that allows for a direct comparison of time-based and event-based cues. In addition, participants were administered a series of standardized measures of retrospective memory and executive functions. Results Individuals with PD demonstrated impaired prospective memory performance compared to the healthy adults, with a greater impairment demonstrated for the time-based tasks. Time-based prospective memory performance was moderately correlated with measures of executive functioning, but only the Stroop Neuropsychological Screening Test emerged as a unique predictor in a linear regression. Conclusions Findings are interpreted within the context of McDaniel and Einstein's (2000) multi-process theory to suggest that individuals with PD experience particular difficulty executing a future intention when the cue to execute the prescribed intention requires higher levels of executive control. PMID:21090895

  1. A Framework of Simple Event Detection in Surveillance Video

    NASA Astrophysics Data System (ADS)

    Xu, Weiguang; Zhang, Yafei; Lu, Jianjiang; Tian, Yulong; Wang, Jiabao

    Video surveillance is playing more and more important role in people's social life. Real-time alerting of threaten events and searching interesting content in stored large scale video footage needs human operator to pay full attention on monitor for long time. The labor intensive mode has limit the effectiveness and efficiency of the system. A framework of simple event detection is presented advance the automation of video surveillance. An improved inner key point matching approach is used to compensate motion of background in real-time; frame difference are used to detect foreground; HOG based classifiers are used to classify foreground object into people and car; mean-shift is used to tracking the recognized objects. Events are detected based on predefined rules. The maturity of the algorithms guarantee the robustness of the framework, and the improved approach and the easily checked rules enable the framework to work in real-time. Future works to be done are also discussed.

  2. Finding the signal in the noise: Could social media be utilized for early hospital notification of multiple casualty events?

    PubMed Central

    Moore, Sara; Wakam, Glenn; Hubbard, Alan E.; Cohen, Mitchell J.

    2017-01-01

    Introduction Delayed notification and lack of early information hinder timely hospital based activations in large scale multiple casualty events. We hypothesized that Twitter real-time data would produce a unique and reproducible signal within minutes of multiple casualty events and we investigated the timing of the signal compared with other hospital disaster notification mechanisms. Methods Using disaster specific search terms, all relevant tweets from the event to 7 days post-event were analyzed for 5 recent US based multiple casualty events (Boston Bombing [BB], SF Plane Crash [SF], Napa Earthquake [NE], Sandy Hook [SH], and Marysville Shooting [MV]). Quantitative and qualitative analysis of tweet utilization were compared across events. Results Over 3.8 million tweets were analyzed (SH 1.8 m, BB 1.1m, SF 430k, MV 250k, NE 205k). Peak tweets per min ranged from 209–3326. The mean followers per tweeter ranged from 3382–9992 across events. Retweets were tweeted a mean of 82–564 times per event. Tweets occurred very rapidly for all events (<2 mins) and represented 1% of the total event specific tweets in a median of 13 minutes of the first 911 calls. A 200 tweets/min threshold was reached fastest with NE (2 min), BB (7 min), and SF (18 mins). If this threshold was utilized as a signaling mechanism to place local hospitals on standby for possible large scale events, in all case studies, this signal would have preceded patient arrival. Importantly, this threshold for signaling would also have preceded traditional disaster notification mechanisms in SF, NE, and simultaneous with BB and MV. Conclusions Social media data has demonstrated that this mechanism is a powerful, predictable, and potentially important resource for optimizing disaster response. Further investigated is warranted to assess the utility of prospective signally thresholds for hospital based activation. PMID:28982201

  3. Semiparametric modeling and estimation of the terminal behavior of recurrent marker processes before failure events.

    PubMed

    Chan, Kwun Chuen Gary; Wang, Mei-Cheng

    2017-01-01

    Recurrent event processes with marker measurements are mostly and largely studied with forward time models starting from an initial event. Interestingly, the processes could exhibit important terminal behavior during a time period before occurrence of the failure event. A natural and direct way to study recurrent events prior to a failure event is to align the processes using the failure event as the time origin and to examine the terminal behavior by a backward time model. This paper studies regression models for backward recurrent marker processes by counting time backward from the failure event. A three-level semiparametric regression model is proposed for jointly modeling the time to a failure event, the backward recurrent event process, and the marker observed at the time of each backward recurrent event. The first level is a proportional hazards model for the failure time, the second level is a proportional rate model for the recurrent events occurring before the failure event, and the third level is a proportional mean model for the marker given the occurrence of a recurrent event backward in time. By jointly modeling the three components, estimating equations can be constructed for marked counting processes to estimate the target parameters in the three-level regression models. Large sample properties of the proposed estimators are studied and established. The proposed models and methods are illustrated by a community-based AIDS clinical trial to examine the terminal behavior of frequencies and severities of opportunistic infections among HIV infected individuals in the last six months of life.

  4. Event-Based Stereo Depth Estimation Using Belief Propagation.

    PubMed

    Xie, Zhen; Chen, Shengyong; Orchard, Garrick

    2017-01-01

    Compared to standard frame-based cameras, biologically-inspired event-based sensors capture visual information with low latency and minimal redundancy. These event-based sensors are also far less prone to motion blur than traditional cameras, and still operate effectively in high dynamic range scenes. However, classical framed-based algorithms are not typically suitable for these event-based data and new processing algorithms are required. This paper focuses on the problem of depth estimation from a stereo pair of event-based sensors. A fully event-based stereo depth estimation algorithm which relies on message passing is proposed. The algorithm not only considers the properties of a single event but also uses a Markov Random Field (MRF) to consider the constraints between the nearby events, such as disparity uniqueness and depth continuity. The method is tested on five different scenes and compared to other state-of-art event-based stereo matching methods. The results show that the method detects more stereo matches than other methods, with each match having a higher accuracy. The method can operate in an event-driven manner where depths are reported for individual events as they are received, or the network can be queried at any time to generate a sparse depth frame which represents the current state of the network.

  5. Leveraging Event Reporting Through Knowledge Support: A Knowledge-Based Approach to Promoting Patient Fall Prevention.

    PubMed

    Yao, Bin; Kang, Hong; Miao, Qi; Zhou, Sicheng; Liang, Chen; Gong, Yang

    2017-01-01

    Patient falls are a common safety event type that impairs the healthcare quality. Strategies including solution tools and reporting systems for preventing patient falls have been developed and implemented in the U.S. However, the current strategies do not include timely knowledge support, which is in great need in bridging the gap between reporting and learning. In this study, we constructed a knowledge base of fall events by combining expert-reviewed fall prevention solutions and then integrating them into a reporting system. The knowledge base enables timely and tailored knowledge support and thus will serve as a prevailing fall prevention tool. This effort holds promise in making knowledge acquisition and management a routine process for enhancing the reporting and understanding of patient safety events.

  6. A community-based event delivery protocol in publish/subscribe systems for delay tolerant sensor networks.

    PubMed

    Liu, Nianbo; Liu, Ming; Zhu, Jinqi; Gong, Haigang

    2009-01-01

    The basic operation of a Delay Tolerant Sensor Network (DTSN) is to finish pervasive data gathering in networks with intermittent connectivity, while the publish/subscribe (Pub/Sub for short) paradigm is used to deliver events from a source to interested clients in an asynchronous way. Recently, extension of Pub/Sub systems in DTSNs has become a promising research topic. However, due to the unique frequent partitioning characteristic of DTSNs, extension of a Pub/Sub system in a DTSN is a considerably difficult and challenging problem, and there are no good solutions to this problem in published works. To ad apt Pub/Sub systems to DTSNs, we propose CED, a community-based event delivery protocol. In our design, event delivery is based on several unchanged communities, which are formed by sensor nodes in the network according to their connectivity. CED consists of two components: event delivery and queue management. In event delivery, events in a community are delivered to mobile subscribers once a subscriber comes into the community, for improving the data delivery ratio. The queue management employs both the event successful delivery time and the event survival time to decide whether an event should be delivered or dropped for minimizing the transmission overhead. The effectiveness of CED is demonstrated through comprehensive simulation studies.

  7. A general framework for time series data mining based on event analysis: application to the medical domains of electroencephalography and stabilometry.

    PubMed

    Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P

    2014-10-01

    There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Visual search of cyclic spatio-temporal events

    NASA Astrophysics Data System (ADS)

    Gautier, Jacques; Davoine, Paule-Annick; Cunty, Claire

    2018-05-01

    The analysis of spatio-temporal events, and especially of relationships between their different dimensions (space-time-thematic attributes), can be done with geovisualization interfaces. But few geovisualization tools integrate the cyclic dimension of spatio-temporal event series (natural events or social events). Time Coil and Time Wave diagrams represent both the linear time and the cyclic time. By introducing a cyclic temporal scale, these diagrams may highlight the cyclic characteristics of spatio-temporal events. However, the settable cyclic temporal scales are limited to usual durations like days or months. Because of that, these diagrams cannot be used to visualize cyclic events, which reappear with an unusual period, and don't allow to make a visual search of cyclic events. Also, they don't give the possibility to identify the relationships between the cyclic behavior of the events and their spatial features, and more especially to identify localised cyclic events. The lack of possibilities to represent the cyclic time, outside of the temporal diagram of multi-view geovisualization interfaces, limits the analysis of relationships between the cyclic reappearance of events and their other dimensions. In this paper, we propose a method and a geovisualization tool, based on the extension of Time Coil and Time Wave, to provide a visual search of cyclic events, by allowing to set any possible duration to the diagram's cyclic temporal scale. We also propose a symbology approach to push the representation of the cyclic time into the map, in order to improve the analysis of relationships between space and the cyclic behavior of events.

  9. Music, clicks, and their imaginations favor differently the event-based timing component for rhythmic movements.

    PubMed

    Bravi, Riccardo; Quarta, Eros; Del Tongo, Claudia; Carbonaro, Nicola; Tognetti, Alessandro; Minciacchi, Diego

    2015-06-01

    The involvement or noninvolvement of a clock-like neural process, an effector-independent representation of the time intervals to produce, is described as the essential difference between event-based and emergent timing. In a previous work (Bravi et al. in Exp Brain Res 232:1663-1675, 2014a. doi: 10.1007/s00221-014-3845-9 ), we studied repetitive isochronous wrist's flexion-extensions (IWFEs), performed while minimizing visual and tactile information, to clarify whether non-temporal and temporal characteristics of paced auditory stimuli affect the precision and accuracy of the rhythmic motor performance. Here, with the inclusion of new recordings, we expand the examination of the dataset described in our previous study to investigate whether simple and complex paced auditory stimuli (clicks and music) and their imaginations influence in a different way the timing mechanisms for repetitive IWFEs. Sets of IWFEs were analyzed by the windowed (lag one) autocorrelation-wγ(1), a statistical method recently introduced for the distinction between event-based and emergent timing. Our findings provide evidence that paced auditory information and its imagination favor the engagement of a clock-like neural process, and specifically that music, unlike clicks, lacks the power to elicit event-based timing, not counteracting the natural shift of wγ(1) toward positive values as frequency of movements increase.

  10. A stochastical event-based continuous time step rainfall generator based on Poisson rectangular pulse and microcanonical random cascade models

    NASA Astrophysics Data System (ADS)

    Pohle, Ina; Niebisch, Michael; Zha, Tingting; Schümberg, Sabine; Müller, Hannes; Maurer, Thomas; Hinz, Christoph

    2017-04-01

    Rainfall variability within a storm is of major importance for fast hydrological processes, e.g. surface runoff, erosion and solute dissipation from surface soils. To investigate and simulate the impacts of within-storm variabilities on these processes, long time series of rainfall with high resolution are required. Yet, observed precipitation records of hourly or higher resolution are in most cases available only for a small number of stations and only for a few years. To obtain long time series of alternating rainfall events and interstorm periods while conserving the statistics of observed rainfall events, the Poisson model can be used. Multiplicative microcanonical random cascades have been widely applied to disaggregate rainfall time series from coarse to fine temporal resolution. We present a new coupling approach of the Poisson rectangular pulse model and the multiplicative microcanonical random cascade model that preserves the characteristics of rainfall events as well as inter-storm periods. In the first step, a Poisson rectangular pulse model is applied to generate discrete rainfall events (duration and mean intensity) and inter-storm periods (duration). The rainfall events are subsequently disaggregated to high-resolution time series (user-specified, e.g. 10 min resolution) by a multiplicative microcanonical random cascade model. One of the challenges of coupling these models is to parameterize the cascade model for the event durations generated by the Poisson model. In fact, the cascade model is best suited to downscale rainfall data with constant time step such as daily precipitation data. Without starting from a fixed time step duration (e.g. daily), the disaggregation of events requires some modifications of the multiplicative microcanonical random cascade model proposed by Olsson (1998): Firstly, the parameterization of the cascade model for events of different durations requires continuous functions for the probabilities of the multiplicative weights, which we implemented through sigmoid functions. Secondly, the branching of the first and last box is constrained to preserve the rainfall event durations generated by the Poisson rectangular pulse model. The event-based continuous time step rainfall generator has been developed and tested using 10 min and hourly rainfall data of four stations in North-Eastern Germany. The model performs well in comparison to observed rainfall in terms of event durations and mean event intensities as well as wet spell and dry spell durations. It is currently being tested using data from other stations across Germany and in different climate zones. Furthermore, the rainfall event generator is being applied in modelling approaches aimed at understanding the impact of rainfall variability on hydrological processes. Reference Olsson, J.: Evaluation of a scaling cascade model for temporal rainfall disaggregation, Hydrology and Earth System Sciences, 2, 19.30

  11. Towards Real-Time Detection of Gait Events on Different Terrains Using Time-Frequency Analysis and Peak Heuristics Algorithm.

    PubMed

    Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin

    2016-10-01

    Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses.

  12. Towards Real-Time Detection of Gait Events on Different Terrains Using Time-Frequency Analysis and Peak Heuristics Algorithm

    PubMed Central

    Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin

    2016-01-01

    Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses. PMID:27706086

  13. Controlling cavitation-based image contrast in focused ultrasound histotripsy surgery.

    PubMed

    Allen, Steven P; Hall, Timothy L; Cain, Charles A; Hernandez-Garcia, Luis

    2015-01-01

    To develop MRI feedback for cavitation-based, focused ultrasound, tissue erosion surgery (histotripsy), we investigate image contrast generated by transient cavitation events. Changes in GRE image intensity are observed while balanced pairs of field gradients are varied in the presence of an acoustically driven cavitation event. The amplitude of the acoustic pulse and the timing between a cavitation event and the start of these gradient waveforms are also varied. The magnitudes and phases of the cavitation site are compared with those of control images. An echo-planar sequence is used to evaluate histotripsy lesions in ex vivo tissue. Cavitation events in water cause localized attenuation when acoustic pulses exceed a pressure threshold. Attenuation increases with increasing gradient amplitude and gradient lobe separation times and is isotropic with gradient direction. This attenuation also depends upon the relative timing between the cavitation event and the start of the balanced gradients. These factors can be used to control the appearance of attenuation while imaging ex vivo tissue. By controlling the timing between cavitation events and the imaging gradients, MR images can be made alternately sensitive or insensitive to cavitation. During therapy, these images can be used to isolate contrast generated by cavitation. © 2014 Wiley Periodicals, Inc.

  14. Event-driven simulation in SELMON: An overview of EDSE

    NASA Technical Reports Server (NTRS)

    Rouquette, Nicolas F.; Chien, Steve A.; Charest, Leonard, Jr.

    1992-01-01

    EDSE (event-driven simulation engine), a model-based event-driven simulator implemented for SELMON, a tool for sensor selection and anomaly detection in real-time monitoring is described. The simulator is used in conjunction with a causal model to predict future behavior of the model from observed data. The behavior of the causal model is interpreted as equivalent to the behavior of the physical system being modeled. An overview of the functionality of the simulator and the model-based event-driven simulation paradigm on which it is based is provided. Included are high-level descriptions of the following key properties: event consumption and event creation, iterative simulation, synchronization and filtering of monitoring data from the physical system. Finally, how EDSE stands with respect to the relevant open issues of discrete-event and model-based simulation is discussed.

  15. Retention of autobiographical memories: an Internet-based diary study.

    PubMed

    Kristo, Gert; Janssen, Steve M J; Murre, Jaap M J

    2009-11-01

    In this online study we examined the retention of recent personal events using an Internet-based diary technique. Each participant (N=878) recorded on a website one recent personal event and was contacted after a retention interval that ranged between 2 and 46 days. We investigated how well the participants could recall the content, time, and details of their recorded event. We found a classic retention function. Details of the events were forgotten more rapidly than the content and the time of the events. There were no differences between the forgetting rates of the "who", "what" and "where" elements of the content component. Reminiscing, social sharing, pleasantness, and frequency of occurrence aided recall, but surprisingly importance and emotionality did not. They were, however, strongly associated with reminiscing and social sharing.

  16. EEG-Annotate: Automated identification and labeling of events in continuous signals with applications to EEG.

    PubMed

    Su, Kyung-Min; Hairston, W David; Robbins, Kay

    2018-01-01

    In controlled laboratory EEG experiments, researchers carefully mark events and analyze subject responses time-locked to these events. Unfortunately, such markers may not be available or may come with poor timing resolution for experiments conducted in less-controlled naturalistic environments. We present an integrated event-identification method for identifying particular responses that occur in unlabeled continuously recorded EEG signals based on information from recordings of other subjects potentially performing related tasks. We introduce the idea of timing slack and timing-tolerant performance measures to deal with jitter inherent in such non-time-locked systems. We have developed an implementation available as an open-source MATLAB toolbox (http://github.com/VisLab/EEG-Annotate) and have made test data available in a separate data note. We applied the method to identify visual presentation events (both target and non-target) in data from an unlabeled subject using labeled data from other subjects with good sensitivity and specificity. The method also identified actual visual presentation events in the data that were not previously marked in the experiment. Although the method uses traditional classifiers for initial stages, the problem of identifying events based on the presence of stereotypical EEG responses is the converse of the traditional stimulus-response paradigm and has not been addressed in its current form. In addition to identifying potential events in unlabeled or incompletely labeled EEG, these methods also allow researchers to investigate whether particular stereotypical neural responses are present in other circumstances. Timing-tolerance has the added benefit of accommodating inter- and intra- subject timing variations. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  17. Optimizing Tsunami Forecast Model Accuracy

    NASA Astrophysics Data System (ADS)

    Whitmore, P.; Nyland, D. L.; Huang, P. Y.

    2015-12-01

    Recent tsunamis provide a means to determine the accuracy that can be expected of real-time tsunami forecast models. Forecast accuracy using two different tsunami forecast models are compared for seven events since 2006 based on both real-time application and optimized, after-the-fact "forecasts". Lessons learned by comparing the forecast accuracy determined during an event to modified applications of the models after-the-fact provide improved methods for real-time forecasting for future events. Variables such as source definition, data assimilation, and model scaling factors are examined to optimize forecast accuracy. Forecast accuracy is also compared for direct forward modeling based on earthquake source parameters versus accuracy obtained by assimilating sea level data into the forecast model. Results show that including assimilated sea level data into the models increases accuracy by approximately 15% for the events examined.

  18. Address-event-based platform for bioinspired spiking systems

    NASA Astrophysics Data System (ADS)

    Jiménez-Fernández, A.; Luján, C. D.; Linares-Barranco, A.; Gómez-Rodríguez, F.; Rivas, M.; Jiménez, G.; Civit, A.

    2007-05-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows a real-time virtual massive connectivity between huge number neurons, located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate "events" according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. When building multi-chip muti-layered AER systems, it is absolutely necessary to have a computer interface that allows (a) reading AER interchip traffic into the computer and visualizing it on the screen, and (b) converting conventional frame-based video stream in the computer into AER and injecting it at some point of the AER structure. This is necessary for test and debugging of complex AER systems. In the other hand, the use of a commercial personal computer implies to depend on software tools and operating systems that can make the system slower and un-robust. This paper addresses the problem of communicating several AER based chips to compose a powerful processing system. The problem was discussed in the Neuromorphic Engineering Workshop of 2006. The platform is based basically on an embedded computer, a powerful FPGA and serial links, to make the system faster and be stand alone (independent from a PC). A new platform is presented that allow to connect up to eight AER based chips to a Spartan 3 4000 FPGA. The FPGA is responsible of the network communication based in Address-Event and, at the same time, to map and transform the address space of the traffic to implement a pre-processing. A MMU microprocessor (Intel XScale 400MHz Gumstix Connex computer) is also connected to the FPGA to allow the platform to implement eventbased algorithms to interact to the AER system, like control algorithms, network connectivity, USB support, etc. The LVDS transceiver allows a bandwidth of up to 1.32 Gbps, around ~66 Mega events per second (Mevps).

  19. Extracting rate changes in transcriptional regulation from MEDLINE abstracts.

    PubMed

    Liu, Wenting; Miao, Kui; Li, Guangxia; Chang, Kuiyu; Zheng, Jie; Rajapakse, Jagath C

    2014-01-01

    Time delays are important factors that are often neglected in gene regulatory network (GRN) inference models. Validating time delays from knowledge bases is a challenge since the vast majority of biological databases do not record temporal information of gene regulations. Biological knowledge and facts on gene regulations are typically extracted from bio-literature with specialized methods that depend on the regulation task. In this paper, we mine evidences for time delays related to the transcriptional regulation of yeast from the PubMed abstracts. Since the vast majority of abstracts lack quantitative time information, we can only collect qualitative evidences of time delays. Specifically, the speed-up or delay in transcriptional regulation rate can provide evidences for time delays (shorter or longer) in GRN. Thus, we focus on deriving events related to rate changes in transcriptional regulation. A corpus of yeast regulation related abstracts was manually labeled with such events. In order to capture these events automatically, we create an ontology of sub-processes that are likely to result in transcription rate changes by combining textual patterns and biological knowledge. We also propose effective feature extraction methods based on the created ontology to identify the direct evidences with specific details of these events. Our ontologies outperform existing state-of-the-art gene regulation ontologies in the automatic rule learning method applied to our corpus. The proposed deterministic ontology rule-based method can achieve comparable performance to the automatic rule learning method based on decision trees. This demonstrates the effectiveness of our ontology in identifying rate-changing events. We also tested the effectiveness of the proposed feature mining methods on detecting direct evidence of events. Experimental results show that the machine learning method on these features achieves an F1-score of 71.43%. The manually labeled corpus of events relating to rate changes in transcriptional regulation for yeast is available in https://sites.google.com/site/wentingntu/data. The created ontologies summarized both biological causes of rate changes in transcriptional regulation and corresponding positive and negative textual patterns from the corpus. They are demonstrated to be effective in identifying rate-changing events, which shows the benefits of combining textual patterns and biological knowledge on extracting complex biological events.

  20. Event generators for address event representation transmitters

    NASA Astrophysics Data System (ADS)

    Serrano-Gotarredona, Rafael; Serrano-Gotarredona, Teresa; Linares Barranco, Bernabe

    2005-06-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows for real-time virtual massive connectivity between huge number neurons located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate 'events' according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. In a typical AER transmitter chip, there is an array of neurons that generate events. They send events to a peripheral circuitry (let's call it "AER Generator") that transforms those events to neurons coordinates (addresses) which are put sequentially on an interchip high speed digital bus. This bus includes a parallel multi-bit address word plus a Rqst (request) and Ack (acknowledge) handshaking signals for asynchronous data exchange. There have been two main approaches published in the literature for implementing such "AER Generator" circuits. They differ on the way of handling event collisions coming from the array of neurons. One approach is based on detecting and discarding collisions, while the other incorporates arbitration for sequencing colliding events . The first approach is supposed to be simpler and faster, while the second is able to handle much higher event traffic. In this article we will concentrate on the second arbiter-based approach. Boahen has been publishing several techniques for implementing and improving the arbiter based approach. Originally, he proposed an arbitration squeme by rows, followed by a column arbitration. In this scheme, while one neuron was selected by the arbiters to transmit his event out of the chip, the rest of neurons in the array were freezed to transmit any further events during this time window. This limited the maximum transmission speed. In order to improve this speed, Boahen proposed an improved 'burst mode' scheme. In this scheme after the row arbitration, a complete row of events is pipelined out of the array and arbitered out of the chip at higher speed. During this single row event arbitration, the array is free to generate new events and communicate to the row arbiter, in a pipelined mode. This scheme significantly improves maximum event transmission speed, specially for high traffic situations were speed is more critical. We have analyzed and studied this approach and have detected some shortcomings in the circuits reported by Boahen, which may render some false situations under some statistical conditions. The present paper proposes some improvements to overcome such situations. The improved "AER Generator" has been implemented in an AER transmitter system

  1. Temporal and Resource Reasoning for Planning, Scheduling and Execution in Autonomous Agents

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Hunsberger, Luke; Tsamardinos, Ioannis

    2005-01-01

    This viewgraph slide tutorial reviews methods for planning and scheduling events. The presentation reviews several methods and uses several examples of scheduling events for the successful and timely completion of the overall plan. Using constraint based models the presentation reviews planning with time, time representations in problem solving and resource reasoning.

  2. g-PRIME: A Free, Windows Based Data Acquisition and Event Analysis Software Package for Physiology in Classrooms and Research Labs.

    PubMed

    Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R

    2009-01-01

    We present g-PRIME, a software based tool for physiology data acquisition, analysis, and stimulus generation in education and research. This software was developed in an undergraduate neurophysiology course and strongly influenced by instructor and student feedback. g-PRIME is a free, stand-alone, windows application coded and "compiled" in Matlab (does not require a Matlab license). g-PRIME supports many data acquisition interfaces from the PC sound card to expensive high throughput calibrated equipment. The program is designed as a software oscilloscope with standard trigger modes, multi-channel visualization controls, and data logging features. Extensive analysis options allow real time and offline filtering of signals, multi-parameter threshold-and-window based event detection, and two-dimensional display of a variety of parameters including event time, energy density, maximum FFT frequency component, max/min amplitudes, and inter-event rate and intervals. The software also correlates detected events with another simultaneously acquired source (event triggered average) in real time or offline. g-PRIME supports parameter histogram production and a variety of elegant publication quality graphics outputs. A major goal of this software is to merge powerful engineering acquisition and analysis tools with a biological approach to studies of nervous system function.

  3. Screen-based entertainment time, all-cause mortality, and cardiovascular events: population-based study with ongoing mortality and hospital events follow-up.

    PubMed

    Stamatakis, Emmanuel; Hamer, Mark; Dunstan, David W

    2011-01-18

    The aim of this study was to examine the independent relationships of television viewing or other screen-based entertainment ("screen time") with all-cause mortality and clinically confirmed cardiovascular disease (CVD) events. A secondary objective was to examine the extent to which metabolic (body mass index, high-density lipoprotein and total cholesterol) and inflammatory (C-reactive protein) markers mediate the relationship between screen time and CVD events. Although some evidence suggests that prolonged sitting is linked to CVD risk factor development regardless of physical activity participation, studies with hard outcomes are scarce. A population sample of 4,512 (1,945 men) Scottish Health Survey 2003 respondents (≥35 years) were followed up to 2007 for all-cause mortality and CVD events (fatal and nonfatal combined). Main exposures were interviewer-assessed screen time (<2 h/day; 2 to <4 h/day; and ≥4 h/day) and moderate to vigorous intensity physical activity. Two hundred fifteen CVD events and 325 any-cause deaths occurred during 19,364 follow-up person-years. The covariable (age, sex, ethnicity, obesity, smoking, social class, long-standing illness, marital status, diabetes, hypertension)-adjusted hazard ratio (HR) for all-cause mortality was 1.52 (95% confidence interval [CI]: 1.06 to 2.16) and for CVD events was 2.30 (95% CI: 1.33 to 3.96) for participants engaging in ≥4 h/day of screen time relative to <2 h/day. Adjusting for physical activity attenuated these associations only slightly (all-cause mortality: HR: 1.48, 95% CI: 1.04 to 2.13; CVD events: HR: 2.25, 95% CI: 1.30 to 3.89). Exclusion of participants with CVD events in the first 2 years of follow-up and previous cancer registrations did not change these results appreciably. Approximately 25% of the association between screen time and CVD events was explained collectively by C-reactive protein, body mass index, and high-density lipoprotein cholesterol. Recreational sitting, as reflected by television/screen viewing time, is related to raised mortality and CVD risk regardless of physical activity participation. Inflammatory and metabolic risk factors partly explain this relationship. Copyright © 2011 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  4. Real-time hypoglycemia detection from continuous glucose monitoring data of subjects with type 1 diabetes.

    PubMed

    Jensen, Morten Hasselstrøm; Christensen, Toke Folke; Tarnow, Lise; Seto, Edmund; Dencker Johansen, Mette; Hejlesen, Ole Kristian

    2013-07-01

    Hypoglycemia is a potentially fatal condition. Continuous glucose monitoring (CGM) has the potential to detect hypoglycemia in real time and thereby reduce time in hypoglycemia and avoid any further decline in blood glucose level. However, CGM is inaccurate and shows a substantial number of cases in which the hypoglycemic event is not detected by the CGM. The aim of this study was to develop a pattern classification model to optimize real-time hypoglycemia detection. Features such as time since last insulin injection and linear regression, kurtosis, and skewness of the CGM signal in different time intervals were extracted from data of 10 male subjects experiencing 17 insulin-induced hypoglycemic events in an experimental setting. Nondiscriminative features were eliminated with SEPCOR and forward selection. The feature combinations were used in a Support Vector Machine model and the performance assessed by sample-based sensitivity and specificity and event-based sensitivity and number of false-positives. The best model was composed by using seven features and was able to detect 17 of 17 hypoglycemic events with one false-positive compared with 12 of 17 hypoglycemic events with zero false-positives for the CGM alone. Lead-time was 14 min and 0 min for the model and the CGM alone, respectively. This optimized real-time hypoglycemia detection provides a unique approach for the diabetes patient to reduce time in hypoglycemia and learn about patterns in glucose excursions. Although these results are promising, the model needs to be validated on CGM data from patients with spontaneous hypoglycemic events.

  5. Using Real-time Event Tracking Sensitivity Analysis to Overcome Sensor Measurement Uncertainties of Geo-Information Management in Drilling Disasters

    NASA Astrophysics Data System (ADS)

    Tavakoli, S.; Poslad, S.; Fruhwirth, R.; Winter, M.

    2012-04-01

    This paper introduces an application of a novel EventTracker platform for instantaneous Sensitivity Analysis (SA) of large scale real-time geo-information. Earth disaster management systems demand high quality information to aid a quick and timely response to their evolving environments. The idea behind the proposed EventTracker platform is the assumption that modern information management systems are able to capture data in real-time and have the technological flexibility to adjust their services to work with specific sources of data/information. However, to assure this adaptation in real time, the online data should be collected, interpreted, and translated into corrective actions in a concise and timely manner. This can hardly be handled by existing sensitivity analysis methods because they rely on historical data and lazy processing algorithms. In event-driven systems, the effect of system inputs on its state is of value, as events could cause this state to change. This 'event triggering' situation underpins the logic of the proposed approach. Event tracking sensitivity analysis method describes the system variables and states as a collection of events. The higher the occurrence of an input variable during the trigger of event, the greater its potential impact will be on the final analysis of the system state. Experiments were designed to compare the proposed event tracking sensitivity analysis with existing Entropy-based sensitivity analysis methods. The results have shown a 10% improvement in a computational efficiency with no compromise for accuracy. It has also shown that the computational time to perform the sensitivity analysis is 0.5% of the time required compared to using the Entropy-based method. The proposed method has been applied to real world data in the context of preventing emerging crises at drilling rigs. One of the major purposes of such rigs is to drill boreholes to explore oil or gas reservoirs with the final scope of recovering the content of such reservoirs; both in onshore regions as well as in offshore regions. Drilling a well is always guided by technical, economic and security constraints to prevent crew, equipment and environment from injury, damage and pollution. Although risk assessment and local practice provides a high degree of security, uncertainty is given by the behaviour of the formation which may cause crucial situations at the rig. To overcome such uncertainties real-time sensor measurements form a base to predict and thus prevent such crises, the proposed method supports the identification of the data necessary for that.

  6. Unsupervised Spatio-Temporal Data Mining Framework for Burned Area Mapping

    NASA Technical Reports Server (NTRS)

    Kumar, Vipin (Inventor); Boriah, Shyam (Inventor); Mithal, Varun (Inventor); Khandelwal, Ankush (Inventor)

    2016-01-01

    A method reduces processing time required to identify locations burned by fire by receiving a feature value for each pixel in an image, each pixel representing a sub-area of a location. Pixels are then grouped based on similarities of the feature values to form candidate burn events. For each candidate burn event, a probability that the candidate burn event is a true burn event is determined based on at least one further feature value for each pixel in the candidate burn event. Candidate burn events that have a probability below a threshold are removed from further consideration as burn events to produce a set of remaining candidate burn events.

  7. Monitoring Natural Events Globally in Near Real-Time Using NASA's Open Web Services and Tools

    NASA Technical Reports Server (NTRS)

    Boller, Ryan A.; Ward, Kevin Alan; Murphy, Kevin J.

    2015-01-01

    Since 1960, NASA has been making global measurements of the Earth from a multitude of space-based missions, many of which can be useful for monitoring natural events. In recent years, these measurements have been made available in near real-time, making it possible to use them to also aid in managing the response to natural events. We present the challenges and ongoing solutions to using NASA satellite data for monitoring and managing these events.

  8. Initial Evaluation of Signal-Based Bayesian Monitoring

    NASA Astrophysics Data System (ADS)

    Moore, D.; Russell, S.

    2016-12-01

    We present SIGVISA (Signal-based Vertically Integrated Seismic Analysis), a next-generation system for global seismic monitoring through Bayesian inference on seismic signals. Traditional seismic monitoring systems rely on discrete detections produced by station processing software, discarding significant information present in the original recorded signal. By modeling signals directly, our forward model is able to incorporate a rich representation of the physics underlying the signal generation process, including source mechanisms, wave propagation, and station response. This allows inference in the model to recover the qualitative behavior of geophysical methods including waveform matching and double-differencing, all as part of a unified Bayesian monitoring system that simultaneously detects and locates events from a network of stations. We report results from an evaluation of SIGVISA monitoring the western United States for a two-week period following the magnitude 6.0 event in Wells, NV in February 2008. During this period, SIGVISA detects more than twice as many events as NETVISA, and three times as many as SEL3, while operating at the same precision; at lower precisions it detects up to five times as many events as SEL3. At the same time, signal-based monitoring reduces mean location errors by a factor of four relative to detection-based systems. We provide evidence that, given only IMS data, SIGVISA detects events that are missed by regional monitoring networks, indicating that our evaluations may even underestimate its performance. Finally, SIGVISA matches or exceeds the detection rates of existing systems for de novo events - events with no nearby historical seismicity - and detects through automated processing a number of such events missed even by the human analysts generating the LEB.

  9. A Hot-Deck Multiple Imputation Procedure for Gaps in Longitudinal Recurrent Event Histories

    PubMed Central

    Wang, Chia-Ning; Little, Roderick; Nan, Bin; Harlow, Siobán D.

    2012-01-01

    Summary We propose a regression-based hot deck multiple imputation method for gaps of missing data in longitudinal studies, where subjects experience a recurrent event process and a terminal event. Examples are repeated asthma episodes and death, or menstrual periods and the menopause, as in our motivating application. Research interest concerns the onset time of a marker event, defined by the recurrent-event process, or the duration from this marker event to the final event. Gaps in the recorded event history make it difficult to determine the onset time of the marker event, and hence, the duration from onset to the final event. Simple approaches such as jumping gap times or dropping cases with gaps have obvious limitations. We propose a procedure for imputing information in the gaps by substituting information in the gap from a matched individual with a completely recorded history in the corresponding interval. Predictive Mean Matching is used to incorporate information on longitudinal characteristics of the repeated process and the final event time. Multiple imputation is used to propagate imputation uncertainty. The procedure is applied to an important data set for assessing the timing and duration of the menopausal transition. The performance of the proposed method is assessed by a simulation study. PMID:21361886

  10. Development of a GCR Event-based Risk Model

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Ponomarev, Artem L.; Plante, Ianik; Carra, Claudio; Kim, Myung-Hee

    2009-01-01

    A goal at NASA is to develop event-based systems biology models of space radiation risks that will replace the current dose-based empirical models. Complex and varied biochemical signaling processes transmit the initial DNA and oxidative damage from space radiation into cellular and tissue responses. Mis-repaired damage or aberrant signals can lead to genomic instability, persistent oxidative stress or inflammation, which are causative of cancer and CNS risks. Protective signaling through adaptive responses or cell repopulation is also possible. We are developing a computational simulation approach to galactic cosmic ray (GCR) effects that is based on biological events rather than average quantities such as dose, fluence, or dose equivalent. The goal of the GCR Event-based Risk Model (GERMcode) is to provide a simulation tool to describe and integrate physical and biological events into stochastic models of space radiation risks. We used the quantum multiple scattering model of heavy ion fragmentation (QMSFRG) and well known energy loss processes to develop a stochastic Monte-Carlo based model of GCR transport in spacecraft shielding and tissue. We validated the accuracy of the model by comparing to physical data from the NASA Space Radiation Laboratory (NSRL). Our simulation approach allows us to time-tag each GCR proton or heavy ion interaction in tissue including correlated secondary ions often of high multiplicity. Conventional space radiation risk assessment employs average quantities, and assumes linearity and additivity of responses over the complete range of GCR charge and energies. To investigate possible deviations from these assumptions, we studied several biological response pathway models of varying induction and relaxation times including the ATM, TGF -Smad, and WNT signaling pathways. We then considered small volumes of interacting cells and the time-dependent biophysical events that the GCR would produce within these tissue volumes to estimate how GCR event rates mapped to biological signaling induction and relaxation times. We considered several hypotheses related to signaling and cancer risk, and then performed simulations for conditions where aberrant or adaptive signaling would occur on long-duration space mission. Our results do not support the conventional assumptions of dose, linearity and additivity. A discussion on how event-based systems biology models, which focus on biological signaling as the mechanism to propagate damage or adaptation, can be further developed for cancer and CNS space radiation risk projections is given.

  11. Automatic event detection in low SNR microseismic signals based on multi-scale permutation entropy and a support vector machine

    NASA Astrophysics Data System (ADS)

    Jia, Rui-Sheng; Sun, Hong-Mei; Peng, Yan-Jun; Liang, Yong-Quan; Lu, Xin-Ming

    2017-07-01

    Microseismic monitoring is an effective means for providing early warning of rock or coal dynamical disasters, and its first step is microseismic event detection, although low SNR microseismic signals often cannot effectively be detected by routine methods. To solve this problem, this paper presents permutation entropy and a support vector machine to detect low SNR microseismic events. First, an extraction method of signal features based on multi-scale permutation entropy is proposed by studying the influence of the scale factor on the signal permutation entropy. Second, the detection model of low SNR microseismic events based on the least squares support vector machine is built by performing a multi-scale permutation entropy calculation for the collected vibration signals, constructing a feature vector set of signals. Finally, a comparative analysis of the microseismic events and noise signals in the experiment proves that the different characteristics of the two can be fully expressed by using multi-scale permutation entropy. The detection model of microseismic events combined with the support vector machine, which has the features of high classification accuracy and fast real-time algorithms, can meet the requirements of online, real-time extractions of microseismic events.

  12. AER synthetic generation in hardware for bio-inspired spiking systems

    NASA Astrophysics Data System (ADS)

    Linares-Barranco, Alejandro; Linares-Barranco, Bernabe; Jimenez-Moreno, Gabriel; Civit-Balcells, Anton

    2005-06-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows for real-time virtual massive connectivity between huge number neurons located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate 'events' according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. When building multi-chip muti-layered AER systems it is absolutely necessary to have a computer interface that allows (a) to read AER interchip traffic into the computer and visualize it on screen, and (b) convert conventional frame-based video stream in the computer into AER and inject it at some point of the AER structure. This is necessary for test and debugging of complex AER systems. This paper addresses the problem of converting, in a computer, a conventional frame-based video stream into the spike event based representation AER. There exist several proposed software methods for synthetic generation of AER for bio-inspired systems. This paper presents a hardware implementation for one method, which is based on Linear-Feedback-Shift-Register (LFSR) pseudo-random number generation. The sequence of events generated by this hardware, which follows a Poisson distribution like a biological neuron, has been reconstructed using two AER integrator cells. The error of reconstruction for a set of images that produces different traffic loads of event in the AER bus is used as evaluation criteria. A VHDL description of the method, that includes the Xilinx PCI Core, has been implemented and tested using a general purpose PCI-AER board. This PCI-AER board has been developed by authors, and uses a Spartan II 200 FPGA. This system for AER Synthetic Generation is capable of transforming frames of 64x64 pixels, received through a standard computer PCI bus, at a frame rate of 25 frames per second, producing spike events at a peak rate of 107 events per second.

  13. Validity of Administrative Data in Identifying Cancer-related Events in Adolescents and Young Adults: A Population-based Study Using the IMPACT Cohort.

    PubMed

    Gupta, Sumit; Nathan, Paul C; Baxter, Nancy N; Lau, Cindy; Daly, Corinne; Pole, Jason D

    2018-06-01

    Despite the importance of estimating population level cancer outcomes, most registries do not collect critical events such as relapse. Attempts to use health administrative data to identify these events have focused on older adults and have been mostly unsuccessful. We developed and tested administrative data-based algorithms in a population-based cohort of adolescents and young adults with cancer. We identified all Ontario adolescents and young adults 15-21 years old diagnosed with leukemia, lymphoma, sarcoma, or testicular cancer between 1992-2012. Chart abstraction determined the end of initial treatment (EOIT) date and subsequent cancer-related events (progression, relapse, second cancer). Linkage to population-based administrative databases identified fee and procedure codes indicating cancer treatment or palliative care. Algorithms determining EOIT based on a time interval free of treatment-associated codes, and new cancer-related events based on billing codes, were compared with chart-abstracted data. The cohort comprised 1404 patients. Time periods free of treatment-associated codes did not validly identify EOIT dates; using subsequent codes to identify new cancer events was thus associated with low sensitivity (56.2%). However, using administrative data codes that occurred after the EOIT date based on chart abstraction, the first cancer-related event was identified with excellent validity (sensitivity, 87.0%; specificity, 93.3%; positive predictive value, 81.5%; negative predictive value, 95.5%). Although administrative data alone did not validly identify cancer-related events, administrative data in combination with chart collected EOIT dates was associated with excellent validity. The collection of EOIT dates by cancer registries would significantly expand the potential of administrative data linkage to assess cancer outcomes.

  14. Rapid classification of hippocampal replay content for real-time applications

    PubMed Central

    Liu, Daniel F.; Karlsson, Mattias P.; Frank, Loren M.; Eden, Uri T.

    2016-01-01

    Sharp-wave ripple (SWR) events in the hippocampus replay millisecond-timescale patterns of place cell activity related to the past experience of an animal. Interrupting SWR events leads to learning and memory impairments, but how the specific patterns of place cell spiking seen during SWRs contribute to learning and memory remains unclear. A deeper understanding of this issue will require the ability to manipulate SWR events based on their content. Accurate real-time decoding of SWR replay events requires new algorithms that are able to estimate replay content and the associated uncertainty, along with software and hardware that can execute these algorithms for biological interventions on a millisecond timescale. Here we develop an efficient estimation algorithm to categorize the content of replay from multiunit spiking activity. Specifically, we apply real-time decoding methods to each SWR event and then compute the posterior probability of the replay feature. We illustrate this approach by classifying SWR events from data recorded in the hippocampus of a rat performing a spatial memory task into four categories: whether they represent outbound or inbound trajectories and whether the activity is replayed forward or backward in time. We show that our algorithm can classify the majority of SWR events in a recording epoch within 20 ms of the replay onset with high certainty, which makes the algorithm suitable for a real-time implementation with short latencies to incorporate into content-based feedback experiments. PMID:27535369

  15. Brown Dwarf Microlensing Diagram

    NASA Image and Video Library

    2016-11-10

    For the first time, two space-based telescopes have teamed up with ground-based observatories to observe a microlensing event, a magnification of the light of a distant star due to the gravitational effects of an unseen object in the foreground. In this case, the cause of the microlensing event was a brown dwarf, dubbed OGLE-2015-BLG-1319, orbiting a star. In terms of mass, brown dwarfs fall somewhere between the size of the largest planets and the smallest stars. Curiously, scientists have found that, for stars roughly the mass of our sun, less than 1 percent have a brown dwarf orbiting within 3 AU (1 AU is the distance between Earth and the sun). This newly discovered brown dwarf may fall in that distance range. This microlensing event was observed by ground-based telescopes looking for these uncommon events, and subsequently seen by NASA's Spitzer and Swift space telescopes. As the diagram shows, Spitzer and Swift offer additional vantage points for viewing this chance alignment. While Swift orbits close to Earth, and saw (blue diamonds) essentially the same change in light that the ground-based telescopes measured (grey markers), Spitzer's location much farther away from Earth gave it a very different perspective on the event (red circles). In particular, Spitzer's vantage point resulted in a time lag in the microlensing event it observed, compared to what was seen by Swift and the ground-based telescope. This offset allowed astronomers to determine the distance to OGLE-2015-BLG-1319 as well as its mass: around 30-65 times that of Jupiter. http://photojournal.jpl.nasa.gov/catalog/PIA21077

  16. Event-based surveillance in north-western Ethiopia: experience and lessons learnt in the field

    PubMed Central

    Ota, Masaki; Beyene, Belay Bezabih

    2015-01-01

    This study piloted an event-based surveillance system at the health centre (HC) level in Ethiopia. The system collects rumours in the community and registers them in rumour logbooks to record events of disease outbreaks and public health emergencies. Descriptive analysis was conducted on the events captured at the 59 study HCs in the Amhara Region in north-western Ethiopia between October 2013 and November 2014. A total of 126 rumours were registered at two thirds of the HCs during the study period. The average event reporting time was 3.8 days; response time of the HCs was 0.6 days, resulting in a total response time of 4.4 days. The most commonly reported rumours were measles-related (n = 90, 71%). These rumours followed a similar pattern of measles cases reported in the routine surveillance system. The largest proportion of rumours were reported by community members (n = 38, 36%) followed by health post workers (n = 36, 29%) who were normally informed by the community members about the rumours. This surveillance system was established along with an existing indicator-based surveillance system and was simple to implement. The implementation cost was minimal, requiring only printing and distribution of rumour logbooks to the HCs and brief orientations to focal persons. In countries where routine surveillance is still weak, an event-based surveillance system similar to this should be considered as a supplementary tool for disease monitoring. PMID:26668763

  17. Data driven analysis of rain events: feature extraction, clustering, microphysical /macro physical relationship

    NASA Astrophysics Data System (ADS)

    Djallel Dilmi, Mohamed; Mallet, Cécile; Barthes, Laurent; Chazottes, Aymeric

    2017-04-01

    The study of rain time series records is mainly carried out using rainfall rate or rain accumulation parameters estimated on a fixed integration time (typically 1 min, 1 hour or 1 day). In this study we used the concept of rain event. In fact, the discrete and intermittent natures of rain processes make the definition of some features inadequate when defined on a fixed duration. Long integration times (hour, day) lead to mix rainy and clear air periods in the same sample. Small integration time (seconds, minutes) will lead to noisy data with a great sensibility to detector characteristics. The analysis on the whole rain event instead of individual short duration samples of a fixed duration allows to clarify relationships between features, in particular between macro physical and microphysical ones. This approach allows suppressing the intra-event variability partly due to measurement uncertainties and allows focusing on physical processes. An algorithm based on Genetic Algorithm (GA) and Self Organising Maps (SOM) is developed to obtain a parsimonious characterisation of rain events using a minimal set of variables. The use of self-organizing map (SOM) is justified by the fact that it allows to map a high dimensional data space in a two-dimensional space while preserving as much as possible the initial space topology in an unsupervised way. The obtained SOM allows providing the dependencies between variables and consequently removing redundant variables leading to a minimal subset of only five features (the event duration, the rain rate peak, the rain event depth, the event rain rate standard deviation and the absolute rain rate variation of order 0.5). To confirm relevance of the five selected features the corresponding SOM is analyzed. This analysis shows clearly the existence of relationships between features. It also shows the independence of the inter-event time (IETp) feature or the weak dependence of the Dry percentage in event (Dd%e) feature. This confirms that a rain time series can be considered by an alternation of independent rain event and no rain period. The five selected feature are used to perform a hierarchical clustering of the events. The well-known division between stratiform and convective events appears clearly. This classification into two classes is then refined in 5 fairly homogeneous subclasses. The data driven analysis performed on whole rain events instead of fixed length samples allows identifying strong relationships between macrophysics (based on rain rate) and microphysics (based on raindrops) features. We show that among the 5 identified subclasses some of them have specific microphysics characteristics. Obtaining information on microphysical characteristics of rainfall events from rain gauges measurement suggests many implications in development of the quantitative precipitation estimation (QPE), for the improvement of rain rate retrieval algorithm in remote sensing context.

  18. Solar Energetic Particle Warnings from a Coronagraph

    NASA Technical Reports Server (NTRS)

    St Cyr, O. C.; Posner, A.; Burkepile, J. T.

    2017-01-01

    We report here the concept of using near-real time observations from a coronagraph to provide early warning of a fast coronal mass ejection (CME) and the possible onset of a solar energetic particle (SEP) event. The 1 January 2016, fast CME, and its associated SEP event are cited as an example. The CME was detected by the ground-based K-Cor coronagraph at Mauna Loa Solar Observatory and by the SOHO Large Angle and Spectrometric Coronagraph. The near-real-time availability of the high-cadence K-Cor observations in the low corona leads to an obvious question: Why has no one attempted to use a coronagraph as an early warning device for SEP events? The answer is that the low image cadence and the long latency of existing spaceborne coronagraphs make them valid for archival studies but typically unsuitable for near-real-time forecasting. The January 2016 event provided favorable CME viewing geometry and demonstrated that the primary component of a prototype ground-based system for SEP warnings is available several hours on most days. We discuss how a conceptual CME-based warning system relates to other techniques, including an estimate of the relative SEP warning times, and how such a system might be realized.

  19. Confidence intervals for the first crossing point of two hazard functions.

    PubMed

    Cheng, Ming-Yen; Qiu, Peihua; Tan, Xianming; Tu, Dongsheng

    2009-12-01

    The phenomenon of crossing hazard rates is common in clinical trials with time to event endpoints. Many methods have been proposed for testing equality of hazard functions against a crossing hazards alternative. However, there has been relatively few approaches available in the literature for point or interval estimation of the crossing time point. The problem of constructing confidence intervals for the first crossing time point of two hazard functions is considered in this paper. After reviewing a recent procedure based on Cox proportional hazard modeling with Box-Cox transformation of the time to event, a nonparametric procedure using the kernel smoothing estimate of the hazard ratio is proposed. The proposed procedure and the one based on Cox proportional hazard modeling with Box-Cox transformation of the time to event are both evaluated by Monte-Carlo simulations and applied to two clinical trial datasets.

  20. Seismic Signatures of Brine Release at Blood Falls, Taylor Glacier, Antarctica

    NASA Astrophysics Data System (ADS)

    Carr, C. G.; Pettit, E. C.; Carmichael, J.

    2017-12-01

    Blood Falls is created by the release of subglacially-sourced, iron-rich brine at the surface of Taylor Glacier, McMurdo Dry Valleys, Antarctica. The supraglacial portion of this hydrological feature is episodically active. Englacial liquid brine flow occurs despite ice temperatures of -17°C and we document supraglacial liquid brine release despite ambient air temperatures average -20°C. In this study, we use data from a seismic network, time-lapse cameras, and publicly available weather station data to address the questions: what are the characteristics of seismic events that occur during Blood Falls brine release and how do these compare with seismic events that occur during times of Blood Falls quiescence? How are different processes observable in the time-lapse imagery represented in the seismic record? Time-lapse photography constrains the timing of brine release events during the austral winter of 2014. We use a noise-adaptive digital power detector to identify seismic events and cluster analysis to identify repeating events based on waveform similarity across the network. During the 2014 wintertime brine release, high-energy repeated seismic events occurred proximal to Blood Falls. We investigate the ground motions associated with these clustered events, as well as their spatial distribution. We see evidence of possible tremor during the brine release periods, an indicator of fluid movement. If distinctive seismic signatures are associated with Blood Falls brine release they could be identified based solely on seismic data without any aid from time-lapse cameras. Passive seismologic monitoring has the benefit of continuity during the polar night and other poor visibility conditions, which make time-lapse imagery unusable.

  1. VLF Observation of Long Ionospheric Recovery Events

    NASA Astrophysics Data System (ADS)

    Cotts, B. R.; Inan, U. S.

    2006-12-01

    On the evening of 20 November 1992, three early/fast events were observed on the great circle path (GCP) from the NAU transmitter in Puerto Rico to Gander (GA), Newfoundland. These events were found to have significantly longer recovery times (up to 20 minutes) than any previously documented events. Typical early/fast events and Lightning-induced Electron Precipitation (LEP) events affect the D-region ionosphere near the night-time VLF-reflection height of ~85 km and exhibit recovery to pre-event levels of < 180 seconds [e.g., Sampath et al., 2000]. These lightning-associated long recovery VLF events resemble the observed long ionospheric recovery of the VLF signature of the 27 December 2004 galactic gamma-ray flare event [Inan et al., 2006], which was interpreted to be due to the unusually high electron detachment rates at low (below 40 km) altitudes, The region of the ionosphere affected in these long recovery VLF events may thus also include the altitude range < 40 km, and may possibly be related to gigantic jets. In this context, preliminary results indicate that the lightning-associated VLF long recovery events appear to be more common in oceanic thunderstorms. In this paper, we present occurrence statistics and other measured properties of VLF long recovery events, observed on all-sea based and land based VLF great circle paths.

  2. Towards a microchannel-based X-ray detector with two-dimensional spatial and time resolution and high dynamic range

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Bernhard W.; Mane, Anil U.; Elam, Jeffrey W.

    X-ray detectors that combine two-dimensional spatial resolution with a high time resolution are needed in numerous applications of synchrotron radiation. Most detectors with this combination of capabilities are based on semiconductor technology and are therefore limited in size. Furthermore, the time resolution is often realised through rapid time-gating of the acquisition, followed by a slower readout. Here, a detector technology is realised based on relatively inexpensive microchannel plates that uses GHz waveform sampling for a millimeter-scale spatial resolution and better than 100 ps time resolution. The technology is capable of continuous streaming of time- and location-tagged events at rates greatermore » than 10 7events per cm 2. Time-gating can be used for improved dynamic range.« less

  3. Adaptive Neural Network-Based Event-Triggered Control of Single-Input Single-Output Nonlinear Discrete-Time Systems.

    PubMed

    Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani

    2016-01-01

    This paper presents a novel adaptive neural network (NN) control of single-input and single-output uncertain nonlinear discrete-time systems under event sampled NN inputs. In this control scheme, the feedback signals are transmitted, and the NN weights are tuned in an aperiodic manner at the event sampled instants. After reviewing the NN approximation property with event sampled inputs, an adaptive state estimator (SE), consisting of linearly parameterized NNs, is utilized to approximate the unknown system dynamics in an event sampled context. The SE is viewed as a model and its approximated dynamics and the state vector, during any two events, are utilized for the event-triggered controller design. An adaptive event-trigger condition is derived by using both the estimated NN weights and a dead-zone operator to determine the event sampling instants. This condition both facilitates the NN approximation and reduces the transmission of feedback signals. The ultimate boundedness of both the NN weight estimation error and the system state vector is demonstrated through the Lyapunov approach. As expected, during an initial online learning phase, events are observed more frequently. Over time with the convergence of the NN weights, the inter-event times increase, thereby lowering the number of triggered events. These claims are illustrated through the simulation results.

  4. EventThread: Visual Summarization and Stage Analysis of Event Sequence Data.

    PubMed

    Guo, Shunan; Xu, Ke; Zhao, Rongwen; Gotz, David; Zha, Hongyuan; Cao, Nan

    2018-01-01

    Event sequence data such as electronic health records, a person's academic records, or car service records, are ordered series of events which have occurred over a period of time. Analyzing collections of event sequences can reveal common or semantically important sequential patterns. For example, event sequence analysis might reveal frequently used care plans for treating a disease, typical publishing patterns of professors, and the patterns of service that result in a well-maintained car. It is challenging, however, to visually explore large numbers of event sequences, or sequences with large numbers of event types. Existing methods focus on extracting explicitly matching patterns of events using statistical analysis to create stages of event progression over time. However, these methods fail to capture latent clusters of similar but not identical evolutions of event sequences. In this paper, we introduce a novel visualization system named EventThread which clusters event sequences into threads based on tensor analysis and visualizes the latent stage categories and evolution patterns by interactively grouping the threads by similarity into time-specific clusters. We demonstrate the effectiveness of EventThread through usage scenarios in three different application domains and via interviews with an expert user.

  5. [Columbia Sensor Diagrams]. Revised

    NASA Technical Reports Server (NTRS)

    2003-01-01

    A two dimensional graphical event sequence of the time history of relevant sensor information located in the left wing and wheel well areas of the Space Shuttle Columbia Orbiter is presented. Information contained in this graphical event sequence include: 1) Sensor location on orbiter and its associated wire bindle in X-Y plane; 2) Wire bundle routing; 3) Description of each anomalous sensor event; 4) Time annotation by (a) GMT, (b) time relative to LOS, (c) time history bar, and (d) ground track; and 5) Graphical display of temperature rise (based on delta temperature from point it is determined to be anomalous).

  6. Microseismic Velocity Imaging of the Fracturing Zone

    NASA Astrophysics Data System (ADS)

    Zhang, H.; Chen, Y.

    2015-12-01

    Hydraulic fracturing of low permeability reservoirs can induce microseismic events during fracture development. For this reason, microseismic monitoring using sensors on surface or in borehole have been widely used to delineate fracture spatial distribution and to understand fracturing mechanisms. It is often the case that the stimulated reservoir volume (SRV) is determined solely based on microseismic locations. However, it is known that for some fracture development stage, long period long duration events, instead of microseismic events may be associated. In addition, because microseismic events are essentially weak and there exist different sources of noise during monitoring, some microseismic events could not be detected and thus located. Therefore the estimation of the SRV is biased if it is solely determined by microseismic locations. With the existence of fluids and fractures, the seismic velocity of reservoir layers will be decreased. Based on this fact, we have developed a near real time seismic velocity tomography method to characterize velocity changes associated with fracturing process. The method is based on double-difference seismic tomography algorithm to image the fracturing zone where microseismic events occur by using differential arrival times from microseismic event pairs. To take into account varying data distribution for different fracking stages, the method solves the velocity model in the wavelet domain so that different scales of model features can be obtained according to different data distribution. We have applied this real time tomography method to both acoustic emission data from lab experiment and microseismic data from a downhole microseismic monitoring project for shale gas hydraulic fracturing treatment. The tomography results from lab data clearly show the velocity changes associated with different rock fracturing stages. For the field data application, it shows that microseismic events are located in low velocity anomalies. By combining low velocity anomalies with microseismic events, we should better estimate the SRV.

  7. Managing wildfire events: risk-based decision making among a group of federal fire managers

    Treesearch

    Robyn S. Wilson; Patricia L. Winter; Lynn A. Maguire; Timothy Ascher

    2011-01-01

    Managing wildfire events to achieve multiple management objectives involves a high degree of decision complexity and uncertainty, increasing the likelihood that decisions will be informed by experience-based heuristics triggered by available cues at the time of the decision. The research reported here tests the prevalence of three risk-based biases among 206...

  8. Nonparametric methods for analyzing recurrent gap time data with application to infections after hematopoietic cell transplant.

    PubMed

    Lee, Chi Hyun; Luo, Xianghua; Huang, Chiung-Yu; DeFor, Todd E; Brunstein, Claudio G; Weisdorf, Daniel J

    2016-06-01

    Infection is one of the most common complications after hematopoietic cell transplantation. Many patients experience infectious complications repeatedly after transplant. Existing statistical methods for recurrent gap time data typically assume that patients are enrolled due to the occurrence of an event of interest, and subsequently experience recurrent events of the same type; moreover, for one-sample estimation, the gap times between consecutive events are usually assumed to be identically distributed. Applying these methods to analyze the post-transplant infection data will inevitably lead to incorrect inferential results because the time from transplant to the first infection has a different biological meaning than the gap times between consecutive recurrent infections. Some unbiased yet inefficient methods include univariate survival analysis methods based on data from the first infection or bivariate serial event data methods based on the first and second infections. In this article, we propose a nonparametric estimator of the joint distribution of time from transplant to the first infection and the gap times between consecutive infections. The proposed estimator takes into account the potentially different distributions of the two types of gap times and better uses the recurrent infection data. Asymptotic properties of the proposed estimators are established. © 2015, The International Biometric Society.

  9. Nonparametric methods for analyzing recurrent gap time data with application to infections after hematopoietic cell transplant

    PubMed Central

    Lee, Chi Hyun; Huang, Chiung-Yu; DeFor, Todd E.; Brunstein, Claudio G.; Weisdorf, Daniel J.

    2015-01-01

    Summary Infection is one of the most common complications after hematopoietic cell transplantation. Many patients experience infectious complications repeatedly after transplant. Existing statistical methods for recurrent gap time data typically assume that patients are enrolled due to the occurrence of an event of interest, and subsequently experience recurrent events of the same type; moreover, for one-sample estimation, the gap times between consecutive events are usually assumed to be identically distributed. Applying these methods to analyze the post-transplant infection data will inevitably lead to incorrect inferential results because the time from transplant to the first infection has a different biological meaning than the gap times between consecutive recurrent infections. Some unbiased yet inefficient methods include univariate survival analysis methods based on data from the first infection or bivariate serial event data methods based on the first and second infections. In this paper, we propose a nonparametric estimator of the joint distribution of time from transplant to the first infection and the gap times between consecutive infections. The proposed estimator takes into account the potentially different distributions of the two types of gap times and better uses the recurrent infection data. Asymptotic properties of the proposed estimators are established. PMID:26575402

  10. Central FPGA-based destination and load control in the LHCb MHz event readout

    NASA Astrophysics Data System (ADS)

    Jacobsson, R.

    2012-10-01

    The readout strategy of the LHCb experiment is based on complete event readout at 1 MHz. A set of 320 sub-detector readout boards transmit event fragments at total rate of 24.6 MHz at a bandwidth usage of up to 70 GB/s over a commercial switching network based on Gigabit Ethernet to a distributed event building and high-level trigger processing farm with 1470 individual multi-core computer nodes. In the original specifications, the readout was based on a pure push protocol. This paper describes the proposal, implementation, and experience of a non-conventional mixture of a push and a pull protocol, akin to credit-based flow control. An FPGA-based central master module, partly operating at the LHC bunch clock frequency of 40.08 MHz and partly at a double clock speed, is in charge of the entire trigger and readout control from the front-end electronics up to the high-level trigger farm. One FPGA is dedicated to controlling the event fragment packing in the readout boards, the assignment of the farm node destination for each event, and controls the farm load based on an asynchronous pull mechanism from each farm node. This dynamic readout scheme relies on generic event requests and the concept of node credit allowing load control and trigger rate regulation as a function of the global farm load. It also allows the vital task of fast central monitoring and automatic recovery in-flight of failing nodes while maintaining dead-time and event loss at a minimum. This paper demonstrates the strength and suitability of implementing this real-time task for a very large distributed system in an FPGA where no random delays are introduced, and where extreme reliability and accurate event accounting are fundamental requirements. It was in use during the entire commissioning phase of LHCb and has been in faultless operation during the first two years of physics luminosity data taking.

  11. Time does not cause forgetting in short-term serial recall.

    PubMed

    Lewandowsky, Stephan; Duncan, Matthew; Brown, Gordon D A

    2004-10-01

    Time-based theories expect memory performance to decline as the delay between study and recall of an item increases. The assumption of time-based forgetting, central to many models of serial recall, underpins their key behaviors. Here we compare the predictions of time-based and event-based models by simulation and test them in two experiments using a novel manipulation of the delay between study and retrieval. Participants were trained, via corrective feedback, to recall at different speeds, thus varying total recall time from 6 to 10 sec. In the first experiment, participants used the keyboard to enter their responses but had to repeat a word (called the suppressor) aloud during recall to prevent rehearsal. In the second experiment, articulation was again required, but recall was verbal and was paced by the number of repetitions of the suppressor in between retrieval of items. In both experiments, serial position curves for all retrieval speeds overlapped, and output time had little or no effect. Comparative evaluation of a time-based and an event-based model confirmed that these results present a particular challenge to time-based approaches. We conclude that output interference, rather than output time, is critical in serial recall.

  12. Data Sets and Data Services at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.

    2014-12-01

    The Northern California Earthquake Data Center (NCEDC) houses a unique and comprehensive data archive and provides real-time services for a variety of seismological and geophysical data sets that encompass northern and central California. We have over 80 terabytes of continuous and event-based time series data from broadband, short-period, strong motion, and strain sensors as well as continuous and campaign GPS data at both standard and high sample rates in both raw and RINEX format. The Northen California Seismic System (NCSS), operated by UC Berkeley and USGS Menlo Park, has recorded over 890,000 events from 1984 to the present, and the NCEDC provides catalog, parametric information, moment tensors and first motion mechanisms, and time series data for these events. We also host and provide event catalogs, parametric information, and event waveforms for DOE enhanced geothermal system monitoring in northern California and Nevada. The NCEDC provides a variety of ways for users to access these data. The most recent development are web services, which provide interactive, command-line, or program-based workflow access to data. Web services use well-established server and client protocols and RESTful software architecture that allow users to easily submit queries and receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC supports all FDSN-defined web services as well as a number of IRIS-defined and NCEDC-defined services. We also continue to support older email-based and browser-based access to data. NCEDC data and web services can be found at http://www.ncedc.org and http://service.ncedc.org.

  13. Northern California Earthquake Data Center: Data Sets and Data Services

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Allen, R. M.; Zuzlewski, S.

    2015-12-01

    The Northern California Earthquake Data Center (NCEDC) provides a permanent archive and real-time data distribution services for a unique and comprehensive data set of seismological and geophysical data sets encompassing northern and central California. We provide access to over 85 terabytes of continuous and event-based time series data from broadband, short-period, strong motion, and strain sensors as well as continuous and campaign GPS data at both standard and high sample rates. The Northen California Seismic System (NCSS), operated by UC Berkeley and USGS Menlo Park, has recorded over 900,000 events from 1984 to the present, and the NCEDC serves catalog, parametric information, moment tensors and first motion mechanisms, and time series data for these events. We also serve event catalogs, parametric information, and event waveforms for DOE enhanced geothermal system monitoring in northern California and Nevada. The NCEDC provides a several ways for users to access these data. The most recent development are web services, which provide interactive, command-line, or program-based workflow access to data. Web services use well-established server and client protocols and RESTful software architecture that allow users to easily submit queries and receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC supports all FDSN-defined web services as well as a number of IRIS-defined and NCEDC-defined services. We also continue to support older email-based and browser-based access to data. NCEDC data and web services can be found at http://www.ncedc.org and http://service.ncedc.org.

  14. Effects of event knowledge in processing verbal arguments

    PubMed Central

    Bicknell, Klinton; Elman, Jeffrey L.; Hare, Mary; McRae, Ken; Kutas, Marta

    2010-01-01

    This research tests whether comprehenders use their knowledge of typical events in real time to process verbal arguments. In self-paced reading and event-related brain potential (ERP) experiments, we used materials in which the likelihood of a specific patient noun (brakes or spelling) depended on the combination of an agent and verb (mechanic checked vs. journalist checked). Reading times were shorter at the word directly following the patient for the congruent than the incongruent items. Differential N400s were found earlier, immediately at the patient. Norming studies ruled out any account of these results based on direct relations between the agent and patient. Thus, comprehenders dynamically combine information about real-world events based on intrasentential agents and verbs, and this combination then rapidly influences online sentence interpretation. PMID:21076629

  15. Issues in Informal Education: Event-Based Science Communication Involving Planetaria and the Internet

    NASA Technical Reports Server (NTRS)

    Adams, Mitzi L.; Gallagher, D. L.; Whitt, A.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing real-time science related events has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases broadcasts accommodate active feedback and questions from Internet participants. Panel participation will be used to communicate the problems and lessons learned from these activities over the last three years.

  16. Projecting Event-Based Analysis Dates in Clinical Trials: An Illustration Based on the International Duration Evaluation of Adjuvant Chemotherapy (IDEA) Collaboration. Projecting analysis dates for the IDEA collaboration.

    PubMed

    Renfro, Lindsay A; Grothey, Axel M; Paul, James; Floriani, Irene; Bonnetain, Franck; Niedzwiecki, Donna; Yamanaka, Takeharu; Souglakos, Ioannis; Yothers, Greg; Sargent, Daniel J

    2014-12-01

    Clinical trials are expensive and lengthy, where success of a given trial depends on observing a prospectively defined number of patient events required to answer the clinical question. The point at which this analysis time occurs depends on both patient accrual and primary event rates, which typically vary throughout the trial's duration. We demonstrate real-time analysis date projections using data from a collection of six clinical trials that are part of the IDEA collaboration, an international preplanned pooling of data from six trials testing the duration of adjuvant chemotherapy in stage III colon cancer, and we additionally consider the hypothetical impact of one trial's early termination of follow-up. In the absence of outcome data from IDEA, monthly accrual rates for each of the six IDEA trials were used to project subsequent trial-specific accrual, while historical data from similar Adjuvant Colon Cancer Endpoints (ACCENT) Group trials were used to construct a parametric model for IDEA's primary endpoint, disease-free survival, under the same treatment regimen. With this information and using the planned total accrual from each IDEA trial protocol, individual patient accrual and event dates were simulated and the overall IDEA interim and final analysis times projected. Projections were then compared with actual (previously undisclosed) trial-specific event totals at a recent census time for validation. The change in projected final analysis date assuming early termination of follow-up for one IDEA trial was also calculated. Trial-specific predicted event totals were close to the actual number of events per trial for the recent census date at which the number of events per trial was known, with the overall IDEA projected number of events only off by eight patients. Potential early termination of follow-up by one IDEA trial was estimated to postpone the overall IDEA final analysis date by 9 months. Real-time projection of the final analysis time during a trial, or the overall analysis time during a trial collaborative such as IDEA, has practical implications for trial feasibility when these projections are translated into additional time and resources required.

  17. Prescribed journeys through life: Cultural differences in mental time travel between Middle Easterners and Scandinavians.

    PubMed

    Ottsen, Christina Lundsgaard; Berntsen, Dorthe

    2015-12-01

    Mental time travel is the ability to remember past events and imagine future events. Here, 124 Middle Easterners and 128 Scandinavians generated important past and future events. These different societies present a unique opportunity to examine effects of culture. Findings indicate stronger influence of normative schemas and greater use of mental time travel to teach, inform and direct behaviour in the Middle East compared with Scandinavia. The Middle Easterners generated more events that corresponded to their cultural life script and that contained religious words, whereas the Scandinavians reported events with a more positive mood impact. Effects of gender were mainly found in the Middle East. Main effects of time orientation largely replicated recent findings showing that simulation of future and past events are not necessarily parallel processes. In accordance with the notion that future simulations rely on schema-based construction, important future events showed a higher overlap with life script events than past events in both cultures. In general, cross-cultural discrepancies were larger in future compared with past events. Notably, the high focus in the Middle East on sharing future events to give cultural guidance is consistent with the increased adherence to normative scripts found in this culture. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  19. Enhanced secondary analysis of survival data: reconstructing the data from published Kaplan-Meier survival curves.

    PubMed

    Guyot, Patricia; Ades, A E; Ouwens, Mario J N M; Welton, Nicky J

    2012-02-01

    The results of Randomized Controlled Trials (RCTs) on time-to-event outcomes that are usually reported are median time to events and Cox Hazard Ratio. These do not constitute the sufficient statistics required for meta-analysis or cost-effectiveness analysis, and their use in secondary analyses requires strong assumptions that may not have been adequately tested. In order to enhance the quality of secondary data analyses, we propose a method which derives from the published Kaplan Meier survival curves a close approximation to the original individual patient time-to-event data from which they were generated. We develop an algorithm that maps from digitised curves back to KM data by finding numerical solutions to the inverted KM equations, using where available information on number of events and numbers at risk. The reproducibility and accuracy of survival probabilities, median survival times and hazard ratios based on reconstructed KM data was assessed by comparing published statistics (survival probabilities, medians and hazard ratios) with statistics based on repeated reconstructions by multiple observers. The validation exercise established there was no material systematic error and that there was a high degree of reproducibility for all statistics. Accuracy was excellent for survival probabilities and medians, for hazard ratios reasonable accuracy can only be obtained if at least numbers at risk or total number of events are reported. The algorithm is a reliable tool for meta-analysis and cost-effectiveness analyses of RCTs reporting time-to-event data. It is recommended that all RCTs should report information on numbers at risk and total number of events alongside KM curves.

  20. Adaptively Adjusted Event-Triggering Mechanism on Fault Detection for Networked Control Systems.

    PubMed

    Wang, Yu-Long; Lim, Cheng-Chew; Shi, Peng

    2016-12-08

    This paper studies the problem of adaptively adjusted event-triggering mechanism-based fault detection for a class of discrete-time networked control system (NCS) with applications to aircraft dynamics. By taking into account the fault occurrence detection progress and the fault occurrence probability, and introducing an adaptively adjusted event-triggering parameter, a novel event-triggering mechanism is proposed to achieve the efficient utilization of the communication network bandwidth. Both the sensor-to-control station and the control station-to-actuator network-induced delays are taken into account. The event-triggered sensor and the event-triggered control station are utilized simultaneously to establish new network-based closed-loop models for the NCS subject to faults. Based on the established models, the event-triggered simultaneous design of fault detection filter (FDF) and controller is presented. A new algorithm for handling the adaptively adjusted event-triggering parameter is proposed. Performance analysis verifies the effectiveness of the adaptively adjusted event-triggering mechanism, and the simultaneous design of FDF and controller.

  1. A Markovian event-based framework for stochastic spiking neural networks.

    PubMed

    Touboul, Jonathan D; Faugeras, Olivier D

    2011-11-01

    In spiking neural networks, the information is conveyed by the spike times, that depend on the intrinsic dynamics of each neuron, the input they receive and on the connections between neurons. In this article we study the Markovian nature of the sequence of spike times in stochastic neural networks, and in particular the ability to deduce from a spike train the next spike time, and therefore produce a description of the network activity only based on the spike times regardless of the membrane potential process. To study this question in a rigorous manner, we introduce and study an event-based description of networks of noisy integrate-and-fire neurons, i.e. that is based on the computation of the spike times. We show that the firing times of the neurons in the networks constitute a Markov chain, whose transition probability is related to the probability distribution of the interspike interval of the neurons in the network. In the cases where the Markovian model can be developed, the transition probability is explicitly derived in such classical cases of neural networks as the linear integrate-and-fire neuron models with excitatory and inhibitory interactions, for different types of synapses, possibly featuring noisy synaptic integration, transmission delays and absolute and relative refractory period. This covers most of the cases that have been investigated in the event-based description of spiking deterministic neural networks.

  2. An Improved Forwarding of Diverse Events with Mobile Sinks in Underwater Wireless Sensor Networks.

    PubMed

    Raza, Waseem; Arshad, Farzana; Ahmed, Imran; Abdul, Wadood; Ghouzali, Sanaa; Niaz, Iftikhar Azim; Javaid, Nadeem

    2016-11-04

    In this paper, a novel routing strategy to cater the energy consumption and delay sensitivity issues in deep underwater wireless sensor networks is proposed. This strategy is named as ESDR: Event Segregation based Delay sensitive Routing. In this strategy sensed events are segregated on the basis of their criticality and, are forwarded to their respective destinations based on forwarding functions. These functions depend on different routing metrics like: Signal Quality Index, Localization free Signal to Noise Ratio, Energy Cost Function and Depth Dependent Function. The problem of incomparable values of previously defined forwarding functions causes uneven delays in forwarding process. Hence forwarding functions are redefined to ensure their comparable values in different depth regions. Packet forwarding strategy is based on the event segregation approach which forwards one third of the generated events (delay sensitive) to surface sinks and two third events (normal events) are forwarded to mobile sinks. Motion of mobile sinks is influenced by the relative distribution of normal nodes. We have also incorporated two different mobility patterns named as; adaptive mobility and uniform mobility for mobile sinks. The later one is implemented for collecting the packets generated by the normal nodes. These improvements ensure optimum holding time, uniform delay and in-time reporting of delay sensitive events. This scheme is compared with the existing ones and outperforms the existing schemes in terms of network lifetime, delay and throughput.

  3. Networked event-triggered control: an introduction and research trends

    NASA Astrophysics Data System (ADS)

    Mahmoud, Magdi S.; Sabih, Muhammad

    2014-11-01

    A physical system can be studied as either continuous time or discrete-time system depending upon the control objectives. Discrete-time control systems can be further classified into two categories based on the sampling: (1) time-triggered control systems and (2) event-triggered control systems. Time-triggered systems sample states and calculate controls at every sampling instant in a periodic fashion, even in cases when states and calculated control do not change much. This indicates unnecessary and useless data transmission and computation efforts of a time-triggered system, thus inefficiency. For networked systems, the transmission of measurement and control signals, thus, cause unnecessary network traffic. Event-triggered systems, on the other hand, have potential to reduce the communication burden in addition to reducing the computation of control signals. This paper provides an up-to-date survey on the event-triggered methods for control systems and highlights the potential research directions.

  4. Event-Based Control Strategy for Mobile Robots in Wireless Environments.

    PubMed

    Socas, Rafael; Dormido, Sebastián; Dormido, Raquel; Fabregas, Ernesto

    2015-12-02

    In this paper, a new event-based control strategy for mobile robots is presented. It has been designed to work in wireless environments where a centralized controller has to interchange information with the robots over an RF (radio frequency) interface. The event-based architectures have been developed for differential wheeled robots, although they can be applied to other kinds of robots in a simple way. The solution has been checked over classical navigation algorithms, like wall following and obstacle avoidance, using scenarios with a unique or multiple robots. A comparison between the proposed architectures and the classical discrete-time strategy is also carried out. The experimental results shows that the proposed solution has a higher efficiency in communication resource usage than the classical discrete-time strategy with the same accuracy.

  5. Event-Based Control Strategy for Mobile Robots in Wireless Environments

    PubMed Central

    Socas, Rafael; Dormido, Sebastián; Dormido, Raquel; Fabregas, Ernesto

    2015-01-01

    In this paper, a new event-based control strategy for mobile robots is presented. It has been designed to work in wireless environments where a centralized controller has to interchange information with the robots over an RF (radio frequency) interface. The event-based architectures have been developed for differential wheeled robots, although they can be applied to other kinds of robots in a simple way. The solution has been checked over classical navigation algorithms, like wall following and obstacle avoidance, using scenarios with a unique or multiple robots. A comparison between the proposed architectures and the classical discrete-time strategy is also carried out. The experimental results shows that the proposed solution has a higher efficiency in communication resource usage than the classical discrete-time strategy with the same accuracy. PMID:26633412

  6. Is Time-Based Prospective Remembering Mediated by Self-Initiated Rehearsals? Role of Incidental Cues, Ongoing Activity, Age, and Motivation

    ERIC Educational Resources Information Center

    Kvavilashvili, Lia; Fisher, Laura

    2007-01-01

    The present research examined self-reported rehearsal processes in naturalistic time-based prospective memory tasks (Study 1 and 2) and compared them with the processes in event-based tasks (Study 3). Participants had to remember to phone the experimenter either at a prearranged time (a time-based task) or after receiving a certain text message…

  7. An analysis of the 2016 Hitomi breakup event

    NASA Astrophysics Data System (ADS)

    Flegel, Sven; Bennett, James; Lachut, Michael; Möckel, Marek; Smith, Craig

    2017-04-01

    The breakup of Hitomi (ASTRO-H) on 26 March 2016 is analysed. Debris from the fragmentation is used to estimate the time of the event by propagating backwards and estimating the close approach with the parent object. Based on this method, the breakup event is predicted to have occurred at approximately 01:42 UTC on 26 March 2016. The Gaussian variation of parameters equations based on the instantaneous orbits at the predicted time of the event are solved to gain additional insight into the on-orbit position of Hitomi at the time of the event and to test an alternate approach of determining the event epoch and location. A conjunction analysis is carried out between Hitomi and all catalogued objects which were in orbit around the estimated time of the anomaly. Several debris objects have close approaches with Hitomi; however, there is no evidence to support the breakup was caused by a catalogued object. Debris from both of the largest fragmentation events—the Iridium 33-Cosmos 2251 conjunction in 2009 and the intentional destruction of Fengyun 1C in 2007—is involved in close approaches with Hitomi indicating the persistent threat these events have caused in subsequent space missions. To quantify the magnitude of a potential conjunction, the fragmentation resulting from a collision with the debris is modelled using the EVOLVE-4 breakup model. The debris characteristics are estimated from two-line element data. This analysis is indicative of the threat to space assets that mission planners face due to the growing debris population. The impact of the actual event to the environment is investigated based on the debris associated with Hitomi which is currently contained in the United States Strategic Command's catalogue. A look at the active missions in the orbital vicinity of Hitomi reveals that the Hubble Space Telescope is among the spacecraft which may be immediately affected by the new debris.[Figure not available: see fulltext.

  8. Are triggering rates of labquakes universal? Inferring triggering rates from incomplete information

    NASA Astrophysics Data System (ADS)

    Baró, Jordi; Davidsen, Jörn

    2017-12-01

    The acoustic emission activity associated with recent rock fracture experiments under different conditions has indicated that some features of event-event triggering are independent of the details of the experiment and the materials used and are often even indistinguishable from tectonic earthquakes. While the event-event triggering rates or aftershock rates behave pretty much identical for all rock fracture experiments at short times, this is not the case for later times. Here, we discuss how these differences can be a consequence of the aftershock identification method used and show that the true aftershock rates might have two distinct regimes. Specifically, tests on a modified Epidemic-Type Aftershock Sequence model show that the model rates cannot be correctly inferred at late times based on temporal information only if the activity rates or the branching ratio are high. We also discuss both the effect of the two distinct regimes in the aftershock rates and the effect of the background rate on the inter-event time distribution. Our findings should be applicable for inferring event-event triggering rates for many other types of triggering and branching processes as well.

  9. Full-waveform detection of non-impulsive seismic events based on time-reversal methods

    NASA Astrophysics Data System (ADS)

    Solano, Ericka Alinne; Hjörleifsdóttir, Vala; Liu, Qinya

    2017-12-01

    We present a full-waveform detection method for non-impulsive seismic events, based on time-reversal principles. We use the strain Green's tensor as a matched filter, correlating it with continuous observed seismograms, to detect non-impulsive seismic events. We show that this is mathematically equivalent to an adjoint method for detecting earthquakes. We define the detection function, a scalar valued function, which depends on the stacked correlations for a group of stations. Event detections are given by the times at which the amplitude of the detection function exceeds a given value relative to the noise level. The method can make use of the whole seismic waveform or any combination of time-windows with different filters. It is expected to have an advantage compared to traditional detection methods for events that do not produce energetic and impulsive P waves, for example glacial events, landslides, volcanic events and transform-fault earthquakes for events which velocity structure along the path is relatively well known. Furthermore, the method has advantages over empirical Greens functions template matching methods, as it does not depend on records from previously detected events, and therefore is not limited to events occurring in similar regions and with similar focal mechanisms as these events. The method is not specific to any particular way of calculating the synthetic seismograms, and therefore complicated structural models can be used. This is particularly beneficial for intermediate size events that are registered on regional networks, for which the effect of lateral structure on the waveforms can be significant. To demonstrate the feasibility of the method, we apply it to two different areas located along the mid-oceanic ridge system west of Mexico where non-impulsive events have been reported. The first study area is between Clipperton and Siqueiros transform faults (9°N), during the time of two earthquake swarms, occurring in March 2012 and May 2016. The second area of interest is the Gulf of California where two swarms took place during July and September of 2015. We show that we are able to detect previously non-reported, non-impulsive events and recommend that this method be used together with more traditional template matching methods to maximize the number of detected events.

  10. Signal detection on spontaneous reports of adverse events following immunisation: a comparison of the performance of a disproportionality-based algorithm and a time-to-onset-based algorithm

    PubMed Central

    van Holle, Lionel; Bauchau, Vincent

    2014-01-01

    Purpose Disproportionality methods measure how unexpected the observed number of adverse events is. Time-to-onset (TTO) methods measure how unexpected the TTO distribution of a vaccine-event pair is compared with what is expected from other vaccines and events. Our purpose is to compare the performance associated with each method. Methods For the disproportionality algorithms, we defined 336 combinations of stratification factors (sex, age, region and year) and threshold values of the multi-item gamma Poisson shrinker (MGPS). For the TTO algorithms, we defined 18 combinations of significance level and time windows. We used spontaneous reports of adverse events recorded for eight vaccines. The vaccine product labels were used as proxies for true safety signals. Algorithms were ranked according to their positive predictive value (PPV) for each vaccine separately; amedian rank was attributed to each algorithm across vaccines. Results The algorithm with the highest median rank was based on TTO with a significance level of 0.01 and a time window of 60 days after immunisation. It had an overall PPV 2.5 times higher than for the highest-ranked MGPS algorithm, 16th rank overall, which was fully stratified and had a threshold value of 0.8. A TTO algorithm with roughly the same sensitivity as the highest-ranked MGPS had better specificity but longer time-to-detection. Conclusions Within the scope of this study, the majority of the TTO algorithms presented a higher PPV than for any MGPS algorithm. Considering the complementarity of TTO and disproportionality methods, a signal detection strategy combining them merits further investigation. PMID:24038719

  11. Real-Time Global Flood Estimation Using Satellite-Based Precipitation and a Coupled Land Surface and Routing Model

    NASA Technical Reports Server (NTRS)

    Wu, Huan; Adler, Robert F.; Tian, Yudong; Huffman, George J.; Li, Hongyi; Wang, JianJian

    2014-01-01

    A widely used land surface model, the Variable Infiltration Capacity (VIC) model, is coupled with a newly developed hierarchical dominant river tracing-based runoff-routing model to form the Dominant river tracing-Routing Integrated with VIC Environment (DRIVE) model, which serves as the new core of the real-time Global Flood Monitoring System (GFMS). The GFMS uses real-time satellite-based precipitation to derive flood monitoring parameters for the latitude band 50 deg. N - 50 deg. S at relatively high spatial (approximately 12 km) and temporal (3 hourly) resolution. Examples of model results for recent flood events are computed using the real-time GFMS (http://flood.umd.edu). To evaluate the accuracy of the new GFMS, the DRIVE model is run retrospectively for 15 years using both research-quality and real-time satellite precipitation products. Evaluation results are slightly better for the research-quality input and significantly better for longer duration events (3 day events versus 1 day events). Basins with fewer dams tend to provide lower false alarm ratios. For events longer than three days in areas with few dams, the probability of detection is approximately 0.9 and the false alarm ratio is approximately 0.6. In general, these statistical results are better than those of the previous system. Streamflow was evaluated at 1121 river gauges across the quasi-global domain. Validation using real-time precipitation across the tropics (30 deg. S - 30 deg. N) gives positive daily Nash-Sutcliffe Coefficients for 107 out of 375 (28%) stations with a mean of 0.19 and 51% of the same gauges at monthly scale with a mean of 0.33. There were poorer results in higher latitudes, probably due to larger errors in the satellite precipitation input.

  12. Real-time global flood estimation using satellite-based precipitation and a coupled land surface and routing model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Huan; Adler, Robert F.; Tian, Yudong

    2014-03-01

    A widely used land surface model, the Variable Infiltration Capacity (VIC) model, is coupled with a newly developed hierarchical dominant river tracing-based runoff-routing model to form the Dominant river tracing-Routing Integrated with VIC Environment (DRIVE) model, which serves as the new core of the real-time Global Flood Monitoring System (GFMS). The GFMS uses real-time satellite-based precipitation to derive flood monitoring parameters for the latitude band 50°N–50°S at relatively high spatial (~12 km) and temporal (3 hourly) resolution. Examples of model results for recent flood events are computed using the real-time GFMS (http://flood.umd.edu). To evaluate the accuracy of the new GFMS,more » the DRIVE model is run retrospectively for 15 years using both research-quality and real-time satellite precipitation products. Evaluation results are slightly better for the research-quality input and significantly better for longer duration events (3 day events versus 1 day events). Basins with fewer dams tend to provide lower false alarm ratios. For events longer than three days in areas with few dams, the probability of detection is ~0.9 and the false alarm ratio is ~0.6. In general, these statistical results are better than those of the previous system. Streamflow was evaluated at 1121 river gauges across the quasi-global domain. Validation using real-time precipitation across the tropics (30°S–30°N) gives positive daily Nash-Sutcliffe Coefficients for 107 out of 375 (28%) stations with a mean of 0.19 and 51% of the same gauges at monthly scale with a mean of 0.33. Finally, there were poorer results in higher latitudes, probably due to larger errors in the satellite precipitation input.« less

  13. Stress reaction process-based hierarchical recognition algorithm for continuous intrusion events in optical fiber prewarning system

    NASA Astrophysics Data System (ADS)

    Qu, Hongquan; Yuan, Shijiao; Wang, Yanping; Yang, Dan

    2018-04-01

    To improve the recognition performance of optical fiber prewarning system (OFPS), this study proposed a hierarchical recognition algorithm (HRA). Compared with traditional methods, which employ only a complex algorithm that includes multiple extracted features and complex classifiers to increase the recognition rate with a considerable decrease in recognition speed, HRA takes advantage of the continuity of intrusion events, thereby creating a staged recognition flow inspired by stress reaction. HRA is expected to achieve high-level recognition accuracy with less time consumption. First, this work analyzed the continuity of intrusion events and then presented the algorithm based on the mechanism of stress reaction. Finally, it verified the time consumption through theoretical analysis and experiments, and the recognition accuracy was obtained through experiments. Experiment results show that the processing speed of HRA is 3.3 times faster than that of a traditional complicated algorithm and has a similar recognition rate of 98%. The study is of great significance to fast intrusion event recognition in OFPS.

  14. Acute stress affects prospective memory functions via associative memory processes.

    PubMed

    Szőllősi, Ágnes; Pajkossy, Péter; Demeter, Gyula; Kéri, Szabolcs; Racsmány, Mihály

    2018-01-01

    Recent findings suggest that acute stress can improve the execution of delayed intentions (prospective memory, PM). However, it is unclear whether this improvement can be explained by altered executive control processes or by altered associative memory functioning. To investigate this issue, we used physical-psychosocial stressors to induce acute stress in laboratory settings. Then participants completed event- and time-based PM tasks requiring the different contribution of control processes and a control task (letter fluency) frequently used to measure executive functions. According to our results, acute stress had no impact on ongoing task performance, time-based PM, and verbal fluency, whereas it enhanced event-based PM as measured by response speed for the prospective cues. Our findings indicate that, here, acute stress did not affect executive control processes. We suggest that stress affected event-based PM via associative memory processes. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Piecing together the puzzle: Improving event content coverage for real-time sub-event detection using adaptive microblog crawling

    PubMed Central

    Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan

    2017-01-01

    In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of our proposed adaptive crawler and our stream division/recombination technique provides significant gains in event recall (44.44%) and event precision (9.57%). The addition of these sub-events or pieces, allows us to get closer to solving the event puzzle. PMID:29107976

  16. Piecing together the puzzle: Improving event content coverage for real-time sub-event detection using adaptive microblog crawling.

    PubMed

    Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan

    2017-01-01

    In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of our proposed adaptive crawler and our stream division/recombination technique provides significant gains in event recall (44.44%) and event precision (9.57%). The addition of these sub-events or pieces, allows us to get closer to solving the event puzzle.

  17. Optimization of high count rate event counting detector with Microchannel Plates and quad Timepix readout

    NASA Astrophysics Data System (ADS)

    Tremsin, A. S.; Vallerga, J. V.; McPhate, J. B.; Siegmund, O. H. W.

    2015-07-01

    Many high resolution event counting devices process one event at a time and cannot register simultaneous events. In this article a frame-based readout event counting detector consisting of a pair of Microchannel Plates and a quad Timepix readout is described. More than 104 simultaneous events can be detected with a spatial resolution of 55 μm, while >103 simultaneous events can be detected with <10 μm spatial resolution when event centroiding is implemented. The fast readout electronics is capable of processing >1200 frames/sec, while the global count rate of the detector can exceed 5×108 particles/s when no timing information on every particle is required. For the first generation Timepix readout, the timing resolution is limited by the Timepix clock to 10-20 ns. Optimization of the MCP gain, rear field voltage and Timepix threshold levels are crucial for the device performance and that is the main subject of this article. These devices can be very attractive for applications where the photon/electron/ion/neutron counting with high spatial and temporal resolution is required, such as energy resolved neutron imaging, Time of Flight experiments in lidar applications, experiments on photoelectron spectroscopy and many others.

  18. Proactive Control Processes in Event-Based Prospective Memory: Evidence from Intraindividual Variability and Ex-Gaussian Analyses

    ERIC Educational Resources Information Center

    Ball, B. Hunter; Brewer, Gene A.

    2018-01-01

    The present study implemented an individual differences approach in conjunction with response time (RT) variability and distribution modeling techniques to better characterize the cognitive control dynamics underlying ongoing task cost (i.e., slowing) and cue detection in event-based prospective memory (PM). Three experiments assessed the relation…

  19. Time Warp Operating System (TWOS)

    NASA Technical Reports Server (NTRS)

    Bellenot, Steven F.

    1993-01-01

    Designed to support parallel discrete-event simulation, TWOS is complete implementation of Time Warp mechanism - distributed protocol for virtual time synchronization based on process rollback and message annihilation.

  20. USGS Tweet Earthquake Dispatch (@USGSted): Using Twitter for Earthquake Detection and Characterization

    NASA Astrophysics Data System (ADS)

    Liu, S. B.; Bouchard, B.; Bowden, D. C.; Guy, M.; Earle, P.

    2012-12-01

    The U.S. Geological Survey (USGS) is investigating how online social networking services like Twitter—a microblogging service for sending and reading public text-based messages of up to 140 characters—can augment USGS earthquake response products and the delivery of hazard information. The USGS Tweet Earthquake Dispatch (TED) system is using Twitter not only to broadcast seismically-verified earthquake alerts via the @USGSted and @USGSbigquakes Twitter accounts, but also to rapidly detect widely felt seismic events through a real-time detection system. The detector algorithm scans for significant increases in tweets containing the word "earthquake" or its equivalent in other languages and sends internal alerts with the detection time, tweet text, and the location of the city where most of the tweets originated. It has been running in real-time for 7 months and finds, on average, two or three felt events per day with a false detection rate of less than 10%. The detections have reasonable coverage of populated areas globally. The number of detections is small compared to the number of earthquakes detected seismically, and only a rough location and qualitative assessment of shaking can be determined based on Tweet data alone. However, the Twitter detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The main benefit of the tweet-based detections is speed, with most detections occurring between 19 seconds and 2 minutes from the origin time. This is considerably faster than seismic detections in poorly instrumented regions of the world. Going beyond the initial detection, the USGS is developing data mining techniques to continuously archive and analyze relevant tweets for additional details about the detected events. The information generated about an event is displayed on a web-based map designed using HTML5 for the mobile environment, which can be valuable when the user is not able to access a desktop computer at the time of the detections. The continuously updating map displays geolocated tweets arriving after the detection and plots epicenters of recent earthquakes. When available, seismograms from nearby stations are displayed as an additional form of verification. A time series of tweets-per-minute is also shown to illustrate the volume of tweets being generated for the detected event. Future additions are being investigated to provide a more in-depth characterization of the seismic events based on an analysis of tweet text and content from other social media sources.

  1. Long-term stormwater quantity and quality analysis using continuous measurements in a French urban catchment.

    PubMed

    Sun, Siao; Barraud, Sylvie; Castebrunet, Hélène; Aubin, Jean-Baptiste; Marmonier, Pierre

    2015-11-15

    The assessment of urban stormwater quantity and quality is important for evaluating and controlling the impact of the stormwater to natural water and environment. This study mainly addresses long-term evolution of stormwater quantity and quality in a French urban catchment using continuous measured data from 2004 to 2011. Storm event-based data series are obtained (716 rainfall events and 521 runoff events are available) from measured continuous time series. The Mann-Kendall test is applied to these event-based data series for trend detection. A lack of trend is found in rainfall and an increasing trend in runoff is detected. As a result, an increasing trend is present in the runoff coefficient, likely due to growing imperviousness of the catchment caused by urbanization. The event mean concentration of the total suspended solid (TSS) in stormwater does not present a trend, whereas the event load of TSS has an increasing tendency, which is attributed to the increasing event runoff volume. Uncertainty analysis suggests that the major uncertainty in trend detection results lies in uncertainty due to available data. A lack of events due to missing data leads to dramatically increased uncertainty in trend detection results. In contrast, measurement uncertainty in time series data plays a trivial role. The intra-event distribution of TSS is studied based on both M(V) curves and pollutant concentrations of absolute runoff volumes. The trend detection test reveals no significant change in intra-event distributions of TSS in the studied catchment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Context based configuration management system

    NASA Technical Reports Server (NTRS)

    Gurram, Mohana M. (Inventor); Maluf, David A. (Inventor); Mederos, Luis A. (Inventor); Gawdiak, Yuri O. (Inventor)

    2010-01-01

    A computer-based system for configuring and displaying information on changes in, and present status of, a collection of events associated with a project. Classes of icons for decision events, configurations and feedback mechanisms, and time lines (sequential and/or simultaneous) for related events are displayed. Metadata for each icon in each class is displayed by choosing and activating the corresponding icon. Access control (viewing, reading, writing, editing, deleting, etc.) is optionally imposed for metadata and other displayed information.

  3. Influence of time and length size feature selections for human activity sequences recognition.

    PubMed

    Fang, Hongqing; Chen, Long; Srinivasan, Raghavendiran

    2014-01-01

    In this paper, Viterbi algorithm based on a hidden Markov model is applied to recognize activity sequences from observed sensors events. Alternative features selections of time feature values of sensors events and activity length size feature values are tested, respectively, and then the results of activity sequences recognition performances of Viterbi algorithm are evaluated. The results show that the selection of larger time feature values of sensor events and/or smaller activity length size feature values will generate relatively better results on the activity sequences recognition performances. © 2013 ISA Published by ISA All rights reserved.

  4. Dynamically adaptive data-driven simulation of extreme hydrological flows

    NASA Astrophysics Data System (ADS)

    Kumar Jain, Pushkar; Mandli, Kyle; Hoteit, Ibrahim; Knio, Omar; Dawson, Clint

    2018-02-01

    Hydrological hazards such as storm surges, tsunamis, and rainfall-induced flooding are physically complex events that are costly in loss of human life and economic productivity. Many such disasters could be mitigated through improved emergency evacuation in real-time and through the development of resilient infrastructure based on knowledge of how systems respond to extreme events. Data-driven computational modeling is a critical technology underpinning these efforts. This investigation focuses on the novel combination of methodologies in forward simulation and data assimilation. The forward geophysical model utilizes adaptive mesh refinement (AMR), a process by which a computational mesh can adapt in time and space based on the current state of a simulation. The forward solution is combined with ensemble based data assimilation methods, whereby observations from an event are assimilated into the forward simulation to improve the veracity of the solution, or used to invert for uncertain physical parameters. The novelty in our approach is the tight two-way coupling of AMR and ensemble filtering techniques. The technology is tested using actual data from the Chile tsunami event of February 27, 2010. These advances offer the promise of significantly transforming data-driven, real-time modeling of hydrological hazards, with potentially broader applications in other science domains.

  5. Demonstrating the Value of Near Real-time Satellite-based Earth Observations in a Research and Education Framework

    NASA Astrophysics Data System (ADS)

    Chiu, L.; Hao, X.; Kinter, J. L.; Stearn, G.; Aliani, M.

    2017-12-01

    The launch of GOES-16 series provides an opportunity to advance near real-time applications in natural hazard detection, monitoring and warning. This study demonstrates the capability and values of receiving real-time satellite-based Earth observations over a fast terrestrial networks and processing high-resolution remote sensing data in a university environment. The demonstration system includes 4 components: 1) Near real-time data receiving and processing; 2) data analysis and visualization; 3) event detection and monitoring; and 4) information dissemination. Various tools are developed and integrated to receive and process GRB data in near real-time, produce images and value-added data products, and detect and monitor extreme weather events such as hurricane, fire, flooding, fog, lightning, etc. A web-based application system is developed to disseminate near-real satellite images and data products. The images are generated with GIS-compatible format (GeoTIFF) to enable convenient use and integration in various GIS platforms. This study enhances the capacities for undergraduate and graduate education in Earth system and climate sciences, and related applications to understand the basic principles and technology in real-time applications with remote sensing measurements. It also provides an integrated platform for near real-time monitoring of extreme weather events, which are helpful for various user communities.

  6. Comparison of design strategies for a three-arm clinical trial with time-to-event endpoint: Power, time-to-analysis, and operational aspects.

    PubMed

    Asikanius, Elina; Rufibach, Kaspar; Bahlo, Jasmin; Bieska, Gabriele; Burger, Hans Ulrich

    2016-11-01

    To optimize resources, randomized clinical trials with multiple arms can be an attractive option to simultaneously test various treatment regimens in pharmaceutical drug development. The motivation for this work was the successful conduct and positive final outcome of a three-arm randomized clinical trial primarily assessing whether obinutuzumab plus chlorambucil in patients with chronic lympocytic lymphoma and coexisting conditions is superior to chlorambucil alone based on a time-to-event endpoint. The inference strategy of this trial was based on a closed testing procedure. We compare this strategy to three potential alternatives to run a three-arm clinical trial with a time-to-event endpoint. The primary goal is to quantify the differences between these strategies in terms of the time it takes until the first analysis and thus potential approval of a new drug, number of required events, and power. Operational aspects of implementing the various strategies are discussed. In conclusion, using a closed testing procedure results in the shortest time to the first analysis with a minimal loss in power. Therefore, closed testing procedures should be part of the statistician's standard clinical trials toolbox when planning multiarm clinical trials. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Fluctuations in Wikipedia access-rate and edit-event data

    NASA Astrophysics Data System (ADS)

    Kämpf, Mirko; Tismer, Sebastian; Kantelhardt, Jan W.; Muchnik, Lev

    2012-12-01

    Internet-based social networks often reflect extreme events in nature and society by drastic increases in user activity. We study and compare the dynamics of the two major complex processes necessary for information spread via the online encyclopedia ‘Wikipedia’, i.e., article editing (information upload) and article access (information viewing) based on article edit-event time series and (hourly) user access-rate time series for all articles. Daily and weekly activity patterns occur in addition to fluctuations and bursting activity. The bursts (i.e., significant increases in activity for an extended period of time) are characterized by a power-law distribution of durations of increases and decreases. For describing the recurrence and clustering of bursts we investigate the statistics of the return intervals between them. We find stretched exponential distributions of return intervals in access-rate time series, while edit-event time series yield simple exponential distributions. To characterize the fluctuation behavior we apply detrended fluctuation analysis (DFA), finding that most article access-rate time series are characterized by strong long-term correlations with fluctuation exponents α≈0.9. The results indicate significant differences in the dynamics of information upload and access and help in understanding the complex process of collecting, processing, validating, and distributing information in self-organized social networks.

  8. Attrition in NRG Oncology's Radiation-Based Clinical Trials.

    PubMed

    Ulrich, Connie M; Deshmukh, Snehal; Pugh, Stephanie L; Hanlon, Alexandra; Grady, Christine; Watkins Bruner, Deborah; Curran, Walter

    2018-05-10

    To determine individual, organizational, and protocol-specific factors associated with attrition in NRG Oncology's radiation-based clinical trials. This retrospective analysis included 27,443 patients representing 134 NRG Oncology's radiation-based clinical trials .trials with primary efficacy results published from 1985-2011. Trials were separated on the basis of the primary endpoint (fixed time vs event driven). The cumulative incidence approach was used to estimate time to attrition, and cause-specific Cox proportional hazards models were used to assess factors associated with attrition. Most patients (69%) were enrolled in an event-driven trial (n = 18,809), while 31% were enrolled in a fixed-time trial (n = 8634). Median follow-up time for patients enrolled in fixed-time trials was 4.1 months and 37.2 months for patients enrolled in event-driven trials. Fixed time trials with a duration < 6 months had a 5 month attrition rate of 4.3% (95% confidence interval [CI]: 3.4%, 5.5%) and those with a duration ≥ 6 months had a 1 year attrition rate of 1.6% (95% CI: 1.2, 2.1). Event-driven trials had 1- and 5-year attrition rates of 0.5% (95% CI: 0.4%, 0.6%) and 13.6% (95% CI: 13.1%, 14.1%), respectively. Younger age, female gender, and Zubrod performance status >0 were associated with greater attrition as were enrollment by institutions in the West and South regions and participation in fixed-time trials. Attrition in clinical trials can have a negative effect on trial outcomes. Data on factors associated with attrition can help guide the development of strategies to enhance retention. These strategies should focus on patient characteristics associated with attrition in both fixed-time and event-driven trials as well as in differing geographic regions of the country. Copyright © 2018. Published by Elsevier Inc.

  9. Swarming Reconnaissance Using Unmanned Aerial Vehicles in a Parallel Discrete Event Simulation

    DTIC Science & Technology

    2004-03-01

    60 4.3.1.4 Data Distribution Management . . . . . . . . . 60 4.3.1.5 Breathing Time Warp Algorithm/ Rolling Back . 61...58 BTW Breathing Time Warp . . . . . . . . . . . . . . . . . . . . . . . . . 59 DDM Data Distribution Management . . . . . . . . . . . . . . . . . . . . 60...events based on the 58 process algorithm. Data proxies/ distribution management is the vital portion of the SPEEDES im- plementation that allows objects

  10. Analysis hierarchical model for discrete event systems

    NASA Astrophysics Data System (ADS)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  11. A networks-based discrete dynamic systems approach to volcanic seismicity

    NASA Astrophysics Data System (ADS)

    Suteanu, Mirela

    2013-04-01

    The detection and relevant description of pattern change concerning earthquake events is an important, but challenging task. In this paper, earthquake events related to volcanic activity are considered manifestations of a dynamic system evolving over time. The system dynamics is seen as a succession of events with point-like appearance both in time and in space. Each event is characterized by a position in three-dimensional space, a moment of occurrence, and an event size (magnitude). A weighted directed network is constructed to capture the effects of earthquakes on subsequent events. Each seismic event represents a node. Relations among events represent edges. Edge directions are given by the temporal succession of the events. Edges are also characterized by weights reflecting the strengths of the relation between the nodes. Weights are calculated as a function of (i) the time interval separating the two events, (ii) the spatial distance between the events, (iii) the magnitude of the earliest event among the two. Different ways of addressing weight components are explored, and their implications for the properties of the produced networks are analyzed. The resulting networks are then characterized in terms of degree- and weight distributions. Subsequently, the distribution of system transitions is determined for all the edges connecting related events in the network. Two- and three-dimensional diagrams are constructed to reflect transition distributions for each set of events. Networks are thus generated for successive temporal windows of different size, and the evolution of (a) network properties and (b) system transition distributions are followed over time and compared to the timeline of documented geologic processes. Applications concerning volcanic seismicity on the Big Island of Hawaii show that this approach is capable of revealing novel aspects of change occurring in the volcanic system on different scales in time and in space.

  12. Event- and interval-based measurement of stuttering: a review.

    PubMed

    Valente, Ana Rita S; Jesus, Luis M T; Hall, Andreia; Leahy, Margaret

    2015-01-01

    Event- and interval-based measurements are two different ways of computing frequency of stuttering. Interval-based methodology emerged as an alternative measure to overcome problems associated with reproducibility in the event-based methodology. No review has been made to study the effect of methodological factors in interval-based absolute reliability data or to compute the agreement between the two methodologies in terms of inter-judge, intra-judge and accuracy (i.e., correspondence between raters' scores and an established criterion). To provide a review related to reproducibility of event-based and time-interval measurement, and to verify the effect of methodological factors (training, experience, interval duration, sample presentation order and judgment conditions) on agreement of time-interval measurement; in addition, to determine if it is possible to quantify the agreement between the two methodologies The first two authors searched for articles on ERIC, MEDLINE, PubMed, B-on, CENTRAL and Dissertation Abstracts during January-February 2013 and retrieved 495 articles. Forty-eight articles were selected for review. Content tables were constructed with the main findings. Articles related to event-based measurements revealed values of inter- and intra-judge greater than 0.70 and agreement percentages beyond 80%. The articles related to time-interval measures revealed that, in general, judges with more experience with stuttering presented significantly higher levels of intra- and inter-judge agreement. Inter- and intra-judge values were beyond the references for high reproducibility values for both methodologies. Accuracy (regarding the closeness of raters' judgements with an established criterion), intra- and inter-judge agreement were higher for trained groups when compared with non-trained groups. Sample presentation order and audio/video conditions did not result in differences in inter- or intra-judge results. A duration of 5 s for an interval appears to be an acceptable agreement. Explanation for high reproducibility values as well as parameter choice to report those data are discussed. Both interval- and event-based methodologies used trained or experienced judges for inter- and intra-judge determination and data were beyond the references for good reproducibility values. Inter- and intra-judge values were reported in different metric scales among event- and interval-based methods studies, making it unfeasible to quantify the agreement between the two methods. © 2014 Royal College of Speech and Language Therapists.

  13. Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time-to-Event Analysis.

    PubMed

    Gong, Xiajing; Hu, Meng; Zhao, Liang

    2018-05-01

    Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time-to-event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high-dimensional data featured by a large number of predictor variables. Our results showed that ML-based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high-dimensional data. The prediction performances of ML-based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML-based methods provide a powerful tool for time-to-event analysis, with a built-in capacity for high-dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. © 2018 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  14. Complex effusive events at Kilauea as documented by the GOES satellite and remote video cameras

    USGS Publications Warehouse

    Harris, A.J.L.; Thornber, C.R.

    1999-01-01

    GOES provides thermal data for all of the Hawaiian volcanoes once every 15 min. We show how volcanic radiance time series produced from this data stream can be used as a simple measure of effusive activity. Two types of radiance trends in these time series can be used to monitor effusive activity: (a) Gradual variations in radiance reveal steady flow-field extension and tube development. (b) Discrete spikes correlate with short bursts of activity, such as lava fountaining or lava-lake overflows. We are confident that any effusive event covering more than 10,000 m2 of ground in less than 60 min will be unambiguously detectable using this approach. We demonstrate this capability using GOES, video camera and ground-based observational data for the current eruption of Kilauea volcano (Hawai'i). A GOES radiance time series was constructed from 3987 images between 19 June and 12 August 1997. This time series displayed 24 radiance spikes elevated more than two standard deviations above the mean; 19 of these are correlated with video-recorded short-burst effusive events. Less ambiguous events are interpreted, assessed and related to specific volcanic events by simultaneous use of permanently recording video camera data and ground-observer reports. The GOES radiance time series are automatically processed on data reception and made available in near-real-time, so such time series can contribute to three main monitoring functions: (a) automatically alerting major effusive events; (b) event confirmation and assessment; and (c) establishing effusive event chronology.

  15. Projections of hepatitis A virus infection associated with flood events by 2020 and 2030 in Anhui Province, China

    NASA Astrophysics Data System (ADS)

    Gao, Lu; Zhang, Ying; Ding, Guoyong; Liu, Qiyong; Wang, Changke; Jiang, Baofa

    2016-12-01

    Assessing and responding to health risk of climate change is important because of its impact on the natural and societal ecosystems. More frequent and severe flood events will occur in China due to climate change. Given that population is projected to increase, more people will be vulnerable to flood events, which may lead to an increased incidence of HAV infection in the future. This population-based study is going to project the future health burden of HAV infection associated with flood events in Huai River Basin of China. The study area covered four cities of Anhui province in China, where flood events were frequent. Time-series adjusted Poisson regression model was developed to quantify the risks of flood events on HAV infection based on the number of daily cases during summer seasons from 2005 to 2010, controlling for other meteorological variables. Projections of HAV infection in 2020 and 2030 were estimated based on the scenarios of flood events and demographic data. Poisson regression model suggested that compared with the periods without flood events, the risks of severe flood events for HAV infection were significant (OR = 1.28, 95 % CI 1.05-1.55), while risks were not significant from moderate flood events (OR = 1.16, 95 % CI 0.72-1.87) and mild flood events (OR = 1.14, 95 % CI 0.87-1.48). Using the 2010 baseline data and the flood event scenarios (one severe flood event), increased incidence of HAV infection were estimated to be between 0.126/105 and 0.127/105 for 2020. Similarly, the increased HAV infection incidence for 2030 was projected to be between 0.382/105 and 0.399/105. Our study has, for the first time, quantified the increased incidence of HAV infection that will result from flood events in Anhui, China, in 2020 and 2030. The results have implications for public health preparation for developing public health responses to reduce HAV infection during future flood events.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, Fei; Jiang, Huaiguang; Tan, Jin

    This paper proposes an event-driven approach for reconfiguring distribution systems automatically. Specifically, an optimal synchrophasor sensor placement (OSSP) is used to reduce the number of synchrophasor sensors while keeping the whole system observable. Then, a wavelet-based event detection and location approach is used to detect and locate the event, which performs as a trigger for network reconfiguration. With the detected information, the system is then reconfigured using the hierarchical decentralized approach to seek for the new optimal topology. In this manner, whenever an event happens the distribution network can be reconfigured automatically based on the real-time information that is observablemore » and detectable.« less

  17. Low-Latency Line Tracking Using Event-Based Dynamic Vision Sensors

    PubMed Central

    Everding, Lukas; Conradt, Jörg

    2018-01-01

    In order to safely navigate and orient in their local surroundings autonomous systems need to rapidly extract and persistently track visual features from the environment. While there are many algorithms tackling those tasks for traditional frame-based cameras, these have to deal with the fact that conventional cameras sample their environment with a fixed frequency. Most prominently, the same features have to be found in consecutive frames and corresponding features then need to be matched using elaborate techniques as any information between the two frames is lost. We introduce a novel method to detect and track line structures in data streams of event-based silicon retinae [also known as dynamic vision sensors (DVS)]. In contrast to conventional cameras, these biologically inspired sensors generate a quasicontinuous stream of vision information analogous to the information stream created by the ganglion cells in mammal retinae. All pixels of DVS operate asynchronously without a periodic sampling rate and emit a so-called DVS address event as soon as they perceive a luminance change exceeding an adjustable threshold. We use the high temporal resolution achieved by the DVS to track features continuously through time instead of only at fixed points in time. The focus of this work lies on tracking lines in a mostly static environment which is observed by a moving camera, a typical setting in mobile robotics. Since DVS events are mostly generated at object boundaries and edges which in man-made environments often form lines they were chosen as feature to track. Our method is based on detecting planes of DVS address events in x-y-t-space and tracing these planes through time. It is robust against noise and runs in real time on a standard computer, hence it is suitable for low latency robotics. The efficacy and performance are evaluated on real-world data sets which show artificial structures in an office-building using event data for tracking and frame data for ground-truth estimation from a DAVIS240C sensor. PMID:29515386

  18. An event-triggered control approach for the leader-tracking problem with heterogeneous agents

    NASA Astrophysics Data System (ADS)

    Garcia, Eloy; Cao, Yongcan; Casbeer, David W.

    2018-05-01

    This paper presents an event-triggered control and communication framework for the cooperative leader-tracking problem with communication constraints. Continuous communication among agents is not assumed in this work and decentralised event-based strategies are proposed for agents with heterogeneous linear dynamics. Also, the leader dynamics are unknown and only intermittent measurements of its states are obtained by a subset of the followers. The event-based method not only represents a way to restrict communication among agents, but it also provides a decentralised scheme for scheduling information broadcasts. Notably, each agent is able to determine its own broadcasting instants independently of any other agent in the network. In an extension, the case where transmission of information is affected by time-varying communication delays is addressed. Finally, positive lower-bounds on the inter-event time intervals are obtained in order to show that Zeno behaviour does not exist and, therefore, continuous exchange of information is never needed in this framework.

  19. Semi-supervised tracking of extreme weather events in global spatio-temporal climate datasets

    NASA Astrophysics Data System (ADS)

    Kim, S. K.; Prabhat, M.; Williams, D. N.

    2017-12-01

    Deep neural networks have been successfully applied to solve problem to detect extreme weather events in large scale climate datasets and attend superior performance that overshadows all previous hand-crafted methods. Recent work has shown that multichannel spatiotemporal encoder-decoder CNN architecture is able to localize events in semi-supervised bounding box. Motivated by this work, we propose new learning metric based on Variational Auto-Encoders (VAE) and Long-Short-Term-Memory (LSTM) to track extreme weather events in spatio-temporal dataset. We consider spatio-temporal object tracking problems as learning probabilistic distribution of continuous latent features of auto-encoder using stochastic variational inference. For this, we assume that our datasets are i.i.d and latent features is able to be modeled by Gaussian distribution. In proposed metric, we first train VAE to generate approximate posterior given multichannel climate input with an extreme climate event at fixed time. Then, we predict bounding box, location and class of extreme climate events using convolutional layers given input concatenating three features including embedding, sampled mean and standard deviation. Lastly, we train LSTM with concatenated input to learn timely information of dataset by recurrently feeding output back to next time-step's input of VAE. Our contribution is two-fold. First, we show the first semi-supervised end-to-end architecture based on VAE to track extreme weather events which can apply to massive scaled unlabeled climate datasets. Second, the information of timely movement of events is considered for bounding box prediction using LSTM which can improve accuracy of localization. To our knowledge, this technique has not been explored neither in climate community or in Machine Learning community.

  20. SQERTSS: Dynamic rank based throttling of transition probabilities in kinetic Monte Carlo simulations

    DOE PAGES

    Danielson, Thomas; Sutton, Jonathan E.; Hin, Céline; ...

    2017-06-09

    Lattice based Kinetic Monte Carlo (KMC) simulations offer a powerful simulation technique for investigating large reaction networks while retaining spatial configuration information, unlike ordinary differential equations. However, large chemical reaction networks can contain reaction processes with rates spanning multiple orders of magnitude. This can lead to the problem of “KMC stiffness” (similar to stiffness in differential equations), where the computational expense has the potential to be overwhelmed by very short time-steps during KMC simulations, with the simulation spending an inordinate amount of KMC steps / cpu-time simulating fast frivolous processes (FFPs) without progressing the system (reaction network). In order tomore » achieve simulation times that are experimentally relevant or desired for predictions, a dynamic throttling algorithm involving separation of the processes into speed-ranks based on event frequencies has been designed and implemented with the intent of decreasing the probability of FFP events, and increasing the probability of slow process events -- allowing rate limiting events to become more likely to be observed in KMC simulations. This Staggered Quasi-Equilibrium Rank-based Throttling for Steady-state (SQERTSS) algorithm designed for use in achieving and simulating steady-state conditions in KMC simulations. Lastly, as shown in this work, the SQERTSS algorithm also works for transient conditions: the correct configuration space and final state will still be achieved if the required assumptions are not violated, with the caveat that the sizes of the time-steps may be distorted during the transient period.« less

  1. SQERTSS: Dynamic rank based throttling of transition probabilities in kinetic Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Danielson, Thomas; Sutton, Jonathan E.; Hin, Céline; Savara, Aditya

    2017-10-01

    Lattice based Kinetic Monte Carlo (KMC) simulations offer a powerful simulation technique for investigating large reaction networks while retaining spatial configuration information, unlike ordinary differential equations. However, large chemical reaction networks can contain reaction processes with rates spanning multiple orders of magnitude. This can lead to the problem of "KMC stiffness" (similar to stiffness in differential equations), where the computational expense has the potential to be overwhelmed by very short time-steps during KMC simulations, with the simulation spending an inordinate amount of KMC steps/CPU time simulating fast frivolous processes (FFPs) without progressing the system (reaction network). In order to achieve simulation times that are experimentally relevant or desired for predictions, a dynamic throttling algorithm involving separation of the processes into speed-ranks based on event frequencies has been designed and implemented with the intent of decreasing the probability of FFP events, and increasing the probability of slow process events-allowing rate limiting events to become more likely to be observed in KMC simulations. This Staggered Quasi-Equilibrium Rank-based Throttling for Steady-state (SQERTSS) algorithm is designed for use in achieving and simulating steady-state conditions in KMC simulations. As shown in this work, the SQERTSS algorithm also works for transient conditions: the correct configuration space and final state will still be achieved if the required assumptions are not violated, with the caveat that the sizes of the time-steps may be distorted during the transient period.

  2. New Frontiers in Characterization of Sub-Catalog Microseismicity: Utilizing Inter-Event Waveform Cross Correlation for Estimating Precise Locations, Magnitudes, and Focal Mechanisms of Tiny Earthquakes

    NASA Astrophysics Data System (ADS)

    Ellsworth, W. L.; Shelly, D. R.; Hardebeck, J.; Hill, D. P.

    2017-12-01

    Microseismicity often conveys the most direct information about active processes in the earth's subsurface. However, routine network processing typically leaves most earthquakes uncharacterized. These "sub-catalog" events can provide critical clues to ongoing processes in the source region. To address this issue, we have developed waveform-based processing that leverages the existing routine catalog of earthquakes to detect and characterize "sub-catalog" events (those absent in routine catalogs). By correlating waveforms of cataloged events with the continuous data stream, we 1) identify events with similar waveform signatures in the continuous data across multiple stations, 2) precisely measure relative time lags across these stations for both P- and S-wave time windows, and 3) estimate the relative polarity between events by the sign of the peak absolute value correlations and its height above the secondary peak. When combined, these inter-event comparisons yield robust measurements, which enable sensitive event detection, relative relocation, and relative magnitude estimation. The most recent addition, focal mechanisms derived from correlation-based relative polarities, addresses a significant shortcoming in microseismicity analyses (see Shelly et al., JGR, 2016). Depending on the application, we can characterize 2-10 times as many events as included in the initial catalog. This technique is particularly well suited for compact zones of active seismicity such as seismic swarms. Application to a 2014 swarm in Long Valley Caldera, California, illuminates complex patterns of faulting that would have otherwise remained obscured. The prevalence of such features in other environments remains an important, as yet unresolved, question.

  3. Misspecification of Cox regression models with composite endpoints

    PubMed Central

    Wu, Longyang; Cook, Richard J

    2012-01-01

    Researchers routinely adopt composite endpoints in multicenter randomized trials designed to evaluate the effect of experimental interventions in cardiovascular disease, diabetes, and cancer. Despite their widespread use, relatively little attention has been paid to the statistical properties of estimators of treatment effect based on composite endpoints. We consider this here in the context of multivariate models for time to event data in which copula functions link marginal distributions with a proportional hazards structure. We then examine the asymptotic and empirical properties of the estimator of treatment effect arising from a Cox regression model for the time to the first event. We point out that even when the treatment effect is the same for the component events, the limiting value of the estimator based on the composite endpoint is usually inconsistent for this common value. We find that in this context the limiting value is determined by the degree of association between the events, the stochastic ordering of events, and the censoring distribution. Within the framework adopted, marginal methods for the analysis of multivariate failure time data yield consistent estimators of treatment effect and are therefore preferred. We illustrate the methods by application to a recent asthma study. Copyright © 2012 John Wiley & Sons, Ltd. PMID:22736519

  4. Hazard ratio estimation and inference in clinical trials with many tied event times.

    PubMed

    Mehrotra, Devan V; Zhang, Yiwei

    2018-06-13

    The medical literature contains numerous examples of randomized clinical trials with time-to-event endpoints in which large numbers of events accrued over relatively short follow-up periods, resulting in many tied event times. A generally common feature across such examples was that the logrank test was used for hypothesis testing and the Cox proportional hazards model was used for hazard ratio estimation. We caution that this common practice is particularly risky in the setting of many tied event times for two reasons. First, the estimator of the hazard ratio can be severely biased if the Breslow tie-handling approximation for the Cox model (the default in SAS and Stata software) is used. Second, the 95% confidence interval for the hazard ratio can include one even when the corresponding logrank test p-value is less than 0.05. To help establish a better practice, with applicability for both superiority and noninferiority trials, we use theory and simulations to contrast Wald and score tests based on well-known tie-handling approximations for the Cox model. Our recommendation is to report the Wald test p-value and corresponding confidence interval based on the Efron approximation. The recommended test is essentially as powerful as the logrank test, the accompanying point and interval estimates of the hazard ratio have excellent statistical properties even in settings with many tied event times, inferential alignment between the p-value and confidence interval is guaranteed, and implementation is straightforward using commonly used software. Copyright © 2018 John Wiley & Sons, Ltd.

  5. Real-Time Data Processing Systems and Products at the Alaska Earthquake Information Center

    NASA Astrophysics Data System (ADS)

    Ruppert, N. A.; Hansen, R. A.

    2007-05-01

    The Alaska Earthquake Information Center (AEIC) receives data from over 400 seismic sites located within the state boundaries and the surrounding regions and serves as a regional data center. In 2007, the AEIC reported ~20,000 seismic events, with the largest event of M6.6 in Andreanof Islands. The real-time earthquake detection and data processing systems at AEIC are based on the Antelope system from BRTT, Inc. This modular and extensible processing platform allows an integrated system complete from data acquisition to catalog production. Multiple additional modules constructed with the Antelope toolbox have been developed to fit particular needs of the AEIC. The real-time earthquake locations and magnitudes are determined within 2-5 minutes of the event occurrence. AEIC maintains a 24/7 seismologist-on-duty schedule. Earthquake alarms are based on the real- time earthquake detections. Significant events are reviewed by the seismologist on duty within 30 minutes of the occurrence with information releases issued for significant events. This information is disseminated immediately via the AEIC website, ANSS website via QDDS submissions, through e-mail, cell phone and pager notifications, via fax broadcasts and recorded voice-mail messages. In addition, automatic regional moment tensors are determined for events with M>=4.0. This information is posted on the public website. ShakeMaps are being calculated in real-time with the information currently accessible via a password-protected website. AEIC is designing an alarm system targeted for the critical lifeline operations in Alaska. AEIC maintains an extensive computer network to provide adequate support for data processing and archival. For real-time processing, AEIC operates two identical, interoperable computer systems in parallel.

  6. WILBER and PyWEED: Event-based Seismic Data Request Tools

    NASA Astrophysics Data System (ADS)

    Falco, N.; Clark, A.; Trabant, C. M.

    2017-12-01

    WILBER and PyWEED are two user-friendly tools for requesting event-oriented seismic data. Both tools provide interactive maps and other controls for browsing and filtering event and station catalogs, and downloading data for selected event/station combinations, where the data window for each event/station pair may be defined relative to the arrival time of seismic waves from the event to that particular station. Both tools allow data to be previewed visually, and can download data in standard miniSEED, SAC, and other formats, complete with relevant metadata for performing instrument correction. WILBER is a web application requiring only a modern web browser. Once the user has selected an event, WILBER identifies all data available for that time period, and allows the user to select stations based on criteria such as the station's distance and orientation relative to the event. When the user has finalized their request, the data is collected and packaged on the IRIS server, and when it is ready the user is sent a link to download. PyWEED is a downloadable, cross-platform (Macintosh / Windows / Linux) application written in Python. PyWEED allows a user to select multiple events and stations, and will download data for each event/station combination selected. PyWEED is built around the ObsPy seismic toolkit, and allows direct interaction and control of the application through a Python interactive console.

  7. Projecting Event-Based Analysis Dates in Clinical Trials: An Illustration Based on the International Duration Evaluation of Adjuvant Chemotherapy (IDEA) Collaboration. Projecting analysis dates for the IDEA collaboration

    PubMed Central

    Renfro, Lindsay A.; Grothey, Axel M.; Paul, James; Floriani, Irene; Bonnetain, Franck; Niedzwiecki, Donna; Yamanaka, Takeharu; Souglakos, Ioannis; Yothers, Greg; Sargent, Daniel J.

    2015-01-01

    Purpose Clinical trials are expensive and lengthy, where success of a given trial depends on observing a prospectively defined number of patient events required to answer the clinical question. The point at which this analysis time occurs depends on both patient accrual and primary event rates, which typically vary throughout the trial's duration. We demonstrate real-time analysis date projections using data from a collection of six clinical trials that are part of the IDEA collaboration, an international preplanned pooling of data from six trials testing the duration of adjuvant chemotherapy in stage III colon cancer, and we additionally consider the hypothetical impact of one trial's early termination of follow-up. Patients and Methods In the absence of outcome data from IDEA, monthly accrual rates for each of the six IDEA trials were used to project subsequent trial-specific accrual, while historical data from similar Adjuvant Colon Cancer Endpoints (ACCENT) Group trials were used to construct a parametric model for IDEA's primary endpoint, disease-free survival, under the same treatment regimen. With this information and using the planned total accrual from each IDEA trial protocol, individual patient accrual and event dates were simulated and the overall IDEA interim and final analysis times projected. Projections were then compared with actual (previously undisclosed) trial-specific event totals at a recent census time for validation. The change in projected final analysis date assuming early termination of follow-up for one IDEA trial was also calculated. Results Trial-specific predicted event totals were close to the actual number of events per trial for the recent census date at which the number of events per trial was known, with the overall IDEA projected number of events only off by eight patients. Potential early termination of follow-up by one IDEA trial was estimated to postpone the overall IDEA final analysis date by 9 months. Conclusions Real-time projection of the final analysis time during a trial, or the overall analysis time during a trial collaborative such as IDEA, has practical implications for trial feasibility when these projections are translated into additional time and resources required. PMID:26989447

  8. Impact of temporal resolution of inputs on hydrological model performance: An analysis based on 2400 flood events

    NASA Astrophysics Data System (ADS)

    Ficchì, Andrea; Perrin, Charles; Andréassian, Vazken

    2016-07-01

    Hydro-climatic data at short time steps are considered essential to model the rainfall-runoff relationship, especially for short-duration hydrological events, typically flash floods. Also, using fine time step information may be beneficial when using or analysing model outputs at larger aggregated time scales. However, the actual gain in prediction efficiency using short time-step data is not well understood or quantified. In this paper, we investigate the extent to which the performance of hydrological modelling is improved by short time-step data, using a large set of 240 French catchments, for which 2400 flood events were selected. Six-minute rain gauge data were available and the GR4 rainfall-runoff model was run with precipitation inputs at eight different time steps ranging from 6 min to 1 day. Then model outputs were aggregated at seven different reference time scales ranging from sub-hourly to daily for a comparative evaluation of simulations at different target time steps. Three classes of model performance behaviour were found for the 240 test catchments: (i) significant improvement of performance with shorter time steps; (ii) performance insensitivity to the modelling time step; (iii) performance degradation as the time step becomes shorter. The differences between these groups were analysed based on a number of catchment and event characteristics. A statistical test highlighted the most influential explanatory variables for model performance evolution at different time steps, including flow auto-correlation, flood and storm duration, flood hydrograph peakedness, rainfall-runoff lag time and precipitation temporal variability.

  9. Probing SEP Acceleration Processes With Near-relativistic Electrons

    NASA Astrophysics Data System (ADS)

    Haggerty, Dennis K.; Roelof, Edmond C.

    2009-11-01

    Processes in the solar corona are prodigious accelerators of near-relativistic electrons. Only a small fraction of these electrons escape the low corona, yet they are by far the most abundant species observed in Solar Energetic Particle events. These beam-like energetic electron events are sometimes time-associated with coronal mass ejections from the western solar hemisphere. However, a significant number of events are observed without any apparent association with a transient event. The relationship between solar energetic particle events, coronal mass ejections, and near-relativistic electron events are better ordered when we classify the intensity time profiles during the duration of the beam-like anisotropies into three broad categories: 1) Spikes (rapid and equal rise and decay) 2) Pulses (rapid rise, slower decay) and 3) Ramps (rapid rise followed by a plateau). We report on the results of a study that is based on our catalog (covering nearly the complete Solar Cycle 23) of 216 near-relativistic electron events and their association with: solar electromagnetic emissions, shocks driven by coronal mass ejections, models of the coronal magnetic fields and energetic protons. We conclude that electron events with time-intensity profiles of Spikes and Pulses are associated with explosive events in the low corona while events with time-intensity profiles of Ramps are associated with the injection/acceleration process of the CME driven shock.

  10. The evaluation of a web-based incident reporting system.

    PubMed

    Kuo, Ya-Hui; Lee, Ting-Ting; Mills, Mary Etta; Lin, Kuan-Chia

    2012-07-01

    A Web-based reporting system is essential to report incident events anonymously and confidentially. The purpose of this study was to evaluate a Web-based reporting system in Taiwan. User satisfaction and impact of system use were evaluated through a survey answered by 249 nurses. Incident events reported in paper and electronic systems were collected for comparison purposes. Study variables included system user satisfaction, willingness to report, number of reports, severity of the events, and efficiency of the reporting process. Results revealed that senior nurses were less willing to report events, nurses on internal medicine units had higher satisfaction than others, and lowest satisfaction was related to the time it took to file a report. In addition, the Web-based reporting system was used more often than the paper system. The percentages of events reported were significantly higher in the Web-based system in laboratory, environment/device, and incidents occurring in other units, whereas the proportions of reports involving bedsores and dislocation of endotracheal tubes were decreased. Finally, moderate injury event reporting decreased, whereas minor or minimal injury event reporting increased. The study recommends that the data entry process be simplified and the network system be improved to increase user satisfaction and reporting rates.

  11. Using Agent-Based Modeling to Enhance System-Level Real-time Control of Urban Stormwater Systems

    NASA Astrophysics Data System (ADS)

    Rimer, S.; Mullapudi, A. M.; Kerkez, B.

    2017-12-01

    The ability to reduce combined-sewer overflow (CSO) events is an issue that challenges over 800 U.S. municipalities. When the volume of a combined sewer system or wastewater treatment plant is exceeded, untreated wastewater then overflows (a CSO event) into nearby streams, rivers, or other water bodies causing localized urban flooding and pollution. The likelihood and impact of CSO events has only exacerbated due to urbanization, population growth, climate change, aging infrastructure, and system complexity. Thus, there is an urgent need for urban areas to manage CSO events. Traditionally, mitigating CSO events has been carried out via time-intensive and expensive structural interventions such as retention basins or sewer separation, which are able to reduce CSO events, but are costly, arduous, and only provide a fixed solution to a dynamic problem. Real-time control (RTC) of urban drainage systems using sensor and actuator networks has served as an inexpensive and versatile alternative to traditional CSO intervention. In particular, retrofitting individual stormwater elements for sensing and automated active distributed control has been shown to significantly reduce the volume of discharge during CSO events, with some RTC models demonstrating a reduction upwards of 90% when compared to traditional passive systems. As more stormwater elements become retrofitted for RTC, system-level RTC across complete watersheds is an attainable possibility. However, when considering the diverse set of control needs of each of these individual stormwater elements, such system-level RTC becomes a far more complex problem. To address such diverse control needs, agent-based modeling is employed such that each individual stormwater element is treated as an autonomous agent with a diverse decision making capabilities. We present preliminary results and limitations of utilizing the agent-based modeling computational framework for the system-level control of diverse, interacting stormwater elements.

  12. Temporal and spatial heterogeneity of rupture process application in shakemaps of Yushu Ms7.1 earthquake, China

    NASA Astrophysics Data System (ADS)

    Kun, C.

    2015-12-01

    Studies have shown that estimates of ground motion parameter from ground motion attenuation relationship often greater than the observed value, mainly because multiple ruptures of the big earthquake reduce the source pulse height of source time function. In the absence of real-time data of the station after the earthquake, this paper attempts to make some constraints from the source, to improve the accuracy of shakemaps. Causative fault of Yushu Ms 7.1 earthquake is vertical approximately (dip 83 °), and source process in time and space was dispersive distinctly. Main shock of Yushu Ms7.1 earthquake can be divided into several sub-events based on source process of this earthquake. Magnitude of each sub-events depended on each area under the curve of source pulse of source time function, and location derived from source process of each sub-event. We use ShakeMap method with considering the site effect to generate shakeMap for each sub-event, respectively. Finally, ShakeMaps of mainshock can be aquired from superposition of shakemaps for all the sub-events in space. Shakemaps based on surface rupture of causative Fault from field survey can also be derived for mainshock with only one magnitude. We compare ShakeMaps of both the above methods with Intensity of investigation. Comparisons show that decomposition method of main shock more accurately reflect the shake of earthquake in near-field, but for far field the shake is controlled by the weakening influence of the source, the estimated Ⅵ area was smaller than the intensity of the actual investigation. Perhaps seismic intensity in far-field may be related to the increasing seismic duration for the two events. In general, decomposition method of main shock based on source process, considering shakemap of each sub-event, is feasible for disaster emergency response, decision-making and rapid Disaster Assessment after the earthquake.

  13. Pick- and waveform-based techniques for real-time detection of induced seismicity

    NASA Astrophysics Data System (ADS)

    Grigoli, Francesco; Scarabello, Luca; Böse, Maren; Weber, Bernd; Wiemer, Stefan; Clinton, John F.

    2018-05-01

    The monitoring of induced seismicity is a common operation in many industrial activities, such as conventional and non-conventional hydrocarbon production or mining and geothermal energy exploitation, to cite a few. During such operations, we generally collect very large and strongly noise-contaminated data sets that require robust and automated analysis procedures. Induced seismicity data sets are often characterized by sequences of multiple events with short interevent times or overlapping events; in these cases, pick-based location methods may struggle to correctly assign picks to phases and events, and errors can lead to missed detections and/or reduced location resolution and incorrect magnitudes, which can have significant consequences if real-time seismicity information are used for risk assessment frameworks. To overcome these issues, different waveform-based methods for the detection and location of microseismicity have been proposed. The main advantages of waveform-based methods is that they appear to perform better and can simultaneously detect and locate seismic events providing high-quality locations in a single step, while the main disadvantage is that they are computationally expensive. Although these methods have been applied to different induced seismicity data sets, an extensive comparison with sophisticated pick-based detection methods is still missing. In this work, we introduce our improved waveform-based detector and we compare its performance with two pick-based detectors implemented within the SeiscomP3 software suite. We test the performance of these three approaches with both synthetic and real data sets related to the induced seismicity sequence at the deep geothermal project in the vicinity of the city of St. Gallen, Switzerland.

  14. Activity Recognition on Streaming Sensor Data.

    PubMed

    Krishnan, Narayanan C; Cook, Diane J

    2014-02-01

    Many real-world applications that focus on addressing needs of a human, require information about the activities being performed by the human in real-time. While advances in pervasive computing have lead to the development of wireless and non-intrusive sensors that can capture the necessary activity information, current activity recognition approaches have so far experimented on either a scripted or pre-segmented sequence of sensor events related to activities. In this paper we propose and evaluate a sliding window based approach to perform activity recognition in an on line or streaming fashion; recognizing activities as and when new sensor events are recorded. To account for the fact that different activities can be best characterized by different window lengths of sensor events, we incorporate the time decay and mutual information based weighting of sensor events within a window. Additional contextual information in the form of the previous activity and the activity of the previous window is also appended to the feature describing a sensor window. The experiments conducted to evaluate these techniques on real-world smart home datasets suggests that combining mutual information based weighting of sensor events and adding past contextual information into the feature leads to best performance for streaming activity recognition.

  15. Non-fragile ?-? control for discrete-time stochastic nonlinear systems under event-triggered protocols

    NASA Astrophysics Data System (ADS)

    Sun, Ying; Ding, Derui; Zhang, Sunjie; Wei, Guoliang; Liu, Hongjian

    2018-07-01

    In this paper, the non-fragile ?-? control problem is investigated for a class of discrete-time stochastic nonlinear systems under event-triggered communication protocols, which determine whether the measurement output should be transmitted to the controller or not. The main purpose of the addressed problem is to design an event-based output feedback controller subject to gain variations guaranteeing the prescribed disturbance attenuation level described by the ?-? performance index. By utilizing the Lyapunov stability theory combined with S-procedure, a sufficient condition is established to guarantee both the exponential mean-square stability and the ?-? performance for the closed-loop system. In addition, with the help of the orthogonal decomposition, the desired controller parameter is obtained in terms of the solution to certain linear matrix inequalities. Finally, a simulation example is exploited to demonstrate the effectiveness of the proposed event-based controller design scheme.

  16. The `TTIME' Package: Performance Evaluation in a Cluster Computing Environment

    NASA Astrophysics Data System (ADS)

    Howe, Marico; Berleant, Daniel; Everett, Albert

    2011-06-01

    The objective of translating developmental event time across mammalian species is to gain an understanding of the timing of human developmental events based on known time of those events in animals. The potential benefits include improvements to diagnostic and intervention capabilities. The CRAN `ttime' package provides the functionality to infer unknown event timings and investigate phylogenetic proximity utilizing hierarchical clustering of both known and predicted event timings. The original generic mammalian model included nine eutherian mammals: Felis domestica (cat), Mustela putorius furo (ferret), Mesocricetus auratus (hamster), Macaca mulatta (monkey), Homo sapiens (humans), Mus musculus (mouse), Oryctolagus cuniculus (rabbit), Rattus norvegicus (rat), and Acomys cahirinus (spiny mouse). However, the data for this model is expected to grow as more data about developmental events is identified and incorporated into the analysis. Performance evaluation of the `ttime' package across a cluster computing environment versus a comparative analysis in a serial computing environment provides an important computational performance assessment. A theoretical analysis is the first stage of a process in which the second stage, if justified by the theoretical analysis, is to investigate an actual implementation of the `ttime' package in a cluster computing environment and to understand the parallelization process that underlies implementation.

  17. Symbolic discrete event system specification

    NASA Technical Reports Server (NTRS)

    Zeigler, Bernard P.; Chi, Sungdo

    1992-01-01

    Extending discrete event modeling formalisms to facilitate greater symbol manipulation capabilities is important to further their use in intelligent control and design of high autonomy systems. An extension to the DEVS formalism that facilitates symbolic expression of event times by extending the time base from the real numbers to the field of linear polynomials over the reals is defined. A simulation algorithm is developed to generate the branching trajectories resulting from the underlying nondeterminism. To efficiently manage symbolic constraints, a consistency checking algorithm for linear polynomial constraints based on feasibility checking algorithms borrowed from linear programming has been developed. The extended formalism offers a convenient means to conduct multiple, simultaneous explorations of model behaviors. Examples of application are given with concentration on fault model analysis.

  18. Supervised Time Series Event Detector for Building Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-04-13

    A machine learning based approach is developed to detect events that have rarely been seen in the historical data. The data can include building energy consumption, sensor data, environmental data and any data that may affect the building's energy consumption. The algorithm is a modified nonlinear Bayesian support vector machine, which examines daily energy consumption profile, detect the days with abnormal events, and diagnose the cause of the events.

  19. Lightning electric field measurements which correlate with strikes to the NASA F-106B aircraft, 22 July 1980

    NASA Technical Reports Server (NTRS)

    Levine, D. M.

    1981-01-01

    Ground-based data collected on lightning monitoring equipment operated by Goddard Space Flight Center at Wallops Island, Virginia, during a storm being monitored by NASA's F-106B, are presented. The slow electric field change data and RF radiation data were collected at the times the lightning monitoring equipment on the aircraft was triggered. The timing of the ground-based events correlate well with events recorded on the aircraft and provide an indication of the type of flash with which the aircraft was involved.

  20. A simplified real time method to forecast semi-enclosed basins storm surge

    NASA Astrophysics Data System (ADS)

    Pasquali, D.; Di Risio, M.; De Girolamo, P.

    2015-11-01

    Semi-enclosed basins are often prone to storm surge events. Indeed, their meteorological exposition, the presence of large continental shelf and their shape can lead to strong sea level set-up. A real time system aimed at forecasting storm surge may be of great help to protect human activities (i.e. to forecast flooding due to storm surge events), to manage ports and to safeguard coasts safety. This paper aims at illustrating a simple method able to forecast storm surge events in semi-enclosed basins in real time. The method is based on a mixed approach in which the results obtained by means of a simplified physics based model with low computational costs are corrected by means of statistical techniques. The proposed method is applied to a point of interest located in the Northern part of the Adriatic Sea. The comparison of forecasted levels against observed values shows the satisfactory reliability of the forecasts.

  1. The design and implementation of EPL: An event pattern language for active databases

    NASA Technical Reports Server (NTRS)

    Giuffrida, G.; Zaniolo, C.

    1994-01-01

    The growing demand for intelligent information systems requires closer coupling of rule-based reasoning engines, such as CLIPS, with advanced data base management systems (DBMS). For instance, several commercial DBMS now support the notion of triggers that monitor events and transactions occurring in the database and fire induced actions, which perform a variety of critical functions, including safeguarding the integrity of data, monitoring access, and recording volatile information needed by administrators, analysts, and expert systems to perform assorted tasks; examples of these tasks include security enforcement, market studies, knowledge discovery, and link analysis. At UCLA, we designed and implemented the event pattern language (EPL) which is capable of detecting and acting upon complex patterns of events which are temporally related to each other. For instance, a plant manager should be notified when a certain pattern of overheating repeats itself over time in a chemical process; likewise, proper notification is required when a suspicious sequence of bank transactions is executed within a certain time limit. The EPL prototype is built in CLIPS to operate on top of Sybase, a commercial relational DBMS, where actions can be triggered by events such as simple database updates, insertions, and deletions. The rule-based syntax of EPL allows the sequences of goals in rules to be interpreted as sequences of temporal events; each goal can correspond to either (1) a simple event, or (2) a (possibly negated) event/condition predicate, or (3) a complex event defined as the disjunction and repetition of other events. Various extensions have been added to CLIPS in order to tailor the interface with Sybase and its open client/server architecture.

  2. Reciprocal influences between negative life events and callous-unemotional traits.

    PubMed

    Kimonis, Eva R; Centifanti, Luna C M; Allen, Jennifer L; Frick, Paul J

    2014-11-01

    Children with conduct problems and co-occurring callous-unemotional (CU) traits show more severe, stable, and aggressive antisocial behaviors than those without CU traits. Exposure to negative life events has been identified as an important contributing factor to the expression of CU traits across time, although the directionality of this effect has remained unknown due to a lack of longitudinal study. The present longitudinal study examined potential bidirectional effects of CU traits leading to experiencing more negative life events and negative life events leading to increases in CU traits across 3 years among a sample of community-based school-aged (M = 10.9, SD = 1.71 years) boys and girls (N = 98). Repeated rating measures of CU traits, negative life events and conduct problems completed by children and parents during annual assessments were moderately to highly stable across time. Cross-lagged models supported a reciprocal relationship of moderate magnitude between child-reported CU traits and "controllable" negative life events. Parent-reported CU traits predicted "uncontrollable" life events at the earlier time point and controllable life events at the later time point, but no reciprocal effect was evident. These findings have important implications for understanding developmental processes that contribute to the stability of CU traits in youth.

  3. Laboratory-based versus non-laboratory-based method for assessment of cardiovascular disease risk: the NHANES I Follow-up Study cohort

    PubMed Central

    Gaziano, Thomas A; Young, Cynthia R; Fitzmaurice, Garrett; Atwood, Sidney; Gaziano, J Michael

    2008-01-01

    Summary Background Around 80% of all cardiovascular deaths occur in developing countries. Assessment of those patients at high risk is an important strategy for prevention. Since developing countries have limited resources for prevention strategies that require laboratory testing, we assessed if a risk prediction method that did not require any laboratory tests could be as accurate as one requiring laboratory information. Methods The National Health and Nutrition Examination Survey (NHANES) was a prospective cohort study of 14 407 US participants aged between 25–74 years at the time they were first examined (between 1971 and 1975). Our follow-up study population included participants with complete information on these surveys who did not report a history of cardiovascular disease (myocardial infarction, heart failure, stroke, angina) or cancer, yielding an analysis dataset N=6186. We compared how well either method could predict first-time fatal and non-fatal cardiovascular disease events in this cohort. For the laboratory-based model, which required blood testing, we used standard risk factors to assess risk of cardiovascular disease: age, systolic blood pressure, smoking status, total cholesterol, reported diabetes status, and current treatment for hypertension. For the non-laboratory-based model, we substituted body-mass index for cholesterol. Findings In the cohort of 6186, there were 1529 first-time cardiovascular events and 578 (38%) deaths due to cardiovascular disease over 21 years. In women, the laboratory-based model was useful for predicting events, with a c statistic of 0·829. The c statistic of the non-laboratory-based model was 0·831. In men, the results were similar (0·784 for the laboratory-based model and 0·783 for the non-laboratory-based model). Results were similar between the laboratory-based and non-laboratory-based models in both men and women when restricted to fatal events only. Interpretation A method that uses non-laboratory-based risk factors predicted cardiovascular events as accurately as one that relied on laboratory-based values. This approach could simplify risk assessment in situations where laboratory testing is inconvenient or unavailable. PMID:18342687

  4. Recollection-Dependent Memory for Event Duration in Large-Scale Spatial Navigation

    ERIC Educational Resources Information Center

    Brunec, Iva K.; Ozubko, Jason D.; Barense, Morgan D.; Moscovitch, Morris

    2017-01-01

    Time and space represent two key aspects of episodic memories, forming the spatiotemporal context of events in a sequence. Little is known, however, about how temporal information, such as the duration and the order of particular events, are encoded into memory, and if it matters whether the memory representation is based on recollection or…

  5. The upcoming mutual event season for the Patroclus-Menoetius Trojan binary

    NASA Astrophysics Data System (ADS)

    Grundy, W. M.; Noll, K. S.; Buie, M. W.; Levison, H. F.

    2018-05-01

    We present new Hubble Space Telescope and ground-based Keck observations and new Keplerian orbit solutions for the mutual orbit of binary Jupiter Trojan asteroid (617) Patroclus and Menoetius, targets of NASA's Lucy mission. We predict event times for the upcoming mutual event season, which is anticipated to run from late 2017 through mid 2019.

  6. Track-based event recognition in a realistic crowded environment

    NASA Astrophysics Data System (ADS)

    van Huis, Jasper R.; Bouma, Henri; Baan, Jan; Burghouts, Gertjan J.; Eendebak, Pieter T.; den Hollander, Richard J. M.; Dijk, Judith; van Rest, Jeroen H.

    2014-10-01

    Automatic detection of abnormal behavior in CCTV cameras is important to improve the security in crowded environments, such as shopping malls, airports and railway stations. This behavior can be characterized at different time scales, e.g., by small-scale subtle and obvious actions or by large-scale walking patterns and interactions between people. For example, pickpocketing can be recognized by the actual snatch (small scale), when he follows the victim, or when he interacts with an accomplice before and after the incident (longer time scale). This paper focusses on event recognition by detecting large-scale track-based patterns. Our event recognition method consists of several steps: pedestrian detection, object tracking, track-based feature computation and rule-based event classification. In the experiment, we focused on single track actions (walk, run, loiter, stop, turn) and track interactions (pass, meet, merge, split). The experiment includes a controlled setup, where 10 actors perform these actions. The method is also applied to all tracks that are generated in a crowded shopping mall in a selected time frame. The results show that most of the actions can be detected reliably (on average 90%) at a low false positive rate (1.1%), and that the interactions obtain lower detection rates (70% at 0.3% FP). This method may become one of the components that assists operators to find threatening behavior and enrich the selection of videos that are to be observed.

  7. The effect of bioturbation in pelagic sediments: Lessons from radioactive tracers and planktonic foraminifera in the Gulf of Aqaba, Red Sea

    NASA Astrophysics Data System (ADS)

    Steiner, Zvi; Lazar, Boaz; Levi, Shani; Tsroya, Shimon; Pelled, Omer; Bookman, Revital; Erez, Jonathan

    2016-12-01

    Studies of recent environmental perturbations often rely on data derived from marine sedimentary records. These records are known to imperfectly inscribe the true sequence of events, yet there is large uncertainty regarding the corrections that should be employed to accurately describe the sedimentary history. Here we show in recent records from the Gulf of Aqaba, Red Sea, how events of the abrupt disappearance of the planktonic foraminifer Globigerinoides sacculifer, and episodic deposition of the artificial radionuclide 137Cs, are significantly altered in the sedimentary record compared to their known past timing. Instead of the abrupt disappearance of the foraminifera, we observe a prolonged decline beginning at core depth equivalent to ∼30 y prior to its actual disappearance and continuing for decades past the event. We further observe asymmetric smoothing of the radionuclide peak. Utilization of advection-diffusion-reaction models to reconstruct the original fluxes based on the known absolute timing of the events reveal that it is imperative to use a continuous function to describe bioturbation. Discretization of bioturbation into mixed and unmixed layers significantly shifts the location of the modeled event. When bioturbation is described as a continuously decreasing function of depth, the peak of a very short term event smears asymmetrically but remains in the right depth. When sudden events repeat while the first spike is still mixed with the upper sediment layer, bioturbation unifies adjacent peaks. The united peak appears at an intermediate depth that does not necessarily correlate with the timing of the individual events. In a third case, a long lasting sedimentary event affected by bioturbation, the resulting peak is rather weak compared to the actual event and appears deeper in the sediment column than expected based on the termination of the event. The model clearly shows that abrupt changes can only endure in the record if a thick sediment layer settled on the sediment-water interface at once or if bioturbation rates decreased to very low values for a prolonged period of time. In any other case smearing by bioturbation makes an abrupt event appear to have started shortly before the real timing and end long after its true termination.

  8. Ensemble reconstruction of severe low flow events in France since 1871

    NASA Astrophysics Data System (ADS)

    Caillouet, Laurie; Vidal, Jean-Philippe; Sauquet, Eric; Devers, Alexandre; Graff, Benjamin

    2016-04-01

    This work presents a study of severe low flow events that occurred from 1871 onwards for a large number of near-natural catchments in France. It aims at assessing and comparing their characteristics to improve our knowledge on historical events and to provide a selection of benchmark events for climate change adaptation purposes. The historical depth of streamflow observations is generally limited to the last 50 years and therefore offers too small a sample of severe low flow events to properly explore the long-term evolution of their characteristics and associated impacts. In order to overcome this limit, this work takes advantage of a 140-year ensemble hydrometeorological dataset over France based on: (1) a probabilistic precipitation and temperature downscaling of the Twentieth Century Reanalysis over France (Caillouet et al., 2015), and (2) a continuous hydrological modelling that uses the high-resolution meteorological reconstructions as forcings over the whole period. This dataset provides an ensemble of 25 equally plausible daily streamflow time series for a reference network of stations in France over the whole 1871-2012 period. Severe low flow events are identified based on a combination of a fixed threshold and a daily variable threshold. Each event is characterized by its deficit, duration and timing by applying the Sequent Peak Algorithm. The procedure is applied to the 25 simulated time series as well as to the observed time series in order to compare observed and simulated events over the recent period, and to characterize in a probabilistic way unrecorded historical events. The ensemble aspect of the reconstruction leads to address specific issues, for properly defining events across ensemble simulations, as well as for adequately comparing the simulated characteristics to the observed ones. This study brings forward the outstanding 1921 and 1940s events but also older and less known ones that occurred during the last decade of the 19th century. For the first time, severe low flow events are qualified in a homogeneous way over 140 years on a large set of near-natural French catchments, allowing for detailed analyses of the effect of climate variability and anthropogenic climate change on low flow hydrology. Caillouet, L., Vidal, J.-P., Sauquet, E., and Graff, B. (2015) Probabilistic precipitation and temperature downscaling of the Twentieth Century Reanalysis over France, Clim. Past Discuss., 11, 4425-4482, doi:10.5194/cpd-11-4425-2015

  9. Use of a handheld computer application for voluntary medication event reporting by inpatient nurses and physicians.

    PubMed

    Dollarhide, Adrian W; Rutledge, Thomas; Weinger, Matthew B; Dresselhaus, Timothy R

    2008-04-01

    To determine the feasibility of capturing self-reported medication events using a handheld computer-based Medication Event Reporting Tool (MERT). Handheld computers operating the MERT software application were deployed among volunteer physician (n = 185) and nurse (n = 119) participants on the medical wards of four university-affiliated teaching hospitals. Participants were encouraged to complete confidential reports on the handheld computers for medication events observed during the study period. Demographic variables including age, gender, education level, and clinical experience were recorded for all participants. Each MERT report included details on the provider, location, timing and type of medication event recorded. Over the course of 2,311 days of clinician participation, 76 events were reported; the median time for report completion was 231 seconds. The average event reporting rate for all participants was 0.033 reports per clinician shift. Nurses had a significantly higher reporting rate compared to physicians (0.045 vs 0.026 reports/shift, p = .02). Subgroup analysis revealed that attending physicians reported events more frequently than resident physicians (0.042 vs 0.021 reports/shift, p = .03), and at a rate similar to that of nurses (p = .80). Only 5% of MERT medication events were reported to require increased monitoring or treatment. A handheld-based event reporting tool is a feasible method to record medication events in inpatient hospital care units. Handheld reporting tools may hold promise to augment existing hospital reporting systems.

  10. A comparison of estimators from self-controlled case series, case-crossover design, and sequence symmetry analysis for pharmacoepidemiological studies.

    PubMed

    Takeuchi, Yoshinori; Shinozaki, Tomohiro; Matsuyama, Yutaka

    2018-01-08

    Despite the frequent use of self-controlled methods in pharmacoepidemiological studies, the factors that may bias the estimates from these methods have not been adequately compared in real-world settings. Here, we comparatively examined the impact of a time-varying confounder and its interactions with time-invariant confounders, time trends in exposures and events, restrictions, and misspecification of risk period durations on the estimators from three self-controlled methods. This study analyzed self-controlled case series (SCCS), case-crossover (CCO) design, and sequence symmetry analysis (SSA) using simulated and actual electronic medical records datasets. We evaluated the performance of the three self-controlled methods in simulated cohorts for the following scenarios: 1) time-invariant confounding with interactions between the confounders, 2) time-invariant and time-varying confounding without interactions, 3) time-invariant and time-varying confounding with interactions among the confounders, 4) time trends in exposures and events, 5) restricted follow-up time based on event occurrence, and 6) patient restriction based on event history. The sensitivity of the estimators to misspecified risk period durations was also evaluated. As a case study, we applied these methods to evaluate the risk of macrolides on liver injury using electronic medical records. In the simulation analysis, time-varying confounding produced bias in the SCCS and CCO design estimates, which aggravated in the presence of interactions between the time-invariant and time-varying confounders. The SCCS estimates were biased by time trends in both exposures and events. Erroneously short risk periods introduced bias to the CCO design estimate, whereas erroneously long risk periods introduced bias to the estimates of all three methods. Restricting the follow-up time led to severe bias in the SSA estimates. The SCCS estimates were sensitive to patient restriction. The case study showed that although macrolide use was significantly associated with increased liver injury occurrence in all methods, the value of the estimates varied. The estimations of the three self-controlled methods depended on various underlying assumptions, and the violation of these assumptions may cause non-negligible bias in the resulting estimates. Pharmacoepidemiologists should select the appropriate self-controlled method based on how well the relevant key assumptions are satisfied with respect to the available data.

  11. Hierarchical structure for audio-video based semantic classification of sports video sequences

    NASA Astrophysics Data System (ADS)

    Kolekar, M. H.; Sengupta, S.

    2005-07-01

    A hierarchical structure for sports event classification based on audio and video content analysis is proposed in this paper. Compared to the event classifications in other games, those of cricket are very challenging and yet unexplored. We have successfully solved cricket video classification problem using a six level hierarchical structure. The first level performs event detection based on audio energy and Zero Crossing Rate (ZCR) of short-time audio signal. In the subsequent levels, we classify the events based on video features using a Hidden Markov Model implemented through Dynamic Programming (HMM-DP) using color or motion as a likelihood function. For some of the game-specific decisions, a rule-based classification is also performed. Our proposed hierarchical structure can easily be applied to any other sports. Our results are very promising and we have moved a step forward towards addressing semantic classification problems in general.

  12. The European 2015 drought from a hydrological perspective

    NASA Astrophysics Data System (ADS)

    Laaha, Gregor; Gauster, Tobias; Tallaksen, Lena M.; Vidal, Jean-Philippe; Stahl, Kerstin; Prudhomme, Christel; Heudorfer, Benedikt; Vlnas, Radek; Ionita, Monica; Van Lanen, Henny A. J.; Adler, Mary-Jeanne; Caillouet, Laurie; Delus, Claire; Fendekova, Miriam; Gailliez, Sebastien; Hannaford, Jamie; Kingston, Daniel; Van Loon, Anne F.; Mediero, Luis; Osuch, Marzena; Romanowicz, Renata; Sauquet, Eric; Stagge, James H.; Wong, Wai K.

    2017-06-01

    In 2015 large parts of Europe were affected by drought. In this paper, we analyze the hydrological footprint (dynamic development over space and time) of the drought of 2015 in terms of both severity (magnitude) and spatial extent and compare it to the extreme drought of 2003. Analyses are based on a range of low flow and hydrological drought indices derived for about 800 streamflow records across Europe, collected in a community effort based on a common protocol. We compare the hydrological footprints of both events with the meteorological footprints, in order to learn from similarities and differences of both perspectives and to draw conclusions for drought management. The region affected by hydrological drought in 2015 differed somewhat from the drought of 2003, with its center located more towards eastern Europe. In terms of low flow magnitude, a region surrounding the Czech Republic was the most affected, with summer low flows that exhibited return intervals of 100 years and more. In terms of deficit volumes, the geographical center of the event was in southern Germany, where the drought lasted a particularly long time. A detailed spatial and temporal assessment of the 2015 event showed that the particular behavior in these regions was partly a result of diverging wetness preconditions in the studied catchments. Extreme droughts emerged where preconditions were particularly dry. In regions with wet preconditions, low flow events developed later and tended to be less severe. For both the 2003 and 2015 events, the onset of the hydrological drought was well correlated with the lowest flow recorded during the event (low flow magnitude), pointing towards a potential for early warning of the severity of streamflow drought. Time series of monthly drought indices (both streamflow- and climate-based indices) showed that meteorological and hydrological events developed differently in space and time, both in terms of extent and severity (magnitude). These results emphasize that drought is a hazard which leaves different footprints on the various components of the water cycle at different spatial and temporal scales. The difference in the dynamic development of meteorological and hydrological drought also implies that impacts on various water-use sectors and river ecology cannot be informed by climate indices alone. Thus, an assessment of drought impacts on water resources requires hydrological data in addition to drought indices based solely on climate data. The transboundary scale of the event also suggests that additional efforts need to be undertaken to make timely pan-European hydrological assessments more operational in the future.

  13. The Roles of Flares and Shocks in determining SEP Abundances

    NASA Technical Reports Server (NTRS)

    Cane, H. V.; Mewaldt, R. A.; Cohen, C. M. S.; vonRosenvinge, T. T.

    2007-01-01

    We examine solar energetic particle (SEP) event-averaged abundances of Fe relative to O and intensity versus time profiles at energies above 25 MeV/nucleon using the SIS instrument on ACE. These data are compared with solar wind conditions during each event and with estimates of the strength of the associated shock based on average travel times to 1 AU. We find that the majority of events with an Fe to 0 abundance ratio greater than two times the average 5-12 MeV/nuc value for large SEP events (0.134) occur in the western hemisphere. Furthermore, in most of these Fe-rich events the profiles peak within 12 hours of the associated flare, suggesting that some of the observed interplanetary particles are accelerated in these flares. The vast majority of events with Fe/O below 0.134 are influenced by interplanetary shock acceleration. We suggest that variations in elemental composition in SEP events mainly arise from the combination of flare particles and shock acceleration of these particles and/or the ambient medium.

  14. Research of the Effect Caused by Terrestrial Power Sources on the Near-Earth Space above China based on DEMETER Satellite Data

    NASA Astrophysics Data System (ADS)

    Zhang, C.; Wu, J.; Ma, Q.

    2017-12-01

    The environmental effect on the ionosphere caused by man-made power line emission (PLE) and power line harmonic radiation (PLHR) has become an increasing concern. Based on the observed data of 6.5 operating years of DEMETER satellite, by scanning the electric field power density time-frequency spectrograms, 133 PLHR events with central frequencies from 500 Hz to 4.5 kHz are detected in the near-Earth space above China. Among the 133 events, 129 events have PLE events at the base power system frequency (50 Hz in China). The duration time of every PLE event covers that of the corresponding PLHR event totally. As the same with PLHR, PLE is also propagating in whistler mode in the ionosphere. In two events that are detected in the conjugate region of Australian NWC VLF transmitter, radiations with line structure in the vicinity of 19.8 kHz are detected. There are 5 lines distributed from about 19.7 kHz to 19.9 kHz, which are in accordance with the frequency range of NWC transmitted signals. The frequency spacing of the 5 lines is exactly 50 Hz and the bandwidth of each line is about 10 Hz. The electric field power density of the line structure radiation is at the same level with the corresponding PLE, much higher than that of PLHR. The line structure radiations suggest possible modulation of VLF signals by PLE. At last, the variation of ionospheric parameters measured by DEMETER in relation with PLHR is analyzed statistically. As the revisiting orbits of DEMETER pass over the same area with nearly no deviation and at the same time of day, for each PLHR event, we check and average the parameters of 3 revisiting orbits before and after the event respectively. Combined with the event orbit, the variations of these parameters can be obtained. There are totally 5 tendencies: no variation, ascending, descending, crest and trough. Only a few events show no variation. Though there are differences in other 4 tendencies, none of the parameters show extremely preferences on one of the 4 tendencies. The crest and trough events are generally more than ascending and descending events, especially for ion density and O+ ion percentage. The variations of parameters show no preferences on latitude and the day of year.

  15. A Short-term ESPERTA-based Forecast Tool for Moderate-to-extreme Solar Proton Events

    NASA Astrophysics Data System (ADS)

    Laurenza, M.; Alberti, T.; Cliver, E. W.

    2018-04-01

    The ESPERTA (Empirical model for Solar Proton Event Real Time Alert) forecast tool has a Probability of Detection (POD) of 63% for all >10 MeV events with proton peak intensity ≥10 pfu (i.e., ≥S1 events, S1 referring to minor storms on the NOAA Solar Radiation Storms scale), from 1995 to 2014 with a false alarm rate (FAR) of 38% and a median (minimum) warning time (WT) of ∼4.8 (0.4) hr. The NOAA space weather scale includes four additional categories: moderate (S2), strong (S3), severe (S4), and extreme (S5). As S1 events have only minor impacts on HF radio propagation in the polar regions, the effective threshold for significant space radiation effects appears to be the S2 level (100 pfu), above which both biological and space operation impacts are observed along with increased effects on HF propagation in the polar regions. We modified the ESPERTA model to predict ≥S2 events and obtained a POD of 75% (41/55) and an FAR of 24% (13/54) for the 1995–2014 interval with a median (minimum) WT of ∼1.7 (0.2) hr based on predictions made at the time of the S1 threshold crossing. The improved performance of ESPERTA for ≥S2 events is a reflection of the big flare syndrome, which postulates that the measures of the various manifestations of eruptive solar flares increase as one considers increasingly larger events.

  16. Extreme-volatility dynamics in crude oil markets

    NASA Astrophysics Data System (ADS)

    Jiang, Xiong-Fei; Zheng, Bo; Qiu, Tian; Ren, Fei

    2017-02-01

    Based on concepts and methods from statistical physics, we investigate extreme-volatility dynamics in the crude oil markets, using the high-frequency data from 2006 to 2010 and the daily data from 1986 to 2016. The dynamic relaxation of extreme volatilities is described by a power law, whose exponents usually depend on the magnitude of extreme volatilities. In particular, the relaxation before and after extreme volatilities is time-reversal symmetric at the high-frequency time scale, but time-reversal asymmetric at the daily time scale. This time-reversal asymmetry is mainly induced by exogenous events. However, the dynamic relaxation after exogenous events exhibits the same characteristics as that after endogenous events. An interacting herding model both with and without exogenous driving forces could qualitatively describe the extreme-volatility dynamics.

  17. Evaluation of the Health Protection Event-Based Surveillance for the London 2012 Olympic and Paralympic Games.

    PubMed

    Severi, E; Kitching, A; Crook, P

    2014-06-19

    The Health Protection Agency (HPA) (currently Public Health England) implemented the Health Protection Event-Based Surveillance (EBS) to provide additional national epidemic intelligence for the 2012 London Olympic and Paralympic Games (the Games). We describe EBS and evaluate the system attributes. EBS aimed at identifying, assessing and reporting to the HPA Olympic Coordination Centre (OCC) possible national infectious disease threats that may significantly impact the Games. EBS reported events in England from 2 July to 12 September 2012. EBS sourced events from reports from local health protection units and from screening an electronic application 'HPZone Dashboard' (DB). During this period, 147 new events were reported to EBS, mostly food-borne and vaccine-preventable diseases: 79 from regional units, 144 from DB (76 from both). EBS reported 61 events to the OCC: 21 of these were reported onwards. EBS sensitivity was 95.2%; positive predictive value was 32.8%; reports were timely (median one day; 10th percentile: 0 days - same day; 90th percentile: 3.6 days); completeness was 99.7%; stability was 100%; EBS simplicity was assessed as good; the daily time per regional or national unit dedicated to EBS was approximately 4 hours (weekdays) and 3 hours (weekends). OCC directors judged EBS as efficient, fast and responsive. EBS provided reliable, reassuring, timely, simple and stable national epidemic intelligence for the Games.

  18. Diagnosis of delay-deadline failures in real time discrete event models.

    PubMed

    Biswas, Santosh; Sarkar, Dipankar; Bhowal, Prodip; Mukhopadhyay, Siddhartha

    2007-10-01

    In this paper a method for fault detection and diagnosis (FDD) of real time systems has been developed. A modeling framework termed as real time discrete event system (RTDES) model is presented and a mechanism for FDD of the same has been developed. The use of RTDES framework for FDD is an extension of the works reported in the discrete event system (DES) literature, which are based on finite state machines (FSM). FDD of RTDES models are suited for real time systems because of their capability of representing timing faults leading to failures in terms of erroneous delays and deadlines, which FSM-based ones cannot address. The concept of measurement restriction of variables is introduced for RTDES and the consequent equivalence of states and indistinguishability of transitions have been characterized. Faults are modeled in terms of an unmeasurable condition variable in the state map. Diagnosability is defined and the procedure of constructing a diagnoser is provided. A checkable property of the diagnoser is shown to be a necessary and sufficient condition for diagnosability. The methodology is illustrated with an example of a hydraulic cylinder.

  19. A regressive storm model for extreme space weather

    NASA Astrophysics Data System (ADS)

    Terkildsen, Michael; Steward, Graham; Neudegg, Dave; Marshall, Richard

    2012-07-01

    Extreme space weather events, while rare, pose significant risk to society in the form of impacts on critical infrastructure such as power grids, and the disruption of high end technological systems such as satellites and precision navigation and timing systems. There has been an increased focus on modelling the effects of extreme space weather, as well as improving the ability of space weather forecast centres to identify, with sufficient lead time, solar activity with the potential to produce extreme events. This paper describes the development of a data-based model for predicting the occurrence of extreme space weather events from solar observation. The motivation for this work was to develop a tool to assist space weather forecasters in early identification of solar activity conditions with the potential to produce extreme space weather, and with sufficient lead time to notify relevant customer groups. Data-based modelling techniques were used to construct the model, and an extensive archive of solar observation data used to train, optimise and test the model. The optimisation of the base model aimed to eliminate false negatives (missed events) at the expense of a tolerable increase in false positives, under the assumption of an iterative improvement in forecast accuracy during progression of the solar disturbance, as subsequent data becomes available.

  20. An Event-Based Verification Scheme for the Real-Time Flare Detection System at Kanzelhöhe Observatory

    NASA Astrophysics Data System (ADS)

    Pötzi, W.; Veronig, A. M.; Temmer, M.

    2018-06-01

    In the framework of the Space Situational Awareness program of the European Space Agency (ESA/SSA), an automatic flare detection system was developed at Kanzelhöhe Observatory (KSO). The system has been in operation since mid-2013. The event detection algorithm was upgraded in September 2017. All data back to 2014 was reprocessed using the new algorithm. In order to evaluate both algorithms, we apply verification measures that are commonly used for forecast validation. In order to overcome the problem of rare events, which biases the verification measures, we introduce a new event-based method. We divide the timeline of the Hα observations into positive events (flaring period) and negative events (quiet period), independent of the length of each event. In total, 329 positive and negative events were detected between 2014 and 2016. The hit rate for the new algorithm reached 96% (just five events were missed) and a false-alarm ratio of 17%. This is a significant improvement of the algorithm, as the original system had a hit rate of 85% and a false-alarm ratio of 33%. The true skill score and the Heidke skill score both reach values of 0.8 for the new algorithm; originally, they were at 0.5. The mean flare positions are accurate within {±} 1 heliographic degree for both algorithms, and the peak times improve from a mean difference of 1.7± 2.9 minutes to 1.3± 2.3 minutes. The flare start times that had been systematically late by about 3 minutes as determined by the original algorithm, now match the visual inspection within -0.47± 4.10 minutes.

  1. A universal approach to determine footfall timings from kinematics of a single foot marker in hoofed animals

    PubMed Central

    Clayton, Hilary M.

    2015-01-01

    The study of animal movement commonly requires the segmentation of continuous data streams into individual strides. The use of forceplates and foot-mounted accelerometers readily allows the detection of the foot-on and foot-off events that define a stride. However, when relying on optical methods such as motion capture, there is lack of validated robust, universally applicable stride event detection methods. To date, no method has been validated for movement on a circle, while algorithms are commonly specific to front/hind limbs or gait. In this study, we aimed to develop and validate kinematic stride segmentation methods applicable to movement on straight line and circle at walk and trot, which exclusively rely on a single, dorsal hoof marker. The advantage of such marker placement is the robustness to marker loss and occlusion. Eight horses walked and trotted on a straight line and in a circle over an array of multiple forceplates. Kinetic events were detected based on the vertical force profile and used as the reference values. Kinematic events were detected based on displacement, velocity or acceleration signals of the dorsal hoof marker depending on the algorithm using (i) defined thresholds associated with derived movement signals and (ii) specific events in the derived movement signals. Method comparison was performed by calculating limits of agreement, accuracy, between-horse precision and within-horse precision based on differences between kinetic and kinematic event. In addition, we examined the effect of force thresholds ranging from 50 to 150 N on the timings of kinetic events. The two approaches resulted in very good and comparable performance: of the 3,074 processed footfall events, 95% of individual foot on and foot off events differed by no more than 26 ms from the kinetic event, with average accuracy between −11 and 10 ms and average within- and between horse precision ≤8 ms. While the event-based method may be less likely to suffer from scaling effects, on soft ground the threshold-based method may prove more valuable. While we found that use of velocity thresholds for foot on detection results in biased event estimates for the foot on the inside of the circle at trot, adjusting thresholds for this condition negated the effect. For the final four algorithms, we found no noteworthy bias between conditions or between front- and hind-foot timings. Different force thresholds in the range of 50 to 150 N had the greatest systematic effect on foot-off estimates in the hind limbs (up to on average 16 ms per condition), being greater than the effect on foot-on estimates or foot-off estimates in the forelimbs (up to on average ±7 ms per condition). PMID:26157641

  2. The relative importance of real-time in-cab and external feedback in managing fatigue in real-world commercial transport operations.

    PubMed

    Fitzharris, Michael; Liu, Sara; Stephens, Amanda N; Lenné, Michael G

    2017-05-29

    Real-time driver monitoring systems represent a solution to address key behavioral risks as they occur, particularly distraction and fatigue. The efficacy of these systems in real-world settings is largely unknown. This article has three objectives: (1) to document the incidence and duration of fatigue in real-world commercial truck-driving operations, (2) to determine the reduction, if any, in the incidence of fatigue episodes associated with providing feedback, and (3) to tease apart the relative contribution of in-cab warnings from 24/7 monitoring and feedback to employers. Data collected from a commercially available in-vehicle camera-based driver monitoring system installed in a commercial truck fleet operating in Australia were analyzed. The real-time driver monitoring system makes continuous assessments of driver drowsiness based on eyelid position and other factors. Data were collected in a baseline period where no feedback was provided to drivers. Real-time feedback to drivers then occurred via in-cab auditory and haptic warnings, which were further enhanced by direct feedback by company management when fatigue events were detected by external 24/7 monitors. Fatigue incidence rates and their timing of occurrence across the three time periods were compared. Relative to no feedback being provided to drivers when fatigue events were detected, in-cab warnings resulted in a 66% reduction in fatigue events, with a 95% reduction achieved by the real-time provision of direct feedback in addition to in-cab warnings (p < 0.01). With feedback, fatigue events were shorter in duration a d occurred later in the trip, and fewer drivers had more than one verified fatigue event per trip. That the provision of feedback to the company on driver fatigue events in real time provides greater benefit than feedback to the driver alone has implications for companies seeking to mitigate risks associated with fatigue. Having fewer fatigue events is likely a reflection of the device itself and the accompanying safety culture of the company in terms of how the information is used. Data were analysed on a per-truck trip basis, and the findings are indicative of fatigue events in a large-scale commercial transport fleet. Future research ought to account for individual driver performance, which was not possible with the available data in this retrospective analysis. Evidence that real-time driver monitoring feedback is effective in reducing fatigue events is invaluable in the development of fleet safety policies, and of future national policy and vehicle safety regulations. Implications for automotive driver monitoring are discussed.

  3. A General and Efficient Method for Incorporating Precise Spike Times in Globally Time-Driven Simulations

    PubMed Central

    Hanuschkin, Alexander; Kunkel, Susanne; Helias, Moritz; Morrison, Abigail; Diesmann, Markus

    2010-01-01

    Traditionally, event-driven simulations have been limited to the very restricted class of neuronal models for which the timing of future spikes can be expressed in closed form. Recently, the class of models that is amenable to event-driven simulation has been extended by the development of techniques to accurately calculate firing times for some integrate-and-fire neuron models that do not enable the prediction of future spikes in closed form. The motivation of this development is the general perception that time-driven simulations are imprecise. Here, we demonstrate that a globally time-driven scheme can calculate firing times that cannot be discriminated from those calculated by an event-driven implementation of the same model; moreover, the time-driven scheme incurs lower computational costs. The key insight is that time-driven methods are based on identifying a threshold crossing in the recent past, which can be implemented by a much simpler algorithm than the techniques for predicting future threshold crossings that are necessary for event-driven approaches. As run time is dominated by the cost of the operations performed at each incoming spike, which includes spike prediction in the case of event-driven simulation and retrospective detection in the case of time-driven simulation, the simple time-driven algorithm outperforms the event-driven approaches. Additionally, our method is generally applicable to all commonly used integrate-and-fire neuronal models; we show that a non-linear model employing a standard adaptive solver can reproduce a reference spike train with a high degree of precision. PMID:21031031

  4. Acoustic emission analysis of tooth-composite interfacial debonding.

    PubMed

    Cho, N Y; Ferracane, J L; Lee, I B

    2013-01-01

    This study detected tooth-composite interfacial debonding during composite restoration by means of acoustic emission (AE) analysis and investigated the effects of composite properties and adhesives on AE characteristics. The polymerization shrinkage, peak shrinkage rate, flexural modulus, and shrinkage stress of a methacrylate-based universal hybrid, a flowable, and a silorane-based composite were measured. Class I cavities on 49 extracted premolars were restored with 1 of the 3 composites and 1 of the following adhesives: 2 etch-and-rinse adhesives, 2 self-etch adhesives, and an adhesive for the silorane-based composite. AE analysis was done for 2,000 sec during light-curing. The silorane-based composite exhibited the lowest shrinkage (rate), the longest time to peak shrinkage rate, the lowest shrinkage stress, and the fewest AE events. AE events were detected immediately after the beginning of light-curing in most composite-adhesive combinations, but not until 40 sec after light-curing began for the silorane-based composite. AE events were concentrated at the initial stage of curing in self-etch adhesives compared with etch-and-rinse adhesives. Reducing the shrinkage (rate) of composites resulted in reduced shrinkage stress and less debonding, as evidenced by fewer AE events. AE is an effective technique for monitoring, in real time, the debonding kinetics at the tooth-composite interface.

  5. Microseismic Events Detection on Xishancun Landslide, Sichuan Province, China

    NASA Astrophysics Data System (ADS)

    Sheng, M.; Chu, R.; Wei, Z.

    2016-12-01

    On landslide, the slope movement and the fracturing of the rock mass often lead to microearthquakes, which are recorded as weak signals on seismographs. The distribution characteristics of temporal and spatial regional unstability as well as the impact of external factors on the unstable regions can be understand and analyzed by monitoring those microseismic events. Microseismic method can provide some information inside the landslide, which can be used as supplementary of geodetic methods for monitoring the movement of landslide surface. Compared to drilling on landslide, microseismic method is more economical and safe. Xishancun Landslide is located about 60km northwest of Wenchuan earthquake centroid, it keep deforming after the earthquake, which greatly increases the probability of disasters. In the autumn of 2015, 30 seismometers were deployed on the landslide for 3 months with intervals of 200 500 meters. First, we used regional earthquakes for time correction of seismometers to eliminate the influence of inaccuracy GPS clocks and the subsurface structure of stations. Due to low velocity of the loose medium, the travel time difference of microseismic events on the landslide up to 5s. According to travel time and waveform characteristics, we found many microseismic events and converted them into envelopes as templates, then we used a sliding-window cross-correlation technique based on waveform envelope to detect the other microseismic events. Consequently, 100 microseismic events were detected with the waveforms recorded on all seismometers. Based on the location, we found most of them located on the front of the landslide while the others located on the back end. The bottom and top of the landslide accumulated considerable energy and deformed largely, radiated waves could be recorded by all stations. What's more, the bottom with more events seemed very active. In addition, there were many smaller events happened in middle part of the landslide where released less energy, generated signals could be recorded only by a few stations. Based on the distribution of those microseismic events, we found four unstable regions which agreed well with deformed areas monitored by Geodesy methods. The distribution of those microseismic events, should be related to internal structure and movement of the landslide.

  6. The TRMM Multi-satellite Precipitation Analysis (TMPA): Quasi-Global Precipitation Estimates at Fine Scales

    NASA Technical Reports Server (NTRS)

    Huffman, George J.; Adler, Robert F.; Bolvin, David T.; Gu, Guojun; Nelkin, Eric J.; Bowman, Kenneth P.; Stocker, Erich; Wolff, David B.

    2006-01-01

    The TRMM Multi-satellite Precipitation Analysis (TMPA) provides a calibration-based sequential scheme for combining multiple precipitation estimates from satellites, as well as gauge analyses where feasible, at fine scales (0.25 degrees x 0.25 degrees and 3-hourly). It is available both after and in real time, based on calibration by the TRMM Combined Instrument and TRMM Microwave Imager precipitation products, respectively. Only the after-real-time product incorporates gauge data at the present. The data set covers the latitude band 50 degrees N-S for the period 1998 to the delayed present. Early validation results are as follows: The TMPA provides reasonable performance at monthly scales, although it is shown to have precipitation rate dependent low bias due to lack of sensitivity to low precipitation rates in one of the input products (based on AMSU-B). At finer scales the TMPA is successful at approximately reproducing the surface-observation-based histogram of precipitation, as well as reasonably detecting large daily events. The TMPA, however, has lower skill in correctly specifying moderate and light event amounts on short time intervals, in common with other fine-scale estimators. Examples are provided of a flood event and diurnal cycle determination.

  7. On Mixed Data and Event Driven Design for Adaptive-Critic-Based Nonlinear $H_{\\infty}$ Control.

    PubMed

    Wang, Ding; Mu, Chaoxu; Liu, Derong; Ma, Hongwen

    2018-04-01

    In this paper, based on the adaptive critic learning technique, the control for a class of unknown nonlinear dynamic systems is investigated by adopting a mixed data and event driven design approach. The nonlinear control problem is formulated as a two-player zero-sum differential game and the adaptive critic method is employed to cope with the data-based optimization. The novelty lies in that the data driven learning identifier is combined with the event driven design formulation, in order to develop the adaptive critic controller, thereby accomplishing the nonlinear control. The event driven optimal control law and the time driven worst case disturbance law are approximated by constructing and tuning a critic neural network. Applying the event driven feedback control, the closed-loop system is built with stability analysis. Simulation studies are conducted to verify the theoretical results and illustrate the control performance. It is significant to observe that the present research provides a new avenue of integrating data-based control and event-triggering mechanism into establishing advanced adaptive critic systems.

  8. Design a Learning-Oriented Fall Event Reporting System Based on Kirkpatrick Model.

    PubMed

    Zhou, Sicheng; Kang, Hong; Gong, Yang

    2017-01-01

    Patient fall has been a severe problem in healthcare facilities around the world due to its prevalence and cost. Routine fall prevention training programs are not as effective as expected. Using event reporting systems is the trend for reducing patient safety events such as falls, although some limitations of the systems exist at current stage. We summarized these limitations through literature review, and developed an improved web-based fall event reporting system. The Kirkpatrick model, widely used in the business area for training program evaluation, has been integrated during the design of our system. Different from traditional event reporting systems that only collect and store the reports, our system automatically annotates and analyzes the reported events, and provides users with timely knowledge support specific to the reported event. The paper illustrates the design of our system and how its features are intended to reduce patient falls by learning from previous errors.

  9. Stochastic generation of hourly rainstorm events in Johor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nojumuddin, Nur Syereena; Yusof, Fadhilah; Yusop, Zulkifli

    2015-02-03

    Engineers and researchers in water-related studies are often faced with the problem of having insufficient and long rainfall record. Practical and effective methods must be developed to generate unavailable data from limited available data. Therefore, this paper presents a Monte-Carlo based stochastic hourly rainfall generation model to complement the unavailable data. The Monte Carlo simulation used in this study is based on the best fit of storm characteristics. Hence, by using the Maximum Likelihood Estimation (MLE) and Anderson Darling goodness-of-fit test, lognormal appeared to be the best rainfall distribution. Therefore, the Monte Carlo simulation based on lognormal distribution was usedmore » in the study. The proposed model was verified by comparing the statistical moments of rainstorm characteristics from the combination of the observed rainstorm events under 10 years and simulated rainstorm events under 30 years of rainfall records with those under the entire 40 years of observed rainfall data based on the hourly rainfall data at the station J1 in Johor over the period of 1972–2011. The absolute percentage error of the duration-depth, duration-inter-event time and depth-inter-event time will be used as the accuracy test. The results showed the first four product-moments of the observed rainstorm characteristics were close with the simulated rainstorm characteristics. The proposed model can be used as a basis to derive rainfall intensity-duration frequency in Johor.« less

  10. An event-version-based spatio-temporal modeling approach and its application in the cadastral management

    NASA Astrophysics Data System (ADS)

    Li, Yangdong; Han, Zhen; Liao, Zhongping

    2009-10-01

    Spatiality, temporality, legality, accuracy and continuality are characteristic of cadastral information, and the cadastral management demands that the cadastral data should be accurate, integrated and updated timely. It's a good idea to build an effective GIS management system to manage the cadastral data which are characterized by spatiality and temporality. Because no sound spatio-temporal data models have been adopted, however, the spatio-temporal characteristics of cadastral data are not well expressed in the existing cadastral management systems. An event-version-based spatio-temporal modeling approach is first proposed from the angle of event and version. Then with the help of it, an event-version-based spatio-temporal cadastral data model is built to represent spatio-temporal cadastral data. At last, the previous model is used in the design and implementation of a spatio-temporal cadastral management system. The result of the application of the system shows that the event-version-based spatio-temporal data model is very suitable for the representation and organization of cadastral data.

  11. Tempo Rubato : Animacy Speeds Up Time in the Brain

    PubMed Central

    Carrozzo, Mauro; Moscatelli, Alessandro; Lacquaniti, Francesco

    2010-01-01

    Background How do we estimate time when watching an action? The idea that events are timed by a centralized clock has recently been called into question in favour of distributed, specialized mechanisms. Here we provide evidence for a critical specialization: animate and inanimate events are separately timed by humans. Methodology/Principal Findings In different experiments, observers were asked to intercept a moving target or to discriminate the duration of a stationary flash while viewing different scenes. Time estimates were systematically shorter in the sessions involving human characters moving in the scene than in those involving inanimate moving characters. Remarkably, the animate/inanimate context also affected randomly intermingled trials which always depicted the same still character. Conclusions/Significance The existence of distinct time bases for animate and inanimate events might be related to the partial segregation of the neural networks processing these two categories of objects, and could enhance our ability to predict critically timed actions. PMID:21206749

  12. Supporting Beacon and Event-Driven Messages in Vehicular Platoons through Token-Based Strategies

    PubMed Central

    Uhlemann, Elisabeth

    2018-01-01

    Timely and reliable inter-vehicle communications is a critical requirement to support traffic safety applications, such as vehicle platooning. Furthermore, low-delay communications allow the platoon to react quickly to unexpected events. In this scope, having a predictable and highly effective medium access control (MAC) method is of utmost importance. However, the currently available IEEE 802.11p technology is unable to adequately address these challenges. In this paper, we propose a MAC method especially adapted to platoons, able to transmit beacons within the required time constraints, but with a higher reliability level than IEEE 802.11p, while concurrently enabling efficient dissemination of event-driven messages. The protocol circulates the token within the platoon not in a round-robin fashion, but based on beacon data age, i.e., the time that has passed since the previous collection of status information, thereby automatically offering repeated beacon transmission opportunities for increased reliability. In addition, we propose three different methods for supporting event-driven messages co-existing with beacons. Analysis and simulation results in single and multi-hop scenarios showed that, by providing non-competitive channel access and frequent retransmission opportunities, our protocol can offer beacon delivery within one beacon generation interval while fulfilling the requirements on low-delay dissemination of event-driven messages for traffic safety applications. PMID:29570676

  13. Supporting Beacon and Event-Driven Messages in Vehicular Platoons through Token-Based Strategies.

    PubMed

    Balador, Ali; Uhlemann, Elisabeth; Calafate, Carlos T; Cano, Juan-Carlos

    2018-03-23

    Timely and reliable inter-vehicle communications is a critical requirement to support traffic safety applications, such as vehicle platooning. Furthermore, low-delay communications allow the platoon to react quickly to unexpected events. In this scope, having a predictable and highly effective medium access control (MAC) method is of utmost importance. However, the currently available IEEE 802.11p technology is unable to adequately address these challenges. In this paper, we propose a MAC method especially adapted to platoons, able to transmit beacons within the required time constraints, but with a higher reliability level than IEEE 802.11p, while concurrently enabling efficient dissemination of event-driven messages. The protocol circulates the token within the platoon not in a round-robin fashion, but based on beacon data age, i.e., the time that has passed since the previous collection of status information, thereby automatically offering repeated beacon transmission opportunities for increased reliability. In addition, we propose three different methods for supporting event-driven messages co-existing with beacons. Analysis and simulation results in single and multi-hop scenarios showed that, by providing non-competitive channel access and frequent retransmission opportunities, our protocol can offer beacon delivery within one beacon generation interval while fulfilling the requirements on low-delay dissemination of event-driven messages for traffic safety applications.

  14. Sleep Patterns Are Associated with Common Illness in Adolescents

    PubMed Central

    Orzech, Kathryn M.; Acebo, Christine; Seifer, Ronald; Barker, David; Carskadon, Mary A.

    2014-01-01

    Summary This prospective, field-based study examined the association between actigraphically-measured total sleep time and incident illness including cold, flu, gastroenteritis, and other common infectious diseases (e.g., strep throat) in adolescents over the course of a school semester. Participants were 56 adolescents ages 14–19 years (mean = 16.6 (standard deviation = 1.2), 39% male) from 5 high schools in Rhode Island. Beginning in late January, adolescents wore actigraphs (mean 91 (19) days, range 16 – 112 days) and were assigned post-hoc to Longer or Shorter sleep groups based on median splits. Adolescents were interviewed weekly across as many as 16 weeks (modal number of interviews = 13) using a structured protocol that included 14 health event questions. Illness events and illness-related school absences were coded for 710 completed interviews, with 681 illness events and 90 school absences reported. Outcomes (illness bouts, illness duration, and absences) were compared among sex, sleep, and academic year groups using non-parametric regression. In a subset of 18 subjects, mean actigraphically estimated total sleep time 6 nights before matched illness/wellness events was compared using MANOVA. Longer sleepers and males reported fewer illness bouts; total sleep time effects were more apparent in males than females. A trend was found for shorter total sleep time before ill events. The present findings in this small naturalistic sample indicate that acute illnesses were more frequent in otherwise healthy adolescents with shorter sleep, and illness events were associated with less sleep during the prior week than comparable matched periods without illness. PMID:24134661

  15. Noether's Theorem and its Inverse of Birkhoffian System in Event Space Based on Herglotz Variational Problem

    NASA Astrophysics Data System (ADS)

    Tian, X.; Zhang, Y.

    2018-03-01

    Herglotz variational principle, in which the functional is defined by a differential equation, generalizes the classical ones defining the functional by an integral. The principle gives a variational principle description of nonconservative systems even when the Lagrangian is independent of time. This paper focuses on studying the Noether's theorem and its inverse of a Birkhoffian system in event space based on the Herglotz variational problem. Firstly, according to the Herglotz variational principle of a Birkhoffian system, the principle of a Birkhoffian system in event space is established. Secondly, its parametric equations and two basic formulae for the variation of Pfaff-Herglotz action of a Birkhoffian system in event space are obtained. Furthermore, the definition and criteria of Noether symmetry of the Birkhoffian system in event space based on the Herglotz variational problem are given. Then, according to the relationship between the Noether symmetry and conserved quantity, the Noether's theorem is derived. Under classical conditions, Noether's theorem of a Birkhoffian system in event space based on the Herglotz variational problem reduces to the classical ones. In addition, Noether's inverse theorem of the Birkhoffian system in event space based on the Herglotz variational problem is also obtained. In the end of the paper, an example is given to illustrate the application of the results.

  16. A Prototype External Event Broker for LSST

    NASA Astrophysics Data System (ADS)

    Elan Alvarez, Gabriella; Stassun, Keivan; Burger, Dan; Siverd, Robert; Cox, Donald

    2015-01-01

    LSST plans to have an alerts system that will automatically identify various types of "events" appearing in the LSST data stream. These events will include things such as supernovae, moving objects, and many other types, and it is expected that there will be millions of events nightly. It is expected that there may be tens of millions of events each night. To help the LSST community parse and make full advantage of the LSST alerts stream, we are working to design an external "events alert broker" that will generate real-time notification of LSST events to users and/or robotic telescope facilities based on user-specified criteria. For example, users will be able to specify that they wish to be notified immediately via text message of urgent events, such as GRB counterparts, or notified only occasionally in digest form of less time-sensitive events, such as eclipsing binaries. This poster will summarize results from a survey of scientists for the most important features that such an alerts notification service needs to provide, and will present a preliminary design for our external event broker.

  17. Influence of Convective Effect of Solar Winds on the CME Transit Time

    NASA Astrophysics Data System (ADS)

    Sun, Lu-yuan

    2017-10-01

    Based on an empirical model for predicting the transit time of coronal mass ejections (CMEs) proposed by Gopalswamy, 52 CME events which are related to the geomagnetic storms of Dst < -50 nT, and 10 CME events which caused extremely strong geomagnetic storms (Dst < -200 nT) in 1996- 2007 are selected, and combined with the observational data of the interplanetary solar winds that collected by the ACE satellite at 1AU, to analyze the influence of convective effect of ambient solar winds on the prediction of the CME transit time when it arrives at a place of 1 AU. After taking the convective effect of ambient solar winds into account, the standard deviation of predictions is reduced from 16.5 to 11.4 hours for the 52 CME events, and the prediction error is less than 15 hours for 68% of these events; while the standard deviation of predictions is reduced from 10.6 to 6.5 hours for the 10 CME events that caused extremely strong geomagnetic storms, and the prediction error is less than 5 hours for 6 of the 10 events. These results show that taking the convective effect of ambient solar winds into account can reduce the standard deviation of the predicted CME transit time, hence the convective effect of solar winds plays an important role for predicting the transit times of CME events.

  18. Teaching dental students about patient communication following an adverse event: a pilot educational module.

    PubMed

    Raja, Sheela; Rajagopalan, Chelsea F; Patel, Janki; Van Kanegan, Kevin

    2014-05-01

    Adverse events are an important but understudied area in dentistry. Most dentists will face the issue of an adverse event several times in their clinical careers. The authors implemented a six-hour pilot educational module at one dental school to improve fourth-year dental students' knowledge and confidence in communicating with patients about adverse events. Based on results from the twenty-nine students who completed both the pre- and posttests, the module significantly increased the students' knowledge of the key concepts involved in adverse events. However, the module did not improve the students' confidence that they would be able to implement these communication skills in clinical situations. Based on these results, this article discusses how future educational efforts can be modified to better prepare students for the communication challenges associated with adverse events.

  19. Quantitative Adverse Outcome Pathways and Their Application to Predictive Toxicology

    EPA Science Inventory

    A quantitative adverse outcome pathway (qAOP) consists of one or more biologically based, computational models describing key event relationships linking a molecular initiating event (MIE) to an adverse outcome. A qAOP provides quantitative, dose–response, and time-course p...

  20. Influence evaluation of loading conditions during pressurized thermal shock transients based on thermal-hydraulics and structural analyses

    NASA Astrophysics Data System (ADS)

    Katsuyama, Jinya; Uno, Shumpei; Watanabe, Tadashi; Li, Yinsheng

    2018-03-01

    The thermal hydraulic (TH) behavior of coolant water is a key factor in the structural integrity assessments on reactor pressure vessels (RPVs) of pressurized water reactors (PWRs) under pressurized thermal shock (PTS) events, because the TH behavior may affect the loading conditions in the assessment. From the viewpoint of TH behavior, configuration of plant equipment and their dimensions, and operator action time considerably influence various parameters, such as the temperature and flow rate of coolant water and inner pressure. In this study, to investigate the influence of the operator action time on TH behavior during a PTS event, we developed an analysis model for a typical Japanese PWR plant, including the RPV and the main components of both primary and secondary systems, and performed TH analyses by using a system analysis code called RELAP5. We applied two different operator action times based on the Japanese and the United States (US) rules: Operators may act after 10 min (Japanese rules) and 30 min (the US rules) after the occurrence of PTS events. Based on the results of TH analysis with different operator action times, we also performed structural analyses for evaluating thermal-stress distributions in the RPV during PTS events as loading conditions in the structural integrity assessment. From the analysis results, it was clarified that differences in operator action times significantly affect TH behavior and loading conditions, as the Japanese rule may lead to lower stresses than that under the US rule because an earlier operator action caused lower pressure in the RPV.

  1. Precision Seismic Monitoring of Volcanic Eruptions at Axial Seamount

    NASA Astrophysics Data System (ADS)

    Waldhauser, F.; Wilcock, W. S. D.; Tolstoy, M.; Baillard, C.; Tan, Y. J.; Schaff, D. P.

    2017-12-01

    Seven permanent ocean bottom seismometers of the Ocean Observatories Initiative's real time cabled observatory at Axial Seamount off the coast of the western United States record seismic activity since 2014. The array captured the April 2015 eruption, shedding light on the detailed structure and dynamics of the volcano and the Juan de Fuca midocean ridge system (Wilcock et al., 2016). After a period of continuously increasing seismic activity primarily associated with the reactivation of caldera ring faults, and the subsequent seismic crisis on April 24, 2015 with 7000 recorded events that day, seismicity rates steadily declined and the array currently records an average of 5 events per day. Here we present results from ongoing efforts to automatically detect and precisely locate seismic events at Axial in real-time, providing the computational framework and fundamental data that will allow rapid characterization and analysis of spatio-temporal changes in seismogenic properties. We combine a kurtosis-based P- and S-phase onset picker and time domain cross-correlation detection and phase delay timing algorithms together with single-event and double-difference location methods to rapidly and precisely (tens of meters) compute the location and magnitudes of new events with respect to a 2-year long, high-resolution background catalog that includes nearly 100,000 events within a 5×5 km region. We extend the real-time double-difference location software DD-RT to efficiently handle the anticipated high-rate and high-density earthquake activity during future eruptions. The modular monitoring framework will allow real-time tracking of other seismic events such as tremors and sea-floor lava explosions that enable the timing and location of lava flows and thus guide response research cruises to the most interesting sites. Finally, rapid detection of eruption precursors and initiation will allow for adaptive sampling by the OOI instruments for optimal recording of future eruptions. With a higher eruption recurrence rate than land-based volcanoes the Axial OOI observatory offers the opportunity to monitor and study volcanic eruptions throughout multiple cycles.

  2. Sex differences in daily life stress and craving in opioid-dependent patients.

    PubMed

    Moran, Landhing M; Kowalczyk, William J; Phillips, Karran A; Vahabzadeh, Massoud; Lin, Jia-Ling; Mezghanni, Mustapha; Epstein, David H; Preston, Kenzie L

    2018-04-11

    Responses to stress and drug craving differ between men and women. Differences in the momentary experience of stress in relation to craving are less well-understood. Using ecological momentary assessment (EMA), we examined sex differences in real-time in two areas: (1) causes and contexts associated with stress, and (2) the extent to which stress and drug cues are associated with craving. Outpatients on opioid-agonist treatment (135 males, 47 females) reported stress, craving, and behavior on smartphones for 16 weeks. They initiated an entry each time they felt more stressed than usual (stress event) and made randomly prompted entries 3 times/day. In stress-event entries, they identified the causes and context (location, activity, companions), and rated stress and craving severity. The causes reported for stress events did not differ significantly by sex. Women reported arguing and being in a store more often during stress events, and men reported working more often during stress events, compared to base rates (assessed via random prompts). Women showed a greater increase in opioid craving as a function of stress (p < 0.0001) and had higher stress ratings in the presence of both stress and drug cues relative to men (p < 0.01). Similar effects were found for cocaine craving in men (p < 0.0001). EMA methods provide evidence based on real-time activities and moods that opioid-dependent men and women experience similar contexts and causes for stress but differ in stress- and cue-induced craving. These findings support sex-based tailoring of treatment, but because not all participants conformed to the overall pattern of sex differences, any such tailoring should also consider person-level differences.

  3. A systematic comparison of recurrent event models for application to composite endpoints.

    PubMed

    Ozga, Ann-Kathrin; Kieser, Meinhard; Rauch, Geraldine

    2018-01-04

    Many clinical trials focus on the comparison of the treatment effect between two or more groups concerning a rarely occurring event. In this situation, showing a relevant effect with an acceptable power requires the observation of a large number of patients over a long period of time. For feasibility issues, it is therefore often considered to include several event types of interest, non-fatal or fatal, and to combine them within a composite endpoint. Commonly, a composite endpoint is analyzed with standard survival analysis techniques by assessing the time to the first occurring event. This approach neglects that an individual may experience more than one event which leads to a loss of information. As an alternative, composite endpoints could be analyzed by models for recurrent events. There exists a number of such models, e.g. regression models based on count data or Cox-based models such as the approaches of Andersen and Gill, Prentice, Williams and Peterson or, Wei, Lin and Weissfeld. Although some of the methods were already compared within the literature there exists no systematic investigation for the special requirements regarding composite endpoints. Within this work a simulation-based comparison of recurrent event models applied to composite endpoints is provided for different realistic clinical trial scenarios. We demonstrate that the Andersen-Gill model and the Prentice- Williams-Petersen models show similar results under various data scenarios whereas the Wei-Lin-Weissfeld model delivers effect estimators which can considerably deviate under commonly met data scenarios. Based on the conducted simulation study, this paper helps to understand the pros and cons of the investigated methods in the context of composite endpoints and provides therefore recommendations for an adequate statistical analysis strategy and a meaningful interpretation of results.

  4. Projections of hepatitis A virus infection associated with flood events by 2020 and 2030 in Anhui Province, China.

    PubMed

    Gao, Lu; Zhang, Ying; Ding, Guoyong; Liu, Qiyong; Wang, Changke; Jiang, Baofa

    2016-12-01

    Assessing and responding to health risk of climate change is important because of its impact on the natural and societal ecosystems. More frequent and severe flood events will occur in China due to climate change. Given that population is projected to increase, more people will be vulnerable to flood events, which may lead to an increased incidence of HAV infection in the future. This population-based study is going to project the future health burden of HAV infection associated with flood events in Huai River Basin of China. The study area covered four cities of Anhui province in China, where flood events were frequent. Time-series adjusted Poisson regression model was developed to quantify the risks of flood events on HAV infection based on the number of daily cases during summer seasons from 2005 to 2010, controlling for other meteorological variables. Projections of HAV infection in 2020 and 2030 were estimated based on the scenarios of flood events and demographic data. Poisson regression model suggested that compared with the periods without flood events, the risks of severe flood events for HAV infection were significant (OR = 1.28, 95 % CI 1.05-1.55), while risks were not significant from moderate flood events (OR = 1.16, 95 % CI 0.72-1.87) and mild flood events (OR = 1.14, 95 % CI 0.87-1.48). Using the 2010 baseline data and the flood event scenarios (one severe flood event), increased incidence of HAV infection were estimated to be between 0.126/10 5 and 0.127/10 5 for 2020. Similarly, the increased HAV infection incidence for 2030 was projected to be between 0.382/10 5 and 0.399/10 5 . Our study has, for the first time, quantified the increased incidence of HAV infection that will result from flood events in Anhui, China, in 2020 and 2030. The results have implications for public health preparation for developing public health responses to reduce HAV infection during future flood events.

  5. Bounded influence function based inference in joint modelling of ordinal partial linear model and accelerated failure time model.

    PubMed

    Chakraborty, Arindom

    2016-12-01

    A common objective in longitudinal studies is to characterize the relationship between a longitudinal response process and a time-to-event data. Ordinal nature of the response and possible missing information on covariates add complications to the joint model. In such circumstances, some influential observations often present in the data may upset the analysis. In this paper, a joint model based on ordinal partial mixed model and an accelerated failure time model is used, to account for the repeated ordered response and time-to-event data, respectively. Here, we propose an influence function-based robust estimation method. Monte Carlo expectation maximization method-based algorithm is used for parameter estimation. A detailed simulation study has been done to evaluate the performance of the proposed method. As an application, a data on muscular dystrophy among children is used. Robust estimates are then compared with classical maximum likelihood estimates. © The Author(s) 2014.

  6. Learning rational temporal eye movement strategies.

    PubMed

    Hoppe, David; Rothkopf, Constantin A

    2016-07-19

    During active behavior humans redirect their gaze several times every second within the visual environment. Where we look within static images is highly efficient, as quantified by computational models of human gaze shifts in visual search and face recognition tasks. However, when we shift gaze is mostly unknown despite its fundamental importance for survival in a dynamic world. It has been suggested that during naturalistic visuomotor behavior gaze deployment is coordinated with task-relevant events, often predictive of future events, and studies in sportsmen suggest that timing of eye movements is learned. Here we establish that humans efficiently learn to adjust the timing of eye movements in response to environmental regularities when monitoring locations in the visual scene to detect probabilistically occurring events. To detect the events humans adopt strategies that can be understood through a computational model that includes perceptual and acting uncertainties, a minimal processing time, and, crucially, the intrinsic costs of gaze behavior. Thus, subjects traded off event detection rate with behavioral costs of carrying out eye movements. Remarkably, based on this rational bounded actor model the time course of learning the gaze strategies is fully explained by an optimal Bayesian learner with humans' characteristic uncertainty in time estimation, the well-known scalar law of biological timing. Taken together, these findings establish that the human visual system is highly efficient in learning temporal regularities in the environment and that it can use these regularities to control the timing of eye movements to detect behaviorally relevant events.

  7. Application of satellite-based rainfall and medium range meteorological forecast in real-time flood forecasting in the Mahanadi River basin

    NASA Astrophysics Data System (ADS)

    Nanda, Trushnamayee; Beria, Harsh; Sahoo, Bhabagrahi; Chatterjee, Chandranath

    2016-04-01

    Increasing frequency of hydrologic extremes in a warming climate call for the development of reliable flood forecasting systems. The unavailability of meteorological parameters in real-time, especially in the developing parts of the world, makes it a challenging task to accurately predict flood, even at short lead times. The satellite-based Tropical Rainfall Measuring Mission (TRMM) provides an alternative to the real-time precipitation data scarcity. Moreover, rainfall forecasts by the numerical weather prediction models such as the medium term forecasts issued by the European Center for Medium range Weather Forecasts (ECMWF) are promising for multistep-ahead flow forecasts. We systematically evaluate these rainfall products over a large catchment in Eastern India (Mahanadi River basin). We found spatially coherent trends, with both the real-time TRMM rainfall and ECMWF rainfall forecast products overestimating low rainfall events and underestimating high rainfall events. However, no significant bias was found for the medium rainfall events. Another key finding was that these rainfall products captured the phase of the storms pretty well, but suffered from consistent under-prediction. The utility of the real-time TRMM and ECMWF forecast products are evaluated by rainfall-runoff modeling using different artificial neural network (ANN)-based models up to 3-days ahead. Keywords: TRMM; ECMWF; forecast; ANN; rainfall-runoff modeling

  8. LLNL Location and Detection Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Myers, S C; Harris, D B; Anderson, M L

    2003-07-16

    We present two LLNL research projects in the topical areas of location and detection. The first project assesses epicenter accuracy using a multiple-event location algorithm, and the second project employs waveform subspace Correlation to detect and identify events at Fennoscandian mines. Accurately located seismic events are the bases of location calibration. A well-characterized set of calibration events enables new Earth model development, empirical calibration, and validation of models. In a recent study, Bondar et al. (2003) develop network coverage criteria for assessing the accuracy of event locations that are determined using single-event, linearized inversion methods. These criteria are conservative andmore » are meant for application to large bulletins where emphasis is on catalog completeness and any given event location may be improved through detailed analysis or application of advanced algorithms. Relative event location techniques are touted as advancements that may improve absolute location accuracy by (1) ensuring an internally consistent dataset, (2) constraining a subset of events to known locations, and (3) taking advantage of station and event correlation structure. Here we present the preliminary phase of this work in which we use Nevada Test Site (NTS) nuclear explosions, with known locations, to test the effect of travel-time model accuracy on relative location accuracy. Like previous studies, we find that the reference velocity-model and relative-location accuracy are highly correlated. We also find that metrics based on travel-time residual of relocated events are not a reliable for assessing either velocity-model or relative-location accuracy. In the topical area of detection, we develop specialized correlation (subspace) detectors for the principal mines surrounding the ARCES station located in the European Arctic. Our objective is to provide efficient screens for explosions occurring in the mines of the Kola Peninsula (Kovdor, Zapolyarny, Olenogorsk, Khibiny) and the major iron mines of northern Sweden (Malmberget, Kiruna). In excess of 90% of the events detected by the ARCES station are mining explosions, and a significant fraction are from these northern mining groups. The primary challenge in developing waveform correlation detectors is the degree of variation in the source time histories of the shots, which can result in poor correlation among events even in close proximity. Our approach to solving this problem is to use lagged subspace correlation detectors, which offer some prospect of compensating for variation and uncertainty in source time functions.« less

  9. Poisson-event-based analysis of cell proliferation.

    PubMed

    Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul

    2015-05-01

    A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.

  10. FPGA-Based X-Ray Detection and Measurement for an X-Ray Polarimeter

    NASA Technical Reports Server (NTRS)

    Gregory, Kyle; Hill, Joanne; Black, Kevin; Baumgartner, Wayne

    2013-01-01

    This technology enables detection and measurement of x-rays in an x-ray polarimeter using a field-programmable gate array (FPGA). The technology was developed for the Gravitational and Extreme Magnetism Small Explorer (GEMS) mission. It performs precision energy and timing measurements, as well as rejection of non-x-ray events. It enables the GEMS polarimeter to detect precisely when an event has taken place so that additional measurements can be made. The technology also enables this function to be performed in an FPGA using limited resources so that mass and power can be minimized while reliability for a space application is maximized and precise real-time operation is achieved. This design requires a low-noise, charge-sensitive preamplifier; a highspeed analog to digital converter (ADC); and an x-ray detector with a cathode terminal. It functions by computing a sum of differences for time-samples whose difference exceeds a programmable threshold. A state machine advances through states as a programmable number of consecutive samples exceeds or fails to exceed this threshold. The pulse height is recorded as the accumulated sum. The track length is also measured based on the time from the start to the end of accumulation. For track lengths longer than a certain length, the algorithm estimates the barycenter of charge deposit by comparing the accumulator value at the midpoint to the final accumulator value. The design also employs a number of techniques for rejecting background events. This innovation enables the function to be performed in space where it can operate autonomously with a rapid response time. This implementation combines advantages of computing system-based approaches with those of pure analog approaches. The result is an implementation that is highly reliable, performs in real-time, rejects background events, and consumes minimal power.

  11. Methods for automatic trigger threshold adjustment

    DOEpatents

    Welch, Benjamin J; Partridge, Michael E

    2014-03-18

    Methods are presented for adjusting trigger threshold values to compensate for drift in the quiescent level of a signal monitored for initiating a data recording event, thereby avoiding false triggering conditions. Initial threshold values are periodically adjusted by re-measuring the quiescent signal level, and adjusting the threshold values by an offset computation based upon the measured quiescent signal level drift. Re-computation of the trigger threshold values can be implemented on time based or counter based criteria. Additionally, a qualification width counter can be utilized to implement a requirement that a trigger threshold criterion be met a given number of times prior to initiating a data recording event, further reducing the possibility of a false triggering situation.

  12. Assessing the performance of the generalized propensity score for estimating the effect of quantitative or continuous exposures on survival or time-to-event outcomes.

    PubMed

    Austin, Peter C

    2018-01-01

    Propensity score methods are frequently used to estimate the effects of interventions using observational data. The propensity score was originally developed for use with binary exposures. The generalized propensity score (GPS) is an extension of the propensity score for use with quantitative or continuous exposures (e.g. pack-years of cigarettes smoked, dose of medication, or years of education). We describe how the GPS can be used to estimate the effect of continuous exposures on survival or time-to-event outcomes. To do so we modified the concept of the dose-response function for use with time-to-event outcomes. We used Monte Carlo simulations to examine the performance of different methods of using the GPS to estimate the effect of quantitative exposures on survival or time-to-event outcomes. We examined covariate adjustment using the GPS and weighting using weights based on the inverse of the GPS. The use of methods based on the GPS was compared with the use of conventional G-computation and weighted G-computation. Conventional G-computation resulted in estimates of the dose-response function that displayed the lowest bias and the lowest variability. Amongst the two GPS-based methods, covariate adjustment using the GPS tended to have the better performance. We illustrate the application of these methods by estimating the effect of average neighbourhood income on the probability of survival following hospitalization for an acute myocardial infarction.

  13. Real-time notification and improved situational awareness in fire emergencies using geospatial-based publish/subscribe

    NASA Astrophysics Data System (ADS)

    Kassab, Ala'; Liang, Steve; Gao, Yang

    2010-12-01

    Emergency agencies seek to maintain situational awareness and effective decision making through continuous monitoring of, and real-time alerting about, sources of information regarding current incidents and developing fire hazards. The nature of this goal requires integrating different, potentially numerous, sources of dynamic geospatial information on the one side, and a large number of clients having heterogeneous and specific interests in data on the other side. In such scenarios, the traditional request/reply communication style may function inefficiently, as it is based on point-to-point, synchronous, and pulling mode interaction between consumer clients and information providers/services. In this work, we propose Geospatial-based Publish/ Subscribe, an interaction framework that serves as a middleware for real-time transacting of spatially related information of interest, termed geospatial events, in distributed systems. Expressive data models, including geospatial event and geospatial subscription, as well as an efficient matching approach for fast dissemination of geospatial events to interested clients, are introduced. The proposed interaction framework is realized through the development of a Real-Time Fire Emergency Response System (RFERS) prototype. The prototype is designed for transacting several topics of geospatial events that are crucial within the context of fire emergencies, including GPS locations of emergency assets, meteorological observations of wireless sensors, fire incidents reports, and temporal sequences of remote sensing images of active wildfires. The performance of the system prototype has been evaluated in order to demonstrate its efficiency.

  14. A new precipitation-based method of baseflow separation and event identification for small watersheds (<50 km2)

    NASA Astrophysics Data System (ADS)

    Koskelo, Antti I.; Fisher, Thomas R.; Utz, Ryan M.; Jordan, Thomas E.

    2012-07-01

    SummaryBaseflow separation methods are often impractical, require expensive materials and time-consuming methods, and/or are not designed for individual events in small watersheds. To provide a simple baseflow separation method for small watersheds, we describe a new precipitation-based technique known as the Sliding Average with Rain Record (SARR). The SARR uses rainfall data to justify each separation of the hydrograph. SARR has several advantages such as: it shows better consistency with the precipitation and discharge records, it is easier and more practical to implement, and it includes a method of event identification based on precipitation and quickflow response. SARR was derived from the United Kingdom Institute of Hydrology (UKIH) method with several key modifications to adapt it for small watersheds (<50 km2). We tested SARR on watersheds in the Choptank Basin on the Delmarva Peninsula (US Mid-Atlantic region) and compared the results with the UKIH method at the annual scale and the hydrochemical method at the individual event scale. Annually, SARR calculated a baseflow index that was ˜10% higher than the UKIH method due to the finer time step of SARR (1 d) compared to UKIH (5 d). At the watershed scale, hydric soils were an important driver of the annual baseflow index likely due to increased groundwater retention in hydric areas. At the event scale, SARR calculated less baseflow than the hydrochemical method, again because of the differences in time step (hourly for hydrochemical) and different definitions of baseflow. Both SARR and hydrochemical baseflow increased with event size, suggesting that baseflow contributions are more important during larger storms. To make SARR easy to implement, we have written a MatLab program to automate the calculations which requires only daily rainfall and daily flow data as inputs.

  15. Ensemble survival tree models to reveal pairwise interactions of variables with time-to-events outcomes in low-dimensional setting

    PubMed Central

    Dazard, Jean-Eudes; Ishwaran, Hemant; Mehlotra, Rajeev; Weinberg, Aaron; Zimmerman, Peter

    2018-01-01

    Unraveling interactions among variables such as genetic, clinical, demographic and environmental factors is essential to understand the development of common and complex diseases. To increase the power to detect such variables interactions associated with clinical time-to-events outcomes, we borrowed established concepts from random survival forest (RSF) models. We introduce a novel RSF-based pairwise interaction estimator and derive a randomization method with bootstrap confidence intervals for inferring interaction significance. Using various linear and nonlinear time-to-events survival models in simulation studies, we first show the efficiency of our approach: true pairwise interaction-effects between variables are uncovered, while they may not be accompanied with their corresponding main-effects, and may not be detected by standard semi-parametric regression modeling and test statistics used in survival analysis. Moreover, using a RSF-based cross-validation scheme for generating prediction estimators, we show that informative predictors may be inferred. We applied our approach to an HIV cohort study recording key host gene polymorphisms and their association with HIV change of tropism or AIDS progression. Altogether, this shows how linear or nonlinear pairwise statistical interactions of variables may be efficiently detected with a predictive value in observational studies with time-to-event outcomes. PMID:29453930

  16. Ensemble survival tree models to reveal pairwise interactions of variables with time-to-events outcomes in low-dimensional setting.

    PubMed

    Dazard, Jean-Eudes; Ishwaran, Hemant; Mehlotra, Rajeev; Weinberg, Aaron; Zimmerman, Peter

    2018-02-17

    Unraveling interactions among variables such as genetic, clinical, demographic and environmental factors is essential to understand the development of common and complex diseases. To increase the power to detect such variables interactions associated with clinical time-to-events outcomes, we borrowed established concepts from random survival forest (RSF) models. We introduce a novel RSF-based pairwise interaction estimator and derive a randomization method with bootstrap confidence intervals for inferring interaction significance. Using various linear and nonlinear time-to-events survival models in simulation studies, we first show the efficiency of our approach: true pairwise interaction-effects between variables are uncovered, while they may not be accompanied with their corresponding main-effects, and may not be detected by standard semi-parametric regression modeling and test statistics used in survival analysis. Moreover, using a RSF-based cross-validation scheme for generating prediction estimators, we show that informative predictors may be inferred. We applied our approach to an HIV cohort study recording key host gene polymorphisms and their association with HIV change of tropism or AIDS progression. Altogether, this shows how linear or nonlinear pairwise statistical interactions of variables may be efficiently detected with a predictive value in observational studies with time-to-event outcomes.

  17. Data-driven event-by-event respiratory motion correction using TOF PET list-mode centroid of distribution

    NASA Astrophysics Data System (ADS)

    Ren, Silin; Jin, Xiao; Chan, Chung; Jian, Yiqiang; Mulnix, Tim; Liu, Chi; E Carson, Richard

    2017-06-01

    Data-driven respiratory gating techniques were developed to correct for respiratory motion in PET studies, without the help of external motion tracking systems. Due to the greatly increased image noise in gated reconstructions, it is desirable to develop a data-driven event-by-event respiratory motion correction method. In this study, using the Centroid-of-distribution (COD) algorithm, we established a data-driven event-by-event respiratory motion correction technique using TOF PET list-mode data, and investigated its performance by comparing with an external system-based correction method. Ten human scans with the pancreatic β-cell tracer 18F-FP-(+)-DTBZ were employed. Data-driven respiratory motions in superior-inferior (SI) and anterior-posterior (AP) directions were first determined by computing the centroid of all radioactive events during each short time frame with further processing. The Anzai belt system was employed to record respiratory motion in all studies. COD traces in both SI and AP directions were first compared with Anzai traces by computing the Pearson correlation coefficients. Then, respiratory gated reconstructions based on either COD or Anzai traces were performed to evaluate their relative performance in capturing respiratory motion. Finally, based on correlations of displacements of organ locations in all directions and COD information, continuous 3D internal organ motion in SI and AP directions was calculated based on COD traces to guide event-by-event respiratory motion correction in the MOLAR reconstruction framework. Continuous respiratory correction results based on COD were compared with that based on Anzai, and without motion correction. Data-driven COD traces showed a good correlation with Anzai in both SI and AP directions for the majority of studies, with correlation coefficients ranging from 63% to 89%. Based on the determined respiratory displacements of pancreas between end-expiration and end-inspiration from gated reconstructions, there was no significant difference between COD-based and Anzai-based methods. Finally, data-driven COD-based event-by-event respiratory motion correction yielded comparable results to that based on Anzai respiratory traces, in terms of contrast recovery and reduced motion-induced blur. Data-driven event-by-event respiratory motion correction using COD showed significant image quality improvement compared with reconstructions with no motion correction, and gave comparable results to the Anzai-based method.

  18. Data-driven event-by-event respiratory motion correction using TOF PET list-mode centroid of distribution.

    PubMed

    Ren, Silin; Jin, Xiao; Chan, Chung; Jian, Yiqiang; Mulnix, Tim; Liu, Chi; Carson, Richard E

    2017-06-21

    Data-driven respiratory gating techniques were developed to correct for respiratory motion in PET studies, without the help of external motion tracking systems. Due to the greatly increased image noise in gated reconstructions, it is desirable to develop a data-driven event-by-event respiratory motion correction method. In this study, using the Centroid-of-distribution (COD) algorithm, we established a data-driven event-by-event respiratory motion correction technique using TOF PET list-mode data, and investigated its performance by comparing with an external system-based correction method. Ten human scans with the pancreatic β-cell tracer 18 F-FP-(+)-DTBZ were employed. Data-driven respiratory motions in superior-inferior (SI) and anterior-posterior (AP) directions were first determined by computing the centroid of all radioactive events during each short time frame with further processing. The Anzai belt system was employed to record respiratory motion in all studies. COD traces in both SI and AP directions were first compared with Anzai traces by computing the Pearson correlation coefficients. Then, respiratory gated reconstructions based on either COD or Anzai traces were performed to evaluate their relative performance in capturing respiratory motion. Finally, based on correlations of displacements of organ locations in all directions and COD information, continuous 3D internal organ motion in SI and AP directions was calculated based on COD traces to guide event-by-event respiratory motion correction in the MOLAR reconstruction framework. Continuous respiratory correction results based on COD were compared with that based on Anzai, and without motion correction. Data-driven COD traces showed a good correlation with Anzai in both SI and AP directions for the majority of studies, with correlation coefficients ranging from 63% to 89%. Based on the determined respiratory displacements of pancreas between end-expiration and end-inspiration from gated reconstructions, there was no significant difference between COD-based and Anzai-based methods. Finally, data-driven COD-based event-by-event respiratory motion correction yielded comparable results to that based on Anzai respiratory traces, in terms of contrast recovery and reduced motion-induced blur. Data-driven event-by-event respiratory motion correction using COD showed significant image quality improvement compared with reconstructions with no motion correction, and gave comparable results to the Anzai-based method.

  19. Analysis of the geophysical data using a posteriori algorithms

    NASA Astrophysics Data System (ADS)

    Voskoboynikova, Gyulnara; Khairetdinov, Marat

    2016-04-01

    The problems of monitoring, prediction and prevention of extraordinary natural and technogenic events are priority of modern problems. These events include earthquakes, volcanic eruptions, the lunar-solar tides, landslides, falling celestial bodies, explosions utilized stockpiles of ammunition, numerous quarry explosion in open coal mines, provoking technogenic earthquakes. Monitoring is based on a number of successive stages, which include remote registration of the events responses, measurement of the main parameters as arrival times of seismic waves or the original waveforms. At the final stage the inverse problems associated with determining the geographic location and time of the registration event are solving. Therefore, improving the accuracy of the parameters estimation of the original records in the high noise is an important problem. As is known, the main measurement errors arise due to the influence of external noise, the difference between the real and model structures of the medium, imprecision of the time definition in the events epicenter, the instrumental errors. Therefore, posteriori algorithms more accurate in comparison with known algorithms are proposed and investigated. They are based on a combination of discrete optimization method and fractal approach for joint detection and estimation of the arrival times in the quasi-periodic waveforms sequence in problems of geophysical monitoring with improved accuracy. Existing today, alternative approaches to solving these problems does not provide the given accuracy. The proposed algorithms are considered for the tasks of vibration sounding of the Earth in times of lunar and solar tides, and for the problem of monitoring of the borehole seismic source location in trade drilling.

  20. A rank test for bivariate time-to-event outcomes when one event is a surrogate

    PubMed Central

    Shaw, Pamela A.; Fay, Michael P.

    2016-01-01

    In many clinical settings, improving patient survival is of interest but a practical surrogate, such as time to disease progression, is instead used as a clinical trial’s primary endpoint. A time-to-first endpoint (e.g. death or disease progression) is commonly analyzed but may not be adequate to summarize patient outcomes if a subsequent event contains important additional information. We consider a surrogate outcome very generally, as one correlated with the true endpoint of interest. Settings of interest include those where the surrogate indicates a beneficial outcome so that the usual time-to-first endpoint of death or surrogate event is nonsensical. We present a new two-sample test for bivariate, interval-censored time-to-event data, where one endpoint is a surrogate for the second, less frequently observed endpoint of true interest. This test examines whether patient groups have equal clinical severity. If the true endpoint rarely occurs, the proposed test acts like a weighted logrank test on the surrogate; if it occurs for most individuals, then our test acts like a weighted logrank test on the true endpoint. If the surrogate is a useful statistical surrogate, our test can have better power than tests based on the surrogate that naively handle the true endpoint. In settings where the surrogate is not valid (treatment affects the surrogate but not the true endpoint), our test incorporates the information regarding the lack of treatment effect from the observed true endpoints and hence is expected to have a dampened treatment effect compared to tests based on the surrogate alone. PMID:27059817

  1. A new Bayesian Inference-based Phase Associator for Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Meier, Men-Andrin; Heaton, Thomas; Clinton, John; Wiemer, Stefan

    2013-04-01

    State of the art network-based Earthquake Early Warning (EEW) systems can provide warnings for large magnitude 7+ earthquakes. Although regions in the direct vicinity of the epicenter will not receive warnings prior to damaging shaking, real-time event characterization is available before the destructive S-wave arrival across much of the strongly affected region. In contrast, in the case of the more frequent medium size events, such as the devastating 1994 Mw6.7 Northridge, California, earthquake, providing timely warning to the smaller damage zone is more difficult. For such events the "blind zone" of current systems (e.g. the CISN ShakeAlert system in California) is similar in size to the area over which severe damage occurs. We propose a faster and more robust Bayesian inference-based event associator, that in contrast to the current standard associators (e.g. Earthworm Binder), is tailored to EEW and exploits information other than only phase arrival times. In particular, the associator potentially allows for reliable automated event association with as little as two observations, which, compared to the ShakeAlert system, would speed up the real-time characterizations by about ten seconds and thus reduce the blind zone area by up to 80%. We compile an extensive data set of regional and teleseismic earthquake and noise waveforms spanning a wide range of earthquake magnitudes and tectonic regimes. We pass these waveforms through a causal real-time filterbank with passband filters between 0.1 and 50Hz, and, updating every second from the event detection, extract the maximum amplitudes in each frequency band. Using this dataset, we define distributions of amplitude maxima in each passband as a function of epicentral distance and magnitude. For the real-time data, we pass incoming broadband and strong motion waveforms through the same filterbank and extract an evolving set of maximum amplitudes in each passband. We use the maximum amplitude distributions to check whether the incoming waveforms are consistent with amplitude and frequency patterns of local earthquakes by means of a maximum likelihood approach. If such a single-station event likelihood is larger than a predefined threshold value we check whether there are neighboring stations that also have single-station event likelihoods above the threshold. If this is the case for at least one other station, we evaluate whether the respective relative arrival times are in agreement with a common earthquake origin (assuming a simple velocity model and using an Equal Differential Time location scheme). Additionally we check if there are stations where, given the preliminary location, observations would be expected but were not reported ("not-yet-arrived data"). Together, the single-station event likelihood functions and the location likelihood function constitute the multi-station event likelihood function. This function can then be combined with various types of prior information (such as station noise levels, preceding seismicity, fault proximity, etc.) to obtain a Bayesian posterior distribution, representing the degree of belief that the ensemble of the current real-time observations correspond to a local earthquake, rather than to some other signal source irrelevant for EEW. Additional to the reduction of the blind zone size, this approach facilitates the eventual development of an end-to-end probabilistic framework for an EEW system that provides systematic real-time assessment of the risk of false alerts, which enables end users of EEW to implement damage mitigation strategies only above a specified certainty level.

  2. Bio-inspired UAV routing, source localization, and acoustic signature classification for persistent surveillance

    NASA Astrophysics Data System (ADS)

    Burman, Jerry; Hespanha, Joao; Madhow, Upamanyu; Pham, Tien

    2011-06-01

    A team consisting of Teledyne Scientific Company, the University of California at Santa Barbara and the Army Research Laboratory* is developing technologies in support of automated data exfiltration from heterogeneous battlefield sensor networks to enhance situational awareness for dismounts and command echelons. Unmanned aerial vehicles (UAV) provide an effective means to autonomously collect data from a sparse network of unattended ground sensors (UGSs) that cannot communicate with each other. UAVs are used to reduce the system reaction time by generating autonomous collection routes that are data-driven. Bio-inspired techniques for search provide a novel strategy to detect, capture and fuse data. A fast and accurate method has been developed to localize an event by fusing data from a sparse number of UGSs. This technique uses a bio-inspired algorithm based on chemotaxis or the motion of bacteria seeking nutrients in their environment. A unique acoustic event classification algorithm was also developed based on using swarm optimization. Additional studies addressed the problem of routing multiple UAVs, optimally placing sensors in the field and locating the source of gunfire at helicopters. A field test was conducted in November of 2009 at Camp Roberts, CA. The field test results showed that a system controlled by bio-inspired software algorithms can autonomously detect and locate the source of an acoustic event with very high accuracy and visually verify the event. In nine independent test runs of a UAV, the system autonomously located the position of an explosion nine times with an average accuracy of 3 meters. The time required to perform source localization using the UAV was on the order of a few minutes based on UAV flight times. In June 2011, additional field tests of the system will be performed and will include multiple acoustic events, optimal sensor placement based on acoustic phenomenology and the use of the International Technology Alliance (ITA) Sensor Network Fabric (IBM).

  3. Biases in the subjective timing of perceptual events: Libet et al. (1983) revisited.

    PubMed

    Danquah, Adam N; Farrell, Martin J; O'Boyle, Donald J

    2008-09-01

    We report two experiments in which participants had to judge the time of occurrence of a stimulus relative to a clock. The experiments were based on the control condition used by Libet, Gleason, Wright, and Pearl [Libet, B., Gleason, C. A., Wright, E. W., & Pearl, D. K. (1983). Time of conscious intention to act in relation to onset of cerebral activities (readiness-potential): The unconscious initiation of a freely voluntary act. Brain 106, 623-642] to correct for any bias in the estimation of the time at which an endogenous event, the conscious intention to perform a movement, occurred. Participants' responses were affected systematically by the sensory modality of the stimulus and by the speed of the clock. Such findings demonstrate the variability in judging the time at which an exogenous event occurs and, by extension, suggest that such variability may also apply to the judging the time of occurrence of endogenous events. The reliability of participants' estimations of when they formed the conscious intention to perform a movement in Libet et al.'s (1983) study is therefore questionable.

  4. A novel high performance ESD power clamp circuit with a small area

    NASA Astrophysics Data System (ADS)

    Zhaonian, Yang; Hongxia, Liu; Li, Li; Qingqing, Zhuo

    2012-09-01

    A MOSFET-based electrostatic discharge (ESD) power clamp circuit with only a 10 ns RC time constant for a 0.18-μm process is proposed. A diode-connected NMOSFET is used to maintain a long delay time and save area. The special structure overcomes other shortcomings in this clamp circuit. Under fast power-up events, the gate voltage of the clamp MOSFET does not rise as quickly as under ESD events, the special structure can keep the clamp MOSFET thoroughly off. Under a falsely triggered event, the special structure can turn off the clamp MOSFET in a short time. The clamp circuit can also reject the power supply noise effectively. Simulation results show that the clamp circuit avoids fast false triggering events such as a 30 ns/1.8 V power-up, maintains a 1.2 μs delay time and a 2.14 μs turn-off time, and reduces to about 70% of the RC time constant. It is believed that the proposed clamp circuit can be widely used in high-speed integrated circuits.

  5. A log-Weibull spatial scan statistic for time to event data.

    PubMed

    Usman, Iram; Rosychuk, Rhonda J

    2018-06-13

    Spatial scan statistics have been used for the identification of geographic clusters of elevated numbers of cases of a condition such as disease outbreaks. These statistics accompanied by the appropriate distribution can also identify geographic areas with either longer or shorter time to events. Other authors have proposed the spatial scan statistics based on the exponential and Weibull distributions. We propose the log-Weibull as an alternative distribution for the spatial scan statistic for time to events data and compare and contrast the log-Weibull and Weibull distributions through simulation studies. The effect of type I differential censoring and power have been investigated through simulated data. Methods are also illustrated on time to specialist visit data for discharged patients presenting to emergency departments for atrial fibrillation and flutter in Alberta during 2010-2011. We found northern regions of Alberta had longer times to specialist visit than other areas. We proposed the spatial scan statistic for the log-Weibull distribution as a new approach for detecting spatial clusters for time to event data. The simulation studies suggest that the test performs well for log-Weibull data.

  6. Comparative study of predicted and experimentally detected interplanetary shocks

    NASA Astrophysics Data System (ADS)

    Kartalev, M. D.; Grigorov, K. G.; Smith, Z.; Dryer, M.; Fry, C. D.; Sun, Wei; Deehr, C. S.

    2002-03-01

    We compare the real time space weather prediction shock arrival times at 1 AU made by the USAF/NOAA Shock Time of Arrival (STOA) and Interplanetary Shock Propagation Model (ISPM) models, and the Exploration Physics International/University of Alaska Hakamada-Akasofu-Fry Solar Wind Model (HAF-v2) to a real time analysis analysis of plasma and field ACE data. The comparison is made using an algorithm that was developed on the basis of wavelet data analysis and MHD identification procedure. The shock parameters are estimated for selected "candidate events". An appropriate automatically performing Web-based interface periodically utilizes solar wind observations made by the ACE at L1. Near real time results as well an archive of the registered interesting events are available on a specially developed web site. A number of events are considered. These studies are essential for the validation of real time space weather forecasts made from solar data.

  7. Disease progress and response to treatment as predictors of survival, disability, cognitive impairment and depression in Parkinson's disease

    PubMed Central

    Vu, Thuy C.; Nutt, John G.; Holford, Nicholas H. G.

    2012-01-01

    AIM To describe the time to clinical events (death, disability, cognitive impairment and depression) in Parkinson's disease using the time course of disease status and treatment as explanatory variables. METHODS Disease status based on the Unified Parkinson's Disease Rating Scale (UPDRS) and the time to clinical outcome events were obtained from 800 patients who initially had early Parkinson's disease. Parametric hazard models were used to describe the time to the events of interest. RESULTS Time course of disease status (severity) was an important predictor of clinical outcome events. There was an increased hazard ratio for death 1.4 (95% CI 1.31, 149), disability 2.75 (95% CI 2.30, 3.28), cognitive impairment 4.35 (95% CI 1.94, 9.74), and depressive state 1.43 (95% CI 1.26, 1.63) with each 10 unit increase of UPDRS. Age at study entry increased the hazard with hazard ratios of 49.1 (95% CI 8.7, 278) for death, 4.76 (95% CI 1.10, 20.6) for disability and 90.0 (95% CI 63.3–128) for cognitive impairment at age 60 years. Selegiline treatment had independent effects as a predictor of death at 8 year follow-up with a hazard ratio of 2.54 (95% CI 1.51, 4.25) but had beneficial effects on disability with a hazard ratio of 0.363 (95% CI 0.132, 0.533) and depression with a hazard ratio of 0.372 (95% CI 0.12, 0.552). CONCLUSIONS Our findings show that the time course of disease status based on UPDRS is a much better predictor of future clinical events than any baseline disease characteristic. Continued selegiline treatment appears to increase the hazard of death. PMID:22300470

  8. Support Vector Hazards Machine: A Counting Process Framework for Learning Risk Scores for Censored Outcomes.

    PubMed

    Wang, Yuanjia; Chen, Tianle; Zeng, Donglin

    2016-01-01

    Learning risk scores to predict dichotomous or continuous outcomes using machine learning approaches has been studied extensively. However, how to learn risk scores for time-to-event outcomes subject to right censoring has received little attention until recently. Existing approaches rely on inverse probability weighting or rank-based regression, which may be inefficient. In this paper, we develop a new support vector hazards machine (SVHM) approach to predict censored outcomes. Our method is based on predicting the counting process associated with the time-to-event outcomes among subjects at risk via a series of support vector machines. Introducing counting processes to represent time-to-event data leads to a connection between support vector machines in supervised learning and hazards regression in standard survival analysis. To account for different at risk populations at observed event times, a time-varying offset is used in estimating risk scores. The resulting optimization is a convex quadratic programming problem that can easily incorporate non-linearity using kernel trick. We demonstrate an interesting link from the profiled empirical risk function of SVHM to the Cox partial likelihood. We then formally show that SVHM is optimal in discriminating covariate-specific hazard function from population average hazard function, and establish the consistency and learning rate of the predicted risk using the estimated risk scores. Simulation studies show improved prediction accuracy of the event times using SVHM compared to existing machine learning methods and standard conventional approaches. Finally, we analyze two real world biomedical study data where we use clinical markers and neuroimaging biomarkers to predict age-at-onset of a disease, and demonstrate superiority of SVHM in distinguishing high risk versus low risk subjects.

  9. Developing assessment system for wireless capsule endoscopy videos based on event detection

    NASA Astrophysics Data System (ADS)

    Chen, Ying-ju; Yasen, Wisam; Lee, Jeongkyu; Lee, Dongha; Kim, Yongho

    2009-02-01

    Along with the advancing of technology in wireless and miniature camera, Wireless Capsule Endoscopy (WCE), the combination of both, enables a physician to diagnose patient's digestive system without actually perform a surgical procedure. Although WCE is a technical breakthrough that allows physicians to visualize the entire small bowel noninvasively, the video viewing time takes 1 - 2 hours. This is very time consuming for the gastroenterologist. Not only it sets a limit on the wide application of this technology but also it incurs considerable amount of cost. Therefore, it is important to automate such process so that the medical clinicians only focus on interested events. As an extension from our previous work that characterizes the motility of digestive tract in WCE videos, we propose a new assessment system for energy based events detection (EG-EBD) to classify the events in WCE videos. For the system, we first extract general features of a WCE video that can characterize the intestinal contractions in digestive organs. Then, the event boundaries are identified by using High Frequency Content (HFC) function. The segments are classified into WCE event by special features. In this system, we focus on entering duodenum, entering cecum, and active bleeding. This assessment system can be easily extended to discover more WCE events, such as detailed organ segmentation and more diseases, by using new special features. In addition, the system provides a score for every WCE image for each event. Using the event scores, the system helps a specialist to speedup the diagnosis process.

  10. Finite-Horizon $H_\\infty $ Consensus for Multiagent Systems With Redundant Channels via An Observer-Type Event-Triggered Scheme.

    PubMed

    Xu, Wenying; Wang, Zidong; Ho, Daniel W C

    2018-05-01

    This paper is concerned with the finite-horizon consensus problem for a class of discrete time-varying multiagent systems with external disturbances and missing measurements. To improve the communication reliability, redundant channels are introduced and the corresponding protocol is constructed for the information transmission over redundant channels. An event-triggered scheme is adopted to determine whether the information of agents should be transmitted to their neighbors. Subsequently, an observer-type event-triggered control protocol is proposed based on the latest received neighbors' information. The purpose of the addressed problem is to design a time-varying controller based on the observed information to achieve the consensus performance in a finite horizon. By utilizing a constrained recursive Riccati difference equation approach, some sufficient conditions are obtained to guarantee the consensus performance, and the controller parameters are also designed. Finally, a numerical example is provided to demonstrate the desired reliability of redundant channels and the effectiveness of the event-triggered control protocol.

  11. Non-stationary least-squares complex decomposition for microseismic noise attenuation

    NASA Astrophysics Data System (ADS)

    Chen, Yangkang

    2018-06-01

    Microseismic data processing and imaging are crucial for subsurface real-time monitoring during hydraulic fracturing process. Unlike the active-source seismic events or large-scale earthquake events, the microseismic event is usually of very small magnitude, which makes its detection challenging. The biggest trouble of microseismic data is the low signal-to-noise ratio issue. Because of the small energy difference between effective microseismic signal and ambient noise, the effective signals are usually buried in strong random noise. I propose a useful microseismic denoising algorithm that is based on decomposing a microseismic trace into an ensemble of components using least-squares inversion. Based on the predictive property of useful microseismic event along the time direction, the random noise can be filtered out via least-squares fitting of multiple damping exponential components. The method is flexible and almost automated since the only parameter needed to be defined is a decomposition number. I use some synthetic and real data examples to demonstrate the potential of the algorithm in processing complicated microseismic data sets.

  12. Effects of intermediate-scale wind disturbance on composition, structure, and succession in Quercus stands: Implications for natural disturbance-based silviculture

    Treesearch

    M.M. Cowden; J.L. Hart; C.J. Schweitzer; D.C. Dey

    2014-01-01

    Forest disturbances are discrete events in space and time that disrupt the biophysical environment and impart lasting legacies on forest composition and structure. Disturbances are often classified along a gradient of spatial extent and magnitude that ranges from catastrophic events where most of the overstory is removed to gap-scale events that modify local...

  13. The Influence of Age at Single-Event Multilevel Surgery on Outcome in Children with Cerebral Palsy Who Walk with Flexed Knee Gait

    ERIC Educational Resources Information Center

    Svehlik, Martin; Steinwender, Gerhard; Kraus, Tanja; Saraph, Vinay; Lehmann, Thomas; Linhart, Wolfgang E.; Zwick, Ernst B.

    2011-01-01

    Aim: Information on the timing and long-term outcome of single-event multilevel surgery in children with bilateral spastic cerebral palsy (CP) walking with flexed knee gait is limited. Based on our clinical experience, we hypothesized that older children with bilateral spastic CP would benefit more from single-event multilevel surgery than younger…

  14. Joint time-frequency analysis of EEG signals based on a phase-space interpretation of the recording process

    NASA Astrophysics Data System (ADS)

    Testorf, M. E.; Jobst, B. C.; Kleen, J. K.; Titiz, A.; Guillory, S.; Scott, R.; Bujarski, K. A.; Roberts, D. W.; Holmes, G. L.; Lenck-Santini, P.-P.

    2012-10-01

    Time-frequency transforms are used to identify events in clinical EEG data. Data are recorded as part of a study for correlating the performance of human subjects during a memory task with pathological events in the EEG, called spikes. The spectrogram and the scalogram are reviewed as tools for evaluating spike activity. A statistical evaluation of the continuous wavelet transform across trials is used to quantify phase-locking events. For simultaneously improving the time and frequency resolution, and for representing the EEG of several channels or trials in a single time-frequency plane, a multichannel matching pursuit algorithm is used. Fundamental properties of the algorithm are discussed as well as preliminary results, which were obtained with clinical EEG data.

  15. First demonstration of an emulsion multi-stage shifter for accelerator neutrino experiments in J-PARC T60

    NASA Astrophysics Data System (ADS)

    Yamada, K.; Aoki, S.; Cao, S.; Chikuma, N.; Fukuda, T.; Fukuzawa, Y.; Gonin, M.; Hayashino, T.; Hayato, Y.; Hiramoto, A.; Hosomi, F.; Inoh, T.; Iori, S.; Ishiguro, K.; Kawahara, H.; Kim, H.; Kitagawa, N.; Koga, T.; Komatani, R.; Komatsu, M.; Matsushita, A.; Mikado, S.; Minamino, A.; Mizusawa, H.; Matsumoto, T.; Matsuo, T.; Morimoto, Y.; Morishima, K.; Morishita, M.; Naganawa, N.; Nakamura, K.; Nakamura, M.; Nakamura, Y.; Nakano, T.; Nakatsuka, Y.; Nakaya, T.; Nishio, A.; Ogawa, S.; Oshima, H.; Quilain, B.; Rokujo, H.; Sato, O.; Seiya, Y.; Shibuya, H.; Shiraishi, T.; Suzuki, Y.; Tada, S.; Takahashi, S.; Yokoyama, M.; Yoshimoto, M.

    2017-06-01

    We describe the first ever implementation of a clock-based, multi-stage emulsion shifter in an accelerator neutrino experiment. The system was installed in the neutrino monitoring building at the Japan Proton Accelerator Research Complex as part of a test experiment, T60, and stable operation was maintained for a total of 126.6 days. By applying time information to emulsion films, various results were obtained. Time resolutions of 5.3-14.7 s were evaluated in an operation spanning 46.9 days (yielding division numbers of 1.4-3.8×105). By using timing and spatial information, reconstruction of coincident events consisting of high-multiplicity and vertex-contained events, including neutrino events, was performed. Emulsion events were matched to events observed by INGRID, one of the on-axis near detectors of the T2K experiment, with high reliability (98.5%), and hybrid analysis of the emulsion and INGRID events was established by means of the multi-stage shifter. The results demonstrate that the multi-stage shifter can feasibly be used in neutrino experiments.

  16. Sport events and climate for visitors—the case of FIFA World Cup in Qatar 2022

    NASA Astrophysics Data System (ADS)

    Matzarakis, Andreas; Fröhlich, Dominik

    2015-04-01

    The effect of weather on sport events is not well studied. It requires special attention if the event is taking place at a time and place with extreme weather situations. For the world soccer championship in Qatar (Doha 2022), human biometeorological analysis has been performed in order to identify the time of the year that is most suitable in terms of thermal comfort for visitors attending the event. The analysis is based on thermal indices like Physiologically Equivalent Temperature (PET). The results show that this kind of event may be not appropriate for visitors, if it is placed during months with extreme conditions. For Doha, this is the period from May to September, when conditions during a large majority of hours of the day cause strong heat stress for the visitors. A more appropriate time would be the months November to February, when thermally comfortable conditions are much more frequent. The methods applied here can quantify the thermal conditions and show limitations and possibilities for specific events and locations.

  17. Optimal filter parameters for low SNR seismograms as a function of station and event location

    NASA Astrophysics Data System (ADS)

    Leach, Richard R.; Dowla, Farid U.; Schultz, Craig A.

    1999-06-01

    Global seismic monitoring requires deployment of seismic sensors worldwide, in many areas that have not been studied or have few useable recordings. Using events with lower signal-to-noise ratios (SNR) would increase the amount of data from these regions. Lower SNR events can add significant numbers to data sets, but recordings of these events must be carefully filtered. For a given region, conventional methods of filter selection can be quite subjective and may require intensive analysis of many events. To reduce this laborious process, we have developed an automated method to provide optimal filters for low SNR regional or teleseismic events. As seismic signals are often localized in frequency and time with distinct time-frequency characteristics, our method is based on the decomposition of a time series into a set of subsignals, each representing a band with f/Δ f constant (constant Q). The SNR is calculated on the pre-event noise and signal window. The band pass signals with high SNR are used to indicate the cutoff filter limits for the optimized filter. Results indicate a significant improvement in SNR, particularly for low SNR events. The method provides an optimum filter which can be immediately applied to unknown regions. The filtered signals are used to map the seismic frequency response of a region and may provide improvements in travel-time picking, azimuth estimation, regional characterization, and event detection. For example, when an event is detected and a preliminary location is determined, the computer could automatically select optimal filter bands for data from non-reporting stations. Results are shown for a set of low SNR events as well as 379 regional and teleseismic events recorded at stations ABKT, KIV, and ANTO in the Middle East.

  18. Limitations imposed on fire PRA methods as the result of incomplete and uncertain fire event data.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nowlen, Steven Patrick; Hyslop, J. S.

    2010-04-01

    Fire probabilistic risk assessment (PRA) methods utilize data and insights gained from actual fire events in a variety of ways. For example, fire occurrence frequencies, manual fire fighting effectiveness and timing, and the distribution of fire events by fire source and plant location are all based directly on the historical experience base. Other factors are either derived indirectly or supported qualitatively based on insights from the event data. These factors include the general nature and intensity of plant fires, insights into operator performance, and insights into fire growth and damage behaviors. This paper will discuss the potential methodology improvements thatmore » could be realized if more complete fire event reporting information were available. Areas that could benefit from more complete event reporting that will be discussed in the paper include fire event frequency analysis, analysis of fire detection and suppression system performance including incipient detection systems, analysis of manual fire fighting performance, treatment of fire growth from incipient stages to fully-involved fires, operator response to fire events, the impact of smoke on plant operations and equipment, and the impact of fire-induced cable failures on plant electrical circuits.« less

  19. Event-Triggered Distributed Average Consensus Over Directed Digital Networks With Limited Communication Bandwidth.

    PubMed

    Li, Huaqing; Chen, Guo; Huang, Tingwen; Dong, Zhaoyang; Zhu, Wei; Gao, Lan

    2016-12-01

    In this paper, we consider the event-triggered distributed average-consensus of discrete-time first-order multiagent systems with limited communication data rate and general directed network topology. In the framework of digital communication network, each agent has a real-valued state but can only exchange finite-bit binary symbolic data sequence with its neighborhood agents at each time step due to the digital communication channels with energy constraints. Novel event-triggered dynamic encoder and decoder for each agent are designed, based on which a distributed control algorithm is proposed. A scheme that selects the number of channel quantization level (number of bits) at each time step is developed, under which all the quantizers in the network are never saturated. The convergence rate of consensus is explicitly characterized, which is related to the scale of network, the maximum degree of nodes, the network structure, the scaling function, the quantization interval, the initial states of agents, the control gain and the event gain. It is also found that under the designed event-triggered protocol, by selecting suitable parameters, for any directed digital network containing a spanning tree, the distributed average consensus can be always achieved with an exponential convergence rate based on merely one bit information exchange between each pair of adjacent agents at each time step. Two simulation examples are provided to illustrate the feasibility of presented protocol and the correctness of the theoretical results.

  20. From mess to mass: a methodology for calculating storm event pollutant loads with their uncertainties, from continuous raw data time series.

    PubMed

    Métadier, M; Bertrand-Krajewski, J-L

    2011-01-01

    With the increasing implementation of continuous monitoring of both discharge and water quality in sewer systems, large data bases are now available. In order to manage large amounts of data and calculate various variables and indicators of interest it is necessary to apply automated methods for data processing. This paper deals with the processing of short time step turbidity time series to estimate TSS (Total Suspended Solids) and COD (Chemical Oxygen Demand) event loads in sewer systems during storm events and their associated uncertainties. The following steps are described: (i) sensor calibration, (ii) estimation of data uncertainties, (iii) correction of raw data, (iv) data pre-validation tests, (v) final validation, and (vi) calculation of TSS and COD event loads and estimation of their uncertainties. These steps have been implemented in an integrated software tool. Examples of results are given for a set of 33 storm events monitored in a stormwater separate sewer system.

  1. Forensic Disaster Analysis in Near-real Time

    NASA Astrophysics Data System (ADS)

    Kunz, Michael; Zschau, Jochen; Wenzel, Friedemann; Khazai, Bijan; Kunz-Plapp, Tina; Trieselmann, Werner

    2014-05-01

    The impacts of extreme hydro-meteorological and geophysical events are controlled by various factors including severity of the event (intensity, duration, spatial extent), amplification with other phenomena (multihazard or cascading effects), interdependencies of technical systems and infrastructure, preparedness and resilience of the society. The Center for Disaster Management and Risk Reduction Technology (CEDIM) has adopted the comprehensive understanding of disasters and develops methodologies of near real-time FDA as a complementing component of the FORIN program of IRDR. The new research strategy 'Near Real-Time Forensic Disaster Analysis (FDA)' aims at scrutinizing disasters closely with a multi-disciplinary approach in order to assess the various aspects of disasters and to identify mechanisms most relevant for an extreme event to become a disaster (e.g., causal loss analysis). Recent technology developments - which have opened unprecedented opportunities for real-time hazard, vulnerability and loss assessment - are used for analyzing disasters and their impacts in combination with databases of historical events. The former covers modern empirical and analytical methods available in engineering and remote sensing for rapid impact assessments, rapid information extraction from crowd sourcing as well as rapid assessments of socio-economic impacts and economic losses. The event-driven science-based assessments of CEDIM are compiled based on interdisciplinary expertise and include the critical evaluation, assessment, validation, and quantification of an event. An important component of CEDIM's FDA is the near real-time approach which is expected to significantly speed up our understanding of natural disasters and be used to provide timely, relevant and valuable information to various user groups within their respective contexts. Currently, CEDIM has developed models and methodologies to assess different types of hazard. These approaches were applied to several disasters including, for example, Super Typhoon Haiyan/Yolanda (Nov. 2013), Central European Floods (June 2013), Hurricane Sandy (Oct. 2012), US Droughts (Summer 2012), or Typhoon Saola in Taiwan and Philippines (July 2012).

  2. Relational event models for longitudinal network data with an application to interhospital patient transfers.

    PubMed

    Vu, Duy; Lomi, Alessandro; Mascia, Daniele; Pallotti, Francesca

    2017-06-30

    The main objective of this paper is to introduce and illustrate relational event models, a new class of statistical models for the analysis of time-stamped data with complex temporal and relational dependencies. We outline the main differences between recently proposed relational event models and more conventional network models based on the graph-theoretic formalism typically adopted in empirical studies of social networks. Our main contribution involves the definition and implementation of a marked point process extension of currently available models. According to this approach, the sequence of events of interest is decomposed into two components: (a) event time and (b) event destination. This decomposition transforms the problem of selection of event destination in relational event models into a conditional multinomial logistic regression problem. The main advantages of this formulation are the possibility of controlling for the effect of event-specific data and a significant reduction in the estimation time of currently available relational event models. We demonstrate the empirical value of the model in an analysis of interhospital patient transfers within a regional community of health care organizations. We conclude with a discussion of how the models we presented help to overcome some the limitations of statistical models for networks that are currently available. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  3. Deconstructing events: The neural bases for space, time, and causality

    PubMed Central

    Kranjec, Alexander; Cardillo, Eileen R.; Lehet, Matthew; Chatterjee, Anjan

    2013-01-01

    Space, time, and causality provide a natural structure for organizing our experience. These abstract categories allow us to think relationally in the most basic sense; understanding simple events require one to represent the spatial relations among objects, the relative durations of actions or movements, and links between causes and effects. The present fMRI study investigates the extent to which the brain distinguishes between these fundamental conceptual domains. Participants performed a one-back task with three conditions of interest (SPACE, TIME and CAUSALITY). Each condition required comparing relations between events in a simple verbal narrative. Depending on the condition, participants were instructed to either attend to the spatial, temporal, or causal characteristics of events, but between participants, each particular event relation appeared in all three conditions. Contrasts compared neural activity during each condition against the remaining two and revealed how thinking about events is deconstructed neurally. Space trials recruited neural areas traditionally associated with visuospatial processing, primarily bilateral frontal and occipitoparietal networks. Causality trials activated areas previously found to underlie causal thinking and thematic role assignment, such as left medial frontal, and left middle temporal gyri, respectively. Causality trials also produced activations in SMA, caudate, and cerebellum; cortical and subcortical regions associated with the perception of time at different timescales. The TIME contrast however, produced no significant effects. This pattern, indicating negative results for TIME trials, but positive effects for CAUSALITY trials in areas important for time perception, motivated additional overlap analyses to further probe relations between domains. The results of these analyses suggest a closer correspondence between time and causality than between time and space. PMID:21861674

  4. 30 WS North Base Wind Study

    NASA Technical Reports Server (NTRS)

    Wheeler, Mark

    2011-01-01

    The 30 Weather Squadron (30 WS) is concerned about strong winds observed at their northern towers without advance warning. They state that terrain influences along the extreme northern fringes of Vandenberg Air Force Base (VAFB) make it difficult for forecasters to issue timely and accurate high wind warnings for northeasterly wind events. These events tend to occur during the winter or early spring when they are under the influence of the Great Basin high pressure weather regime. The Launch Weather Officers (LWOs) have seen these rapid wind increases in the current northern Towers 60, 70 and 71 in excess of their 35 kt operational warning threshold. For this task, the 30 WS requested the Applied Meteorology Unit (AMU) analyze data from days when these towers reported winds in excess of 35 kt and determine if there were any precursors in the observations that would allow the LWOs to better forecast and warn their operational customers for these wind events. The 30 WS provided wind tower data for the cool season (October - March) from the period January 2004-March 20 IO. The AMU decoded and evaluated the wind tower data for 66 days identified by the 30 WS as having high-wind events. Out of the 66 event days, only 30 had wind speed observations of > or =35 kt from at least one of the three northern towers. The AMU analyzed surface and upper air charts to determine the synoptic conditions for each event day along with tower peak wind speed and direction time series and wind rose charts for all 30 event days. The analysis revealed a trend on all event days in which the tower winds shifted to the northeast for a period of time before the first recorded > or =35 kt wind speed. The time periods for the 30 event days ranged from 20 minutes to several hours, with a median value of 110 minutes. This trend, if monitored, could give the 30 WS forecasters a precursor to assist in issuing an operational warning before a high wind event occurs. The AMU recommends developing a high-wind alert capability for VAFB using a local mesoscale model to forecast these wind events. The model should incorporate all of the VAFB local data sets and have a forecast capability of between 2 to 24 hours. Such a model would allow the meteorologists at VAFB to alert the operational customers of high wind events in a timely manner so protective action could be taken.

  5. An event-related visual occlusion method for examining anticipatory skill in natural interceptive tasks.

    PubMed

    Mann, David L; Abernethy, Bruce; Farrow, Damian; Davis, Mark; Spratford, Wayne

    2010-05-01

    This article describes a new automated method for the controlled occlusion of vision during natural tasks. The method permits the time course of the presence or absence of visual information to be linked to identifiable events within the task of interest. An example application is presented in which the method is used to examine the ability of cricket batsmen to pick up useful information from the prerelease movement patterns of the opposing bowler. Two key events, separated by a consistent within-action time lag, were identified in the cricket bowling action sequence-namely, the penultimate foot strike prior to ball release (Event 1), and the subsequent moment of ball release (Event 2). Force-plate registration of Event 1 was then used as a trigger to facilitate automated occlusion of vision using liquid crystal occlusion goggles at time points relative to Event 2. Validation demonstrated that, compared with existing approaches that are based on manual triggering, this method of occlusion permitted considerable gains in temporal precision and a reduction in the number of unusable trials. A more efficient and accurate protocol to examine anticipation is produced, while preserving the important natural coupling between perception and action.

  6. [Infiltration characteristics of soil water on loess slope land under intermittent and repetitive rainfall conditions].

    PubMed

    Li, Yi; Shao, Ming-An

    2008-07-01

    Based on the experiments of controlled intermittent and repetitive rainfall on slope land, the infiltration and distribution characteristics of soil water on loess slope land were studied. The results showed that under the condition of intermittent rainfall, the cumulative runoff during two rainfall events increased linearly with time, and the wetting front also increased with time. In the interval of the two rainfall events, the wetting front increased slowly, and the infiltration rate was smaller on steeper slope than on flat surface. During the second rainfall event, there was an obvious decreasing trend of infiltration rate with time. The cumulative infiltration on 15 degrees slope land was larger than that of 25 degrees slope land, being 178 mm and 88 mm, respectively. Under the condition of repetitive rainfall, the initial infiltration rate during each rainfall event was relatively large, and during the first rainfall, both the infiltration rate and the cumulative infiltration at various stages were larger than those during the other three rainfall events. However, after the first rainfall, there were no obvious differences in the infiltration rate among the next three rainfall events. The more the rainfall event, the deeper the wetting front advanced.

  7. Pesticide leaching via subsurface drains in different hydrologic situations

    NASA Astrophysics Data System (ADS)

    Zajíček, Antonín; Fučík, Petr; Liška, Marek; Dobiáš, Jakub

    2017-04-01

    esticides and their degradates in tile drainage waters were studied in two small, predominantly agricultural, tile-drained subcatchments in the Bohemian-Moravian Highlands, Czech Republic. The goal was to evaluate their occurence and the dymamics of their concentrations in drainage waters in different hydrologic situations using discharge and concentration monitoring together with 18O and 2H isotope analysis for Mean Residence Time (MRT) estimation and hydrograph separations during rainfall - runoff (R-R) events. The drainage and stream discharges were measured continuously at the closing outlets of three drainage groups and one small stream. During periods of prevailing base and interflow, samples were collected manually in two-week intervals for isotope analysis and during the spraying period (March to October) also for pesticide analysis. During R-R events, samples were taken by automatic samplers in intervals varying from 20 min (summer) to 1 hour (winter). To enable isotopic analysis, precipitation was sampled both manually at two-week intervals and also using an automatic rainfall sampler which collected samples of precipitation during the R-R events at 20-min. intervals. The isotopic analysis showed, that MRT of drainage base flow and interflow varies from 2,2 to 3,3 years, while MRT of base flow and interflow in surface stream is several months. During R-R events, the proportion of event water varied from 0 to 60 % in both drainage and surface runoff. The occurrence of pesticides and their degradates in drainage waters is strongly dependent on the hydrologic situation. While degradates were permanently present in drainage waters in high but varying concentrations according to instantaneous runoff composition, parent matters were detected almost exclusively during R-R events. In periods with prevailing base flow and interflow (grab samples), especially ESA forms of chloracetanilide degradates occured in high concentrations in all samples. Average sum of degradates varried between 1 730 - 5 760 ng/l. During R-R events, pesticide concentration varried according to runoff composition and time between sprayng and event. Event with no protortiom of event water in drainage runoff were typical by incereas in degradates concentrations (up to 20 000ng/l) and none or low occurence of parent matters. Events with significant event water proportion in drainage runoff were characterised by decrease in degradates concentrations and (when event happened soon affter spraying) by presence of paternal pesticides in drinage runoff. Instanteous concentrations of paren matters can be extremely high in that causes, up to 23 000 ng/l in drainage waters and up to 40 000 ng/l in small stream. Above results suggest that drainage systems could act as significant source of pesticide leaching. When parent compounds leaches via tile drainage systems, there are some border conditions that must exist together such as the occurence of R-R event soon after the pests application and the presence of event water (or water with short residence time in the catchment) in the drainage runoff.

  8. Multilevel joint competing risk models

    NASA Astrophysics Data System (ADS)

    Karunarathna, G. H. S.; Sooriyarachchi, M. R.

    2017-09-01

    Joint modeling approaches are often encountered for different outcomes of competing risk time to event and count in many biomedical and epidemiology studies in the presence of cluster effect. Hospital length of stay (LOS) has been the widely used outcome measure in hospital utilization due to the benchmark measurement for measuring multiple terminations such as discharge, transferred, dead and patients who have not completed the event of interest at the follow up period (censored) during hospitalizations. Competing risk models provide a method of addressing such multiple destinations since classical time to event models yield biased results when there are multiple events. In this study, the concept of joint modeling has been applied to the dengue epidemiology in Sri Lanka, 2006-2008 to assess the relationship between different outcomes of LOS and platelet count of dengue patients with the district cluster effect. Two key approaches have been applied to build up the joint scenario. In the first approach, modeling each competing risk separately using the binary logistic model, treating all other events as censored under the multilevel discrete time to event model, while the platelet counts are assumed to follow a lognormal regression model. The second approach is based on the endogeneity effect in the multilevel competing risks and count model. Model parameters were estimated using maximum likelihood based on the Laplace approximation. Moreover, the study reveals that joint modeling approach yield more precise results compared to fitting two separate univariate models, in terms of AIC (Akaike Information Criterion).

  9. The Earth Observatory Natural Event Tracker (EONET): An API for Matching Natural Events to GIBS Imagery

    NASA Astrophysics Data System (ADS)

    Ward, K.

    2015-12-01

    Hidden within the terabytes of imagery in NASA's Global Imagery Browse Services (GIBS) collection are hundreds of daily natural events. Some events are newsworthy, devastating, and visibly obvious at a global scale, others are merely regional curiosities. Regardless of the scope and significance of any one event, it is likely that multiple GIBS layers can be viewed to provide a multispectral, dataset-based view of the event. To facilitate linking between the discrete event and the representative dataset imagery, NASA's Earth Observatory Group has developed a prototype application programming interface (API): the Earth Observatory Natural Event Tracker (EONET). EONET supports an API model that allows users to retrieve event-specific metadata--date/time, location, and type (wildfire, storm, etc.)--and web service layer-specific metadata which can be used to link to event-relevant dataset imagery in GIBS. GIBS' ability to ingest many near real time datasets, combined with its growing archive of past imagery, means that API users will be able to develop client applications that not only show ongoing events but can also look at imagery from before and after. In our poster, we will present the API and show examples of its use.

  10. Semiparametric Time-to-Event Modeling in the Presence of a Latent Progression Event

    PubMed Central

    Rice, John D.; Tsodikov, Alex

    2017-01-01

    Summary In cancer research, interest frequently centers on factors influencing a latent event that must precede a terminal event. In practice it is often impossible to observe the latent event precisely, making inference about this process difficult. To address this problem, we propose a joint model for the unobserved time to the latent and terminal events, with the two events linked by the baseline hazard. Covariates enter the model parametrically as linear combinations that multiply, respectively, the hazard for the latent event and the hazard for the terminal event conditional on the latent one. We derive the partial likelihood estimators for this problem assuming the latent event is observed, and propose a profile likelihood–based method for estimation when the latent event is unobserved. The baseline hazard in this case is estimated nonparametrically using the EM algorithm, which allows for closed-form Breslow-type estimators at each iteration, bringing improved computational efficiency and stability compared with maximizing the marginal likelihood directly. We present simulation studies to illustrate the finite-sample properties of the method; its use in practice is demonstrated in the analysis of a prostate cancer data set. PMID:27556886

  11. Semiparametric time-to-event modeling in the presence of a latent progression event.

    PubMed

    Rice, John D; Tsodikov, Alex

    2017-06-01

    In cancer research, interest frequently centers on factors influencing a latent event that must precede a terminal event. In practice it is often impossible to observe the latent event precisely, making inference about this process difficult. To address this problem, we propose a joint model for the unobserved time to the latent and terminal events, with the two events linked by the baseline hazard. Covariates enter the model parametrically as linear combinations that multiply, respectively, the hazard for the latent event and the hazard for the terminal event conditional on the latent one. We derive the partial likelihood estimators for this problem assuming the latent event is observed, and propose a profile likelihood-based method for estimation when the latent event is unobserved. The baseline hazard in this case is estimated nonparametrically using the EM algorithm, which allows for closed-form Breslow-type estimators at each iteration, bringing improved computational efficiency and stability compared with maximizing the marginal likelihood directly. We present simulation studies to illustrate the finite-sample properties of the method; its use in practice is demonstrated in the analysis of a prostate cancer data set. © 2016, The International Biometric Society.

  12. Long-term prospective memory impairment following mild traumatic brain injury with loss of consciousness: findings from the Canadian Longitudinal Study on Aging.

    PubMed

    Bedard, Marc; Taler, Vanessa; Steffener, Jason

    2017-12-18

    We aimed to examine the extent to which loss of consciousness (LOC) following mild traumatic brain injury (mTBI) may be associated with impairments in time- and event-based prospective memory (PM). PM is thought to involve executive processes and be subserved by prefrontal regions. Neuroimaging research suggests alterations to these areas of the brain several years after mTBI, particularly if LOC was experienced. However, it remains unclear whether impairments in time- or event-based functioning may persist more than a year after mTBI, and what the link with duration of LOC may be. Analyses were run on data from the Canadian Longitudinal Study on Aging, a nationwide study on health and aging involving individuals between the ages of 45-85. The present study consisted of 1937 participants who experienced mTBI more than 12 months prior, of whom 1146 reported spending less than 1 min unconscious, and 791 had LOC between 1 and 20 min, and 13,525 cognitively healthy adults. Participants were administered the Miami Prospective Memory Test, and tests of retrospective memory and executive functioning. Both mTBI groups were impaired in time-based PM relative to people with no history of TBI. Time- and event-based impairments were predicted by older age, and executive dysfunction among those who spent more time unconscious. Those with mTBI with LOC may experience impairments in PM, particularly in conditions of high demand on executive processes (time-based PM). Implications for interventions aimed at ameliorating PM among those who have experienced mTBI are discussed.

  13. Courses of action for effects based operations using evolutionary algorithms

    NASA Astrophysics Data System (ADS)

    Haider, Sajjad; Levis, Alexander H.

    2006-05-01

    This paper presents an Evolutionary Algorithms (EAs) based approach to identify effective courses of action (COAs) in Effects Based Operations. The approach uses Timed Influence Nets (TINs) as the underlying mathematical model to capture a dynamic uncertain situation. TINs provide a concise graph-theoretic probabilistic approach to specify the cause and effect relationships that exist among the variables of interest (actions, desired effects, and other uncertain events) in a problem domain. The purpose of building these TIN models is to identify and analyze several alternative courses of action. The current practice is to use trial and error based techniques which are not only labor intensive but also produce sub-optimal results and are not capable of modeling constraints among actionable events. The EA based approach presented in this paper is aimed to overcome these limitations. The approach generates multiple COAs that are close enough in terms of achieving the desired effect. The purpose of generating multiple COAs is to give several alternatives to a decision maker. Moreover, the alternate COAs could be generalized based on the relationships that exist among the actions and their execution timings. The approach also allows a system analyst to capture certain types of constraints among actionable events.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henderson, C. B.; Gould, A.; Gaudi, B. S.

    The mass of the lenses giving rise to Galactic microlensing events can be constrained by measuring the relative lens-source proper motion and lens flux. The flux of the lens can be separated from that of the source, companions to the source, and unrelated nearby stars with high-resolution images taken when the lens and source are spatially resolved. For typical ground-based adaptive optics (AO) or space-based observations, this requires either inordinately long time baselines or high relative proper motions. We provide a list of microlensing events toward the Galactic bulge with high relative lens-source proper motion that are therefore good candidatesmore » for constraining the lens mass with future high-resolution imaging. We investigate all events from 2004 to 2013 that display detectable finite-source effects, a feature that allows us to measure the proper motion. In total, we present 20 events with μ ≳ 8 mas yr{sup –1}. Of these, 14 were culled from previous analyses while 6 are new, including OGLE-2004-BLG-368, MOA-2005-BLG-36, OGLE-2012-BLG-0211, OGLE-2012-BLG-0456, MOA-2012-BLG-532, and MOA-2013-BLG-029. In ≲12 yr from the time of each event the lens and source of each event will be sufficiently separated for ground-based telescopes with AO systems or space telescopes to resolve each component and further characterize the lens system. Furthermore, for the most recent events, comparison of the lens flux estimates from images taken immediately to those estimated from images taken when the lens and source are resolved can be used to empirically check the robustness of the single-epoch method currently being used to estimate lens masses for many events.« less

  15. Model-Based Adaptive Event-Triggered Control of Strict-Feedback Nonlinear Systems.

    PubMed

    Li, Yuan-Xin; Yang, Guang-Hong

    2018-04-01

    This paper is concerned with the adaptive event-triggered control problem of nonlinear continuous-time systems in strict-feedback form. By using the event-sampled neural network (NN) to approximate the unknown nonlinear function, an adaptive model and an associated event-triggered controller are designed by exploiting the backstepping method. In the proposed method, the feedback signals and the NN weights are aperiodically updated only when the event-triggered condition is violated. A positive lower bound on the minimum intersample time is guaranteed to avoid accumulation point. The closed-loop stability of the resulting nonlinear impulsive dynamical system is rigorously proved via Lyapunov analysis under an adaptive event sampling condition. In comparing with the traditional adaptive backstepping design with a fixed sample period, the event-triggered method samples the state and updates the NN weights only when it is necessary. Therefore, the number of transmissions can be significantly reduced. Finally, two simulation examples are presented to show the effectiveness of the proposed control method.

  16. Event-Based $H_\\infty $ State Estimation for Time-Varying Stochastic Dynamical Networks With State- and Disturbance-Dependent Noises.

    PubMed

    Sheng, Li; Wang, Zidong; Zou, Lei; Alsaadi, Fuad E

    2017-10-01

    In this paper, the event-based finite-horizon H ∞ state estimation problem is investigated for a class of discrete time-varying stochastic dynamical networks with state- and disturbance-dependent noises [also called (x,v) -dependent noises]. An event-triggered scheme is proposed to decrease the frequency of the data transmission between the sensors and the estimator, where the signal is transmitted only when certain conditions are satisfied. The purpose of the problem addressed is to design a time-varying state estimator in order to estimate the network states through available output measurements. By employing the completing-the-square technique and the stochastic analysis approach, sufficient conditions are established to ensure that the error dynamics of the state estimation satisfies a prescribed H ∞ performance constraint over a finite horizon. The desired estimator parameters can be designed via solving coupled backward recursive Riccati difference equations. Finally, a numerical example is exploited to demonstrate the effectiveness of the developed state estimation scheme.

  17. The Earthquake Early Warning System In Southern Italy: Performance Tests And Next Developments

    NASA Astrophysics Data System (ADS)

    Zollo, A.; Elia, L.; Martino, C.; Colombelli, S.; Emolo, A.; Festa, G.; Iannaccone, G.

    2011-12-01

    PRESTo (PRobabilistic and Evolutionary early warning SysTem) is the software platform for Earthquake Early Warning (EEW) in Southern Italy, that integrates recent algorithms for real-time earthquake location, magnitude estimation and damage assessment, into a highly configurable and easily portable package. The system is under active experimentation based on the Irpinia Seismic Network (ISNet). PRESTo processes the live streams of 3C acceleration data for P-wave arrival detection and, while an event is occurring, promptly performs event detection and provides location, magnitude estimations and peak ground shaking predictions at target sites. The earthquake location is obtained by an evolutionary, real-time probabilistic approach based on an equal differential time formulation. At each time step, it uses information from both triggered and not-yet-triggered stations. Magnitude estimation exploits an empirical relationship that correlates it to the filtered Peak Displacement (Pd), measured over the first 2-4 s of P-signal. Peak ground-motion parameters at any distance can be finally estimated by ground motion prediction equations. Alarm messages containing the updated estimates of these parameters can thus reach target sites before the destructive waves, enabling automatic safety procedures. Using the real-time data streaming from the ISNet network, PRESTo has produced a bulletin for about a hundred low-magnitude events occurred during last two years. Meanwhile, the performances of the EEW system were assessed off-line playing-back the records for moderate and large events from Italy, Spain and Japan and synthetic waveforms for large historical events in Italy. These tests have shown that, when a dense seismic network is deployed in the fault area, PRESTo produces reliable estimates of earthquake location and size within 5-6 s from the event origin time (To). Estimates are provided as probability density functions whose uncertainty typically decreases with time, obtaining a stable solution within 10 s from To. The regional approach was recently integrated with a threshold-based early warning method for the definition of alert levels and the estimation of the Potential Damaged Zone (PDZ) in which the highest intensity levels are expected. The dominant period Tau_c and the peak displacement (Pd) are simultaneously measured in a 3s window after the first P-arrival time. Pd and Tau_c are then compared with threshold values, previously established through an empirical regression analysis, that define a decisional table with four alert levels. According to the real-time measured values of Pd and tau_c, each station provides a local alert level that can be used to warn distant sites and to define the extent of the PDZ. The integrated system was validated off-line for the M6.3, 2009 Central Italy earthquake and ten large Japanese events, due to the low-magnitude events currently occurring in Irpinia. The results confirmed the feasibility and the robustness of such an approach, providing reliable predictions of the earthquake damaging effects, that is a relevant information for the efficient planning of the rescue operations in the immediate post-event emergency phase.

  18. Effects of spatial attention on mental time travel in patients with neglect.

    PubMed

    Anelli, Filomena; Avanzi, Stefano; Arzy, Shahar; Mancuso, Mauro; Frassinetti, Francesca

    2018-04-01

    Numerous studies agree that time is represented in spatial terms in the brain. Here we investigate how a deficit in orienting attention in space influences the ability to mentally travel in time, that is to recall the past and anticipate the future. Right brain-damaged patients, with (RBD-N+) and without neglect (RBD-N-), and healthy controls (HC) were subjected to a Mental Time Travel (MTT) task. Participants were asked to project themselves in time to past, present or future (i.e., self-projection) and, for each self-projection, to judge whether events were located relatively in the past or the future (i.e., self-reference). The MTT-task was performed before and after a manipulation, through prismatic adaptation (PA), inducing a leftward shift of spatial attention. Before PA, RBD-N+ were slower for future than for past events, whereas RBD-N- and HC responded similarly to past and future events. A leftward shift of spatial attention by PA reduced the difference in past/future processing in RBD-N+ and fastened RBD-N- and HC's response to past events. Assuming that time concepts, such as past/future, are coded with a left-to-right order on a mental time line (MTL), a recursive search of future-events can explain neglect patients' performance. Improvement of the spatial deficit following PA reduces the recursive search of future events on the rightmost part of the MTL, facilitating exploration of past events on the leftmost part of the MTL, finally favoring the correct location of past and future events. In addition, the study of the anatomical correlates of the temporal deficit in mental time travel through voxel-based lesion-symptom mapping showed a correlation with a lesion located in the insula and in the thalamus. These findings provide new insights about the inter-relations of space and time, and can pave the way to a procedure to rehabilitate a deficit in these cognitive domains. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Development of a process-oriented vulnerability concept for water travel time in karst aquifers-case study of Tanour and Rasoun springs catchment area.

    NASA Astrophysics Data System (ADS)

    Hamdan, Ibraheem; Sauter, Martin; Ptak, Thomas; Wiegand, Bettina; Margane, Armin; Toll, Mathias

    2017-04-01

    Key words: Karst aquifer, water travel time, vulnerability assessment, Jordan. The understanding of the groundwater pathways and movement through karst aquifers, and the karst aquifer response to precipitation events especially in the arid to semi-arid areas is fundamental to evaluate pollution risks from point and non-point sources. In spite of the great importance of the karst aquifer for drinking purposes, karst aquifers are highly sensitive to contamination events due to the fast connections between the land-surface and the groundwater (through the karst features) which is makes groundwater quality issues within karst systems very complicated. Within this study, different methods and approaches were developed and applied in order to characterise the karst aquifer system of the Tanour and Rasoun springs (NW-Jordan) and the flow dynamics within the aquifer, and to develop a process-oriented method for vulnerability assessment based on the monitoring of different multi-spatially variable parameters of water travel time in karst aquifer. In general, this study aims to achieve two main objectives: 1. Characterization of the karst aquifer system and flow dynamics. 2. Development of a process-oriented method for vulnerability assessment based on spatially variable parameters of travel time. In order to achieve these aims, different approaches and methods were applied starting from the understanding of the geological and hydrogeological characteristics of the karst aquifer and its vulnerability against pollutants, to using different methods, procedures and monitored parameters in order to determine the water travel time within the aquifer and investigate its response to precipitation event and, finally, with the study of the aquifer response to pollution events. The integrated breakthrough signal obtained from the applied methods and procedures including the using of stable isotopes of oxygen and hydrogen, the monitoring of multi qualitative and quantitative parameters using automated probes and data loggers, and the development of travel time physics-based vulnerability assessment method shows good agreement as an applicable methods to determine the water travel time in karst aquifers, and to investigate its response to precipitation and pollution events.

  20. Ontology-based prediction of surgical events in laparoscopic surgery

    NASA Astrophysics Data System (ADS)

    Katić, Darko; Wekerle, Anna-Laura; Gärtner, Fabian; Kenngott, Hannes; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie

    2013-03-01

    Context-aware technologies have great potential to help surgeons during laparoscopic interventions. Their underlying idea is to create systems which can adapt their assistance functions automatically to the situation in the OR, thus relieving surgeons from the burden of managing computer assisted surgery devices manually. To this purpose, a certain kind of understanding of the current situation in the OR is essential. Beyond that, anticipatory knowledge of incoming events is beneficial, e.g. for early warnings of imminent risk situations. To achieve the goal of predicting surgical events based on previously observed ones, we developed a language to describe surgeries and surgical events using Description Logics and integrated it with methods from computational linguistics. Using n-Grams to compute probabilities of followup events, we are able to make sensible predictions of upcoming events in real-time. The system was evaluated on professionally recorded and labeled surgeries and showed an average prediction rate of 80%.

  1. Neuromorphic audio-visual sensor fusion on a sound-localizing robot.

    PubMed

    Chan, Vincent Yue-Sek; Jin, Craig T; van Schaik, André

    2012-01-01

    This paper presents the first robotic system featuring audio-visual (AV) sensor fusion with neuromorphic sensors. We combine a pair of silicon cochleae and a silicon retina on a robotic platform to allow the robot to learn sound localization through self motion and visual feedback, using an adaptive ITD-based sound localization algorithm. After training, the robot can localize sound sources (white or pink noise) in a reverberant environment with an RMS error of 4-5° in azimuth. We also investigate the AV source binding problem and an experiment is conducted to test the effectiveness of matching an audio event with a corresponding visual event based on their onset time. Despite the simplicity of this method and a large number of false visual events in the background, a correct match can be made 75% of the time during the experiment.

  2. Pharmacogenetics-based warfarin dosing algorithm decreases time to stable anticoagulation and the risk of major hemorrhage: an updated meta-analysis of randomized controlled trials.

    PubMed

    Wang, Zhi-Quan; Zhang, Rui; Zhang, Peng-Pai; Liu, Xiao-Hong; Sun, Jian; Wang, Jun; Feng, Xiang-Fei; Lu, Qiu-Fen; Li, Yi-Gang

    2015-04-01

    Warfarin is yet the most widely used oral anticoagulant for thromboembolic diseases, despite the recently emerged novel anticoagulants. However, difficulty in maintaining stable dose within the therapeutic range and subsequent serious adverse effects markedly limited its use in clinical practice. Pharmacogenetics-based warfarin dosing algorithm is a recently emerged strategy to predict the initial and maintaining dose of warfarin. However, whether this algorithm is superior over conventional clinically guided dosing algorithm remains controversial. We made a comparison of pharmacogenetics-based versus clinically guided dosing algorithm by an updated meta-analysis. We searched OVID MEDLINE, EMBASE, and the Cochrane Library for relevant citations. The primary outcome was the percentage of time in therapeutic range. The secondary outcomes were time to stable therapeutic dose and the risks of adverse events including all-cause mortality, thromboembolic events, total bleedings, and major bleedings. Eleven randomized controlled trials with 2639 participants were included. Our pooled estimates indicated that pharmacogenetics-based dosing algorithm did not improve percentage of time in therapeutic range [weighted mean difference, 4.26; 95% confidence interval (CI), -0.50 to 9.01; P = 0.08], but it significantly shortened the time to stable therapeutic dose (weighted mean difference, -8.67; 95% CI, -11.86 to -5.49; P < 0.00001). Additionally, pharmacogenetics-based algorithm significantly reduced the risk of major bleedings (odds ratio, 0.48; 95% CI, 0.23 to 0.98; P = 0.04), but it did not reduce the risks of all-cause mortality, total bleedings, or thromboembolic events. Our results suggest that pharmacogenetics-based warfarin dosing algorithm significantly improves the efficiency of International Normalized Ratio correction and reduces the risk of major hemorrhage.

  3. Middle school students' understanding of time: Implications for the National Science Education Standards

    NASA Astrophysics Data System (ADS)

    Reinemann, Deborah Jean

    2000-10-01

    Measures of time are essential to human life, especially in the Western world. Human understanding of time develops from the preschool stages of using "before" and "after" to an adult understanding and appreciation of time. Previous researchers (for example, Piaget, Friedman) have investigated and described stages of time development. Time, as it was investigated here, can be classified as conventional, logical or experiential. Conventional time is the ordered representation of time; the days of the week, the months of the year, or clock time: seconds and hours. Logical time is the deduction of duration based on regular events; for example, calculating the passage of time based on two separate events. Experiential time involves the duration of events and estimating intervals. With the recent production of the National Science Education Standards (NSES), many schools are aligning their science curriculum with the NSES. Time appears both implicitly and explicitly in the NSES. Do Middle School students possess the understanding of time necessary to meet the recommendations of the NSES? An interview protocol of four sessions was developed to investigate middle school students understanding of time. The four sessions included: building and testing water clocks; an interview about water clocks and time intervals; a laserdisc presentation about relative time spans; and a mind mapping session. Students were also given the GALT test of Logical Thinking. The subjects of the study were interviewed; eleven eighth grade students and thirteen sixth grade students. The data was transcribed and coded, and a rubric was developed to evaluate students based on their responses to the four sessions. The Time Analysis Rubric is a grid of the types of time: conventional, logical and experiential time versus the degree of understanding of time. Student results were assigned to levels of understanding based on the Time Analysis Rubric. There was a relationship (although not significant) between the students' GALT score and the Time Analysis Rubric results. There was no difference in Time Analysis levels between sixth and eighth grade students. On the basis of this study, Middle School students' level of understanding of time appears to be sufficient to master the requirements of the NSES.

  4. Effectiveness of pharmacovigilance training of general practitioners: a retrospective cohort study in the Netherlands comparing two methods.

    PubMed

    Gerritsen, Roald; Faddegon, Hans; Dijkers, Fred; van Grootheest, Kees; van Puijenbroek, Eugène

    2011-09-01

    Spontaneous reporting is a cornerstone of pharmacovigilance. Unfamiliarity with the reporting of suspected adverse drug reactions (ADRs) is a major factor leading to not reporting these events. Medical education may promote more effective reporting. Numerous changes have been implemented in medical education over the last decade, with a shift in training methods from those aimed predominantly at the transfer of knowledge towards those that are more practice based and skill oriented. It is conceivable that these changes have an impact on pharmacovigilance training in vocational training programmes. Therefore, this study compares the effectiveness of a skill-oriented, practice-based pharmacovigilance training method, with a traditional, lecture-based pharmacovigilance training method in the vocational training of general practitioners (GPs). The traditional, lecture-based method is common practice in the Netherlands. The purpose of this study was to establish whether the use of a practice-based, skill-oriented method in pharmacovigilance training during GP traineeship leads to an increase of reported ADRs after completion of this traineeship, compared with a lecture-based method. We also investigated whether the applied training method has an impact on the documentation level of the reports and on the number of unlabelled events reported. A retrospective cohort study. The number of ADR reports submitted to the Netherlands Pharmacovigilance Centre Lareb (between January 2006 and October 2010) after completion of GP vocational training was compared between the two groups. Documentation level of the reports and the number of labelled/unlabelled events reported were also compared. The practice-based cohort reported 32 times after completion of training (124 subjects, 6.8 reports per 1000 months of follow-up; total follow-up of 4704 months). The lecture-based cohort reported 12 times after training (135 subjects, 2.1 reports per 1000 months of follow-up; total follow-up of 5824 months) [odds ratio 2.9; 95% CI 1.4, 6.1]. Reports from GPs with practice-based training had a better documentation grade than those from GPs with lecture-based training, and more often concerned unlabelled events. The practice-based method resulted in significantly more and better-documented reports and more often concerned unlabelled events than the lecture-based method. This effect persisted and did not appear to diminish over time.

  5. What can neuromorphic event-driven precise timing add to spike-based pattern recognition?

    PubMed

    Akolkar, Himanshu; Meyer, Cedric; Clady, Zavier; Marre, Olivier; Bartolozzi, Chiara; Panzeri, Stefano; Benosman, Ryad

    2015-03-01

    This letter introduces a study to precisely measure what an increase in spike timing precision can add to spike-driven pattern recognition algorithms. The concept of generating spikes from images by converting gray levels into spike timings is currently at the basis of almost every spike-based modeling of biological visual systems. The use of images naturally leads to generating incorrect artificial and redundant spike timings and, more important, also contradicts biological findings indicating that visual processing is massively parallel, asynchronous with high temporal resolution. A new concept for acquiring visual information through pixel-individual asynchronous level-crossing sampling has been proposed in a recent generation of asynchronous neuromorphic visual sensors. Unlike conventional cameras, these sensors acquire data not at fixed points in time for the entire array but at fixed amplitude changes of their input, resulting optimally sparse in space and time-pixel individually and precisely timed only if new, (previously unknown) information is available (event based). This letter uses the high temporal resolution spiking output of neuromorphic event-based visual sensors to show that lowering time precision degrades performance on several recognition tasks specifically when reaching the conventional range of machine vision acquisition frequencies (30-60 Hz). The use of information theory to characterize separability between classes for each temporal resolution shows that high temporal acquisition provides up to 70% more information that conventional spikes generated from frame-based acquisition as used in standard artificial vision, thus drastically increasing the separability between classes of objects. Experiments on real data show that the amount of information loss is correlated with temporal precision. Our information-theoretic study highlights the potentials of neuromorphic asynchronous visual sensors for both practical applications and theoretical investigations. Moreover, it suggests that representing visual information as a precise sequence of spike times as reported in the retina offers considerable advantages for neuro-inspired visual computations.

  6. Event (error and near-miss) reporting and learning system for process improvement in radiation oncology.

    PubMed

    Mutic, Sasa; Brame, R Scott; Oddiraju, Swetha; Parikh, Parag; Westfall, Melisa A; Hopkins, Merilee L; Medina, Angel D; Danieley, Jonathan C; Michalski, Jeff M; El Naqa, Issam M; Low, Daniel A; Wu, Bin

    2010-09-01

    The value of near-miss and error reporting processes in many industries is well appreciated and typically can be supported with data that have been collected over time. While it is generally accepted that such processes are important in the radiation therapy (RT) setting, studies analyzing the effects of organized reporting and process improvement systems on operation and patient safety in individual clinics remain scarce. The purpose of this work is to report on the design and long-term use of an electronic reporting system in a RT department and compare it to the paper-based reporting system it replaced. A specifically designed web-based system was designed for reporting of individual events in RT and clinically implemented in 2007. An event was defined as any occurrence that could have, or had, resulted in a deviation in the delivery of patient care. The aim of the system was to support process improvement in patient care and safety. The reporting tool was designed so individual events could be quickly and easily reported without disrupting clinical work. This was very important because the system use was voluntary. The spectrum of reported deviations extended from minor workflow issues (e.g., scheduling) to errors in treatment delivery. Reports were categorized based on functional area, type, and severity of an event. The events were processed and analyzed by a formal process improvement group that used the data and the statistics collected through the web-based tool for guidance in reengineering clinical processes. The reporting trends for the first 24 months with the electronic system were compared to the events that were reported in the same clinic with a paper-based system over a seven-year period. The reporting system and the process improvement structure resulted in increased event reporting, improved event communication, and improved identification of clinical areas which needed process and safety improvements. The reported data were also useful for the evaluation of corrective measures and recognition of ineffective measures and efforts. The electronic system was relatively well accepted by personnel and resulted in minimal disruption of clinical work. Event reporting in the quarters with the fewest number of reported events, though voluntary, was almost four times greater than the most events reported in any one quarter with the paper-based system and remained consistent from the inception of the process through the date of this report. However, the acceptance was not universal, validating the need for improved education regarding reporting processes and systematic approaches to reporting culture development. Specially designed electronic event reporting systems in a radiotherapy setting can provide valuable data for process and patient safety improvement and are more effective reporting mechanisms than paper-based systems. Additional work is needed to develop methods that can more effectively utilize reported data for process improvement, including the development of standardized event taxonomy and a classification system for RT.

  7. Modeling Temporal Processes in Early Spacecraft Design: Application of Discrete-Event Simulations for Darpa's F6 Program

    NASA Technical Reports Server (NTRS)

    Dubos, Gregory F.; Cornford, Steven

    2012-01-01

    While the ability to model the state of a space system over time is essential during spacecraft operations, the use of time-based simulations remains rare in preliminary design. The absence of the time dimension in most traditional early design tools can however become a hurdle when designing complex systems whose development and operations can be disrupted by various events, such as delays or failures. As the value delivered by a space system is highly affected by such events, exploring the trade space for designs that yield the maximum value calls for the explicit modeling of time.This paper discusses the use of discrete-event models to simulate spacecraft development schedule as well as operational scenarios and on-orbit resources in the presence of uncertainty. It illustrates how such simulations can be utilized to support trade studies, through the example of a tool developed for DARPA's F6 program to assist the design of "fractionated spacecraft".

  8. A lengthy look at the daily grind: time series analysis of events, mood, stress, and satisfaction.

    PubMed

    Fuller, Julie A; Stanton, Jeffrey M; Fisher, Gwenith G; Spitzmuller, Christiane; Russell, Steven S; Smith, Patricia C

    2003-12-01

    The present study investigated processes by which job stress and satisfaction unfold over time by examining the relations between daily stressful events, mood, and these variables. Using a Web-based daily survey of stressor events, perceived strain, mood, and job satisfaction completed by 14 university workers, 1,060 occasions of data were collected. Transfer function analysis, a multivariate version of time series analysis, was used to examine the data for relationships among the measured variables after factoring out the contaminating influences of serial dependency. Results revealed a contrast effect in which a stressful event associated positively with higher strain on the same day and associated negatively with strain on the following day. Perceived strain increased over the course of a semester for a majority of participants, suggesting that effects of stress build over time. Finally, the data were consistent with the notion that job satisfaction is a distal outcome that is mediated by perceived strain. ((c) 2003 APA, all rights reserved)

  9. Joint model-based clustering of nonlinear longitudinal trajectories and associated time-to-event data analysis, linked by latent class membership: with application to AIDS clinical studies.

    PubMed

    Huang, Yangxin; Lu, Xiaosun; Chen, Jiaqing; Liang, Juan; Zangmeister, Miriam

    2017-10-27

    Longitudinal and time-to-event data are often observed together. Finite mixture models are currently used to analyze nonlinear heterogeneous longitudinal data, which, by releasing the homogeneity restriction of nonlinear mixed-effects (NLME) models, can cluster individuals into one of the pre-specified classes with class membership probabilities. This clustering may have clinical significance, and be associated with clinically important time-to-event data. This article develops a joint modeling approach to a finite mixture of NLME models for longitudinal data and proportional hazard Cox model for time-to-event data, linked by individual latent class indicators, under a Bayesian framework. The proposed joint models and method are applied to a real AIDS clinical trial data set, followed by simulation studies to assess the performance of the proposed joint model and a naive two-step model, in which finite mixture model and Cox model are fitted separately.

  10. Infrasound Predictions Using the Weather Research and Forecasting Model: Atmospheric Green's Functions for the Source Physics Experiments 1-6.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poppeliers, Christian; Aur, Katherine Anderson; Preston, Leiph

    This report shows the results of constructing predictive atmospheric models for the Source Physics Experiments 1-6. Historic atmospheric data are combined with topography to construct an atmo- spheric model that corresponds to the predicted (or actual) time of a given SPE event. The models are ultimately used to construct atmospheric Green's functions to be used for subsequent analysis. We present three atmospheric models for each SPE event: an average model based on ten one- hour snap shots of the atmosphere and two extrema models corresponding to the warmest, coolest, windiest, etc. atmospheric snap shots. The atmospheric snap shots consist ofmore » wind, temperature, and pressure profiles of the atmosphere for a one-hour time window centered at the time of the predicted SPE event, as well as nine additional snap shots for each of the nine preceding years, centered at the time and day of the SPE event.« less

  11. Operational, Real-Time, Sun-to-Earth Interplanetary Shock Predictions During Solar Cycle 23

    NASA Astrophysics Data System (ADS)

    Fry, C. D.; Dryer, M.; Sun, W.; Deehr, C. S.; Smith, Z.; Akasofu, S.

    2002-05-01

    We report on our progress in predicting interplanetary shock arrival time (SAT) in real-time, using three forecast models: the Hakamada-Akasofu-Fry (HAF) modified kinematic model, the Interplanetary Shock Propagation Model (ISPM) and the Shock Time of Arrival (STOA) model. These models are run concurrently to provide real-time predictions of the arrival time at Earth of interplanetary shocks caused by solar events. These "fearless forecasts" are the first, and presently only, publicly distributed predictions of SAT and are undergoing quantitative evaluation for operational utility and scientific benchmarking. All three models predict SAT, but the HAF model also provides a global view of the propagation of interplanetary shocks through the pre-existing, non-uniform heliospheric structure. This allows the forecaster to track the propagation of the shock and to differentiate between shocks caused by solar events and those associated with co-rotating interaction regions (CIRs). This study includes 173 events during the period February, 1997 to October, 2000. Shock predictions were compared with spacecraft observations at the L1 location to determine how well the models perform. Sixty-eight shocks were observed at L1 within 120 hours of an event. We concluded that 6 of these observed shocks were caused by CIRs, and the remainder were caused by solar events. The forecast skill of the models are presented in terms of RMS errors, contingency tables and skill scores commonly used by the weather forecasting community. The false alarm rate for HAF was higher than for ISPM or STOA but much lower than for predictions based upon empirical studies or climatology. Of the parameters used to characterize a shock source at the Sun, the initial speed of the coronal shock, as represented by the observed metric type II speed, has the largest influence on the predicted SAT. We also found that HAF model predictions based upon type II speed are generally better for shocks originating from sites near central meridian, and worse for limb events. This tendency suggests that the observed type II speed is more representative of the interplanetary shock speed for events occurring near central meridian. In particular, the type II speed appears to underestimate the actual Earth-directed IP shock speed when the source of the event is near the limb. Several of the most interesting events (Bastille Day epoch (2000), April Fools Day epoch (2001))will be discussed in more detail with the use of real-time animations.

  12. Phylogenetic framework for coevolutionary studies: a compass for exploring jungles of tangled trees.

    PubMed

    Martínez-Aquino, Andrés

    2016-08-01

    Phylogenetics is used to detect past evolutionary events, from how species originated to how their ecological interactions with other species arose, which can mirror cophylogenetic patterns. Cophylogenetic reconstructions uncover past ecological relationships between taxa through inferred coevolutionary events on trees, for example, codivergence, duplication, host-switching, and loss. These events can be detected by cophylogenetic analyses based on nodes and the length and branching pattern of the phylogenetic trees of symbiotic associations, for example, host-parasite. In the past 2 decades, algorithms have been developed for cophylogetenic analyses and implemented in different software, for example, statistical congruence index and event-based methods. Based on the combination of these approaches, it is possible to integrate temporal information into cophylogenetical inference, such as estimates of lineage divergence times between 2 taxa, for example, hosts and parasites. Additionally, the advances in phylogenetic biogeography applying methods based on parametric process models and combined Bayesian approaches, can be useful for interpreting coevolutionary histories in a scenario of biogeographical area connectivity through time. This article briefly reviews the basics of parasitology and provides an overview of software packages in cophylogenetic methods. Thus, the objective here is to present a phylogenetic framework for coevolutionary studies, with special emphasis on groups of parasitic organisms. Researchers wishing to undertake phylogeny-based coevolutionary studies can use this review as a "compass" when "walking" through jungles of tangled phylogenetic trees.

  13. Phylogenetic framework for coevolutionary studies: a compass for exploring jungles of tangled trees

    PubMed Central

    2016-01-01

    Abstract Phylogenetics is used to detect past evolutionary events, from how species originated to how their ecological interactions with other species arose, which can mirror cophylogenetic patterns. Cophylogenetic reconstructions uncover past ecological relationships between taxa through inferred coevolutionary events on trees, for example, codivergence, duplication, host-switching, and loss. These events can be detected by cophylogenetic analyses based on nodes and the length and branching pattern of the phylogenetic trees of symbiotic associations, for example, host–parasite. In the past 2 decades, algorithms have been developed for cophylogetenic analyses and implemented in different software, for example, statistical congruence index and event-based methods. Based on the combination of these approaches, it is possible to integrate temporal information into cophylogenetical inference, such as estimates of lineage divergence times between 2 taxa, for example, hosts and parasites. Additionally, the advances in phylogenetic biogeography applying methods based on parametric process models and combined Bayesian approaches, can be useful for interpreting coevolutionary histories in a scenario of biogeographical area connectivity through time. This article briefly reviews the basics of parasitology and provides an overview of software packages in cophylogenetic methods. Thus, the objective here is to present a phylogenetic framework for coevolutionary studies, with special emphasis on groups of parasitic organisms. Researchers wishing to undertake phylogeny-based coevolutionary studies can use this review as a “compass” when “walking” through jungles of tangled phylogenetic trees. PMID:29491928

  14. Sojourning with the Homogeneous Poisson Process.

    PubMed

    Liu, Piaomu; Peña, Edsel A

    2016-01-01

    In this pedagogical article, distributional properties, some surprising, pertaining to the homogeneous Poisson process (HPP), when observed over a possibly random window, are presented. Properties of the gap-time that covered the termination time and the correlations among gap-times of the observed events are obtained. Inference procedures, such as estimation and model validation, based on event occurrence data over the observation window, are also presented. We envision that through the results in this paper, a better appreciation of the subtleties involved in the modeling and analysis of recurrent events data will ensue, since the HPP is arguably one of the simplest among recurrent event models. In addition, the use of the theorem of total probability, Bayes theorem, the iterated rules of expectation, variance and covariance, and the renewal equation could be illustrative when teaching distribution theory, mathematical statistics, and stochastic processes at both the undergraduate and graduate levels. This article is targeted towards both instructors and students.

  15. Observations on Rupture Behaviour of Fluid Induced Events at the Basel EGS Based on Empirical Green's Function Analysis

    NASA Astrophysics Data System (ADS)

    Folesky, J.; Kummerow, J.; Shapiro, S. A.; Asanuma, H.; Häring, M. O.

    2015-12-01

    The Emprirical Green's Function (EGF) method uses pairs of events of high wave form similarity and adjacent hypocenters to decompose the influences of source time function, ray path, instrument site, and instrument response. The seismogram of the smaller event is considered as the Green's Function which then can be deconvolved from the other seismogram. The result provides a reconstructed relative source time function (RSTF) of the larger event of that event pair. The comparison of the RSTFs at different stations of the observation systems produces information on the rupture process of the larger event based on the observation of the directivity effect and on changing RSTFs complexities.The Basel EGS dataset of 2006-2007 consists of about 2800 localized events of magnitudes between 0.0

  16. Intelligent fuzzy controller for event-driven real time systems

    NASA Technical Reports Server (NTRS)

    Grantner, Janos; Patyra, Marek; Stachowicz, Marian S.

    1992-01-01

    Most of the known linguistic models are essentially static, that is, time is not a parameter in describing the behavior of the object's model. In this paper we show a model for synchronous finite state machines based on fuzzy logic. Such finite state machines can be used to build both event-driven, time-varying, rule-based systems and the control unit section of a fuzzy logic computer. The architecture of a pipelined intelligent fuzzy controller is presented, and the linguistic model is represented by an overall fuzzy relation stored in a single rule memory. A VLSI integrated circuit implementation of the fuzzy controller is suggested. At a clock rate of 30 MHz, the controller can perform 3 MFLIPS on multi-dimensional fuzzy data.

  17. SYNAISTHISI: an IoT-powered smart visitor management and cognitive recommendations system

    NASA Astrophysics Data System (ADS)

    Thanos, Giorgos Konstandinos; Karafylli, Christina; Karafylli, Maria; Zacharakis, Dimitris; Papadimitriou, Apostolis; Dimitros, Kostantinos; Kanellopoulou, Konstantina; Kyriazanos, Dimitris M.; Thomopoulos, Stelios C. A.

    2016-05-01

    Location-based and navigation services are really needed to help visitors and audience of big events, complex buildings, shopping malls, airports and large companies. However, the lack of GPS and proper mapping indoors usually renders location-based applications and services useless or simply not applicable in such environments. SYNAISTHISI introduces a mobile application for smartphones which offers navigation capabilities outside and inside buildings and through multiple floor levels. The application comes together with a suite of helpful services, including personalized recommendations, visit/event management and a helpful search functionality in order to navigate to a specific location, event or person. As the user finds his way towards his destination, NFC-enabled checkpoints and bluetooth beacons assist him, while offering re-routing, check-in/out capabilities and useful information about ongoing meetings and nearby events. The application is supported by a back-end GIS system which can provide a broad and clear view to event organizers, campus managers and field personnel for purposes of event logistics, safety and security. SYNAISTHISI system comes with plenty competitive advantages including (a) Seamless Navigation as users move between outdoor and indoor areas and different floor levels by using innovative routing algorithms, (b) connection to and powered by IoT platform, for localization and real-time information feedback, (c) dynamic personalized recommendations based on user profile, location and real-time information provided by the IoT platform and (d) Indoor localization without the need for expensive infrastructure and installations.

  18. Multidetector system for nanosecond tagged neutron technology based on hardware selection of events

    NASA Astrophysics Data System (ADS)

    Karetnikov, M. D.; Korotkov, S. A.; Khasaev, T. O.

    2016-09-01

    At the T( d, n)He4 reaction a neutron is accompanied by an associated alpha-particle emitted in the opposite direction. A time and a direction of the neutron escape can be determined by measuring a time and coordinates of the alpha particle at the position-sensitive alpha-detector. The nanosecond tagged neutron technology (NTNT) based on this principle has great potentialities for various applications, e.g., for remote detection of explosives. A spectrum of gamma-rays emitted at the interaction of tagged neutrons with nuclei of chemical elements allows identify a chemical composition of an irradiated object. For practical realization of NTNT, a time resolution of recording the alpha-gamma coincidences should be close to 1 ns. The total intensity of signals can exceed 1 × 106 1/s from all gamma-detectors and 7 × 106 1/s from the alpha-detector. The processing of such stream of data without losses and distortion of information is one of challenging problems of NTNT. Several models of analog DAQ system based on hardware selection of events were devised and their characteristics are examined. The comparison with the digital DAQ systems demonstrated that the analog DAQ provides better timing parameters, lower power consumption, and higher maximum rate of useful events.

  19. CHELSI: a portable neutron spectrometer for the 20-800 MeV region.

    PubMed

    McLean, T D; Olsher, R H; Romero, L L; Miles, L H; Devine, R T; Fallu-Labruyere, A; Grudberg, P

    2007-01-01

    CHELSI is a CsI-based portable spectrometer being developed at Los Alamos National Laboratory for use in high-energy neutron fields. Based on the inherent pulse shape discrimination properties of CsI(Tl), the instrument flags charged particle events produced via neutron-induced spallation events. Scintillation events are processed in real time using digital signal processing and a conservative estimate of neutron dose rate is made based on the charged particle energy distribution. A more accurate dose estimate can be made by unfolding the 2D charged particle versus pulse height distribution to reveal the incident neutron spectrum from which dose is readily obtained. A prototype probe has been assembled and data collected in quasi-monoenergetic fields at The Svedberg Laboratory (TSL) in Uppsala as well as at the Los Alamos Neutron Science Center (LANSCE). Preliminary efforts at deconvoluting the shape/energy data using empirical response functions derived from time-of-flight measurements are described.

  20. Incorporating Real-time Earthquake Information into Large Enrollment Natural Disaster Course Learning

    NASA Astrophysics Data System (ADS)

    Furlong, K. P.; Benz, H.; Hayes, G. P.; Villasenor, A.

    2010-12-01

    Although most would agree that the occurrence of natural disaster events such as earthquakes, volcanic eruptions, and floods can provide effective learning opportunities for natural hazards-based courses, implementing compelling materials into the large-enrollment classroom environment can be difficult. These natural hazard events derive much of their learning potential from their real-time nature, and in the modern 24/7 news-cycle where all but the most devastating events are quickly out of the public eye, the shelf life for an event is quite limited. To maximize the learning potential of these events requires that both authoritative information be available and course materials be generated as the event unfolds. Although many events such as hurricanes, flooding, and volcanic eruptions provide some precursory warnings, and thus one can prepare background materials to place the main event into context, earthquakes present a particularly confounding situation of providing no warning, but where context is critical to student learning. Attempting to implement real-time materials into large enrollment classes faces the additional hindrance of limited internet access (for students) in most lecture classrooms. In Earth 101 Natural Disasters: Hollywood vs Reality, taught as a large enrollment (150+ students) general education course at Penn State, we are collaborating with the USGS’s National Earthquake Information Center (NEIC) to develop efficient means to incorporate their real-time products into learning activities in the lecture hall environment. Over time (and numerous events) we have developed a template for presenting USGS-produced real-time information in lecture mode. The event-specific materials can be quickly incorporated and updated, along with key contextual materials, to provide students with up-to-the-minute current information. In addition, we have also developed in-class activities, such as student determination of population exposure to severe ground shaking (i.e. simulating the USGS PAGER product), tsunami warning calculations, and building damage analyses that allow the students to participate in realistic hazard analyses as the event unfolds. Examples of these templates and activities will be presented. Key to the successful implementation of real-time materials is sufficient flexibility and adaptability in the course syllabus.

  1. Real-time Upstream Monitoring System: Using ACE Data to Predict the Arrival of Interplanetary Shocks

    NASA Astrophysics Data System (ADS)

    Donegan, M. M.; Wagstaff, K. L.; Ho, G. C.; Vandegriff, J.

    2003-12-01

    We have developed an algorithm to predict Earth arrival times for interplanetary (IP) shock events originating at the Sun. Our predictions are generated from real-time data collected by the Electron, Proton, and Alpha Monitor (EPAM) instrument on NASA's Advanced Composition Explorer (ACE) spacecraft. The high intensities of energetic ions that occur prior to and during an IP shock pose a radiation hazard to astronauts as well as to electronics in Earth orbit. The potential to predict such events is based on characteristic signatures in the Energetic Storm Particle (ESP) event ion intensities which are often associated with IP shocks. We have previously reported on the development and implementation of an algorithm to forecast the arrival of ESP events. Historical ion data from ACE/EPAM was used to train an artificial neural network which uses the signature of an approaching event to predict the time remaining until the shock arrives. Tests on the trained network have been encouraging, with an average error of 9.4 hours for predictions made 24 hours in advance, and an reduced average error of 4.9 hours when the shock is 12 hours away. The prediction engine has been integrated into a web-based system that uses real-time ACE/EPAM data provided by the NOAA Space Environment Center (http://sd-www.jhuapl.edu/UPOS/RISP/ index.html.) This system continually processes the latest ACE data, reports whether or not there is an impending shock, and predicts the time remaining until the shock arrival. Our predictions are updated every five minutes and provide significant lead-time, thereby supplying critical information that can be used by mission planners, satellite operations controllers, and scientists. We have continued to refine the prediction capabilities of this system; in addition to forecasting arrival times for shocks, we now provide confidence estimates for those predictions.

  2. Real-time determination of the efficacy of residual disinfection to limit wastewater contamination in a water distribution system using filtration-based luminescence.

    PubMed

    Lee, Jiyoung; Deininger, Rolf A

    2010-05-01

    Water distribution systems can be vulnerable to microbial contamination through cross-connections, wastewater backflow, the intrusion of soiled water after a loss of pressure resulting from an electricity blackout, natural disaster, or intentional contamination of the system in a bioterrrorism event. The most urgent matter a water treatment utility would face in this situation is detecting the presence and extent of a contamination event in real-time, so that immediate action can be taken to mitigate the problem. The current approved microbiological detection methods are culture-based plate count methods, which require incubation time (1 to 7 days). This long period of time would not be useful for the protection of public health. This study was designed to simulate wastewater intrusion in a water distribution system. The objectives were 2-fold: (1) real-time detection of water contamination, and (2) investigation of the sustainability of drinking water systems to suppress the contamination with secondary disinfectant residuals (chlorine and chloramine). The events of drinking water contamination resulting from a wastewater addition were determined by filtration-based luminescence assay. The water contamination was detected by luminescence method within 5 minutes. The signal amplification attributed to wastewater contamination was clear-102-fold signal increase. After 1 hour, chlorinated water could inactivate 98.8% of the bacterial contaminant, while chloraminated water reduced 77.2%.

  3. Updated Intensity - Duration - Frequency Curves Under Different Future Climate Scenarios

    NASA Astrophysics Data System (ADS)

    Ragno, E.; AghaKouchak, A.

    2016-12-01

    Current infrastructure design procedures rely on the use of Intensity - Duration - Frequency (IDF) curves retrieved under the assumption of temporal stationarity, meaning that occurrences of extreme events are expected to be time invariant. However, numerous studies have observed more severe extreme events over time. Hence, the stationarity assumption for extreme analysis may not be appropriate in a warming climate. This issue raises concerns regarding the safety and resilience of the existing and future infrastructures. Here we employ historical and projected (RCP 8.5) CMIP5 runs to investigate IDF curves of 14 urban areas across the United States. We first statistically assess changes in precipitation extremes using an energy-based test for equal distributions. Then, through a Bayesian inference approach for stationary and non-stationary extreme value analysis, we provide updated IDF curves based on climatic model projections. This presentation summarizes the projected changes in statistics of extremes. We show that, based on CMIP5 simulations, extreme precipitation events in some urban areas can be 20% more severe in the future, even when projected annual mean precipitation is expected to remain similar to the ground-based climatology.

  4. Effects of Learned Episodic Event Structure on Prospective Duration Judgments

    ERIC Educational Resources Information Center

    Faber, Myrthe; Gennari, Silvia P.

    2017-01-01

    The field of psychology of time has typically distinguished between prospective timing and retrospective duration estimation: in prospective timing, participants attend to and encode time, whereas in retrospective estimation, estimates are based on the memory of what happened. Prior research on prospective timing has primarily focused on…

  5. The fluid events model: Predicting continuous task action change.

    PubMed

    Radvansky, Gabriel A; D'Mello, Sidney; Abbott, Robert G; Morgan, Brent; Fike, Karl; Tamplin, Andrea K

    2015-01-01

    The fluid events model is a behavioural model aimed at predicting the likelihood that people will change their actions in ongoing, interactive events. From this view, not only are people responding to aspects of the environment, but they are also basing responses on prior experiences. The fluid events model is an attempt to predict the likelihood that people will shift the type of actions taken within an event on a trial-by-trial basis, taking into account both event structure and experience-based factors. The event-structure factors are: (a) changes in event structure, (b) suitability of the current action to the event, and (c) time on task. The experience-based factors are: (a) whether a person has recently shifted actions, (b) how often a person has shifted actions, (c) whether there has been a dip in performance, and (d) a person's propensity to switch actions within the current task. The model was assessed using data from a series of tasks in which a person was producing responses to events. These were two stimulus-driven figure-drawing studies, a conceptually driven decision-making study, and a probability matching study using a standard laboratory task. This analysis predicted trial-by-trial action switching in a person-independent manner with an average accuracy of 70%, which reflects a 34% improvement above chance. In addition, correlations between overall switch rates and actual switch rates were remarkably high (mean r = .98). The experience-based factors played a more major role than the event-structure factors, but this might be attributable to the nature of the tasks.

  6. A temporal discriminability account of children's eyewitness suggestibility.

    PubMed

    Bright-Paul, Alexandra; Jarrold, Christopher

    2009-07-01

    Children's suggestibility is typically measured using a three-stage 'event-misinformation-test' procedure. We examined whether suggestibility is influenced by the time delays imposed between these stages, and in particular whether the temporal discriminability of sources (event and misinformation) predicts performance. In a novel approach, the degree of source discriminability was calculated as the relative magnitude of two intervals (the ratio of event-misinformation and misinformation-test intervals), based on an adaptation of existing 'ratio-rule' accounts of memory. Five-year-olds (n =150) watched an event, and were exposed to misinformation, before memory for source was tested. The absolute event-test delay (12 versus 24 days) and the 'ratio' of event-misinformation/misinformation-test intervals (11:1, 3:1, 1:1, 1:3 and 1:11) were manipulated across participants. The temporal discriminability of sources, measured by the ratio, was indeed a strong predictor of suggestibility. Most importantly, if the ratio was constant (e.g. 18/6 versus 9/3 days), performance was remarkably similar despite variations in absolute delay (e.g. 24 versus 12 days). This intriguing finding not only extends the ratio-rule of distinctiveness to misinformation paradigms, but also serves to illustrate a new empirical means of differentiating between explanations of suggestibility based on interference between sources and disintegration of source information over time.

  7. Temporal models for the episodic volcanism of Campi Flegrei caldera (Italy) with uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Bevilacqua, Andrea; Flandoli, Franco; Neri, Augusto; Isaia, Roberto; Vitale, Stefano

    2016-11-01

    After the large-scale event of Neapolitan Yellow Tuff ( 15 ka B.P.), intense and mostly explosive volcanism has occurred within and along the boundaries of the Campi Flegrei caldera (Italy). Eruptions occurred closely spaced in time, over periods from a few centuries to a few millennia, and were alternated with periods of quiescence lasting up to several millennia. Often events also occurred closely in space, thus generating a cluster of events. This study had two main objectives: (1) to describe the uncertainty in the geologic record by using a quantitative model and (2) to develop, based on the uncertainty assessment, a long-term subdomain specific temporal probability model that describes the temporal and spatial eruptive behavior of the caldera. In particular, the study adopts a space-time doubly stochastic nonhomogeneous Poisson-type model with a local self-excitation feature able to generate clustering of events which are consistent with the reconstructed record of Campi Flegrei. Results allow the evaluation of similarities and differences between the three epochs of activity as well as to derive eruptive base rate of the caldera and its capacity to generate clusters of events. The temporal probability model is also used to investigate the effect of the most recent eruption of Monte Nuovo (A.D. 1538) in a possible reactivation of the caldera and to estimate the time to the next eruption under different volcanological and modeling assumptions.

  8. Nearly suppressed photoluminescence blinking of small-sized, blue-green-orange-red emitting single CdSe-based core/gradient alloy shell/shell quantum dots: correlation between truncation time and photoluminescence quantum yield.

    PubMed

    Roy, Debjit; Mandal, Saptarshi; De, Chayan K; Kumar, Kaushalendra; Mandal, Prasun K

    2018-04-18

    CdSe-based core/gradient alloy shell/shell semiconductor quantum dots (CGASS QDs) have been shown to be optically quite superior compared to core-shell QDs. However, very little is known about CGASS QDs at the single particle level. Photoluminescence blinking dynamics of four differently emitting (blue (λem = 510), green (λem = 532), orange (λem = 591), and red (λem = 619)) single CGASS QDs having average sizes <∼7 nm have been probed in our home-built total internal reflection fluorescence (TIRF) microscope. All four samples possess an average ON-fraction of 0.70-0.85, which hints towards nearly suppressed PL blinking in these gradiently alloyed systems. Suppression of blinking has been so far achieved with QDs having sizes greater than 10 nm and mostly emitting in the red region (λem > 600 nm). In this manuscript, we report nearly suppressed PL blinking behaviour of CGASS QDs with average sizes <∼7 nm and emitting in the entire range of the visible spectrum, i.e. from blue to green to orange to red. The probability density distribution of both ON- and OFF-event durations for all of these CGASS QDs could be fitted well with a modified inverse truncated power law with an additional exponential model equation. It has been found that unlike most of the literature reports, the power law exponent for OFF-event durations is greater than the power law exponent for ON-event durations for all four samples. This suggests that relatively large ON-event durations are interrupted by comparatively small OFF-event durations. This in turn is indicative of a suppressed non-radiative Auger recombination process for these CGASS systems. However, in these four different samples the ON-event truncation time varies inversely with the OFF-event truncation time, which hints that both the ON- and OFF-event truncation processes are dictated by some common factor. We have employed 2D joint probability distribution analysis to probe the correlation between the event durations and found that residual memory exists in both the ON- and OFF-event durations. Positively correlated successive ON-ON and OFF-OFF event durations and negatively correlated (anti-correlated) ON-OFF event durations perhaps suggest the involvement of more than one type of trapping process within the blinking framework. The timescale corresponding to the additional exponential term has been assigned to hole trapping for ON-event duration statistics. Similarly, for OFF-event duration statistics, this component suggests hole detrapping. We found that the average duration of the exponential process for the ON-event durations is an order of magnitude higher than that of the OFF-event durations. This indicates that the holes are trapped for a significantly long time. When electron trapping is followed by such a hole trapping, long ON-event durations result. We have observed long ON-event durations, as high as 50 s. The competing charge tunnelling model has been used to account for the observed blinking behaviour in these CGASS QDs. Quite interestingly, the PLQY of all of these differently emitting QDs (an ensemble level property) could be correlated with the truncation time (a property at the single particle level). A respective concomitant increase-decrease of ON-OFF event truncation times with increasing PLQY is also indicative of a varying degree of suppression of the Auger recombination processes in these four different CGASS QDs.

  9. Optimal Futility Interim Design: A Predictive Probability of Success Approach with Time-to-Event Endpoint.

    PubMed

    Tang, Zhongwen

    2015-01-01

    An analytical way to compute predictive probability of success (PPOS) together with credible interval at interim analysis (IA) is developed for big clinical trials with time-to-event endpoints. The method takes account of the fixed data up to IA, the amount of uncertainty in future data, and uncertainty about parameters. Predictive power is a special type of PPOS. The result is confirmed by simulation. An optimal design is proposed by finding optimal combination of analysis time and futility cutoff based on some PPOS criteria.

  10. Regression Analysis of Mixed Recurrent-Event and Panel-Count Data with Additive Rate Models

    PubMed Central

    Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L.

    2015-01-01

    Summary Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007; Zhao et al., 2011). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013). In this paper, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study. PMID:25345405

  11. Hybrid Markov-mass action law model for cell activation by rare binding events: Application to calcium induced vesicular release at neuronal synapses.

    PubMed

    Guerrier, Claire; Holcman, David

    2016-10-18

    Binding of molecules, ions or proteins to small target sites is a generic step of cell activation. This process relies on rare stochastic events where a particle located in a large bulk has to find small and often hidden targets. We present here a hybrid discrete-continuum model that takes into account a stochastic regime governed by rare events and a continuous regime in the bulk. The rare discrete binding events are modeled by a Markov chain for the encounter of small targets by few Brownian particles, for which the arrival time is Poissonian. The large ensemble of particles is described by mass action laws. We use this novel model to predict the time distribution of vesicular release at neuronal synapses. Vesicular release is triggered by the binding of few calcium ions that can originate either from the synaptic bulk or from the entry through calcium channels. We report here that the distribution of release time is bimodal although it is triggered by a single fast action potential. While the first peak follows a stimulation, the second corresponds to the random arrival over much longer time of ions located in the synaptic terminal to small binding vesicular targets. To conclude, the present multiscale stochastic modeling approach allows studying cellular events based on integrating discrete molecular events over several time scales.

  12. Estimating the effect of a rare time-dependent treatment on the recurrent event rate.

    PubMed

    Smith, Abigail R; Zhu, Danting; Goodrich, Nathan P; Merion, Robert M; Schaubel, Douglas E

    2018-05-30

    In many observational studies, the objective is to estimate the effect of treatment or state-change on the recurrent event rate. If treatment is assigned after the start of follow-up, traditional methods (eg, adjustment for baseline-only covariates or fully conditional adjustment for time-dependent covariates) may give biased results. We propose a two-stage modeling approach using the method of sequential stratification to accurately estimate the effect of a time-dependent treatment on the recurrent event rate. At the first stage, we estimate the pretreatment recurrent event trajectory using a proportional rates model censored at the time of treatment. Prognostic scores are estimated from the linear predictor of this model and used to match treated patients to as yet untreated controls based on prognostic score at the time of treatment for the index patient. The final model is stratified on matched sets and compares the posttreatment recurrent event rate to the recurrent event rate of the matched controls. We demonstrate through simulation that bias due to dependent censoring is negligible, provided the treatment frequency is low, and we investigate a threshold at which correction for dependent censoring is needed. The method is applied to liver transplant (LT), where we estimate the effect of development of post-LT End Stage Renal Disease (ESRD) on rate of days hospitalized. Copyright © 2018 John Wiley & Sons, Ltd.

  13. Event-Based Plausibility Immediately Influences On-Line Language Comprehension

    ERIC Educational Resources Information Center

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L.; Scheepers, Christoph; McRae, Ken

    2011-01-01

    In some theories of sentence comprehension, linguistically relevant lexical knowledge, such as selectional restrictions, is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional…

  14. 14 CFR § 1274.204 - Costs and payments.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... with NASA (e.g., Department of Defense or Federal Aviation Administration), the resources contributed... accomplishment by the recipient of predetermined tangible milestones. Any arrangement where payments are made on... accomplishment of verifiable, significant event(s) and may not be based upon the mere passage of time or the...

  15. Spatial variability of excess mortality during prolonged dust events in a high-density city: a time-stratified spatial regression approach.

    PubMed

    Wong, Man Sing; Ho, Hung Chak; Yang, Lin; Shi, Wenzhong; Yang, Jinxin; Chan, Ta-Chien

    2017-07-24

    Dust events have long been recognized to be associated with a higher mortality risk. However, no study has investigated how prolonged dust events affect the spatial variability of mortality across districts in a downwind city. In this study, we applied a spatial regression approach to estimate the district-level mortality during two extreme dust events in Hong Kong. We compared spatial and non-spatial models to evaluate the ability of each regression to estimate mortality. We also compared prolonged dust events with non-dust events to determine the influences of community factors on mortality across the city. The density of a built environment (estimated by the sky view factor) had positive association with excess mortality in each district, while socioeconomic deprivation contributed by lower income and lower education induced higher mortality impact in each territory planning unit during a prolonged dust event. Based on the model comparison, spatial error modelling with the 1st order of queen contiguity consistently outperformed other models. The high-risk areas with higher increase in mortality were located in an urban high-density environment with higher socioeconomic deprivation. Our model design shows the ability to predict spatial variability of mortality risk during an extreme weather event that is not able to be estimated based on traditional time-series analysis or ecological studies. Our spatial protocol can be used for public health surveillance, sustainable planning and disaster preparation when relevant data are available.

  16. Adaptive Information Dissemination Control to Provide Diffdelay for the Internet of Things.

    PubMed

    Liu, Xiao; Liu, Anfeng; Huang, Changqin

    2017-01-12

    Applications running on the Internet of Things, such as the Wireless Sensor and Actuator Networks (WSANs) platform, generally have different quality of service (QoS) requirements. For urgent events, it is crucial that information be reported to the actuator quickly, and the communication cost is the second factor. However, for interesting events, communication costs, network lifetime and time all become important factors. In most situations, these different requirements cannot be satisfied simultaneously. In this paper, an adaptive communication control based on a differentiated delay (ACCDS) scheme is proposed to resolve this conflict. In an ACCDS, source nodes of events adaptively send various searching actuators routings (SARs) based on the degree of sensitivity to delay while maintaining the network lifetime. For a delay-sensitive event, the source node sends a large number of SARs to actuators to identify and inform the actuators in an extremely short time; thus, action can be taken quickly but at higher communication costs. For delay-insensitive events, the source node sends fewer SARs to reduce communication costs and improve network lifetime. Therefore, an ACCDS can meet the QoS requirements of different events using a differentiated delay framework. Theoretical analysis simulation results indicate that an ACCDS provides delay and communication costs and differentiated services; an ACCDS scheme can reduce the network delay by 11.111%-53.684% for a delay-sensitive event and reduce the communication costs by 5%-22.308% for interesting events, and reduce the network lifetime by about 28.713%.

  17. Adaptive Information Dissemination Control to Provide Diffdelay for the Internet of Things

    PubMed Central

    Liu, Xiao; Liu, Anfeng; Huang, Changqin

    2017-01-01

    Applications running on the Internet of Things, such as the Wireless Sensor and Actuator Networks (WSANs) platform, generally have different quality of service (QoS) requirements. For urgent events, it is crucial that information be reported to the actuator quickly, and the communication cost is the second factor. However, for interesting events, communication costs, network lifetime and time all become important factors. In most situations, these different requirements cannot be satisfied simultaneously. In this paper, an adaptive communication control based on a differentiated delay (ACCDS) scheme is proposed to resolve this conflict. In an ACCDS, source nodes of events adaptively send various searching actuators routings (SARs) based on the degree of sensitivity to delay while maintaining the network lifetime. For a delay-sensitive event, the source node sends a large number of SARs to actuators to identify and inform the actuators in an extremely short time; thus, action can be taken quickly but at higher communication costs. For delay-insensitive events, the source node sends fewer SARs to reduce communication costs and improve network lifetime. Therefore, an ACCDS can meet the QoS requirements of different events using a differentiated delay framework. Theoretical analysis simulation results indicate that an ACCDS provides delay and communication costs and differentiated services; an ACCDS scheme can reduce the network delay by 11.111%–53.684% for a delay-sensitive event and reduce the communication costs by 5%–22.308% for interesting events, and reduce the network lifetime by about 28.713%. PMID:28085097

  18. Regression analysis of mixed recurrent-event and panel-count data with additive rate models.

    PubMed

    Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L

    2015-03-01

    Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007, The Statistical Analysis of Recurrent Events. New York: Springer-Verlag; Zhao et al., 2011, Test 20, 1-42). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013, Statistics in Medicine 32, 1954-1963). In this article, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study. © 2014, The International Biometric Society.

  19. Real-time detection and classification of anomalous events in streaming data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferragut, Erik M.; Goodall, John R.; Iannacone, Michael D.

    2016-04-19

    A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The events can be displayed to a user in user-defined groupings in an animated fashion. The system can include a plurality of anomaly detectors that together implement an algorithm to identify low probability events and detect atypical traffic patterns. The atypical traffic patterns can then be classified as being of interest or not. In one particular example, in a network environment, the classification can be whether the network traffic is malicious or not.

  20. Engine control system having fuel-based timing

    DOEpatents

    Willi, Martin L [Dunlap, IL; Fiveland, Scott B [Metamora, IL; Montgomery, David T [Edelstein, IL; Gong, Weidong [Dunlap, IL

    2012-04-03

    A control system for an engine having a cylinder is disclosed having an engine valve movable to regulate a fluid flow of the cylinder and an actuator associated with the engine valve. The control system also has a sensor configured to generate a signal indicative of an amount of an air/fuel mixture remaining within the cylinder after completion of a first combustion event and a controller in communication with the actuator and the sensor. The controller may be configured to compare the amount with a desired amount, and to selectively regulate the actuator to adjust a timing of the engine valve associated with a subsequent combustion event based on the comparison.

  1. Performance of Earthquake Early Warning Systems during the Major Events of the 2016-2017 Central Italy Seismic Sequence.

    NASA Astrophysics Data System (ADS)

    Festa, G.; Picozzi, M.; Alessandro, C.; Colombelli, S.; Cattaneo, M.; Chiaraluce, L.; Elia, L.; Martino, C.; Marzorati, S.; Supino, M.; Zollo, A.

    2017-12-01

    Earthquake early warning systems (EEWS) are systems nowadays contributing to the seismic risk mitigation actions, both in terms of losses and societal resilience, by issuing an alert promptly after the earthquake origin and before the ground shaking impacts the targets to be protected. EEWS systems can be grouped in two main classes: network based and stand-alone systems. Network based EEWS make use of dense seismic networks surrounding the fault (e.g. Near Fault Observatory; NFO) generating the event. The rapid processing of the P-wave early portion allows for the location and magnitude estimation of the event then used to predict the shaking through ground motion prediction equations. Stand-alone systems instead analyze the early P-wave signal to predict the ground shaking carried by the late S or surface waves, through empirically calibrated scaling relationships, at the recording site itself. We compared the network-based (PRESTo, PRobabilistic and Evolutionary early warning SysTem, www.prestoews.org, Satriano et al., 2011) and the stand-alone (SAVE, on-Site-Alert-leVEl, Caruso et al., 2017) systems, by analyzing their performance during the 2016-2017 Central Italy sequence. We analyzed 9 earthquakes having magnitude 5.0 < M < 6.5 at about 200 stations located within 200 km from the epicentral area, including stations of The Altotiberina NFO (TABOO). Performances are evaluated in terms of rate of success of ground shaking intensity prediction and available lead-time, i.e. the time available for security actions. PRESTo also evaluated the accuracy of location and magnitude. Both systems well predict the ground shaking nearby the event source, with a success rate around 90% within the potential damage zone. The lead-time is significantly larger for the network based system, increasing to more than 10s at 40 km from the event epicentre. The stand-alone system better performs in the near-source region showing a positive albeit small lead-time (<3s). Far away from the source, the performances slightly degrade, mostly owing to uncertain calibration of attenuation relationships. This study opens to the possibility of making EEWS operational in Italy, based on the available acceleration networks, by improving the capability of reducing the lead-time related to data telemetry.

  2. PTSD onset and course following the World Trade Center disaster: findings and implications for future research.

    PubMed

    Boscarino, Joseph A; Adams, Richard E

    2009-10-01

    We sought to identify common risk factors associated with posttraumatic stress disorder (PTSD) onset and course, including delayed, persistent, and remitted PTSD following a major traumatic exposure. Based on a prospective study of New York City adults following the World Trade Center disaster (WTCD), we conducted baseline interviews with 2,368 persons one year after this event and then at follow-up 1 year later to evaluate changes in current PTSD status based on DSM-IV criteria. Baseline analysis suggested that current PTSD, defined as present if this occurred in the past 12 months, was associated with females, younger adults, those with lower self-esteem, lower social support, higher WTCD exposure, more lifetime traumatic events, and those with a history of pre-WTCD depression. At follow-up, current PTSD was associated with Latinos, non-native born persons, those with lower self-esteem, more negative life events, more lifetime traumatic events, and those with mixed handedness. Classifying respondents at follow-up into resilient (no PTSD time 1 or 2), remitted (PTSD time 1, not 2), delayed (no PTSD time 1, but PTSD time 2), and persistent (PTSD both time 1 and 2) PTSD, revealed the following: compared to resilient cases, remitted ones were more likely to be female, have more negative life events, have greater lifetime traumatic events, and have pre-WTCD depression. Delayed cases were more likely to be Latino, be non-native born, have lower self-esteem, have more negative life events, have greater lifetime traumas, and have mixed handedness. Persistent cases had a similar profile as delayed, but were the only cases associated with greater WTCD exposures. They were also likely to have had a pre-WTCD depression diagnosis. Examination of WTCD-related PTSD at follow-up, more specifically, revealed a similar risk profile, except that handedness was no longer significant and WTCD exposure was now significant for both remitted and persistent cases. PTSD onset and course is complex and appears to be related to trauma exposure, individual predispositions, and external factors not directly related to the original traumatic event. This diagnostic classification may benefit from additional conceptualization and research as this relates to changes in PTSD status over time.

  3. Design of an FPGA-Based Algorithm for Real-Time Solutions of Statistics-Based Positioning

    PubMed Central

    DeWitt, Don; Johnson-Williams, Nathan G.; Miyaoka, Robert S.; Li, Xiaoli; Lockhart, Cate; Lewellen, Tom K.; Hauck, Scott

    2010-01-01

    We report on the implementation of an algorithm and hardware platform to allow real-time processing of the statistics-based positioning (SBP) method for continuous miniature crystal element (cMiCE) detectors. The SBP method allows an intrinsic spatial resolution of ~1.6 mm FWHM to be achieved using our cMiCE design. Previous SBP solutions have required a postprocessing procedure due to the computation and memory intensive nature of SBP. This new implementation takes advantage of a combination of algebraic simplifications, conversion to fixed-point math, and a hierarchal search technique to greatly accelerate the algorithm. For the presented seven stage, 127 × 127 bin LUT implementation, these algorithm improvements result in a reduction from >7 × 106 floating-point operations per event for an exhaustive search to < 5 × 103 integer operations per event. Simulations show nearly identical FWHM positioning resolution for this accelerated SBP solution, and positioning differences of <0.1 mm from the exhaustive search solution. A pipelined field programmable gate array (FPGA) implementation of this optimized algorithm is able to process events in excess of 250 K events per second, which is greater than the maximum expected coincidence rate for an individual detector. In contrast with all detectors being processed at a centralized host, as in the current system, a separate FPGA is available at each detector, thus dividing the computational load. These methods allow SBP results to be calculated in real-time and to be presented to the image generation components in real-time. A hardware implementation has been developed using a commercially available prototype board. PMID:21197135

  4. Two Distinct Types of CME-flare Relationships Based on SOHO and STEREO Observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jang, Soojeong; Moon, Yong-Jae; Kim, Rok-Soon

    In this paper, we present two distinct types of coronal mass ejection (CME)-flare relationships according to their observing time differences using 107 events from 2010 to 2013. The observing time difference, Δ T , is defined as flare peak time minus CME first appearance time at Solar Terrestrial Relations Observatory ( STEREO ) COR1 field of view. There are 41 events for group A (Δ T < 0) and 66 events for group B (Δ T ≥ 0). We compare CME 3D parameters (speed and kinetic energy) based on multi-spacecraft data ( SOlar and Heliospheric Observatory ( SOHO ) andmore » STEREO A and B ) and their associated flare properties (peak flux, fluence, and duration). Our main results are as follows. First, there are better relationships between CME and flare parameters for group B than that of group A. In particular, CME 3D kinetic energy for group B is well correlated with flare fluence with the correlation coefficient of 0.67, which is much stronger than that (cc = 0.31) of group A. Second, the events belonging to group A have short flare durations of less than 1 hr (mean = 21 minutes), while the events for group B have longer durations up to 4 hr (mean = 81 minutes). Third, the mean value of height at peak speed for group B is 4.05 Rs, which is noticeably higher than that of group A (1.89 Rs). This is well correlated with the CME acceleration duration (cc = 0.75). A higher height at peak speed and a longer acceleration duration of CME for group B could be explained by the fact that magnetic reconnections for group B continuously occur for a longer time than those for group A.« less

  5. Spatiotemporal Responses of Groundwater Flow and Aquifer-River Exchanges to Flood Events

    NASA Astrophysics Data System (ADS)

    Liang, Xiuyu; Zhan, Hongbin; Schilling, Keith

    2018-03-01

    Rapidly rising river stages induced by flood events lead to considerable river water infiltration into aquifers and carry surface-borne solutes into hyporheic zones which are widely recognized as an important place for the biogeochemical activity. Existing studies for surface-groundwater exchanges induced by flood events usually limit to a river-aquifer cross section that is perpendicular to river channels, and neglect groundwater flow in parallel with river channels. In this study, surface-groundwater exchanges to a flood event are investigated with specific considerations of unconfined flow in direction that is in parallel with river channels. The groundwater flow is described by a two-dimensional Boussinesq equation and the flood event is described by a diffusive-type flood wave. Analytical solutions are derived and tested using the numerical solution. The results indicate that river water infiltrates into aquifers quickly during flood events, and mostly returns to the river within a short period of time after the flood event. However, the rest river water will stay in aquifers for a long period of time. The residual river water not only flows back to rivers but also flows to downstream aquifers. The one-dimensional model of neglecting flow in the direction parallel with river channels will overestimate heads and discharge in upstream aquifers. The return flow induced by the flood event has a power law form with time and has a significant impact on the base flow recession at early times. The solution can match the observed hydraulic heads in riparian zone wells of Iowa during flood events.

  6. Climate network analysis of regional precipitation extremes: The true story told by event synchronization

    NASA Astrophysics Data System (ADS)

    Odenweller, Adrian; Donner, Reik V.

    2017-04-01

    Over the last decade, complex network methods have been frequently used for characterizing spatio-temporal patterns of climate variability from a complex systems perspective, yielding new insights into time-dependent teleconnectivity patterns and couplings between different components of the Earth climate. Among the foremost results reported, network analyses of the synchronicity of extreme events as captured by the so-called event synchronization have been proposed to be powerful tools for disentangling the spatio-temporal organization of particularly extreme rainfall events and anticipating the timing of monsoon onsets or extreme floodings. Rooted in the analysis of spike train synchrony analysis in the neurosciences, event synchronization has the great advantage of automatically classifying pairs of events arising at two distinct spatial locations as temporally close (and, thus, possibly statistically - or even dynamically - interrelated) or not without the necessity of selecting an additional parameter in terms of a maximally tolerable delay between these events. This consideration is conceptually justified in case of the original application to spike trains in electroencephalogram (EEG) recordings, where the inter-spike intervals show relatively narrow distributions at high temporal sampling rates. However, in case of climate studies, precipitation extremes defined by daily precipitation sums exceeding a certain empirical percentile of their local distribution exhibit a distinctively different type of distribution of waiting times between subsequent events. This raises conceptual concerns if event synchronization is still appropriate for detecting interlinkages between spatially distributed precipitation extremes. In order to study this problem in more detail, we employ event synchronization together with an alternative similarity measure for event sequences, event coincidence rates, which requires a manual setting of the tolerable maximum delay between two events to be considered potentially related. Both measures are then used to generate climate networks from parts of the satellite-based TRMM precipitation data set at daily resolution covering the Indian and East Asian monsoon domains, respectively, thereby reanalysing previously published results. The obtained spatial patterns of degree densities and local clustering coefficients exhibit marked differences between both similarity measures. Specifically, we demonstrate that there exists a strong relationship between the fraction of extremes occurring at subsequent days and the degree density in the event synchronization based networks, suggesting that the spatial patterns obtained using this approach are strongly affected by the presence of serial dependencies between events. Given that a manual selection of the maximally tolerable delay between two events can be guided by a priori climatological knowledge and even used for systematic testing of different hypotheses on climatic processes underlying the emergence of spatio-temporal patterns of extreme precipitation, our results provide evidence that event coincidence rates are a more appropriate statistical characteristic for similarity assessment and network construction for climate extremes, while results based on event synchronization need to be interpreted with great caution.

  7. Optimizing the real-time ground level enhancement alert system based on neutron monitor measurements: Introducing GLE Alert Plus

    NASA Astrophysics Data System (ADS)

    Souvatzoglou, G.; Papaioannou, A.; Mavromichalaki, H.; Dimitroulakos, J.; Sarlanis, C.

    2014-11-01

    Whenever a significant intensity increase is being recorded by at least three neutron monitor stations in real-time mode, a ground level enhancement (GLE) event is marked and an automated alert is issued. Although, the physical concept of the algorithm is solid and has efficiently worked in a number of cases, the availability of real-time data is still an open issue and makes timely GLE alerts quite challenging. In this work we present the optimization of the GLE alert that has been set into operation since 2006 at the Athens Neutron Monitor Station. This upgrade has led to GLE Alert Plus, which is currently based upon the Neutron Monitor Database (NMDB). We have determined the critical values per station allowing us to issue reliable GLE alerts close to the initiation of the event while at the same time we keep the false alert rate at low levels. Furthermore, we have managed to treat the problem of data availability, introducing the Go-Back-N algorithm. A total of 13 GLE events have been marked from January 2000 to December 2012. GLE Alert Plus issued an alert for 12 events. These alert times are compared to the alert times of GOES Space Weather Prediction Center and Solar Energetic Particle forecaster of the University of Málaga (UMASEP). In all cases GLE Alert Plus precedes the GOES alert by ≈8-52 min. The comparison with UMASEP demonstrated a remarkably good agreement. Real-time GLE alerts by GLE Alert Plus may be retrieved by http://cosray.phys.uoa.gr/gle_alert_plus.html, http://www.nmdb.eu, and http://swe.ssa.esa.int/web/guest/space-radiation. An automated GLE alert email notification system is also available to interested users.

  8. Life-threatening false alarm rejection in ICU: using the rule-based and multi-channel information fusion method.

    PubMed

    Liu, Chengyu; Zhao, Lina; Tang, Hong; Li, Qiao; Wei, Shoushui; Li, Jianqing

    2016-08-01

    False alarm (FA) rates as high as 86% have been reported in intensive care unit monitors. High FA rates decrease quality of care by slowing staff response times while increasing patient burdens and stresses. In this study, we proposed a rule-based and multi-channel information fusion method for accurately classifying the true or false alarms for five life-threatening arrhythmias: asystole (ASY), extreme bradycardia (EBR), extreme tachycardia (ETC), ventricular tachycardia (VTA) and ventricular flutter/fibrillation (VFB). The proposed method consisted of five steps: (1) signal pre-processing, (2) feature detection and validation, (3) true/false alarm determination for each channel, (4) 'real-time' true/false alarm determination and (5) 'retrospective' true/false alarm determination (if needed). Up to four signal channels, that is, two electrocardiogram signals, one arterial blood pressure and/or one photoplethysmogram signal were included in the analysis. Two events were set for the method validation: event 1 for 'real-time' and event 2 for 'retrospective' alarm classification. The results showed that 100% true positive ratio (i.e. sensitivity) on the training set were obtained for ASY, EBR, ETC and VFB types, and 94% for VTA type, accompanied by the corresponding true negative ratio (i.e. specificity) results of 93%, 81%, 78%, 85% and 50% respectively, resulting in the score values of 96.50, 90.70, 88.89, 92.31 and 64.90, as well as with a final score of 80.57 for event 1 and 79.12 for event 2. For the test set, the proposed method obtained the score of 88.73 for ASY, 77.78 for EBR, 89.92 for ETC, 67.74 for VFB and 61.04 for VTA types, with the final score of 71.68 for event 1 and 75.91 for event 2.

  9. Representative Atmospheric Plume Development for Elevated Releases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eslinger, Paul W.; Lowrey, Justin D.; McIntyre, Justin I.

    2014-02-01

    An atmospheric explosion of a low-yield nuclear device will produce a large number of radioactive isotopes, some of which can be measured with airborne detection systems. However, properly equipped aircraft may not arrive in the region where an explosion occurred for a number of hours after the event. Atmospheric conditions will have caused the radioactive plume to move and diffuse before the aircraft arrives. The science behind predicting atmospheric plume movement has advanced enough that the location of the maximum concentrations in the plume can be determined reasonably accurately in real time, or near real time. Given the assumption thatmore » an aircraft can follow a plume, this study addresses the amount of atmospheric dilution expected to occur in a representative plume as a function of time past the release event. The approach models atmospheric transport of hypothetical releases from a single location for every day in a year using the publically available HYSPLIT code. The effective dilution factors for the point of maximum concentration in an elevated plume based on a release of a non-decaying, non-depositing tracer can vary by orders of magnitude depending on the day of the release, even for the same number of hours after the release event. However, the median of the dilution factors based on releases for 365 consecutive days at one site follows a power law relationship in time, as shown in Figure S-1. The relationship is good enough to provide a general rule of thumb for estimating typical future dilution factors in a plume starting at the same point. However, the coefficients of the power law function may vary for different release point locations. Radioactive decay causes the effective dilution factors to decrease more quickly with the time past the release event than the dilution factors based on a non-decaying tracer. An analytical expression for the dilution factors of isotopes with different half-lives can be developed given the power law expression for the non-decaying tracer. If the power-law equation for the median dilution factor, Df, based on a non-decaying tracer has the general form Df=a(×t)^(-b) for time t after the release event, then the equation has the form Df=e^(-λt)×a×t^(-b) for a radioactive isotope, where λ is the decay constant for the isotope.« less

  10. Multi-state model for studying an intermediate event using time-dependent covariates: application to breast cancer.

    PubMed

    Meier-Hirmer, Carolina; Schumacher, Martin

    2013-06-20

    The aim of this article is to propose several methods that allow to investigate how and whether the shape of the hazard ratio after an intermediate event depends on the waiting time to occurrence of this event and/or the sojourn time in this state. A simple multi-state model, the illness-death model, is used as a framework to investigate the occurrence of this intermediate event. Several approaches are shown and their advantages and disadvantages are discussed. All these approaches are based on Cox regression. As different time-scales are used, these models go beyond Markov models. Different estimation methods for the transition hazards are presented. Additionally, time-varying covariates are included into the model using an approach based on fractional polynomials. The different methods of this article are then applied to a dataset consisting of four studies conducted by the German Breast Cancer Study Group (GBSG). The occurrence of the first isolated locoregional recurrence (ILRR) is studied. The results contribute to the debate on the role of the ILRR with respect to the course of the breast cancer disease and the resulting prognosis. We have investigated different modelling strategies for the transition hazard after ILRR or in general after an intermediate event. Including time-dependent structures altered the resulting hazard functions considerably and it was shown that this time-dependent structure has to be taken into account in the case of our breast cancer dataset. The results indicate that an early recurrence increases the risk of death. A late ILRR increases the hazard function much less and after the successful removal of the second tumour the risk of death is almost the same as before the recurrence. With respect to distant disease, the appearance of the ILRR only slightly increases the risk of death if the recurrence was treated successfully. It is important to realize that there are several modelling strategies for the intermediate event and that each of these strategies has restrictions and may lead to different results. Especially in the medical literature considering breast cancer development, the time-dependency is often neglected in the statistical analyses. We show that the time-varying variables cannot be neglected in the case of ILRR and that fractional polynomials are a useful tool for finding the functional form of these time-varying variables.

  11. The development of an incident event reporting system for nursing students.

    PubMed

    Chiou, Shwu-Fen; Huang, Ean-Wen; Chuang, Jen-Hsiang

    2009-01-01

    Incident events may occur when nursing students are present in the clinical setting. Their inexperience and unfamiliarity with clinical practice put them at risk for making mistakes that could potentially harm patients and themselves. However, there are deficiencies with incident event reporting systems, including incomplete data and delayed reports. The purpose of this study was to develop an incident event reporting system for nursing students in clinical settings and evaluate its effectiveness. This study was undertaken in three phases. In the first phase, a literature review and focus groups were used to develop the architecture of the reporting system. In the second phase, the reporting system was implemented. Data from incident events that involved nursing students were collected for a 12-month period. In the third phase, a pre-post trial was undertaken to evaluate the performance of the reporting system. The ASP.NET software and Microsoft Access 2003 were used to create an interactive web-based interface and design a database for the reporting system. Email notifications alerted the nursing student's teacher when an incident event was reported. One year after installing the reporting system, the number of reported incident events increased tenfold. However, the time to report the incident event and the time required to complete the reporting procedures were shorter than before implementation of the reporting system. The incident event reporting system appeared to be effective in more comprehensively reporting the number of incident events and shorten the time required for reporting them compared to traditional written reports.

  12. Dietary patterns associated with overweight and obesity among Brazilian schoolchildren: an approach based on the time-of-day of eating events.

    PubMed

    Kupek, Emil; Lobo, Adriana S; Leal, Danielle B; Bellisle, France; de Assis, Maria Alice A

    2016-12-01

    Several studies reported that the timing of eating events has critical implications in the prevention of obesity, but dietary patterns regarding the time-of-day have not been explored in children. The aim of this study was to derive latent food patterns of daily eating events and to examine their associations with overweight/obesity among schoolchildren. A population-based cross-sectional study was conducted with 7-10-year-old Brazilian schoolchildren (n 1232) who completed the Previous Day Food Questionnaire, illustrated with twenty-one foods/beverages in six daily eating events. Latent class analysis was used to derive dietary patterns whose association with child weight status was evaluated by multivariate multinomial regression. Four mutually exclusive latent classes of dietary patterns were identified and labelled according to the time-of-day of eating events and food intake probability (FIP): (A) higher FIP only at lunch; (B) lower FIP at all eating events; (C) higher FIP at lunch, afternoon and evening snacks; (D) lower FIP at breakfast and at evening snack, higher FIP at other meals/snacks. The percentages of children within these classes were 32·3, 48·6, 15·1 and 4·0 %, respectively. After controlling for potential confounders, the mean probabilities of obesity for these classes were 6 % (95 % CI 3·0, 9·0), 13 % (95 % CI 9·0, 17·0), 12 % (95 % CI 6·0, 19) and 11 % (95 % CI 5·0, 17·0), in the same order. In conclusion, the children eating traditional lunch with rice and beans as the main meal of the day (class A) had the lowest obesity risk, thus reinforcing the importance of both the food type and the time-of-day of its intake for weight status.

  13. Laboratory-Based Prospective Surveillance for Community Outbreaks of Shigella spp. in Argentina

    PubMed Central

    Viñas, María R.; Tuduri, Ezequiel; Galar, Alicia; Yih, Katherine; Pichel, Mariana; Stelling, John; Brengi, Silvina P.; Della Gaspera, Anabella; van der Ploeg, Claudia; Bruno, Susana; Rogé, Ariel; Caffer, María I.; Kulldorff, Martin; Galas, Marcelo

    2013-01-01

    Background To implement effective control measures, timely outbreak detection is essential. Shigella is the most common cause of bacterial diarrhea in Argentina. Highly resistant clones of Shigella have emerged, and outbreaks have been recognized in closed settings and in whole communities. We hereby report our experience with an evolving, integrated, laboratory-based, near real-time surveillance system operating in six contiguous provinces of Argentina during April 2009 to March 2012. Methodology To detect localized shigellosis outbreaks timely, we used the prospective space-time permutation scan statistic algorithm of SaTScan, embedded in WHONET software. Twenty three laboratories sent updated Shigella data on a weekly basis to the National Reference Laboratory. Cluster detection analysis was performed at several taxonomic levels: for all Shigella spp., for serotypes within species and for antimicrobial resistance phenotypes within species. Shigella isolates associated with statistically significant signals (clusters in time/space with recurrence interval ≥365 days) were subtyped by pulsed field gel electrophoresis (PFGE) using PulseNet protocols. Principal Findings In three years of active surveillance, our system detected 32 statistically significant events, 26 of them identified before hospital staff was aware of any unexpected increase in the number of Shigella isolates. Twenty-six signals were investigated by PFGE, which confirmed a close relationship among the isolates for 22 events (84.6%). Seven events were investigated epidemiologically, which revealed links among the patients. Seventeen events were found at the resistance profile level. The system detected events of public health importance: infrequent resistance profiles, long-lasting and/or re-emergent clusters and events important for their duration or size, which were reported to local public health authorities. Conclusions/Significance The WHONET-SaTScan system may serve as a model for surveillance and can be applied to other pathogens, implemented by other networks, and scaled up to national and international levels for early detection and control of outbreaks. PMID:24349586

  14. Laboratory-based prospective surveillance for community outbreaks of Shigella spp. in Argentina.

    PubMed

    Viñas, María R; Tuduri, Ezequiel; Galar, Alicia; Yih, Katherine; Pichel, Mariana; Stelling, John; Brengi, Silvina P; Della Gaspera, Anabella; van der Ploeg, Claudia; Bruno, Susana; Rogé, Ariel; Caffer, María I; Kulldorff, Martin; Galas, Marcelo

    2013-01-01

    To implement effective control measures, timely outbreak detection is essential. Shigella is the most common cause of bacterial diarrhea in Argentina. Highly resistant clones of Shigella have emerged, and outbreaks have been recognized in closed settings and in whole communities. We hereby report our experience with an evolving, integrated, laboratory-based, near real-time surveillance system operating in six contiguous provinces of Argentina during April 2009 to March 2012. To detect localized shigellosis outbreaks timely, we used the prospective space-time permutation scan statistic algorithm of SaTScan, embedded in WHONET software. Twenty three laboratories sent updated Shigella data on a weekly basis to the National Reference Laboratory. Cluster detection analysis was performed at several taxonomic levels: for all Shigella spp., for serotypes within species and for antimicrobial resistance phenotypes within species. Shigella isolates associated with statistically significant signals (clusters in time/space with recurrence interval ≥365 days) were subtyped by pulsed field gel electrophoresis (PFGE) using PulseNet protocols. In three years of active surveillance, our system detected 32 statistically significant events, 26 of them identified before hospital staff was aware of any unexpected increase in the number of Shigella isolates. Twenty-six signals were investigated by PFGE, which confirmed a close relationship among the isolates for 22 events (84.6%). Seven events were investigated epidemiologically, which revealed links among the patients. Seventeen events were found at the resistance profile level. The system detected events of public health importance: infrequent resistance profiles, long-lasting and/or re-emergent clusters and events important for their duration or size, which were reported to local public health authorities. The WHONET-SaTScan system may serve as a model for surveillance and can be applied to other pathogens, implemented by other networks, and scaled up to national and international levels for early detection and control of outbreaks.

  15. Data Prediction for Public Events in Professional Domains Based on Improved RNN- LSTM

    NASA Astrophysics Data System (ADS)

    Song, Bonan; Fan, Chunxiao; Wu, Yuexin; Sun, Juanjuan

    2018-02-01

    The traditional data services of prediction for emergency or non-periodic events usually cannot generate satisfying result or fulfill the correct prediction purpose. However, these events are influenced by external causes, which mean certain a priori information of these events generally can be collected through the Internet. This paper studied the above problems and proposed an improved model—LSTM (Long Short-term Memory) dynamic prediction and a priori information sequence generation model by combining RNN-LSTM and public events a priori information. In prediction tasks, the model is qualified for determining trends, and its accuracy also is validated. This model generates a better performance and prediction results than the previous one. Using a priori information can increase the accuracy of prediction; LSTM can better adapt to the changes of time sequence; LSTM can be widely applied to the same type of prediction tasks, and other prediction tasks related to time sequence.

  16. What is the cause of confidence inflation in the Life Events Inventory (LEI) paradigm?

    PubMed

    Von Glahn, Nicholas R; Otani, Hajime; Migita, Mai; Langford, Sara J; Hillard, Erin E

    2012-01-01

    Briefly imagining, paraphrasing, or explaining an event causes people to increase their confidence that this event occurred during childhood-the imagination inflation effect. The mechanisms responsible for the effect were investigated with a new paradigm. In Experiment 1, event familiarity (defined as processing fluency) was varied by asking participants to rate each event once, three times, or five times. No inflation was found, indicating that familiarity does not account for the effect. In Experiment 2, richness of memory representation was manipulated by asking participants to generate zero, three, or six details. Confidence increased from the initial to the final rating in the three- and six-detail conditions, indicating that the effect is based on reality-monitoring errors. However, greater inflation in the three-detail condition than in the six-detail condition indicated that there is a boundary condition. These results were also consistent with an alternative hypothesis, the mental workload hypothesis.

  17. Gait event detection using linear accelerometers or angular velocity transducers in able-bodied and spinal-cord injured individuals.

    PubMed

    Jasiewicz, Jan M; Allum, John H J; Middleton, James W; Barriskill, Andrew; Condie, Peter; Purcell, Brendan; Li, Raymond Che Tin

    2006-12-01

    We report on three different methods of gait event detection (toe-off and heel strike) using miniature linear accelerometers and angular velocity transducers in comparison to using standard pressure-sensitive foot switches. Detection was performed with normal and spinal-cord injured subjects. The detection of end contact (EC), normally toe-off, and initial contact (IC) normally, heel strike was based on either foot linear accelerations or foot sagittal angular velocity or shank sagittal angular velocity. The results showed that all three methods were as accurate as foot switches in estimating times of IC and EC for normal gait patterns. In spinal-cord injured subjects, shank angular velocity was significantly less accurate (p<0.02). We conclude that detection based on foot linear accelerations or foot angular velocity can correctly identify the timing of IC and EC events in both normal and spinal-cord injured subjects.

  18. A geo-computational algorithm for exploring the structure of diffusion progression in time and space.

    PubMed

    Chin, Wei-Chien-Benny; Wen, Tzai-Hung; Sabel, Clive E; Wang, I-Hsiang

    2017-10-03

    A diffusion process can be considered as the movement of linked events through space and time. Therefore, space-time locations of events are key to identify any diffusion process. However, previous clustering analysis methods have focused only on space-time proximity characteristics, neglecting the temporal lag of the movement of events. We argue that the temporal lag between events is a key to understand the process of diffusion movement. Using the temporal lag could help to clarify the types of close relationships. This study aims to develop a data exploration algorithm, namely the TrAcking Progression In Time And Space (TaPiTaS) algorithm, for understanding diffusion processes. Based on the spatial distance and temporal interval between cases, TaPiTaS detects sub-clusters, a group of events that have high probability of having common sources, identifies progression links, the relationships between sub-clusters, and tracks progression chains, the connected components of sub-clusters. Dengue Fever cases data was used as an illustrative case study. The location and temporal range of sub-clusters are presented, along with the progression links. TaPiTaS algorithm contributes a more detailed and in-depth understanding of the development of progression chains, namely the geographic diffusion process.

  19. Mixed-Poisson Point Process with Partially-Observed Covariates: Ecological Momentary Assessment of Smoking.

    PubMed

    Neustifter, Benjamin; Rathbun, Stephen L; Shiffman, Saul

    2012-01-01

    Ecological Momentary Assessment is an emerging method of data collection in behavioral research that may be used to capture the times of repeated behavioral events on electronic devices, and information on subjects' psychological states through the electronic administration of questionnaires at times selected from a probability-based design as well as the event times. A method for fitting a mixed Poisson point process model is proposed for the impact of partially-observed, time-varying covariates on the timing of repeated behavioral events. A random frailty is included in the point-process intensity to describe variation among subjects in baseline rates of event occurrence. Covariate coefficients are estimated using estimating equations constructed by replacing the integrated intensity in the Poisson score equations with a design-unbiased estimator. An estimator is also proposed for the variance of the random frailties. Our estimators are robust in the sense that no model assumptions are made regarding the distribution of the time-varying covariates or the distribution of the random effects. However, subject effects are estimated under gamma frailties using an approximate hierarchical likelihood. The proposed approach is illustrated using smoking data.

  20. Search for Correlated Fluctuations in the Beta+ Decay of Na-22

    NASA Astrophysics Data System (ADS)

    Silverman, M. P.; Strange, W.

    2008-10-01

    Claims for a ``cosmogenic'' force that correlates otherwise independent stochastic events have been made for at least 10 years, based largely on visual inspection of time series of histograms whose shapes were interpreted as suggestive of recurrent patterns with semi-diurnal, diurnal, and monthly periods. Building on our earlier work to test randomness of different nuclear decay processes, we have searched for correlations in the time-series of coincident positron-electron annihilations deriving from beta+ decay of Na-22. Disintegrations were counted within a narrow time window over a period of 7 days, leading to a time series of more than 1 million events. Statistical tests were performed on the raw time series, its correlation function, and its Fourier transform to search for cyclic correlations indicative of quantum-mechanical violating deviations from Poisson statistics. The time series was then partitioned into a sequence of 167 ``bags'' each of 8192 events. A histogram was made of the events of each bag, where contiguous frequency classes differed by a single count. The chronological sequence of histograms was then tested for correlations within classes. In all cases the results of the tests were in accord with statistical control, giving no evidence of correlated fluctuations.

  1. Using Pattern Recognition and Discriminance Analysis to Predict Critical Events in Large Signal Databases

    NASA Astrophysics Data System (ADS)

    Feller, Jens; Feller, Sebastian; Mauersberg, Bernhard; Mergenthaler, Wolfgang

    2009-09-01

    Many applications in plant management require close monitoring of equipment performance, in particular with the objective to prevent certain critical events. At each point in time, the information available to classify the criticality of the process, is represented through the historic signal database as well as the actual measurement. This paper presents an approach to detect and predict critical events, based on pattern recognition and discriminance analysis.

  2. Observation of Long Ionospheric Recoveries from Lightning-induced Electron Precipitation Events

    NASA Astrophysics Data System (ADS)

    Mohammadpour Salut, M.; Cohen, M.

    2015-12-01

    Lightning strokes induces lower ionospheric nighttime disturbances which can be detected through Very Low Frequency (VLF) remote sensing via at least two means: (1) direct heating and ionization, known as an Early event, and (2) triggered precipitation of energetic electrons from the radiation belts, known as Lightning-induced Electron Precipitation (LEP). For each, the ionospheric recover time is typically a few minutes or less. A small class of Early events have been identified as having unusually long ionospheric recoveries (10s of minutes), with the underlying mechanism still in question. Our study shows for the first time that some LEP events also demonstrate unusually long recovery. The VLF events were detected by visual inspection of the recorded data in both the North-South and East-West magnetic fields. Data from the National Lightning Detection Network (NLDN) are used to determine the location and peak current of the lightning responsible for each lightning-associated VLF perturbation. LEP or Early VLF events are determined by measuring the time delay between the causative lightning discharges and the onset of all lightning-associated perturbations. LEP events typically possess an onset delay greater than ~ 200 msec following the causative lightning discharges, while the onset of Early VLF events is time-aligned (<20 msec) with the lightning return stroke. Nonducted LEP events are distinguished from ducted events based on the location of the causative lightning relative to the precipitation region. From 15 March to 20 April and 15 October to 15 November 2011, a total of 385 LEP events observed at Indiana, Montana, Colorado and Oklahoma VLF sites, on the NAA, NLK and NML transmitter signals. 46 of these events exhibited a long recovery. It has been found that the occurrence rate of ducted long recovery LEP events is higher than nonducted. Of the 46 long recovery LEP events, 33 events were induced by ducted whistlers, and 13 events were associated with nonducted obliquely propagating whistler waves. The occurrence of high peak current lightning strokes is a prerequisite for long recovery LEP events.

  3. Robust Covariate-Adjusted Log-Rank Statistics and Corresponding Sample Size Formula for Recurrent Events Data

    PubMed Central

    Song, Rui; Kosorok, Michael R.; Cai, Jianwen

    2009-01-01

    Summary Recurrent events data are frequently encountered in clinical trials. This article develops robust covariate-adjusted log-rank statistics applied to recurrent events data with arbitrary numbers of events under independent censoring and the corresponding sample size formula. The proposed log-rank tests are robust with respect to different data-generating processes and are adjusted for predictive covariates. It reduces to the Kong and Slud (1997, Biometrika 84, 847–862) setting in the case of a single event. The sample size formula is derived based on the asymptotic normality of the covariate-adjusted log-rank statistics under certain local alternatives and a working model for baseline covariates in the recurrent event data context. When the effect size is small and the baseline covariates do not contain significant information about event times, it reduces to the same form as that of Schoenfeld (1983, Biometrics 39, 499–503) for cases of a single event or independent event times within a subject. We carry out simulations to study the control of type I error and the comparison of powers between several methods in finite samples. The proposed sample size formula is illustrated using data from an rhDNase study. PMID:18162107

  4. Some implications of an event-based definition of exposure to the risk of road accident.

    PubMed

    Elvik, Rune

    2015-03-01

    This paper proposes a new definition of exposure to the risk of road accident as any event, limited in space and time, representing a potential for an accident to occur by bringing road users close to each other in time or space of by requiring a road user to take action to avoid leaving the roadway. A typology of events representing a potential for an accident is proposed. Each event can be interpreted as a trial as defined in probability theory. Risk is the proportion of events that result in an accident. Defining exposure as events demanding the attention of road users implies that road users will learn from repeated exposure to these events, which in turn implies that there will normally be a negative relationship between exposure and risk. Four hypotheses regarding the relationship between exposure and risk are proposed. Preliminary tests support these hypotheses. Advantages and disadvantages of defining exposure as specific events are discussed. It is argued that developments in vehicle technology are likely to make events both observable and countable, thus ensuring that exposure is an operational concept. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Automatic Detection and Classification of Audio Events for Road Surveillance Applications.

    PubMed

    Almaadeed, Noor; Asim, Muhammad; Al-Maadeed, Somaya; Bouridane, Ahmed; Beghdadi, Azeddine

    2018-06-06

    This work investigates the problem of detecting hazardous events on roads by designing an audio surveillance system that automatically detects perilous situations such as car crashes and tire skidding. In recent years, research has shown several visual surveillance systems that have been proposed for road monitoring to detect accidents with an aim to improve safety procedures in emergency cases. However, the visual information alone cannot detect certain events such as car crashes and tire skidding, especially under adverse and visually cluttered weather conditions such as snowfall, rain, and fog. Consequently, the incorporation of microphones and audio event detectors based on audio processing can significantly enhance the detection accuracy of such surveillance systems. This paper proposes to combine time-domain, frequency-domain, and joint time-frequency features extracted from a class of quadratic time-frequency distributions (QTFDs) to detect events on roads through audio analysis and processing. Experiments were carried out using a publicly available dataset. The experimental results conform the effectiveness of the proposed approach for detecting hazardous events on roads as demonstrated by 7% improvement of accuracy rate when compared against methods that use individual temporal and spectral features.

  6. Effects of daily kangaroo care on cardiorespiratory parameters in preterm infants.

    PubMed

    Mitchell, A J; Yates, C; Williams, K; Hall, R W

    2013-01-01

    Kangaroo care (KC) has possible benefits for promoting physiological stability and positive developmental outcomes in preterm infants. The purpose of this study was to compare bradycardia and oxygen desaturation events in preterm infants in standard incubator care versus KC. Thirty-eight infants 27 to 30 weeks gestational age were randomly assigned to 2 hours of KC daily between days of life 5 to 10 or to standard incubator care. Infants were monitored for bradycardia (heart rate <80) or oxygen desaturation (<80%). Analysis of hourly events was based on three sets of data: standard care group 24 hours daily, KC group during incubator time 22 hours daily, and KC group during holding time 2 hours daily. The KC group had fewer bradycardia events per hour while being held compared to time spent in an incubator (p = 0.048). The KC group also had significantly fewer oxygen desaturation events while being held than while in the incubator (p = 0.017) and significantly fewer desaturation events than infants in standard care (p = 0.02). KC reduces bradycardia and oxygen desaturation events in preterm infants, providing physiological stability and possible benefits for neurodevelopmental outcomes.

  7. A Content-Adaptive Analysis and Representation Framework for Audio Event Discovery from "Unscripted" Multimedia

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao

    2006-12-01

    We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.

  8. Simulation of the Tsunami Resulting from the M 9.2 2004 Sumatra-Andaman Earthquake - Dynamic Rupture vs. Seismic Inversion Source Model

    NASA Astrophysics Data System (ADS)

    Vater, Stefan; Behrens, Jörn

    2017-04-01

    Simulations of historic tsunami events such as the 2004 Sumatra or the 2011 Tohoku event are usually initialized using earthquake sources resulting from inversion of seismic data. Also, other data from ocean buoys etc. is sometimes included in the derivation of the source model. The associated tsunami event can often be well simulated in this way, and the results show high correlation with measured data. However, it is unclear how the derived source model compares to the particular earthquake event. In this study we use the results from dynamic rupture simulations obtained with SeisSol, a software package based on an ADER-DG discretization solving the spontaneous dynamic earthquake rupture problem with high-order accuracy in space and time. The tsunami model is based on a second-order Runge-Kutta discontinuous Galerkin (RKDG) scheme on triangular grids and features a robust wetting and drying scheme for the simulation of inundation events at the coast. Adaptive mesh refinement enables the efficient computation of large domains, while at the same time it allows for high local resolution and geometric accuracy. The results are compared to measured data and results using earthquake sources based on inversion. With the approach of using the output of actual dynamic rupture simulations, we can estimate the influence of different earthquake parameters. Furthermore, the comparison to other source models enables a thorough comparison and validation of important tsunami parameters, such as the runup at the coast. This work is part of the ASCETE (Advanced Simulation of Coupled Earthquake and Tsunami Events) project, which aims at an improved understanding of the coupling between the earthquake and the generated tsunami event.

  9. Improving short-term forecasting during ramp events by means of Regime-Switching Artificial Neural Networks

    NASA Astrophysics Data System (ADS)

    Gallego, C.; Costa, A.; Cuerva, A.

    2010-09-01

    Since nowadays wind energy can't be neither scheduled nor large-scale storaged, wind power forecasting has been useful to minimize the impact of wind fluctuations. In particular, short-term forecasting (characterised by prediction horizons from minutes to a few days) is currently required by energy producers (in a daily electricity market context) and the TSO's (in order to keep the stability/balance of an electrical system). Within the short-term background, time-series based models (i.e., statistical models) have shown a better performance than NWP models for horizons up to few hours. These models try to learn and replicate the dynamic shown by the time series of a certain variable. When considering the power output of wind farms, ramp events are usually observed, being characterized by a large positive gradient in the time series (ramp-up) or negative (ramp-down) during relatively short time periods (few hours). Ramp events may be motivated by many different causes, involving generally several spatial scales, since the large scale (fronts, low pressure systems) up to the local scale (wind turbine shut-down due to high wind speed, yaw misalignment due to fast changes of wind direction). Hence, the output power may show unexpected dynamics during ramp events depending on the underlying processes; consequently, traditional statistical models considering only one dynamic for the hole power time series may be inappropriate. This work proposes a Regime Switching (RS) model based on Artificial Neural Nets (ANN). The RS-ANN model gathers as many ANN's as different dynamics considered (called regimes); a certain ANN is selected so as to predict the output power, depending on the current regime. The current regime is on-line updated based on a gradient criteria, regarding the past two values of the output power. 3 Regimes are established, concerning ramp events: ramp-up, ramp-down and no-ramp regime. In order to assess the skillness of the proposed RS-ANN model, a single-ANN model (without regime classification) is adopted as a reference model. Both models are evaluated in terms of Improvement over Persistence on the Mean Square Error basis (IoP%) when predicting horizons form 1 time-step to 5. The case of a wind farm located in the complex terrain of Alaiz (north of Spain) has been considered. Three years of available power output data with a hourly resolution have been employed: two years for training and validation of the model and the last year for assessing the accuracy. Results showed that the RS-ANN overcame the single-ANN model for one step-ahead forecasts: the overall IoP% was up to 8.66% for the RS-ANN model (depending on the gradient criterion selected to consider the ramp regime triggered) and 6.16% for the single-ANN. However, both models showed similar accuracy for larger horizons. A locally-weighted evaluation during ramp events for one-step ahead was also performed. It was found that the IoP% during ramps-up increased from 17.60% (case of single-ANN) to 22.25% (case of RS-ANN); however, during the ramps-down events this improvement increased from 18.55% to 19.55%. Three main conclusions are derived from this case study: It highlights the importance of considering statistical models capable of differentiate several regimes showed by the output power time series in order to improve the forecasting during extreme events like ramps. On-line regime classification based on available power output data didn't seem to contribute to improve forecasts for horizons beyond one-step ahead. Tacking into account other explanatory variables (local wind measurements, NWP outputs) could lead to a better understanding of ramp events, improving the regime assessment also for further horizons. The RS-ANN model slightly overcame the single-ANN during ramp-down events. If further research reinforce this effect, special attention should be addressed to understand the underlying processes during ramp-down events.

  10. Timing of late Holocene surface rupture of the Wairau Fault, Marlborough, New Zealand

    USGS Publications Warehouse

    Zachariasen, J.; Berryman, K.; Langridge, Rob; Prentice, C.; Rymer, M.; Stirling, M.; Villamor, P.

    2006-01-01

    Three trenches excavated across the central portion of the right-lateral strike-slip Wairau Fault in South Island, New Zealand, exposed a complex set of fault strands that have displaced a sequence of late Holocene alluvial and colluvial deposits. Abundant charcoal fragments provide age control for various stratigraphic horizons dating back to c. 5610 yr ago. Faulting relations from the Wadsworth trench show that the most recent surface rupture event occurred at least 1290 yr and at most 2740 yr ago. Drowned trees in landslide-dammed Lake Chalice, in combination with charcoal from the base of an unfaulted colluvial wedge at Wadsworth trench, suggest a narrower time bracket for this event of 1811-2301 cal. yr BP. The penultimate faulting event occurred between c. 2370 and 3380 yr, and possibly near 2680 ?? 60 cal. yr BP, when data from both the Wadsworth and Dillon trenches are combined. Two older events have been recognised from Dillon trench but remain poorly dated. A probable elapsed time of at least 1811 yr since the last surface rupture, and an average slip rate estimate for the Wairau Fault of 3-5 mm/yr, suggests that at least 5.4 m and up to 11.5 m of elastic shear strain has accumulated since the last rupture. This is near to or greater than the single-event displacement estimates of 5-7 m. The average recurrence interval for surface rupture of the fault determined from the trench data is 1150-1400 yr. Although the uncertainties in the timing of faulting events and variability in inter-event times remain high, the time elapsed since the last event is in the order of 1-2 times the average recurrence interval, implying that the Wairau Fault is near the end of its interseismic period. ?? The Royal Society of New Zealand 2006.

  11. Regional Evaluation of the Severity-Based Stroke Triage Algorithm for Emergency Medical Services Using Discrete Event Simulation.

    PubMed

    Bogle, Brittany M; Asimos, Andrew W; Rosamond, Wayne D

    2017-10-01

    The Severity-Based Stroke Triage Algorithm for Emergency Medical Services endorses routing patients with suspected large vessel occlusion acute ischemic strokes directly to endovascular stroke centers (ESCs). We sought to evaluate different specifications of this algorithm within a region. We developed a discrete event simulation environment to model patients with suspected stroke transported according to algorithm specifications, which varied by stroke severity screen and permissible additional transport time for routing patients to ESCs. We simulated King County, Washington, and Mecklenburg County, North Carolina, distributing patients geographically into census tracts. Transport time to the nearest hospital and ESC was estimated using traffic-based travel times. We assessed undertriage, overtriage, transport time, and the number-needed-to-route, defined as the number of patients enduring additional transport to route one large vessel occlusion patient to an ESC. Undertriage was higher and overtriage was lower in King County compared with Mecklenburg County for each specification. Overtriage variation was primarily driven by screen (eg, 13%-55% in Mecklenburg County and 10%-40% in King County). Transportation time specifications beyond 20 minutes increased overtriage and decreased undertriage in King County but not Mecklenburg County. A low- versus high-specificity screen routed 3.7× more patients to ESCs. Emergency medical services spent nearly twice the time routing patients to ESCs in King County compared with Mecklenburg County. Our results demonstrate how discrete event simulation can facilitate informed decision making to optimize emergency medical services stroke severity-based triage algorithms. This is the first step toward developing a mature simulation to predict patient outcomes. © 2017 American Heart Association, Inc.

  12. Temporal Clustering of Regional-Scale Extreme Precipitation Events in Southern Switzerland

    NASA Astrophysics Data System (ADS)

    Barton, Yannick; Giannakaki, Paraskevi; Von Waldow, Harald; Chevalier, Clément; Pfhal, Stephan; Martius, Olivia

    2017-04-01

    Temporal clustering of extreme precipitation events on subseasonal time scales is a form of compound extremes and is of crucial importance for the formation of large-scale flood events. Here, the temporal clustering of regional-scale extreme precipitation events in southern Switzerland is studied. These precipitation events are relevant for the flooding of lakes in southern Switzerland and northern Italy. This research determines whether temporal clustering is present and then identifies the dynamics that are responsible for the clustering. An observation-based gridded precipitation dataset of Swiss daily rainfall sums and ECMWF reanalysis datasets are used. To analyze the clustering in the precipitation time series a modified version of Ripley's K function is used. It determines the average number of extreme events in a time period, to characterize temporal clustering on subseasonal time scales and to determine the statistical significance of the clustering. Significant clustering of regional-scale precipitation extremes is found on subseasonal time scales during the fall season. Four high-impact clustering episodes are then selected and the dynamics responsible for the clustering are examined. During the four clustering episodes, all heavy precipitation events were associated with an upperlevel breaking Rossby wave over western Europe and in most cases strong diabatic processes upstream over the Atlantic played a role in the amplification of these breaking waves. Atmospheric blocking downstream over eastern Europe supported this wave breaking during two of the clustering episodes. During one of the clustering periods, several extratropical transitions of tropical cyclones in the Atlantic contributed to the formation of high-amplitude ridges over the Atlantic basin and downstream wave breaking. During another event, blocking over Alaska assisted the phase locking of the Rossby waves downstream over the Atlantic.

  13. Sensitivity analysis and calibration of a dynamic physically based slope stability model

    NASA Astrophysics Data System (ADS)

    Zieher, Thomas; Rutzinger, Martin; Schneider-Muntau, Barbara; Perzl, Frank; Leidinger, David; Formayer, Herbert; Geitner, Clemens

    2017-06-01

    Physically based modelling of slope stability on a catchment scale is still a challenging task. When applying a physically based model on such a scale (1 : 10 000 to 1 : 50 000), parameters with a high impact on the model result should be calibrated to account for (i) the spatial variability of parameter values, (ii) shortcomings of the selected model, (iii) uncertainties of laboratory tests and field measurements or (iv) parameters that cannot be derived experimentally or measured in the field (e.g. calibration constants). While systematic parameter calibration is a common task in hydrological modelling, this is rarely done using physically based slope stability models. In the present study a dynamic, physically based, coupled hydrological-geomechanical slope stability model is calibrated based on a limited number of laboratory tests and a detailed multitemporal shallow landslide inventory covering two landslide-triggering rainfall events in the Laternser valley, Vorarlberg (Austria). Sensitive parameters are identified based on a local one-at-a-time sensitivity analysis. These parameters (hydraulic conductivity, specific storage, angle of internal friction for effective stress, cohesion for effective stress) are systematically sampled and calibrated for a landslide-triggering rainfall event in August 2005. The identified model ensemble, including 25 behavioural model runs with the highest portion of correctly predicted landslides and non-landslides, is then validated with another landslide-triggering rainfall event in May 1999. The identified model ensemble correctly predicts the location and the supposed triggering timing of 73.0 % of the observed landslides triggered in August 2005 and 91.5 % of the observed landslides triggered in May 1999. Results of the model ensemble driven with raised precipitation input reveal a slight increase in areas potentially affected by slope failure. At the same time, the peak run-off increases more markedly, suggesting that precipitation intensities during the investigated landslide-triggering rainfall events were already close to or above the soil's infiltration capacity.

  14. A mobile robots experimental environment with event-based wireless communication.

    PubMed

    Guinaldo, María; Fábregas, Ernesto; Farias, Gonzalo; Dormido-Canto, Sebastián; Chaos, Dictino; Sánchez, José; Dormido, Sebastián

    2013-07-22

    An experimental platform to communicate between a set of mobile robots through a wireless network has been developed. The mobile robots get their position through a camera which performs as sensor. The video images are processed in a PC and a Waspmote card sends the corresponding position to each robot using the ZigBee standard. A distributed control algorithm based on event-triggered communications has been designed and implemented to bring the robots into the desired formation. Each robot communicates to its neighbors only at event times. Furthermore, a simulation tool has been developed to design and perform experiments with the system. An example of usage is presented.

  15. Potential of turbidity monitoring for real time control of pollutant discharge in sewers during rainfall events.

    PubMed

    Lacour, C; Joannis, C; Gromaire, M-C; Chebbo, G

    2009-01-01

    Turbidity sensors can be used to continuously monitor the evolution of pollutant mass discharge. For two sites within the Paris combined sewer system, continuous turbidity, conductivity and flow data were recorded at one-minute time intervals over a one-year period. This paper is intended to highlight the variability in turbidity dynamics during wet weather. For each storm event, turbidity response aspects were analysed through different classifications. The correlation between classification and common parameters, such as the antecedent dry weather period, total event volume per impervious hectare and both the mean and maximum hydraulic flow for each event, was also studied. Moreover, the dynamics of flow and turbidity signals were compared at the event scale. No simple relation between turbidity responses, hydraulic flow dynamics and the chosen parameters was derived from this effort. Knowledge of turbidity dynamics could therefore potentially improve wet weather management, especially when using pollution-based real-time control (P-RTC) since turbidity contains information not included in hydraulic flow dynamics and not readily predictable from such dynamics.

  16. Unreported seismic events found far off-shore Mexico using full-waveform, cross-correlation detection method.

    NASA Astrophysics Data System (ADS)

    Solano, ErickaAlinne; Hjorleifsdottir, Vala; Perez-Campos, Xyoli

    2015-04-01

    A large subset of seismic events do not have impulsive arrivals, such as low frequency events in volcanoes, earthquakes in the shallow part of the subduction interface and further down dip from the traditional seismogenic part, glacial events, volcanic and non-volcanic tremors and landslides. A suite of methods can be used to detect these non-impulsive events. One of this methods is the full-waveform detection based on time reversal methods (Solano, et al , submitted to GJI). The method uses continuous observed seismograms, together with Greens functions and moment tensor responses calculated for an arbitrary 3D structure. This method was applied to the 2012 Ometepec-Pinotepa Nacional earthquake sequence in Guerrero, Mexico. During the span time of the study, we encountered three previously unknown events. One of this events was an impulsive earthquake in the Ometepec area, that only has clear arrivals on three stations and was therefore not located and reported by the SSN. The other two events are previously undetected events, very depleted in high frequencies, that occurred far outside the search area. A very rough estimate gives the location of this two events in the portion of the East Pacific Rise around 9 N. These two events are detected despite their distance from the search area, due to favorable move-out on the array of the Mexican National Seismological Service network (SSN). We are expanding the study area to the EPR and to a larger period of time, with the objective of finding more events in that region. We will present an analysis of the newly detected events, as well as any further findings at the meeting.

  17. Using the Integration of Discrete Event and Agent-Based Simulation to Enhance Outpatient Service Quality in an Orthopedic Department.

    PubMed

    Kittipittayakorn, Cholada; Ying, Kuo-Ching

    2016-01-01

    Many hospitals are currently paying more attention to patient satisfaction since it is an important service quality index. Many Asian countries' healthcare systems have a mixed-type registration, accepting both walk-in patients and scheduled patients. This complex registration system causes a long patient waiting time in outpatient clinics. Different approaches have been proposed to reduce the waiting time. This study uses the integration of discrete event simulation (DES) and agent-based simulation (ABS) to improve patient waiting time and is the first attempt to apply this approach to solve this key problem faced by orthopedic departments. From the data collected, patient behaviors are modeled and incorporated into a massive agent-based simulation. The proposed approach is an aid for analyzing and modifying orthopedic department processes, allows us to consider far more details, and provides more reliable results. After applying the proposed approach, the total waiting time of the orthopedic department fell from 1246.39 minutes to 847.21 minutes. Thus, using the correct simulation model significantly reduces patient waiting time in an orthopedic department.

  18. Using the Integration of Discrete Event and Agent-Based Simulation to Enhance Outpatient Service Quality in an Orthopedic Department

    PubMed Central

    Kittipittayakorn, Cholada

    2016-01-01

    Many hospitals are currently paying more attention to patient satisfaction since it is an important service quality index. Many Asian countries' healthcare systems have a mixed-type registration, accepting both walk-in patients and scheduled patients. This complex registration system causes a long patient waiting time in outpatient clinics. Different approaches have been proposed to reduce the waiting time. This study uses the integration of discrete event simulation (DES) and agent-based simulation (ABS) to improve patient waiting time and is the first attempt to apply this approach to solve this key problem faced by orthopedic departments. From the data collected, patient behaviors are modeled and incorporated into a massive agent-based simulation. The proposed approach is an aid for analyzing and modifying orthopedic department processes, allows us to consider far more details, and provides more reliable results. After applying the proposed approach, the total waiting time of the orthopedic department fell from 1246.39 minutes to 847.21 minutes. Thus, using the correct simulation model significantly reduces patient waiting time in an orthopedic department. PMID:27195606

  19. Temporal Expectation and Information Processing: A Model-Based Analysis

    ERIC Educational Resources Information Center

    Jepma, Marieke; Wagenmakers, Eric-Jan; Nieuwenhuis, Sander

    2012-01-01

    People are able to use temporal cues to anticipate the timing of an event, enabling them to process that event more efficiently. We conducted two experiments, using the fixed-foreperiod paradigm (Experiment 1) and the temporal-cueing paradigm (Experiment 2), to assess which components of information processing are speeded when subjects use such…

  20. Direct AFM observation of an opening event of a DNA cuboid constructed via a prism structure.

    PubMed

    Endo, Masayuki; Hidaka, Kumi; Sugiyama, Hiroshi

    2011-04-07

    A cuboid structure was constructed using a DNA origami design based on a square prism structure. The structure was characterized by atomic force microscopy (AFM) and dynamic light scattering. The real-time opening event of the cuboid was directly observed by high-speed AFM.

  1. Learning the Cardiac Cycle: Simultaneous Observations of Electrical and Mechanical Events.

    ERIC Educational Resources Information Center

    Kenney, Richard Alec; Frey, Mary Anne Bassett

    1980-01-01

    Described is a method for integrating electrical and mechanical events of the cardiac cycle by measuring systolic time intervals, which involves simultaneous recording of the ECG, a phonocardiogram, and the contour of the carotid pulse. Both resting and stress change data are provided as bases for class discussion. (CS)

  2. Semicompeting risks in aging research: methods, issues and needs

    PubMed Central

    Varadhan, Ravi; Xue, Qian-Li; Bandeen-Roche, Karen

    2015-01-01

    A semicompeting risks problem involves two-types of events: a nonterminal and a terminal event (death). Typically, the nonterminal event is the focus of the study, but the terminal event can preclude the occurrence of the nonterminal event. Semicompeting risks are ubiquitous in studies of aging. Examples of semicompeting risk dyads include: dementia and death, frailty syndrome and death, disability and death, and nursing home placement and death. Semicompeting risk models can be divided into two broad classes: models based only on observables quantities (class O) and those based on potential (latent) failure times (class L). The classical illness-death model belongs to class O. This model is a special case of the multistate models, which has been an active area of methodology development. During the past decade and a half, there has also been a flurry of methodological activity on semicompeting risks based on latent failure times (L models). These advances notwithstanding, the semi-competing risks methodology has not penetrated biomedical research, in general, and gerontological research, in particular. Some possible reasons for this lack of uptake are: the methods are relatively new and sophisticated, conceptual problems associated with potential failure time models are difficult to overcome, paucity of expository articles aimed at educating practitioners, and non-availability of readily usable software. The main goals of this review article are: (i) to describe the major types of semicompeting risks problems arising in aging research, (ii) to provide a brief survey of the semicompeting risks methods, (iii) to suggest appropriate methods for addressing the problems in aging research, (iv) to highlight areas where more work is needed, and (v) to suggest ways to facilitate the uptake of the semicompeting risks methodology by the broader biomedical research community. PMID:24729136

  3. Capturing rogue waves by multi-point statistics

    NASA Astrophysics Data System (ADS)

    Hadjihosseini, A.; Wächter, Matthias; Hoffmann, N. P.; Peinke, J.

    2016-01-01

    As an example of a complex system with extreme events, we investigate ocean wave states exhibiting rogue waves. We present a statistical method of data analysis based on multi-point statistics which for the first time allows the grasping of extreme rogue wave events in a highly satisfactory statistical manner. The key to the success of the approach is mapping the complexity of multi-point data onto the statistics of hierarchically ordered height increments for different time scales, for which we can show that a stochastic cascade process with Markov properties is governed by a Fokker-Planck equation. Conditional probabilities as well as the Fokker-Planck equation itself can be estimated directly from the available observational data. With this stochastic description surrogate data sets can in turn be generated, which makes it possible to work out arbitrary statistical features of the complex sea state in general, and extreme rogue wave events in particular. The results also open up new perspectives for forecasting the occurrence probability of extreme rogue wave events, and even for forecasting the occurrence of individual rogue waves based on precursory dynamics.

  4. Accident diagnosis system based on real-time decision tree expert system

    NASA Astrophysics Data System (ADS)

    Nicolau, Andressa dos S.; Augusto, João P. da S. C.; Schirru, Roberto

    2017-06-01

    Safety is one of the most studied topics when referring to power stations. For that reason, sensors and alarms develop an important role in environmental and human protection. When abnormal event happens, it triggers a chain of alarms that must be, somehow, checked by the control room operators. In this case, diagnosis support system can help operators to accurately identify the possible root-cause of the problem in short time. In this article, we present a computational model of a generic diagnose support system based on artificial intelligence, that was applied on the dataset of two real power stations: Angra1 Nuclear Power Plant and Santo Antônio Hydroelectric Plant. The proposed system processes all the information logged in the sequence of events before a shutdown signal using the expert's knowledge inputted into an expert system indicating the chain of events, from the shutdown signal to its root-cause. The results of both applications showed that the support system is a potential tool to help the control room operators identify abnormal events, as accidents and consequently increase the safety.

  5. Practice-based learning and improvement: a two-year experience with the reporting of morbidity and mortality cases by general surgery residents.

    PubMed

    Falcone, John L; Lee, Kenneth K W; Billiar, Timothy R; Hamad, Giselle G

    2012-01-01

    The Accreditation Council for Graduate Medical Education (ACGME) core competency of practice-based learning and improvement can be assessed with surgical Morbidity and Mortality Conference (MMC). We aim to describe the MMC reporting patterns of general surgery residents, describe the adverse event rate for patients and compare that with existing published rates, and describe the nature of our institutional adverse events. We hypothesize that reporting patterns and incidence rates will remain constant over time. In this retrospective cohort study, archived MMC case lists were evaluated from January 1, 2009 to December 31, 2010. The reporting patterns of the residents, the adverse event ratios, and the specific categories of adverse events were described over the academic years. χ(2) and Fisher's exact tests were used to compare across academic years, using an α = 0.05. There were 85 surgical MMC case lists evaluated. Services achieved a reporting rate above 80% (p < 0.001). The most consistent reporting was done by postgraduate year (PGY) 5 level chief residents for all services (p > 0.05). Out of 11,368 patients evaluated from complete MMC submissions, 289 patients had an adverse event reported (2.5%). This was lower than published reporting rates for patient adverse event rates (p < 0.001). Adverse event rates were consistent for residents at the postgraduate year 2, 4, and 5 levels for all services (p > 0.05). Over 2 years, 522 adverse events were reported for 461 patients. A majority of adverse events were from death (24.1%), hematologic and/or vascular events (16.7%), and gastrointestinal system events (16.1%). Surgery resident MMC reporting patterns and adverse event rates are generally stable over time. This study shows which adverse event cases are important for chief residents to report. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  6. A Novel Approach to Estimating Nitrous Oxide Emissions during Wetting Events from Single-Timepoint Flux Measurements.

    PubMed

    Davis, Brian W; Needelman, Brian A; Cavigelli, Michel A; Yarwood, Stephanie A; Maul, Jude E; Bagley, Gwendolyn A; Mirsky, Steven B

    2017-03-01

    Precipitation and irrigation induce pulses of NO emissions in agricultural soils, but the magnitude, duration, and timing of these pulses remain uncertain. This uncertainty makes it difficult to accurately extrapolate emissions from unmeasured time periods between chamber sampling events. Therefore, we developed a modeling protocol to predict NO emissions from data collected daily for 7 d after wetting events. Within a cover crop-based corn ( L.) production system in Beltsville, MD, we conducted the 7-d time series during four time periods representing a range of corn growth stages in 2013 and 2014. Treatments included mixtures and monocultures of grass and legume cover crops that were fertilized with pelletized poultry litter or urea-ammonium nitrate solution (9-276 kg N ha). Most fluxes did not exhibit the expected exponential decay over time (82%); therefore, cumulative emissions were calculated using trapezoidal integration over 7 d after the wetting event. Cumulative 7-d emissions were well correlated with single point gas fluxes on the second day after a wetting event using a generalized linear mixed model (ln[emissions] = 0.809∙ln[flux] + 2.47). Soil chemical covariates before or after a wetting event were weakly associated with cumulative emissions. The ratio of dissolved organic C to total inorganic N was negatively correlated with cumulative emissions ( = 0.23-0.29), whereas nitrate was positively correlated with cumulative emissions ( = 0.23-0.33). Our model is an innovative approach that is calibrated using site-specific time series data, which may then be used to estimate short-term NO emissions after wetting events using only a single flux measurement. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  7. Influential factors of red-light running at signalized intersection and prediction using a rare events logistic regression model.

    PubMed

    Ren, Yilong; Wang, Yunpeng; Wu, Xinkai; Yu, Guizhen; Ding, Chuan

    2016-10-01

    Red light running (RLR) has become a major safety concern at signalized intersection. To prevent RLR related crashes, it is critical to identify the factors that significantly impact the drivers' behaviors of RLR, and to predict potential RLR in real time. In this research, 9-month's RLR events extracted from high-resolution traffic data collected by loop detectors from three signalized intersections were applied to identify the factors that significantly affect RLR behaviors. The data analysis indicated that occupancy time, time gap, used yellow time, time left to yellow start, whether the preceding vehicle runs through the intersection during yellow, and whether there is a vehicle passing through the intersection on the adjacent lane were significantly factors for RLR behaviors. Furthermore, due to the rare events nature of RLR, a modified rare events logistic regression model was developed for RLR prediction. The rare events logistic regression method has been applied in many fields for rare events studies and shows impressive performance, but so far none of previous research has applied this method to study RLR. The results showed that the rare events logistic regression model performed significantly better than the standard logistic regression model. More importantly, the proposed RLR prediction method is purely based on loop detector data collected from a single advance loop detector located 400 feet away from stop-bar. This brings great potential for future field applications of the proposed method since loops have been widely implemented in many intersections and can collect data in real time. This research is expected to contribute to the improvement of intersection safety significantly. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Regularity of a renewal process estimated from binary data.

    PubMed

    Rice, John D; Strawderman, Robert L; Johnson, Brent A

    2017-10-09

    Assessment of the regularity of a sequence of events over time is important for clinical decision-making as well as informing public health policy. Our motivating example involves determining the effect of an intervention on the regularity of HIV self-testing behavior among high-risk individuals when exact self-testing times are not recorded. Assuming that these unobserved testing times follow a renewal process, the goals of this work are to develop suitable methods for estimating its distributional parameters when only the presence or absence of at least one event per subject in each of several observation windows is recorded. We propose two approaches to estimation and inference: a likelihood-based discrete survival model using only time to first event; and a potentially more efficient quasi-likelihood approach based on the forward recurrence time distribution using all available data. Regularity is quantified and estimated by the coefficient of variation (CV) of the interevent time distribution. Focusing on the gamma renewal process, where the shape parameter of the corresponding interevent time distribution has a monotone relationship with its CV, we conduct simulation studies to evaluate the performance of the proposed methods. We then apply them to our motivating example, concluding that the use of text message reminders significantly improves the regularity of self-testing, but not its frequency. A discussion on interesting directions for further research is provided. © 2017, The International Biometric Society.

  9. Radar rainfall estimation in the context of post-event analysis of flash-flood events

    NASA Astrophysics Data System (ADS)

    Delrieu, G.; Bouilloud, L.; Boudevillain, B.; Kirstetter, P.-E.; Borga, M.

    2009-09-01

    This communication is about a methodology for radar rainfall estimation in the context of post-event analysis of flash-flood events developed within the HYDRATE project. For such extreme events, some raingauge observations (operational, amateur) are available at the event time scale, while few raingauge time series are generally available at the hydrologic time steps. Radar data is therefore the only way to access to the rainfall space-time organization, but the quality of the radar data may be highly variable as a function of (1) the relative locations of the event and the radar(s) and (2) the radar operating protocol(s) and maintenance. A positive point: heavy rainfall is associated with convection implying better visibility and lesser bright band contamination compared with more current situations. In parallel with the development of a regionalized and adaptive radar data processing system (TRADHy; Delrieu et al. 2009), a pragmatic approach is proposed here to make best use of the available radar and raingauge data for a given flash-flood event by: (1) Identifying and removing residual ground clutter, (2) Applying the "hydrologic visibility" concept (Pellarin et al. 2002) to correct for range-dependent errors (screening and VPR effects for non-attenuating wavelengths, (3) Estimating an effective Z-R relationship through a radar-raingauge optimization approach to remove the mean field bias (Dinku et al. 2002) A sensitivity study, based on the high-quality volume radar datasets collected during two intense rainfall events of the Bollène 2002 experiment (Delrieu et al. 2009), is first proposed. Then the method is implemented for two other historical events occurred in France (Avène 1997 and Aude 1999) with datasets of lesser quality. References: Delrieu, G., B. Boudevillain, J. Nicol, B. Chapon, P.-E. Kirstetter, H. Andrieu, and D. Faure, 2009: Bollène 2002 experiment: radar rainfall estimation in the Cévennes-Vivarais region, France. Journal of Applied Meteorology and Climatology, in press. Dinku, T., E.N. Anagnostou, and M. Borga, 2002: Improving Radar-Based Estimation of Rainfall over Complex Terrain. J. Appl. Meteor., 41, 1163-1178. Pellarin, T., G. Delrieu, G. M. Saulnier, H. Andrieu, B. Vignal, and J. D. Creutin, 2002: Hydrologic visibility of weather radar systems operating in mountainous regions: Case study for the Ardeche Catchment (France). Journal of Hydrometeorology, 3, 539-555.

  10. Evaluation of W Phase CMT Based PTWC Real-Time Tsunami Forecast Model Using DART Observations: Events of the Last Decade

    NASA Astrophysics Data System (ADS)

    Wang, D.; Becker, N. C.; Weinstein, S.; Duputel, Z.; Rivera, L. A.; Hayes, G. P.; Hirshorn, B. F.; Bouchard, R. H.; Mungov, G.

    2017-12-01

    The Pacific Tsunami Warning Center (PTWC) began forecasting tsunamis in real-time using source parameters derived from real-time Centroid Moment Tensor (CMT) solutions in 2009. Both the USGS and PTWC typically obtain W-Phase CMT solutions for large earthquakes less than 30 minutes after earthquake origin time. Within seconds, and often before waves reach the nearest deep ocean bottom pressure sensor (DARTs), PTWC then generates a regional tsunami propagation forecast using its linear shallow water model. The model is initialized by the sea surface deformation that mimics the seafloor deformation based on Okada's (1985) dislocation model of a rectangular fault with a uniform slip. The fault length and width are empirical functions of the seismic moment. How well did this simple model perform? The DART records provide a very valuable dataset for model validation. We examine tsunami events of the last decade with earthquake magnitudes ranging from 6.5 to 9.0 including some deep events for which tsunamis were not expected. Most of the forecast results were obtained during the events. We also include events from before the implementation of the WCMT method at USGS and PTWC, 2006-2009. For these events, WCMTs were computed retrospectively (Duputel et al. 2012). We also re-ran the model with a larger domain for some events to include far-field DARTs that recorded a tsunami with identical source parameters used during the events. We conclude that our model results, in terms of maximum wave amplitude, are mostly within a factor of two of the observed at DART stations, with an average error of less than 40% for most events, including the 2010 Maule and the 2011 Tohoku tsunamis. However, the simple fault model with a uniform slip is too simplistic for the Tohoku tsunami. We note model results are sensitive to centroid location and depth, especially if the earthquake is close to land or inland. For the 2016 M7.8 New Zealand earthquake the initial forecast underestimated the tsunami because the initial WCMT centroid was on land (the epicenter was inland but most of the slips occurred offshore). Later WCMTs did provide better forecast. The model also failed to reproduce the observed tsunamis from earthquake-generated landslides. Sea level observations during the events are crucial in determining whether or not a forecast needs to be adjusted.

  11. Spatio-Temporal Story Mapping Animation Based On Structured Causal Relationships Of Historical Events

    NASA Astrophysics Data System (ADS)

    Inoue, Y.; Tsuruoka, K.; Arikawa, M.

    2014-04-01

    In this paper, we proposed a user interface that displays visual animations on geographic maps and timelines for depicting historical stories by representing causal relationships among events for time series. We have been developing an experimental software system for the spatial-temporal visualization of historical stories for tablet computers. Our proposed system makes people effectively learn historical stories using visual animations based on hierarchical structures of different scale timelines and maps.

  12. NEVER forget: negative emotional valence enhances recapitulation.

    PubMed

    Bowen, Holly J; Kark, Sarah M; Kensinger, Elizabeth A

    2018-06-01

    A hallmark feature of episodic memory is that of "mental time travel," whereby an individual feels they have returned to a prior moment in time. Cognitive and behavioral neuroscience methods have revealed a neurobiological counterpart: Successful retrieval often is associated with reactivation of a prior brain state. We review the emerging literature on memory reactivation and recapitulation, and we describe evidence for the effects of emotion on these processes. Based on this review, we propose a new model: Negative Emotional Valence Enhances Recapitulation (NEVER). This model diverges from existing models of emotional memory in three key ways. First, it underscores the effects of emotion during retrieval. Second, it stresses the importance of sensory processing to emotional memory. Third, it emphasizes how emotional valence - whether an event is negative or positive - affects the way that information is remembered. The model specifically proposes that, as compared to positive events, negative events both trigger increased encoding of sensory detail and elicit a closer resemblance between the sensory encoding signature and the sensory retrieval signature. The model also proposes that negative valence enhances the reactivation and storage of sensory details over offline periods, leading to a greater divergence between the sensory recapitulation of negative and positive memories over time. Importantly, the model proposes that these valence-based differences occur even when events are equated for arousal, thus rendering an exclusively arousal-based theory of emotional memory insufficient. We conclude by discussing implications of the model and suggesting directions for future research to test the tenets of the model.

  13. Foretelling Flares and Solar Energetic Particle Events: the FORSPEF tool

    NASA Astrophysics Data System (ADS)

    Anastasiadis, Anastasios; Papaioannou, Athanasios; Sandberg, Ingmar; Georgoulis, Manolis K.; Tziotziou, Kostas; Jiggens, Piers

    2017-04-01

    A novel integrated prediction system, for both solar flares (SFs) and solar energetic particle (SEP) events is being presented. The Forecasting Solar Particle Events and Flares (FORSPEF) provides forecasting of solar eruptive events, such as SFs with a projection to coronal mass ejections (CMEs) (occurrence and velocity) and the likelihood of occurrence of a SEP event. In addition, FORSPEF, also provides nowcasting of SEP events based on actual SF and CME near real-time data, as well as the complete SEP profile (peak flux, fluence, rise time, duration) per parent solar event. The prediction of SFs relies on a morphological method: the effective connected magnetic field strength (Beff); it is based on an assessment of potentially flaring active-region (AR) magnetic configurations and it utilizes sophisticated analysis of a large number of AR magnetograms. For the prediction of SEP events new methods have been developed for both the likelihood of SEP occurrence and the expected SEP characteristics. In particular, using the location of the flare (longitude) and the flare size (maximum soft X-ray intensity), a reductive statistical method has been implemented. Moreover, employing CME parameters (velocity and width), proper functions per width (i.e. halo, partial halo, non-halo) and integral energy (E>30, 60, 100 MeV) have been identified. In our technique warnings are issued for all > C1.0 soft X-ray flares. The prediction time in the forecasting scheme extends to 24 hours with a refresh rate of 3 hours while the respective prediction time for the nowcasting scheme depends on the availability of the near real-time data and falls between 15-20 minutes for solar flares and 6 hours for CMEs. We present the modules of the FORSPEF system, their interconnection and the operational set up. The dual approach in the development of FORPSEF (i.e. forecasting and nowcasting scheme) permits the refinement of predictions upon the availability of new data that characterize changes on the Sun and the interplanetary space, while the combined usage of SF and SEP forecasting methods upgrades FORSPEF to an integrated forecasting solution. Finally, we demonstrate the validation of the modules of the FORSPEF tool using categorical scores constructed on archived data and we further discuss independent case studies. This work has been funded through the "FORSPEF: FORecasting Solar Particle Events and Flares", ESA Contract No. 4000109641/13/NL/AK and the "SPECS: Solar Particle Events foreCasting Studies" project of the National Observatory of Athens.

  14. A quantile-based Time at Risk: A new approach for assessing risk in financial markets

    NASA Astrophysics Data System (ADS)

    Bolgorian, Meysam; Raei, Reza

    2013-11-01

    In this paper, we provide a new measure for evaluation of risk in financial markets. This measure is based on the return interval of critical events in financial markets or other investment situations. Our main goal was to devise a model like Value at Risk (VaR). As VaR, for a given financial asset, probability level and time horizon, gives a critical value such that the likelihood of loss on the asset over the time horizon exceeds this value is equal to the given probability level, our concept of Time at Risk (TaR), using a probability distribution function of return intervals, provides a critical time such that the probability that the return interval of a critical event exceeds this time equals the given probability level. As an empirical application, we applied our model to data from the Tehran Stock Exchange Price Index (TEPIX) as a financial asset (market portfolio) and reported the results.

  15. Transient modeling in simulation of hospital operations for emergency response.

    PubMed

    Paul, Jomon Aliyas; George, Santhosh K; Yi, Pengfei; Lin, Li

    2006-01-01

    Rapid estimates of hospital capacity after an event that may cause a disaster can assist disaster-relief efforts. Due to the dynamics of hospitals, following such an event, it is necessary to accurately model the behavior of the system. A transient modeling approach using simulation and exponential functions is presented, along with its applications in an earthquake situation. The parameters of the exponential model are regressed using outputs from designed simulation experiments. The developed model is capable of representing transient, patient waiting times during a disaster. Most importantly, the modeling approach allows real-time capacity estimation of hospitals of various sizes and capabilities. Further, this research is an analysis of the effects of priority-based routing of patients within the hospital and the effects on patient waiting times determined using various patient mixes. The model guides the patients based on the severity of injuries and queues the patients requiring critical care depending on their remaining survivability time. The model also accounts the impact of prehospital transport time on patient waiting time.

  16. Developing a Signature Based Safeguards Approach for the Electrorefiner and Salt Cleanup Unit Operations in Pyroprocessing Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murphy, Chantell Lynne-Marie

    Traditional nuclear materials accounting does not work well for safeguards when applied to pyroprocessing. Alternate methods such as Signature Based Safeguards (SBS) are being investigated. The goal of SBS is real-time/near-real-time detection of anomalous events in the pyroprocessing facility as they could indicate loss of special nuclear material. In high-throughput reprocessing facilities, metric tons of separated material are processed that must be accounted for. Even with very low uncertainties of accountancy measurements (<0.1%) the uncertainty of the material balances is still greater than the desired level. Novel contributions of this work are as follows: (1) significant enhancement of SBS developmentmore » for the salt cleanup process by creating a new gas sparging process model, selecting sensors to monitor normal operation, identifying safeguards-significant off-normal scenarios, and simulating those off-normal events and generating sensor output; (2) further enhancement of SBS development for the electrorefiner by simulating off-normal events caused by changes in salt concentration and identifying which conditions lead to Pu and Cm not tracking throughout the rest of the system; and (3) new contribution in applying statistical techniques to analyze the signatures gained from these two models to help draw real-time conclusions on anomalous events.« less

  17. Neuromorphic Event-Based 3D Pose Estimation

    PubMed Central

    Reverter Valeiras, David; Orchard, Garrick; Ieng, Sio-Hoi; Benosman, Ryad B.

    2016-01-01

    Pose estimation is a fundamental step in many artificial vision tasks. It consists of estimating the 3D pose of an object with respect to a camera from the object's 2D projection. Current state of the art implementations operate on images. These implementations are computationally expensive, especially for real-time applications. Scenes with fast dynamics exceeding 30–60 Hz can rarely be processed in real-time using conventional hardware. This paper presents a new method for event-based 3D object pose estimation, making full use of the high temporal resolution (1 μs) of asynchronous visual events output from a single neuromorphic camera. Given an initial estimate of the pose, each incoming event is used to update the pose by combining both 3D and 2D criteria. We show that the asynchronous high temporal resolution of the neuromorphic camera allows us to solve the problem in an incremental manner, achieving real-time performance at an update rate of several hundreds kHz on a conventional laptop. We show that the high temporal resolution of neuromorphic cameras is a key feature for performing accurate pose estimation. Experiments are provided showing the performance of the algorithm on real data, including fast moving objects, occlusions, and cases where the neuromorphic camera and the object are both in motion. PMID:26834547

  18. Predicting performance and safety based on driver fatigue.

    PubMed

    Mollicone, Daniel; Kan, Kevin; Mott, Chris; Bartels, Rachel; Bruneau, Steve; van Wollen, Matthew; Sparrow, Amy R; Van Dongen, Hans P A

    2018-04-02

    Fatigue causes decrements in vigilant attention and reaction time and is a major safety hazard in the trucking industry. There is a need to quantify the relationship between driver fatigue and safety in terms of operationally relevant measures. Hard-braking events are a suitable measure for this purpose as they are relatively easily observed and are correlated with collisions and near-crashes. We developed an analytic approach that predicts driver fatigue based on a biomathematical model and then estimates hard-braking events as a function of predicted fatigue, controlling for time of day to account for systematic variations in exposure (traffic density). The analysis used de-identified data from a previously published, naturalistic field study of 106 U.S. commercial motor vehicle (CMV) drivers. Data analyzed included drivers' official duty logs, sleep patterns measured around the clock using wrist actigraphy, and continuous recording of vehicle data to capture hard-braking events. The curve relating predicted fatigue to hard-braking events showed that the frequency of hard-braking events increased as predicted fatigue levels worsened. For each increment on the fatigue scale, the frequency of hard-braking events increased by 7.8%. The results provide proof of concept for a novel approach that predicts fatigue based on drivers' sleep patterns and estimates driving performance in terms of an operational metric related to safety. The approach can be translated to practice by CMV operators to achieve a fatigue risk profile specific to their own settings, in order to support data-driven decisions about fatigue countermeasures that cost-effectively deliver quantifiable operational benefits. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Time-to-event methodology improved statistical evaluation in register-based health services research.

    PubMed

    Bluhmki, Tobias; Bramlage, Peter; Volk, Michael; Kaltheuner, Matthias; Danne, Thomas; Rathmann, Wolfgang; Beyersmann, Jan

    2017-02-01

    Complex longitudinal sampling and the observational structure of patient registers in health services research are associated with methodological challenges regarding data management and statistical evaluation. We exemplify common pitfalls and want to stimulate discussions on the design, development, and deployment of future longitudinal patient registers and register-based studies. For illustrative purposes, we use data from the prospective, observational, German DIabetes Versorgungs-Evaluation register. One aim was to explore predictors for the initiation of a basal insulin supported therapy in patients with type 2 diabetes initially prescribed to glucose-lowering drugs alone. Major challenges are missing mortality information, time-dependent outcomes, delayed study entries, different follow-up times, and competing events. We show that time-to-event methodology is a valuable tool for improved statistical evaluation of register data and should be preferred to simple case-control approaches. Patient registers provide rich data sources for health services research. Analyses are accompanied with the trade-off between data availability, clinical plausibility, and statistical feasibility. Cox' proportional hazards model allows for the evaluation of the outcome-specific hazards, but prediction of outcome probabilities is compromised by missing mortality information. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Analysis and verification of a prediction model of solar energetic proton events

    NASA Astrophysics Data System (ADS)

    Wang, J.; Zhong, Q.

    2017-12-01

    The solar energetic particle event can cause severe radiation damages near Earth. The alerts and summary products of the solar energetic proton events were provided by the Space Environment Prediction Center (SEPC) according to the flux of the greater than 10 MeV protons taken by GOES satellite in geosynchronous orbit. The start of a solar energetic proton event is defined as the time when the flux of the greater than 10 MeV protons equals or exceeds 10 proton flux units (pfu). In this study, a model was developed to predict the solar energetic proton events, provide the warning for the solar energetic proton events at least minutes in advance, based on both the soft X-ray flux and integral proton flux taken by GOES. The quality of the forecast model was measured against verifications of accuracy, reliability, discrimination capability, and forecast skills. The peak flux and rise time of the solar energetic proton events in the six channels, >1MeV, >5 MeV, >10 MeV, >30 MeV, >50 MeV, >100 MeV, were also simulated and analyzed.

  1. Designing an 'expert knowledge' based approach for the quantification of historical floods - the case study of the Kinzig catchment in Southwest Germany

    NASA Astrophysics Data System (ADS)

    Bösmeier, Annette; Glaser, Rüdiger; Stahl, Kerstin; Himmelsbach, Iso; Schönbein, Johannes

    2017-04-01

    Future estimations of flood hazard and risk for developing optimal coping and adaption strategies inevitably include considerations of the frequency and magnitude of past events. Methods of historical climatology represent one way of assessing flood occurrences beyond the period of instrumental measurements and can thereby substantially help to extend the view into the past and to improve modern risk analysis. Such historical information can be of additional value and has been used in statistical approaches like Bayesian flood frequency analyses during recent years. However, the derivation of quantitative values from vague descriptive information of historical sources remains a crucial challenge. We explored possibilities of parametrization of descriptive flood related data specifically for the assessment of historical floods in a framework that combines a hermeneutical approach with mathematical and statistical methods. This study forms part of the transnational, Franco-German research project TRANSRISK2 (2014 - 2017), funded by ANR and DFG, with the focus on exploring the floods history of the last 300 years for the regions of Upper and Middle Rhine. A broad data base of flood events had been compiled, dating back to AD 1500. The events had been classified based on hermeneutical methods, depending on intensity, spatial dimension, temporal structure, damages and mitigation measures associated with the specific events. This indexed database allowed the exploration of a link between descriptive data and quantitative information for the overlapping time period of classified floods and instrumental measurements since the end of the 19th century. Thereby, flood peak discharges as a quantitative measure of the severity of a flood were used to assess the discharge intervals for flood classes (upper and lower thresholds) within different time intervals for validating the flood classification, as well as examining the trend in the perception threshold over time. Furthermore, within a suitable time period, flood classes and other quantifiable indicators of flood intensity (number of damaged locations mentioned in historical sources, general availability of reports associated with a specific event) were combined with available peak discharges measurements. We argue that this information can be considered as 'expert knowledge' and used it to develop a fuzzy rule based model for deriving peak discharge estimates of pre-instrumental events that can finally be introduced into a flood frequency analysis.

  2. Physics-based, Bayesian sequential detection method and system for radioactive contraband

    DOEpatents

    Candy, James V; Axelrod, Michael C; Breitfeller, Eric F; Chambers, David H; Guidry, Brian L; Manatt, Douglas R; Meyer, Alan W; Sale, Kenneth E

    2014-03-18

    A distributed sequential method and system for detecting and identifying radioactive contraband from highly uncertain (noisy) low-count, radionuclide measurements, i.e. an event mode sequence (EMS), using a statistical approach based on Bayesian inference and physics-model-based signal processing based on the representation of a radionuclide as a monoenergetic decomposition of monoenergetic sources. For a given photon event of the EMS, the appropriate monoenergy processing channel is determined using a confidence interval condition-based discriminator for the energy amplitude and interarrival time and parameter estimates are used to update a measured probability density function estimate for a target radionuclide. A sequential likelihood ratio test is then used to determine one of two threshold conditions signifying that the EMS is either identified as the target radionuclide or not, and if not, then repeating the process for the next sequential photon event of the EMS until one of the two threshold conditions is satisfied.

  3. Prospective memory deficits in illicit polydrug users are associated with the average long-term typical dose of ecstasy typically consumed in a single session.

    PubMed

    Gallagher, Denis T; Hadjiefthyvoulou, Florentia; Fisk, John E; Montgomery, Catharine; Robinson, Sarita J; Judge, Jeannie

    2014-01-01

    Neuroimaging evidence suggests that ecstasy-related reductions in SERT densities relate more closely to the number of tablets typically consumed per session rather than estimated total lifetime use. To better understand the basis of drug related deficits in prospective memory (p.m.) we explored the association between p.m. and average long-term typical dose and long-term frequency of use. Study 1: Sixty-five ecstasy/polydrug users and 85 nonecstasy users completed an event-based, a short-term and a long-term time-based p.m. task. Study 2: Study 1 data were merged with outcomes on the same p.m. measures from a previous study creating a combined sample of 103 ecstasy/polydrug users, 38 cannabis-only users, and 65 nonusers of illicit drugs. Study 1: Ecstasy/polydrug users had significant impairments on all p.m. outcomes compared with nonecstasy users. Study 2: Ecstasy/polydrug users were impaired in event-based p.m. compared with both other groups and in long-term time-based p.m. compared with nonillicit drug users. Both drug using groups did worse on the short-term time-based p.m. task compared with nonusers. Higher long-term average typical dose of ecstasy was associated with poorer performance on the event and short-term time-based p.m. tasks and accounted for unique variance in the two p.m. measures over and above the variance associated with cannabis and cocaine use. The typical ecstasy dose consumed in a single session is an important predictor of p.m. impairments with higher doses reflecting increasing tolerance giving rise to greater p.m. impairment.

  4. Three-dimensional, position-sensitive radiation detection

    DOEpatents

    He, Zhong; Zhang, Feng

    2010-04-06

    Disclosed herein is a method of determining a characteristic of radiation detected by a radiation detector via a multiple-pixel event having a plurality of radiation interactions. The method includes determining a cathode-to-anode signal ratio for a selected interaction of the plurality of radiation interactions based on electron drift time data for the selected interaction, and determining the radiation characteristic for the multiple-pixel event based on both the cathode-to-anode signal ratio and the electron drift time data. In some embodiments, the method further includes determining a correction factor for the radiation characteristic based on an interaction depth of the plurality of radiation interactions, a lateral distance between the selected interaction and a further interaction of the plurality of radiation interactions, and the lateral positioning of the plurality of radiation interactions.

  5. Digital disease detection: A systematic review of event-based internet biosurveillance systems.

    PubMed

    O'Shea, Jesse

    2017-05-01

    Internet access and usage has changed how people seek and report health information. Meanwhile,infectious diseases continue to threaten humanity. The analysis of Big Data, or vast digital data, presents an opportunity to improve disease surveillance and epidemic intelligence. Epidemic intelligence contains two components: indicator based and event-based. A relatively new surveillance type has emerged called event-based Internet biosurveillance systems. These systems use information on events impacting health from Internet sources, such as social media or news aggregates. These systems circumvent the limitations of traditional reporting systems by being inexpensive, transparent, and flexible. Yet, innovations and the functionality of these systems can change rapidly. To update the current state of knowledge on event-based Internet biosurveillance systems by identifying all systems, including current functionality, with hopes to aid decision makers with whether to incorporate new methods into comprehensive programmes of surveillance. A systematic review was performed through PubMed, Scopus, and Google Scholar databases, while also including grey literature and other publication types. 50 event-based Internet systems were identified, including an extraction of 15 attributes for each system, described in 99 articles. Each system uses different innovative technology and data sources to gather data, process, and disseminate data to detect infectious disease outbreaks. The review emphasises the importance of using both formal and informal sources for timely and accurate infectious disease outbreak surveillance, cataloguing all event-based Internet biosurveillance systems. By doing so, future researchers will be able to use this review as a library for referencing systems, with hopes of learning, building, and expanding Internet-based surveillance systems. Event-based Internet biosurveillance should act as an extension of traditional systems, to be utilised as an additional, supplemental data source to have a more comprehensive estimate of disease burden. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Event-based knowledge elicitation of operating room management decision-making using scenarios adapted from information systems data

    PubMed Central

    2011-01-01

    Background No systematic process has previously been described for a needs assessment that identifies the operating room (OR) management decisions made by the anesthesiologists and nurse managers at a facility that do not maximize the efficiency of use of OR time. We evaluated whether event-based knowledge elicitation can be used practically for rapid assessment of OR management decision-making at facilities, whether scenarios can be adapted automatically from information systems data, and the usefulness of the approach. Methods A process of event-based knowledge elicitation was developed to assess OR management decision-making that may reduce the efficiency of use of OR time. Hypothetical scenarios addressing every OR management decision influencing OR efficiency were created from published examples. Scenarios are adapted, so that cues about conditions are accurate and appropriate for each facility (e.g., if OR 1 is used as an example in a scenario, the listed procedure is a type of procedure performed at the facility in OR 1). Adaptation is performed automatically using the facility's OR information system or anesthesia information management system (AIMS) data for most scenarios (43 of 45). Performing the needs assessment takes approximately 1 hour of local managers' time while they decide if their decisions are consistent with the described scenarios. A table of contents of the indexed scenarios is created automatically, providing a simple version of problem solving using case-based reasoning. For example, a new OR manager wanting to know the best way to decide whether to move a case can look in the chapter on "Moving Cases on the Day of Surgery" to find a scenario that describes the situation being encountered. Results Scenarios have been adapted and used at 22 hospitals. Few changes in decisions were needed to increase the efficiency of use of OR time. The few changes were heterogeneous among hospitals, showing the usefulness of individualized assessments. Conclusions Our technical advance is the development and use of automated event-based knowledge elicitation to identify suboptimal OR management decisions that decrease the efficiency of use of OR time. The adapted scenarios can be used in future decision-making. PMID:21214905

  7. Event-based knowledge elicitation of operating room management decision-making using scenarios adapted from information systems data.

    PubMed

    Dexter, Franklin; Wachtel, Ruth E; Epstein, Richard H

    2011-01-07

    No systematic process has previously been described for a needs assessment that identifies the operating room (OR) management decisions made by the anesthesiologists and nurse managers at a facility that do not maximize the efficiency of use of OR time. We evaluated whether event-based knowledge elicitation can be used practically for rapid assessment of OR management decision-making at facilities, whether scenarios can be adapted automatically from information systems data, and the usefulness of the approach. A process of event-based knowledge elicitation was developed to assess OR management decision-making that may reduce the efficiency of use of OR time. Hypothetical scenarios addressing every OR management decision influencing OR efficiency were created from published examples. Scenarios are adapted, so that cues about conditions are accurate and appropriate for each facility (e.g., if OR 1 is used as an example in a scenario, the listed procedure is a type of procedure performed at the facility in OR 1). Adaptation is performed automatically using the facility's OR information system or anesthesia information management system (AIMS) data for most scenarios (43 of 45). Performing the needs assessment takes approximately 1 hour of local managers' time while they decide if their decisions are consistent with the described scenarios. A table of contents of the indexed scenarios is created automatically, providing a simple version of problem solving using case-based reasoning. For example, a new OR manager wanting to know the best way to decide whether to move a case can look in the chapter on "Moving Cases on the Day of Surgery" to find a scenario that describes the situation being encountered. Scenarios have been adapted and used at 22 hospitals. Few changes in decisions were needed to increase the efficiency of use of OR time. The few changes were heterogeneous among hospitals, showing the usefulness of individualized assessments. Our technical advance is the development and use of automated event-based knowledge elicitation to identify suboptimal OR management decisions that decrease the efficiency of use of OR time. The adapted scenarios can be used in future decision-making.

  8. Identification and characterization of earthquake clusters: a comparative analysis for selected sequences in Italy

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Gentili, Stefania

    2017-04-01

    Identification and statistical characterization of seismic clusters may provide useful insights about the features of seismic energy release and their relation to physical properties of the crust within a given region. Moreover, a number of studies based on spatio-temporal analysis of main-shocks occurrence require preliminary declustering of the earthquake catalogs. Since various methods, relying on different physical/statistical assumptions, may lead to diverse classifications of earthquakes into main events and related events, we aim to investigate the classification differences among different declustering techniques. Accordingly, a formal selection and comparative analysis of earthquake clusters is carried out for the most relevant earthquakes in North-Eastern Italy, as reported in the local OGS-CRS bulletins, compiled at the National Institute of Oceanography and Experimental Geophysics since 1977. The comparison is then extended to selected earthquake sequences associated with a different seismotectonic setting, namely to events that occurred in the region struck by the recent Central Italy destructive earthquakes, making use of INGV data. Various techniques, ranging from classical space-time windows methods to ad hoc manual identification of aftershocks, are applied for detection of earthquake clusters. In particular, a statistical method based on nearest-neighbor distances of events in space-time-energy domain, is considered. Results from clusters identification by the nearest-neighbor method turn out quite robust with respect to the time span of the input catalogue, as well as to minimum magnitude cutoff. The identified clusters for the largest events reported in North-Eastern Italy since 1977 are well consistent with those reported in earlier studies, which were aimed at detailed manual aftershocks identification. The study shows that the data-driven approach, based on the nearest-neighbor distances, can be satisfactorily applied to decompose the seismic catalog into background seismicity and individual sequences of earthquake clusters, also in areas characterized by moderate seismic activity, where the standard declustering techniques may turn out rather gross approximations. With these results acquired, the main statistical features of seismic clusters are explored, including complex interdependence of related events, with the aim to characterize the space-time patterns of earthquakes occurrence in North-Eastern Italy and capture their basic differences with Central Italy sequences.

  9. Analysis of Cumulonimbus (Cb), Thunderstorm and Fog for Izmir Adnan Menderes Airport

    NASA Astrophysics Data System (ADS)

    Avsar, Ercument

    2016-07-01

    Demand for airline transport has been increasing day by day with the development of the aviation industry in Turkey. Meteorological conditions are among the most important factors that influence aviation facilities. Meteorological events cause delays and cancellation of flights which create economic and time losses, and they even lead to accidents and breakups. The most important meteorological events that affect the takeoff and landing of airplanes can be listed as wind, runway visual range, cloud, rain, icing, turbulence, and low level windshear. Meteorological events that affect the aviation facilities most often in Adnan Menderes Airport (LTBJ), the fourth largest airport in Turkey in terms of air traffic, are fog, Cumulonimbus (Cb) clouds and thunderstorms (TS-Thunderstorm). Therefore, it is important to identify the occurrence time of these events based on the analysis of data over many years and do the flight plans based on this meteorological information in order to make the aviation facilities safer and without delays. In this study, statistical analysis on the formation of Cb clouds, thunderstorm and foggy days is conducted using observations produced for aviation (METAR) and special observers (SPECI). It is found that there are two types of fog that are observed most often at LTBJ, namely radiation and advection fogs, accordingly to the results of statistical analysis based on data from 2004 to 2014. Fog events are found to occur most often in the months of December and January, during 04:00 - 07:00 UTC time interval, between pressure values over 1015-1020 hPa, in 130-190 degree light breeze (1-5KT) and in temperature levels between 5°C and 8°C. Thunderstorm events recorded at LTBJ between the years 2004 and 2014 are most often observed in the months of January and February, in 120-210 degree gentle breeze winds (6-10KT), and in temperature levels between 8 and 18 °C. Key Words: Adnan Menderes International Airport, LTBJ, Fog, Thunderstorm (TS), Cb Clouds

  10. VizieR Online Data Catalog: Spitzer IRAC events observed in crowded fields (Calchi+, 2015)

    NASA Astrophysics Data System (ADS)

    Calchi Novati, S.; Gould, A.; Yee, J. C.; Beichman, C.; Bryden, G.; Carey, S.; Fausnaugh, M.; Gaudi, B. S.; Henderson, C. B.; Pogge, R. W.; Shvartzvald, Y.; Wibking, B.; Zhu, W.; Spitzer Team; Udalski, A.; Poleski, R.; Pawlak, M.; Szymanski, M. K.; Skowron, J.; Mroz, P.; Kozlowski, S.; Wyrzykowski, L.; Pietrukowicz, P.; Pietrzynski, G.; Soszynski, I.; Ulaczyk, K.; OGLE Group

    2017-10-01

    In Table 1 we list the 170 events monitored in 2015. For each, we report the event name, the coordinates, the first and last day of observation, and the number of observed epochs. The events were chosen based on the microlensing alerts provided by the OGLE (Udalski et al. 2015AcA....65....1U) and MOA (Bond et al. 2004ApJ...606L.155B) collaborations. The current analysis is based on the preliminary reduced data made available by the Spitzer Science Center almost in real time (on average, 2-3 days after the observations). The final reduction of the data is now publicly available at the NASA/IPAC Infrared Science Database (IRSA, http://irsa.ipac.caltech.edu/frontpage/). (1 data file).

  11. Reduced-Order Modeling and Wavelet Analysis of Turbofan Engine Structural Response Due to Foreign Object Damage (FOD) Events

    NASA Technical Reports Server (NTRS)

    Turso, James; Lawrence, Charles; Litt, Jonathan

    2004-01-01

    The development of a wavelet-based feature extraction technique specifically targeting FOD-event induced vibration signal changes in gas turbine engines is described. The technique performs wavelet analysis of accelerometer signals from specified locations on the engine and is shown to be robust in the presence of significant process and sensor noise. It is envisioned that the technique will be combined with Kalman filter thermal/health parameter estimation for FOD-event detection via information fusion from these (and perhaps other) sources. Due to the lack of high-frequency FOD-event test data in the open literature, a reduced-order turbofan structural model (ROM) was synthesized from a finite element model modal analysis to support the investigation. In addition to providing test data for algorithm development, the ROM is used to determine the optimal sensor location for FOD-event detection. In the presence of significant noise, precise location of the FOD event in time was obtained using the developed wavelet-based feature.

  12. Reduced-Order Modeling and Wavelet Analysis of Turbofan Engine Structural Response Due to Foreign Object Damage "FOD" Events

    NASA Technical Reports Server (NTRS)

    Turso, James A.; Lawrence, Charles; Litt, Jonathan S.

    2007-01-01

    The development of a wavelet-based feature extraction technique specifically targeting FOD-event induced vibration signal changes in gas turbine engines is described. The technique performs wavelet analysis of accelerometer signals from specified locations on the engine and is shown to be robust in the presence of significant process and sensor noise. It is envisioned that the technique will be combined with Kalman filter thermal/ health parameter estimation for FOD-event detection via information fusion from these (and perhaps other) sources. Due to the lack of high-frequency FOD-event test data in the open literature, a reduced-order turbofan structural model (ROM) was synthesized from a finite-element model modal analysis to support the investigation. In addition to providing test data for algorithm development, the ROM is used to determine the optimal sensor location for FOD-event detection. In the presence of significant noise, precise location of the FOD event in time was obtained using the developed wavelet-based feature.

  13. Event-triggered decentralized adaptive fault-tolerant control of uncertain interconnected nonlinear systems with actuator failures.

    PubMed

    Choi, Yun Ho; Yoo, Sung Jin

    2018-06-01

    This paper investigates the event-triggered decentralized adaptive tracking problem of a class of uncertain interconnected nonlinear systems with unexpected actuator failures. It is assumed that local control signals are transmitted to local actuators with time-varying faults whenever predefined conditions for triggering events are satisfied. Compared with the existing control-input-based event-triggering strategy for adaptive control of uncertain nonlinear systems, the aim of this paper is to propose a tracking-error-based event-triggering strategy in the decentralized adaptive fault-tolerant tracking framework. The proposed approach can relax drastic changes in control inputs caused by actuator faults in the existing triggering strategy. The stability of the proposed event-triggering control system is analyzed in the Lyapunov sense. Finally, simulation comparisons of the proposed and existing approaches are provided to show the effectiveness of the proposed theoretical result in the presence of actuator faults. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  14. Age Differences in the Experience of Daily Life Events: A Study Based on the Social Goals Perspective

    PubMed Central

    Ji, Lingling; Peng, Huamao; Xue, Xiaotong

    2017-01-01

    This study examined age differences in daily life events related to different types of social goals based on the socioemotional selectivity theory (SST), and determined whether the positivity effect existed in the context of social goals in older adults’ daily lives. Over a course of 14 days, 49 older adults and 36 younger adults wrote about up to three life events daily and rated the valence of each event. The findings indicated that (1) although both older and younger adults recorded events related to both emotional and knowledge-acquisition goals, the odds ratio for reporting a higher number of events related to emotional goals compared to the number of events related to knowledge-acquisition goals was 2.12 times higher in older adults than that observed in younger adults. (2) Considering the number of events, there was an age-related positivity effect only for knowledge-related goals, and (3) older adults’ ratings for events related to emotional and knowledge-acquisition goals were significantly more positive compared to those observed in younger adults. These findings supported the SST, and to some extent, the positivity effect was demonstrated in the context of social goals. PMID:28979227

  15. Age Differences in the Experience of Daily Life Events: A Study Based on the Social Goals Perspective.

    PubMed

    Ji, Lingling; Peng, Huamao; Xue, Xiaotong

    2017-01-01

    This study examined age differences in daily life events related to different types of social goals based on the socioemotional selectivity theory (SST), and determined whether the positivity effect existed in the context of social goals in older adults' daily lives. Over a course of 14 days, 49 older adults and 36 younger adults wrote about up to three life events daily and rated the valence of each event. The findings indicated that (1) although both older and younger adults recorded events related to both emotional and knowledge-acquisition goals, the odds ratio for reporting a higher number of events related to emotional goals compared to the number of events related to knowledge-acquisition goals was 2.12 times higher in older adults than that observed in younger adults. (2) Considering the number of events, there was an age-related positivity effect only for knowledge-related goals, and (3) older adults' ratings for events related to emotional and knowledge-acquisition goals were significantly more positive compared to those observed in younger adults. These findings supported the SST, and to some extent, the positivity effect was demonstrated in the context of social goals.

  16. Very low frequency radio events with a reduced intensity observed by the low-altitude DEMETER spacecraft

    NASA Astrophysics Data System (ADS)

    Záhlava, J.; Němec, F.; Santolík, O.; Kolmašová, I.; Parrot, M.; Rodger, C. J.

    2015-11-01

    We present results of a systematic study of unusual very low frequency (VLF) radio events with a reduced intensity observed in the frequency-time spectrograms measured by the low-orbiting Detection of Electro-Magnetic Emissions Transmitted from Earthquake Regions (DEMETER) spacecraft. They occur exclusively on the nightside. During these events, the intensity of fractional hop whistlers at specific frequencies is significantly reduced. These frequencies are usually above about 3.4 kHz (second Earth-ionosphere waveguide cutoff frequency), but about 20% of events extend down to about 1.7 kHz (first Earth-ionosphere waveguide cutoff frequency). The frequencies of a reduced intensity vary smoothly with time. We have inspected 6.5 years of DEMETER data, and we identified in total 1601 such events. We present a simple model of the event formation based on the wave propagation in the Earth-ionosphere waveguide. We apply the model to two selected events, and we demonstrate that the model is able to reproduce both the minimum frequencies of the events and their approximate frequency-time shapes. The overall geographic distribution of the events is shifted by about 3000 km westward and slightly southward with respect to the areas with high long-term average lightning activity. We demonstrate that this shift is related to the specific DEMETER orbit, and we suggest its qualitative explanation by the east-west asymmetry of the wave propagation in the Earth-ionosphere waveguide.

  17. Visuospatial asymmetries and emotional valence influence mental time travel.

    PubMed

    Thomas, Nicole A; Takarangi, Melanie K T

    2018-06-01

    Spatial information is tightly intertwined with temporal and valence-based information. Namely, "past" is represented on the left, and "future" on the right, along a horizontal mental timeline. Similarly, right is associated with positive, whereas left is negative. We developed a novel task to examine the effects of emotional valence and temporal distance on mental representations of time. We compared positivity biases, where positive events are positioned closer to now, and right hemisphere emotion biases, where negative events are positioned to the left. When the entire life span was used, a positivity bias emerged; positive events were closer to now. When timeline length was reduced, positivity and right hemisphere emotion biases were consistent for past events. In contrast, positive and negative events were equidistant from now in the future condition, suggesting positivity and right hemisphere emotion biases opposed one another, leading events to be positioned at a similar distance. We then reversed the timeline by moving past to the right and future to the left. Positivity biases in the past condition were eliminated, and negative events were placed slightly closer to now in the future condition. We conclude that an underlying left-to-right mental representation of time is necessary for positivity biases to emerge for past events; however, our mental representations of future events are inconsistent with positivity biases. These findings point to an important difference in the way in which we represent the past and the future on our mental timeline. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  18. Detection and characterization of lightning-based sources using continuous wavelet transform: application to audio-magnetotellurics

    NASA Astrophysics Data System (ADS)

    Larnier, H.; Sailhac, P.; Chambodut, A.

    2018-01-01

    Atmospheric electromagnetic waves created by global lightning activity contain information about electrical processes of the inner and the outer Earth. Large signal-to-noise ratio events are particularly interesting because they convey information about electromagnetic properties along their path. We introduce a new methodology to automatically detect and characterize lightning-based waves using a time-frequency decomposition obtained through the application of continuous wavelet transform. We focus specifically on three types of sources, namely, atmospherics, slow tails and whistlers, that cover the frequency range 10 Hz to 10 kHz. Each wave has distinguishable characteristics in the time-frequency domain due to source shape and dispersion processes. Our methodology allows automatic detection of each type of event in the time-frequency decomposition thanks to their specific signature. Horizontal polarization attributes are also recovered in the time-frequency domain. This procedure is first applied to synthetic extremely low frequency time-series with different signal-to-noise ratios to test for robustness. We then apply it on real data: three stations of audio-magnetotelluric data acquired in Guadeloupe, oversea French territories. Most of analysed atmospherics and slow tails display linear polarization, whereas analysed whistlers are elliptically polarized. The diversity of lightning activity is finally analysed in an audio-magnetotelluric data processing framework, as used in subsurface prospecting, through estimation of the impedance response functions. We show that audio-magnetotelluric processing results depend mainly on the frequency content of electromagnetic waves observed in processed time-series, with an emphasis on the difference between morning and afternoon acquisition. Our new methodology based on the time-frequency signature of lightning-induced electromagnetic waves allows automatic detection and characterization of events in audio-magnetotelluric time-series, providing the means to assess quality of response functions obtained through processing.

  19. Understanding human dynamics in microblog posting activities

    NASA Astrophysics Data System (ADS)

    Jiang, Zhihong; Zhang, Yubao; Wang, Hui; Li, Pei

    2013-02-01

    Human activity patterns are an important issue in behavior dynamics research. Empirical evidence indicates that human activity patterns can be characterized by a heavy-tailed inter-event time distribution. However, most researchers give an understanding by only modeling the power-law feature of the inter-event time distribution, and those overlooked non-power-law features are likely to be nontrivial. In this work, we propose a behavior dynamics model, called the finite memory model, in which humans adaptively change their activity rates based on a finite memory of recent activities, which is driven by inherent individual interest. Theoretical analysis shows a finite memory model can properly explain various heavy-tailed inter-event time distributions, including a regular power law and some non-power-law deviations. To validate the model, we carry out an empirical study based on microblogging activity from thousands of microbloggers in the Celebrity Hall of the Sina microblog. The results show further that the model is reasonably effective. We conclude that finite memory is an effective dynamics element to describe the heavy-tailed human activity pattern.

  20. An Optimal Image-Based Method for Identification of Acoustic Emission (AE) Sources in Plate-Like Structures Using a Lead Zirconium Titanate (PZT) Sensor Array.

    PubMed

    Yan, Gang; Zhou, Li

    2018-02-21

    This paper proposes an innovative method for identifying the locations of multiple simultaneous acoustic emission (AE) events in plate-like structures from the view of image processing. By using a linear lead zirconium titanate (PZT) sensor array to record the AE wave signals, a reverse-time frequency-wavenumber (f-k) migration is employed to produce images displaying the locations of AE sources by back-propagating the AE waves. Lamb wave theory is included in the f-k migration to consider the dispersive property of the AE waves. Since the exact occurrence time of the AE events is usually unknown when recording the AE wave signals, a heuristic artificial bee colony (ABC) algorithm combined with an optimal criterion using minimum Shannon entropy is used to find the image with the identified AE source locations and occurrence time that mostly approximate the actual ones. Experimental studies on an aluminum plate with AE events simulated by PZT actuators are performed to validate the applicability and effectiveness of the proposed optimal image-based AE source identification method.

  1. An Optimal Image-Based Method for Identification of Acoustic Emission (AE) Sources in Plate-Like Structures Using a Lead Zirconium Titanate (PZT) Sensor Array

    PubMed Central

    Zhou, Li

    2018-01-01

    This paper proposes an innovative method for identifying the locations of multiple simultaneous acoustic emission (AE) events in plate-like structures from the view of image processing. By using a linear lead zirconium titanate (PZT) sensor array to record the AE wave signals, a reverse-time frequency-wavenumber (f-k) migration is employed to produce images displaying the locations of AE sources by back-propagating the AE waves. Lamb wave theory is included in the f-k migration to consider the dispersive property of the AE waves. Since the exact occurrence time of the AE events is usually unknown when recording the AE wave signals, a heuristic artificial bee colony (ABC) algorithm combined with an optimal criterion using minimum Shannon entropy is used to find the image with the identified AE source locations and occurrence time that mostly approximate the actual ones. Experimental studies on an aluminum plate with AE events simulated by PZT actuators are performed to validate the applicability and effectiveness of the proposed optimal image-based AE source identification method. PMID:29466310

  2. Simulating recurrent event data with hazard functions defined on a total time scale.

    PubMed

    Jahn-Eimermacher, Antje; Ingel, Katharina; Ozga, Ann-Kathrin; Preussler, Stella; Binder, Harald

    2015-03-08

    In medical studies with recurrent event data a total time scale perspective is often needed to adequately reflect disease mechanisms. This means that the hazard process is defined on the time since some starting point, e.g. the beginning of some disease, in contrast to a gap time scale where the hazard process restarts after each event. While techniques such as the Andersen-Gill model have been developed for analyzing data from a total time perspective, techniques for the simulation of such data, e.g. for sample size planning, have not been investigated so far. We have derived a simulation algorithm covering the Andersen-Gill model that can be used for sample size planning in clinical trials as well as the investigation of modeling techniques. Specifically, we allow for fixed and/or random covariates and an arbitrary hazard function defined on a total time scale. Furthermore we take into account that individuals may be temporarily insusceptible to a recurrent incidence of the event. The methods are based on conditional distributions of the inter-event times conditional on the total time of the preceeding event or study start. Closed form solutions are provided for common distributions. The derived methods have been implemented in a readily accessible R script. The proposed techniques are illustrated by planning the sample size for a clinical trial with complex recurrent event data. The required sample size is shown to be affected not only by censoring and intra-patient correlation, but also by the presence of risk-free intervals. This demonstrates the need for a simulation algorithm that particularly allows for complex study designs where no analytical sample size formulas might exist. The derived simulation algorithm is seen to be useful for the simulation of recurrent event data that follow an Andersen-Gill model. Next to the use of a total time scale, it allows for intra-patient correlation and risk-free intervals as are often observed in clinical trial data. Its application therefore allows the simulation of data that closely resemble real settings and thus can improve the use of simulation studies for designing and analysing studies.

  3. A Program Structure for Event-Based Speech Synthesis by Rules within a Flexible Segmental Framework.

    ERIC Educational Resources Information Center

    Hill, David R.

    1978-01-01

    A program structure based on recently developed techniques for operating system simulation has the required flexibility for use as a speech synthesis algorithm research framework. This program makes synthesis possible with less rigid time and frequency-component structure than simpler schemes. It also meets real-time operation and memory-size…

  4. Effect of confounding variables on hemodynamic response function estimation using averaging and deconvolution analysis: An event-related NIRS study.

    PubMed

    Aarabi, Ardalan; Osharina, Victoria; Wallois, Fabrice

    2017-07-15

    Slow and rapid event-related designs are used in fMRI and functional near-infrared spectroscopy (fNIRS) experiments to temporally characterize the brain hemodynamic response to discrete events. Conventional averaging (CA) and the deconvolution method (DM) are the two techniques commonly used to estimate the Hemodynamic Response Function (HRF) profile in event-related designs. In this study, we conducted a series of simulations using synthetic and real NIRS data to examine the effect of the main confounding factors, including event sequence timing parameters, different types of noise, signal-to-noise ratio (SNR), temporal autocorrelation and temporal filtering on the performance of these techniques in slow and rapid event-related designs. We also compared systematic errors in the estimates of the fitted HRF amplitude, latency and duration for both techniques. We further compared the performance of deconvolution methods based on Finite Impulse Response (FIR) basis functions and gamma basis sets. Our results demonstrate that DM was much less sensitive to confounding factors than CA. Event timing was the main parameter largely affecting the accuracy of CA. In slow event-related designs, deconvolution methods provided similar results to those obtained by CA. In rapid event-related designs, our results showed that DM outperformed CA for all SNR, especially above -5 dB regardless of the event sequence timing and the dynamics of background NIRS activity. Our results also show that periodic low-frequency systemic hemodynamic fluctuations as well as phase-locked noise can markedly obscure hemodynamic evoked responses. Temporal autocorrelation also affected the performance of both techniques by inducing distortions in the time profile of the estimated hemodynamic response with inflated t-statistics, especially at low SNRs. We also found that high-pass temporal filtering could substantially affect the performance of both techniques by removing the low-frequency components of HRF profiles. Our results emphasize the importance of characterization of event timing, background noise and SNR when estimating HRF profiles using CA and DM in event-related designs. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Epicenter Location of Regional Seismic Events Using Love Wave and Rayleigh Wave Ambient Seismic Noise Green's Functions

    NASA Astrophysics Data System (ADS)

    Levshin, A. L.; Barmin, M. P.; Moschetti, M. P.; Mendoza, C.; Ritzwoller, M. H.

    2011-12-01

    We describe a novel method to locate regional seismic events based on exploiting Empirical Green's Functions (EGF) that are produced from ambient seismic noise. Elastic EGFs between pairs of seismic stations are determined by cross-correlating long time-series of ambient noise recorded at the two stations. The EGFs principally contain Rayleigh waves on the vertical-vertical cross-correlations and Love waves on the transverse-transverse cross-correlations. Earlier work (Barmin et al., "Epicentral location based on Rayleigh wave empirical Green's functions from ambient seismic noise", Geophys. J. Int., 2011) showed that group time delays observed on Rayleigh wave EGFs can be exploited to locate to within about 1 km moderate sized earthquakes using USArray Transportable Array (TA) stations. The principal advantage of the method is that the ambient noise EGFs are affected by lateral variations in structure similarly to the earthquake signals, so the location is largely unbiased by 3-D structure. However, locations based on Rayleigh waves alone may be biased by more than 1 km if the earthquake depth is unknown but lies between 2 km and 7 km. This presentation is motivated by the fact that group time delays for Love waves are much less affected by earthquake depth than Rayleigh waves; thus exploitation of Love wave EGFs may reduce location bias caused by uncertainty in event depth. The advantage of Love waves to locate seismic events, however, is mitigated by the fact that Love wave EGFs have a smaller SNR than Rayleigh waves. Here, we test the use of Love and Rayleigh wave EGFs between 5- and 15-sec period to locate seismic events based on the USArray TA in the western US. We focus on locating aftershocks of the 2008 M 6.0 Wells earthquake, mining blasts in Wyoming and Montana, and small earthquakes near Norman, OK and Dallas, TX, some of which may be triggered by hydrofracking or injection wells.

  6. Re-scheduling as a tool for the power management on board a spacecraft

    NASA Technical Reports Server (NTRS)

    Albasheer, Omar; Momoh, James A.

    1995-01-01

    The scheduling of events on board a spacecraft is based on forecast energy levels. The real time values of energy may not coincide with the forecast values; consequently, a dynamic revising to the allocation of power is needed. The re-scheduling is also needed for other reasons on board a spacecraft like the addition of new event which must be scheduled, or a failure of an event due to many different contingencies. This need of rescheduling is very important to the survivability of the spacecraft. In this presentation, a re-scheduling tool will be presented as a part of an overall scheme for the power management on board a spacecraft from the allocation of energy point of view. The overall scheme is based on the optimal use of energy available on board a spacecraft using expert systems combined with linear optimization techniques. The system will be able to schedule maximum number of events utilizing most energy available. The outcome is more events scheduled to share the operation cost of that spacecraft. The system will also be able to re-schedule in case of a contingency with minimal time and minimal disturbance of the original schedule. The end product is a fully integrated planning system capable of producing the right decisions in short time with less human error. The overall system will be presented with the re-scheduling algorithm discussed in detail, then the tests and results will be presented for validations.

  7. Quantitative assessment of building fire risk to life safety.

    PubMed

    Guanquan, Chu; Jinhua, Sun

    2008-06-01

    This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.

  8. The use of propagation path corrections to improve regional seismic event location in western China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steck, L.K.; Cogbill, A.H.; Velasco, A.A.

    1999-03-01

    In an effort to improve the ability to locate seismic events in western China using only regional data, the authors have developed empirical propagation path corrections (PPCs) and applied such corrections using both traditional location routines as well as a nonlinear grid search method. Thus far, the authors have concentrated on corrections to observed P arrival times for shallow events using travel-time observations available from the USGS EDRs, the ISC catalogs, their own travel-tim picks from regional data, and data from other catalogs. They relocate events with the algorithm of Bratt and Bache (1988) from a region encompassing China. Formore » individual stations having sufficient data, they produce a map of the regional travel-time residuals from all well-located teleseismic events. From these maps, interpolated PPC surfaces have been constructed using both surface fitting under tension and modified Bayesian kriging. The latter method offers the advantage of providing well-behaved interpolants, but requires that the authors have adequate error estimates associated with the travel-time residuals. To improve error estimates for kriging and event location, they separate measurement error from modeling error. The modeling error is defined as the travel-time variance of a particular model as a function of distance, while the measurement error is defined as the picking error associated with each phase. They estimate measurement errors for arrivals from the EDRs based on roundoff or truncation, and use signal-to-noise for the travel-time picks from the waveform data set.« less

  9. Operational warning of interplanetary shock arrivals using energetic particle data from ACE: Real-time Upstream Monitoring System

    NASA Astrophysics Data System (ADS)

    Donegan, M.; Vandegriff, J.; Ho, G. C.; Julia, S. J.

    2004-12-01

    We report on an operational system which provides advance warning and predictions of arrival times at Earth of interplanetary (IP) shocks that originate at the Sun. The data stream used in our prediction algorithm is real-time and comes from the Electron, Proton, and Alpha Monitor (EPAM) instrument on NASA's Advanced Composition Explorer (ACE) spacecraft. Since locally accelerated energetic storm particle (ESP) events accompany most IP shocks, their arrival can be predicted using ESP event signatures. We have previously reported on the development and implementation of an algorithm which recognizes the upstream particle signature of approaching IP shocks and provides estimated countdown predictions. A web-based system (see (http://sd-www.jhuapl.edu/UPOS/RISP/index.html) combines this prediction capability with real-time ACE/EPAM data provided by the NOAA Space Environment Center. The most recent ACE data is continually processed and predictions of shock arrival time are updated every five minutes when an event is impending. An operational display is provided to indicate advisories and countdowns for the event. Running the algorithm on a test set of historical events, we obtain a median error of about 10 hours for predictions made 24-36 hours before actual shock arrival and about 6 hours when the shock is 6-12 hours away. This system can provide critical information to mission planners, satellite operations controllers, and scientists by providing significant lead-time for approaching events. Recently, we have made improvements to the triggering mechanism as well as re-training the neural network, and here we report prediction results from the latest system.

  10. Continuous event monitoring via a Bayesian predictive approach.

    PubMed

    Di, Jianing; Wang, Daniel; Brashear, H Robert; Dragalin, Vladimir; Krams, Michael

    2016-01-01

    In clinical trials, continuous monitoring of event incidence rate plays a critical role in making timely decisions affecting trial outcome. For example, continuous monitoring of adverse events protects the safety of trial participants, while continuous monitoring of efficacy events helps identify early signals of efficacy or futility. Because the endpoint of interest is often the event incidence associated with a given length of treatment duration (e.g., incidence proportion of an adverse event with 2 years of dosing), assessing the event proportion before reaching the intended treatment duration becomes challenging, especially when the event onset profile evolves over time with accumulated exposure. In particular, in the earlier part of the study, ignoring censored subjects may result in significant bias in estimating the cumulative event incidence rate. Such a problem is addressed using a predictive approach in the Bayesian framework. In the proposed approach, experts' prior knowledge about both the frequency and timing of the event occurrence is combined with observed data. More specifically, during any interim look, each event-free subject will be counted with a probability that is derived using prior knowledge. The proposed approach is particularly useful in early stage studies for signal detection based on limited information. But it can also be used as a tool for safety monitoring (e.g., data monitoring committee) during later stage trials. Application of the approach is illustrated using a case study where the incidence rate of an adverse event is continuously monitored during an Alzheimer's disease clinical trial. The performance of the proposed approach is also assessed and compared with other Bayesian and frequentist methods via simulation. Copyright © 2015 John Wiley & Sons, Ltd.

  11. Recognising safety critical events: can automatic video processing improve naturalistic data analyses?

    PubMed

    Dozza, Marco; González, Nieves Pañeda

    2013-11-01

    New trends in research on traffic accidents include Naturalistic Driving Studies (NDS). NDS are based on large scale data collection of driver, vehicle, and environment information in real world. NDS data sets have proven to be extremely valuable for the analysis of safety critical events such as crashes and near crashes. However, finding safety critical events in NDS data is often difficult and time consuming. Safety critical events are currently identified using kinematic triggers, for instance searching for deceleration below a certain threshold signifying harsh braking. Due to the low sensitivity and specificity of this filtering procedure, manual review of video data is currently necessary to decide whether the events identified by the triggers are actually safety critical. Such reviewing procedure is based on subjective decisions, is expensive and time consuming, and often tedious for the analysts. Furthermore, since NDS data is exponentially growing over time, this reviewing procedure may not be viable anymore in the very near future. This study tested the hypothesis that automatic processing of driver video information could increase the correct classification of safety critical events from kinematic triggers in naturalistic driving data. Review of about 400 video sequences recorded from the events, collected by 100 Volvo cars in the euroFOT project, suggested that drivers' individual reaction may be the key to recognize safety critical events. In fact, whether an event is safety critical or not often depends on the individual driver. A few algorithms, able to automatically classify driver reaction from video data, have been compared. The results presented in this paper show that the state of the art subjective review procedures to identify safety critical events from NDS can benefit from automated objective video processing. In addition, this paper discusses the major challenges in making such video analysis viable for future NDS and new potential applications for NDS video processing. As new NDS such as SHRP2 are now providing the equivalent of five years of one vehicle data each day, the development of new methods, such as the one proposed in this paper, seems necessary to guarantee that these data can actually be analysed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. The excitation and characteristic frequency of the long-period volcanic event: An approach based on an inhomogeneous autoregressive model of a linear dynamic system

    USGS Publications Warehouse

    Nakano, M.; Kumagai, H.; Kumazawa, M.; Yamaoka, K.; Chouet, B.A.

    1998-01-01

    We present a method to quantify the source excitation function and characteristic frequencies of long-period volcanic events. The method is based on an inhomogeneous autoregressive (AR) model of a linear dynamic system, in which the excitation is assumed to be a time-localized function applied at the beginning of the event. The tail of an exponentially decaying harmonic waveform is used to determine the characteristic complex frequencies of the event by the Sompi method. The excitation function is then derived by operating an AR filter constructed from the characteristic frequencies to the entire seismogram of the event, including the inhomogeneous part of the signal. We apply this method to three long-period events at Kusatsu-Shirane Volcano, central Japan, whose waveforms display simple decaying monochromatic oscillations except for the beginning of the events. We recover time-localized excitation functions lasting roughly 1 s at the start of each event and find that the estimated functions are very similar to each other at all the stations of the seismic network for each event. The phases of the characteristic oscillations referred to the estimated excitation function fall within a narrow range for almost all the stations. These results strongly suggest that the excitation and mode of oscillation are both dominated by volumetric change components. Each excitation function starts with a pronounced dilatation consistent with a sudden deflation of the volumetric source which may be interpreted in terms of a choked-flow transport mechanism. The frequency and Q of the characteristic oscillation both display a temporal evolution from event to event. Assuming a crack filled with bubbly water as seismic source for these events, we apply the Van Wijngaarden-Papanicolaou model to estimate the acoustic properties of the bubbly liquid and find that the observed changes in the frequencies and Q are consistently explained by a temporal change in the radii of the bubbles characterizing the bubbly water in the crack.

  13. Event specific qualitative and quantitative polymerase chain reaction detection of genetically modified MON863 maize based on the 5'-transgene integration sequence.

    PubMed

    Yang, Litao; Xu, Songci; Pan, Aihu; Yin, Changsong; Zhang, Kewei; Wang, Zhenying; Zhou, Zhigang; Zhang, Dabing

    2005-11-30

    Because of the genetically modified organisms (GMOs) labeling policies issued in many countries and areas, polymerase chain reaction (PCR) methods were developed for the execution of GMO labeling policies, such as screening, gene specific, construct specific, and event specific PCR detection methods, which have become a mainstay of GMOs detection. The event specific PCR detection method is the primary trend in GMOs detection because of its high specificity based on the flanking sequence of the exogenous integrant. This genetically modified maize, MON863, contains a Cry3Bb1 coding sequence that produces a protein with enhanced insecticidal activity against the coleopteran pest, corn rootworm. In this study, the 5'-integration junction sequence between the host plant DNA and the integrated gene construct of the genetically modified maize MON863 was revealed by means of thermal asymmetric interlaced-PCR, and the specific PCR primers and TaqMan probe were designed based upon the revealed 5'-integration junction sequence; the conventional qualitative PCR and quantitative TaqMan real-time PCR detection methods employing these primers and probes were successfully developed. In conventional qualitative PCR assay, the limit of detection (LOD) was 0.1% for MON863 in 100 ng of maize genomic DNA for one reaction. In the quantitative TaqMan real-time PCR assay, the LOD and the limit of quantification were eight and 80 haploid genome copies, respectively. In addition, three mixed maize samples with known MON863 contents were detected using the established real-time PCR systems, and the ideal results indicated that the established event specific real-time PCR detection systems were reliable, sensitive, and accurate.

  14. Real time monitoring of induced seismicity in the Insheim and Landau deep geothermal reservoirs, Upper Rhine Graben, using the new SeisComP3 cross-correlation detector

    NASA Astrophysics Data System (ADS)

    Vasterling, Margarete; Wegler, Ulrich; Bruestle, Andrea; Becker, Jan

    2016-04-01

    Real time information on the locations and magnitudes of induced earthquakes is essential for response plans based on the magnitude frequency distribution. We developed and tested a real time cross-correlation detector focusing on induced microseismicity in deep geothermal reservoirs. The incoming seismological data are cross-correlated in real time with a set of known master events. We use the envelopes of the seismograms rather than the seismograms themselves to account for small changes in the source locations or in the focal mechanisms. Two different detection conditions are implemented: After first passing a single trace correlation condition, secondly a network correlation is calculated taking the amplitude information of the seismic network into account. The magnitude is estimated by using the respective ratio of the maximum amplitudes of the master event and the detected event. The detector is implemented as a real time tool and put into practice as a SeisComp3 module, an established open source software for seismological real time data handling and analysis. We validated the reliability and robustness of the detector by an offline playback test using four month of data from monitoring the power plant in Insheim (Upper Rhine Graben, SW Germany). Subsequently, in October 2013 the detector was installed as real time monitoring system within the project "MAGS2 - Microseismic Activity of Geothermal Systems". Master events from the two neighboring geothermal power plants in Insheim and Landau and two nearby quarries are defined. After detection, manual phase determination and event location are performed at the local seismological survey of the Geological Survey and Mining Authority of Rhineland-Palatinate. Until November 2015 the detector identified 454 events out of which 95% were assigned correctly to the respective source. 5% were misdetections caused by local tectonic events. To evaluate the completeness of the automatically obtained catalogue, it is compared to the event catalogue of the Seismological Service of Southwestern Germany and to the events reported by the company tasked with seismic monitoring of the Insheim power plant. Events missed by the cross-correlation detector are generally very small. They are registered at too few stations to meet the detection criteria. Most of these small events were not locatable. The automatic catalogue has a magnitude of completeness around 0.0 and is significantly more detailed than the catalogue from standard processing of the Seismological Service of Southwestern Germany for this region. For events in the magnitude range of the master event the magnitude estimated from the amplitude ratio reproduces the local magnitude well. For weaker events there tends to be a small offset. Altogether, the developed real time cross correlation detector provides robust detections with reliable association of the events to the respective sources and valid magnitude estimates. Thus, it provides input parameters for the mitigation of seismic hazard by using response plans in real time.

  15. Stressful Life Events and Depressive Symptomatology Among Basque Adolescents: The Mediating Role of Attachment Representations.

    PubMed

    Aliri, Jone; Muela, Alexander; Gorostiaga, Arantxa; Balluerka, Nekane; Aritzeta, Aitor; Soroa, Goretti

    2018-01-01

    The occurrence of stressful life events is a risk factor for psychopathology in adolescence. Depression is a problem of notable clinical importance that has a negative psychosocial impact on adolescents and which has considerable social, educational, and economic costs. The aim of this study was to examine the relationship between stressful life events and depressive symptomatology in adolescence, taking into account the effect that attachment representations may have on this relation. Participants were 1653 adolescents (951 girls) aged between 13 and 18 years. The sample was selected by means of a random sampling procedure based on the availability of schools to participate. Data were collected at two time points: attachment and stressful life events were assessed first, and symptoms of depression were evaluated eight to nine months later. Two time points were used in order to better analyze the mediating role of attachment security. Stressful life events were recorded using the Inventory of Stressful Life Events, attachment was evaluated by the Inventory of Parent and Peer Attachment (mother, father, and peer versions), and depressive symptomatology was assessed through the Children's Depression Scale. In all cases, the Basque version of these scales was used. The results indicated that attachment to parents was a mediating variable in the relationship between stressful life events and depressive symptomatology. Contrary to what we expected, the results indicate that stressful life events did not have a negative effect on peer attachment, and neither did the latter variable act as a mediator of the relationship between stressful life events and depressive symptoms. It can be concluded that attachment-based interventions may be especially useful for reducing depression symptoms among adolescents. The findings also suggest a role for interventions that target parent-child attachment relationships.

  16. A Saccade Based Framework for Real-Time Motion Segmentation Using Event Based Vision Sensors

    PubMed Central

    Mishra, Abhishek; Ghosh, Rohan; Principe, Jose C.; Thakor, Nitish V.; Kukreja, Sunil L.

    2017-01-01

    Motion segmentation is a critical pre-processing step for autonomous robotic systems to facilitate tracking of moving objects in cluttered environments. Event based sensors are low power analog devices that represent a scene by means of asynchronous information updates of only the dynamic details at high temporal resolution and, hence, require significantly less calculations. However, motion segmentation using spatiotemporal data is a challenging task due to data asynchrony. Prior approaches for object tracking using neuromorphic sensors perform well while the sensor is static or a known model of the object to be followed is available. To address these limitations, in this paper we develop a technique for generalized motion segmentation based on spatial statistics across time frames. First, we create micromotion on the platform to facilitate the separation of static and dynamic elements of a scene, inspired by human saccadic eye movements. Second, we introduce the concept of spike-groups as a methodology to partition spatio-temporal event groups, which facilitates computation of scene statistics and characterize objects in it. Experimental results show that our algorithm is able to classify dynamic objects with a moving camera with maximum accuracy of 92%. PMID:28316563

  17. Visual form predictions facilitate auditory processing at the N1.

    PubMed

    Paris, Tim; Kim, Jeesun; Davis, Chris

    2017-02-20

    Auditory-visual (AV) events often involve a leading visual cue (e.g. auditory-visual speech) that allows the perceiver to generate predictions about the upcoming auditory event. Electrophysiological evidence suggests that when an auditory event is predicted, processing is sped up, i.e., the N1 component of the ERP occurs earlier (N1 facilitation). However, it is not clear (1) whether N1 facilitation is based specifically on predictive rather than multisensory integration and (2) which particular properties of the visual cue it is based on. The current experiment used artificial AV stimuli in which visual cues predicted but did not co-occur with auditory cues. Visual form cues (high and low salience) and the auditory-visual pairing were manipulated so that auditory predictions could be based on form and timing or on timing only. The results showed that N1 facilitation occurred only for combined form and temporal predictions. These results suggest that faster auditory processing (as indicated by N1 facilitation) is based on predictive processing generated by a visual cue that clearly predicts both what and when the auditory stimulus will occur. Copyright © 2016. Published by Elsevier Ltd.

  18. Multi Sensor Fusion Framework for Indoor-Outdoor Localization of Limited Resource Mobile Robots

    PubMed Central

    Marín, Leonardo; Vallés, Marina; Soriano, Ángel; Valera, Ángel; Albertos, Pedro

    2013-01-01

    This paper presents a sensor fusion framework that improves the localization of mobile robots with limited computational resources. It employs an event based Kalman Filter to combine the measurements of a global sensor and an inertial measurement unit (IMU) on an event based schedule, using fewer resources (execution time and bandwidth) but with similar performance when compared to the traditional methods. The event is defined to reflect the necessity of the global information, when the estimation error covariance exceeds a predefined limit. The proposed experimental platforms are based on the LEGO Mindstorm NXT, and consist of a differential wheel mobile robot navigating indoors with a zenithal camera as global sensor, and an Ackermann steering mobile robot navigating outdoors with a SBG Systems GPS accessed through an IGEP board that also serves as datalogger. The IMU in both robots is built using the NXT motor encoders along with one gyroscope, one compass and two accelerometers from Hitecnic, placed according to a particle based dynamic model of the robots. The tests performed reflect the correct performance and low execution time of the proposed framework. The robustness and stability is observed during a long walk test in both indoors and outdoors environments. PMID:24152933

  19. Multi sensor fusion framework for indoor-outdoor localization of limited resource mobile robots.

    PubMed

    Marín, Leonardo; Vallés, Marina; Soriano, Ángel; Valera, Ángel; Albertos, Pedro

    2013-10-21

    This paper presents a sensor fusion framework that improves the localization of mobile robots with limited computational resources. It employs an event based Kalman Filter to combine the measurements of a global sensor and an inertial measurement unit (IMU) on an event based schedule, using fewer resources (execution time and bandwidth) but with similar performance when compared to the traditional methods. The event is defined to reflect the necessity of the global information, when the estimation error covariance exceeds a predefined limit. The proposed experimental platforms are based on the LEGO Mindstorm NXT, and consist of a differential wheel mobile robot navigating indoors with a zenithal camera as global sensor, and an Ackermann steering mobile robot navigating outdoors with a SBG Systems GPS accessed through an IGEP board that also serves as datalogger. The IMU in both robots is built using the NXT motor encoders along with one gyroscope, one compass and two accelerometers from Hitecnic, placed according to a particle based dynamic model of the robots. The tests performed reflect the correct performance and low execution time of the proposed framework. The robustness and stability is observed during a long walk test in both indoors and outdoors environments.

  20. Towards marine seismological Network: real time small aperture seismic array

    NASA Astrophysics Data System (ADS)

    Ilinskiy, Dmitry

    2017-04-01

    Most powerful and dangerous seismic events are generated in underwater subduction zones. Existing seismological networks are based on land seismological stations. Increased demands for accuracy of location, magnitude, rupture process of coming earthquakes and at the same time reduction of data processing time require information from seabed seismic stations located near the earthquake generation area. Marine stations provide important contribution for clarification of the tectonic settings in most active subduction zones of the world. Early warning system for subduction zone area is based on marine seabed array which located near the area of most hazardous seismic zone in the region. Fast track processing for location of the earthquake hypocenter and energy takes place in buoy surface unit. Information about detected and located earthquake reaches the onshore seismological center earlier than the first break waves from the same earthquake will reach the nearest onshore seismological station. Implementation of small aperture array is based on existed and shown a good proven performance and costs effective solutions such as weather moored buoy and self-pop up autonomous seabed seismic nodes. Permanent seabed system for real-time operation has to be installed in deep sea waters far from the coast. Seabed array consists of several self-popup seismological stations which continuously acquire the data, detect the events of certain energy class and send detected event parameters to the surface buoy via acoustic link. Surface buoy unit determine the earthquake location by receiving the event parameters from seabed units and send such information in semi-real time to the onshore seismological center via narrow band satellite link. Upon the request from the cost the system could send wave form of events of certain energy class, bottom seismic station battery status and other environmental parameters. When the battery life of particular seabed unit is close to became empty, the seabed unit is switching into sleep mode and send that information to surface buoy and father to the onshore data center. Then seabed unit can wait for the vessel of opportunity for recovery of seabed unit to sea surface and replacing seabed station to another one with fresh batteries. All collected permanent seismic data by seabed unit could than downloaded for father processing and analysis. In our presentation we will demonstrate the several working prototypes of proposed system such as real time cable broad band seismological station and real time buoy seabed seismological station.

  1. Multi-event study of high-latitude thermospheric wind variations at substorm onset with a Fabry-Perot interferometer at Tromsoe, Norway

    NASA Astrophysics Data System (ADS)

    Xu, H.; Shiokawa, K.; Oyama, S. I.; Otsuka, Y.

    2017-12-01

    We studied the high-latitude thermospheric wind variations near the onset time of isolated substorms. Substorm-related energy input from the magnetosphere to the polar ionosphere modifies the high-latitude ionosphere and thermosphere. For the first time, this study showed the characteristics of high-latitude thermospheric wind variations at the substorm onset. We also investigated the possibility of these wind variations as a potential trigger of substorm onset by modifying the ionospheric current system (Kan, 1993). A Fabry-Perot interferometer (FPI) at Tromsoe, Norway provided wind measurements estimated from Doppler shift of both red-line (630.0 nm for the F region) and green-line (557.7 nm for the E region) emissions of aurora and airglow. We used seven-year data sets obtained from 2009 to 2015 with a time resolution of 13 min. We first identified the onset times of local isolated substorms using ground-based magnetometer data obtained at the Tromsoe and Bear Island stations, which belongs to the IMAGE magnetometer chain. We obtained 4 red-line events and 5 green-line events taken place at different local times. For all these events, the peak locations of westward ionospheric currents identified by the ground-based magnetometer chain were located at the poleward side of Tromsoe. Then, we calculated two weighted averages of wind velocities for 30 min around the onset time and 30 min after the onset time of substorms. We evaluated differences between these two weighted averages to estimate the strength of wind changes. The observed wind changes at these substorm onsets were less than 49 m/s (26 m/s) for red-line (green-line) events, which are much smaller than the typical plasma convection speed. This indicates that the plasma motion caused by substorm-induced thermospheric winds through ion-neutral collisions is a minor effect as the driver of high-latitude plasma convection, as well as the triggering of substorm onset. We discuss possible causes of these observed wind changes at the onset of substorms based on the mechanisms of thermospheric diurnal tides, arc-induced electric field and Joule heating caused by the auroral activities that were identified by the cross sections of all-sky images, as well as the IMF-associated plasma convection model.

  2. Intensity - Duration - Frequency Curves for U.S. Cities in a Warming Climate

    NASA Astrophysics Data System (ADS)

    Ragno, Elisa; AghaKouchak, Amir; Love, Charlotte; Vahedifard, Farshid; Cheng, Linyin; Lima, Carlos

    2017-04-01

    Current infrastructure design procedures rely on the use of Intensity - Duration - Frequency (IDF) curves retrieved under the assumption of temporal stationarity, meaning that occurrences of extreme events are expected to be time invariant. However, numerous studies have observed more severe extreme events over time. Hence, the stationarity assumption for extreme analysis may not be appropriate in a warming climate. This issue raises concerns regarding the safety and resilience of infrastructures and natural slopes. Here we employ daily precipitation data from historical and projected (RCP 8.5) CMIP5 runs to investigate IDF curves of 14 urban areas across the United States. We first statistically assess changes in precipitation extremes using an energy-based test for equal distributions. Then, through a Bayesian inference approach for stationary and non-stationary extreme value analysis, we provide updated IDF curves based on future climatic model projections. We show that, based on CMIP5 simulations, U.S cities may experience extreme precipitation events up to 20% more intense and twice as frequently, relative to historical records, despite the expectation of unchanged annual mean precipitation.

  3. Forecast Based Financing for Managing Weather and Climate Risks to Reduce Potential Disaster Impacts

    NASA Astrophysics Data System (ADS)

    Arrighi, J.

    2017-12-01

    There is a critical window of time to reduce potential impacts of a disaster after a forecast for heightened risk is issued and before an extreme event occurs. The concept of Forecast-based Financing focuses on this window of opportunity. Through advanced preparation during system set-up, tailored methodologies are used to 1) analyze a range of potential extreme event forecasts, 2) identify emergency preparedness measures that can be taken when factoring in forecast lead time and inherent uncertainty and 3) develop standard operating procedures that are agreed on and tied to guaranteed funding sources to facilitate emergency measures led by the Red Cross or government actors when preparedness measures are triggered. This presentation will focus on a broad overview of the current state of theory and approaches used in developing a forecast-based financing systems - with a specific focus on hydrologic events, case studies of success and challenges in various contexts where this approach is being piloted, as well as what is on the horizon to be further explored and developed from a research perspective as the application of this approach continues to expand.

  4. Real-time detection of organic contamination events in water distribution systems by principal components analysis of ultraviolet spectral data.

    PubMed

    Zhang, Jian; Hou, Dibo; Wang, Ke; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo

    2017-05-01

    The detection of organic contaminants in water distribution systems is essential to protect public health from potential harmful compounds resulting from accidental spills or intentional releases. Existing methods for detecting organic contaminants are based on quantitative analyses such as chemical testing and gas/liquid chromatography, which are time- and reagent-consuming and involve costly maintenance. This study proposes a novel procedure based on discrete wavelet transform and principal component analysis for detecting organic contamination events from ultraviolet spectral data. Firstly, the spectrum of each observation is transformed using discrete wavelet with a coiflet mother wavelet to capture the abrupt change along the wavelength. Principal component analysis is then employed to approximate the spectra based on capture and fusion features. The significant value of Hotelling's T 2 statistics is calculated and used to detect outliers. An alarm of contamination event is triggered by sequential Bayesian analysis when the outliers appear continuously in several observations. The effectiveness of the proposed procedure is tested on-line using a pilot-scale setup and experimental data.

  5. An operational procedure for rapid flood risk assessment in Europe

    NASA Astrophysics Data System (ADS)

    Dottori, Francesco; Kalas, Milan; Salamon, Peter; Bianchi, Alessandra; Alfieri, Lorenzo; Feyen, Luc

    2017-07-01

    The development of methods for rapid flood mapping and risk assessment is a key step to increase the usefulness of flood early warning systems and is crucial for effective emergency response and flood impact mitigation. Currently, flood early warning systems rarely include real-time components to assess potential impacts generated by forecasted flood events. To overcome this limitation, this study describes the benchmarking of an operational procedure for rapid flood risk assessment based on predictions issued by the European Flood Awareness System (EFAS). Daily streamflow forecasts produced for major European river networks are translated into event-based flood hazard maps using a large map catalogue derived from high-resolution hydrodynamic simulations. Flood hazard maps are then combined with exposure and vulnerability information, and the impacts of the forecasted flood events are evaluated in terms of flood-prone areas, economic damage and affected population, infrastructures and cities.An extensive testing of the operational procedure has been carried out by analysing the catastrophic floods of May 2014 in Bosnia-Herzegovina, Croatia and Serbia. The reliability of the flood mapping methodology is tested against satellite-based and report-based flood extent data, while modelled estimates of economic damage and affected population are compared against ground-based estimations. Finally, we evaluate the skill of risk estimates derived from EFAS flood forecasts with different lead times and combinations of probabilistic forecasts. Results highlight the potential of the real-time operational procedure in helping emergency response and management.

  6. Event-triggered resilient filtering with stochastic uncertainties and successive packet dropouts via variance-constrained approach

    NASA Astrophysics Data System (ADS)

    Jia, Chaoqing; Hu, Jun; Chen, Dongyan; Liu, Yurong; Alsaadi, Fuad E.

    2018-07-01

    In this paper, we discuss the event-triggered resilient filtering problem for a class of time-varying systems subject to stochastic uncertainties and successive packet dropouts. The event-triggered mechanism is employed with hope to reduce the communication burden and save network resources. The stochastic uncertainties are considered to describe the modelling errors and the phenomenon of successive packet dropouts is characterized by a random variable obeying the Bernoulli distribution. The aim of the paper is to provide a resilient event-based filtering approach for addressed time-varying systems such that, for all stochastic uncertainties, successive packet dropouts and filter gain perturbation, an optimized upper bound of the filtering error covariance is obtained by designing the filter gain. Finally, simulations are provided to demonstrate the effectiveness of the proposed robust optimal filtering strategy.

  7. Testing the seismology-based landquake monitoring system

    NASA Astrophysics Data System (ADS)

    Chao, Wei-An

    2016-04-01

    I have developed a real-time landquake monitoring system (RLMs), which monitor large-scale landquake activities in the Taiwan using real-time seismic network of Broadband Array in Taiwan for Seismology (BATS). The RLM system applies a grid-based general source inversion (GSI) technique to obtain the preliminary source location and force mechanism. A 2-D virtual source-grid on the Taiwan Island is created with an interval of 0.2° in both latitude and longitude. The depth of each grid point is fixed on the free surface topography. A database is stored on the hard disk for the synthetics, which are obtained using Green's functions computed by the propagator matrix approach for 1-D average velocity model, at all stations from each virtual source-grid due to nine elementary source components: six elementary moment tensors and three orthogonal (north, east and vertical) single-forces. Offline RLM system was carried out for events detected in previous studies. An important aspect of the RLM system is the implementation of GSI approach for different source types (e.g., full moment tensor, double couple faulting, and explosion source) by the grid search through the 2-D virtual source to automatically identify landquake event based on the improvement in waveform fitness and evaluate the best-fit solution in the monitoring area. With this approach, not only the force mechanisms but also the event occurrence time and location can be obtained simultaneously about 6-8 min after an occurrence of an event. To improve the insufficient accuracy of GSI-determined lotion, I further conduct a landquake epicenter determination (LED) method that maximizes the coherency of the high-frequency (1-3 Hz) horizontal envelope functions to determine the final source location. With good knowledge about the source location, I perform landquake force history (LFH) inversion to investigate the source dynamics (e.g., trajectory) for the relatively large-sized landquake event. With providing aforementioned source information in real-time, the government and emergency response agencies have sufficient reaction time for rapid assessment and response to landquake hazards. Since 2016, the RLM system has operated online.

  8. Continuous robust sound event classification using time-frequency features and deep learning

    PubMed Central

    Song, Yan; Xiao, Wei; Phan, Huy

    2017-01-01

    The automatic detection and recognition of sound events by computers is a requirement for a number of emerging sensing and human computer interaction technologies. Recent advances in this field have been achieved by machine learning classifiers working in conjunction with time-frequency feature representations. This combination has achieved excellent accuracy for classification of discrete sounds. The ability to recognise sounds under real-world noisy conditions, called robust sound event classification, is an especially challenging task that has attracted recent research attention. Another aspect of real-word conditions is the classification of continuous, occluded or overlapping sounds, rather than classification of short isolated sound recordings. This paper addresses the classification of noise-corrupted, occluded, overlapped, continuous sound recordings. It first proposes a standard evaluation task for such sounds based upon a common existing method for evaluating isolated sound classification. It then benchmarks several high performing isolated sound classifiers to operate with continuous sound data by incorporating an energy-based event detection front end. Results are reported for each tested system using the new task, to provide the first analysis of their performance for continuous sound event detection. In addition it proposes and evaluates a novel Bayesian-inspired front end for the segmentation and detection of continuous sound recordings prior to classification. PMID:28892478

  9. Continuous robust sound event classification using time-frequency features and deep learning.

    PubMed

    McLoughlin, Ian; Zhang, Haomin; Xie, Zhipeng; Song, Yan; Xiao, Wei; Phan, Huy

    2017-01-01

    The automatic detection and recognition of sound events by computers is a requirement for a number of emerging sensing and human computer interaction technologies. Recent advances in this field have been achieved by machine learning classifiers working in conjunction with time-frequency feature representations. This combination has achieved excellent accuracy for classification of discrete sounds. The ability to recognise sounds under real-world noisy conditions, called robust sound event classification, is an especially challenging task that has attracted recent research attention. Another aspect of real-word conditions is the classification of continuous, occluded or overlapping sounds, rather than classification of short isolated sound recordings. This paper addresses the classification of noise-corrupted, occluded, overlapped, continuous sound recordings. It first proposes a standard evaluation task for such sounds based upon a common existing method for evaluating isolated sound classification. It then benchmarks several high performing isolated sound classifiers to operate with continuous sound data by incorporating an energy-based event detection front end. Results are reported for each tested system using the new task, to provide the first analysis of their performance for continuous sound event detection. In addition it proposes and evaluates a novel Bayesian-inspired front end for the segmentation and detection of continuous sound recordings prior to classification.

  10. Bayesian Phase II optimization for time-to-event data based on historical information.

    PubMed

    Bertsche, Anja; Fleischer, Frank; Beyersmann, Jan; Nehmiz, Gerhard

    2017-01-01

    After exploratory drug development, companies face the decision whether to initiate confirmatory trials based on limited efficacy information. This proof-of-concept decision is typically performed after a Phase II trial studying a novel treatment versus either placebo or an active comparator. The article aims to optimize the design of such a proof-of-concept trial with respect to decision making. We incorporate historical information and develop pre-specified decision criteria accounting for the uncertainty of the observed treatment effect. We optimize these criteria based on sensitivity and specificity, given the historical information. Specifically, time-to-event data are considered in a randomized 2-arm trial with additional prior information on the control treatment. The proof-of-concept criterion uses treatment effect size, rather than significance. Criteria are defined on the posterior distribution of the hazard ratio given the Phase II data and the historical control information. Event times are exponentially modeled within groups, allowing for group-specific conjugate prior-to-posterior calculation. While a non-informative prior is placed on the investigational treatment, the control prior is constructed via the meta-analytic-predictive approach. The design parameters including sample size and allocation ratio are then optimized, maximizing the probability of taking the right decision. The approach is illustrated with an example in lung cancer.

  11. Event-Based Surveillance During EXPO Milan 2015: Rationale, Tools, Procedures, and Initial Results

    PubMed Central

    Manso, Martina Del; Caporali, Maria Grazia; Napoli, Christian; Linge, Jens P.; Mantica, Eleonora; Verile, Marco; Piatti, Alessandra; Pompa, Maria Grazia; Vellucci, Loredana; Costanzo, Virgilio; Bastiampillai, Anan Judina; Gabrielli, Eugenia; Gramegna, Maria; Declich, Silvia

    2016-01-01

    More than 21 million participants attended EXPO Milan from May to October 2015, making it one of the largest protracted mass gathering events in Europe. Given the expected national and international population movement and health security issues associated with this event, Italy fully implemented, for the first time, an event-based surveillance (EBS) system focusing on naturally occurring infectious diseases and the monitoring of biological agents with potential for intentional release. The system started its pilot phase in March 2015 and was fully operational between April and November 2015. In order to set the specific objectives of the EBS system, and its complementary role to indicator-based surveillance, we defined a list of priority diseases and conditions. This list was designed on the basis of the probability and possible public health impact of infectious disease transmission, existing statutory surveillance systems in place, and any surveillance enhancements during the mass gathering event. This article reports the methodology used to design the EBS system for EXPO Milan and the results of 8 months of surveillance. PMID:27314656

  12. Experiencing Past and Future Personal Events: Functional Neuroimaging Evidence on the Neural Bases of Mental Time Travel

    ERIC Educational Resources Information Center

    Botzung, Anne; Denkova, Ekaterina; Manning, Lilianne

    2008-01-01

    Functional MRI was used in healthy subjects to investigate the existence of common neural structures supporting re-experiencing the past and pre-experiencing the future. Past and future events evocation appears to involve highly similar patterns of brain activation including, in particular, the medial prefrontal cortex, posterior regions and the…

  13. Toward a Healthy Community (Organizing Events for Community Health Promotion).

    ERIC Educational Resources Information Center

    Public Health Service (DHHS), Rockville, MD. Office of Disease Prevention and Health Promotion.

    This booklet suggests the first steps communities can take in assessing their needs and resources and mobilizing public interest and support for health promotion. It is based on an approach to health education and community organization that recognizes the value of a highly visible, time-limited event, such as a health fair, a marathon, or an…

  14. Pyroclastic density current hazard maps at Campi Flegrei caldera (Italy): the effects of event scale, vent location and time forecasts.

    NASA Astrophysics Data System (ADS)

    Bevilacqua, Andrea; Neri, Augusto; Esposti Ongaro, Tomaso; Isaia, Roberto; Flandoli, Franco; Bisson, Marina

    2016-04-01

    Today hundreds of thousands people live inside the Campi Flegrei caldera (Italy) and in the adjacent part of the city of Naples making a future eruption of such volcano an event with huge consequences. Very high risks are associated with the occurrence of pyroclastic density currents (PDCs). Mapping of background or long-term PDC hazard in the area is a great challenge due to the unknown eruption time, scale and vent location of the next event as well as the complex dynamics of the flow over the caldera topography. This is additionally complicated by the remarkable epistemic uncertainty on the eruptive record, affecting the time of past events, the location of vents as well as the PDCs areal extent estimates. First probability maps of PDC invasion were produced combining a vent-opening probability map, statistical estimates concerning the eruptive scales and a Cox-type temporal model including self-excitement effects, based on the eruptive record of the last 15 kyr. Maps were produced by using a Monte Carlo approach and adopting a simplified inundation model based on the "box model" integral approximation tested with 2D transient numerical simulations of flow dynamics. In this presentation we illustrate the independent effects of eruption scale, vent location and time of forecast of the next event. Specific focus was given to the remarkable differences between the eastern and western sectors of the caldera and their effects on the hazard maps. The analysis allowed to identify areas with elevated probabilities of flow invasion as a function of the diverse assumptions made. With the quantification of some sources of uncertainty in relation to the system, we were also able to provide mean and percentile maps of PDC hazard levels.

  15. Genetic Stratigraphy of Key Demographic Events in Arabia

    PubMed Central

    Fernandes, Verónica; Triska, Petr; Pereira, Joana B.; Alshamali, Farida; Rito, Teresa; Machado, Alison; Fajkošová, Zuzana; Cavadas, Bruno; Černý, Viktor; Soares, Pedro

    2015-01-01

    At the crossroads between Africa and Eurasia, Arabia is necessarily a melting pot, its peoples enriched by successive gene flow over the generations. Estimating the timing and impact of these multiple migrations are important steps in reconstructing the key demographic events in the human history. However, current methods based on genome-wide information identify admixture events inefficiently, tending to estimate only the more recent ages, as here in the case of admixture events across the Red Sea (∼8–37 generations for African input into Arabia, and 30–90 generations for “back-to-Africa” migrations). An mtDNA-based founder analysis, corroborated by detailed analysis of the whole-mtDNA genome, affords an alternative means by which to identify, date and quantify multiple migration events at greater time depths, across the full range of modern human history, albeit for the maternal line of descent only. In Arabia, this approach enables us to infer several major pulses of dispersal between the Near East and Arabia, most likely via the Gulf corridor. Although some relict lineages survive in Arabia from the time of the out-of-Africa dispersal, 60 ka, the major episodes in the peopling of the Peninsula took place from north to south in the Late Glacial and, to a lesser extent, the immediate post-glacial/Neolithic. Exchanges across the Red Sea were mainly due to the Arab slave trade and maritime dominance (from ∼2.5 ka to very recent times), but had already begun by the early Holocene, fuelled by the establishment of maritime networks since ∼8 ka. The main “back-to-Africa” migrations, again undetected by genome-wide dating analyses, occurred in the Late Glacial period for introductions into eastern Africa, whilst the Neolithic was more significant for migrations towards North Africa. PMID:25738654

  16. Effects of Marijuana on Ictal and Interictal EEG Activities in Idiopathic Generalized Epilepsy.

    PubMed

    Sivakumar, Sanjeev; Zutshi, Deepti; Seraji-Bozorgzad, Navid; Shah, Aashit K

    2017-01-01

    Marijuana-based treatment for refractory epilepsy shows promise in surveys, case series, and clinical trials. However, literature on their EEG effects is sparse. Our objective is to analyze the effect of marijuana on EEG in a 24-year-old patient with idiopathic generalized epilepsy treated with cannabis. We blindly reviewed 3 long-term EEGs-a 24-hour study while only on antiepileptic drugs, a 72-hour EEG with Cannabis indica smoked on days 1 and 3 in addition to antiepileptic drugs, and a 48-hour EEG with combination C indica/sativa smoked on day 1 plus antiepileptic drugs. Generalized spike-wave discharges and diffuse paroxysmal fast activity were categorized as interictal and ictal, based on duration of less than 10 seconds or greater, respectively. Data from three studies concatenated into contiguous time series, with usage of marijuana modeled as time-dependent discrete variable while interictal and ictal events constituted dependent variables. Analysis of variance as initial test for significance followed by time series analysis using Generalized Autoregressive Conditional Heteroscedasticity model was performed. Statistical significance for lower interictal events (analysis of variance P = 0.001) was seen during C indica use, but not for C indica/sativa mixture (P = 0.629) or ictal events (P = 0.087). However, time series analysis revealed a significant inverse correlation between marijuana use, with interictal (P < 0.0004) and ictal (P = 0.002) event rates. Using a novel approach to EEG data, we demonstrate a decrease in interictal and ictal electrographic events during marijuana use. Larger samples of patients and EEG, with standardized cannabinoid formulation and dosing, are needed to validate our findings.

  17. Genetic stratigraphy of key demographic events in Arabia.

    PubMed

    Fernandes, Verónica; Triska, Petr; Pereira, Joana B; Alshamali, Farida; Rito, Teresa; Machado, Alison; Fajkošová, Zuzana; Cavadas, Bruno; Černý, Viktor; Soares, Pedro; Richards, Martin B; Pereira, Luísa

    2015-01-01

    At the crossroads between Africa and Eurasia, Arabia is necessarily a melting pot, its peoples enriched by successive gene flow over the generations. Estimating the timing and impact of these multiple migrations are important steps in reconstructing the key demographic events in the human history. However, current methods based on genome-wide information identify admixture events inefficiently, tending to estimate only the more recent ages, as here in the case of admixture events across the Red Sea (~8-37 generations for African input into Arabia, and 30-90 generations for "back-to-Africa" migrations). An mtDNA-based founder analysis, corroborated by detailed analysis of the whole-mtDNA genome, affords an alternative means by which to identify, date and quantify multiple migration events at greater time depths, across the full range of modern human history, albeit for the maternal line of descent only. In Arabia, this approach enables us to infer several major pulses of dispersal between the Near East and Arabia, most likely via the Gulf corridor. Although some relict lineages survive in Arabia from the time of the out-of-Africa dispersal, 60 ka, the major episodes in the peopling of the Peninsula took place from north to south in the Late Glacial and, to a lesser extent, the immediate post-glacial/Neolithic. Exchanges across the Red Sea were mainly due to the Arab slave trade and maritime dominance (from ~2.5 ka to very recent times), but had already begun by the early Holocene, fuelled by the establishment of maritime networks since ~8 ka. The main "back-to-Africa" migrations, again undetected by genome-wide dating analyses, occurred in the Late Glacial period for introductions into eastern Africa, whilst the Neolithic was more significant for migrations towards North Africa.

  18. Comparison of algorithms to generate event times conditional on time-dependent covariates.

    PubMed

    Sylvestre, Marie-Pierre; Abrahamowicz, Michal

    2008-06-30

    The Cox proportional hazards model with time-dependent covariates (TDC) is now a part of the standard statistical analysis toolbox in medical research. As new methods involving more complex modeling of time-dependent variables are developed, simulations could often be used to systematically assess the performance of these models. Yet, generating event times conditional on TDC requires well-designed and efficient algorithms. We compare two classes of such algorithms: permutational algorithms (PAs) and algorithms based on a binomial model. We also propose a modification of the PA to incorporate a rejection sampler. We performed a simulation study to assess the accuracy, stability, and speed of these algorithms in several scenarios. Both classes of algorithms generated data sets that, once analyzed, provided virtually unbiased estimates with comparable variances. In terms of computational efficiency, the PA with the rejection sampler reduced the time necessary to generate data by more than 50 per cent relative to alternative methods. The PAs also allowed more flexibility in the specification of the marginal distributions of event times and required less calibration.

  19. Immortal time bias in pharmaco-epidemiology.

    PubMed

    Suissa, Samy

    2008-02-15

    Immortal time is a span of cohort follow-up during which, because of exposure definition, the outcome under study could not occur. Bias from immortal time was first identified in the 1970s in epidemiology in the context of cohort studies of the survival benefit of heart transplantation. It recently resurfaced in pharmaco-epidemiology, with several observational studies reporting that various medications can be extremely effective at reducing morbidity and mortality. These studies, while using different cohort designs, all involved some form of immortal time and the corresponding bias. In this paper, the author describes various cohort study designs leading to this bias, quantifies its magnitude under different survival distributions, and illustrates it by using data from a cohort of lung cancer patients. The author shows that for time-based, event-based, and exposure-based cohort definitions, the bias in the rate ratio resulting from misclassified or excluded immortal time increases proportionately to the duration of immortal time. The bias is more pronounced with a decreasing hazard function for the outcome event, as illustrated with the Weibull distribution compared with a constant hazard from the exponential distribution. In conclusion, observational studies of drug benefit in which computerized databases are used must be designed and analyzed properly to avoid immortal time bias.

  20. AMS-02 as a Space Weather Observatory

    NASA Astrophysics Data System (ADS)

    Whitman, K.; Bindi, V.; Chati, M.; Consolandi, C.; Corti, C.

    2013-12-01

    The Alpha Magnetic Spectrometer (AMS-02) is a state-of-the-art space detector that measures particles in the energy range of hundreds of MeV to a few TeV. AMS-02 has been installed onboard of the International Space Station (ISS) since May 2011 where it will operate for the duration of the station. To date, there is an abundance of space-based solar data collected in the low energy regimes, whereas there are very few direct measurements of higher energy particles available. AMS-02 is capable of measuring arrival time and composition of the highest energy SEPs in space. It is crucial to build a better knowledge base regarding the most energetic and potentially harmful events. We are currently developing a program to employ AMS-02 as a real-time space weather observatory. SEPs with higher energies are usually accelerated during a short period of time and they are the first particles to reach the Earth. AMS-02, measuring these highest energy SEPs, can alert the onset of an SEP event. During the past two years of operation, we have identified two main quantities in AMS-02 that are particularly sensitive to the arrival of SEPs: the detector livetime and the transition radiation detector (TRD) event size. By monitoring the detector livetime and the TRD event size, AMS-02 can pinpoint in real-time the arrival of SEPs inside the Earth's magnetosphere operating as a space weather detector.

  1. Versatile single-chip event sequencer for atomic physics experiments

    NASA Astrophysics Data System (ADS)

    Eyler, Edward

    2010-03-01

    A very inexpensive dsPIC microcontroller with internal 32-bit counters is used to produce a flexible timing signal generator with up to 16 TTL-compatible digital outputs, with a time resolution and accuracy of 50 ns. This time resolution is easily sufficient for event sequencing in typical experiments involving cold atoms or laser spectroscopy. This single-chip device is capable of triggered operation and can also function as a sweeping delay generator. With one additional chip it can also concurrently produce accurately timed analog ramps, and another one-chip addition allows real-time control from an external computer. Compared to an FPGA-based digital pattern generator, this design is slower but simpler and more flexible, and it can be reprogrammed using ordinary `C' code without special knowledge. I will also describe the use of the same microcontroller with additional hardware to implement a digital lock-in amplifier and PID controller for laser locking, including a simple graphics-based control unit. This work is supported in part by the NSF.

  2. Eventogram: A Visual Representation of Main Events in Biomedical Signals.

    PubMed

    Elgendi, Mohamed

    2016-09-22

    Biomedical signals carry valuable physiological information and many researchers have difficulty interpreting and analyzing long-term, one-dimensional, quasi-periodic biomedical signals. Traditionally, biomedical signals are analyzed and visualized using periodogram, spectrogram, and wavelet methods. However, these methods do not offer an informative visualization of main events within the processed signal. This paper attempts to provide an event-related framework to overcome the drawbacks of the traditional visualization methods and describe the main events within the biomedical signal in terms of duration and morphology. Electrocardiogram and photoplethysmogram signals are used in the analysis to demonstrate the differences between the traditional visualization methods, and their performance is compared against the proposed method, referred to as the " eventogram " in this paper. The proposed method is based on two event-related moving averages that visualizes the main time-domain events in the processed biomedical signals. The traditional visualization methods were unable to find dominant events in processed signals while the eventogram was able to visualize dominant events in signals in terms of duration and morphology. Moreover, eventogram -based detection algorithms succeeded with detecting main events in different biomedical signals with a sensitivity and positive predictivity >95%. The output of the eventogram captured unique patterns and signatures of physiological events, which could be used to visualize and identify abnormal waveforms in any quasi-periodic signal.

  3. Self-Exciting Point Process Modeling of Conversation Event Sequences

    NASA Astrophysics Data System (ADS)

    Masuda, Naoki; Takaguchi, Taro; Sato, Nobuo; Yano, Kazuo

    Self-exciting processes of Hawkes type have been used to model various phenomena including earthquakes, neural activities, and views of online videos. Studies of temporal networks have revealed that sequences of social interevent times for individuals are highly bursty. We examine some basic properties of event sequences generated by the Hawkes self-exciting process to show that it generates bursty interevent times for a wide parameter range. Then, we fit the model to the data of conversation sequences recorded in company offices in Japan. In this way, we can estimate relative magnitudes of the self excitement, its temporal decay, and the base event rate independent of the self excitation. These variables highly depend on individuals. We also point out that the Hawkes model has an important limitation that the correlation in the interevent times and the burstiness cannot be independently modulated.

  4. A generalized framework for nucleosynthesis calculations

    NASA Astrophysics Data System (ADS)

    Sprouse, Trevor; Mumpower, Matthew; Aprahamian, Ani

    2014-09-01

    Simulating astrophysical events is a difficult process, requiring a detailed pairing of knowledge from both astrophysics and nuclear physics. Astrophysics guides the thermodynamic evolution of an astrophysical event. We present a nucleosynthesis framework written in Fortran that combines as inputs a thermodynamic evolution and nuclear data to time evolve the abundances of nuclear species. Through our coding practices, we have emphasized the applicability of our framework to any astrophysical event, including those involving nuclear fission. Because these calculations are often very complicated, our framework dynamically optimizes itself based on the conditions at each time step in order to greatly minimize total computation time. To highlight the power of this new approach, we demonstrate the use of our framework to simulate both Big Bang nucleosynthesis and r-process nucleosynthesis with speeds competitive with current solutions dedicated to either process alone.

  5. Regional Seismic Travel-Time Prediction, Uncertainty, and Location Improvement in Western Eurasia

    NASA Astrophysics Data System (ADS)

    Flanagan, M. P.; Myers, S. C.

    2004-12-01

    We investigate our ability to improve regional travel-time prediction and seismic event location using an a priori, three-dimensional velocity model of Western Eurasia and North Africa: WENA1.0 [Pasyanos et al., 2004]. Our objective is to improve the accuracy of seismic location estimates and calculate representative location uncertainty estimates. As we focus on the geographic region of Western Eurasia, the Middle East, and North Africa, we develop, test, and validate 3D model-based travel-time prediction models for 30 stations in the study region. Three principal results are presented. First, the 3D WENA1.0 velocity model improves travel-time prediction over the iasp91 model, as measured by variance reduction, for regional Pg, Pn, and P phases recorded at the 30 stations. Second, a distance-dependent uncertainty model is developed and tested for the WENA1.0 model. Third, an end-to-end validation test based on 500 event relocations demonstrates improved location performance over the 1-dimensional iasp91 model. Validation of the 3D model is based on a comparison of approximately 11,000 Pg, Pn, and P travel-time predictions and empirical observations from ground truth (GT) events. Ray coverage for the validation dataset is chosen to provide representative, regional-distance sampling across Eurasia and North Africa. The WENA1.0 model markedly improves travel-time predictions for most stations with an average variance reduction of 25% for all ray paths. We find that improvement is station dependent, with some stations benefiting greatly from WENA1.0 predictions (52% at APA, 33% at BKR, and 32% at NIL), some stations showing moderate improvement (12% at KEV, 14% at BOM, and 12% at TAM), some benefiting only slightly (6% at MOX, and 4% at SVE), and some are degraded (-6% at MLR and -18% at QUE). We further test WENA1.0 by comparing location accuracy with results obtained using the iasp91 model. Again, relocation of these events is dependent on ray paths that evenly sample WENA1.0 and therefore provide an unbiased assessment of location performance. A statistically significant sample is achieved by generating 500 location realizations based on 5 events with location accuracy between 1 km and 5 km. Each realization is a randomly selected event with location determined by randomly selecting 5 stations from the available network. In 340 cases (68% of the instances), locations are improved, and average mislocation is reduced from 31 km to 26 km. Preliminary test of uncertainty estimates suggest that our uncertainty model produces location uncertainty ellipses that are representative of location accuracy. These results highlight the importance of accurate GT datasets in assessing regional travel-time models and demonstrate that an a priori 3D model can markedly improve our ability to locate small magnitude events in a regional monitoring context. This work was performed under the auspices of the U.S. Department of Energy by the University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48, Contribution UCRL-CONF-206386.

  6. Seismology-based early identification of dam-formation landquake events.

    PubMed

    Chao, Wei-An; Zhao, Li; Chen, Su-Chin; Wu, Yih-Min; Chen, Chi-Hsuan; Huang, Hsin-Hua

    2016-01-12

    Flooding resulting from the bursting of dams formed by landquake events such as rock avalanches, landslides and debris flows can lead to serious bank erosion and inundation of populated areas near rivers. Seismic waves can be generated by landquake events which can be described as time-dependent forces (unloading/reloading cycles) acting on the Earth. In this study, we conduct inversions of long-period (LP, period ≥20 s) waveforms for the landquake force histories (LFHs) of ten events, which provide quantitative characterization of the initiation, propagation and termination stages of the slope failures. When the results obtained from LP waveforms are analyzed together with high-frequency (HF, 1-3 Hz) seismic signals, we find a relatively strong late-arriving seismic phase (dubbed Dam-forming phase or D-phase) recorded clearly in the HF waveforms at the closest stations, which potentially marks the time when the collapsed masses sliding into river and perhaps even impacting the topographic barrier on the opposite bank. Consequently, our approach to analyzing the LP and HF waveforms developed in this study has a high potential for identifying five dam-forming landquake events (DFLEs) in near real-time using broadband seismic records, which can provide timely warnings of the impending floods to downstream residents.

  7. Machine Learning-based Transient Brokers for Real-time Classification of the LSST Alert Stream

    NASA Astrophysics Data System (ADS)

    Narayan, Gautham; Zaidi, Tayeb; Soraisam, Monika; ANTARES Collaboration

    2018-01-01

    The number of transient events discovered by wide-field time-domain surveys already far outstrips the combined followup resources of the astronomical community. This number will only increase as we progress towards the commissioning of the Large Synoptic Survey Telescope (LSST), breaking the community's current followup paradigm. Transient brokers - software to sift through, characterize, annotate and prioritize events for followup - will be a critical tool for managing alert streams in the LSST era. Developing the algorithms that underlie the brokers, and obtaining simulated LSST-like datasets prior to LSST commissioning, to train and test these algorithms are formidable, though not insurmountable challenges. The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is a joint project of the National Optical Astronomy Observatory and the Department of Computer Science at the University of Arizona. We have been developing completely automated methods to characterize and classify variable and transient events from their multiband optical photometry. We describe the hierarchical ensemble machine learning algorithm we are developing, and test its performance on sparse, unevenly sampled, heteroskedastic data from various existing observational campaigns, as well as our progress towards incorporating these into a real-time event broker working on live alert streams from time-domain surveys.

  8. Terminal weather information management

    NASA Technical Reports Server (NTRS)

    Lee, Alfred T.

    1990-01-01

    Since the mid-1960's, microburst/windshear events have caused at least 30 aircraft accidents and incidents and have killed more than 600 people in the United States alone. This study evaluated alternative means of alerting an airline crew to the presence of microburst/windshear events in the terminal area. Of particular interest was the relative effectiveness of conventional and data link ground-to-air transmissions of ground-based radar and low-level windshear sensing information on microburst/windshear avoidance. The Advanced Concepts Flight Simulator located at Ames Research Center was employed in a line oriented simulation of a scheduled round-trip airline flight from Salt Lake City to Denver Stapleton Airport. Actual weather en route and in the terminal area was simulated using recorded data. The microburst/windshear incident of July 11, 1988 was re-created for the Denver area operations. Six experienced airline crews currently flying scheduled routes were employed as test subjects for each of three groups: (1) A baseline group which received alerts via conventional air traffic control (ATC) tower transmissions; (2) An experimental group which received alerts/events displayed visually and aurally in the cockpit six miles (approx. 2 min.) from the microburst event; and (3) An additional experimental group received displayed alerts/events 23 linear miles (approx. 7 min.) from the microburst event. Analyses of crew communications and decision times showed a marked improvement in both situation awareness and decision-making with visually displayed ground-based radar information. Substantial reductions in the variability of decision times among crews in the visual display groups were also found. These findings suggest that crew performance will be enhanced and individual differences among crews due to differences in training and prior experience are significantly reduced by providing real-time, graphic display of terminal weather hazards.

  9. Flood and Landslide Applications of High Time Resolution Satellite Rain Products

    NASA Technical Reports Server (NTRS)

    Adler, Robert F.; Hong, Yang; Huffman, George J.

    2006-01-01

    Experimental, potentially real-time systems to detect floods and landslides related to heavy rain events are described. A key basis for these applications is high time resolution satellite rainfall analyses. Rainfall is the primary cause for devastating floods across the world. However, in many countries, satellite-based precipitation estimation may be the best source of rainfall data due to insufficient ground networks and absence of data sharing along many trans-boundary river basins. Remotely sensed precipitation from the NASA's TRMM Multi-satellite Precipitation Analysis (TMPA) operational system (near real-time precipitation at a spatial-temporal resolution of 3 hours and 0.25deg x 0.25deg) is used to monitor extreme precipitation events. Then these data are ingested into a macro-scale hydrological model which is parameterized using spatially distributed elevation, soil and land cover datasets available globally from satellite remote sensing. Preliminary flood results appear reasonable in terms of location and frequency of events, with implementation on a quasi-global basis underway. With the availability of satellite rainfall analyses at fine time resolution, it has also become possible to assess landslide risk on a near-global basis. Early results show that landslide occurrence is closely associated with the spatial patterns and temporal distribution of TRMM rainfall characteristics. Particularly, the number of landslides triggered by rainfall is related to rainfall climatology, antecedent rainfall accumulation, and intensity-duration of rainstorms. For the purpose of prediction, an empirical TMPA-based rainfall intensity-duration threshold is developed and shown to have skill in determining potential areas of landslides. These experimental findings, in combination with landslide surface susceptibility information based on satellite-based land surface information, form a starting point towards a potential operational landslide monitoring/warning system around the globe.

  10. Sex differences in event-related risk for major depression.

    PubMed

    Maciejewski, P K; Prigerson, H G; Mazure, C M

    2001-05-01

    This study sought to determine if women are more likely than men to experience an episode of major depression in response to stressful life events. Sex differences in event-related risk for depression were examined by means of secondary analyses employing data from the Americans' Changing Lives study. The occurrence and time of occurrence of depression onset and instances of stressful life events within a 12-month period preceding a structured interview were documented in a community-based sample of 1024 men and 1800 women. Survival analytical techniques were used to examine sex differences in risk for depression associated with generic and specific stressful life events. Women were approximately three times more likely than men to experience major depression in response to any stressful life event. Women and men did not differ in risk for depression associated with the death of a spouse or child, events affecting their relationship to a spouse/partner (divorce and marital/love problems) or events corresponding to acute financial or legal difficulties. Women were at elevated risk for depression associated with more distant interpersonal losses (death of a close friend or relative) and other types of events (change of residence, physical attack, or life-threatening illness/injury). Stressful life events overall, with some exceptions among specific event types, pose a greater risk for depression among women compared to men.

  11. Catalogue of {>} 55 MeV Wide-longitude Solar Proton Events Observed by SOHO, ACE, and the STEREOs at {≈} 1 AU During 2009 - 2016

    NASA Astrophysics Data System (ADS)

    Paassilta, Miikka; Papaioannou, Athanasios; Dresing, Nina; Vainio, Rami; Valtonen, Eino; Heber, Bernd

    2018-04-01

    Based on energetic particle observations made at {≈} 1 AU, we present a catalogue of 46 wide-longitude ({>} 45°) solar energetic particle (SEP) events detected at multiple locations during 2009 - 2016. The particle kinetic energies of interest were chosen as {>} 55 MeV for protons and 0.18 - 0.31 MeV for electrons. We make use of proton data from the Solar and Heliospheric Observatory/Energetic and Relativistic Nuclei and Electron Experiment (SOHO/ERNE) and the Solar Terrestrial Relations Observatory/High Energy Telescopes (STEREO/HET), together with electron data from the Advanced Composition Explorer/Electron, Proton, and Alpha Monitor (ACE/EPAM) and the STEREO/ Solar Electron and Proton Telescopes (SEPT). We consider soft X-ray data from the Geostationary Operational Environmental Satellites (GOES) and coronal mass ejection (CME) observations made with the SOHO/ Large Angle and Spectrometric Coronagraph (LASCO) and STEREO/ Coronagraphs 1 and 2 (COR1, COR2) to establish the probable associations between SEP events and the related solar phenomena. Event onset times and peak intensities are determined; velocity dispersion analysis (VDA) and time-shifting analysis (TSA) are performed for protons; TSA is performed for electrons. In our event sample, there is a tendency for the highest peak intensities to occur when the observer is magnetically connected to solar regions west of the flare. Our estimates for the mean event width, derived as the standard deviation of a Gaussian curve modelling the SEP intensities (protons {≈} 44°, electrons {≈} 50°), largely agree with previous results for lower-energy SEPs. SEP release times with respect to event flares, as well as the event rise times, show no simple dependence on the observer's connection angle, suggesting that the source region extent and dominant particle acceleration and transport mechanisms are important in defining these characteristics of an event. There is no marked difference between the speed distributions of the CMEs related to wide events and the CMEs related to all near-Earth SEP events of similar energy range from the same time period.

  12. Mining the key predictors for event outbreaks in social networks

    NASA Astrophysics Data System (ADS)

    Yi, Chengqi; Bao, Yuanyuan; Xue, Yibo

    2016-04-01

    It will be beneficial to devise a method to predict a so-called event outbreak. Existing works mainly focus on exploring effective methods for improving the accuracy of predictions, while ignoring the underlying causes: What makes event go viral? What factors that significantly influence the prediction of an event outbreak in social networks? In this paper, we proposed a novel definition for an event outbreak, taking into account the structural changes to a network during the propagation of content. In addition, we investigated features that were sensitive to predicting an event outbreak. In order to investigate the universality of these features at different stages of an event, we split the entire lifecycle of an event into 20 equal segments according to the proportion of the propagation time. We extracted 44 features, including features related to content, users, structure, and time, from each segment of the event. Based on these features, we proposed a prediction method using supervised classification algorithms to predict event outbreaks. Experimental results indicate that, as time goes by, our method is highly accurate, with a precision rate ranging from 79% to 97% and a recall rate ranging from 74% to 97%. In addition, after applying a feature-selection algorithm, the top five selected features can considerably improve the accuracy of the prediction. Data-driven experimental results show that the entropy of the eigenvector centrality, the entropy of the PageRank, the standard deviation of the betweenness centrality, the proportion of re-shares without content, and the average path length are the key predictors for an event outbreak. Our findings are especially useful for further exploring the intrinsic characteristics of outbreak prediction.

  13. The near real time Forensic Disaster Analysis of the central European flood in June 2013 - A graphical representation of the main results

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Elmer, Florian; Trieselmann, Werner; Kreibich, Heidi; Kunz, Michael; Khazai, Bijan; Dransch, Doris; Wenzel, Friedemann; Zschau, Jochen; Merz, Bruno; Mühr, Bernhard; Kunz-Plapp, Tina; Möhrle, Stella; Bessel, Tina; Fohringer, Joachim

    2014-05-01

    The Central European flood of June 2013 is one of the most severe flood events that have occurred in Central Europe in the past decades. All major German river basins were affected (Rhine, Danube, and Elbe as well as the smaller Weser catchment).In terms of spatial extent and event magnitude, it was the most severe event at least since 1950. Within the current research focus on near real time forensic disaster analysis, the Center for Disaster Management and Risk Reduction Technology (CEDIM) assessed and analysed the multiple facets of the flood event from the beginning. The aim is to describe the on-going event, analyse the event sources, link the physical characteristics to the impact and consequences of the event and to understand the root causes that turn the physical event into a disaster (or prevent it from becoming disastrous). For the near real time component of this research, tools for rapid assessment and concise presentation of analysis results are essential. This contribution provides a graphical summary of the results of the CEDIM-FDA analyses on the June 2013 flood. It demonstrates the potential of visual representations for improving the communication and hence usability of findings in a rapid, intelligible and expressive way as a valuable supplement to usual event reporting. It is based on analyses of the hydrometeorological sources, the flood pathways (from satellite imagery, data extraction from social media), the resilience of the affected regions, and causal loss analysis. The prototypical representation of the FDA-results for the June 2013 flood provides an important step in the development of graphical event templates for the visualisation of forensic disaster analyses. These are intended to become a standard component of future CEDIM-FDA event activities.

  14. Modeling hard clinical end-point data in economic analyses.

    PubMed

    Kansal, Anuraag R; Zheng, Ying; Palencia, Roberto; Ruffolo, Antonio; Hass, Bastian; Sorensen, Sonja V

    2013-11-01

    The availability of hard clinical end-point data, such as that on cardiovascular (CV) events among patients with type 2 diabetes mellitus, is increasing, and as a result there is growing interest in using hard end-point data of this type in economic analyses. This study investigated published approaches for modeling hard end-points from clinical trials and evaluated their applicability in health economic models with different disease features. A review of cost-effectiveness models of interventions in clinically significant therapeutic areas (CV diseases, cancer, and chronic lower respiratory diseases) was conducted in PubMed and Embase using a defined search strategy. Only studies integrating hard end-point data from randomized clinical trials were considered. For each study included, clinical input characteristics and modeling approach were summarized and evaluated. A total of 33 articles (23 CV, eight cancer, two respiratory) were accepted for detailed analysis. Decision trees, Markov models, discrete event simulations, and hybrids were used. Event rates were incorporated either as constant rates, time-dependent risks, or risk equations based on patient characteristics. Risks dependent on time and/or patient characteristics were used where major event rates were >1%/year in models with fewer health states (<7). Models of infrequent events or with numerous health states generally preferred constant event rates. The detailed modeling information and terminology varied, sometimes requiring interpretation. Key considerations for cost-effectiveness models incorporating hard end-point data include the frequency and characteristics of the relevant clinical events and how the trial data is reported. When event risk is low, simplification of both the model structure and event rate modeling is recommended. When event risk is common, such as in high risk populations, more detailed modeling approaches, including individual simulations or explicitly time-dependent event rates, are more appropriate to accurately reflect the trial data.

  15. Ptaquiloside from bracken in stream water at base flow and during storm events.

    PubMed

    Clauson-Kaas, Frederik; Ramwell, Carmel; Hansen, Hans Chr B; Strobel, Bjarne W

    2016-12-01

    The bracken fern (Pteridium spp.) densely populates both open and woodland vegetation types around the globe. Bracken is toxic to livestock when consumed, and a group of potent illudane-type carcinogens have been identified, of which the compound ptaquiloside (PTA) is the most abundant. The highly water soluble PTA has been shown to be leachable from bracken fronds, and present in the soil and water below bracken stands. This has raised concerns over whether the compound might pose a risk to drinking water sources. We investigated PTA concentrations in a small stream draining a bracken-infested catchment at base flow and in response to storm events during a growth season, and included sampling of the bracken canopy throughfall. Streams in other bracken-dominated areas were also sampled at base flow for comparison, and a controlled pulse experiment was conducted in the field to study the in-stream dynamics of PTA. Ptaquiloside concentrations in the stream never exceeded 61 ng L -1 in the base flow samples, but peaked at 2.2 μg L -1 during the studied storm events. The mass of PTA in the stream, per storm event, was 7.5-93 mg from this catchment. A clear temporal connection was observed between rainfall and PTA concentration in the stream, with a reproducible time lag of approx. 1 h from onset of rain to elevated concentrations, and returning rather quickly (about 2 h) to base flow concentration levels. The concentration of PTA behaved similar to an inert tracer (Cl - ) in the pulse experiment over a relative short time scale (minutes-hours) reflecting no PTA sorption, and dispersion and dilution considerably lowered the observed PTA concentrations downstream. Bracken throughfall revealed a potent and lasting source of PTA during rainfall, with concentrations up to 169 μg L -1 , that did not decrease over the course of the event. In the stream, the throughfall contribution to PTA cannot be separated from a possible below-ground input from litter, rhizomes and soil. Catchment-specific factors such as the soil pH, topography, hydrology, and bracken coverage will evidently affect the level of PTA observed in the receiving stream, as well as the distance from bracken, but time since precipitation seems most important. Studying PTA loads and transport in surface streams fed by bracken-infested catchments, simply taking occasional grab samples will not capture the precipitation-linked pulses. The place and time of sampling governs the findings, and including event-based sampling is essential to provide a more complete picture of PTA loads to surface water. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. A Mobile Robots Experimental Environment with Event-Based Wireless Communication

    PubMed Central

    Guinaldo, María; Fábregas, Ernesto; Farias, Gonzalo; Dormido-Canto, Sebastián; Chaos, Dictino; Sánchez, José; Dormido, Sebastián

    2013-01-01

    An experimental platform to communicate between a set of mobile robots through a wireless network has been developed. The mobile robots get their position through a camera which performs as sensor. The video images are processed in a PC and a Waspmote card sends the corresponding position to each robot using the ZigBee standard. A distributed control algorithm based on event-triggered communications has been designed and implemented to bring the robots into the desired formation. Each robot communicates to its neighbors only at event times. Furthermore, a simulation tool has been developed to design and perform experiments with the system. An example of usage is presented. PMID:23881139

  17. Refining locations of the 2005 Mukacheve, West Ukraine, earthquakes based on similarity of their waveforms

    NASA Astrophysics Data System (ADS)

    Gnyp, Andriy

    2009-06-01

    Based on the results of application of correlation analysis to records of the 2005 Mukacheve group of recurrent events and their subsequent relocation relative to the reference event of 7 July 2005, a conclusion has been drawn that all the events had most likely occurred on the same rup-ture plane. Station terms have been estimated for seismic stations of the Transcarpathians, accounting for variation of seismic velocities beneath their locations as compared to the travel time tables used in the study. In methodical aspect, potentials and usefulness of correlation analysis of seismic records for a more detailed study of seismic processes, tectonics and geodynamics of the Carpathian region have been demonstrated.

  18. Cross-modal decoupling in temporal attention.

    PubMed

    Mühlberg, Stefanie; Oriolo, Giovanni; Soto-Faraco, Salvador

    2014-06-01

    Prior studies have repeatedly reported behavioural benefits to events occurring at attended, compared to unattended, points in time. It has been suggested that, as for spatial orienting, temporal orienting of attention spreads across sensory modalities in a synergistic fashion. However, the consequences of cross-modal temporal orienting of attention remain poorly understood. One challenge is that the passage of time leads to an increase in event predictability throughout a trial, thus making it difficult to interpret possible effects (or lack thereof). Here we used a design that avoids complete temporal predictability to investigate whether attending to a sensory modality (vision or touch) at a point in time confers beneficial access to events in the other, non-attended, sensory modality (touch or vision, respectively). In contrast to previous studies and to what happens with spatial attention, we found that events in one (unattended) modality do not automatically benefit from happening at the time point when another modality is expected. Instead, it seems that attention can be deployed in time with relative independence for different sensory modalities. Based on these findings, we argue that temporal orienting of attention can be cross-modally decoupled in order to flexibly react according to the environmental demands, and that the efficiency of this selective decoupling unfolds in time. © 2014 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  19. APNEA list mode data acquisition and real-time event processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogle, R.A.; Miller, P.; Bramblett, R.L.

    1997-11-01

    The LMSC Active Passive Neutron Examinations and Assay (APNEA) Data Logger is a VME-based data acquisition system using commercial-off-the-shelf hardware with the application-specific software. It receives TTL inputs from eighty-eight {sup 3}He detector tubes and eight timing signals. Two data sets are generated concurrently for each acquisition session: (1) List Mode recording of all detector and timing signals, timestamped to 3 microsecond resolution; (2) Event Accumulations generated in real-time by counting events into short (tens of microseconds) and long (seconds) time bins following repetitive triggers. List Mode data sets can be post-processed to: (1) determine the optimum time bins formore » TRU assay of waste drums, (2) analyze a given data set in several ways to match different assay requirements and conditions and (3) confirm assay results by examining details of the raw data. Data Logger events are processed and timestamped by an array of 15 TMS320C40 DSPs and delivered to an embedded controller (PowerPC604) for interim disk storage. Three acquisition modes, corresponding to different trigger sources are provided. A standard network interface to a remote host system (Windows NT or SunOS) provides for system control, status, and transfer of previously acquired data. 6 figs.« less

  20. Automatic Detection and Classification of Unsafe Events During Power Wheelchair Use.

    PubMed

    Pineau, Joelle; Moghaddam, Athena K; Yuen, Hiu Kim; Archambault, Philippe S; Routhier, François; Michaud, François; Boissy, Patrick

    2014-01-01

    Using a powered wheelchair (PW) is a complex task requiring advanced perceptual and motor control skills. Unfortunately, PW incidents and accidents are not uncommon and their consequences can be serious. The objective of this paper is to develop technological tools that can be used to characterize a wheelchair user's driving behavior under various settings. In the experiments conducted, PWs are outfitted with a datalogging platform that records, in real-time, the 3-D acceleration of the PW. Data collection was conducted over 35 different activities, designed to capture a spectrum of PW driving events performed at different speeds (collisions with fixed or moving objects, rolling on incline plane, and rolling across multiple types obstacles). The data was processed using time-series analysis and data mining techniques, to automatically detect and identify the different events. We compared the classification accuracy using four different types of time-series features: 1) time-delay embeddings; 2) time-domain characterization; 3) frequency-domain features; and 4) wavelet transforms. In the analysis, we compared the classification accuracy obtained when distinguishing between safe and unsafe events during each of the 35 different activities. For the purposes of this study, unsafe events were defined as activities containing collisions against objects at different speed, and the remainder were defined as safe events. We were able to accurately detect 98% of unsafe events, with a low (12%) false positive rate, using only five examples of each activity. This proof-of-concept study shows that the proposed approach has the potential of capturing, based on limited input from embedded sensors, contextual information on PW use, and of automatically characterizing a user's PW driving behavior.

  1. Monte Carlo event generators in atomic collisions: A new tool to tackle the few-body dynamics

    NASA Astrophysics Data System (ADS)

    Ciappina, M. F.; Kirchner, T.; Schulz, M.

    2010-04-01

    We present a set of routines to produce theoretical event files, for both single and double ionization of atoms by ion impact, based on a Monte Carlo event generator (MCEG) scheme. Such event files are the theoretical counterpart of the data obtained from a kinematically complete experiment; i.e. they contain the momentum components of all collision fragments for a large number of ionization events. Among the advantages of working with theoretical event files is the possibility to incorporate the conditions present in a real experiment, such as the uncertainties in the measured quantities. Additionally, by manipulating them it is possible to generate any type of cross sections, specially those that are usually too complicated to compute with conventional methods due to a lack of symmetry. Consequently, the numerical effort of such calculations is dramatically reduced. We show examples for both single and double ionization, with special emphasis on a new data analysis tool, called four-body Dalitz plots, developed very recently. Program summaryProgram title: MCEG Catalogue identifier: AEFV_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFV_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2695 No. of bytes in distributed program, including test data, etc.: 18 501 Distribution format: tar.gz Programming language: FORTRAN 77 with parallelization directives using scripting Computer: Single machines using Linux and Linux servers/clusters (with cores with any clock speed, cache memory and bits in a word) Operating system: Linux (any version and flavor) and FORTRAN 77 compilers Has the code been vectorised or parallelized?: Yes RAM: 64-128 kBytes (the codes are very cpu intensive) Classification: 2.6 Nature of problem: The code deals with single and double ionization of atoms by ion impact. Conventional theoretical approaches aim at a direct calculation of the corresponding cross sections. This has the important shortcoming that it is difficult to account for the experimental conditions when comparing results to measured data. In contrast, the present code generates theoretical event files of the same type as are obtained in a real experiment. From these event files any type of cross sections can be easily extracted. The theoretical schemes are based on distorted wave formalisms for both processes of interest. Solution method: The codes employ a Monte Carlo Event Generator based on theoretical formalisms to generate event files for both single and double ionization. One of the main advantages of having access to theoretical event files is the possibility of adding the conditions present in real experiments (parameter uncertainties, environmental conditions, etc.) and to incorporate additional physics in the resulting event files (e.g. elastic scattering or other interactions absent in the underlying calculations). Additional comments: The computational time can be dramatically reduced if a large number of processors is used. Since the codes has no communication between processes it is possible to achieve an efficiency of a 100% (this number certainly will be penalized by the queuing waiting time). Running time: Times vary according to the process, single or double ionization, to be simulated, the number of processors and the type of theoretical model. The typical running time is between several hours and up to a few weeks.

  2. A coupled weather generator - rainfall-runoff approach on hourly time steps for flood risk analysis

    NASA Astrophysics Data System (ADS)

    Winter, Benjamin; Schneeberger, Klaus; Dung Nguyen, Viet; Vorogushyn, Sergiy; Huttenlau, Matthias; Merz, Bruno; Stötter, Johann

    2017-04-01

    The evaluation of potential monetary damage of flooding is an essential part of flood risk management. One possibility to estimate the monetary risk is to analyze long time series of observed flood events and their corresponding damages. In reality, however, only few flood events are documented. This limitation can be overcome by the generation of a set of synthetic, physically and spatial plausible flood events and subsequently the estimation of the resulting monetary damages. In the present work, a set of synthetic flood events is generated by a continuous rainfall-runoff simulation in combination with a coupled weather generator and temporal disaggregation procedure for the study area of Vorarlberg (Austria). Most flood risk studies focus on daily time steps, however, the mesoscale alpine study area is characterized by short concentration times, leading to large differences between daily mean and daily maximum discharge. Accordingly, an hourly time step is needed for the simulations. The hourly metrological input for the rainfall-runoff model is generated in a two-step approach. A synthetic daily dataset is generated by a multivariate and multisite weather generator and subsequently disaggregated to hourly time steps with a k-Nearest-Neighbor model. Following the event generation procedure, the negative consequences of flooding are analyzed. The corresponding flood damage for each synthetic event is estimated by combining the synthetic discharge at representative points of the river network with a loss probability relation for each community in the study area. The loss probability relation is based on exposure and susceptibility analyses on a single object basis (residential buildings) for certain return periods. For these impact analyses official inundation maps of the study area are used. Finally, by analyzing the total event time series of damages, the expected annual damage or losses associated with a certain probability of occurrence can be estimated for the entire study area.

  3. 77 FR 20615 - DAU Industry Day: “Affordability, Efficiency, and the Industrial Base”

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-05

    ... the Industrial Base'' AGENCY: Defense Acquisition University (DAU), DoD. ACTION: Event notice. SUMMARY... to discuss affordability, efficiency, and the industrial base. After a variety of presenters, the... this time of fiscal austerity, while maintaining a healthy industrial base. Following the plenary...

  4. Monitoring glacier surface seismicity in time and space using Rayleigh waves

    USGS Publications Warehouse

    Mikesell, T. D.; Van Wijk, K.; Haney, Matthew M.; Bradford, J.H.; Marshall, Hans P.; Harper, J. T.

    2012-01-01

    Sliding glaciers and brittle ice failure generate seismic body and surface wave energy characteristic to the source mechanism. Here we analyze continuous seismic recordings from an array of nine short-period passive seismometers located on Bench Glacier, Alaska (USA) (61.033°N, 145.687°W). We focus on the arrival-time and amplitude information of the dominant Rayleigh wave phase. Over a 46-hour period we detect thousands of events using a cross-correlation based event identification method. Travel-time inversion of a subset of events (7% of the total) defines an active crevasse, propagating more than 200 meters in three hours. From the Rayleigh wave amplitudes, we estimate the amount of volumetric opening along the crevasse as well as an average bulk attenuation (  = 42) for the ice in this part of the glacier. With the remaining icequake signals we establish a diurnal periodicity in seismicity, indicating that surface run-off and subglacial water pressure changes likely control the triggering of these surface events. Furthermore, we find that these events are too weak (i.e., too noisy) to locate individually. However, stacking individual events increases the signal-to-noise ratio of the waveforms, implying that these periodic sources are effectively stationary during the recording period.

  5. Time-lapse seismic tomography using the data of microseismic monitoring network and analysis of mine-induced events, seismic tomography results and technological data in Pyhäsalmi mine, Finland

    NASA Astrophysics Data System (ADS)

    Nevalainen, Jouni; Kozlovskaya, Elena

    2016-04-01

    We present results of a seismic travel-time tomography applied to microseismic data from the Pyhäsalmi mine, Finland. The data about microseismic events in the mine is recorded since 2002 when the passive microseismic monitoring network was installed in the mine. Since that over 130000 microseismic events have been observed. The first target of our study was to test can the passive microseismic monitoring data be used with travel-time tomography. In this data set the source-receiver geometry is based on non-even distribution of natural and mine-induced events inside and in the vicinity of the mine and hence, is a non-ideal one for the travel-time tomography. The tomographic inversion procedure was tested with the synthetic data and real source-receiver geometry from Pyhäsalmi mine and with the real travel-time data of the first arrivals of P-waves from the microseismic events. The results showed that seismic tomography is capable to reveal differences in seismic velocities in the mine area corresponding to different rock types. For example, the velocity contrast between the ore body and surrounding rock is detectable. The velocity model recovered agrees well with the known geological structures in the mine area. The second target of the study was to apply the travel-time tomography to microseismic monitoring data recorded during different time periods in order to track temporal changes in seismic velocities within the mining area as the excavation proceeds. The result shows that such a time-lapse travel-time tomography can recover such changes. In order to obtain good ray coverage and good resolution, the time interval for a single tomography round need to be selected taking into account the number of events and their spatial distribution. The third target was to compare and analyze mine-induced event locations, seismic tomography results and mining technological data (for example, mine excavation plans) in order to understand the influence of mining technology to mining-induced seismicity. Acknowledgements: This study has been supported by ERDF SEISLAB project and Pyhäsalmi Mine Ltd.

  6. A Step towards a Sharable Community Knowledge Base for WRF Settings -Developing a WRF Setting Methodology based on a case study in a Torrential Rainfall Event

    NASA Astrophysics Data System (ADS)

    CHU, Q.; Xu, Z.; Zhuo, L.; Han, D.

    2016-12-01

    Increased requirements for interactions between different disciplines and readily access to the numerical weather forecasting system featured with portability and extensibility have made useful contribution to the increases of downstream model users in WRF over recent years. For these users, a knowledge base classified by the representative events would be much helpful. This is because the determination of model settings is regarded as the most important steps in WRF. However, such a process is generally time-consuming, even if with a high computational platform. As such, we propose a sharable proper lookup table on WRF domain settings and corresponding procedures based on a representative torrential rainfall event in Beijing, China. It has been found that WRF's simulations' drift away from the input lateral boundary conditions can be significantly reduced with the adjustment of the domain settings. Among all the impact factors, the placement of nested domain can not only affect the moving speed and angle of the storm-center, but also the location and amount of heavy-rain-belt which can only be detected with adjusted spatial resolutions. Spin-up time is also considered in the model settings, which is demonstrated to have the most obvious influence on the accuracy of the simulations. This conclusion is made based on the large diversity of spatial distributions of precipitation, in terms of the amount of heavy rain varied from -30% to 58% among each experiment. After following all the procedures, the variations of domain settings have minimal effect on the modeling and show the best correlation (larger than 0.65) with fusion observations. So the model settings, including domain size covering the greater Beijing area, 1:5:5 downscaling ratio, 57 vertical levels with top of 50hpa and 60h spin-up time, are found suitable for predicting the similar convective torrential rainfall event in Beijing area. We hope that the procedure for building the community WRF knowledge base in this paper would be helpful to peer-researchers and operational communities by saving them from repeating each other's work. More importantly, the results by studying different events and locations could enrich this community knowledge base to benefit WRF users around the world in the future.

  7. Safety at The William Quarrier Scottish Epilepsy Centre.

    PubMed

    Anderson, James; Grant, Victoria; Elgammal, Mariam; Campbell, Alison; Hampshire, Julia; Hansen, Stig; Russell, Aline J C

    2017-12-01

    We examined the yield from EMFIT bed alarms and staff response time to generalised seizure in a medium term residential assessment unit for epilepsy. The Scottish Epilpesy Centre (SEC) has a Video Observation System (VOS) that provides continuous recording of all patient spaces (external and internal) and allows retention of clinically relevant events. A retrospective audit of daily EMFIT test records, nursing seizure record sheets (seizure type and EMFIT alert status), clinical incident reporting systems and the VOS database of retained clinical events was conducted for an 9 month period from April 1st 2016 till December 31st 2016. All generalized tonic clonic seizures (GTCS) were noted by patient, time and location and staff response time to GTCS was calculated. There were 85 people admitted during the audit period who had 61 GTCS. 50 events were in bed and EMFIT alert status was recorded. On 8 occasions the EMFIT did not alert: 5 events were not of sufficient duration or frequency, in 2 the patient fell from the bed early and 1 event the alarm did not trigger. The average response time to GTCS was 23s. The longest response time was 69s (range, 0-69s, sd 15.76.). The EMFIT bed alarm appears to be a valuable adjunct to safety systems. Within the novel environment of the SEC it is possible to maintain a response time to GTCS that is comparable to hospital based UK video telemetry units. Copyright © 2017 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  8. Use of Unstructured Event-Based Reports for Global Infectious Disease Surveillance

    PubMed Central

    Blench, Michael; Tolentino, Herman; Freifeld, Clark C.; Mandl, Kenneth D.; Mawudeku, Abla; Eysenbach, Gunther; Brownstein, John S.

    2009-01-01

    Free or low-cost sources of unstructured information, such as Internet news and online discussion sites, provide detailed local and near real-time data on disease outbreaks, even in countries that lack traditional public health surveillance. To improve public health surveillance and, ultimately, interventions, we examined 3 primary systems that process event-based outbreak information: Global Public Health Intelligence Network, HealthMap, and EpiSPIDER. Despite similarities among them, these systems are highly complementary because they monitor different data types, rely on varying levels of automation and human analysis, and distribute distinct information. Future development should focus on linking these systems more closely to public health practitioners in the field and establishing collaborative networks for alert verification and dissemination. Such development would further establish event-based monitoring as an invaluable public health resource that provides critical context and an alternative to traditional indicator-based outbreak reporting. PMID:19402953

  9. Prior task experience and comparable stimulus exposure nullify focal and nonfocal prospective memory retrieval differences.

    PubMed

    Hicks, Jason L; Franks, Bryan A; Spitler, Samantha N

    2017-10-01

    We explored the nature of focal versus nonfocal event-based prospective memory retrieval. In the context of a lexical decision task, people received an intention to respond to a single word (focal) in one condition and to a category label (nonfocal) for the other condition. Participants experienced both conditions, and their order was manipulated. The focal instruction condition was a single word presented multiple times. In Experiment 1, the stimuli in the nonfocal condition were different exemplars from a category, each presented once. In the nonfocal condition retrieval was poorer and reaction times were slower during the ongoing task as compared to the focal condition, replicating prior findings. In Experiment 2, the stimulus in the nonfocal condition was a single category exemplar repeated multiple times. When this single-exemplar nonfocal condition followed in time the single-item focal condition, focal versus nonfocal performance was virtually indistinguishable. These results demonstrate that people can modify their stimulus processing and expectations in event-based prospective memory tasks based on experience with the nature of prospective cues and with the ongoing task.

  10. Effect of typhoon on atmospheric aerosol particle pollutants accumulation over Xiamen, China.

    PubMed

    Yan, Jinpei; Chen, Liqi; Lin, Qi; Zhao, Shuhui; Zhang, Miming

    2016-09-01

    Great influence of typhoon on air quality has been confirmed, however, rare data especially high time resolved aerosol particle data could be used to establish the behavior of typhoon on air pollution. A single particle aerosol spectrometer (SPAMS) was employed to characterize the particles with particle number count in high time resolution for two typhoons of Soulik (2013) and Soudelor (2015) with similar tracks. Three periods with five events were classified during the whole observation time, including pre - typhoon (event 1 and event 2), typhoon (event 3 and event 4) and post - typhoon (event 5) based on the meteorological parameters and particle pollutant properties. First pollutant group appeared during pre-typhoon (event 2) with high relative contributions of V - Ni rich particles. Pollution from the ship emissions and accumulated by local processes with stagnant meteorological atmosphere dominated the formation of the pollutant group before typhoon. The second pollutant group was present during typhoon (event 3), while typhoon began to change the local wind direction and increase wind speed. Particle number count reached up to the maximum value. High relative contributions of V - Ni rich and dust particles with low value of NO3(-)/SO4(2-) was observed during this period, indicating that the pollutant group was governed by the combined effect of local pollutant emissions and long-term transports. The analysis of this study sheds a deep insight into understand the relationship between the air pollution and typhoon. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Simulations and Characteristics of Large Solar Events Propagating Throughout the Heliosphere and Beyond (Invited)

    NASA Astrophysics Data System (ADS)

    Intriligator, D. S.; Sun, W.; Detman, T. R.; Dryer, Ph D., M.; Intriligator, J.; Deehr, C. S.; Webber, W. R.; Gloeckler, G.; Miller, W. D.

    2015-12-01

    Large solar events can have severe adverse global impacts at Earth. These solar events also can propagate throughout the heliopshere and into the interstellar medium. We focus on the July 2012 and Halloween 2003 solar events. We simulate these events starting from the vicinity of the Sun at 2.5 Rs. We compare our three dimensional (3D) time-dependent simulations to available spacecraft (s/c) observations at 1 AU and beyond. Based on the comparisons of the predictions from our simulations with in-situ measurements we find that the effects of these large solar events can be observed in the outer heliosphere, the heliosheath, and even into the interstellar medium. We use two simulation models. The HAFSS (HAF Source Surface) model is a kinematic model. HHMS-PI (Hybrid Heliospheric Modeling System with Pickup protons) is a numerical magnetohydrodynamic solar wind (SW) simulation model. Both HHMS-PI and HAFSS are ideally suited for these analyses since starting at 2.5 Rs from the Sun they model the slowly evolving background SW and the impulsive, time-dependent events associated with solar activity. Our models naturally reproduce dynamic 3D spatially asymmetric effects observed throughout the heliosphere. Pre-existing SW background conditions have a strong influence on the propagation of shock waves from solar events. Time-dependence is a crucial aspect of interpreting s/c data. We show comparisons of our simulation results with STEREO A, ACE, Ulysses, and Voyager s/c observations.

  12. ADESSA: A Real-Time Decision Support Service for Delivery of Semantically Coded Adverse Drug Event Data

    PubMed Central

    Duke, Jon D.; Friedlin, Jeff

    2010-01-01

    Evaluating medications for potential adverse events is a time-consuming process, typically involving manual lookup of information by physicians. This process can be expedited by CDS systems that support dynamic retrieval and filtering of adverse drug events (ADE’s), but such systems require a source of semantically-coded ADE data. We created a two-component system that addresses this need. First we created a natural language processing application which extracts adverse events from Structured Product Labels and generates a standardized ADE knowledge base. We then built a decision support service that consumes a Continuity of Care Document and returns a list of patient-specific ADE’s. Our database currently contains 534,125 ADE’s from 5602 product labels. An NLP evaluation of 9529 ADE’s showed recall of 93% and precision of 95%. On a trial set of 30 CCD’s, the system provided adverse event data for 88% of drugs and returned these results in an average of 620ms. PMID:21346964

  13. A diary after dinner: How the time of event recording influences later accessibility of diary events.

    PubMed

    Szőllősi, Ágnes; Keresztes, Attila; Conway, Martin A; Racsmány, Mihály

    2015-01-01

    Recording the events of a day in a diary may help improve their later accessibility. An interesting question is whether improvements in long-term accessibility will be greater if the diary is completed at the end of the day, or after a period of sleep, the following morning. We investigated this question using an internet-based diary method. On each of five days, participants (n = 109) recorded autobiographical memories for that day or for the previous day. Recording took place either in the morning or in the evening. Following a 30-day retention interval, the diary events were free recalled. We found that participants who recorded their memories in the evening before sleep had best memory performance. These results suggest that the time of reactivation and recording of recent autobiographical events has a significant effect on the later accessibility of those diary events. We discuss our results in the light of related findings that show a beneficial effect of reduced interference during sleep on memory consolidation and reconsolidation.

  14. Perpetrator, worker and workplace characteristics associated with patient and visitor perpetrated violence (Type II) on hospital workers: a review of the literature and existing occupational injury data.

    PubMed

    Pompeii, Lisa; Dement, John; Schoenfisch, Ashley; Lavery, Amy; Souder, Megan; Smith, Claudia; Lipscomb, Hester

    2013-02-01

    Non-fatal type II violence experienced by hospital workers (patient/visitor-on-worker violence) is not well described. Hospital administration data (2004-2009) were examined for purposes of calculating rates of type II violent events experienced by workers. We also conducted a review of the hospital-based literature (2000-2010) and summarized findings associated with type II violence. 484 physical assaults were identified in the data, with a rate of 1.75 events/100 full-time equivalents. Only few details about events were captured, while non-physical events were not captured. The literature yielded 17 studies, with a range proportion of verbal abuse (22%-90%), physical threats (12%-64%) and assaults (2%-32%) reported. The literature lacked rigorous methods for examining incidence and circumstances surrounding events or rates of events over time. For purposes of examining the impact of type II violence on worker safety, satisfaction and retention, rigorous surveillance efforts by hospital employers and researchers are warranted. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.

  15. Method of controlling cyclic variation in engine combustion

    DOEpatents

    Davis, L.I. Jr.; Daw, C.S.; Feldkamp, L.A.; Hoard, J.W.; Yuan, F.; Connolly, F.T.

    1999-07-13

    Cyclic variation in combustion of a lean burning engine is reduced by detecting an engine combustion event output such as torsional acceleration in a cylinder (i) at a combustion event (k), using the detected acceleration to predict a target acceleration for the cylinder at the next combustion event (k+1), modifying the target output by a correction term that is inversely proportional to the average phase of the combustion event output of cylinder (i) and calculating a control output such as fuel pulse width or spark timing necessary to achieve the target acceleration for cylinder (i) at combustion event (k+1) based on anti-correlation with the detected acceleration and spill-over effects from fueling. 27 figs.

  16. Method of controlling cyclic variation in engine combustion

    DOEpatents

    Davis, Jr., Leighton Ira; Daw, Charles Stuart; Feldkamp, Lee Albert; Hoard, John William; Yuan, Fumin; Connolly, Francis Thomas

    1999-01-01

    Cyclic variation in combustion of a lean burning engine is reduced by detecting an engine combustion event output such as torsional acceleration in a cylinder (i) at a combustion event (k), using the detected acceleration to predict a target acceleration for the cylinder at the next combustion event (k+1), modifying the target output by a correction term that is inversely proportional to the average phase of the combustion event output of cylinder (i) and calculating a control output such as fuel pulse width or spark timing necessary to achieve the target acceleration for cylinder (i) at combustion event (k+1) based on anti-correlation with the detected acceleration and spill-over effects from fueling.

  17. A large refined catalog of earthquake relocations and focal mechanisms for the Island of Hawai'i and its seismotectonic implications

    USGS Publications Warehouse

    Lin, Guoqing; Okubo, Paul G.

    2016-01-01

    We present high-quality focal mechanisms based on a refined earthquake location catalog for the Island of Hawai'i, focusing on Mauna Loa and Kīlauea volcanoes. The relocation catalog is based on first-arrival times and waveform data of both compressional and shear waves for about 180,000 events on and near the Island of Hawai'i between 1986 and 2009 recorded by the seismic stations at the Hawaiian Volcano Observatory. We relocate all the earthquakes by applying ray tracing through an existing three-dimensional velocity model, similar event cluster analysis, and a differential-time relocation method. The resulting location catalog represents an expansion of previous relocation studies, covering a longer time period and consisting of more events with well-constrained absolute locations. The focal mechanisms are obtained based on the compressional-wave first-motion polarities and compressional-to-shear wave amplitude ratios by applying the HASH program to the waveform cross correlation relocated earthquakes. Overall, the good-quality (defined by the HASH parameters) focal solutions are dominated by normal faulting in our study area, especially in the active Ka'ōiki and Hīlea seismic zones. Kīlauea caldera is characterized by a mixture of approximately equal numbers of normal, strike-slip, and reverse faults, whereas its south flank has slightly fewer strike-slip events. Our relocation and focal mechanism results will be useful for mapping the seismic stress and strain fields and for understanding the seismic-volcanic-tectonic relationships within the magmatic systems.

  18. A large refined catalog of earthquake relocations and focal mechanisms for the Island of Hawai'i and its seismotectonic implications

    NASA Astrophysics Data System (ADS)

    Lin, Guoqing; Okubo, Paul G.

    2016-07-01

    We present high-quality focal mechanisms based on a refined earthquake location catalog for the Island of Hawai'i, focusing on Mauna Loa and Kīlauea volcanoes. The relocation catalog is based on first-arrival times and waveform data of both compressional and shear waves for about 180,000 events on and near the Island of Hawai'i between 1986 and 2009 recorded by the seismic stations at the Hawaiian Volcano Observatory. We relocate all the earthquakes by applying ray tracing through an existing three-dimensional velocity model, similar event cluster analysis, and a differential-time relocation method. The resulting location catalog represents an expansion of previous relocation studies, covering a longer time period and consisting of more events with well-constrained absolute locations. The focal mechanisms are obtained based on the compressional-wave first-motion polarities and compressional-to-shear wave amplitude ratios by applying the HASH program to the waveform cross correlation relocated earthquakes. Overall, the good-quality (defined by the HASH parameters) focal solutions are dominated by normal faulting in our study area, especially in the active Ka'ōiki and Hīlea seismic zones. Kīlauea caldera is characterized by a mixture of approximately equal numbers of normal, strike-slip, and reverse faults, whereas its south flank has slightly fewer strike-slip events. Our relocation and focal mechanism results will be useful for mapping the seismic stress and strain fields and for understanding the seismic-volcanic-tectonic relationships within the magmatic systems.

  19. Population-based absolute risk estimation with survey data

    PubMed Central

    Kovalchik, Stephanie A.; Pfeiffer, Ruth M.

    2013-01-01

    Absolute risk is the probability that a cause-specific event occurs in a given time interval in the presence of competing events. We present methods to estimate population-based absolute risk from a complex survey cohort that can accommodate multiple exposure-specific competing risks. The hazard function for each event type consists of an individualized relative risk multiplied by a baseline hazard function, which is modeled nonparametrically or parametrically with a piecewise exponential model. An influence method is used to derive a Taylor-linearized variance estimate for the absolute risk estimates. We introduce novel measures of the cause-specific influences that can guide modeling choices for the competing event components of the model. To illustrate our methodology, we build and validate cause-specific absolute risk models for cardiovascular and cancer deaths using data from the National Health and Nutrition Examination Survey. Our applications demonstrate the usefulness of survey-based risk prediction models for predicting health outcomes and quantifying the potential impact of disease prevention programs at the population level. PMID:23686614

  20. Daily GRACE gravity field solutions track major flood events in the Ganges-Brahmaputra Delta

    NASA Astrophysics Data System (ADS)

    Gouweleeuw, Ben T.; Kvas, Andreas; Gruber, Christian; Gain, Animesh K.; Mayer-Gürr, Thorsten; Flechtner, Frank; Güntner, Andreas

    2018-05-01

    Two daily gravity field solutions based on observations from the Gravity Recovery and Climate Experiment (GRACE) satellite mission are evaluated against daily river runoff data for major flood events in the Ganges-Brahmaputra Delta (GBD) in 2004 and 2007. The trends over periods of a few days of the daily GRACE data reflect temporal variations in daily river runoff during major flood events. This is especially true for the larger flood in 2007, which featured two distinct periods of critical flood level exceedance in the Brahmaputra River. This first hydrological evaluation of daily GRACE gravity field solutions based on a Kalman filter approach confirms their potential for gravity-based large-scale flood monitoring. This particularly applies to short-lived, high-volume floods, as they occur in the GBD with a 4-5-year return period. The release of daily GRACE gravity field solutions in near-real time may enable flood monitoring for large events.

  1. RoboTAP: Target priorities for robotic microlensing observations

    NASA Astrophysics Data System (ADS)

    Hundertmark, M.; Street, R. A.; Tsapras, Y.; Bachelet, E.; Dominik, M.; Horne, K.; Bozza, V.; Bramich, D. M.; Cassan, A.; D'Ago, G.; Figuera Jaimes, R.; Kains, N.; Ranc, C.; Schmidt, R. W.; Snodgrass, C.; Wambsganss, J.; Steele, I. A.; Mao, S.; Ment, K.; Menzies, J.; Li, Z.; Cross, S.; Maoz, D.; Shvartzvald, Y.

    2018-01-01

    Context. The ability to automatically select scientifically-important transient events from an alert stream of many such events, and to conduct follow-up observations in response, will become increasingly important in astronomy. With wide-angle time domain surveys pushing to fainter limiting magnitudes, the capability to follow-up on transient alerts far exceeds our follow-up telescope resources, and effective target prioritization becomes essential. The RoboNet-II microlensing program is a pathfinder project, which has developed an automated target selection process (RoboTAP) for gravitational microlensing events, which are observed in real time using the Las Cumbres Observatory telescope network. Aims: Follow-up telescopes typically have a much smaller field of view compared to surveys, therefore the most promising microlensing events must be automatically selected at any given time from an annual sample exceeding 2000 events. The main challenge is to select between events with a high planet detection sensitivity, with the aim of detecting many planets and characterizing planetary anomalies. Methods: Our target selection algorithm is a hybrid system based on estimates of the planet detection zones around a microlens. It follows automatic anomaly alerts and respects the expected survey coverage of specific events. Results: We introduce the RoboTAP algorithm, whose purpose is to select and prioritize microlensing events with high sensitivity to planetary companions. In this work, we determine the planet sensitivity of the RoboNet follow-up program and provide a working example of how a broker can be designed for a real-life transient science program conducting follow-up observations in response to alerts; we explore the issues that will confront similar programs being developed for the Large Synoptic Survey Telescope (LSST) and other time domain surveys.

  2. Event-Based Sensing and Control for Remote Robot Guidance: An Experimental Case

    PubMed Central

    Santos, Carlos; Martínez-Rey, Miguel; Santiso, Enrique

    2017-01-01

    This paper describes the theoretical and practical foundations for remote control of a mobile robot for nonlinear trajectory tracking using an external localisation sensor. It constitutes a classical networked control system, whereby event-based techniques for both control and state estimation contribute to efficient use of communications and reduce sensor activity. Measurement requests are dictated by an event-based state estimator by setting an upper bound to the estimation error covariance matrix. The rest of the time, state prediction is carried out with the Unscented transformation. This prediction method makes it possible to select the appropriate instants at which to perform actuations on the robot so that guidance performance does not degrade below a certain threshold. Ultimately, we obtained a combined event-based control and estimation solution that drastically reduces communication accesses. The magnitude of this reduction is set according to the tracking error margin of a P3-DX robot following a nonlinear trajectory, remotely controlled with a mini PC and whose pose is detected by a camera sensor. PMID:28878144

  3. Intelligent detection and identification in fiber-optical perimeter intrusion monitoring system based on the FBG sensor network

    NASA Astrophysics Data System (ADS)

    Wu, Huijuan; Qian, Ya; Zhang, Wei; Li, Hanyu; Xie, Xin

    2015-12-01

    A real-time intelligent fiber-optic perimeter intrusion detection system (PIDS) based on the fiber Bragg grating (FBG) sensor network is presented in this paper. To distinguish the effects of different intrusion events, a novel real-time behavior impact classification method is proposed based on the essential statistical characteristics of signal's profile in the time domain. The features are extracted by the principal component analysis (PCA), which are then used to identify the event with a K-nearest neighbor classifier. Simulation and field tests are both carried out to validate its effectiveness. The average identification rate (IR) for five sample signals in the simulation test is as high as 96.67%, and the recognition rate for eight typical signals in the field test can also be achieved up to 96.52%, which includes both the fence-mounted and the ground-buried sensing signals. Besides, critically high detection rate (DR) and low false alarm rate (FAR) can be simultaneously obtained based on the autocorrelation characteristics analysis and a hierarchical detection and identification flow.

  4. Prediction of Intensity Change Subsequent to Concentric Eyewall Events

    NASA Astrophysics Data System (ADS)

    Mauk, Rachel Grant

    Concentric eyewall events have been documented numerous times in intense tropical cyclones over the last two decades. During a concentric eyewall event, an outer (secondary) eyewall forms around the inner (primary) eyewall. Improved instrumentation on aircraft and satellites greatly increases the likelihood of detecting an event. Despite the increased ability to detect such events, forecasts of intensity changes during and after these events remain poor. When concentric eyewall events occur near land, accurate intensity change predictions are especially critical to ensure proper emergency preparations and staging of recovery assets. A nineteen-year (1997-2015) database of concentric eyewall events is developed by analyzing microwave satellite imagery, aircraft- and land-based radar, and other published documents. Events are identified in both the North Atlantic and eastern North Pacific basins. TCs are categorized as single (1 event), serial (>= 2 events) and super-serial (>= 3 events). Key findings here include distinct spatial patterns for single and serial Atlantic TCs, a broad seasonal distribution for eastern North Pacific TCs, and apparent ENSO-related variability in both basins. The intensity change subsequent to the concentric eyewall event is calculated from the HURDAT2 database at time points relative to the start and to the end of the event. Intensity change is then categorized as Weaken (≤ -10 kt), Maintain (+/- 5 kt), and Strengthen (≥ 10 kt). Environmental conditions in which each event occurred are analyzed based on the SHIPS diagnostic files. Oceanic, dynamic, thermodynamic, and TC status predictors are selected for testing in a multiple discriminant analysis procedure to determine which variables successfully discriminate the intensity change category and the occurrence of additional concentric eyewall events. Intensity models are created for 12 h, 24 h, 36 h, and 48 h after the concentric eyewall events end. Leave-one-out cross validation is performed on each set of discriminators to generate classifications, which are then compared to observations. For each model, the top combinations achieve 80-95% overall accuracy in classifying TCs based on the environmental characteristics, although Maintain systems are frequently misclassified. The third part of this dissertation employs the Weather Research and Forecasting model to further investigate concentric eyewall events. Two serial Atlantic concentric eyewall cases (Katrina 2005 and Wilma 2005) are selected from the original study set, and WRF simulations performed using several model designs. Despite strong evidence from multiple sources that serial concentric eyewalls formed in both hurricanes, the WRF simulations did not produce identifiable concentric eyewall structures for Katrina, and only transient structures for Wilma. Possible reasons for the lack of concentric eyewall formation are discussed, including model resolution, microphysics, and data sources.

  5. Hospital staff should use more than one method to detect adverse events and potential adverse events: incident reporting, pharmacist surveillance and local real‐time record review may all have a place

    PubMed Central

    Olsen, Sisse; Neale, Graham; Schwab, Kat; Psaila, Beth; Patel, Tejal; Chapman, E Jane; Vincent, Charles

    2007-01-01

    Background Over the past five years, in most hospitals in England and Wales, incident reporting has become well established but it remains unclear how well reports match clinical adverse events. International epidemiological studies of adverse events are based on retrospective, multi‐hospital case record review. In this paper the authors describe the use of incident reporting, pharmacist surveillance and local real‐time record review for the recognition of clinical risks associated with hospital inpatient care. Methodology Data on adverse events were collected prospectively on 288 patients discharged from adult acute medical and surgical units in an NHS district general hospital using incident reports, active surveillance of prescription charts by pharmacists and record review at time of discharge. Results Record review detected 26 adverse events (AEs) and 40 potential adverse events (PAEs) occurring during the index admission. In contrast, in the same patient group, incident reporting detected 11 PAEs and no AEs. Pharmacy surveillance found 10 medication errors all of which were PAEs. There was little overlap in the nature of events detected by the three methods. Conclusion The findings suggest that incident reporting does not provide an adequate assessment of clinical adverse events and that this method needs to be supplemented with other more systematic forms of data collection. Structured record review, carried out by clinicians, provides an important component of an integrated approach to identifying risk in the context of developing a safety and quality improvement programme. PMID:17301203

  6. Assessing dry weather flow contribution in TSS and COD storm events loads in combined sewer systems.

    PubMed

    Métadier, M; Bertrand-Krajewski, J L

    2011-01-01

    Continuous high resolution long term turbidity measurements along with continuous discharge measurements are now recognised as an appropriate technique for the estimation of in sewer total suspended solids (TSS) and Chemical Oxygen Demand (COD) loads during storm events. In the combined system of the Ecully urban catchment (Lyon, France), this technique is implemented since 2003, with more than 200 storm events monitored. This paper presents a method for the estimation of the dry weather (DW) contribution to measured total TSS and COD event loads with special attention devoted to uncertainties assessment. The method accounts for the dynamics of both discharge and turbidity time series at two minutes time step. The study is based on 180 DW days monitored in 2007-2008. Three distinct classes of DW days were evidenced. Variability analysis and quantification showed that no seasonal effect and no trend over the year were detectable. The law of propagation of uncertainties is applicable for uncertainties estimation. The method has then been applied to all measured storm events. This study confirms the interest of long term continuous discharge and turbidity time series in sewer systems, especially in the perspective of wet weather quality modelling.

  7. Exploiting semantics for sensor re-calibration in event detection systems

    NASA Astrophysics Data System (ADS)

    Vaisenberg, Ronen; Ji, Shengyue; Hore, Bijit; Mehrotra, Sharad; Venkatasubramanian, Nalini

    2008-01-01

    Event detection from a video stream is becoming an important and challenging task in surveillance and sentient systems. While computer vision has been extensively studied to solve different kinds of detection problems over time, it is still a hard problem and even in a controlled environment only simple events can be detected with a high degree of accuracy. Instead of struggling to improve event detection using image processing only, we bring in semantics to direct traditional image processing. Semantics are the underlying facts that hide beneath video frames, which can not be "seen" directly by image processing. In this work we demonstrate that time sequence semantics can be exploited to guide unsupervised re-calibration of the event detection system. We present an instantiation of our ideas by using an appliance as an example--Coffee Pot level detection based on video data--to show that semantics can guide the re-calibration of the detection model. This work exploits time sequence semantics to detect when re-calibration is required to automatically relearn a new detection model for the newly evolved system state and to resume monitoring with a higher rate of accuracy.

  8. GAC: Gene Associations with Clinical, a web based application.

    PubMed

    Zhang, Xinyan; Rupji, Manali; Kowalski, Jeanne

    2017-01-01

    We present GAC, a shiny R based tool for interactive visualization of clinical associations based on high-dimensional data. The tool provides a web-based suite to perform supervised principal component analysis (SuperPC), an approach that uses both high-dimensional data, such as gene expression, combined with clinical data to infer clinical associations. We extended the approach to address binary outcomes, in addition to continuous and time-to-event data in our package, thereby increasing the use and flexibility of SuperPC.  Additionally, the tool provides an interactive visualization for summarizing results based on a forest plot for both binary and time-to-event data.  In summary, the GAC suite of tools provide a one stop shop for conducting statistical analysis to identify and visualize the association between a clinical outcome of interest and high-dimensional data types, such as genomic data. Our GAC package has been implemented in R and is available via http://shinygispa.winship.emory.edu/GAC/. The developmental repository is available at https://github.com/manalirupji/GAC.

  9. Correlating Transcription Initiation and Conformational Changes by a Single-Subunit RNA Polymerase with Near Base-Pair Resolution.

    PubMed

    Koh, Hye Ran; Roy, Rahul; Sorokina, Maria; Tang, Guo-Qing; Nandakumar, Divya; Patel, Smita S; Ha, Taekjip

    2018-05-17

    We provide a comprehensive analysis of transcription in real time by T7 RNA Polymerase (RNAP) using single-molecule fluorescence resonance energy transfer by monitoring the entire life history of transcription initiation, including stepwise RNA synthesis with near base-pair resolution, abortive cycling, and transition into elongation. Kinetically branching pathways were observed for abortive initiation with an RNAP either recycling on the same promoter or exchanging with another RNAP from solution. We detected fast and slow populations of RNAP in their transition into elongation, consistent with the efficient and delayed promoter release, respectively, observed in ensemble studies. Real-time monitoring of abortive cycling using three-probe analysis showed that the initiation events are stochastically branched into productive and failed transcription. The abortive products are generated primarily from initiation events that fail to progress to elongation, and a majority of the productive events transit to elongation without making abortive products. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Event-Triggered Distributed Approximate Optimal State and Output Control of Affine Nonlinear Interconnected Systems.

    PubMed

    Narayanan, Vignesh; Jagannathan, Sarangapani

    2017-06-08

    This paper presents an approximate optimal distributed control scheme for a known interconnected system composed of input affine nonlinear subsystems using event-triggered state and output feedback via a novel hybrid learning scheme. First, the cost function for the overall system is redefined as the sum of cost functions of individual subsystems. A distributed optimal control policy for the interconnected system is developed using the optimal value function of each subsystem. To generate the optimal control policy, forward-in-time, neural networks are employed to reconstruct the unknown optimal value function at each subsystem online. In order to retain the advantages of event-triggered feedback for an adaptive optimal controller, a novel hybrid learning scheme is proposed to reduce the convergence time for the learning algorithm. The development is based on the observation that, in the event-triggered feedback, the sampling instants are dynamic and results in variable interevent time. To relax the requirement of entire state measurements, an extended nonlinear observer is designed at each subsystem to recover the system internal states from the measurable feedback. Using a Lyapunov-based analysis, it is demonstrated that the system states and the observer errors remain locally uniformly ultimately bounded and the control policy converges to a neighborhood of the optimal policy. Simulation results are presented to demonstrate the performance of the developed controller.

  11. Hydrologic ensembles based on convection-permitting precipitation nowcasts for flash flood warnings

    NASA Astrophysics Data System (ADS)

    Demargne, Julie; Javelle, Pierre; Organde, Didier; de Saint Aubin, Céline; Ramos, Maria-Helena

    2017-04-01

    In order to better anticipate flash flood events and provide timely warnings to communities at risk, the French national service in charge of flood forecasting (SCHAPI) is implementing a national flash flood warning system for small-to-medium ungauged basins. Based on a discharge-threshold flood warning method called AIGA (Javelle et al. 2014), the current version of the system runs a simplified hourly distributed hydrologic model with operational radar-gauge QPE grids from Météo-France at a 1-km2 resolution every 15 minutes. This produces real-time peak discharge estimates along the river network, which are subsequently compared to regionalized flood frequency estimates to provide warnings according to the AIGA-estimated return period of the ongoing event. To further extend the effective warning lead time while accounting for hydrometeorological uncertainties, the flash flood warning system is being enhanced to include Météo-France's AROME-NWC high-resolution precipitation nowcasts as time-lagged ensembles and multiple sets of hydrological regionalized parameters. The operational deterministic precipitation forecasts, from the nowcasting version of the AROME convection-permitting model (Auger et al. 2015), were provided at a 2.5-km resolution for a 6-hr forecast horizon for 9 significant rain events from September 2014 to June 2016. The time-lagged approach is a practical choice of accounting for the atmospheric forecast uncertainty when no extensive forecast archive is available for statistical modelling. The evaluation on 781 French basins showed significant improvements in terms of flash flood event detection and effective warning lead-time, compared to warnings from the current AIGA setup (without any future precipitation). We also discuss how to effectively communicate verification information to help determine decision-relevant warning thresholds for flood magnitude and probability. Javelle, P., Demargne, J., Defrance, D., Arnaud, P., 2014. Evaluating flash flood warnings at ungauged locations using post-event surveys: a case study with the AIGA warning system. Hydrological Sciences Journal, doi: 10.1080/02626667.2014.923970 Auger, L., Dupont, O., Hagelin, S., Brousseau, P., Brovelli, P., 2015. AROME-NWC: a new nowcasting tool based on an operational mesoscale forecasting system. Quarterly Journal of the Royal Meteorological Society, 141: 1603-1611, doi:10.1002/qj.2463

  12. Assessing the performance of regional landslide early warning models: the EDuMaP method

    NASA Astrophysics Data System (ADS)

    Calvello, M.; Piciullo, L.

    2016-01-01

    A schematic of the components of regional early warning systems for rainfall-induced landslides is herein proposed, based on a clear distinction between warning models and warning systems. According to this framework an early warning system comprises a warning model as well as a monitoring and warning strategy, a communication strategy and an emergency plan. The paper proposes the evaluation of regional landslide warning models by means of an original approach, called the "event, duration matrix, performance" (EDuMaP) method, comprising three successive steps: identification and analysis of the events, i.e., landslide events and warning events derived from available landslides and warnings databases; definition and computation of a duration matrix, whose elements report the time associated with the occurrence of landslide events in relation to the occurrence of warning events, in their respective classes; evaluation of the early warning model performance by means of performance criteria and indicators applied to the duration matrix. During the first step the analyst identifies and classifies the landslide and warning events, according to their spatial and temporal characteristics, by means of a number of model parameters. In the second step, the analyst computes a time-based duration matrix with a number of rows and columns equal to the number of classes defined for the warning and landslide events, respectively. In the third step, the analyst computes a series of model performance indicators derived from a set of performance criteria, which need to be defined by considering, once again, the features of the warning model. The applicability, potentialities and limitations of the EDuMaP method are tested and discussed using real landslides and warning data from the municipal early warning system operating in Rio de Janeiro (Brazil).

  13. Investigating the Origins of Two Extreme Solar Particle Events: Proton Source Profile and Associated Electromagnetic Emissions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kocharov, Leon; Usoskin, Ilya; Pohjolainen, Silja

    We analyze the high-energy particle emission from the Sun in two extreme solar particle events in which protons are accelerated to relativistic energies and can cause a significant signal even in the ground-based particle detectors. Analysis of a relativistic proton event is based on modeling of the particle transport and interaction, from a near-Sun source through the solar wind and the Earth’s magnetosphere and atmosphere to a detector on the ground. This allows us to deduce the time profile of the proton source at the Sun and compare it with observed electromagnetic emissions. The 1998 May 2 event is associatedmore » with a flare and a coronal mass ejection (CME), which were well observed by the Nançay Radioheliograph, thus the images of the radio sources are available. For the 2003 November 2 event, the low corona images of the CME liftoff obtained at the Mauna Loa Solar Observatory are available. Those complementary data sets are analyzed jointly with the broadband dynamic radio spectra, EUV images, and other data available for both events. We find a common scenario for both eruptions, including the flare’s dual impulsive phase, the CME-launch-associated decimetric-continuum burst, and the late, low-frequency type III radio bursts at the time of the relativistic proton injection into the interplanetary medium. The analysis supports the idea that the two considered events start with emission of relativistic protons previously accelerated during the flare and CME launch, then trapped in large-scale magnetic loops and later released by the expanding CME.« less

  14. Recollection-dependent memory for event duration in large-scale spatial navigation

    PubMed Central

    Barense, Morgan D.

    2017-01-01

    Time and space represent two key aspects of episodic memories, forming the spatiotemporal context of events in a sequence. Little is known, however, about how temporal information, such as the duration and the order of particular events, are encoded into memory, and if it matters whether the memory representation is based on recollection or familiarity. To investigate this issue, we used a real world virtual reality navigation paradigm where periods of navigation were interspersed with pauses of different durations. Crucially, participants were able to reliably distinguish the durations of events that were subjectively “reexperienced” (i.e., recollected), but not of those that were familiar. This effect was not found in temporal order (ordinal) judgments. We also show that the active experience of the passage of time (holding down a key while waiting) moderately enhanced duration memory accuracy. Memory for event duration, therefore, appears to rely on the hippocampally supported ability to recollect or reexperience an event enabling the reinstatement of both its duration and its spatial context, to distinguish it from other events in a sequence. In contrast, ordinal memory appears to rely on familiarity and recollection to a similar extent. PMID:28202714

  15. Ensemble Modeling of the July 23, 2012 CME Event

    NASA Astrophysics Data System (ADS)

    Cash, M. D.; Biesecker, D. A.; Millward, G.; Arge, C. N.; Henney, C. J.

    2013-12-01

    On July 23, 2012 a large and very fast coronal mass ejection (CME) was observed by STEREO A. This CME was unusual in that the estimates of the speed of the CME ranged from 2125 km/s to 2780 km/s based on dividing the distance of STEREO A from the Sun by the transit time of the CME. Modeling of this CME event with the WSA-Enlil model has also suggested that a very fast speed is required in order to obtain the correct arrival time at 1 AU. We present a systematic study of parameter space for the July 23, 2012 CME event through an ensemble study using the WSA-Enlil model to predict the arrival time of the CME at STEREO A. We investigate how variations in the initial speed, angular width, and direction affect the predicted arrival time. We also explore how variations in the background solar wind influence CME arrival time by using varying ADAPT maps within our ensemble study. Factors involved in the fast transit time of this large CME are discussed and the optimal CME parameters are presented.

  16. ABCD3-I score and the risk of early or 3-month stroke recurrence in tissue- and time-based definitions of TIA and minor stroke.

    PubMed

    Mayer, Lukas; Ferrari, Julia; Krebs, Stefan; Boehme, Christian; Toell, Thomas; Matosevic, Benjamin; Tinchon, Alexander; Brainin, Michael; Gattringer, Thomas; Sommer, Peter; Thun, Peter; Willeit, Johann; Lang, Wilfried; Kiechl, Stefan; Knoflach, Michael

    2018-03-01

    Changing definition of TIA from time to a tissue basis questions the validity of the well-established ABCD3-I risk score for recurrent ischemic cerebrovascular events. We analyzed patients with ischemic stroke with mild neurological symptoms arriving < 24 h after symptom onset in a phase where it is unclear, if the event turns out to be a TIA or minor stroke, in the prospective multi-center Austrian Stroke Unit Registry. Patients were retrospectively categorized according to a time-based (symptom duration below/above 24 h) and tissue-based (without/with corresponding brain lesion on CT or MRI) definition of TIA or minor stroke. Outcome parameters were early stroke during stroke unit stay and 3-month ischemic stroke. Of the 5237 TIA and minor stroke patients with prospectively documented ABCD3-I score, 2755 (52.6%) had a TIA by the time-based and 2183 (41.7%) by the tissue-based definition. Of the 2457 (46.9%) patients with complete 3-month followup, corresponding numbers were 1195 (48.3%) for the time- and 971 (39.5%) for the tissue-based definition of TIA. Early and 3-month ischemic stroke occurred in 1.1 and 2.5% of time-based TIA, 3.8 and 5.9% of time-based minor stroke, 1.2 and 2.3% of tissue-based TIA as well as in 3.1 and 5.5% of tissue-based minor stroke patients. Irrespective of the definition of TIA and minor stroke, the risk of early and 3-month ischemic stroke steadily increased with increasing ABCD3-I score points. The ABCD3-I score performs equally in TIA patients in tissue- as well as time-based definition and the same is true for minor stroke patients.

  17. QRS detection based ECG quality assessment.

    PubMed

    Hayn, Dieter; Jammerbund, Bernhard; Schreier, Günter

    2012-09-01

    Although immediate feedback concerning ECG signal quality during recording is useful, up to now not much literature describing quality measures is available. We have implemented and evaluated four ECG quality measures. Empty lead criterion (A), spike detection criterion (B) and lead crossing point criterion (C) were calculated from basic signal properties. Measure D quantified the robustness of QRS detection when applied to the signal. An advanced Matlab-based algorithm combining all four measures and a simplified algorithm for Android platforms, excluding measure D, were developed. Both algorithms were evaluated by taking part in the Computing in Cardiology Challenge 2011. Each measure's accuracy and computing time was evaluated separately. During the challenge, the advanced algorithm correctly classified 93.3% of the ECGs in the training-set and 91.6 % in the test-set. Scores for the simplified algorithm were 0.834 in event 2 and 0.873 in event 3. Computing time for measure D was almost five times higher than for other measures. Required accuracy levels depend on the application and are related to computing time. While our simplified algorithm may be accurate for real-time feedback during ECG self-recordings, QRS detection based measures can further increase the performance if sufficient computing power is available.

  18. A Model of Rapid Radicalization Behavior Using Agent-Based Modeling and Quorum Sensing

    NASA Technical Reports Server (NTRS)

    Schwartz, Noah; Drucker, Nick; Campbell, Kenyth

    2012-01-01

    Understanding the dynamics of radicalization, especially rapid radicalization, has become increasingly important to US policy in the past several years. Traditionally, radicalization is considered a slow process, but recent social and political events demonstrate that the process can occur quickly. Examining this rapid process, in real time, is impossible. However, recreating an event using modeling and simulation (M&S) allows researchers to study some of the complex dynamics associated with rapid radicalization. We propose to adapt the biological mechanism of quorum sensing as a tool to explore, or possibly explain, rapid radicalization. Due to the complex nature of quorum sensing, M&S allows us to examine events that we could not otherwise examine in real time. For this study, we employ Agent Based Modeling (ABM), an M&S paradigm suited to modeling group behavior. The result of this study was the successful creation of rapid radicalization using quorum sensing. The Battle of Mogadishu was the inspiration for this model and provided the testing conditions used to explore quorum sensing and the ideas behind rapid radicalization. The final product has wider applicability however, using quorum sensing as a possible tool for examining other catalytic rapid radicalization events.

  19. Time-varying causal network of the Korean financial system based on firm-specific risk premiums

    NASA Astrophysics Data System (ADS)

    Song, Jae Wook; Ko, Bonggyun; Cho, Poongjin; Chang, Woojin

    2016-09-01

    The aim of this paper is to investigate the Korean financial system based on time-varying causal network. We discover many stylized facts by utilizing the firm-specific risk premiums for measuring the causality direction from a firm to firm. At first, we discover that the interconnectedness of causal network is affected by the outbreak of financial events; the co-movement of firm-specific risk premium is strengthened after each positive event, and vice versa. Secondly, we find that the major sector of the Korean financial system is the Depositories, and the financial reform in June-2011 achieves its purpose by weakening the power of risk-spillovers of Broker-Dealers. Thirdly, we identify that the causal network is a small-world network with scale-free topology where the power-law exponents of out-Degree and negative event are more significant than those of in-Degree and positive event. Lastly, we discuss that the current aspects of causal network are closely related to the long-term future scenario of the KOSPI Composite index where the direction and stability are significantly affected by the power of risk-spillovers and the power-law exponents of degree distributions, respectively.

  20. An event database for rotational seismology

    NASA Astrophysics Data System (ADS)

    Salvermoser, Johannes; Hadziioannou, Celine; Hable, Sarah; Chow, Bryant; Krischer, Lion; Wassermann, Joachim; Igel, Heiner

    2016-04-01

    The ring laser sensor (G-ring) located at Wettzell, Germany, routinely observes earthquake-induced rotational ground motions around a vertical axis since its installation in 2003. Here we present results from a recently installed event database which is the first that will provide ring laser event data in an open access format. Based on the GCMT event catalogue and some search criteria, seismograms from the ring laser and the collocated broadband seismometer are extracted and processed. The ObsPy-based processing scheme generates plots showing waveform fits between rotation rate and transverse acceleration and extracts characteristic wavefield parameters such as peak ground motions, noise levels, Love wave phase velocities and waveform coherence. For each event, these parameters are stored in a text file (json dictionary) which is easily readable and accessible on the website. The database contains >10000 events starting in 2007 (Mw>4.5). It is updated daily and therefore provides recent events at a time lag of max. 24 hours. The user interface allows to filter events for epoch, magnitude, and source area, whereupon the events are displayed on a zoomable world map. We investigate how well the rotational motions are compatible with the expectations from the surface wave magnitude scale. In addition, the website offers some python source code examples for downloading and processing the openly accessible waveforms.

Top