Real-time monitoring of clinical processes using complex event processing and transition systems.
Meinecke, Sebastian
2014-01-01
Dependencies between tasks in clinical processes are often complex and error-prone. Our aim is to describe a new approach for the automatic derivation of clinical events identified via the behaviour of IT systems using Complex Event Processing. Furthermore we map these events on transition systems to monitor crucial clinical processes in real-time for preventing and detecting erroneous situations.
Real-Time Data Processing Systems and Products at the Alaska Earthquake Information Center
NASA Astrophysics Data System (ADS)
Ruppert, N. A.; Hansen, R. A.
2007-05-01
The Alaska Earthquake Information Center (AEIC) receives data from over 400 seismic sites located within the state boundaries and the surrounding regions and serves as a regional data center. In 2007, the AEIC reported ~20,000 seismic events, with the largest event of M6.6 in Andreanof Islands. The real-time earthquake detection and data processing systems at AEIC are based on the Antelope system from BRTT, Inc. This modular and extensible processing platform allows an integrated system complete from data acquisition to catalog production. Multiple additional modules constructed with the Antelope toolbox have been developed to fit particular needs of the AEIC. The real-time earthquake locations and magnitudes are determined within 2-5 minutes of the event occurrence. AEIC maintains a 24/7 seismologist-on-duty schedule. Earthquake alarms are based on the real- time earthquake detections. Significant events are reviewed by the seismologist on duty within 30 minutes of the occurrence with information releases issued for significant events. This information is disseminated immediately via the AEIC website, ANSS website via QDDS submissions, through e-mail, cell phone and pager notifications, via fax broadcasts and recorded voice-mail messages. In addition, automatic regional moment tensors are determined for events with M>=4.0. This information is posted on the public website. ShakeMaps are being calculated in real-time with the information currently accessible via a password-protected website. AEIC is designing an alarm system targeted for the critical lifeline operations in Alaska. AEIC maintains an extensive computer network to provide adequate support for data processing and archival. For real-time processing, AEIC operates two identical, interoperable computer systems in parallel.
NASA Astrophysics Data System (ADS)
Chiu, L.; Hao, X.; Kinter, J. L.; Stearn, G.; Aliani, M.
2017-12-01
The launch of GOES-16 series provides an opportunity to advance near real-time applications in natural hazard detection, monitoring and warning. This study demonstrates the capability and values of receiving real-time satellite-based Earth observations over a fast terrestrial networks and processing high-resolution remote sensing data in a university environment. The demonstration system includes 4 components: 1) Near real-time data receiving and processing; 2) data analysis and visualization; 3) event detection and monitoring; and 4) information dissemination. Various tools are developed and integrated to receive and process GRB data in near real-time, produce images and value-added data products, and detect and monitor extreme weather events such as hurricane, fire, flooding, fog, lightning, etc. A web-based application system is developed to disseminate near-real satellite images and data products. The images are generated with GIS-compatible format (GeoTIFF) to enable convenient use and integration in various GIS platforms. This study enhances the capacities for undergraduate and graduate education in Earth system and climate sciences, and related applications to understand the basic principles and technology in real-time applications with remote sensing measurements. It also provides an integrated platform for near real-time monitoring of extreme weather events, which are helpful for various user communities.
Using Antelope and Seiscomp in the framework of the Romanian Seismic Network
NASA Astrophysics Data System (ADS)
Marius Craiu, George; Craiu, Andreea; Marmureanu, Alexandru; Neagoe, Cristian
2014-05-01
The National Institute for Earth Physics (NIEP) operates a real-time seismic network designed to monitor the seismic activity on the Romania territory, dominated by the Vrancea intermediate-depth (60-200 km) earthquakes. The NIEP real-time network currently consists of 102 stations and two seismic arrays equipped with different high quality digitizers (Kinemetrics K2, Quanterra Q330, Quanterra Q330HR, PS6-26, Basalt), broadband and short period seismometers (CMG3ESP, CMG40T, KS2000, KS54000, KS2000, CMG3T, STS2, SH-1, S13, Mark l4c, Ranger, Gs21, Mark 22) and acceleration sensors (Episensor Kinemetrics). The primary goal of the real-time seismic network is to provide earthquake parameters from more broad-band stations with a high dynamic range, for more rapid and accurate computation of the locations and magnitudes of earthquakes. The Seedlink and AntelopeTM program packages are completely automated Antelope seismological system is run at the Data Center in Măgurele. The Antelope data acquisition and processing software is running for real-time processing and post processing. The Antelope real-time system provides automatic event detection, arrival picking, event location, and magnitude calculation. It also provides graphical displays and automatic location within near real time after a local, regional or teleseismic event has occurred SeisComP 3 is another automated system that is run at the NIEP and which provides the following features: data acquisition, data quality control, real-time data exchange and processing, network status monitoring, issuing event alerts, waveform archiving and data distribution, automatic event detection and location, easy access to relevant information about stations, waveforms, and recent earthquakes. The main goal of this paper is to compare both of these data acquisitions systems in order to improve their detection capabilities, location accuracy, magnitude and depth determination and reduce the RMS and other location errors.
Testing the causality of Hawkes processes with time reversal
NASA Astrophysics Data System (ADS)
Cordi, Marcus; Challet, Damien; Muni Toke, Ioane
2018-03-01
We show that univariate and symmetric multivariate Hawkes processes are only weakly causal: the true log-likelihoods of real and reversed event time vectors are almost equal, thus parameter estimation via maximum likelihood only weakly depends on the direction of the arrow of time. In ideal (synthetic) conditions, tests of goodness of parametric fit unambiguously reject backward event times, which implies that inferring kernels from time-symmetric quantities, such as the autocovariance of the event rate, only rarely produce statistically significant fits. Finally, we find that fitting financial data with many-parameter kernels may yield significant fits for both arrows of time for the same event time vector, sometimes favouring the backward time direction. This goes to show that a significant fit of Hawkes processes to real data with flexible kernels does not imply a definite arrow of time unless one tests it.
The improved broadband Real-Time Seismic Network in Romania
NASA Astrophysics Data System (ADS)
Neagoe, C.; Ionescu, C.
2009-04-01
Starting with 2002 the National Institute for Earth Physics (NIEP) has developed its real-time digital seismic network. This network consists of 96 seismic stations of which 48 broad band and short period stations and two seismic arrays are transmitted in real-time. The real time seismic stations are equipped with Quanterra Q330 and K2 digitizers, broadband seismometers (STS2, CMG40T, CMG 3ESP, CMG3T) and strong motions sensors Kinemetrics episensors (+/- 2g). SeedLink and AntelopeTM (installed on MARMOT) program packages are used for real-time (RT) data acquisition and exchange. The communication from digital seismic stations to the National Data Center in Bucharest is assured by 5 providers (GPRS, VPN, satellite communication, radio lease line and internet), which will assure the back-up communications lines. The processing centre runs BRTT's AntelopeTM 4.10 data acquisition and processing software on 2 workstations for real-time processing and post processing. The Antelope Real-Time System is also providing automatic event detection, arrival picking, event location and magnitude calculation. It provides graphical display and reporting within near-real-time after a local or regional event occurred. Also at the data center was implemented a system to collect macroseismic information using the internet on which macro seismic intensity maps are generated. In the near future at the data center will be install Seiscomp 3 data acquisition processing software on a workstation. The software will run in parallel with Antelope software as a back-up. The present network will be expanded in the near future. In the first half of 2009 NIEP will install 8 additional broad band stations in Romanian territory, which also will be transmitted to the data center in real time. The Romanian Seismic Network is permanently exchanging real -time waveform data with IRIS, ORFEUS and different European countries through internet. In Romania, magnitude and location of an earthquake are now available within a few minutes after the earthquake occurred. One of the greatest challenges in the near future is to provide shaking intensity maps and other ground motion parameters, within 5 minutes post-event, on the Internet and GIS-based format in order to improve emergency response, public information, preparedness and hazard mitigation
NASA Astrophysics Data System (ADS)
Vasterling, Margarete; Wegler, Ulrich; Becker, Jan; Brüstle, Andrea; Bischoff, Monika
2017-01-01
We develop and test a real-time envelope cross-correlation detector for use in seismic response plans to mitigate hazard of induced seismicity. The incoming seismological data are cross-correlated in real-time with a set of previously recorded master events. For robustness against small changes in the earthquake source locations or in the focal mechanisms we cross-correlate the envelopes of the seismograms rather than the seismograms themselves. Two sequenced detection conditions are implemented: After passing a single trace cross-correlation condition, a network cross-correlation is calculated taking amplitude ratios between stations into account. Besides detecting the earthquake and assigning it to the respective reservoir, real-time magnitudes are important for seismic response plans. We estimate the magnitudes of induced microseismicity using the relative amplitudes between master event and detected event. The real-time detector is implemented as a SeisComP3 module. We carry out offline and online performance tests using seismic monitoring data of the Insheim and Landau geothermal power plants (Upper Rhine Graben, Germany), also including blasts from a nearby quarry. The comparison of the automatic real-time catalogue with a manually processed catalogue shows, that with the implemented parameters events are always correctly assigned to the respective reservoir (4 km distance between reservoirs) or the quarry (8 km and 10 km distance, respectively, from the reservoirs). The real-time catalogue achieves a magnitude of completeness around 0.0. Four per cent of the events assigned to the Insheim reservoir and zero per cent of the Landau events are misdetections. All wrong detections are local tectonic events, whereas none are caused by seismic noise.
A Web service-based architecture for real-time hydrologic sensor networks
NASA Astrophysics Data System (ADS)
Wong, B. P.; Zhao, Y.; Kerkez, B.
2014-12-01
Recent advances in web services and cloud computing provide new means by which to process and respond to real-time data. This is particularly true of platforms built for the Internet of Things (IoT). These enterprise-scale platforms have been designed to exploit the IP-connectivity of sensors and actuators, providing a robust means by which to route real-time data feeds and respond to events of interest. While powerful and scalable, these platforms have yet to be adopted by the hydrologic community, where the value of real-time data impacts both scientists and decision makers. We discuss the use of one such IoT platform for the purpose of large-scale hydrologic measurements, showing how rapid deployment and ease-of-use allows scientists to focus on their experiment rather than software development. The platform is hardware agnostic, requiring only IP-connectivity of field devices to capture, store, process, and visualize data in real-time. We demonstrate the benefits of real-time data through a real-world use case by showing how our architecture enables the remote control of sensor nodes, thereby permitting the nodes to adaptively change sampling strategies to capture major hydrologic events of interest.
Rule-Based Event Processing and Reaction Rules
NASA Astrophysics Data System (ADS)
Paschke, Adrian; Kozlenkov, Alexander
Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.
Effects of Event Knowledge in Processing Verbal Arguments
ERIC Educational Resources Information Center
Bicknell, Klinton; Elman, Jeffrey L.; Hare, Mary; McRae, Ken; Kutas, Marta
2010-01-01
This research tests whether comprehenders use their knowledge of typical events in real time to process verbal arguments. In self-paced reading and event-related brain potential (ERP) experiments, we used materials in which the likelihood of a specific patient noun ("brakes" or "spelling") depended on the combination of an agent and verb…
Effects of event knowledge in processing verbal arguments
Bicknell, Klinton; Elman, Jeffrey L.; Hare, Mary; McRae, Ken; Kutas, Marta
2010-01-01
This research tests whether comprehenders use their knowledge of typical events in real time to process verbal arguments. In self-paced reading and event-related brain potential (ERP) experiments, we used materials in which the likelihood of a specific patient noun (brakes or spelling) depended on the combination of an agent and verb (mechanic checked vs. journalist checked). Reading times were shorter at the word directly following the patient for the congruent than the incongruent items. Differential N400s were found earlier, immediately at the patient. Norming studies ruled out any account of these results based on direct relations between the agent and patient. Thus, comprehenders dynamically combine information about real-world events based on intrasentential agents and verbs, and this combination then rapidly influences online sentence interpretation. PMID:21076629
NASA Astrophysics Data System (ADS)
Roessler, D.; Weber, B.; Ellguth, E.; Spazier, J.
2017-12-01
The geometry of seismic monitoring networks, site conditions and data availability as well as monitoring targets and strategies typically impose trade-offs between data quality, earthquake detection sensitivity, false detections and alert times. Network detection capabilities typically change with alteration of the seismic noise level by human activity or by varying weather and sea conditions. To give helpful information to operators and maintenance coordinators, gempa developed a range of tools to evaluate earthquake detection and network performance including qceval, npeval and sceval. qceval is a module which analyzes waveform quality parameters in real-time and deactivates and reactivates data streams based on waveform quality thresholds for automatic processing. For example, thresholds can be defined for latency, delay, timing quality, spikes and gaps count and rms. As changes in the automatic processing have a direct influence on detection quality and speed, another tool called "npeval" was designed to calculate in real-time the expected time needed to detect and locate earthquakes by evaluating the effective network geometry. The effective network geometry is derived from the configuration of stations participating in the detection. The detection times are shown as an additional layer on the map and updated in real-time as soon as the effective network geometry changes. Yet another new tool, "sceval", is an automatic module which classifies located seismic events (Origins) in real-time. sceval evaluates the spatial distribution of the stations contributing to an Origin. It confirms or rejects the status of Origins, adds comments or leaves the Origin unclassified. The comments are passed to an additional sceval plug-in where the end user can customize event types. This unique identification of real and fake events in earthquake catalogues allows to lower network detection thresholds. In real-time monitoring situations operators can limit the processing to events with unclassified Origins, reducing their workload. Classified Origins can be treated specifically by other procedures. These modules have been calibrated and fully tested by several complex seismic monitoring networks in the region of Indonesia and Northern Chile.
NASA Astrophysics Data System (ADS)
Baziw, Erick; Verbeek, Gerald
2012-12-01
Among engineers there is considerable interest in the real-time identification of "events" within time series data with a low signal to noise ratio. This is especially true for acoustic emission analysis, which is utilized to assess the integrity and safety of many structures and is also applied in the field of passive seismic monitoring (PSM). Here an array of seismic receivers are used to acquire acoustic signals to monitor locations where seismic activity is expected: underground excavations, deep open pits and quarries, reservoirs into which fluids are injected or from which fluids are produced, permeable subsurface formations, or sites of large underground explosions. The most important element of PSM is event detection: the monitoring of seismic acoustic emissions is a continuous, real-time process which typically runs 24 h a day, 7 days a week, and therefore a PSM system with poor event detection can easily acquire terabytes of useless data as it does not identify crucial acoustic events. This paper outlines a new algorithm developed for this application, the so-called SEED™ (Signal Enhancement and Event Detection) algorithm. The SEED™ algorithm uses real-time Bayesian recursive estimation digital filtering techniques for PSM signal enhancement and event detection.
Arranging computer architectures to create higher-performance controllers
NASA Technical Reports Server (NTRS)
Jacklin, Stephen A.
1988-01-01
Techniques for integrating microprocessors, array processors, and other intelligent devices in control systems are reviewed, with an emphasis on the (re)arrangement of components to form distributed or parallel processing systems. Consideration is given to the selection of the host microprocessor, increasing the power and/or memory capacity of the host, multitasking software for the host, array processors to reduce computation time, the allocation of real-time and non-real-time events to different computer subsystems, intelligent devices to share the computational burden for real-time events, and intelligent interfaces to increase communication speeds. The case of a helicopter vibration-suppression and stabilization controller is analyzed as an example, and significant improvements in computation and throughput rates are demonstrated.
Simplifying operations with an uplink/downlink integration toolkit
NASA Technical Reports Server (NTRS)
Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine
1994-01-01
The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to display actual values from the telemetry data stream. This toolkit was designed to greatly simplify the user's ability to access and view telemetry data, and also provide a means to view this data in the context of the commands and ground events that are used to interpret it. A closed-loop system can prove especially useful in small missions with limited resources requiring automated monitoring tools. This paper will discuss the toolkit implementation, including design trade-offs and future plans for enhancing the automated capabilities.
Simplifying operations with an uplink/downlink integration toolkit
NASA Astrophysics Data System (ADS)
Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine
1994-11-01
The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to display actual values from the telemetry data stream. This toolkit was designed to greatly simplify the user's ability to access and view telemetry data, and also provide a means to view this data in the context of the commands and ground events that are used to interpret it. A closed-loop system can prove especially useful in small missions with limited resources requiring automated monitoring tools. This paper will discuss the toolkit implementation, including design trade-offs and future plans for enhancing the automated capabilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murphy, Chantell Lynne-Marie
Traditional nuclear materials accounting does not work well for safeguards when applied to pyroprocessing. Alternate methods such as Signature Based Safeguards (SBS) are being investigated. The goal of SBS is real-time/near-real-time detection of anomalous events in the pyroprocessing facility as they could indicate loss of special nuclear material. In high-throughput reprocessing facilities, metric tons of separated material are processed that must be accounted for. Even with very low uncertainties of accountancy measurements (<0.1%) the uncertainty of the material balances is still greater than the desired level. Novel contributions of this work are as follows: (1) significant enhancement of SBS developmentmore » for the salt cleanup process by creating a new gas sparging process model, selecting sensors to monitor normal operation, identifying safeguards-significant off-normal scenarios, and simulating those off-normal events and generating sensor output; (2) further enhancement of SBS development for the electrorefiner by simulating off-normal events caused by changes in salt concentration and identifying which conditions lead to Pu and Cm not tracking throughout the rest of the system; and (3) new contribution in applying statistical techniques to analyze the signatures gained from these two models to help draw real-time conclusions on anomalous events.« less
Controlling Real-Time Processes On The Space Station With Expert Systems
NASA Astrophysics Data System (ADS)
Leinweber, David; Perry, John
1987-02-01
Many aspects of space station operations involve continuous control of real-time processes. These processes include electrical power system monitoring, propulsion system health and maintenance, environmental and life support systems, space suit checkout, on-board manufacturing, and servicing of attached vehicles such as satellites, shuttles, orbital maneuvering vehicles, orbital transfer vehicles and remote teleoperators. Traditionally, monitoring of these critical real-time processes has been done by trained human experts monitoring telemetry data. However, the long duration of space station missions and the high cost of crew time in space creates a powerful economic incentive for the development of highly autonomous knowledge-based expert control procedures for these space stations. In addition to controlling the normal operations of these processes, the expert systems must also be able to quickly respond to anomalous events, determine their cause and initiate corrective actions in a safe and timely manner. This must be accomplished without excessive diversion of system resources from ongoing control activities and any events beyond the scope of the expert control and diagnosis functions must be recognized and brought to the attention of human operators. Real-time sensor based expert systems (as opposed to off-line, consulting or planning systems receiving data via the keyboard) pose particular problems associated with sensor failures, sensor degradation and data consistency, which must be explicitly handled in an efficient manner. A set of these systems must also be able to work together in a cooperative manner. This paper describes the requirements for real-time expert systems in space station control, and presents prototype implementations of space station expert control procedures in PICON (process intelligent control). PICON is a real-time expert system shell which operates in parallel with distributed data acquisition systems. It incorporates a specialized inference engine with a specialized scheduling portion specifically designed to match the allocation of system resources with the operational requirements of real-time control systems. Innovative knowledge engineering techniques used in PICON to facilitate the development of real-time sensor-based expert systems which use the special features of the inference engine are illustrated in the prototype examples.
NASA Astrophysics Data System (ADS)
Hagerty, M. T.; Lomax, A.; Hellman, S. B.; Whitmore, P.; Weinstein, S.; Hirshorn, B. F.; Knight, W. R.
2015-12-01
Tsunami warning centers must rapidly decide whether an earthquake is likely to generate a destructive tsunami in order to issue a tsunami warning quickly after a large event. For very large events (Mw > 8 or so), magnitude and location alone are sufficient to warrant an alert. However, for events of smaller magnitude (e.g., Mw ~ 7.5), particularly for so-called "tsunami earthquakes", magnitude alone is insufficient to issue an alert and other measurements must be rapidly made and used to assess tsunamigenic potential. The Tsunami Information technology Modernization (TIM) is a National Oceanic and Atmospheric Administration (NOAA) project to update and standardize the earthquake and tsunami monitoring systems currently employed at the U.S. Tsunami Warning Centers in Ewa Beach, Hawaii (PTWC) and Palmer, Alaska (NTWC). We (ISTI) are responsible for implementing the seismic monitoring components in this new system, including real-time seismic data collection and seismic processing. The seismic data processor includes a variety of methods aimed at real-time discrimination of tsunamigenic events, including: Mwp, Me, slowness (Theta), W-phase, mantle magnitude (Mm), array processing and finite-fault inversion. In addition, it contains the ability to designate earthquake scenarios and play the resulting synthetic seismograms through the processing system. Thus, it is also a convenient tool that integrates research and monitoring and may be used to calibrate and tune the real-time monitoring system. Here we show results of the automated processing system for a large dataset of subduction zone earthquakes containing recent tsunami earthquakes and we examine the accuracy of the various discrimation methods and discuss issues related to their successful real-time application.
NASA Astrophysics Data System (ADS)
Vasterling, Margarete; Wegler, Ulrich; Bruestle, Andrea; Becker, Jan
2016-04-01
Real time information on the locations and magnitudes of induced earthquakes is essential for response plans based on the magnitude frequency distribution. We developed and tested a real time cross-correlation detector focusing on induced microseismicity in deep geothermal reservoirs. The incoming seismological data are cross-correlated in real time with a set of known master events. We use the envelopes of the seismograms rather than the seismograms themselves to account for small changes in the source locations or in the focal mechanisms. Two different detection conditions are implemented: After first passing a single trace correlation condition, secondly a network correlation is calculated taking the amplitude information of the seismic network into account. The magnitude is estimated by using the respective ratio of the maximum amplitudes of the master event and the detected event. The detector is implemented as a real time tool and put into practice as a SeisComp3 module, an established open source software for seismological real time data handling and analysis. We validated the reliability and robustness of the detector by an offline playback test using four month of data from monitoring the power plant in Insheim (Upper Rhine Graben, SW Germany). Subsequently, in October 2013 the detector was installed as real time monitoring system within the project "MAGS2 - Microseismic Activity of Geothermal Systems". Master events from the two neighboring geothermal power plants in Insheim and Landau and two nearby quarries are defined. After detection, manual phase determination and event location are performed at the local seismological survey of the Geological Survey and Mining Authority of Rhineland-Palatinate. Until November 2015 the detector identified 454 events out of which 95% were assigned correctly to the respective source. 5% were misdetections caused by local tectonic events. To evaluate the completeness of the automatically obtained catalogue, it is compared to the event catalogue of the Seismological Service of Southwestern Germany and to the events reported by the company tasked with seismic monitoring of the Insheim power plant. Events missed by the cross-correlation detector are generally very small. They are registered at too few stations to meet the detection criteria. Most of these small events were not locatable. The automatic catalogue has a magnitude of completeness around 0.0 and is significantly more detailed than the catalogue from standard processing of the Seismological Service of Southwestern Germany for this region. For events in the magnitude range of the master event the magnitude estimated from the amplitude ratio reproduces the local magnitude well. For weaker events there tends to be a small offset. Altogether, the developed real time cross correlation detector provides robust detections with reliable association of the events to the respective sources and valid magnitude estimates. Thus, it provides input parameters for the mitigation of seismic hazard by using response plans in real time.
Design of an FPGA-Based Algorithm for Real-Time Solutions of Statistics-Based Positioning
DeWitt, Don; Johnson-Williams, Nathan G.; Miyaoka, Robert S.; Li, Xiaoli; Lockhart, Cate; Lewellen, Tom K.; Hauck, Scott
2010-01-01
We report on the implementation of an algorithm and hardware platform to allow real-time processing of the statistics-based positioning (SBP) method for continuous miniature crystal element (cMiCE) detectors. The SBP method allows an intrinsic spatial resolution of ~1.6 mm FWHM to be achieved using our cMiCE design. Previous SBP solutions have required a postprocessing procedure due to the computation and memory intensive nature of SBP. This new implementation takes advantage of a combination of algebraic simplifications, conversion to fixed-point math, and a hierarchal search technique to greatly accelerate the algorithm. For the presented seven stage, 127 × 127 bin LUT implementation, these algorithm improvements result in a reduction from >7 × 106 floating-point operations per event for an exhaustive search to < 5 × 103 integer operations per event. Simulations show nearly identical FWHM positioning resolution for this accelerated SBP solution, and positioning differences of <0.1 mm from the exhaustive search solution. A pipelined field programmable gate array (FPGA) implementation of this optimized algorithm is able to process events in excess of 250 K events per second, which is greater than the maximum expected coincidence rate for an individual detector. In contrast with all detectors being processed at a centralized host, as in the current system, a separate FPGA is available at each detector, thus dividing the computational load. These methods allow SBP results to be calculated in real-time and to be presented to the image generation components in real-time. A hardware implementation has been developed using a commercially available prototype board. PMID:21197135
NASA Technical Reports Server (NTRS)
Pordes, Ruth (Editor)
1989-01-01
Papers on real-time computer applications in nuclear, particle, and plasma physics are presented, covering topics such as expert systems tactics in testing FASTBUS segment interconnect modules, trigger control in a high energy physcis experiment, the FASTBUS read-out system for the Aleph time projection chamber, a multiprocessor data acquisition systems, DAQ software architecture for Aleph, a VME multiprocessor system for plasma control at the JT-60 upgrade, and a multiasking, multisinked, multiprocessor data acquisition front end. Other topics include real-time data reduction using a microVAX processor, a transputer based coprocessor for VEDAS, simulation of a macropipelined multi-CPU event processor for use in FASTBUS, a distributed VME control system for the LISA superconducting Linac, a distributed system for laboratory process automation, and a distributed system for laboratory process automation. Additional topics include a structure macro assembler for the event handler, a data acquisition and control system for Thomson scattering on ATF, remote procedure execution software for distributed systems, and a PC-based graphic display real-time particle beam uniformity.
Towards marine seismological Network: real time small aperture seismic array
NASA Astrophysics Data System (ADS)
Ilinskiy, Dmitry
2017-04-01
Most powerful and dangerous seismic events are generated in underwater subduction zones. Existing seismological networks are based on land seismological stations. Increased demands for accuracy of location, magnitude, rupture process of coming earthquakes and at the same time reduction of data processing time require information from seabed seismic stations located near the earthquake generation area. Marine stations provide important contribution for clarification of the tectonic settings in most active subduction zones of the world. Early warning system for subduction zone area is based on marine seabed array which located near the area of most hazardous seismic zone in the region. Fast track processing for location of the earthquake hypocenter and energy takes place in buoy surface unit. Information about detected and located earthquake reaches the onshore seismological center earlier than the first break waves from the same earthquake will reach the nearest onshore seismological station. Implementation of small aperture array is based on existed and shown a good proven performance and costs effective solutions such as weather moored buoy and self-pop up autonomous seabed seismic nodes. Permanent seabed system for real-time operation has to be installed in deep sea waters far from the coast. Seabed array consists of several self-popup seismological stations which continuously acquire the data, detect the events of certain energy class and send detected event parameters to the surface buoy via acoustic link. Surface buoy unit determine the earthquake location by receiving the event parameters from seabed units and send such information in semi-real time to the onshore seismological center via narrow band satellite link. Upon the request from the cost the system could send wave form of events of certain energy class, bottom seismic station battery status and other environmental parameters. When the battery life of particular seabed unit is close to became empty, the seabed unit is switching into sleep mode and send that information to surface buoy and father to the onshore data center. Then seabed unit can wait for the vessel of opportunity for recovery of seabed unit to sea surface and replacing seabed station to another one with fresh batteries. All collected permanent seismic data by seabed unit could than downloaded for father processing and analysis. In our presentation we will demonstrate the several working prototypes of proposed system such as real time cable broad band seismological station and real time buoy seabed seismological station.
Non-stationary least-squares complex decomposition for microseismic noise attenuation
NASA Astrophysics Data System (ADS)
Chen, Yangkang
2018-06-01
Microseismic data processing and imaging are crucial for subsurface real-time monitoring during hydraulic fracturing process. Unlike the active-source seismic events or large-scale earthquake events, the microseismic event is usually of very small magnitude, which makes its detection challenging. The biggest trouble of microseismic data is the low signal-to-noise ratio issue. Because of the small energy difference between effective microseismic signal and ambient noise, the effective signals are usually buried in strong random noise. I propose a useful microseismic denoising algorithm that is based on decomposing a microseismic trace into an ensemble of components using least-squares inversion. Based on the predictive property of useful microseismic event along the time direction, the random noise can be filtered out via least-squares fitting of multiple damping exponential components. The method is flexible and almost automated since the only parameter needed to be defined is a decomposition number. I use some synthetic and real data examples to demonstrate the potential of the algorithm in processing complicated microseismic data sets.
The psychophysiology of real-time financial risk processing.
Lo, Andrew W; Repin, Dmitry V
2002-04-01
A longstanding controversy in economics and finance is whether financial markets are governed by rational forces or by emotional responses. We study the importance of emotion in the decision-making process of professional securities traders by measuring their physiological characteristics (e.g., skin conductance, blood volume pulse, etc.) during live trading sessions while simultaneously capturing real-time prices from which market events can be detected. In a sample of 10 traders, we find statistically significant differences in mean electrodermal responses during transient market events relative to no-event control periods, and statistically significant mean changes in cardiovascular variables during periods of heightened market volatility relative to normal-volatility control periods. We also observe significant differences in these physiological responses across the 10 traders that may be systematically related to the traders' levels of experience.
The UNAVCO Real-time GPS Data Processing System and Community Reference Data Sets
NASA Astrophysics Data System (ADS)
Sievers, C.; Mencin, D.; Berglund, H. T.; Blume, F.; Meertens, C. M.; Mattioli, G. S.
2013-12-01
UNAVCO has constructed a real-time GPS (RT-GPS) network of 420 GPS stations. The majority of the streaming stations come from the EarthScope Plate Boundary Observatory (PBO) through an NSF-ARRA funded Cascadia Upgrade Initiative that upgraded 100 backbone stations throughout the PBO footprint and 282 stations focused in the Pacific Northwest. Additional contributions from NOAA (~30 stations in Southern California) and the USGS (8 stations at Yellowstone) account for the other real-time stations. Based on community based outcomes of a workshop focused on real-time GPS position data products and formats hosted by UNAVCO in Spring of 2011, UNAVCO now provides real-time PPP positions for all 420 stations using Trimble's PIVOT software and for 50 stations using TrackRT at the volcanic centers located at Yellowstone (Figure 1 shows an example ensemble of TrackRT networks used in processing the Yellowstone data), Mt St Helens, and Montserrat. The UNAVCO real-time system has the potential to enhance our understanding of earthquakes, seismic wave propagation, volcanic eruptions, magmatic intrusions, movement of ice, landslides, and the dynamics of the atmosphere. Beyond its increasing uses for science and engineering, RT-GPS has the potential to provide early warning of hazards to emergency managers, utilities, other infrastructure managers, first responders and others. With the goal of characterizing stability and improving software and higher level products based on real-time GPS time series, UNAVCO is developing an open community standard data set where data processors can provide solutions based on common sets of RT-GPS data which simulate real world scenarios and events. UNAVCO is generating standard data sets for playback that include not only real and synthetic events but also background noise, antenna movement (e.g., steps, linear trends, sine waves, and realistic earthquake-like motions), receiver drop out and online return, interruption of communications (such as, bulk regional failures due to specific carriers during an actual event), satellites rising and setting, various constellation outages and differences in performance between real-time and simulated (retroactive) real-time. We present an overview of the UNAVCO RT-GPS system, a comparison of the UNAVCO generated real-time data products, and an overview of available common data sets.
APNEA list mode data acquisition and real-time event processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hogle, R.A.; Miller, P.; Bramblett, R.L.
1997-11-01
The LMSC Active Passive Neutron Examinations and Assay (APNEA) Data Logger is a VME-based data acquisition system using commercial-off-the-shelf hardware with the application-specific software. It receives TTL inputs from eighty-eight {sup 3}He detector tubes and eight timing signals. Two data sets are generated concurrently for each acquisition session: (1) List Mode recording of all detector and timing signals, timestamped to 3 microsecond resolution; (2) Event Accumulations generated in real-time by counting events into short (tens of microseconds) and long (seconds) time bins following repetitive triggers. List Mode data sets can be post-processed to: (1) determine the optimum time bins formore » TRU assay of waste drums, (2) analyze a given data set in several ways to match different assay requirements and conditions and (3) confirm assay results by examining details of the raw data. Data Logger events are processed and timestamped by an array of 15 TMS320C40 DSPs and delivered to an embedded controller (PowerPC604) for interim disk storage. Three acquisition modes, corresponding to different trigger sources are provided. A standard network interface to a remote host system (Windows NT or SunOS) provides for system control, status, and transfer of previously acquired data. 6 figs.« less
[Development of medical supplies management system].
Zhong, Jianping; Shen, Beijun; Zhu, Huili
2012-11-01
This paper adopts advanced information technology to manage medical supplies, in order to improve the medical supplies management level and reduce material cost. It develops a Medical Supplies Management System with B/S and C/S mixed structure, optimizing material management process, building large equipment performance evaluation model, providing interface solution with HIS, and realizing real-time information briefing of high value material's consumption. The medical materials are managed during its full life-cycle. The material consumption of the clinical departments is monitored real-timely. Through the closed-loop management with pre-event budget, mid-event control and after-event analysis, it realizes the final purpose of management yielding benefit.
Pappachan, Bobby K; Caesarendra, Wahyu; Tjahjowidodo, Tegoeh; Wijaya, Tomi
2017-01-01
Process monitoring using indirect methods relies on the usage of sensors. Using sensors to acquire vital process related information also presents itself with the problem of big data management and analysis. Due to uncertainty in the frequency of events occurring, a higher sampling rate is often used in real-time monitoring applications to increase the chances of capturing and understanding all possible events related to the process. Advanced signal processing methods are used to further decipher meaningful information from the acquired data. In this research work, power spectrum density (PSD) of sensor data acquired at sampling rates between 40–51.2 kHz was calculated and the corelation between PSD and completed number of cycles/passes is presented. Here, the progress in number of cycles/passes is the event this research work intends to classify and the algorithm used to compute PSD is Welch’s estimate method. A comparison between Welch’s estimate method and statistical methods is also discussed. A clear co-relation was observed using Welch’s estimate to classify the number of cycles/passes. The paper also succeeds in classifying vibration signal generated by the spindle from the vibration signal acquired during finishing process. PMID:28556809
NASA Astrophysics Data System (ADS)
Jackson, Michael; Zimakov, Leonid; Moessmer, Matthias
2015-04-01
Scientific GNSS networks are moving towards a model of real-time data acquisition, epoch-by-epoch storage integrity, and on-board real-time position and displacement calculations. This new paradigm allows the integration of real-time, high-rate GNSS displacement information with acceleration and velocity data to create very high-rate displacement records. The mating of these two instruments allows the creation of a new, very high-rate (200 Hz) displacement observable that has the full-scale displacement characteristics of GNSS and high-precision dynamic motions of seismic technologies. It is envisioned that these new observables can be used for earthquake early warning studies, volcano monitoring, and critical infrastructure monitoring applications. Our presentation will focus on the characteristics of GNSS, seismic, and strong motion sensors in high dynamic environments, including historic earthquakes replicated on a shake table over a range of displacements and frequencies. We will explore the optimum integration of these sensors from a filtering perspective including simple harmonic impulses over varying frequencies and amplitudes and under the dynamic conditions of various earthquake scenarios. We will also explore the tradeoffs between various GNSS processing schemes including real-time precise point positioning (PPP) and real-time kinematic (RTK) as applied to seismogeodesy. In addition we will discuss implementation of a Rapid Seismic Event Notification System that provides quick delivery of digital data from seismic stations to the acquisition and processing center and a full data integrity model for real-time earthquake notification that provides warning prior to significant ground shaking.
Analysis of real-time vibration data
Safak, E.
2005-01-01
In recent years, a few structures have been instrumented to provide continuous vibration data in real time, recording not only large-amplitude motions generated by extreme loads, but also small-amplitude motions generated by ambient loads. The main objective in continuous recording is to track any changes in structural characteristics, and to detect damage after an extreme event, such as an earthquake or explosion. The Fourier-based spectral analysis methods have been the primary tool to analyze vibration data from structures. In general, such methods do not work well for real-time data, because real-time data are mainly composed of ambient vibrations with very low amplitudes and signal-to-noise ratios. The long duration, linearity, and the stationarity of ambient data, however, allow us to utilize statistical signal processing tools, which can compensate for the adverse effects of low amplitudes and high noise. The analysis of real-time data requires tools and techniques that can be applied in real-time; i.e., data are processed and analyzed while being acquired. This paper presents some of the basic tools and techniques for processing and analyzing real-time vibration data. The topics discussed include utilization of running time windows, tracking mean and mean-square values, filtering, system identification, and damage detection.
Synthetic Foveal Imaging Technology
NASA Technical Reports Server (NTRS)
Hoenk, Michael; Monacos, Steve; Nikzad, Shouleh
2009-01-01
Synthetic Foveal imaging Technology (SyFT) is an emerging discipline of image capture and image-data processing that offers the prospect of greatly increased capabilities for real-time processing of large, high-resolution images (including mosaic images) for such purposes as automated recognition and tracking of moving objects of interest. SyFT offers a solution to the image-data processing problem arising from the proposed development of gigapixel mosaic focal-plane image-detector assemblies for very wide field-of-view imaging with high resolution for detecting and tracking sparse objects or events within narrow subfields of view. In order to identify and track the objects or events without the means of dynamic adaptation to be afforded by SyFT, it would be necessary to post-process data from an image-data space consisting of terabytes of data. Such post-processing would be time-consuming and, as a consequence, could result in missing significant events that could not be observed at all due to the time evolution of such events or could not be observed at required levels of fidelity without such real-time adaptations as adjusting focal-plane operating conditions or aiming of the focal plane in different directions to track such events. The basic concept of foveal imaging is straightforward: In imitation of a natural eye, a foveal-vision image sensor is designed to offer higher resolution in a small region of interest (ROI) within its field of view. Foveal vision reduces the amount of unwanted information that must be transferred from the image sensor to external image-data-processing circuitry. The aforementioned basic concept is not new in itself: indeed, image sensors based on these concepts have been described in several previous NASA Tech Briefs articles. Active-pixel integrated-circuit image sensors that can be programmed in real time to effect foveal artificial vision on demand are one such example. What is new in SyFT is a synergistic combination of recent advances in foveal imaging, computing, and related fields, along with a generalization of the basic foveal-vision concept to admit a synthetic fovea that is not restricted to one contiguous region of an image.
Real-time optimizations for integrated smart network camera
NASA Astrophysics Data System (ADS)
Desurmont, Xavier; Lienard, Bruno; Meessen, Jerome; Delaigle, Jean-Francois
2005-02-01
We present an integrated real-time smart network camera. This system is composed of an image sensor, an embedded PC based electronic card for image processing and some network capabilities. The application detects events of interest in visual scenes, highlights alarms and computes statistics. The system also produces meta-data information that could be shared between other cameras in a network. We describe the requirements of such a system and then show how the design of the system is optimized to process and compress video in real-time. Indeed, typical video-surveillance algorithms as background differencing, tracking and event detection should be highly optimized and simplified to be used in this hardware. To have a good adequation between hardware and software in this light embedded system, the software management is written on top of the java based middle-ware specification established by the OSGi alliance. We can integrate easily software and hardware in complex environments thanks to the Java Real-Time specification for the virtual machine and some network and service oriented java specifications (like RMI and Jini). Finally, we will report some outcomes and typical case studies of such a camera like counter-flow detection.
Temporal compression in episodic memory for real-life events.
Jeunehomme, Olivier; Folville, Adrien; Stawarczyk, David; Van der Linden, Martial; D'Argembeau, Arnaud
2018-07-01
Remembering an event typically takes less time than experiencing it, suggesting that episodic memory represents past experience in a temporally compressed way. Little is known, however, about how the continuous flow of real-life events is summarised in memory. Here we investigated the nature and determinants of temporal compression by directly comparing memory contents with the objective timing of events as measured by a wearable camera. We found that episodic memories consist of a succession of moments of prior experience that represent events with varying compression rates, such that the density of retrieved information is modulated by goal processing and perceptual changes. Furthermore, the results showed that temporal compression rates remain relatively stable over one week and increase after a one-month delay, particularly for goal-related events. These data shed new light on temporal compression in episodic memory and suggest that compression rates are adaptively modulated to maintain current goal-relevant information.
NASA Astrophysics Data System (ADS)
Tavakoli, S.; Poslad, S.; Fruhwirth, R.; Winter, M.
2012-04-01
This paper introduces an application of a novel EventTracker platform for instantaneous Sensitivity Analysis (SA) of large scale real-time geo-information. Earth disaster management systems demand high quality information to aid a quick and timely response to their evolving environments. The idea behind the proposed EventTracker platform is the assumption that modern information management systems are able to capture data in real-time and have the technological flexibility to adjust their services to work with specific sources of data/information. However, to assure this adaptation in real time, the online data should be collected, interpreted, and translated into corrective actions in a concise and timely manner. This can hardly be handled by existing sensitivity analysis methods because they rely on historical data and lazy processing algorithms. In event-driven systems, the effect of system inputs on its state is of value, as events could cause this state to change. This 'event triggering' situation underpins the logic of the proposed approach. Event tracking sensitivity analysis method describes the system variables and states as a collection of events. The higher the occurrence of an input variable during the trigger of event, the greater its potential impact will be on the final analysis of the system state. Experiments were designed to compare the proposed event tracking sensitivity analysis with existing Entropy-based sensitivity analysis methods. The results have shown a 10% improvement in a computational efficiency with no compromise for accuracy. It has also shown that the computational time to perform the sensitivity analysis is 0.5% of the time required compared to using the Entropy-based method. The proposed method has been applied to real world data in the context of preventing emerging crises at drilling rigs. One of the major purposes of such rigs is to drill boreholes to explore oil or gas reservoirs with the final scope of recovering the content of such reservoirs; both in onshore regions as well as in offshore regions. Drilling a well is always guided by technical, economic and security constraints to prevent crew, equipment and environment from injury, damage and pollution. Although risk assessment and local practice provides a high degree of security, uncertainty is given by the behaviour of the formation which may cause crucial situations at the rig. To overcome such uncertainties real-time sensor measurements form a base to predict and thus prevent such crises, the proposed method supports the identification of the data necessary for that.
Feasibility of Using Alternate Fuels in the U.S. Antarctic Program: Initial Assessment
2017-09-01
Figures 1 Platts’ Jet A fuel prices per gallons from 1990 to 2013. Platts’ pricing is a real time market process for determining the cost of fossil ... fossil fuels. This process takes into account supply, demand, and current events. Since 1909, Platts has been reporting these real time prices and...refinery to upload NSF’s fuel to the day it arrives at a destination where it will per- form work for a different customer). Over the past decade, day
Real time analysis with the upgraded LHCb trigger in Run III
NASA Astrophysics Data System (ADS)
Szumlak, Tomasz
2017-10-01
The current LHCb trigger system consists of a hardware level, which reduces the LHC bunch-crossing rate of 40 MHz to 1.1 MHz, a rate at which the entire detector is read out. A second level, implemented in a farm of around 20k parallel processing CPUs, the event rate is reduced to around 12.5 kHz. The LHCb experiment plans a major upgrade of the detector and DAQ system in the LHC long shutdown II (2018-2019). In this upgrade, a purely software based trigger system is being developed and it will have to process the full 30 MHz of bunch crossings with inelastic collisions. LHCb will also receive a factor of 5 increase in the instantaneous luminosity, which further contributes to the challenge of reconstructing and selecting events in real time with the CPU farm. We discuss the plans and progress towards achieving efficient reconstruction and selection with a 30 MHz throughput. Another challenge is to exploit the increased signal rate that results from removing the 1.1 MHz readout bottleneck, combined with the higher instantaneous luminosity. Many charm hadron signals can be recorded at up to 50 times higher rate. LHCb is implementing a new paradigm in the form of real time data analysis, in which abundant signals are recorded in a reduced event format that can be fed directly to the physics analyses. These data do not need any further offline event reconstruction, which allows a larger fraction of the grid computing resources to be devoted to Monte Carlo productions. We discuss how this real-time analysis model is absolutely critical to the LHCb upgrade, and how it will evolve during Run-II.
ERIC Educational Resources Information Center
Dallas, Andrea; DeDe, Gayle; Nicol, Janet
2013-01-01
The current study employed a neuro-imaging technique, Event-Related Potentials (ERP), to investigate real-time processing of sentences containing filler-gap dependencies by late-learning speakers of English as a second language (L2) with a Chinese native language background. An individual differences approach was also taken to examine the role of…
Selective attention in multi-chip address-event systems.
Bartolozzi, Chiara; Indiveri, Giacomo
2009-01-01
Selective attention is the strategy used by biological systems to cope with the inherent limits in their available computational resources, in order to efficiently process sensory information. The same strategy can be used in artificial systems that have to process vast amounts of sensory data with limited resources. In this paper we present a neuromorphic VLSI device, the "Selective Attention Chip" (SAC), which can be used to implement these models in multi-chip address-event systems. We also describe a real-time sensory-motor system, which integrates the SAC with a dynamic vision sensor and a robotic actuator. We present experimental results from each component in the system, and demonstrate how the complete system implements a real-time stimulus-driven selective attention model.
Monitoring activities of satellite data processing services in real-time with SDDS Live Monitor
NASA Astrophysics Data System (ADS)
Duc Nguyen, Minh
2017-10-01
This work describes Live Monitor, the monitoring subsystem of SDDS - an automated system for space experiment data processing, storage, and distribution created at SINP MSU. Live Monitor allows operators and developers of satellite data centers to identify errors occurred in data processing quickly and to prevent further consequences caused by the errors. All activities of the whole data processing cycle are illustrated via a web interface in real-time. Notification messages are delivered to responsible people via emails and Telegram messenger service. The flexible monitoring mechanism implemented in Live Monitor allows us to dynamically change and control events being shown on the web interface on our demands. Physicists, whose space weather analysis models are functioning upon satellite data provided by SDDS, can use the developed RESTful API to monitor their own events and deliver customized notification messages by their needs.
Abnormal Condition Monitoring of Workpieces Based on RFID for Wisdom Manufacturing Workshops.
Zhang, Cunji; Yao, Xifan; Zhang, Jianming
2015-12-03
Radio Frequency Identification (RFID) technology has been widely used in many fields. However, previous studies have mainly focused on product life cycle tracking, and there are few studies on real-time status monitoring of workpieces in manufacturing workshops. In this paper, a wisdom manufacturing model is introduced, a sensing-aware environment for a wisdom manufacturing workshop is constructed, and RFID event models are defined. A synthetic data cleaning method is applied to clean the raw RFID data. The Complex Event Processing (CEP) technology is adopted to monitor abnormal conditions of workpieces in real time. The RFID data cleaning method and data mining technology are examined by simulation and physical experiments. The results show that the synthetic data cleaning method preprocesses data well. The CEP based on the Rifidi(®) Edge Server technology completed abnormal condition monitoring of workpieces in real time. This paper reveals the importance of RFID spatial and temporal data analysis in real-time status monitoring of workpieces in wisdom manufacturing workshops.
Abnormal Condition Monitoring of Workpieces Based on RFID for Wisdom Manufacturing Workshops
Zhang, Cunji; Yao, Xifan; Zhang, Jianming
2015-01-01
Radio Frequency Identification (RFID) technology has been widely used in many fields. However, previous studies have mainly focused on product life cycle tracking, and there are few studies on real-time status monitoring of workpieces in manufacturing workshops. In this paper, a wisdom manufacturing model is introduced, a sensing-aware environment for a wisdom manufacturing workshop is constructed, and RFID event models are defined. A synthetic data cleaning method is applied to clean the raw RFID data. The Complex Event Processing (CEP) technology is adopted to monitor abnormal conditions of workpieces in real time. The RFID data cleaning method and data mining technology are examined by simulation and physical experiments. The results show that the synthetic data cleaning method preprocesses data well. The CEP based on the Rifidi® Edge Server technology completed abnormal condition monitoring of workpieces in real time. This paper reveals the importance of RFID spatial and temporal data analysis in real-time status monitoring of workpieces in wisdom manufacturing workshops. PMID:26633418
NASA Astrophysics Data System (ADS)
Gajda, Janusz; Wyłomańska, Agnieszka; Zimroz, Radosław
2016-12-01
Many real data exhibit behavior adequate to subdiffusion processes. Very often it is manifested by so-called ;trapping events;. The visible evidence of subdiffusion we observe not only in financial time series but also in technical data. In this paper we propose a model which can be used for description of such kind of data. The model is based on the continuous time autoregressive time series with stable noise delayed by the infinitely divisible inverse subordinator. The proposed system can be applied to real datasets with short-time dependence, visible jumps and mentioned periods of stagnation. In this paper we extend the theoretical considerations in analysis of subordinated processes and propose a new model that exhibits mentioned properties. We concentrate on the main characteristics of the examined subordinated process expressed mainly in the language of the measures of dependence which are main tools used in statistical investigation of real data. We present also the simulation procedure of the considered system and indicate how to estimate its parameters. The theoretical results we illustrate by the analysis of real technical data.
Real-Time Field Data Acquisition and Remote Sensor Reconfiguration Using Scientific Workflows
NASA Astrophysics Data System (ADS)
Silva, F.; Mehta, G.; Vahi, K.; Deelman, E.
2010-12-01
Despite many technological advances, field data acquisition still consists of several manual and laborious steps. Once sensors and data loggers are deployed in the field, scientists often have to periodically return to their study sites in order to collect their data. Even when field deployments have a way to communicate and transmit data back to the laboratory (e.g. by using a satellite or a cellular modem), data analysis still requires several repetitive steps. Because data often needs to be processed and inspected manually, there is usually a significant time delay between data collection and analysis. As a result, sensor failures that could be detected almost in real-time are not noted for weeks or months. Finally, sensor reconfiguration as a result of interesting events in the field is still done manually, making rapid response nearly impossible and causing important data to be missed. By working closely with scientists from different application domains, we identified several tasks that, if automated, could greatly improve the way field data is collected, processed, and distributed. Our goals are to enable real-time data collection and validation, automate sensor reconfiguration in response to interest events in the field, and allow scientists to easily automate their data processing. We began our design by employing the Sensor Processing and Acquisition Network (SPAN) architecture. SPAN uses an embedded processor in the field to coordinate sensor data acquisition from analog and digital sensors by interfacing with different types of devices and data loggers. SPAN is also able to interact with various types of communication devices in order to provide real-time communication to and from field sites. We use the Pegasus Workflow Management System (Pegasus WMS) to coordinate data collection and control sensors and deployments in the field. Because scientific workflows can be used to automate multi-step, repetitive tasks, scientists can create simple workflows to download sensor data, perform basic QA/QC, and identify events of interest as well as sensor and data logger failures almost in real-time. As a result of this automation, scientists can quickly be notified (e.g. via e-mail or SMS) so that important events are not missed. In addition, Pegasus WMS has the ability to abstract the execution environment of where programs run. By placing a Pegasus WMS agent inside an embedded processor in the field, we allow scientists to ship simple computational models to the field, enabling remote data processing at the field site. As an example, scientists can send an image processing algorithm to the field so that the embedded processor can analyze images, thus reducing the bandwidth necessary for communication. In addition, when real-time communication to the laboratory is not possible, scientists can create simple computational models that can be run on sensor nodes autonomously, monitoring sensor data and making adjustments without any human intervention. We believe our system lowers the bar for the adoption of reconfigurable sensor networks by field scientists. In this poster, we will show how this technology can be used to provide not only data acquisition, but also real-time data validation and sensor reconfiguration.
Neuromorphic Event-Based 3D Pose Estimation
Reverter Valeiras, David; Orchard, Garrick; Ieng, Sio-Hoi; Benosman, Ryad B.
2016-01-01
Pose estimation is a fundamental step in many artificial vision tasks. It consists of estimating the 3D pose of an object with respect to a camera from the object's 2D projection. Current state of the art implementations operate on images. These implementations are computationally expensive, especially for real-time applications. Scenes with fast dynamics exceeding 30–60 Hz can rarely be processed in real-time using conventional hardware. This paper presents a new method for event-based 3D object pose estimation, making full use of the high temporal resolution (1 μs) of asynchronous visual events output from a single neuromorphic camera. Given an initial estimate of the pose, each incoming event is used to update the pose by combining both 3D and 2D criteria. We show that the asynchronous high temporal resolution of the neuromorphic camera allows us to solve the problem in an incremental manner, achieving real-time performance at an update rate of several hundreds kHz on a conventional laptop. We show that the high temporal resolution of neuromorphic cameras is a key feature for performing accurate pose estimation. Experiments are provided showing the performance of the algorithm on real data, including fast moving objects, occlusions, and cases where the neuromorphic camera and the object are both in motion. PMID:26834547
Process mining techniques: an application to time management
NASA Astrophysics Data System (ADS)
Khowaja, Ali Raza
2018-04-01
In an environment people have to make sure that all of their work are completed within a given time in accordance with its quality. In order to achieve the real phenomenon of process mining one needs to understand all of these processes in a detailed manner. Personal Information and communication has always been a highlighting issue on internet but for now information and communication tools within factual life refers to their daily schedule, location analysis, environmental analysis and, more generally, social media applications support these systems which makes data available for data analysis generated through event logs, but also for process analysis which combines environmental and location analysis. Process mining can be used to exploit all these real live processes with the help of the event logs which are already available in those datasets through user censored data or may be user labeled data. These processes could be used to redesign a user's flow and understand all these processes in a bit more detailed manner. In order to increase the quality of each of the processes that we go through our daily lives is to give a closer look to each of the processes and after analyzing them, one should make changes to get better results. On the contrarily, we applied process mining techniques on seven different subjects combined in a single dataset collected from Korea. Above all, the following paper comments on the efficiency of processes in the event logs referring to time management's sphere of influence.
Liu, Baolin; Wu, Guangning; Wang, Zhongning; Ji, Xiang
2011-07-01
In the real world, some of the auditory and visual information received by the human brain are temporally asynchronous. How is such information integrated in cognitive processing in the brain? In this paper, we aimed to study the semantic integration of differently asynchronous audio-visual information in cognitive processing using ERP (event-related potential) method. Subjects were presented with videos of real world events, in which the auditory and visual information are temporally asynchronous. When the critical action was prior to the sound, sounds incongruous with the preceding critical actions elicited a N400 effect when compared to congruous condition. This result demonstrates that semantic contextual integration indexed by N400 also applies to cognitive processing of multisensory information. In addition, the N400 effect is early in latency when contrasted with other visually induced N400 studies. It is shown that cross modal information is facilitated in time when contrasted with visual information in isolation. When the sound was prior to the critical action, a larger late positive wave was observed under the incongruous condition compared to congruous condition. P600 might represent a reanalysis process, in which the mismatch between the critical action and the preceding sound was evaluated. It is shown that environmental sound may affect the cognitive processing of a visual event. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
"Fast" Is Not "Real-Time": Designing Effective Real-Time AI Systems
NASA Astrophysics Data System (ADS)
O'Reilly, Cindy A.; Cromarty, Andrew S.
1985-04-01
Realistic practical problem domains (such as robotics, process control, and certain kinds of signal processing) stand to benefit greatly from the application of artificial intelligence techniques. These problem domains are of special interest because they are typified by complex dynamic environments in which the ability to select and initiate a proper response to environmental events in real time is a strict prerequisite to effective environmental interaction. Artificial intelligence systems developed to date have been sheltered from this real-time requirement, however, largely by virtue of their use of simplified problem domains or problem representations. The plethora of colloquial and (in general) mutually inconsistent interpretations of the term "real-time" employed by workers in each of these domains further exacerbates the difficul-ties in effectively applying state-of-the-art problem solving tech-niques to time-critical problems. Indeed, the intellectual waters are by now sufficiently muddied that the pursuit of a rigorous treatment of intelligent real-time performance mandates the redevelopment of proper problem perspective on what "real-time" means, starting from first principles. We present a simple but nonetheless formal definition of real-time performance. We then undertake an analysis of both conventional techniques and AI technology with respect to their ability to meet substantive real-time performance criteria. This analysis provides a basis for specification of problem-independent design requirements for systems that would claim real-time performance. Finally, we discuss the application of these design principles to a pragmatic problem in real-time signal understanding.
How to handle 6GBytes a night and not get swamped
NASA Technical Reports Server (NTRS)
Allsman, R.; Alcock, C.; Axelrod, T.; Bennett, D.; Cook, K.; Park, H.-S.; Griest, K.; Marshall, S.; Perlmutter, S.; Stubbs, C.
1992-01-01
The Macho Project has undertaken a 5 year effort to search for dark matter in the halo of the Galaxy by scanning the Magellanic Clouds for micro-lensing events. Each evening's raw image data will be reduced in real-time into the observed stars' photometric measurements. The actual search for micro-lensing events will be a post-processing operation. The theoretical prediction of the rate of such events necessitates the collection of a large number of repeated exposures. The project designed camera subsystem delivers 64 Mbytes per exposure with exposures typically occurring every 500 seconds. An ideal evening's observing will provide 6 Gbytes of raw image data and 40 Mbytes of reduced photometric measurements. Recognizing the difficulty of digging out from a snowballing cascade of raw data, the project requires the real-time reduction of each evening's data. The software team's implementation strategy centered on this non-negotiable mandate. Accepting the reality that 2 full time people needed to implement the core real-time control and data management system within 6 months, off-the-shelf vendor components were explored to provide quick solutions to the classic needs for file management, data management, and process control. Where vendor solutions were lacking, state-of-the-art models were used for hand tailored subsystems. In particular, petri nets manage process control, memory mapped bulletin boards provide interprocess communication between the multi-tasked processes, and C++ class libraries provide memory mapped, disk resident databases. The differences between the implementation strategy and the final implementation reality are presented. The necessity of validating vendor product claims are explored. Both the successful and hindsight decisions enabling the collection and processing of the nightly data barrage are reviewed.
Real-Time Mapping alert system; characteristics and capabilities
Torres, L.A.; Lambert, S.C.; Liebermann, T.D.
1995-01-01
The U.S. Geological Survey has an extensive hydrologic network that records and transmits precipitation, stage, discharge, and other water-related data on a real-time basis to an automated data processing system. Data values are recorded on electronic data collection platforms at field sampling sites. These values are transmitted by means of orbiting satellites to receiving ground stations, and by way of telecommunication lines to a U.S. Geological Survey office where they are processed on a computer system. Data that exceed predefined thresholds are identified as alert values. The current alert status at monitoring sites within a state or region is of critical importance during floods, hurricanes, and other extreme hydrologic events. This report describes the characteristics and capabilities of a series of computer programs for real-time mapping of hydrologic data. The software provides interactive graphics display and query of hydrologic information from the network in a real-time, map-based, menu-driven environment.
NASA Astrophysics Data System (ADS)
Zimakov, L. G.; Raczka, J.; Barrientos, S. E.
2016-12-01
We will discuss and show the results obtained from an integrated SeismoGeodetic System, model SG160-09, installed in the Chile (Chilean National Network), Italy (University of Naples Network), and California. The SG160-09 provides the user high rate GNSS and accelerometer data, full epoch-by-epoch measurement integrity and the ability to create combined GNSS and accelerometer high-rate (200Hz) displacement time series in real-time. The SG160-09 combines seismic recording with GNSS geodetic measurement in a single compact, ruggedized case. The system includes a low-power, 220-channel GNSS receiver powered by the latest Trimble-precise Maxwell™6 technology and supports tracking GPS, GLONASS and Galileo signals. The receiver incorporates on-board GNSS point positioning using Real-Time Precise Point Positioning (PPP) technology with satellite clock and orbit corrections delivered over IP networks. The seismic recording includes an ANSS Class A, force balance accelerometer with the latest, low power, 24-bit A/D converter, producing high-resolution seismic data. The SG160-09 processor acquires and packetizes both seismic and geodetic data and transmits it to the central station using an advanced, error-correction protocol providing data integrity between the field and the processing center. The SG160-09 has been installed in three seismic stations in different geographic locations with different Trimble global reference stations coverage The hardware includes the SG160-09 system, external Zephyr Geodetic-2 GNSS antenna, both radio and high-speed Internet communication media. Both acceleration and displacement data was transmitted in real-time to the centralized Data Acquisition Centers for real-time data processing. Command/Control of the field station and real-time GNSS position correction are provided via the Pivot platform. Data from the SG160-09 system was used for seismic event characterization along with data from traditional seismic and geodetic stations installed in the network. Our presentation will focus on the key improvements of the network installation with the SG160-09 system, RTX correction accuracy obtained from Trimble Global RTX tracking network, rapid data transmission, and real-time data processing for strong seismic events and aftershock characterization.
A Model of Rapid Radicalization Behavior Using Agent-Based Modeling and Quorum Sensing
NASA Technical Reports Server (NTRS)
Schwartz, Noah; Drucker, Nick; Campbell, Kenyth
2012-01-01
Understanding the dynamics of radicalization, especially rapid radicalization, has become increasingly important to US policy in the past several years. Traditionally, radicalization is considered a slow process, but recent social and political events demonstrate that the process can occur quickly. Examining this rapid process, in real time, is impossible. However, recreating an event using modeling and simulation (M&S) allows researchers to study some of the complex dynamics associated with rapid radicalization. We propose to adapt the biological mechanism of quorum sensing as a tool to explore, or possibly explain, rapid radicalization. Due to the complex nature of quorum sensing, M&S allows us to examine events that we could not otherwise examine in real time. For this study, we employ Agent Based Modeling (ABM), an M&S paradigm suited to modeling group behavior. The result of this study was the successful creation of rapid radicalization using quorum sensing. The Battle of Mogadishu was the inspiration for this model and provided the testing conditions used to explore quorum sensing and the ideas behind rapid radicalization. The final product has wider applicability however, using quorum sensing as a possible tool for examining other catalytic rapid radicalization events.
NASA Astrophysics Data System (ADS)
Neagoe, Cristian; Grecu, Bogdan; Manea, Liviu
2016-04-01
National Institute for Earth Physics (NIEP) operates a real time seismic network which is designed to monitor the seismic activity on the Romanian territory, which is dominated by the intermediate earthquakes (60-200 km) from Vrancea area. The ability to reduce the impact of earthquakes on society depends on the existence of a large number of high-quality observational data. The development of the network in recent years and an advanced seismic acquisition are crucial to achieving this objective. The software package used to perform the automatic real-time locations is Seiscomp3. An accurate choice of the Seiscomp3 setting parameters is necessary to ensure the best performance of the real-time system i.e., the most accurate location for the earthquakes and avoiding any false events. The aim of this study is to optimize the algorithms of the real-time system that detect and locate the earthquakes in the monitored area. This goal is pursued by testing different parameters (e.g., STA/LTA, filters applied to the waveforms) on a data set of representative earthquakes of the local seismicity. The results are compared with the locations from the Romanian Catalogue ROMPLUS.
NASA Astrophysics Data System (ADS)
Choudhury, Diptyajit; Angeloski, Aleksandar; Ziah, Haseeb; Buchholz, Hilmar; Landsman, Andre; Gupta, Amitava; Mitra, Tiyasa
Lunar explorations often involve use of a lunar lander , a rover [1],[2] and an orbiter which rotates around the moon with a fixed radius. The orbiters are usually lunar satellites orbiting along a polar orbit to ensure visibility with respect to the rover and the Earth Station although with varying latency. Communication in such deep space missions is usually done using a specialized protocol like Proximity-1[3]. MATLAB simulation of Proximity-1 have been attempted by some contemporary researchers[4] to simulate all features like transmission control, delay etc. In this paper it is attempted to simulate, in real time, the communication between a tracking station on earth (earth station), a lunar orbiter and a lunar rover using concepts of Distributed Real-time Simulation(DRTS).The objective of the simulation is to simulate, in real-time, the time varying communication delays associated with the communicating elements with a facility to integrate specific simulation modules to study different aspects e.g. response due to a specific control command from the earth station to be executed by the rover. The hardware platform comprises four single board computers operating as stand-alone real time systems (developed by MATLAB xPC target and inter-networked using UDP-IP protocol). A time triggered DRTS approach is adopted. The earth station, the orbiter and the rover are programmed as three standalone real-time processes representing the communicating elements in the system. Communication from one communicating element to another constitutes an event which passes a state message from one element to another, augmenting the state of the latter. These events are handled by an event scheduler which is the fourth real-time process. The event scheduler simulates the delay in space communication taking into consideration the distance between the communicating elements. A unique time synchronization algorithm is developed which takes into account the large latencies in space communication. The DRTS setup thus developed serves as an important and inexpensive test bench for trying out remote controlled applications on the rover, for example, from an earth station. The simulation is modular and the system is composable. Each of the processes can be aug-mented with relevant simulation modules that handle the events to simulate specific function-alities. With stringent energy saving requirements on most rovers, such a simulation set up, for example, can be used to design optimal rover movement control strategies from the orbiter in conjunction with autonomous systems on the rover itself. References 1. Lunar and Planetary Department, Moscow University, Lunokhod 1, "http://selena.sai.msu.ru/Home/Spa 2. NASA History Office, Guidelines for Advanced Manned Space Vehicle Program, "http://history.nasa.gov 35ann/AMSVPguidelines/top.htm" 3. Consultative Committee For Space Data Systems, "Proximity-1 Space Link Protocol" CCSDS 211.0-B-1 Blue Book. October 2002. 4. Segui, J. and Jennings, E., "Delay Tolerant Networking-Bundle Protocol Simulation", in Proceedings of the 2nd IEEE International Conference on Space Mission Challenges for Infor-mation Technology, 2006.
Event-driven processing for hardware-efficient neural spike sorting
NASA Astrophysics Data System (ADS)
Liu, Yan; Pereira, João L.; Constandinou, Timothy G.
2018-02-01
Objective. The prospect of real-time and on-node spike sorting provides a genuine opportunity to push the envelope of large-scale integrated neural recording systems. In such systems the hardware resources, power requirements and data bandwidth increase linearly with channel count. Event-based (or data-driven) processing can provide here a new efficient means for hardware implementation that is completely activity dependant. In this work, we investigate using continuous-time level-crossing sampling for efficient data representation and subsequent spike processing. Approach. (1) We first compare signals (synthetic neural datasets) encoded with this technique against conventional sampling. (2) We then show how such a representation can be directly exploited by extracting simple time domain features from the bitstream to perform neural spike sorting. (3) The proposed method is implemented in a low power FPGA platform to demonstrate its hardware viability. Main results. It is observed that considerably lower data rates are achievable when using 7 bits or less to represent the signals, whilst maintaining the signal fidelity. Results obtained using both MATLAB and reconfigurable logic hardware (FPGA) indicate that feature extraction and spike sorting accuracies can be achieved with comparable or better accuracy than reference methods whilst also requiring relatively low hardware resources. Significance. By effectively exploiting continuous-time data representation, neural signal processing can be achieved in a completely event-driven manner, reducing both the required resources (memory, complexity) and computations (operations). This will see future large-scale neural systems integrating on-node processing in real-time hardware.
NASA Astrophysics Data System (ADS)
Passmore, P. R.; Jackson, M.; Zimakov, L. G.; Raczka, J.; Davidson, P.
2014-12-01
The key requirements for Earthquake Early Warning and other Rapid Event Notification Systems are: Quick delivery of digital data from a field station to the acquisition and processing center; Data integrity for real-time earthquake notification in order to provide warning prior to significant ground shaking in the given target area. These two requirements are met in the recently developed Trimble SG160-09 SeismoGeodetic System, which integrates both GNSS and acceleration measurements using the Kalman filter algorithm to create a new high-rate (200 sps), real-time displacement with sufficient accuracy and very low latency for rapid delivery of the acquired data to a processing center. The data acquisition algorithm in the SG160-09 System provides output of both acceleration and displacement digital data with 0.2 sec delay. This is a significant reduction in the time interval required for real-time transmission compared to data delivery algorithms available in digitizers currently used in other Earthquake Early Warning networks. Both acceleration and displacement data are recorded and transmitted to the processing site in a specially developed Multiplexed Recording Format (MRF) that minimizes the bandwidth required for real-time data transmission. In addition, a built in algorithm calculates the τc and Pd once the event is declared. The SG160-09 System keeps track of what data has not been acknowledged and re-transmits the data giving priority to current data. Modified REF TEK Protocol Daemon (RTPD) receives the digital data and acknowledges data received without error. It forwards this "good" data to processing clients of various real-time data processing software including Earthworm and SeisComP3. The processing clients cache packets when a data gap occurs due to a dropped packet or network outage. The cache packet time is settable, but should not exceed 0.5 sec in the Earthquake Early Warning network configuration. The rapid data transmission algorithm was tested with different communication media, including Internet, DSL, Wi-Fi, GPRS, etc. The test results show that the data latency via most communication media do not exceed 0.5 sec nominal from a first sample in the data packet. Detailed acquisition algorithm and results of data transmission via different communication media are presented.
Bahk, Chi Y; Cumming, Melissa; Paushter, Louisa; Madoff, Lawrence C; Thomson, Angus; Brownstein, John S
2016-02-01
Real-time monitoring of mainstream and social media can inform public health practitioners and policy makers about vaccine sentiment and hesitancy. We describe a publicly available platform for monitoring vaccination-related content, called the Vaccine Sentimeter. With automated data collection from 100,000 mainstream media sources and Twitter, natural-language processing for automated filtering, and manual curation to ensure accuracy, the Vaccine Sentimeter offers a global real-time view of vaccination conversations online. To assess the system's utility, we followed two events: polio vaccination in Pakistan after a news story about a Central Intelligence Agency vaccination ruse and subsequent attacks on health care workers, and a controversial episode in a television program about adverse events following human papillomavirus vaccination. For both events, increased online activity was detected and characterized. For the first event, Twitter response to the attacks on health care workers decreased drastically after the first attack, in contrast to mainstream media coverage. For the second event, the mainstream and social media response was largely positive about the HPV vaccine, but antivaccine conversations persisted longer than the provaccine reaction. Using the Vaccine Sentimeter could enable public health professionals to detect increased online activity or sudden shifts in sentiment that could affect vaccination uptake. Project HOPE—The People-to-People Health Foundation, Inc.
Real-time feedback control of twin-screw wet granulation based on image analysis.
Madarász, Lajos; Nagy, Zsombor Kristóf; Hoffer, István; Szabó, Barnabás; Csontos, István; Pataki, Hajnalka; Démuth, Balázs; Szabó, Bence; Csorba, Kristóf; Marosi, György
2018-06-04
The present paper reports the first dynamic image analysis-based feedback control of continuous twin-screw wet granulation process. Granulation of the blend of lactose and starch was selected as a model process. The size and size distribution of the obtained particles were successfully monitored by a process camera coupled with an image analysis software developed by the authors. The validation of the developed system showed that the particle size analysis tool can determine the size of the granules with an error of less than 5 µm. The next step was to implement real-time feedback control of the process by controlling the liquid feeding rate of the pump through a PC, based on the real-time determined particle size results. After the establishment of the feedback control, the system could correct different real-life disturbances, creating a Process Analytically Controlled Technology (PACT), which guarantees the real-time monitoring and controlling of the quality of the granules. In the event of changes or bad tendencies in the particle size, the system can automatically compensate the effect of disturbances, ensuring proper product quality. This kind of quality assurance approach is especially important in the case of continuous pharmaceutical technologies. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Donegan, M.; Vandegriff, J.; Ho, G. C.; Julia, S. J.
2004-12-01
We report on an operational system which provides advance warning and predictions of arrival times at Earth of interplanetary (IP) shocks that originate at the Sun. The data stream used in our prediction algorithm is real-time and comes from the Electron, Proton, and Alpha Monitor (EPAM) instrument on NASA's Advanced Composition Explorer (ACE) spacecraft. Since locally accelerated energetic storm particle (ESP) events accompany most IP shocks, their arrival can be predicted using ESP event signatures. We have previously reported on the development and implementation of an algorithm which recognizes the upstream particle signature of approaching IP shocks and provides estimated countdown predictions. A web-based system (see (http://sd-www.jhuapl.edu/UPOS/RISP/index.html) combines this prediction capability with real-time ACE/EPAM data provided by the NOAA Space Environment Center. The most recent ACE data is continually processed and predictions of shock arrival time are updated every five minutes when an event is impending. An operational display is provided to indicate advisories and countdowns for the event. Running the algorithm on a test set of historical events, we obtain a median error of about 10 hours for predictions made 24-36 hours before actual shock arrival and about 6 hours when the shock is 6-12 hours away. This system can provide critical information to mission planners, satellite operations controllers, and scientists by providing significant lead-time for approaching events. Recently, we have made improvements to the triggering mechanism as well as re-training the neural network, and here we report prediction results from the latest system.
Leveraging Long-term Seismic Catalogs for Automated Real-time Event Classification
NASA Astrophysics Data System (ADS)
Linville, L.; Draelos, T.; Pankow, K. L.; Young, C. J.; Alvarez, S.
2017-12-01
We investigate the use of labeled event types available through reviewed seismic catalogs to produce automated event labels on new incoming data from the crustal region spanned by the cataloged events. Using events cataloged by the University of Utah Seismograph Stations between October, 2012 and June, 2017, we calculate the spectrogram for a time window that spans the duration of each event as seen on individual stations, resulting in 110k event spectrograms (50% local earthquakes examples, 50% quarry blasts examples). Using 80% of the randomized example events ( 90k), a classifier is trained to distinguish between local earthquakes and quarry blasts. We explore variations of deep learning classifiers, incorporating elements of convolutional and recurrent neural networks. Using a single-layer Long Short Term Memory recurrent neural network, we achieve 92% accuracy on the classification task on the remaining 20K test examples. Leveraging the decisions from a group of stations that detected the same event by using the median of all classifications in the group increases the model accuracy to 96%. Additional data with equivalent processing from 500 more recently cataloged events (July, 2017), achieves the same accuracy as our test data on both single-station examples and multi-station medians, suggesting that the model can maintain accurate and stable classification rates on real-time automated events local to the University of Utah Seismograph Stations, with potentially minimal levels of re-training through time.
Using SDO Data in the Classroom to Do Real Science -- A Community College Laboratory Investigation
NASA Astrophysics Data System (ADS)
Dave, T. A.; Hildreth, S.; Lee, S.; Scherrer, D. K.
2013-12-01
The incredible accessibility of extremely high spatial and temporal resolution data from the Solar Dynamics Observatory creates an opportunity for students to do almost real-time investigation in an Astronomy Lab. We are developing a short series of laboratory exercises using SDO data, targeted for Community College students in an introductory lab class, extendable to high school and university students. The labs initially lead students to explore what SDO can do, online, through existing SDO video clips taken on specific dates. Students then investigate solar events using the Heliophysics Events Knowledgebase (HEK), and make their own online movies of events, to discuss and share with classmates. Finally, students can investigate specific events and areas, selecting specific dates, locations, wavelength regions, and time cadences to create and gather their own SDO datasets for more detailed investigation. In exploring the Sun using actual data, students actually do real science. We are in the process of beta testing the sequence of labs, and are seeking interested community college, university, and high school astronomy lab teachers who might consider trying the labs themselves.
Event Management of RFID Data Streams: Fast Moving Consumer Goods Supply Chains
NASA Astrophysics Data System (ADS)
Mo, John P. T.; Li, Xue
Radio Frequency Identification (RFID) is a wireless communication technology that uses radio-frequency waves to transfer information between tagged objects and readers without line of sight. This creates tremendous opportunities for linking real world objects into a world of "Internet of things". Application of RFID to Fast Moving Consumer Goods sector will introduce billions of RFID tags in the world. Almost everything is tagged for tracking and identification purposes. This phenomenon will impose a new challenge not only to the network capacity but also to the scalability of processing of RFID events and data. This chapter uses two national demonstrator projects in Australia as case studies to introduce an event managementframework to process high volume RFID data streams in real time and automatically transform physical RFID observations into business-level events. The model handles various temporal event patterns, both simple and complex, with temporal constraints. The model can be implemented in a data management architecture that allows global RFID item tracking and enables fast, large-scale RFID deployment.
Listening to sound patterns as a dynamic activity
NASA Astrophysics Data System (ADS)
Jones, Mari Riess
2003-04-01
The act of listening to a series of sounds created by some natural event is described as involving an entrainmentlike process that transpires in real time. Some aspects of this dynamic process are suggested. In particular, real-time attending is described in terms of an adaptive synchronization activity that permits a listener to target attending energy to forthcoming elements within an acoustical pattern (e.g., music, speech, etc.). Also described are several experiments that illustrate features of this approach as it applies to attending to musiclike patterns. These involve listeners' responses to changes in either the timing or the pitch structure (or both) of various acoustical sequences.
A generic multi-hazard and multi-risk framework and its application illustrated in a virtual city
NASA Astrophysics Data System (ADS)
Mignan, Arnaud; Euchner, Fabian; Wiemer, Stefan
2013-04-01
We present a generic framework to implement hazard correlations in multi-risk assessment strategies. We consider hazard interactions (process I), time-dependent vulnerability (process II) and time-dependent exposure (process III). Our approach is based on the Monte Carlo method to simulate a complex system, which is defined from assets exposed to a hazardous region. We generate 1-year time series, sampling from a stochastic set of events. Each time series corresponds to one risk scenario and the analysis of multiple time series allows for the probabilistic assessment of losses and for the recognition of more or less probable risk paths. Each sampled event is associated to a time of occurrence, a damage footprint and a loss footprint. The occurrence of an event depends on its rate, which is conditional on the occurrence of past events (process I, concept of correlation matrix). Damage depends on the hazard intensity and on the vulnerability of the asset, which is conditional on previous damage on that asset (process II). Losses are the product of damage and exposure value, this value being the original exposure minus previous losses (process III, no reconstruction considered). The Monte Carlo method allows for a straightforward implementation of uncertainties and for implementation of numerous interactions, which is otherwise challenging in an analytical multi-risk approach. We apply our framework to a synthetic data set, defined by a virtual city within a virtual region. This approach gives the opportunity to perform multi-risk analyses in a controlled environment while not requiring real data, which may be difficultly accessible or simply unavailable to the public. Based on the heuristic approach, we define a 100 by 100 km region where earthquakes, volcanic eruptions, fluvial floods, hurricanes and coastal floods can occur. All hazards are harmonized to a common format. We define a 20 by 20 km city, composed of 50,000 identical buildings with a fixed economic value. Vulnerability curves are defined in terms of mean damage ratio as a function of hazard intensity. All data are based on simple equations found in the literature and on other simplifications. We show the impact of earthquake-earthquake interaction and hurricane-storm surge coupling, as well as of time-dependent vulnerability and exposure, on aggregated loss curves. One main result is the emergence of low probability-high consequences (extreme) events when correlations are implemented. While the concept of virtual city can suggest the theoretical benefits of multi-risk assessment for decision support, identifying their real-world practicality will require the study of real test sites.
Real-time detection of optical transients with RAPTOR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borozdin, K. N.; Brumby, Steven P.; Galassi, M. C.
2002-01-01
Fast variability of optical objects is an interesting though poorly explored subject in modern astronomy. Real-time data processing and identification of transient, celestial events in the images is very important, for such study as it allows rapid follow-up with more sensitive instruments, We discuss an approach which we have chosen for the RAPTOR project which is a pioneering close-loop system combining real-time transient detection with rapid follow-up. Our data processing pipeline is able to identify and localize an optical transient within seconds after the observation. We describe the challenges we met, solutions we found and some results obtained in ourmore » search for fast optical transients. The software pipeline we have developed for RAPTOR can easily be applied to the data from other experiments.« less
NASA Astrophysics Data System (ADS)
Edwards, John L.; Beekman, Randy M.; Buchanan, David B.; Farner, Scott; Gershzohn, Gary R.; Khuzadi, Mbuyi; Mikula, D. F.; Nissen, Gerry; Peck, James; Taylor, Shaun
2007-04-01
Human space travel is inherently dangerous. Hazardous conditions will exist. Real time health monitoring of critical subsystems is essential for providing a safe abort timeline in the event of a catastrophic subsystem failure. In this paper, we discuss a practical and cost effective process for developing critical subsystem failure detection, diagnosis and response (FDDR). We also present the results of a real time health monitoring simulation of a propellant ullage pressurization subsystem failure. The health monitoring development process identifies hazards, isolates hazard causes, defines software partitioning requirements and quantifies software algorithm development. The process provides a means to establish the number and placement of sensors necessary to provide real time health monitoring. We discuss how health monitoring software tracks subsystem control commands, interprets off-nominal operational sensor data, predicts failure propagation timelines, corroborate failures predictions and formats failure protocol.
Real-Time Fault Classification for Plasma Processes
Yang, Ryan; Chen, Rongshun
2011-01-01
Plasma process tools, which usually cost several millions of US dollars, are often used in the semiconductor fabrication etching process. If the plasma process is halted due to some process fault, the productivity will be reduced and the cost will increase. In order to maximize the product/wafer yield and tool productivity, a timely and effective fault process detection is required in a plasma reactor. The classification of fault events can help the users to quickly identify fault processes, and thus can save downtime of the plasma tool. In this work, optical emission spectroscopy (OES) is employed as the metrology sensor for in-situ process monitoring. Splitting into twelve different match rates by spectrum bands, the matching rate indicator in our previous work (Yang, R.; Chen, R.S. Sensors 2010, 10, 5703–5723) is used to detect the fault process. Based on the match data, a real-time classification of plasma faults is achieved by a novel method, developed in this study. Experiments were conducted to validate the novel fault classification. From the experimental results, we may conclude that the proposed method is feasible inasmuch that the overall accuracy rate of the classification for fault event shifts is 27 out of 28 or about 96.4% in success. PMID:22164001
NASA Astrophysics Data System (ADS)
Zimakov, Leonid; Jackson, Michael; Passmore, Paul; Raczka, Jared; Alvarez, Marcos; Barrientos, Sergio
2015-04-01
We will discuss and show the results obtained from an integrated SeismoGeodetic System, model SG160-09, installed in the Chilean National Network. The SG160-09 provides the user high rate GNSS and accelerometer data, full epoch-by-epoch measurement integrity and, using the Trimble Pivot™ SeismoGeodetic App, the ability to create combined GNSS and accelerometer high-rate (200Hz) displacement time series in real-time. The SG160-09 combines seismic recording with GNSS geodetic measurement in a single compact, ruggedized package. The system includes a low-power, 220-channel GNSS receiver powered by the latest Trimble-precise Maxwell™6 technology and supports tracking GPS, GLONASS and Galileo signals. The receiver incorporates on-board GNSS point positioning using Real-Time Precise Point Positioning (PPP) technology with satellite clock and orbit corrections delivered over IP networks. The seismic recording element includes an ANSS Class A, force balance triaxial accelerometer with the latest, low power, 24-bit A/D converter, which produces high-resolution seismic data. The SG160-09 processor acquires and packetizes both seismic and geodetic data and transmits it to the central station using an advanced, error-correction protocol with back fill capability providing data integrity between the field and the processing center. The SG160-09 has been installed in the seismic station close to the area of the Iquique earthquake of April 1, 2014, in northern Chile, a seismically prone area at the current time. The hardware includes the SG160-09 system, external Zephyr Geodetic-2 GNSS antenna, and high-speed Internet communication media. Both acceleration and displacement data was transmitted in real-time to the National Seismological Center in Santiago for real-time data processing using Earthworm / Early Bird software. Command/Control of the field station and real-time GNSS position correction are provided via the Pivot software suite. Data from the SG160-09 system was used for seismic event characterization along with data from traditional stand-alone broadband seismic and geodetic stations installed in the network. Our presentation will focus on the key improvements of the network installation with the SG160-09 system, rapid data transmission, and real-time data processing for strong seismic events and aftershock characterization as well as advanced features of the SG160-09 for Earthquake and Tsunami Early Warning system.
A Volcano Exploration Project Pu`u `O`o (VEPP) Exercise: Is Kilauea in Volcanic Unrest? (Invited)
NASA Astrophysics Data System (ADS)
Schwartz, S. Y.
2010-12-01
Volcanic activity captures the interest and imagination of students at all stages in their education. Analysis of real data collected on active volcanoes can further serve to engage students in higher-level inquiry into the complicated physical processes associated with volcanic eruptions. This exercise takes advantage of both student fascination with volcanoes and the recognized benefits of incorporating real, internet-accessible data to achieve its goals of enabling students to: 1) navigate a scientific website; 2) describe the physical events that produce volcano monitoring data; 3) identify patterns in geophysical time-series and distinguish anomalies preceding and synchronous with eruptive events; 4) compare and contrast geophysical time series and 5) integrate diverse data sets to assess the eruptive state of Kilauea volcano. All data come from the VEPP website (vepp.wr.usgs.gov) which provides background information on the historic activity and volcano monitoring methods as well as near-real time volcano monitoring data from the Pu`u `O`o eruptive vent on Kilauea Volcano. This exercise, designed for geology majors, has students initially work individually to acquire basic skills with volcano monitoring data interpretation and then together in a jigsaw activity to unravel the events leading up to and culminating in the July 2007 volcanic episode. Based on patterns established prior to the July 2007 event, students examine real-time volcano monitoring data to evaluate the present activity level of Kilauea volcano. This exercise will be used for the first time in an upper division Geologic Hazards class in fall 2010 and lessons learned including an exercise assessment will be presented.
Exploring Listeners' Real-Time Reactions to Regional Accents
ERIC Educational Resources Information Center
Watson, Kevin; Clark, Lynn
2015-01-01
Evaluative reactions to language stimuli are presumably dynamic events, constantly changing through time as the signal unfolds, yet the tools we usually use to capture these reactions provide us with only a snapshot of this process by recording reactions at a single point in time. This paper outlines and evaluates a new methodology which employs…
Software-Based Real-Time Acquisition and Processing of PET Detector Raw Data.
Goldschmidt, Benjamin; Schug, David; Lerche, Christoph W; Salomon, André; Gebhardt, Pierre; Weissler, Bjoern; Wehner, Jakob; Dueppenbecker, Peter M; Kiessling, Fabian; Schulz, Volkmar
2016-02-01
In modern positron emission tomography (PET) readout architectures, the position and energy estimation of scintillation events (singles) and the detection of coincident events (coincidences) are typically carried out on highly integrated, programmable printed circuit boards. The implementation of advanced singles and coincidence processing (SCP) algorithms for these architectures is often limited by the strict constraints of hardware-based data processing. In this paper, we present a software-based data acquisition and processing architecture (DAPA) that offers a high degree of flexibility for advanced SCP algorithms through relaxed real-time constraints and an easily extendible data processing framework. The DAPA is designed to acquire detector raw data from independent (but synchronized) detector modules and process the data for singles and coincidences in real-time using a center-of-gravity (COG)-based, a least-squares (LS)-based, or a maximum-likelihood (ML)-based crystal position and energy estimation approach (CPEEA). To test the DAPA, we adapted it to a preclinical PET detector that outputs detector raw data from 60 independent digital silicon photomultiplier (dSiPM)-based detector stacks and evaluated it with a [(18)F]-fluorodeoxyglucose-filled hot-rod phantom. The DAPA is highly reliable with less than 0.1% of all detector raw data lost or corrupted. For high validation thresholds (37.1 ± 12.8 photons per pixel) of the dSiPM detector tiles, the DAPA is real time capable up to 55 MBq for the COG-based CPEEA, up to 31 MBq for the LS-based CPEEA, and up to 28 MBq for the ML-based CPEEA. Compared to the COG-based CPEEA, the rods in the image reconstruction of the hot-rod phantom are only slightly better separable and less blurred for the LS- and ML-based CPEEA. While the coincidence time resolution (∼ 500 ps) and energy resolution (∼12.3%) are comparable for all three CPEEA, the system sensitivity is up to 2.5 × higher for the LS- and ML-based CPEEA.
Julian, Timothy R; Bustos, Carla; Kwong, Laura H; Badilla, Alejandro D; Lee, Julia; Bischel, Heather N; Canales, Robert A
2018-05-08
Quantitative data on human-environment interactions are needed to fully understand infectious disease transmission processes and conduct accurate risk assessments. Interaction events occur during an individual's movement through, and contact with, the environment, and can be quantified using diverse methodologies. Methods that utilize videography, coupled with specialized software, can provide a permanent record of events, collect detailed interactions in high resolution, be reviewed for accuracy, capture events difficult to observe in real-time, and gather multiple concurrent phenomena. In the accompanying video, the use of specialized software to capture humanenvironment interactions for human exposure and disease transmission is highlighted. Use of videography, combined with specialized software, allows for the collection of accurate quantitative representations of human-environment interactions in high resolution. Two specialized programs include the Virtual Timing Device for the Personal Computer, which collects sequential microlevel activity time series of contact events and interactions, and LiveTrak, which is optimized to facilitate annotation of events in real-time. Opportunities to annotate behaviors at high resolution using these tools are promising, permitting detailed records that can be summarized to gain information on infectious disease transmission and incorporated into more complex models of human exposure and risk.
Handheld 2-channel impedimetric cell counting system with embedded real-time processing
NASA Astrophysics Data System (ADS)
Rottigni, A.; Carminati, M.; Ferrari, G.; Vahey, M. D.; Voldman, J.; Sampietro, M.
2011-05-01
Lab-on-a-chip systems have been attracting a growing attention for the perspective of miniaturization and portability of bio-chemical assays. Here we present a the design and characterization of a miniaturized, USB-powered, self-contained, 2-channel instrument for impedance sensing, suitable for label-free tracking and real-time detection of cells flowing in microfluidic channels. This original circuit features a signal generator based on a direct digital synthesizer, a transimpedance amplifier, an integrated square-wave lock-in coupled to a Σ▵ ADC converter, and a digital processing platform. Real-time automatic peak detection on two channels is implemented in a FPGA. System functionality has been tested with an electronic resistance modulator to simulate 1% impedance variation produced by cells, reaching a time resolution of 50μs (enabling a count rate of 2000 events/s) with an applied voltage as low as 200mV. Biological experiments have been carried out counting yeast cells. Statistical analysis of events is in agreement with the expected amplitude and time distributions. 2-channel yeast counting has been performed with concomitant dielectrophoretic cell separation, showing that this novel and ultra compact sensing system, thanks to the selectivity of the lock-in detector, is compatible with other AC electrical fields applied to the device.
A Verification Method of Inter-Task Cooperation in Embedded Real-time Systems and its Evaluation
NASA Astrophysics Data System (ADS)
Yoshida, Toshio
In software development process of embedded real-time systems, the design of the task cooperation process is very important. The cooperating process of such tasks is specified by task cooperation patterns. Adoption of unsuitable task cooperation patterns has fatal influence on system performance, quality, and extendibility. In order to prevent repetitive work caused by the shortage of task cooperation performance, it is necessary to verify task cooperation patterns in an early software development stage. However, it is very difficult to verify task cooperation patterns in an early software developing stage where task program codes are not completed yet. Therefore, we propose a verification method using task skeleton program codes and a real-time kernel that has a function of recording all events during software execution such as system calls issued by task program codes, external interrupts, and timer interrupt. In order to evaluate the proposed verification method, we applied it to the software development process of a mechatronics control system.
Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan
2017-01-01
In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of our proposed adaptive crawler and our stream division/recombination technique provides significant gains in event recall (44.44%) and event precision (9.57%). The addition of these sub-events or pieces, allows us to get closer to solving the event puzzle. PMID:29107976
Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan
2017-01-01
In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of our proposed adaptive crawler and our stream division/recombination technique provides significant gains in event recall (44.44%) and event precision (9.57%). The addition of these sub-events or pieces, allows us to get closer to solving the event puzzle.
Real-time measurements, rare events and photon economics
NASA Astrophysics Data System (ADS)
Jalali, B.; Solli, D. R.; Goda, K.; Tsia, K.; Ropers, C.
2010-07-01
Rogue events otherwise known as outliers and black swans are singular, rare, events that carry dramatic impact. They appear in seemingly unconnected systems in the form of oceanic rogue waves, stock market crashes, evolution, and communication systems. Attempts to understand the underlying dynamics of such complex systems that lead to spectacular and often cataclysmic outcomes have been frustrated by the scarcity of events, resulting in insufficient statistical data, and by the inability to perform experiments under controlled conditions. Extreme rare events also occur in ultrafast physical sciences where it is possible to collect large data sets, even for rare events, in a short time period. The knowledge gained from observing rare events in ultrafast systems may provide valuable insight into extreme value phenomena that occur over a much slower timescale and that have a closer connection with human experience. One solution is a real-time ultrafast instrument that is capable of capturing singular and randomly occurring non-repetitive events. The time stretch technology developed during the past 13 years is providing a powerful tool box for reaching this goal. This paper reviews this technology and discusses its use in capturing rogue events in electronic signals, spectroscopy, and imaging. We show an example in nonlinear optics where it was possible to capture rare and random solitons whose unusual statistical distribution resemble those observed in financial markets. The ability to observe the true spectrum of each event in real time has led to important insight in understanding the underlying process, which in turn has made it possible to control soliton generation leading to improvement in the coherence of supercontinuum light. We also show a new class of fast imagers which are being considered for early detection of cancer because of their potential ability to detect rare diseased cells (so called rogue cells) in a large population of healthy cells.
ROADNET: A Real-time Data Aware System for Earth, Oceanographic, and Environmental Applications
NASA Astrophysics Data System (ADS)
Vernon, F.; Hansen, T.; Lindquist, K.; Ludascher, B.; Orcutt, J.; Rajasekar, A.
2003-12-01
The Real-time Observatories, Application, and Data management Network (ROADNet) Program aims to develop an integrated, seamless, and transparent environmental information network that will deliver geophysical, oceanographic, hydrological, ecological, and physical data to a variety of users in real-time. ROADNet is a multidisciplinary, multinational partnership of researchers, policymakers, natural resource managers, educators, and students who aim to use the data to advance our understanding and management of coastal, ocean, riparian, and terrestrial Earth systems in Southern California, Mexico, and well off shore. To date, project activity and funding have focused on the design and deployment of network linkages and on the exploratory development of the real-time data management system. We are currently adapting powerful "Data Grid" technologies to the unique challenges associated with the management and manipulation of real-time data. Current "Grid" projects deal with static data files, and significant technical innovation is required to address fundamental problems of real-time data processing, integration, and distribution. The technologies developed through this research will create a system that dynamically adapt downstream processing, cataloging, and data access interfaces when sensors are added or removed from the system; provide for real-time processing and monitoring of data streams--detecting events, and triggering computations, sensor and logger modifications, and other actions; integrate heterogeneous data from multiple (signal) domains; and provide for large-scale archival and querying of "consolidated" data. The software tools which must be developed do not exist, although limited prototype systems are available. This research has implications for the success of large-scale NSF initiatives in the Earth sciences (EarthScope), ocean sciences (OOI- Ocean Observatories Initiative), biological sciences (NEON - National Ecological Observatory Network) and civil engineering (NEES - Network for Earthquake Engineering Simulation). Each of these large scale initiatives aims to collect real-time data from thousands of sensors, and each will require new technologies to process, manage, and communicate real-time multidisciplinary environmental data on regional, national, and global scales.
Context-aware event detection smartphone application for first responders
NASA Astrophysics Data System (ADS)
Boddhu, Sanjay K.; Dave, Rakesh P.; McCartney, Matt; West, James A.; Williams, Robert L.
2013-05-01
The rise of social networking platforms like Twitter, Facebook, etc…, have provided seamless sharing of information (as chat, video and other media) among its user community on a global scale. Further, the proliferation of the smartphones and their connectivity networks has powered the ordinary individuals to share and acquire information regarding the events happening in his/her immediate vicinity in a real-time fashion. This human-centric sensed data being generated in "human-as-sensor" approach is tremendously valuable as it delivered mostly with apt annotations and ground truth that would be missing in traditional machine-centric sensors, besides high redundancy factor (same data thru multiple users). Further, when appropriately employed this real-time data can support in detecting localized events like fire, accidents, shooting, etc…, as they unfold and pin-point individuals being affected by those events. This spatiotemporal information, when made available for first responders in the event vicinity (or approaching it) can greatly assist them to make effective decisions to protect property and life in a timely fashion. In this vein, under SATE and YATE programs, the research team at AFRL Tec^Edge Discovery labs had demonstrated the feasibility of developing Smartphone applications, that can provide a augmented reality view of the appropriate detected events in a given geographical location (localized) and also provide an event search capability over a large geographic extent. In its current state, the application thru its backend connectivity utilizes a data (Text & Image) processing framework, which deals with data challenges like; identifying and aggregating important events, analyzing and correlating the events temporally and spatially and building a search enabled event database. Further, the smartphone application with its backend data processing workflow has been successfully field tested with live user generated feeds.
Analytical Models of Cross-Layer Protocol Optimization in Real-Time Wireless Sensor Ad Hoc Networks
NASA Astrophysics Data System (ADS)
Hortos, William S.
The real-time interactions among the nodes of a wireless sensor network (WSN) to cooperatively process data from multiple sensors are modeled. Quality-of-service (QoS) metrics are associated with the quality of fused information: throughput, delay, packet error rate, etc. Multivariate point process (MVPP) models of discrete random events in WSNs establish stochastic characteristics of optimal cross-layer protocols. Discrete-event, cross-layer interactions in mobile ad hoc network (MANET) protocols have been modeled using a set of concatenated design parameters and associated resource levels by the MVPPs. Characterization of the "best" cross-layer designs for a MANET is formulated by applying the general theory of martingale representations to controlled MVPPs. Performance is described in terms of concatenated protocol parameters and controlled through conditional rates of the MVPPs. Modeling limitations to determination of closed-form solutions versus explicit iterative solutions for ad hoc WSN controls are examined.
Volcanic Ash and SO2 Monitoring Using Suomi NPP Direct Broadcast OMPS Data
NASA Astrophysics Data System (ADS)
Seftor, C. J.; Krotkov, N. A.; McPeters, R. D.; Li, J. Y.; Brentzel, K. W.; Habib, S.; Hassinen, S.; Heinrichs, T. A.; Schneider, D. J.
2014-12-01
NASA's Suomi NPP Ozone Science Team, in conjunction with Goddard Space Flight Center's (GSFC's) Direct Readout Laboratory, developed the capability of processing, in real-time, direct readout (DR) data from the Ozone Mapping and Profiler Suite (OMPS) to perform SO2 and Aerosol Index (AI) retrievals. The ability to retrieve this information from real-time processing of DR data was originally developed for the Ozone Monitoring Instrument (OMI) onboard the Aura spacecraft and is used by Volcano Observatories and Volcanic Ash Advisory Centers (VAACs) charged with mapping ash clouds from volcanic eruptions and providing predictions/forecasts about where the ash will go. The resulting real-time SO2 and AI products help to mitigate the effects of eruptions such as the ones from Eyjafjallajokull in Iceland and Puyehue-Cordón Caulle in Chile, which cause massive disruptions to airline flight routes for weeks as airlines struggle to avoid ash clouds that could cause engine failure, deeply pitted windshields impossible to see through, and other catastrophic events. We will discuss the implementation of real-time processing of OMPS DR data by both the Geographic Information Network of Alaska (GINA) and the Finnish Meteorological Institute (FMI), which provide real-time coverage over some of the most congested airspace and over many of the most active volcanoes in the world, and show examples of OMPS DR processing results from recent volcanic eruptions.
Defining Tsunami Magnitude as Measure of Potential Impact
NASA Astrophysics Data System (ADS)
Titov, V. V.; Tang, L.
2016-12-01
The goal of tsunami forecast, as a system for predicting potential impact of a tsunami at coastlines, requires quick estimate of a tsunami magnitude. This goal has been recognized since the beginning of tsunami research. The work of Kajiura, Soloviev, Abe, Murty, and many others discussed several scales for tsunami magnitude based on estimates of tsunami energy. However, difficulties of estimating tsunami energy based on available tsunami measurements at coastal sea-level stations has carried significant uncertainties and has been virtually impossible in real time, before tsunami impacts coastlines. The slow process of tsunami magnitude estimates, including collection of vast amount of available coastal sea-level data from affected coastlines, made it impractical to use any tsunami magnitude scales in tsunami warning operations. Uncertainties of estimates made tsunami magnitudes difficult to use as universal scale for tsunami analysis. Historically, the earthquake magnitude has been used as a proxy of tsunami impact estimates, since real-time seismic data is available of real-time processing and ample amount of seismic data is available for an elaborate post event analysis. This measure of tsunami impact carries significant uncertainties in quantitative tsunami impact estimates, since the relation between the earthquake and generated tsunami energy varies from case to case. In this work, we argue that current tsunami measurement capabilities and real-time modeling tools allow for establishing robust tsunami magnitude that will be useful for tsunami warning as a quick estimate for tsunami impact and for post-event analysis as a universal scale for tsunamis inter-comparison. We present a method for estimating the tsunami magnitude based on tsunami energy and present application of the magnitude analysis for several historical events for inter-comparison with existing methods.
Real-Time Data Processing in the muon system of the D0 detector.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neeti Parashar et al.
2001-07-03
This paper presents a real-time application of the 16-bit fixed point Digital Signal Processors (DSPs), in the Muon System of the D0 detector located at the Fermilab Tevatron, presently the world's highest-energy hadron collider. As part of the Upgrade for a run beginning in the year 2000, the system is required to process data at an input event rate of 10 KHz without incurring significant deadtime in readout. The ADSP21csp01 processor has high I/O bandwidth, single cycle instruction execution and fast task switching support to provide efficient multisignal processing. The processor's internal memory consists of 4K words of Program Memorymore » and 4K words of Data Memory. In addition there is an external memory of 32K words for general event buffering and 16K words of Dual port Memory for input data queuing. This DSP fulfills the requirement of the Muon subdetector systems for data readout. All error handling, buffering, formatting and transferring of the data to the various trigger levels of the data acquisition system is done in software. The algorithms developed for the system complete these tasks in about 20 {micro}s per event.« less
Duke, Jon D.; Friedlin, Jeff
2010-01-01
Evaluating medications for potential adverse events is a time-consuming process, typically involving manual lookup of information by physicians. This process can be expedited by CDS systems that support dynamic retrieval and filtering of adverse drug events (ADE’s), but such systems require a source of semantically-coded ADE data. We created a two-component system that addresses this need. First we created a natural language processing application which extracts adverse events from Structured Product Labels and generates a standardized ADE knowledge base. We then built a decision support service that consumes a Continuity of Care Document and returns a list of patient-specific ADE’s. Our database currently contains 534,125 ADE’s from 5602 product labels. An NLP evaluation of 9529 ADE’s showed recall of 93% and precision of 95%. On a trial set of 30 CCD’s, the system provided adverse event data for 88% of drugs and returned these results in an average of 620ms. PMID:21346964
NASA Astrophysics Data System (ADS)
Ding, R.; He, T.
2017-12-01
With the increased popularity in mobile applications and services, there has been a growing demand for more advanced mobile technologies that utilize real-time Location Based Services (LBS) data to support natural hazard response efforts. Compared to traditional sources like the census bureau that often can only provide historical and static data, an LBS service can provide more current data to drive a real-time natural hazard response system to more accurately process and assess issues such as population density in areas impacted by a hazard. However, manually preparing or preprocessing the data to suit the needs of the particular application would be time-consuming. This research aims to implement a population heatmap visual analytics system based on real-time data for natural disaster emergency management. System comprised of a three-layered architecture, including data collection, data processing, and visual analysis layers. Real-time, location-based data meeting certain polymerization conditions are collected from multiple sources across the Internet, then processed and stored in a cloud-based data store. Parallel computing is utilized to provide fast and accurate access to the pre-processed population data based on criteria such as the disaster event and to generate a location-based population heatmap as well as other types of visual digital outputs using auxiliary analysis tools. At present, a prototype system, which geographically covers the entire region of China and combines population heat map based on data from the Earthquake Catalogs database has been developed. It Preliminary results indicate that the generation of dynamic population density heatmaps based on the prototype system has effectively supported rapid earthquake emergency rescue and evacuation efforts as well as helping responders and decision makers to evaluate and assess earthquake damage. Correlation analyses that were conducted revealed that the aggregation and movement of people depended on various factors, including earthquake occurrence time and location of epicenter. This research hopes to continue to build upon the success of the prototype system in order to improve and extend the system to support the analysis of earthquakes and other types of natural hazard events.
A first near real-time seismology-based landquake monitoring system.
Chao, Wei-An; Wu, Yih-Min; Zhao, Li; Chen, Hongey; Chen, Yue-Gau; Chang, Jui-Ming; Lin, Che-Min
2017-03-02
Hazards from gravity-driven instabilities on hillslope (termed 'landquake' in this study) are an important problem facing us today. Rapid detection of landquake events is crucial for hazard mitigation and emergency response. Based on the real-time broadband data in Taiwan, we have developed a near real-time landquake monitoring system, which is a fully automatic process based on waveform inversion that yields source information (e.g., location and mechanism) and identifies the landquake source by examining waveform fitness for different types of source mechanisms. This system has been successfully tested offline using seismic records during the passage of the 2009 Typhoon Morakot in Taiwan and has been in online operation during the typhoon season in 2015. In practice, certain levels of station coverage (station gap < 180°), signal-to-noise ratio (SNR ≥ 5.0), and a threshold of event size (volume >10 6 m 3 and area > 0.20 km 2 ) are required to ensure good performance (fitness > 0.6 for successful source identification) of the system, which can be readily implemented in other places in the world with real-time seismic networks and high landquake activities.
A first near real-time seismology-based landquake monitoring system
Chao, Wei-An; Wu, Yih-Min; Zhao, Li; Chen, Hongey; Chen, Yue-Gau; Chang, Jui-Ming; Lin, Che-Min
2017-01-01
Hazards from gravity-driven instabilities on hillslope (termed ‘landquake’ in this study) are an important problem facing us today. Rapid detection of landquake events is crucial for hazard mitigation and emergency response. Based on the real-time broadband data in Taiwan, we have developed a near real-time landquake monitoring system, which is a fully automatic process based on waveform inversion that yields source information (e.g., location and mechanism) and identifies the landquake source by examining waveform fitness for different types of source mechanisms. This system has been successfully tested offline using seismic records during the passage of the 2009 Typhoon Morakot in Taiwan and has been in online operation during the typhoon season in 2015. In practice, certain levels of station coverage (station gap < 180°), signal-to-noise ratio (SNR ≥ 5.0), and a threshold of event size (volume >106 m3 and area > 0.20 km2) are required to ensure good performance (fitness > 0.6 for successful source identification) of the system, which can be readily implemented in other places in the world with real-time seismic networks and high landquake activities. PMID:28252039
Signal processing methodologies for an acoustic fetal heart rate monitor
NASA Technical Reports Server (NTRS)
Pretlow, Robert A., III; Stoughton, John W.
1992-01-01
Research and development is presented of real time signal processing methodologies for the detection of fetal heart tones within a noise-contaminated signal from a passive acoustic sensor. A linear predictor algorithm is utilized for detection of the heart tone event and additional processing derives heart rate. The linear predictor is adaptively 'trained' in a least mean square error sense on generic fetal heart tones recorded from patients. A real time monitor system is described which outputs to a strip chart recorder for plotting the time history of the fetal heart rate. The system is validated in the context of the fetal nonstress test. Comparisons are made with ultrasonic nonstress tests on a series of patients. Comparative data provides favorable indications of the feasibility of the acoustic monitor for clinical use.
NASA Astrophysics Data System (ADS)
Gómez-Rodríguez, F.; Linares-Barranco, A.; Paz, R.; Miró-Amarante, L.; Jiménez, G.; Civit, A.
2007-05-01
Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows real-time virtual massive connectivity among huge number of neurons located on different chips.[1] By exploiting high speed digital communication circuits (with nano-seconds timing), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Neurons generate "events" according to their activity levels. That is, more active neurons generate more events per unit time and access the interchip communication channel more frequently than neurons with low activity. In Neuromorphic system development, AER brings some advantages to develop real-time image processing system: (1) AER represents the information like time continuous stream not like a frame; (2) AER sends the most important information first (although this depends on the sender); (3) AER allows to process information as soon as it is received. When AER is used in artificial vision field, each pixel is considered like a neuron, so pixel's intensity is represented like a sequence of events; modifying the number and the frequency of these events, it is possible to make some image filtering. In this paper we present four image filters using AER: (a) Noise addition and suppression, (b) brightness modification, (c) single moving object tracking and (d) geometrical transformations (rotation, translation, reduction and magnification). For testing and debugging, we use USB-AER board developed by Robotic and Technology of Computers Applied to Rehabilitation (RTCAR) research group. This board is based on an FPGA, devoted to manage the AER functionality. This board also includes a micro-controlled for USB communication, 2 Mbytes RAM and 2 AER ports (one for input and one for output).
Hu, Guohang; Zhao, Yuanan; Liu, Xiaofeng; Li, Dawei; Xiao, Qiling; Yi, Kui; Shao, Jianda
2013-08-01
A reliable method, combining a wet etch process and real-time damage event imaging during a raster scan laser damage test, has been developed to directly determine the most dangerous precursor inducing low-density laser damage at 355 nm in fused silica. It is revealed that ~16% of laser damage sites were initiated at the place of the scratches, ~49% initiated at the digs, and ~35% initiated at invisible defects. The morphologies of dangerous scratches and digs were compared with those of moderate ones. It is found that local sharp variation at the edge, twist, or inside of a subsurface defect is the most dangerous laser damage precursor.
NASA Astrophysics Data System (ADS)
Mencin, D.; Hodgkinson, K. M.; Mattioli, G. S.
2017-12-01
In support of hazard research and Earthquake Early Warning (EEW) Systems UNAVCO operates approximately 800 RT-GNSS stations throughout western North America and Alaska (EarthScope Plate Boundary Observatory), Mexico (TLALOCNet), and the pan-Caribbean region (COCONet). Our system produces and distributes raw data (BINEX and RTCM3) and real-time Precise Point Positions via the Trimble PIVOT Platform (RTX). The 2017-09-08 earthquake M8.2 located 98 km SSW of Tres Picos, Mexico is the first great earthquake to occur within the UNAVCO RT-GNSS footprint, which allows for a rigorous analysis of our dynamic and static processing methods. The need for rapid geodetic solutions ranges from seconds (EEW systems) to several minutes (Tsunami Warning and NEIC moment tensor and finite fault models). Here, we compare and quantify the relative processing strategies for producing static offsets, moment tensors and geodetically determined finite fault models using data recorded during this event. We also compare the geodetic solutions with the USGS NEIC seismically derived moment tensors and finite fault models, including displacement waveforms generated from these models. We define kinematic post-processed solutions from GIPSY-OASISII (v6.4) with final orbits and clocks as a "best" case reference to evaluate the performance of our different processing strategies. We find that static displacements of a few centimeters or less are difficult to resolve in the real-time GNSS position estimates. The standard daily 24-hour solutions provide the highest-quality data-set to determine coseismic offsets, but these solutions are delayed by at least 48 hours after the event. Dynamic displacements, estimated in real-time, however, show reasonable agreement with final, post-processed position estimates, and while individual position estimates have large errors, the real-time solutions offer an excellent operational option for EEW systems, including the use of estimated peak-ground displacements or directly inverting for finite-fault solutions. In the near-field, we find that the geodetically-derived moment tensors and finite fault models differ significantly with seismically-derived models, highlighting the utility of using geodetic data in hazard applications.
NASA Astrophysics Data System (ADS)
Cahill, Paul; Michalis, Panagiotis; Solman, Hrvoje; Kerin, Igor; Bekic, Damir; Pakrashi, Vikram; McKeogh, Eamon
2017-04-01
With the effects of climate change becoming more apparent, extreme weather events are now occurring with greater frequency throughout the world. Such extreme events have resulted in increased high intensity flood events which are having devastating consequences on hydro-structures, especially on bridge infrastructure. The remote and often inaccessible nature of such bridges makes inspections problematic, a major concern if safety assessments are required during and after extreme flood events. A solution to this is the introduction of smart, low cost sensing solutions at locations susceptible to hydro-hazards. Such solutions can provide real-time information on the health of the bridge and its environments, with such information aiding in the mitigation of the risks associated with extreme weather events. This study presents the development of an intelligent system for remote, real-time monitoring of hydro-hazards to bridge infrastructure. The solution consists of two types of remote monitoring stations which have the capacity to monitor environmental conditions and provide real-time information to a centralized, big data database solution, from which an intelligent decision support system will accommodate the results to control and manage bridge, river and catchment assets. The first device developed as part of the system is the Weather Information Logging Device (WILD), which monitors rainfall, temperature and air and soil moisture content. The ability of the WILD to monitor rainfall in real time enables flood early warning alerts and predictive river flow conditions, thereby enabling decision makers the ability to make timely and effective decisions about critical infrastructures in advance of extreme flood events. The WILD is complemented by a second monitoring device, the Bridge Information Recording Device (BIRD), which monitors water levels at a given location in real-time. The monitoring of water levels of a river allows for, among other applications, hydraulic modelling to assess the likely impact that severe flood events will have on a bridges foundation, particularly due to scour. The process of reading and validating data from the WILD and BIRD buffer servers is outlined, as is the transmission protocol used for the sending of recorded data to a centralized repository for further use and analysis. Finally, the development of a centralized repository for the collection of data from the WILD and BIRD devices is presented. Eventually the big data solution would be used to receive, store and send the monitored data to the hydrological models, whether existing or developed, and the results would be transmitted to the intelligent decision support system based on a web-based platform, for managing, planning and executing data, processes and procedures for bridge assets. The development of intelligent hydroinformatic system is an important tool for the protection of key infrastructure assets from the increasingly common effects of climate change. Acknowledgement The authors wish to acknowledge the financial support of the European Commission, through the Marie Curie Industry-Academia Partnership and Pathways Network BRIDGE SMS (Intelligent Bridge Assessment Maintenance and Management System) - FP7-People-2013-IAPP- 612517.
Hippocampal Processing of Ambiguity Enhances Fear Memory
Amadi, Ugwechi; Lim, Seh Hong; Liu, Elizabeth; Baratta, Michael V.; Goosens, Ki Ann
2016-01-01
Despite the ubiquitous use of Pavlovian fear conditioning as a model for fear learning, the highly predictable conditions used in the laboratory do not resemble real-world conditions, where dangerous situations can lead to unpleasant outcomes in unpredictable ways. Here we varied the timing of aversive events following predictive cues in rodents and discovered that temporal ambiguity of aversive events greatly enhances fear. During fear conditioning with unpredictably timed aversive events, pharmacological inactivation of the dorsal hippocampus or optogenetic silencing of CA1 cells during aversive negative prediction errors prevented this enhancement of fear without impacting fear learning for predictable events. Dorsal hippocampal inactivation also prevented ambiguity-related enhancement of fear during auditory fear conditioning under a partial reinforcement schedule. These results reveal that information about the timing and occurrence of aversive events is rapidly acquired and that unexpectedly timed or omitted aversive events generate hippocampal signals to enhance fear learning. PMID:28182526
Hippocampal Processing of Ambiguity Enhances Fear Memory.
Amadi, Ugwechi; Lim, Seh Hong; Liu, Elizabeth; Baratta, Michael V; Goosens, Ki A
2017-02-01
Despite the ubiquitous use of Pavlovian fear conditioning as a model for fear learning, the highly predictable conditions used in the laboratory do not resemble real-world conditions, in which dangerous situations can lead to unpleasant outcomes in unpredictable ways. In the current experiments, we varied the timing of aversive events after predictive cues in rodents and discovered that temporal ambiguity of aversive events greatly enhances fear. During fear conditioning with unpredictably timed aversive events, pharmacological inactivation of the dorsal hippocampus or optogenetic silencing of cornu ammonis 1 cells during aversive negative prediction errors prevented this enhancement of fear without affecting fear learning for predictable events. Dorsal hippocampal inactivation also prevented ambiguity-related enhancement of fear during auditory fear conditioning under a partial-reinforcement schedule. These results reveal that information about the timing and occurrence of aversive events is rapidly acquired and that unexpectedly timed or omitted aversive events generate hippocampal signals to enhance fear learning.
Fitzharris, Michael; Liu, Sara; Stephens, Amanda N; Lenné, Michael G
2017-05-29
Real-time driver monitoring systems represent a solution to address key behavioral risks as they occur, particularly distraction and fatigue. The efficacy of these systems in real-world settings is largely unknown. This article has three objectives: (1) to document the incidence and duration of fatigue in real-world commercial truck-driving operations, (2) to determine the reduction, if any, in the incidence of fatigue episodes associated with providing feedback, and (3) to tease apart the relative contribution of in-cab warnings from 24/7 monitoring and feedback to employers. Data collected from a commercially available in-vehicle camera-based driver monitoring system installed in a commercial truck fleet operating in Australia were analyzed. The real-time driver monitoring system makes continuous assessments of driver drowsiness based on eyelid position and other factors. Data were collected in a baseline period where no feedback was provided to drivers. Real-time feedback to drivers then occurred via in-cab auditory and haptic warnings, which were further enhanced by direct feedback by company management when fatigue events were detected by external 24/7 monitors. Fatigue incidence rates and their timing of occurrence across the three time periods were compared. Relative to no feedback being provided to drivers when fatigue events were detected, in-cab warnings resulted in a 66% reduction in fatigue events, with a 95% reduction achieved by the real-time provision of direct feedback in addition to in-cab warnings (p < 0.01). With feedback, fatigue events were shorter in duration a d occurred later in the trip, and fewer drivers had more than one verified fatigue event per trip. That the provision of feedback to the company on driver fatigue events in real time provides greater benefit than feedback to the driver alone has implications for companies seeking to mitigate risks associated with fatigue. Having fewer fatigue events is likely a reflection of the device itself and the accompanying safety culture of the company in terms of how the information is used. Data were analysed on a per-truck trip basis, and the findings are indicative of fatigue events in a large-scale commercial transport fleet. Future research ought to account for individual driver performance, which was not possible with the available data in this retrospective analysis. Evidence that real-time driver monitoring feedback is effective in reducing fatigue events is invaluable in the development of fleet safety policies, and of future national policy and vehicle safety regulations. Implications for automotive driver monitoring are discussed.
Assimilation of Real-Time Satellite And Human Sensor Networks for Modeling Natural Disasters
NASA Astrophysics Data System (ADS)
Aulov, O.; Halem, M.; Lary, D. J.
2011-12-01
We describe the development of underlying technologies needed to address the merging of a web of real time satellite sensor Web (SSW) and Human Sensor Web (HSW) needed to augment the US response to extreme events. As an initial prototyping step and use case scenario, we consider the development of two major system tools that can be transitioned from research to the responding operational agency for mitigating coastal oil spills. These tools consist of the capture of Situation Aware (SA) Social Media (SM) Data, and assimilation of the processed information into forecasting models to provide incident decision managers with interactive virtual spatial temporal animations superimposed with probabilistic data estimates. The system methodologies are equally applicable to the wider class of extreme events such as plume dispersions from volcanoes or massive fires, major floods, hurricane impacts, radioactive isotope dispersions from nuclear accidents, etc. A successful feasibility demonstration of this technology has been shown in the case of the Deepwater Horizon Oil Spill where Human Sensor Networks have been combined with a geophysical model to perform parameter assessments. Flickr images of beached oil were mined from the spill area, geolocated and timestamped and converted into geophysical data. This data was incorporated into General NOAA Operational Modeling Environment (GNOME), a Lagrangian forecast model that uses near real-time surface winds, ocean currents, and satellite shape profiles of oil to generate a forecast of plume movement. As a result, improved estimates of diffusive coefficients and rates of oil spill were determined. Current approaches for providing satellite derived oil distributions are collected from a satellite sensor web of operational and research sensors from many countries, and a manual analysis is performed by NESDIS. A real time SA HSW processing system based on geolocated SM data from sources such as Twitter, Flickr, YouTube etc., greatly supplements the current operational practice of sending out teams of humans to gather samples of tarballs reaching coastal locations. We show that ensemble Kalman filter assimilation of the combination of SM data with model forecast background data fields can minimize the false positive cases of satellite observations alone. Our future framework consists of two parts, a real time SA HSW processing system and an on-demand SSW processing system. HSW processing system uses a geolocated SM data to provide observations of coastal oil contact. SSW system is composed of selected instruments from NASA EOS, NPP and available Decadal Survey mission satellites along with other in situ data to form a real time regional oil spill observing system. We will automate the NESDIS manual process of providing oil spill maps by using Self Organizing Feature Map (SOFM) algorithm. We use the LETKF scheme for assimilating the satellite sensor web and HSW observations into the GNOME model to reduce the uncertainty of the observations. We intend to infuse these developments in an SOA implementation for execution of event driven model forecast assimilation cycles in a dedicated HPC cloud.
Xu, Wenzhao; Collingsworth, Paris D.; Bailey, Barbara; Carlson Mazur, Martha L.; Schaeffer, Jeff; Minsker, Barbara
2017-01-01
This paper proposes a geospatial analysis framework and software to interpret water-quality sampling data from towed undulating vehicles in near-real time. The framework includes data quality assurance and quality control processes, automated kriging interpolation along undulating paths, and local hotspot and cluster analyses. These methods are implemented in an interactive Web application developed using the Shiny package in the R programming environment to support near-real time analysis along with 2- and 3-D visualizations. The approach is demonstrated using historical sampling data from an undulating vehicle deployed at three rivermouth sites in Lake Michigan during 2011. The normalized root-mean-square error (NRMSE) of the interpolation averages approximately 10% in 3-fold cross validation. The results show that the framework can be used to track river plume dynamics and provide insights on mixing, which could be related to wind and seiche events.
Development of a Real-Time Pulse Processing Algorithm for TES-Based X-Ray Microcalorimeters
NASA Technical Reports Server (NTRS)
Tan, Hui; Hennig, Wolfgang; Warburton, William K.; Doriese, W. Bertrand; Kilbourne, Caroline A.
2011-01-01
We report here a real-time pulse processing algorithm for superconducting transition-edge sensor (TES) based x-ray microcalorimeters. TES-based. microca1orimeters offer ultra-high energy resolutions, but the small volume of each pixel requires that large arrays of identical microcalorimeter pixe1s be built to achieve sufficient detection efficiency. That in turn requires as much pulse processing as possible must be performed at the front end of readout electronics to avoid transferring large amounts of data to a host computer for post-processing. Therefore, a real-time pulse processing algorithm that not only can be implemented in the readout electronics but also achieve satisfactory energy resolutions is desired. We have developed an algorithm that can be easily implemented. in hardware. We then tested the algorithm offline using several data sets acquired with an 8 x 8 Goddard TES x-ray calorimeter array and 2x16 NIST time-division SQUID multiplexer. We obtained an average energy resolution of close to 3.0 eV at 6 keV for the multiplexed pixels while preserving over 99% of the events in the data sets.
Whisper: Tracing the Spatiotemporal Process of Information Diffusion in Real Time.
Cao, Nan; Lin, Yu-Ru; Sun, Xiaohua; Lazer, D; Liu, Shixia; Qu, Huamin
2012-12-01
When and where is an idea dispersed? Social media, like Twitter, has been increasingly used for exchanging information, opinions and emotions about events that are happening across the world. Here we propose a novel visualization design, "Whisper", for tracing the process of information diffusion in social media in real time. Our design highlights three major characteristics of diffusion processes in social media: the temporal trend, social-spatial extent, and community response of a topic of interest. Such social, spatiotemporal processes are conveyed based on a sunflower metaphor whose seeds are often dispersed far away. In Whisper, we summarize the collective responses of communities on a given topic based on how tweets were retweeted by groups of users, through representing the sentiments extracted from the tweets, and tracing the pathways of retweets on a spatial hierarchical layout. We use an efficient flux line-drawing algorithm to trace multiple pathways so the temporal and spatial patterns can be identified even for a bursty event. A focused diffusion series highlights key roles such as opinion leaders in the diffusion process. We demonstrate how our design facilitates the understanding of when and where a piece of information is dispersed and what are the social responses of the crowd, for large-scale events including political campaigns and natural disasters. Initial feedback from domain experts suggests promising use for today's information consumption and dispersion in the wild.
Nakamura, Kosuke; Akiyama, Hiroshi; Kawano, Noriaki; Kobayashi, Tomoko; Yoshimatsu, Kayo; Mano, Junichi; Kitta, Kazumi; Ohmori, Kiyomi; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko
2013-12-01
Genetically modified (GM) rice (Oryza sativa) lines, such as insecticidal Kefeng and Kemingdao, have been developed and found unauthorised in processed rice products in many countries. Therefore, qualitative detection methods for the GM rice are required for the GM food regulation. A transgenic construct for expressing cowpea (Vigna unguiculata) trypsin inhibitor (CpTI) was detected in some imported processed rice products contaminated with Kemingdao. The 3' terminal sequence of the identified transgenic construct for expression of CpTI included an endoplasmic reticulum retention signal coding sequence (KDEL) and nopaline synthase terminator (T-nos). The sequence was identical to that in a report on Kefeng. A novel construct-specific real-time polymerase chain reaction (PCR) detection method for detecting the junction region sequence between the CpTI-KDEL and T-nos was developed. The imported processed rice products were evaluated for the contamination of the GM rice using the developed construct-specific real-time PCR methods, and detection frequency was compared with five event-specific detection methods. The construct-specific detection methods detected the GM rice at higher frequency than the event-specific detection methods. Therefore, we propose that the construct-specific detection method is a beneficial tool for screening the contamination of GM rice lines, such as Kefeng, in processed rice products for the GM food regulation. Copyright © 2013 Elsevier Ltd. All rights reserved.
Gauss-Seidel Iterative Method as a Real-Time Pile-Up Solver of Scintillation Pulses
NASA Astrophysics Data System (ADS)
Novak, Roman; Vencelj, Matja¿
2009-12-01
The pile-up rejection in nuclear spectroscopy has been confronted recently by several pile-up correction schemes that compensate for distortions of the signal and subsequent energy spectra artifacts as the counting rate increases. We study here a real-time capability of the event-by-event correction method, which at the core translates to solving many sets of linear equations. Tight time limits and constrained front-end electronics resources make well-known direct solvers inappropriate. We propose a novel approach based on the Gauss-Seidel iterative method, which turns out to be a stable and cost-efficient solution to improve spectroscopic resolution in the front-end electronics. We show the method convergence properties for a class of matrices that emerge in calorimetric processing of scintillation detector signals and demonstrate the ability of the method to support the relevant resolutions. The sole iteration-based error component can be brought below the sliding window induced errors in a reasonable number of iteration steps, thus allowing real-time operation. An area-efficient hardware implementation is proposed that fully utilizes the method's inherent parallelism.
Failure Forecasting in Triaxially Stressed Sandstones
NASA Astrophysics Data System (ADS)
Crippen, A.; Bell, A. F.; Curtis, A.; Main, I. G.
2017-12-01
Precursory signals to fracturing events have been observed to follow power-law accelerations in spatial, temporal, and size distributions leading up to catastrophic failure. In previous studies this behavior was modeled using Voight's relation of a geophysical precursor in order to perform `hindcasts' by solving for failure onset time. However, performing this analysis in retrospect creates a bias, as we know an event happened, when it happened, and we can search data for precursors accordingly. We aim to remove this retrospective bias, thereby allowing us to make failure forecasts in real-time in a rock deformation laboratory. We triaxially compressed water-saturated 100 mm sandstone cores (Pc= 25MPa, Pp = 5MPa, σ = 1.0E-5 s-1) to the point of failure while monitoring strain rate, differential stress, AEs, and continuous waveform data. Here we compare the current `hindcast` methods on synthetic and our real laboratory data. We then apply these techniques to increasing fractions of the data sets to observe the evolution of the failure forecast time with precursory data. We discuss these results as well as our plan to mitigate false positives and minimize errors for real-time application. Real-time failure forecasting could revolutionize the field of hazard mitigation of brittle failure processes by allowing non-invasive monitoring of civil structures, volcanoes, and possibly fault zones.
Novel Real-time Alignment and Calibration of the LHCb detector in Run2
NASA Astrophysics Data System (ADS)
Martinelli, Maurizio;
2017-10-01
LHCb has introduced a novel real-time detector alignment and calibration strategy for LHC Run2. Data collected at the start of the fill are processed in a few minutes and used to update the alignment parameters, while the calibration constants are evaluated for each run. This procedure improves the quality of the online reconstruction. For example, the vertex locator is retracted and reinserted for stable beam conditions in each fill to be centred on the primary vertex position in the transverse plane. Consequently its position changes on a fill-by-fill basis. Critically, this new real-time alignment and calibration procedure allows identical constants to be used in the online and offline reconstruction, thus improving the correlation between triggered and offline-selected events. This offers the opportunity to optimise the event selection in the trigger by applying stronger constraints. The required computing time constraints are met thanks to a new dedicated framework using the multi-core farm infrastructure for the trigger. The motivation for a real-time alignment and calibration of the LHCb detector is discussed from both the operational and physics performance points of view. Specific challenges of this novel configuration are discussed, as well as the working procedures of the framework and its performance.
ERIC Educational Resources Information Center
Banaschewski, Tobias; Brandeis, Daniel
2007-01-01
Background: Monitoring brain processes in real time requires genuine subsecond resolution to follow the typical timing and frequency of neural events. Non-invasive recordings of electric (EEG/ERP) and magnetic (MEG) fields provide this time resolution. They directly measure neural activations associated with a wide variety of brain states and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kirkham, R.; Siddons, D.; Dunn, P.A.
2010-06-23
The Maia detector system is engineered for energy dispersive x-ray fluorescence spectroscopy and elemental imaging at photon rates exceeding 10{sup 7}/s, integrated scanning of samples for pixel transit times as small as 50 {micro}s and high definition images of 10{sup 8} pixels and real-time processing of detected events for spectral deconvolution and online display of pure elemental images. The system developed by CSIRO and BNL combines a planar silicon 384 detector array, application-specific integrated circuits for pulse shaping and peak detection and sampling and optical data transmission to an FPGA-based pipelined, parallel processor. This paper describes the system and themore » underpinning engineering solutions.« less
NASA Astrophysics Data System (ADS)
Edwards, A. W.; Blackler, K.; Gill, R. D.; van der Goot, E.; Holm, J.
1990-10-01
Based upon the experience gained with the present soft x-ray data acquisition system, new techniques are being developed which make extensive use of digital signal processors (DSPs). Digital filters make 13 further frequencies available in real time from the input sampling frequency of 200 kHz. In parallel, various algorithms running on further DSPs generate triggers in response to a range of events in the plasma. The sawtooth crash can be detected, for example, with a delay of only 50 μs from the onset of the collapse. The trigger processor interacts with the digital filter boards to ensure data of the appropriate frequency is recorded throughout a plasma discharge. An independent link is used to pass 780 and 24 Hz filtered data to a network of transputers. A full tomographic inversion and display of the 24 Hz data is carried out in real time using this 15 transputer array. The 780 Hz data are stored for immediate detailed playback following the pulse. Such a system could considerably improve the quality of present plasma diagnostic data which is, in general, sampled at one fixed frequency throughout a discharge. Further, it should provide valuable information towards designing diagnostic data acquisition systems for future long pulse operation machines when a high degree of real-time processing will be required, while retaining the ability to detect, record, and analyze events of interest within such long plasma discharges.
Real-Time Mapping alert system; user's manual
Torres, L.A.
1996-01-01
The U.S. Geological Survey has an extensive hydrologic network that records and transmits precipitation, stage, discharge, and other water- related data on a real-time basis to an automated data processing system. Data values are recorded on electronic data collection platforms at field monitoring sites. These values are transmitted by means of orbiting satellites to receiving ground stations, and by way of telecommunication lines to a U.S. Geological Survey office where they are processed on a computer system. Data that exceed predefined thresholds are identified as alert values. These alert values can help keep water- resource specialists informed of current hydrologic conditions. The current alert status at monitoring sites is of critical importance during floods, hurricanes, and other extreme hydrologic events where quick analysis of the situation is needed. This manual provides instructions for using the Real-Time Mapping software, a series of computer programs developed by the U.S. Geological Survey for quick analysis of hydrologic conditions, and guides users through a basic interactive session. The software provides interactive graphics display and query of real-time information in a map-based, menu-driven environment.
Daamen, Ruby C.; Edwin A. Roehl, Jr.; Conrads, Paul
2010-01-01
A technology often used for industrial applications is “inferential sensor.” Rather than installing a redundant sensor to measure a process, such as an additional waterlevel gage, an inferential sensor, or virtual sensor, is developed that estimates the processes measured by the physical sensor. The advantage of an inferential sensor is that it provides a redundant signal to the sensor in the field but without exposure to environmental threats. In the event that a gage does malfunction, the inferential sensor provides an estimate for the period of missing data. The inferential sensor also can be used in the quality assurance and quality control of the data. Inferential sensors for gages in the EDEN network are currently (2010) under development. The inferential sensors will be automated so that the real-time EDEN data will continuously be compared to the inferential sensor signal and digital reports of the status of the real-time data will be sent periodically to the appropriate support personnel. The development and application of inferential sensors is easily transferable to other real-time hydrologic monitoring networks.
Real-time Retrieving Atmospheric Parameters from Multi-GNSS Constellations
NASA Astrophysics Data System (ADS)
Li, X.; Zus, F.; Lu, C.; Dick, G.; Ge, M.; Wickert, J.; Schuh, H.
2016-12-01
The multi-constellation GNSS (e.g. GPS, GLONASS, Galileo, and BeiDou) bring great opportunities and challenges for real-time retrieval of atmospheric parameters for supporting numerical weather prediction (NWP) nowcasting or severe weather event monitoring. In this study, the observations from different GNSS are combined together for atmospheric parameter retrieving based on the real-time precise point positioning technique. The atmospheric parameters retrieved from multi-GNSS observations, including zenith total delay (ZTD), integrated water vapor (IWV), horizontal gradient (especially high-resolution gradient estimates) and slant total delay (STD), are carefully analyzed and evaluated by using the VLBI, radiosonde, water vapor radiometer and numerical weather model to independently validate the performance of individual GNSS and also demonstrate the benefits of multi-constellation GNSS for real-time atmospheric monitoring. Numerous results show that the multi-GNSS processing can provide real-time atmospheric products with higher accuracy, stronger reliability and better distribution, which would be beneficial for atmospheric sounding systems, especially for nowcasting of extreme weather.
EARLINET: potential operationality of a research network
NASA Astrophysics Data System (ADS)
Sicard, M.; D'Amico, G.; Comerón, A.; Mona, L.; Alados-Arboledas, L.; Amodeo, A.; Baars, H.; Baldasano, J. M.; Belegante, L.; Binietoglou, I.; Bravo-Aranda, J. A.; Fernández, A. J.; Fréville, P.; García-Vizcaíno, D.; Giunta, A.; Granados-Muñoz, M. J.; Guerrero-Rascado, J. L.; Hadjimitsis, D.; Haefele, A.; Hervo, M.; Iarlori, M.; Kokkalis, P.; Lange, D.; Mamouri, R. E.; Mattis, I.; Molero, F.; Montoux, N.; Muñoz, A.; Muñoz Porcar, C.; Navas-Guzmán, F.; Nicolae, D.; Nisantzi, A.; Papagiannopoulos, N.; Papayannis, A.; Pereira, S.; Preißler, J.; Pujadas, M.; Rizi, V.; Rocadenbosch, F.; Sellegri, K.; Simeonov, V.; Tsaknakis, G.; Wagner, F.; Pappalardo, G.
2015-11-01
In the framework of ACTRIS (Aerosols, Clouds, and Trace Gases Research Infrastructure Network) summer 2012 measurement campaign (8 June-17 July 2012), EARLINET organized and performed a controlled exercise of feasibility to demonstrate its potential to perform operational, coordinated measurements and deliver products in near-real time. Eleven lidar stations participated in the exercise which started on 9 July 2012 at 06:00 UT and ended 72 h later on 12 July at 06:00 UT. For the first time, the single calculus chain (SCC) - the common calculus chain developed within EARLINET for the automatic evaluation of lidar data from raw signals up to the final products - was used. All stations sent in real-time measurements of a 1 h duration to the SCC server in a predefined netcdf file format. The pre-processing of the data was performed in real time by the SCC, while the optical processing was performed in near-real time after the exercise ended. 98 and 79 % of the files sent to SCC were successfully pre-processed and processed, respectively. Those percentages are quite large taking into account that no cloud screening was performed on the lidar data. The paper draws present and future SCC users' attention to the most critical parameters of the SCC product configuration and their possible optimal value but also to the limitations inherent to the raw data. The continuous use of SCC direct and derived products in heterogeneous conditions is used to demonstrate two potential applications of EARLINET infrastructure: the monitoring of a Saharan dust intrusion event and the evaluation of two dust transport models. The efforts made to define the measurements protocol and to configure properly the SCC pave the way for applying this protocol for specific applications such as the monitoring of special events, atmospheric modeling, climate research and calibration/validation activities of spaceborne observations.
Alsina-Pagès, Rosa Ma; Navarro, Joan; Alías, Francesc; Hervás, Marcos
2017-04-13
The consistent growth in human life expectancy during the recent years has driven governments and private organizations to increase the efforts in caring for the eldest segment of the population. These institutions have built hospitals and retirement homes that have been rapidly overfilled, making their associated maintenance and operating costs prohibitive. The latest advances in technology and communications envisage new ways to monitor those people with special needs at their own home, increasing their quality of life in a cost-affordable way. The purpose of this paper is to present an Ambient Assisted Living (AAL) platform able to analyze, identify, and detect specific acoustic events happening in daily life environments, which enables the medic staff to remotely track the status of every patient in real-time. Additionally, this tele-care proposal is validated through a proof-of-concept experiment that takes benefit of the capabilities of the NVIDIA Graphical Processing Unit running on a Jetson TK1 board to locally detect acoustic events. Conducted experiments demonstrate the feasibility of this approach by reaching an overall accuracy of 82% when identifying a set of 14 indoor environment events related to the domestic surveillance and patients' behaviour monitoring field. Obtained results encourage practitioners to keep working in this direction, and enable health care providers to remotely track the status of their patients in real-time with non-invasive methods.
Alsina-Pagès, Rosa Ma; Navarro, Joan; Alías, Francesc; Hervás, Marcos
2017-01-01
The consistent growth in human life expectancy during the recent years has driven governments and private organizations to increase the efforts in caring for the eldest segment of the population. These institutions have built hospitals and retirement homes that have been rapidly overfilled, making their associated maintenance and operating costs prohibitive. The latest advances in technology and communications envisage new ways to monitor those people with special needs at their own home, increasing their quality of life in a cost-affordable way. The purpose of this paper is to present an Ambient Assisted Living (AAL) platform able to analyze, identify, and detect specific acoustic events happening in daily life environments, which enables the medic staff to remotely track the status of every patient in real-time. Additionally, this tele-care proposal is validated through a proof-of-concept experiment that takes benefit of the capabilities of the NVIDIA Graphical Processing Unit running on a Jetson TK1 board to locally detect acoustic events. Conducted experiments demonstrate the feasibility of this approach by reaching an overall accuracy of 82% when identifying a set of 14 indoor environment events related to the domestic surveillance and patients’ behaviour monitoring field. Obtained results encourage practitioners to keep working in this direction, and enable health care providers to remotely track the status of their patients in real-time with non-invasive methods. PMID:28406459
Real-time earthquake source imaging: An offline test for the 2011 Tohoku earthquake
NASA Astrophysics Data System (ADS)
Zhang, Yong; Wang, Rongjiang; Zschau, Jochen; Parolai, Stefano; Dahm, Torsten
2014-05-01
In recent decades, great efforts have been expended in real-time seismology aiming at earthquake and tsunami early warning. One of the most important issues is the real-time assessment of earthquake rupture processes using near-field seismogeodetic networks. Currently, earthquake early warning systems are mostly based on the rapid estimate of P-wave magnitude, which contains generally large uncertainties and the known saturation problem. In the case of the 2011 Mw9.0 Tohoku earthquake, JMA (Japan Meteorological Agency) released the first warning of the event with M7.2 after 25 s. The following updates of the magnitude even decreased to M6.3-6.6. Finally, the magnitude estimate stabilized at M8.1 after about two minutes. This led consequently to the underestimated tsunami heights. By using the newly developed Iterative Deconvolution and Stacking (IDS) method for automatic source imaging, we demonstrate an offline test for the real-time analysis of the strong-motion and GPS seismograms of the 2011 Tohoku earthquake. The results show that we had been theoretically able to image the complex rupture process of the 2011 Tohoku earthquake automatically soon after or even during the rupture process. In general, what had happened on the fault could be robustly imaged with a time delay of about 30 s by using either the strong-motion (KiK-net) or the GPS (GEONET) real-time data. This implies that the new real-time source imaging technique is helpful to reduce false and missing warnings, and therefore should play an important role in future tsunami early warning and earthquake rapid response systems.
Fine grained event processing on HPCs with the ATLAS Yoda system
NASA Astrophysics Data System (ADS)
Calafiura, Paolo; De, Kaushik; Guan, Wen; Maeno, Tadashi; Nilsson, Paul; Oleynik, Danila; Panitkin, Sergey; Tsulaia, Vakhtang; Van Gemmeren, Peter; Wenaus, Torre
2015-12-01
High performance computing facilities present unique challenges and opportunities for HEP event processing. The massive scale of many HPC systems means that fractionally small utilization can yield large returns in processing throughput. Parallel applications which can dynamically and efficiently fill any scheduling opportunities the resource presents benefit both the facility (maximal utilization) and the (compute-limited) science. The ATLAS Yoda system provides this capability to HEP-like event processing applications by implementing event-level processing in an MPI-based master-client model that integrates seamlessly with the more broadly scoped ATLAS Event Service. Fine grained, event level work assignments are intelligently dispatched to parallel workers to sustain full utilization on all cores, with outputs streamed off to destination object stores in near real time with similarly fine granularity, such that processing can proceed until termination with full utilization. The system offers the efficiency and scheduling flexibility of preemption without requiring the application actually support or employ check-pointing. We will present the new Yoda system, its motivations, architecture, implementation, and applications in ATLAS data processing at several US HPC centers.
Real-Time and Near Real-Time Data for Space Weather Applications and Services
NASA Astrophysics Data System (ADS)
Singer, H. J.; Balch, C. C.; Biesecker, D. A.; Matsuo, T.; Onsager, T. G.
2015-12-01
Space weather can be defined as conditions in the vicinity of Earth and in the interplanetary environment that are caused primarily by solar processes and influenced by conditions on Earth and its atmosphere. Examples of space weather are the conditions that result from geomagnetic storms, solar particle events, and bursts of intense solar flare radiation. These conditions can have impacts on modern-day technologies such as GPS or electric power grids and on human activities such as astronauts living on the International Space Station or explorers traveling to the moon or Mars. While the ultimate space weather goal is accurate prediction of future space weather conditions, for many applications and services, we rely on real-time and near-real time observations and model results for the specification of current conditions. In this presentation, we will describe the space weather system and the need for real-time and near-real time data that drive the system, characterize conditions in the space environment, and are used by models for assimilation and validation. Currently available data will be assessed and a vision for future needs will be given. The challenges for establishing real-time data requirements, as well as acquiring, processing, and disseminating the data will be described, including national and international collaborations. In addition to describing how the data are used for official government products, we will also give examples of how these data are used by both the public and private sector for new applications that serve the public.
NASA Astrophysics Data System (ADS)
Berni, Nicola; Brocca, Luca; Barbetta, Silvia; Pandolfo, Claudia; Stelluti, Marco; Moramarco, Tommaso
2014-05-01
The Italian national hydro-meteorological early warning system is composed by 21 regional offices (Functional Centres, CF). Umbria Region (central Italy) CF provides early warning for floods and landslides, real-time monitoring and decision support systems (DSS) for the Civil Defence Authorities when significant events occur. The alert system is based on hydrometric and rainfall thresholds with detailed procedures for the management of critical events in which different roles of authorities and institutions involved are defined. The real-time flood forecasting system is based also on different hydrological and hydraulic forecasting models. Among these, the MISDc rainfall-runoff model ("Modello Idrologico SemiDistribuito in continuo"; Brocca et al., 2011) and the flood routing model named STAFOM-RCM (STAge Forecasting Model-Rating Curve Model; Barbetta et al., 2014) are continuously operative in real-time providing discharge and stage forecasts, respectively, with lead-times up to 24 hours (when quantitative precipitation forecasts are used) in several gauged river sections in the Upper-Middle Tiber River basin. Models results are published in real-time in the open source CF web platform: www.cfumbria.it. MISDc provides discharge and soil moisture forecasts for different sub-basins while STAFOM-RCM provides stage forecasts at hydrometric sections. Moreover, through STAFOM-RCM the uncertainty of the forecast stage hydrograph is provided in terms of 95% Confidence Interval (CI) assessed by analyzing the statistical properties of model output in terms of lateral. In the period 10th-12th November 2013, a severe flood event occurred in Umbria mainly affecting the north-eastern area and causing significant economic damages, but fortunately no casualties. The territory was interested by intense and persistent rainfall; the hydro-meteorological monitoring network recorded locally rainfall depth over 400 mm in 72 hours. In the most affected area, the recorded rainfall depths correspond approximately to a return period of 200 years. Most rivers in Umbria have been involved, exceeding hydrometric thresholds and causing flooding (e.g. Chiascio river). The flood event was continuously monitored at the Umbria Region CF and the possible evolution predicted and assessed on the basis of the model forecasts. The predictions provided by MISDc and STAFOM-RCM were found useful to support real-time decision-making addressed to flood risk management. Moreover, the quantification of the uncertainty affecting the deterministic forecast stages was found consistent with the level of confidence selected and had practical utility corroborating the need of coupling deterministic forecast and 'uncertainty' when the model output is used to support decisions about flood management. REFERENCES Barbetta, S., Moramarco, T., Brocca, L., Franchini, M., Melone, F. (2014). Confidence interval of real-time forecast stages provided by the STAFOM-RCM model: the case study of the Tiber River (Italy). Hydrological Processes, 28(3), 729-743. Brocca, L., Melone, F., Moramarco, T. (2011). Distributed rainfall-runoff modelling for flood frequency estimation and flood forecasting. Hydrological Processes, 25 (18), 2801-2813
Monitoring biodiesel reactions of soybean oil and sunflower oil using ultrasonic parameters
NASA Astrophysics Data System (ADS)
Figueiredo, M. K. K.; Silva, C. E. R.; Alvarenga, A. V.; Costa-Félix, R. P. B.
2015-01-01
Biodiesel is an innovation that attempts to substitute diesel oil with biomass. The aim of this paper is to show the development of a real-time method to monitor transesterification reactions by using low-power ultrasound and pulse/echo techniques. The results showed that it is possible to identify different events during the transesterification process by using the proposed parameters, showing that the proposed method is a feasible way to monitor the reactions of biodiesel during its fabrication, in real time, and with relatively low- cost equipment.
NASA Astrophysics Data System (ADS)
Furlong, K. P.; Whitlock, J. S.; Benz, H. M.
2002-12-01
Earthquakes occur globally, on a regular but (as yet) non-predictable basis, and their effects are both dramatic and often devastating. Additionally they serve as a primary tool to image the earth and define the active processes that drive tectonics. As a result, earthquakes can be an extremely effective tool for helping students to learn about active earth processes, natural hazards, and the myriad of issues that arise with non-predictable but potentially devastating natural events. We have developed and implemented a real-time earthquake alert system (EAS) built on the USGS Earthworm system to bring earthquakes into the classroom. Through our EAS, students in our General Education class on Natural Hazards (Earth101 - Natural Disasters: Hollywood vs. Reality) participate in earthquake response activities in ways similar to earthquake hazard professionals - they become part of the response to the event. Our implementation of the Earthworm system allows our students to be paged via cell-phone text messaging (Yes, we provide cell phones to the 'duty seismologists'), and they respond to those pages as appropriate for their role. A parallel web server is maintained that provides the earthquake details (location maps, waveforms etc.) and students produce time-critical output such as news releases, analyses of earthquake trends in the region, and reports detailing implications of the events. Since this is a course targeted at non-science majors, we encourage that they bring their own expertise into the analyses. For example, business of economic majors may investigate the economic impacts of an earthquake, secondary education majors may work on teaching modules based on the information they gather etc. Since the students know that they are responding to real events they develop ownership of the information they gather and they recognize the value of real-time response. Our educational goals in developing this system include: (1) helping students develop a sense of the global distribution and impact of natural hazards, and the implications of non-predictable events; (2) encouraging students to think about how understanding science related events can be crucially important in analyzing societal issues; and (3) developing an approach to understanding important earth science topics in a way in which students 'own' their data, and are entrained into thinking about linkages between science and society. Finally, systems such as our real-time earthquake alert system take science out of the classroom and into the students lives. What better way to broaden the discussion of science, and bring earth science issues to center stage then to have a student receive an earthquake alert when she is socializing on a Friday evening at a campus hangout!
Video enhancement workbench: an operational real-time video image processing system
NASA Astrophysics Data System (ADS)
Yool, Stephen R.; Van Vactor, David L.; Smedley, Kirk G.
1993-01-01
Video image sequences can be exploited in real-time, giving analysts rapid access to information for military or criminal investigations. Video-rate dynamic range adjustment subdues fluctuations in image intensity, thereby assisting discrimination of small or low- contrast objects. Contrast-regulated unsharp masking enhances differentially shadowed or otherwise low-contrast image regions. Real-time removal of localized hotspots, when combined with automatic histogram equalization, may enhance resolution of objects directly adjacent. In video imagery corrupted by zero-mean noise, real-time frame averaging can assist resolution and location of small or low-contrast objects. To maximize analyst efficiency, lengthy video sequences can be screened automatically for low-frequency, high-magnitude events. Combined zoom, roam, and automatic dynamic range adjustment permit rapid analysis of facial features captured by video cameras recording crimes in progress. When trying to resolve small objects in murky seawater, stereo video places the moving imagery in an optimal setting for human interpretation.
NASA Astrophysics Data System (ADS)
Furlong, K. P.; Benz, H.; Hayes, G. P.; Villasenor, A.
2010-12-01
Although most would agree that the occurrence of natural disaster events such as earthquakes, volcanic eruptions, and floods can provide effective learning opportunities for natural hazards-based courses, implementing compelling materials into the large-enrollment classroom environment can be difficult. These natural hazard events derive much of their learning potential from their real-time nature, and in the modern 24/7 news-cycle where all but the most devastating events are quickly out of the public eye, the shelf life for an event is quite limited. To maximize the learning potential of these events requires that both authoritative information be available and course materials be generated as the event unfolds. Although many events such as hurricanes, flooding, and volcanic eruptions provide some precursory warnings, and thus one can prepare background materials to place the main event into context, earthquakes present a particularly confounding situation of providing no warning, but where context is critical to student learning. Attempting to implement real-time materials into large enrollment classes faces the additional hindrance of limited internet access (for students) in most lecture classrooms. In Earth 101 Natural Disasters: Hollywood vs Reality, taught as a large enrollment (150+ students) general education course at Penn State, we are collaborating with the USGS’s National Earthquake Information Center (NEIC) to develop efficient means to incorporate their real-time products into learning activities in the lecture hall environment. Over time (and numerous events) we have developed a template for presenting USGS-produced real-time information in lecture mode. The event-specific materials can be quickly incorporated and updated, along with key contextual materials, to provide students with up-to-the-minute current information. In addition, we have also developed in-class activities, such as student determination of population exposure to severe ground shaking (i.e. simulating the USGS PAGER product), tsunami warning calculations, and building damage analyses that allow the students to participate in realistic hazard analyses as the event unfolds. Examples of these templates and activities will be presented. Key to the successful implementation of real-time materials is sufficient flexibility and adaptability in the course syllabus.
The ATLAS Level-1 Topological Trigger performance in Run 2
NASA Astrophysics Data System (ADS)
Riu, Imma; ATLAS Collaboration
2017-10-01
The Level-1 trigger is the first event rate reducing step in the ATLAS detector trigger system, with an output rate of up to 100 kHz and decision latency smaller than 2.5 μs. During the LHC shutdown after Run 1, the Level-1 trigger system was upgraded at hardware, firmware and software levels. In particular, a new electronics sub-system was introduced in the real-time data processing path: the Level-1 Topological trigger system. It consists of a single electronics shelf equipped with two Level-1 Topological processor blades. They receive real-time information from the Level-1 calorimeter and muon triggers, which is processed to measure angles between trigger objects, invariant masses or other kinematic variables. Complementary to other requirements, these measurements are taken into account in the final Level-1 trigger decision. The system was installed and commissioning started in 2015 and continued during 2016. As part of the commissioning, the decisions from individual algorithms were simulated and compared with the hardware response. An overview of the Level-1 Topological trigger system design, commissioning process and impact on several event selections are illustrated.
Perkins, Gavin D; Davies, Robin P; Quinton, Sarah; Woolley, Sarah; Gao, Fang; Abella, Ben; Stallard, Nigel; Cooke, Matthew W
2011-10-18
Cardiac arrest affects 30-35, 000 hospitalised patients in the UK every year. For these patients to be given the best chance of survival, high quality cardiopulmonary resuscitation (CPR) must be delivered, however the quality of CPR in real-life is often suboptimal. CPR feedback devices have been shown to improve CPR quality in the pre-hospital setting and post-event debriefing can improve adherence to guidelines and CPR quality. However, the evidence for use of these improvement methods in hospital remains unclear. The CPR quality improvement initiative is a prospective cohort study of the Q-CPR real-time feedback device combined with post-event debriefing in hospitalised adult patients who sustain a cardiac arrest. The primary objective of this trial is to assess whether a CPR quality improvement initiative will improve rate of return of sustained spontaneous circulation in in-hospital-cardiac-arrest patients. The study is set in one NHS trust operating three hospital sites. Secondary objectives will evaluate: any return of spontaneous circulation; survival to hospital discharge and patient cerebral performance category at discharge; quality of CPR variables and cardiac arrest team factors. All three sites will have an initial control phase before any improvements are implemented; site 1 will implement audiovisual feedback combined with post event debriefing, site 2 will implement audiovisual feedback only and site 3 will remain as a control site to measure any changes in outcome due to any other trust-wide changes in resuscitation practice. All adult patients sustaining a cardiac arrest and receiving resuscitation from the hospital cardiac arrest team will be included. Patients will be excluded if; they have a Do-not-attempt resuscitation order written and documented in their medical records, the cardiac arrest is not attended by a resuscitation team, the arrest occurs out-of-hospital or the patient has previously participated in this study. The trial will recruit a total of 912 patients from the three hospital sites. This trial will evaluate patient and process focussed outcomes following the implementation of a CPR quality improvement initiative using real-time audiovisual feedback and post event debriefing. ISRCTN56583860.
Scale-invariant structure of energy fluctuations in real earthquakes
NASA Astrophysics Data System (ADS)
Wang, Ping; Chang, Zhe; Wang, Huanyu; Lu, Hong
2017-11-01
Earthquakes are obviously complex phenomena associated with complicated spatiotemporal correlations, and they are generally characterized by two power laws: the Gutenberg-Richter (GR) and the Omori-Utsu laws. However, an important challenge has been to explain two apparently contrasting features: the GR and Omori-Utsu laws are scale-invariant and unaffected by energy or time scales, whereas earthquakes occasionally exhibit a characteristic energy or time scale, such as with asperity events. In this paper, three high-quality datasets on earthquakes were used to calculate the earthquake energy fluctuations at various spatiotemporal scales, and the results reveal the correlations between seismic events regardless of their critical or characteristic features. The probability density functions (PDFs) of the fluctuations exhibit evidence of another scaling that behaves as a q-Gaussian rather than random process. The scaling behaviors are observed for scales spanning three orders of magnitude. Considering the spatial heterogeneities in a real earthquake fault, we propose an inhomogeneous Olami-Feder-Christensen (OFC) model to describe the statistical properties of real earthquakes. The numerical simulations show that the inhomogeneous OFC model shares the same statistical properties with real earthquakes.
Migration velocity analysis using residual diffraction moveout: a real-data example
NASA Astrophysics Data System (ADS)
Gonzalez, Jaime A. C.; de Figueiredo, José J. S.; Coimbra, Tiago A.; Schleicher, Jörg; Novais, Amélia
2016-08-01
Unfocused seismic diffraction events carry direct information about errors in the migration-velocity model. The residual-diffraction-moveout (RDM) migration-velocity-analysis (MVA) method is a recent technique that extracts this information by means of adjusting ellipses or hyperbolas to uncollapsed migrated diffractions. In this paper, we apply this method, which has been tested so far only on synthetic data, to a real data set from the Viking Graben. After application of a plane-wave-destruction (PWD) filter to attenuate the reflected energy, the diffractions in the real data become interpretable and can be used for the RDM method. Our analysis demonstrates that the reflections need not be completely removed for this purpose. Beyond the need to identify and select diffraction events in post-stack migrated sections in the depth domain, the method has a very low computational cost and processing time. To reach an acceptable velocity model of comparable quality as one obtained with common-midpoint (CMP) processing, only two iterations were necessary.
NASA Astrophysics Data System (ADS)
Wang, L.; Toshioka, T.; Nakajima, T.; Narita, A.; Xue, Z.
2017-12-01
In recent years, more and more Carbon Capture and Storage (CCS) studies focus on seismicity monitoring. For the safety management of geological CO2 storage at Tomakomai, Hokkaido, Japan, an Advanced Traffic Light System (ATLS) combined different seismic messages (magnitudes, phases, distributions et al.) is proposed for injection controlling. The primary task for ATLS is the seismic events detection in a long-term sustained time series record. Considering the time-varying characteristics of Signal to Noise Ratio (SNR) of a long-term record and the uneven energy distributions of seismic event waveforms will increase the difficulty in automatic seismic detecting, in this work, an improved probability autoregressive (AR) method for automatic seismic event detecting is applied. This algorithm, called sequentially discounting AR learning (SDAR), can identify the effective seismic event in the time series through the Change Point detection (CPD) of the seismic record. In this method, an anomaly signal (seismic event) can be designed as a change point on the time series (seismic record). The statistical model of the signal in the neighborhood of event point will change, because of the seismic event occurrence. This means the SDAR aims to find the statistical irregularities of the record thought CPD. There are 3 advantages of SDAR. 1. Anti-noise ability. The SDAR does not use waveform messages (such as amplitude, energy, polarization) for signal detecting. Therefore, it is an appropriate technique for low SNR data. 2. Real-time estimation. When new data appears in the record, the probability distribution models can be automatic updated by SDAR for on-line processing. 3. Discounting property. the SDAR introduces a discounting parameter to decrease the influence of present statistic value on future data. It makes SDAR as a robust algorithm for non-stationary signal processing. Within these 3 advantages, the SDAR method can handle the non-stationary time-varying long-term series and achieve real-time monitoring. Finally, we employ the SDAR on a synthetic model and Tomakomai Ocean Bottom Cable (OBC) baseline data to prove the feasibility and advantage of our method.
Optimizing SIEM Throughput on the Cloud Using Parallelization.
Alam, Masoom; Ihsan, Asif; Khan, Muazzam A; Javaid, Qaisar; Khan, Abid; Manzoor, Jawad; Akhundzada, Adnan; Khan, Muhammad Khurram; Farooq, Sajid
2016-01-01
Processing large amounts of data in real time for identifying security issues pose several performance challenges, especially when hardware infrastructure is limited. Managed Security Service Providers (MSSP), mostly hosting their applications on the Cloud, receive events at a very high rate that varies from a few hundred to a couple of thousand events per second (EPS). It is critical to process this data efficiently, so that attacks could be identified quickly and necessary response could be initiated. This paper evaluates the performance of a security framework OSTROM built on the Esper complex event processing (CEP) engine under a parallel and non-parallel computational framework. We explain three architectures under which Esper can be used to process events. We investigated the effect on throughput, memory and CPU usage in each configuration setting. The results indicate that the performance of the engine is limited by the number of events coming in rather than the queries being processed. The architecture where 1/4th of the total events are submitted to each instance and all the queries are processed by all the units shows best results in terms of throughput, memory and CPU usage.
Real-Time Analysis of Magnetic Hyperthermia Experiments on Living Cells under a Confocal Microscope.
Connord, Vincent; Clerc, Pascal; Hallali, Nicolas; El Hajj Diab, Darine; Fourmy, Daniel; Gigoux, Véronique; Carrey, Julian
2015-05-01
Combining high-frequency alternating magnetic fields (AMF) and magnetic nanoparticles (MNPs) is an efficient way to induce biological responses through several approaches: magnetic hyperthermia, drug release, controls of gene expression and neurons, or activation of chemical reactions. So far, these experiments cannot be analyzed in real-time during the AMF application. A miniaturized electromagnet fitting under a confocal microscope is built, which produces an AMF of frequency and amplitude similar to the ones used in magnetic hyperthermia. AMF application induces massive damages to tumoral cells having incorporated nanoparticles into their lysosomes without affecting the others. Using this setup, real-time analyses of molecular events occurring during AMF application are performed. Lysosome membrane permeabilization and reactive oxygen species production are detected after only 30 min of AMF application, demonstrating they occur at an early stage in the cascade of events leading eventually to cell death. Additionally, lysosomes self-assembling into needle-shaped organization under the influence of AMF is observed in real-time. This experimental approach will permit to get a deeper insight into the physical, molecular, and biological process occurring in several innovative techniques used in nanomedecine based on the combined use of MNPs and high-frequency magnetic fields. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Climate Signals: An On-Line Digital Platform for Mapping Climate Change Impacts in Real Time
NASA Astrophysics Data System (ADS)
Cutting, H.
2016-12-01
Climate Signals is an on-line digital platform for cataloging and mapping the impacts of climate change. The CS platform specifies and details the chains of connections between greenhouse gas emissions and individual climate events. Currently in open-beta release, the platform is designed to to engage and serve the general public, news media, and policy-makers, particularly in real-time during extreme climate events. Climate Signals consists of a curated relational database of events and their links to climate change, a mapping engine, and a gallery of climate change monitors offering real-time data. For each event in the database, an infographic engine provides a custom attribution "tree" that illustrates the connections to climate change. In addition, links to key contextual resources are aggregated and curated for each event. All event records are fully annotated with detailed source citations and corresponding hyper links. The system of attribution used to link events to climate change in real-time is detailed here. This open-beta release is offered for public user testing and engagement. Launched in May 2016, the operation of this platform offers lessons for public engagement in climate change impacts.
Multiple disturbances classifier for electric signals using adaptive structuring neural networks
NASA Astrophysics Data System (ADS)
Lu, Yen-Ling; Chuang, Cheng-Long; Fahn, Chin-Shyurng; Jiang, Joe-Air
2008-07-01
This work proposes a novel classifier to recognize multiple disturbances for electric signals of power systems. The proposed classifier consists of a series of pipeline-based processing components, including amplitude estimator, transient disturbance detector, transient impulsive detector, wavelet transform and a brand-new neural network for recognizing multiple disturbances in a power quality (PQ) event. Most of the previously proposed methods usually treated a PQ event as a single disturbance at a time. In practice, however, a PQ event often consists of various types of disturbances at the same time. Therefore, the performances of those methods might be limited in real power systems. This work considers the PQ event as a combination of several disturbances, including steady-state and transient disturbances, which is more analogous to the real status of a power system. Six types of commonly encountered power quality disturbances are considered for training and testing the proposed classifier. The proposed classifier has been tested on electric signals that contain single disturbance or several disturbances at a time. Experimental results indicate that the proposed PQ disturbance classification algorithm can achieve a high accuracy of more than 97% in various complex testing cases.
Dynamically adaptive data-driven simulation of extreme hydrological flows
NASA Astrophysics Data System (ADS)
Kumar Jain, Pushkar; Mandli, Kyle; Hoteit, Ibrahim; Knio, Omar; Dawson, Clint
2018-02-01
Hydrological hazards such as storm surges, tsunamis, and rainfall-induced flooding are physically complex events that are costly in loss of human life and economic productivity. Many such disasters could be mitigated through improved emergency evacuation in real-time and through the development of resilient infrastructure based on knowledge of how systems respond to extreme events. Data-driven computational modeling is a critical technology underpinning these efforts. This investigation focuses on the novel combination of methodologies in forward simulation and data assimilation. The forward geophysical model utilizes adaptive mesh refinement (AMR), a process by which a computational mesh can adapt in time and space based on the current state of a simulation. The forward solution is combined with ensemble based data assimilation methods, whereby observations from an event are assimilated into the forward simulation to improve the veracity of the solution, or used to invert for uncertain physical parameters. The novelty in our approach is the tight two-way coupling of AMR and ensemble filtering techniques. The technology is tested using actual data from the Chile tsunami event of February 27, 2010. These advances offer the promise of significantly transforming data-driven, real-time modeling of hydrological hazards, with potentially broader applications in other science domains.
NSTX-U Advances in Real-Time C++11 on Linux
NASA Astrophysics Data System (ADS)
Erickson, Keith G.
2015-08-01
Programming languages like C and Ada combined with proprietary embedded operating systems have dominated the real-time application space for decades. The new C++11 standard includes native, language-level support for concurrency, a required feature for any nontrivial event-oriented real-time software. Threads, Locks, and Atomics now exist to provide the necessary tools to build the structures that make up the foundation of a complex real-time system. The National Spherical Torus Experiment Upgrade (NSTX-U) at the Princeton Plasma Physics Laboratory (PPPL) is breaking new ground with the language as applied to the needs of fusion devices. A new Digital Coil Protection System (DCPS) will serve as the main protection mechanism for the magnetic coils, and it is written entirely in C++11 running on Concurrent Computer Corporation's real-time operating system, RedHawk Linux. It runs over 600 algorithms in a 5 kHz control loop that determine whether or not to shut down operations before physical damage occurs. To accomplish this, NSTX-U engineers developed software tools that do not currently exist elsewhere, including real-time atomic synchronization, real-time containers, and a real-time logging framework. Together with a recent (and carefully configured) version of the GCC compiler, these tools enable data acquisition, processing, and output using a conventional operating system to meet a hard real-time deadline (that is, missing one periodic is a failure) of 200 microseconds.
Proposal and Evaluation of BLE Discovery Process Based on New Features of Bluetooth 5.0.
Hernández-Solana, Ángela; Perez-Diaz-de-Cerio, David; Valdovinos, Antonio; Valenzuela, Jose Luis
2017-08-30
The device discovery process is one of the most crucial aspects in real deployments of sensor networks. Recently, several works have analyzed the topic of Bluetooth Low Energy (BLE) device discovery through analytical or simulation models limited to version 4.x. Non-connectable and non-scannable undirected advertising has been shown to be a reliable alternative for discovering a high number of devices in a relatively short time period. However, new features of Bluetooth 5.0 allow us to define a variant on the device discovery process, based on BLE scannable undirected advertising events, which results in higher discovering capacities and also lower power consumption. In order to characterize this new device discovery process, we experimentally model the real device behavior of BLE scannable undirected advertising events. Non-detection packet probability, discovery probability, and discovery latency for a varying number of devices and parameters are compared by simulations and experimental measurements. We demonstrate that our proposal outperforms previous works, diminishing the discovery time and increasing the potential user device density. A mathematical model is also developed in order to easily obtain a measure of the potential capacity in high density scenarios.
Proposal and Evaluation of BLE Discovery Process Based on New Features of Bluetooth 5.0
2017-01-01
The device discovery process is one of the most crucial aspects in real deployments of sensor networks. Recently, several works have analyzed the topic of Bluetooth Low Energy (BLE) device discovery through analytical or simulation models limited to version 4.x. Non-connectable and non-scannable undirected advertising has been shown to be a reliable alternative for discovering a high number of devices in a relatively short time period. However, new features of Bluetooth 5.0 allow us to define a variant on the device discovery process, based on BLE scannable undirected advertising events, which results in higher discovering capacities and also lower power consumption. In order to characterize this new device discovery process, we experimentally model the real device behavior of BLE scannable undirected advertising events. Non-detection packet probability, discovery probability, and discovery latency for a varying number of devices and parameters are compared by simulations and experimental measurements. We demonstrate that our proposal outperforms previous works, diminishing the discovery time and increasing the potential user device density. A mathematical model is also developed in order to easily obtain a measure of the potential capacity in high density scenarios. PMID:28867786
Real-time Upstream Monitoring System: Using ACE Data to Predict the Arrival of Interplanetary Shocks
NASA Astrophysics Data System (ADS)
Donegan, M. M.; Wagstaff, K. L.; Ho, G. C.; Vandegriff, J.
2003-12-01
We have developed an algorithm to predict Earth arrival times for interplanetary (IP) shock events originating at the Sun. Our predictions are generated from real-time data collected by the Electron, Proton, and Alpha Monitor (EPAM) instrument on NASA's Advanced Composition Explorer (ACE) spacecraft. The high intensities of energetic ions that occur prior to and during an IP shock pose a radiation hazard to astronauts as well as to electronics in Earth orbit. The potential to predict such events is based on characteristic signatures in the Energetic Storm Particle (ESP) event ion intensities which are often associated with IP shocks. We have previously reported on the development and implementation of an algorithm to forecast the arrival of ESP events. Historical ion data from ACE/EPAM was used to train an artificial neural network which uses the signature of an approaching event to predict the time remaining until the shock arrives. Tests on the trained network have been encouraging, with an average error of 9.4 hours for predictions made 24 hours in advance, and an reduced average error of 4.9 hours when the shock is 12 hours away. The prediction engine has been integrated into a web-based system that uses real-time ACE/EPAM data provided by the NOAA Space Environment Center (http://sd-www.jhuapl.edu/UPOS/RISP/ index.html.) This system continually processes the latest ACE data, reports whether or not there is an impending shock, and predicts the time remaining until the shock arrival. Our predictions are updated every five minutes and provide significant lead-time, thereby supplying critical information that can be used by mission planners, satellite operations controllers, and scientists. We have continued to refine the prediction capabilities of this system; in addition to forecasting arrival times for shocks, we now provide confidence estimates for those predictions.
Real-time Social Internet Data to Guide Forecasting Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Del Valle, Sara Y.
Our goal is to improve decision support by monitoring and forecasting events using social media, mathematical models, and quantifying model uncertainty. Our approach is real-time, data-driven forecasts with quantified uncertainty: Not just for weather anymore. Information flow from human observations of events through an Internet system and classification algorithms is used to produce quantitatively uncertain forecast. In summary, we want to develop new tools to extract useful information from Internet data streams, develop new approaches to assimilate real-time information into predictive models, validate approaches by forecasting events, and our ultimate goal is to develop an event forecasting system using mathematicalmore » approaches and heterogeneous data streams.« less
NASA Astrophysics Data System (ADS)
Pedrozo-Acuña, A.; Magos-Hernández, J. A.; Sánchez-Peralta, J. A.; Blanco-Figueroa, J.; Breña-Naranjo, J. A.
2017-12-01
This contribution presents a real-time system for issuing warnings of intense precipitation events during major storms, developed for Mexico City, Mexico. The system is based on high-temporal resolution (Dt=1min) measurements of precipitation in 10 different points within the city, which report variables such as intensity, number of raindrops, raindrop size, kinetic energy, fall velocity, etc. Each one of these stations, is comprised of an optical disdrometer to measure size and fall velocity of hydrometeors, a solar panel to guarantee an uninterrupted power supply, a wireless broadband access to internet, and a resource constrained device known as Raspberry Pi3 for the processing, storage and sharing of the sensor data over the world wide web. The self-made developed platform follows a component-based system paradigm allowing users to implement custom algorithms and models depending on application requirements. The system is in place since July 2016, and continuous measurements of rainfall in real-time are published over the internet through the webpage www.oh-iiunam.mx. Additionally, the developed platform for the data collection and management interacts with the social network known as Twitter to enable real-time warnings of precipitation events. Key contribution of this development is the design and implementation of a scalable, easy to use, interoperable platform that facilitates the development of real-time precipitation sensor networks and warnings. The system is easy to implement and could be used as a prototype for systems in other regions of the world.
NASA Astrophysics Data System (ADS)
Zimakov, L. G.; Passmore, P.; Raczka, J.; Alvarez, M.; Jackson, M.
2014-12-01
Scientific GNSS networks are moving towards a model of real-time data acquisition, epoch-by-epoch storage integrity, and on-board real-time position and displacement calculations. This new paradigm allows the integration of real-time, high-rate GNSS displacement information with acceleration and velocity data to create very high-rate displacement records. The mating of these two instruments allows the creation of a new, very high-rate (200 sps) displacement observable that has the full-scale displacement characteristics of GNSS and high-precision dynamic motions of seismic technologies. It is envisioned that these new observables can be used for earthquake early warning studies, volcano monitoring, and critical infrastructure monitoring applications. Our presentation will focus on the characteristics of GNSS, seismic, and strong motion sensors in high dynamic environments, including historic earthquakes in Southern California and the Pacific Rim, replicated on a shake table, over a range of displacements and frequencies. We will explore the optimum integration of these sensors from a filtering perspective including simple harmonic impulses over varying frequencies and amplitudes and under the dynamic conditions of various earthquake scenarios. In addition we will discuss implementation of a Rapid Seismic Event Notification System that provides quick delivery of digital data from seismic stations to the acquisition and processing center and a full data integrity model for real-time earthquake notification that provides warning prior to significant ground shaking.
Current Saturation Avoidance with Real-Time Control using DPCS
NASA Astrophysics Data System (ADS)
Ferrara, M.; Hutchinson, I.; Wolfe, S.; Stillerman, J.; Fredian, T.
2008-11-01
Tokamak ohmic-transformer and equilibrium-field coils need to be able to operate near their maximum current capabilities. However if they reach their upper limit during high-performance discharges or in the presence of a strong off-normal event, shape control is compromised, and instability, even plasma disruptions can result. On Alcator C-Mod we designed and tested an anti-saturation routine which detects the impending saturation of OH and EF currents and interpolates to a neighboring safe equilibrium in real-time. The routine was implemented with a multi-processor, multi-time-scale control scheme, which is based on a master process and multiple asynchronous slave processes. The scheme is general and can be used for any computationally-intensive algorithm. USDoE award DE- FC02-99ER545512.
Accident diagnosis system based on real-time decision tree expert system
NASA Astrophysics Data System (ADS)
Nicolau, Andressa dos S.; Augusto, João P. da S. C.; Schirru, Roberto
2017-06-01
Safety is one of the most studied topics when referring to power stations. For that reason, sensors and alarms develop an important role in environmental and human protection. When abnormal event happens, it triggers a chain of alarms that must be, somehow, checked by the control room operators. In this case, diagnosis support system can help operators to accurately identify the possible root-cause of the problem in short time. In this article, we present a computational model of a generic diagnose support system based on artificial intelligence, that was applied on the dataset of two real power stations: Angra1 Nuclear Power Plant and Santo Antônio Hydroelectric Plant. The proposed system processes all the information logged in the sequence of events before a shutdown signal using the expert's knowledge inputted into an expert system indicating the chain of events, from the shutdown signal to its root-cause. The results of both applications showed that the support system is a potential tool to help the control room operators identify abnormal events, as accidents and consequently increase the safety.
Event and Pulse Node Hardware Design for Nuclear Fusion Experiments
NASA Astrophysics Data System (ADS)
Fortunato, J. C.; Batista, A.; Sousa, J.; Fernandes, H.; Varandas, C. A. F.
2008-04-01
This article presents an event and pulse node hardware module (EPN) developed for use in control and data acquisition (CODAC) in current and upcoming long discharges nuclear fusion experiments. Its purpose is to allow real time event management and trigger distribution. The use of a mixture of digital signal processing and field programmable gate arrays, with fiber optic channels for event broadcast between CODAC nodes, and short length paths between the EPN and CODAC hardware, allows an effective and low latency communication path. This hardware will be integrated in the ISTTOK CODAC to allow long AC plasma discharges.
NASA Astrophysics Data System (ADS)
Li, Xingxing
2014-05-01
Earthquake monitoring and early warning system for hazard assessment and mitigation has traditional been based on seismic instruments. However, for large seismic events, it is difficult for traditional seismic instruments to produce accurate and reliable displacements because of the saturation of broadband seismometers and problematic integration of strong-motion data. Compared with the traditional seismic instruments, GPS can measure arbitrarily large dynamic displacements without saturation, making them particularly valuable in case of large earthquakes and tsunamis. GPS relative positioning approach is usually adopted to estimate seismic displacements since centimeter-level accuracy can be achieved in real-time by processing double-differenced carrier-phase observables. However, relative positioning method requires a local reference station, which might itself be displaced during a large seismic event, resulting in misleading GPS analysis results. Meanwhile, the relative/network approach is time-consuming, particularly difficult for the simultaneous and real-time analysis of GPS data from hundreds or thousands of ground stations. In recent years, several single-receiver approaches for real-time GPS seismology, which can overcome the reference station problem of the relative positioning approach, have been successfully developed and applied to GPS seismology. One available method is real-time precise point positioning (PPP) relied on precise satellite orbit and clock products. However, real-time PPP needs a long (re)convergence period, of about thirty minutes, to resolve integer phase ambiguities and achieve centimeter-level accuracy. In comparison with PPP, Colosimo et al. (2011) proposed a variometric approach to determine the change of position between two adjacent epochs, and then displacements are obtained by a single integration of the delta positions. This approach does not suffer from convergence process, but the single integration from delta positions to displacements is accompanied by a drift due to the potential uncompensated errors. Li et al. (2013) presented a temporal point positioning (TPP) method to quickly capture coseismic displacements with a single GPS receiver in real-time. The TPP approach can overcome the convergence problem of precise point positioning (PPP), and also avoids the integration and de-trending process of the variometric approach. The performance of TPP is demonstrated to be at few centimeters level of displacement accuracy for even twenty minutes interval with real-time precise orbit and clock products. In this study, we firstly present and compare the observation models and processing strategies of the current existing single-receiver methods for real-time GPS seismology. Furthermore, we propose several refinements to the variometric approach in order to eliminate the drift trend in the integrated coseismic displacements. The mathematical relationship between these methods is discussed in detail and their equivalence is also proved. The impact of error components such as satellite ephemeris, ionospheric delay, tropospheric delay, and geometry change on the retrieved displacements are carefully analyzed and investigated. Finally, the performance of these single-receiver approaches for real-time GPS seismology is validated using 1 Hz GPS data collected during the Tohoku-Oki earthquake (Mw 9.0, March 11, 2011) in Japan. It is shown that few centimeters accuracy of coseismic displacements is achievable. Keywords: High-rate GPS; real-time GPS seismology; a single receiver; PPP; variometric approach; temporal point positioning; error analysis; coseismic displacement; fault slip inversion;
NASA Astrophysics Data System (ADS)
Brandon, R.; Page, S.; Varndell, J.
2012-06-01
This paper presents a novel application of Evidential Reasoning to Threat Assessment for critical infrastructure protection. A fusion algorithm based on the PCR5 Dezert-Smarandache fusion rule is proposed which fuses alerts generated by a vision-based behaviour analysis algorithm and a-priori watch-list intelligence data. The fusion algorithm produces a prioritised event list according to a user-defined set of event-type severity or priority weightings. Results generated from application of the algorithm to real data and Behaviour Analysis alerts captured at London's Heathrow Airport under the EU FP7 SAMURAI programme are presented. A web-based demonstrator system is also described which implements the fusion process in real-time. It is shown that this system significantly reduces the data deluge problem, and directs the user's attention to the most pertinent alerts, enhancing their Situational Awareness (SA). The end-user is also able to alter the perceived importance of different event types in real-time, allowing the system to adapt rapidly to changes in priorities as the situation evolves. One of the key challenges associated with fusing information deriving from intelligence data is the issue of Data Incest. Techniques for handling Data Incest within Evidential Reasoning frameworks are proposed, and comparisons are drawn with respect to Data Incest management techniques that are commonly employed within Bayesian fusion frameworks (e.g. Covariance Intersection). The challenges associated with simultaneously dealing with conflicting information and Data Incest in Evidential Reasoning frameworks are also discussed.
Comparative study of predicted and experimentally detected interplanetary shocks
NASA Astrophysics Data System (ADS)
Kartalev, M. D.; Grigorov, K. G.; Smith, Z.; Dryer, M.; Fry, C. D.; Sun, Wei; Deehr, C. S.
2002-03-01
We compare the real time space weather prediction shock arrival times at 1 AU made by the USAF/NOAA Shock Time of Arrival (STOA) and Interplanetary Shock Propagation Model (ISPM) models, and the Exploration Physics International/University of Alaska Hakamada-Akasofu-Fry Solar Wind Model (HAF-v2) to a real time analysis analysis of plasma and field ACE data. The comparison is made using an algorithm that was developed on the basis of wavelet data analysis and MHD identification procedure. The shock parameters are estimated for selected "candidate events". An appropriate automatically performing Web-based interface periodically utilizes solar wind observations made by the ACE at L1. Near real time results as well an archive of the registered interesting events are available on a specially developed web site. A number of events are considered. These studies are essential for the validation of real time space weather forecasts made from solar data.
NASA Astrophysics Data System (ADS)
Ip, F.; Dohm, J. M.; Baker, V. R.; Castano, R.; Cichy, B.; Chien, S.; Davies, A.; Doggett, T.; Greeley, R.
2004-12-01
For the first time, a spacecraft has the ability to autonomously detect and react to flood events. Flood detection and the investigation of flooding dynamics in real time from space have never been done before at least not until now. Part of the challenge for the hydrological community has been the difficulty of obtaining cloud-free scenes from orbit at sufficient temporal and spatial resolutions to accurately assess flooding. In addition, the large spatial extent of drainage networks coupled with the size of the data sets necessary to be downlinked from satellites add to the difficulty of monitoring flooding from space. Technology developed as part of the Autonomous Sciencecraft Experiment (ASE) creates the new capability to autonomously detect, assess, and react to dynamic events, thereby enabling the monitoring of transient processes such as flooding in real time. In addition to being able to autonomously process the imaged data onboard the spacecraft for the first time and search the data for specific spectral features, the ASE Science Team has developed and tested change detection algorithms for the Hyperion spectrometer on EO-1. For flood events, if a change is detected in the onboard processed image (i.e. an increase in the number of ¡wet¡" pixels relative to a baseline image where the system is in normal flow condition or relatively dry), the spacecraft is autonomously retasked to obtain additional scenes. For instance, in February 2004 a rare flooding of the Australian Diamantina River was captured by EO-1. In addition, in August during ASE onboard testing a Zambezi River scene in Central Africa was successfully triggered by the classifier to autonomously take another observation. Yet another successful trigger-response flooding test scenario of the Yellow River in China was captured by ASE on 8/18/04. These exciting results pave the way for future smart reconnaissance missions of transient processes on Earth and beyond. Acknowledgments: We are grateful to the City of Tucson and Tucson Water for their support and cooperation.
High-speed event detector for embedded nanopore bio-systems.
Huang, Yiyun; Magierowski, Sebastian; Ghafar-Zadeh, Ebrahim; Wang, Chengjie
2015-08-01
Biological measurements of microscopic phenomena often deal with discrete-event signals. The ability to automatically carry out such measurements at high-speed in a miniature embedded system is desirable but compromised by high-frequency noise along with practical constraints on filter quality and sampler resolution. This paper presents a real-time event-detection method in the context of nanopore sensing that helps to mitigate these drawbacks and allows accurate signal processing in an embedded system. Simulations show at least a 10× improvement over existing on-line detection methods.
Flood and Weather Monitoring Using Real-time Twitter Data Streams
NASA Astrophysics Data System (ADS)
Demir, I.; Sit, M. A.; Sermet, M. Y.
2016-12-01
Social media data is a widely used source to making inference within public crisis periods and events in disaster times. Specifically, since Twitter provides large-scale data publicly in real-time, it is one of the most extensive resources with location information. This abstract provides an overview of a real-time Twitter analysis system to support flood preparedness and response using a comprehensive information-centric flood ontology and natural language processing. Within the scope of this project, we deal with acquisition and processing of real-time Twitter data streams. System fetches the tweets with specified keywords and classifies them as related to flooding or heavy weather conditions. The system uses machine learning algorithms to discover patterns using the correlation between tweets and Iowa Flood Information System's (IFIS) extensive resources. The system uses these patterns to forecast the formation and progress of a potential future flood event. While fetching tweets, predefined hashtags are used for filtering and enhancing the relevancy for selected tweets. With this project, tweets can also be used as an alternative data source where other data sources are not sufficient for specific tasks. During the disasters, the photos that people upload alongside their tweets can be collected and placed to appropriate locations on a mapping system. This allows decision making authorities and communities to see the most recent outlook of the disaster interactively. In case of an emergency, concentration of tweets can help the authorities to determine a strategy on how to reach people most efficiently while providing them the supplies they need. Thanks to the extendable nature of the flood ontology and framework, results from this project will be a guide for other natural disasters, and will be shared with the community.
An arc control and protection system for the JET lower hybrid antenna based on an imaging system.
Figueiredo, J; Mailloux, J; Kirov, K; Kinna, D; Stamp, M; Devaux, S; Arnoux, G; Edwards, J S; Stephen, A V; McCullen, P; Hogben, C
2014-11-01
Arcs are the potentially most dangerous events related to Lower Hybrid (LH) antenna operation. If left uncontrolled they can produce damage and cause plasma disruption by impurity influx. To address this issue an arc real time control and protection imaging system for the Joint European Torus (JET) LH antenna has been implemented. The LH system is one of the additional heating systems at JET. It comprises 24 microwave generators (klystrons, operating at 3.7 GHz) providing up to 5 MW of heating and current drive to the JET plasma. This is done through an antenna composed of an array of waveguides facing the plasma. The protection system presented here is based primarily on an imaging arc detection and real time control system. It has adapted the ITER like wall hotspot protection system using an identical CCD camera and real time image processing unit. A filter has been installed to avoid saturation and spurious system triggers caused by ionization light. The antenna is divided in 24 Regions Of Interest (ROIs) each one corresponding to one klystron. If an arc precursor is detected in a ROI, power is reduced locally with subsequent potential damage and plasma disruption avoided. The power is subsequently reinstated if, during a defined interval of time, arcing is confirmed not to be present by image analysis. This system was successfully commissioned during the restart phase and beginning of the 2013 scientific campaign. Since its installation and commissioning, arcs and related phenomena have been prevented. In this contribution we briefly describe the camera, image processing, and real time control systems. Most importantly, we demonstrate that an LH antenna arc protection system based on CCD camera imaging systems works. Examples of both controlled and uncontrolled LH arc events and their consequences are shown.
NASA Astrophysics Data System (ADS)
Joyce, Malcolm J.; Aspinall, Michael D.; Cave, Francis D.; Lavietes, Anthony D.
2012-08-01
Pulse-shape discrimination (PSD) in fast, organic scintillation detectors is a long-established technique used to separate neutrons and γ rays in mixed radiation fields. In the analogue domain the method can achieve separation in real time, but all knowledge of the pulses themselves is lost thereby preventing the possibility of any post- or repeated analysis. Also, it is typically reliant on electronic systems that are largely obsolete and which require significant experience to set up. In the digital domain, PSD is often more flexible but significant post-processing has usually been necessary to obtain neutron/γ-ray separation. Moreover, the scintillation media on which the technique relies usually have a low flashpoint and are thus deemed hazardous. This complicates the ease with which they are used in industrial applications. In this paper, results obtained with a new portable digital pulse-shape discrimination instrument are described. This instrument provides real-time, digital neutron/γ-ray separation whilst preserving the synchronization with the time-of-arrival for each event, and realizing throughputs of 3 × 106 events per second. Furthermore, this system has been tested with a scintillation medium that is non-flammable and not hazardous.
The quest for wisdom: lessons from 17 tsunamis, 2004-2014.
Okal, Emile A
2015-10-28
Since the catastrophic Sumatra-Andaman tsunami took place in 2004, 16 other tsunamis have resulted in significant damage and 14 in casualties. We review the fundamental changes that have affected our command of tsunami issues as scientists, engineers and decision-makers, in the quest for improved wisdom in this respect. While several scientific paradigms have had to be altered or abandoned, new algorithms, e.g. the W seismic phase and real-time processing of fast-arriving seismic P waves, give us more powerful tools to estimate in real time the tsunamigenic character of an earthquake. We assign to each event a 'wisdom index' based on the warning issued (or not) during the event, and on the response of the population. While this approach is admittedly subjective, it clearly shows several robust trends: (i) we have made significant progress in our command of far-field warning, with only three casualties in the past 10 years; (ii) self-evacuation by educated populations in the near field is a key element of successful tsunami mitigation; (iii) there remains a significant cacophony between the scientific community and decision-makers in industry and government as documented during the 2010 Maule and 2011 Tohoku events; and (iv) the so-called 'tsunami earthquakes' generating larger tsunamis than expected from the size of their seismic source persist as a fundamental challenge, despite scientific progress towards characterizing these events in real time. © 2015 The Author(s).
NASA Astrophysics Data System (ADS)
Knox, H. A.; Draelos, T.; Young, C. J.; Lawry, B.; Chael, E. P.; Faust, A.; Peterson, M. G.
2015-12-01
The quality of automatic detections from seismic sensor networks depends on a large number of data processing parameters that interact in complex ways. The largely manual process of identifying effective parameters is painstaking and does not guarantee that the resulting controls are the optimal configuration settings. Yet, achieving superior automatic detection of seismic events is closely related to these parameters. We present an automated sensor tuning (AST) system that learns near-optimal parameter settings for each event type using neuro-dynamic programming (reinforcement learning) trained with historic data. AST learns to test the raw signal against all event-settings and automatically self-tunes to an emerging event in real-time. The overall goal is to reduce the number of missed legitimate event detections and the number of false event detections. Reducing false alarms early in the seismic pipeline processing will have a significant impact on this goal. Applicable both for existing sensor performance boosting and new sensor deployment, this system provides an important new method to automatically tune complex remote sensing systems. Systems tuned in this way will achieve better performance than is currently possible by manual tuning, and with much less time and effort devoted to the tuning process. With ground truth on detections in seismic waveforms from a network of stations, we show that AST increases the probability of detection while decreasing false alarms.
Using timed event sequential data in nursing research.
Pecanac, Kristen E; Doherty-King, Barbara; Yoon, Ju Young; Brown, Roger; Schiefelbein, Tony
2015-01-01
Measuring behavior is important in nursing research, and innovative technologies are needed to capture the "real-life" complexity of behaviors and events. The purpose of this article is to describe the use of timed event sequential data in nursing research and to demonstrate the use of this data in a research study. Timed event sequencing allows the researcher to capture the frequency, duration, and sequence of behaviors as they occur in an observation period and to link the behaviors to contextual details. Timed event sequential data can easily be collected with handheld computers, loaded with a software program designed for capturing observations in real time. Timed event sequential data add considerable strength to analysis of any nursing behavior of interest, which can enhance understanding and lead to improvement in nursing practice.
Real-time encoding and compression of neuronal spikes by metal-oxide memristors
NASA Astrophysics Data System (ADS)
Gupta, Isha; Serb, Alexantrou; Khiat, Ali; Zeitler, Ralf; Vassanelli, Stefano; Prodromakis, Themistoklis
2016-09-01
Advanced brain-chip interfaces with numerous recording sites bear great potential for investigation of neuroprosthetic applications. The bottleneck towards achieving an efficient bio-electronic link is the real-time processing of neuronal signals, which imposes excessive requirements on bandwidth, energy and computation capacity. Here we present a unique concept where the intrinsic properties of memristive devices are exploited to compress information on neural spikes in real-time. We demonstrate that the inherent voltage thresholds of metal-oxide memristors can be used for discriminating recorded spiking events from background activity and without resorting to computationally heavy off-line processing. We prove that information on spike amplitude and frequency can be transduced and stored in single devices as non-volatile resistive state transitions. Finally, we show that a memristive device array allows for efficient data compression of signals recorded by a multi-electrode array, demonstrating the technology's potential for building scalable, yet energy-efficient on-node processors for brain-chip interfaces.
Real-time encoding and compression of neuronal spikes by metal-oxide memristors
Gupta, Isha; Serb, Alexantrou; Khiat, Ali; Zeitler, Ralf; Vassanelli, Stefano; Prodromakis, Themistoklis
2016-01-01
Advanced brain-chip interfaces with numerous recording sites bear great potential for investigation of neuroprosthetic applications. The bottleneck towards achieving an efficient bio-electronic link is the real-time processing of neuronal signals, which imposes excessive requirements on bandwidth, energy and computation capacity. Here we present a unique concept where the intrinsic properties of memristive devices are exploited to compress information on neural spikes in real-time. We demonstrate that the inherent voltage thresholds of metal-oxide memristors can be used for discriminating recorded spiking events from background activity and without resorting to computationally heavy off-line processing. We prove that information on spike amplitude and frequency can be transduced and stored in single devices as non-volatile resistive state transitions. Finally, we show that a memristive device array allows for efficient data compression of signals recorded by a multi-electrode array, demonstrating the technology's potential for building scalable, yet energy-efficient on-node processors for brain-chip interfaces. PMID:27666698
Liu, Baolin; Wang, Zhongning; Jin, Zhixing
2009-09-11
In real life, the human brain usually receives information through visual and auditory channels and processes the multisensory information, but studies on the integration processing of the dynamic visual and auditory information are relatively few. In this paper, we have designed an experiment, where through the presentation of common scenario, real-world videos, with matched and mismatched actions (images) and sounds as stimuli, we aimed to study the integration processing of synchronized visual and auditory information in videos of real-world events in the human brain, through the use event-related potentials (ERPs) methods. Experimental results showed that videos of mismatched actions (images) and sounds would elicit a larger P400 as compared to videos of matched actions (images) and sounds. We believe that the P400 waveform might be related to the cognitive integration processing of mismatched multisensory information in the human brain. The results also indicated that synchronized multisensory information would interfere with each other, which would influence the results of the cognitive integration processing.
Characteristics of memories for near-death experiences.
Moore, Lauren E; Greyson, Bruce
2017-05-01
Near-death experiences are vivid, life-changing experiences occurring to people who come close to death. Because some of their features, such as enhanced cognition despite compromised brain function, challenge our understanding of the mind-brain relationship, the question arises whether near-death experiences are imagined rather than real events. We administered the Memory Characteristics Questionnaire to 122 survivors of a close brush with death who reported near-death experiences. Participants completed Memory Characteristics Questionnaires for three different memories: that of their near-death experience, that of a real event around the same time, and that of an event they had imagined around the same time. The Memory Characteristics Questionnaire score was higher for the memory of the near-death experience than for that of the real event, which in turn was higher than that of the imagined event. These data suggest that memories of near-death experiences are recalled as "realer" than real events or imagined events. Copyright © 2017 Elsevier Inc. All rights reserved.
Tsui, Fu-Chiang; Espino, Jeremy U; Weng, Yan; Choudary, Arvinder; Su, Hoah-Der; Wagner, Michael M
2005-01-01
The National Retail Data Monitor (NRDM) has monitored over-the-counter (OTC) medication sales in the United States since December 2002. The NRDM collects data from over 18,600 retail stores and processes over 0.6 million sales records per day. This paper describes key architectural features that we have found necessary for a data utility component in a national biosurveillance system. These elements include event-driven architecture to provide analyses of data in near real time, multiple levels of caching to improve query response time, high availability through the use of clustered servers, scalable data storage through the use of storage area networks and a web-service function for interoperation with affiliated systems. The methods and architectural principles are relevant to the design of any production data utility for public health surveillance-systems that collect data from multiple sources in near real time for use by analytic programs and user interfaces that have substantial requirements for time-series data aggregated in multiple dimensions.
Airborne Camera System for Real-Time Applications - Support of a National Civil Protection Exercise
NASA Astrophysics Data System (ADS)
Gstaiger, V.; Romer, H.; Rosenbaum, D.; Henkel, F.
2015-04-01
In the VABENE++ project of the German Aerospace Center (DLR), powerful tools are being developed to aid public authorities and organizations with security responsibilities as well as traffic authorities when dealing with disasters and large public events. One focus lies on the acquisition of high resolution aerial imagery, its fully automatic processing, analysis and near real-time provision to decision makers in emergency situations. For this purpose a camera system was developed to be operated from a helicopter with light-weight processing units and microwave link for fast data transfer. In order to meet end-users' requirements DLR works close together with the German Federal Office of Civil Protection and Disaster Assistance (BBK) within this project. One task of BBK is to establish, maintain and train the German Medical Task Force (MTF), which gets deployed nationwide in case of large-scale disasters. In October 2014, several units of the MTF were deployed for the first time in the framework of a national civil protection exercise in Brandenburg. The VABENE++ team joined the exercise and provided near real-time aerial imagery, videos and derived traffic information to support the direction of the MTF and to identify needs for further improvements and developments. In this contribution the authors introduce the new airborne camera system together with its near real-time processing components and share experiences gained during the national civil protection exercise.
EARLINET: potential operationality of a research network
NASA Astrophysics Data System (ADS)
Sicard, M.; D'Amico, G.; Comerón, A.; Mona, L.; Alados-Arboledas, L.; Amodeo, A.; Baars, H.; Belegante, L.; Binietoglou, I.; Bravo-Aranda, J. A.; Fernández, A. J.; Fréville, P.; García-Vizcaíno, D.; Giunta, A.; Granados-Muñoz, M. J.; Guerrero-Rascado, J. L.; Hadjimitsis, D.; Haefele, A.; Hervo, M.; Iarlori, M.; Kokkalis, P.; Lange, D.; Mamouri, R. E.; Mattis, I.; Molero, F.; Montoux, N.; Muñoz, A.; Muñoz Porcar, C.; Navas-Guzmán, F.; Nicolae, D.; Nisantzi, A.; Papagiannopoulos, N.; Papayannis, A.; Pereira, S.; Preißler, J.; Pujadas, M.; Rizi, V.; Rocadenbosch, F.; Sellegri, K.; Simeonov, V.; Tsaknakis, G.; Wagner, F.; Pappalardo, G.
2015-07-01
In the framework of ACTRIS summer 2012 measurement campaign (8 June-17 July 2012), EARLINET organized and performed a controlled exercise of feasibility to demonstrate its potential to perform operational, coordinated measurements and deliver products in near-real time. Eleven lidar stations participated to the exercise which started on 9 July 2012 at 06:00 UT and ended 72 h later on 12 July at 06:00 UT. For the first time the Single-Calculus Chain (SCC), the common calculus chain developed within EARLINET for the automatic evaluation of lidar data from raw signals up to the final products, was used. All stations sent in real time measurements of 1 h of duration to the SCC server in a predefined netcdf file format. The pre-processing of the data was performed in real time by the SCC while the optical processing was performed in near-real time after the exercise ended. 98 and 84 % of the files sent to SCC were successfully pre-processed and processed, respectively. Those percentages are quite large taking into account that no cloud screening was performed on lidar data. The paper shows time series of continuous and homogeneously obtained products retrieved at different levels of the SCC: range-square corrected signals (pre-processing) and daytime backscatter and nighttime extinction coefficient profiles (optical processing), as well as combined plots of all direct and derived optical products. The derived products include backscatter- and extinction-related Ångström exponents, lidar ratios and color ratios. The combined plots reveal extremely valuable for aerosol classification. The efforts made to define the measurements protocol and to configure properly the SCC pave the way for applying this protocol for specific applications such as the monitoring of special events, atmospheric modelling, climate research and calibration/validation activities of spaceborne observations.
Near-Real-Time Detection and Monitoring of Dust Events by Satellite (SeaWIFS, MODIS, and TOMS)
NASA Technical Reports Server (NTRS)
Hsu, N. Christina; Tsay, Si-Chee; Herman, Jay R.; Kaufman, Yoram
2002-01-01
Over the last few years satellites have given us increasingly detailed information on the size, location, and duration of dust events around the world. These data not only provide valuable feedback to the modelling community as to the fidelity of their aerosol models but are also finding increasing use in near real-time applications. In particular, the ability to locate and track the development of aerosol dust clouds on a near real-time basis is being used by scientists and government to provide warning of air pollution episodes over major urban area. This ability has also become a crucial component of recent coordinated campaigns to study the characteristics of tropospheric aerosols such as dust and their effect on climate. One such recent campaign was ACE-Asia, which was designed to obtain the comprehensive set of ground, aircraft, and satellite data necessary to provide a detailed understanding of atmospheric aerosol particles over the Asian-Pacific region. As part of ACE-Asia, we developed a near real-time data processing and access system to provide satellite data from the polar-orbiting instruments Earth Probe TOMS (in the form of absorbing aerosol index) and SeaWiFS (in the form of aerosol optical thickness, AOT, and Angstrom exponent). The results were available via web access. The location and movement information provided by these data were used both in support of the day-to-day flight planning of ACE-Asia and as input into aerosol transport models. While near real-time SeaWiFS data processing can be performed using either the normal global data product or data obtained via direct broadcast to receiving stations close to the area of interest, near real-time MODIS processing of data to provide aerosol retrievals is currently only available using its direct broadcast capability. In this paper, we will briefly discuss the algorithms used to generate these data. The retrieved aerosol optical thickness and Angstrom exponent from SeaWiFS will be compared with those obtained from various AERONET sites over the Asian-Pacific region. The TOMS aerosol index will also be compared with AERONET aerosol optical thickness over different aerosol conditions, and comparisons between the MODIS and SeaWiFS data will also be presented. Finally, we will discuss the climate implication of our studies using the combined satellite and AERONET observations.
Temporal coding in a silicon network of integrate-and-fire neurons.
Liu, Shih-Chii; Douglas, Rodney
2004-09-01
Spatio-temporal processing of spike trains by neuronal networks depends on a variety of mechanisms distributed across synapses, dendrites, and somata. In natural systems, the spike trains and the processing mechanisms cohere though their common physical instantiation. This coherence is lost when the natural system is encoded for simulation on a general purpose computer. By contrast, analog VLSI circuits are, like neurons, inherently related by their real-time physics, and so, could provide a useful substrate for exploring neuronlike event-based processing. Here, we describe a hybrid analog-digital VLSI chip comprising a set of integrate-and-fire neurons and short-term dynamical synapses that can be configured into simple network architectures with some properties of neocortical neuronal circuits. We show that, despite considerable fabrication variance in the properties of individual neurons, the chip offers a viable substrate for exploring real-time spike-based processing in networks of neurons.
Monitoring microearthquakes with the San Andreas fault observatory at depth
Oye, V.; Ellsworth, W.L.
2007-01-01
In 2005, the San Andreas Fault Observatory at Depth (SAFOD) was drilled through the San Andreas Fault zone at a depth of about 3.1 km. The borehole has subsequently been instrumented with high-frequency geophones in order to better constrain locations and source processes of nearby microearthquakes that will be targeted in the upcoming phase of SAFOD. The microseismic monitoring software MIMO, developed by NORSAR, has been installed at SAFOD to provide near-real time locations and magnitude estimates using the high sampling rate (4000 Hz) waveform data. To improve the detection and location accuracy, we incorporate data from the nearby, shallow borehole (???250 m) seismometers of the High Resolution Seismic Network (HRSN). The event association algorithm of the MIMO software incorporates HRSN detections provided by the USGS real time earthworm software. The concept of the new event association is based on the generalized beam forming, primarily used in array seismology. The method requires the pre-computation of theoretical travel times in a 3D grid of potential microearthquake locations to the seismometers of the current station network. By minimizing the differences between theoretical and observed detection times an event is associated and the location accuracy is significantly improved.
Resolving Single Molecule Lysozyme Dynamics with a Carbon Nanotube Electronic Circuit
NASA Astrophysics Data System (ADS)
Choi, Yongki; Moody, Issa S.; Perez, Israel; Sheps, Tatyana; Weiss, Gregory A.; Collins, Philip G.
2011-03-01
High resolution, real-time monitoring of a single lysozyme molecule is demonstrated by fabricating nanoscale electronic devices based on single-walled carbon nanotubes (SWCNT). In this sensor platform, a biomolecule of interest is attached to a single SWCNT device. The electrical conductance transduces chemical events with single molecule sensitivity and 10 microsecond resolution. In this work, enzymatic turnover by lysozyme is investigated, because the mechanistic details for its processivity and dynamics remain incompletely understood. Stochastically distributed binding events between a lysozyme and its binding substrate, peptidoglycan, are monitored via the sensor conductance. Furthermore, the magnitude and repetition rate of these events varies with pH and the presence of inhibitors or denaturation agents. Changes in the conductance signal are analyzed in terms of lysozyme's internal hinge motion, binding events, and enzymatic processing.
Graphical user interface for image acquisition and processing
Goldberg, Kenneth A.
2002-01-01
An event-driven GUI-based image acquisition interface for the IDL programming environment designed for CCD camera control and image acquisition directly into the IDL environment where image manipulation and data analysis can be performed, and a toolbox of real-time analysis applications. Running the image acquisition hardware directly from IDL removes the necessity of first saving images in one program and then importing the data into IDL for analysis in a second step. Bringing the data directly into IDL creates an opportunity for the implementation of IDL image processing and display functions in real-time. program allows control over the available charge coupled device (CCD) detector parameters, data acquisition, file saving and loading, and image manipulation and processing, all from within IDL. The program is built using IDL's widget libraries to control the on-screen display and user interface.
EEGLAB, SIFT, NFT, BCILAB, and ERICA: new tools for advanced EEG processing.
Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott
2011-01-01
We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments.
ATLAS Simulation using Real Data: Embedding and Overlay
NASA Astrophysics Data System (ADS)
Haas, Andrew; ATLAS Collaboration
2017-10-01
For some physics processes studied with the ATLAS detector, a more accurate simulation in some respects can be achieved by including real data into simulated events, with substantial potential improvements in the CPU, disk space, and memory usage of the standard simulation configuration, at the cost of significant database and networking challenges. Real proton-proton background events can be overlaid (at the detector digitization output stage) on a simulated hard-scatter process, to account for pileup background (from nearby bunch crossings), cavern background, and detector noise. A similar method is used to account for the large underlying event from heavy ion collisions, rather than directly simulating the full collision. Embedding replaces the muons found in Z→μμ decays in data with simulated taus at the same 4-momenta, thus preserving the underlying event and pileup from the original data event. In all these cases, care must be taken to exactly match detector conditions (beamspot, magnetic fields, alignments, dead sensors, etc.) between the real data event and the simulation. We will discuss the status of these overlay and embedding techniques within ATLAS software and computing.
NASA Astrophysics Data System (ADS)
Schindewolf, Marcus; Kaiser, Andreas; Buchholtz, Arno; Schmidt, Jürgen
2017-04-01
Extreme rainfall events and resulting flash floods led to massive devastations in Germany during spring 2016. The study presented aims on the development of a early warning system, which allows the simulation and assessment of negative effects on infrastructure by radar-based heavy rainfall predictions, serving as input data for the process-based soil loss and deposition model EROSION 3D. Our approach enables a detailed identification of runoff and sediment fluxes in agricultural used landscapes. In a first step, documented historical events were analyzed concerning the accordance of measured radar rainfall and large scale erosion risk maps. A second step focused on a small scale erosion monitoring via UAV of source areas of heavy flooding events and a model reconstruction of the processes involved. In all examples damages were caused to local infrastructure. Both analyses are promising in order to detect runoff and sediment delivering areas even in a high temporal and spatial resolution. Results prove the important role of late-covering crops such as maize, sugar beet or potatoes in runoff generation. While e.g. winter wheat positively affects extensive runoff generation on undulating landscapes, massive soil loss and thus muddy flows are observed and depicted in model results. Future research aims on large scale model parameterization and application in real time, uncertainty estimation of precipitation forecast and interface developments.
NSTX-U Advances in Real-Time C++11 on Linux
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erickson, Keith G.
Programming languages like C and Ada combined with proprietary embedded operating systems have dominated the real-time application space for decades. The new C++11standard includes native, language-level support for concurrency, a required feature for any nontrivial event-oriented real-time software. Threads, Locks, and Atomics now exist to provide the necessary tools to build the structures that make up the foundation of a complex real-time system. The National Spherical Torus Experiment Upgrade (NSTX-U) at the Princeton Plasma Physics Laboratory (PPPL) is breaking new ground with the language as applied to the needs of fusion devices. A new Digital Coil Protection System (DCPS) willmore » serve as the main protection mechanism for the magnetic coils, and it is written entirely in C++11 running on Concurrent Computer Corporation's real-time operating system, RedHawk Linux. It runs over 600 algorithms in a 5 kHz control loop that determine whether or not to shut down operations before physical damage occurs. To accomplish this, NSTX-U engineers developed software tools that do not currently exist elsewhere, including real-time atomic synchronization, real-time containers, and a real-time logging framework. Together with a recent (and carefully configured) version of the GCC compiler, these tools enable data acquisition, processing, and output using a conventional operating system to meet a hard real-time deadline (that is, missing one periodic is a failure) of 200 microseconds.« less
NSTX-U Advances in Real-Time C++11 on Linux
Erickson, Keith G.
2015-08-14
Programming languages like C and Ada combined with proprietary embedded operating systems have dominated the real-time application space for decades. The new C++11standard includes native, language-level support for concurrency, a required feature for any nontrivial event-oriented real-time software. Threads, Locks, and Atomics now exist to provide the necessary tools to build the structures that make up the foundation of a complex real-time system. The National Spherical Torus Experiment Upgrade (NSTX-U) at the Princeton Plasma Physics Laboratory (PPPL) is breaking new ground with the language as applied to the needs of fusion devices. A new Digital Coil Protection System (DCPS) willmore » serve as the main protection mechanism for the magnetic coils, and it is written entirely in C++11 running on Concurrent Computer Corporation's real-time operating system, RedHawk Linux. It runs over 600 algorithms in a 5 kHz control loop that determine whether or not to shut down operations before physical damage occurs. To accomplish this, NSTX-U engineers developed software tools that do not currently exist elsewhere, including real-time atomic synchronization, real-time containers, and a real-time logging framework. Together with a recent (and carefully configured) version of the GCC compiler, these tools enable data acquisition, processing, and output using a conventional operating system to meet a hard real-time deadline (that is, missing one periodic is a failure) of 200 microseconds.« less
CISN ShakeAlert Earthquake Early Warning System Monitoring Tools
NASA Astrophysics Data System (ADS)
Henson, I. H.; Allen, R. M.; Neuhauser, D. S.
2015-12-01
CISN ShakeAlert is a prototype earthquake early warning system being developed and tested by the California Integrated Seismic Network. The system has recently been expanded to support redundant data processing and communications. It now runs on six machines at three locations with ten Apache ActiveMQ message brokers linking together 18 waveform processors, 12 event association processes and 4 Decision Module alert processes. The system ingests waveform data from about 500 stations and generates many thousands of triggers per day, from which a small portion produce earthquake alerts. We have developed interactive web browser system-monitoring tools that display near real time state-of-health and performance information. This includes station availability, trigger statistics, communication and alert latencies. Connections to regional earthquake catalogs provide a rapid assessment of the Decision Module hypocenter accuracy. Historical performance can be evaluated, including statistics for hypocenter and origin time accuracy and alert time latencies for different time periods, magnitude ranges and geographic regions. For the ElarmS event associator, individual earthquake processing histories can be examined, including details of the transmission and processing latencies associated with individual P-wave triggers. Individual station trigger and latency statistics are available. Detailed information about the ElarmS trigger association process for both alerted events and rejected events is also available. The Google Web Toolkit and Map API have been used to develop interactive web pages that link tabular and geographic information. Statistical analysis is provided by the R-Statistics System linked to a PostgreSQL database.
The time to remember: Temporal compression and duration judgements in memory for real-life events.
Jeunehomme, Olivier; D'Argembeau, Arnaud
2018-05-01
Recent studies suggest that the continuous flow of information that constitutes daily life events is temporally compressed in episodic memory, yet the characteristics and determinants of this compression mechanism remain unclear. This study examined this question using an experimental paradigm incorporating wearable camera technology. Participants experienced a series of real-life events and were later asked to mentally replay various event sequences that were cued by pictures taken during the original events. Estimates of temporal compression (the ratio of the time needed to mentally re-experience an event to the actual event duration) showed that events were replayed, on average, about eight times faster than the original experiences. This compression mechanism seemed to operate by representing events as a succession of moments or slices of prior experience separated by temporal discontinuities. Importantly, however, rates of temporal compression were not constant and were lower for events involving goal-directed actions. The results also showed that the perceived duration of events increased with the density of recalled moments of prior experience. Taken together, these data extend our understanding of the mechanisms underlying the temporal compression and perceived duration of real-life events in episodic memory.
Real-time improvement of continuous glucose monitoring accuracy: the smart sensor concept.
Facchinetti, Andrea; Sparacino, Giovanni; Guerra, Stefania; Luijf, Yoeri M; DeVries, J Hans; Mader, Julia K; Ellmerer, Martin; Benesch, Carsten; Heinemann, Lutz; Bruttomesso, Daniela; Avogaro, Angelo; Cobelli, Claudio
2013-04-01
Reliability of continuous glucose monitoring (CGM) sensors is key in several applications. In this work we demonstrate that real-time algorithms can render CGM sensors smarter by reducing their uncertainty and inaccuracy and improving their ability to alert for hypo- and hyperglycemic events. The smart CGM (sCGM) sensor concept consists of a commercial CGM sensor whose output enters three software modules, able to work in real time, for denoising, enhancement, and prediction. These three software modules were recently presented in the CGM literature, and here we apply them to the Dexcom SEVEN Plus continuous glucose monitor. We assessed the performance of the sCGM on data collected in two trials, each containing 12 patients with type 1 diabetes. The denoising module improves the smoothness of the CGM time series by an average of ∼57%, the enhancement module reduces the mean absolute relative difference from 15.1 to 10.3%, increases by 12.6% the pairs of values falling in the A-zone of the Clarke error grid, and finally, the prediction module forecasts hypo- and hyperglycemic events an average of 14 min ahead of time. We have introduced and implemented the sCGM sensor concept. Analysis of data from 24 patients demonstrates that incorporation of suitable real-time signal processing algorithms for denoising, enhancement, and prediction can significantly improve the performance of CGM applications. This can be of great clinical impact for hypo- and hyperglycemic alert generation as well in artificial pancreas devices.
Optimizing SIEM Throughput on the Cloud Using Parallelization
Alam, Masoom; Ihsan, Asif; Javaid, Qaisar; Khan, Abid; Manzoor, Jawad; Akhundzada, Adnan; Khan, M Khurram; Farooq, Sajid
2016-01-01
Processing large amounts of data in real time for identifying security issues pose several performance challenges, especially when hardware infrastructure is limited. Managed Security Service Providers (MSSP), mostly hosting their applications on the Cloud, receive events at a very high rate that varies from a few hundred to a couple of thousand events per second (EPS). It is critical to process this data efficiently, so that attacks could be identified quickly and necessary response could be initiated. This paper evaluates the performance of a security framework OSTROM built on the Esper complex event processing (CEP) engine under a parallel and non-parallel computational framework. We explain three architectures under which Esper can be used to process events. We investigated the effect on throughput, memory and CPU usage in each configuration setting. The results indicate that the performance of the engine is limited by the number of events coming in rather than the queries being processed. The architecture where 1/4th of the total events are submitted to each instance and all the queries are processed by all the units shows best results in terms of throughput, memory and CPU usage. PMID:27851762
Point process modeling and estimation: Advances in the analysis of dynamic neural spiking data
NASA Astrophysics Data System (ADS)
Deng, Xinyi
2016-08-01
A common interest of scientists in many fields is to understand the relationship between the dynamics of a physical system and the occurrences of discrete events within such physical system. Seismologists study the connection between mechanical vibrations of the Earth and the occurrences of earthquakes so that future earthquakes can be better predicted. Astrophysicists study the association between the oscillating energy of celestial regions and the emission of photons to learn the Universe's various objects and their interactions. Neuroscientists study the link between behavior and the millisecond-timescale spike patterns of neurons to understand higher brain functions. Such relationships can often be formulated within the framework of state-space models with point process observations. The basic idea is that the dynamics of the physical systems are driven by the dynamics of some stochastic state variables and the discrete events we observe in an interval are noisy observations with distributions determined by the state variables. This thesis proposes several new methodological developments that advance the framework of state-space models with point process observations at the intersection of statistics and neuroscience. In particular, we develop new methods 1) to characterize the rhythmic spiking activity using history-dependent structure, 2) to model population spike activity using marked point process models, 3) to allow for real-time decision making, and 4) to take into account the need for dimensionality reduction for high-dimensional state and observation processes. We applied these methods to a novel problem of tracking rhythmic dynamics in the spiking of neurons in the subthalamic nucleus of Parkinson's patients with the goal of optimizing placement of deep brain stimulation electrodes. We developed a decoding algorithm that can make decision in real-time (for example, to stimulate the neurons or not) based on various sources of information present in population spiking data. Lastly, we proposed a general three-step paradigm that allows us to relate behavioral outcomes of various tasks to simultaneously recorded neural activity across multiple brain areas, which is a step towards closed-loop therapies for psychological diseases using real-time neural stimulation. These methods are suitable for real-time implementation for content-based feedback experiments.
A digital pixel cell for address event representation image convolution processing
NASA Astrophysics Data System (ADS)
Camunas-Mesa, Luis; Acosta-Jimenez, Antonio; Serrano-Gotarredona, Teresa; Linares-Barranco, Bernabe
2005-06-01
Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows for real-time virtual massive connectivity between huge number of neurons located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate events according to their information levels. Neurons with more information (activity, derivative of activities, contrast, motion, edges,...) generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. AER technology has been used and reported for the implementation of various type of image sensors or retinae: luminance with local agc, contrast retinae, motion retinae,... Also, there has been a proposal for realizing programmable kernel image convolution chips. Such convolution chips would contain an array of pixels that perform weighted addition of events. Once a pixel has added sufficient event contributions to reach a fixed threshold, the pixel fires an event, which is then routed out of the chip for further processing. Such convolution chips have been proposed to be implemented using pulsed current mode mixed analog and digital circuit techniques. In this paper we present a fully digital pixel implementation to perform the weighted additions and fire the events. This way, for a given technology, there is a fully digital implementation reference against which compare the mixed signal implementations. We have designed, implemented and tested a fully digital AER convolution pixel. This pixel will be used to implement a full AER convolution chip for programmable kernel image convolution processing.
RoboTAP: Target priorities for robotic microlensing observations
NASA Astrophysics Data System (ADS)
Hundertmark, M.; Street, R. A.; Tsapras, Y.; Bachelet, E.; Dominik, M.; Horne, K.; Bozza, V.; Bramich, D. M.; Cassan, A.; D'Ago, G.; Figuera Jaimes, R.; Kains, N.; Ranc, C.; Schmidt, R. W.; Snodgrass, C.; Wambsganss, J.; Steele, I. A.; Mao, S.; Ment, K.; Menzies, J.; Li, Z.; Cross, S.; Maoz, D.; Shvartzvald, Y.
2018-01-01
Context. The ability to automatically select scientifically-important transient events from an alert stream of many such events, and to conduct follow-up observations in response, will become increasingly important in astronomy. With wide-angle time domain surveys pushing to fainter limiting magnitudes, the capability to follow-up on transient alerts far exceeds our follow-up telescope resources, and effective target prioritization becomes essential. The RoboNet-II microlensing program is a pathfinder project, which has developed an automated target selection process (RoboTAP) for gravitational microlensing events, which are observed in real time using the Las Cumbres Observatory telescope network. Aims: Follow-up telescopes typically have a much smaller field of view compared to surveys, therefore the most promising microlensing events must be automatically selected at any given time from an annual sample exceeding 2000 events. The main challenge is to select between events with a high planet detection sensitivity, with the aim of detecting many planets and characterizing planetary anomalies. Methods: Our target selection algorithm is a hybrid system based on estimates of the planet detection zones around a microlens. It follows automatic anomaly alerts and respects the expected survey coverage of specific events. Results: We introduce the RoboTAP algorithm, whose purpose is to select and prioritize microlensing events with high sensitivity to planetary companions. In this work, we determine the planet sensitivity of the RoboNet follow-up program and provide a working example of how a broker can be designed for a real-life transient science program conducting follow-up observations in response to alerts; we explore the issues that will confront similar programs being developed for the Large Synoptic Survey Telescope (LSST) and other time domain surveys.
NASA Technical Reports Server (NTRS)
Myers, E.; Coppin, P.; Wagner, M.; Fischer, K.; Lu, L.; McCloskey, R.; Seneker, D.; Cabrol, N.; Wettergreen, D.; Waggoner, A.
2005-01-01
The EventScope educational telepresence project has been involved with education and public outreach for a number of NASA-sponsored missions including the Mars Exploration Rovers, the Odyssey Mission, and the Life in the Atacama project. However, during the second year of operations in the Atacama, a modified version of the EventScope public interface was used as the remote science operations interface. In addition, the EventScope lab hosted remote science operations. This intimate connection with the mission operations allowed the EventScope team to bring the experience of the mission to the public in near real-time. Playing to this strength, the lab developed strategies for releasing E/PO content as close to real-time as possible.
Real-Time Tropospheric Delay Estimation using IGS Products
NASA Astrophysics Data System (ADS)
Stürze, Andrea; Liu, Sha; Söhne, Wolfgang
2014-05-01
The Federal Agency for Cartography and Geodesy (BKG) routinely provides zenith tropospheric delay (ZTD) parameter for the assimilation in numerical weather models since more than 10 years. Up to now the results flowing into the EUREF Permanent Network (EPN) or E-GVAP (EUMETNET EIG GNSS water vapour programme) analysis are based on batch processing of GPS+GLONASS observations in differential network mode. For the recently started COST Action ES1206 about "Advanced Global Navigation Satellite Systems tropospheric products for monitoring severe weather events and climate" (GNSS4SWEC), however, rapid updates in the analysis of the atmospheric state for nowcasting applications require changing the processing strategy towards real-time. In the RTCM SC104 (Radio Technical Commission for Maritime Services, Special Committee 104) a format combining the advantages of Precise Point Positioning (PPP) and Real-Time Kinematic (RTK) is under development. The so-called State Space Representation approach is defining corrections, which will be transferred in real-time to the user e.g. via NTRIP (Network Transport of RTCM via Internet Protocol). Meanwhile messages for precise orbits, satellite clocks and code biases compatible to the basic PPP mode using IGS products are defined. Consequently, the IGS Real-Time Service (RTS) was launched in 2013 in order to extend the well-known precise orbit and clock products by a real-time component. Further messages e.g. with respect to ionosphere or phase biases are foreseen. Depending on the level of refinement, so different accuracies up to the RTK level shall be reachable. In co-operation of BKG and the Technical University of Darmstadt the real-time software GEMon (GREF EUREF Monitoring) is under development. GEMon is able to process GPS and GLONASS observation and RTS product data streams in PPP mode. Furthermore, several state-of-the-art troposphere models, for example based on numerical weather prediction data, are implemented. Hence, it opens the possibility to evaluate the potential of troposphere parameter determination in real-time and its effect to Precise Point Positioning. Starting with an offline investigation of the influence of different RTS products and a priori troposphere models the configuration delivering the best results is used for a real-time processing of the GREF (German Geodetic Reference) network over a suitable period of time. The evaluation of the derived ZTD parameters and station heights is done with respect to well proven GREF, EUREF, IGS, and E-GVAP analysis results. Keywords: GNSS, Zenith Tropospheric Delay, Real-time Precise Point Positioning
Monitoring Natural Events Globally in Near Real-Time Using NASA's Open Web Services and Tools
NASA Technical Reports Server (NTRS)
Boller, Ryan A.; Ward, Kevin Alan; Murphy, Kevin J.
2015-01-01
Since 1960, NASA has been making global measurements of the Earth from a multitude of space-based missions, many of which can be useful for monitoring natural events. In recent years, these measurements have been made available in near real-time, making it possible to use them to also aid in managing the response to natural events. We present the challenges and ongoing solutions to using NASA satellite data for monitoring and managing these events.
Romanian Data Center: A modern way for seismic monitoring
NASA Astrophysics Data System (ADS)
Neagoe, Cristian; Marius Manea, Liviu; Ionescu, Constantin
2014-05-01
The main seismic survey of Romania is performed by the National Institute for Earth Physics (NIEP) which operates a real-time digital seismic network. The NIEP real-time network currently consists of 102 stations and two seismic arrays equipped with different high quality digitizers (Kinemetrics K2, Quanterra Q330, Quanterra Q330HR, PS6-26, Basalt), broadband and short period seismometers (CMG3ESP, CMG40T, KS2000, KS54000, KS2000, CMG3T,STS2, SH-1, S13, Mark l4c, Ranger, gs21, Mark l22) and acceleration sensors (Episensor Kinemetrics). The data are transmitted at the National Data Center (NDC) and Eforie Nord (EFOR) Seismic Observatory. EFOR is the back-up for the NDC and also a monitoring center for the Black Sea tsunami events. NIEP is a data acquisition node for the seismic network of Moldova (FDSN code MD) composed of five seismic stations. NIEP has installed in the northern part of Bulgaria eight seismic stations equipped with broadband sensors and Episensors and nine accelerometers (Episensors) installed in nine districts along the Danube River. All the data are acquired at NIEP for Early Warning System and for primary estimation of the earthquake parameters. The real-time acquisition (RT) and data exchange is done by Antelope software and Seedlink (from Seiscomp3). The real-time data communication is ensured by different types of transmission: GPRS, satellite, radio, Internet and a dedicated line provided by a governmental network. For data processing and analysis at the two data centers Antelope 5.2 TM is being used running on 3 workstations: one from a CentOS platform and two on MacOS. Also a Seiscomp3 server stands as back-up for Antelope 5.2 Both acquisition and analysis of seismic data systems produce information about local and global parameters of earthquakes. In addition, Antelope is used for manual processing (event association, calculation of magnitude, creating a database, sending seismic bulletins, calculation of PGA and PGV, etc.), generating ShakeMap products and interaction with global data centers. National Data Center developed tools to enable centralizing of data from software like Antelope and Seiscomp3. These tools allow rapid distribution of information about damages observed after an earthquake to the public. Another feature of the developed application is the alerting of designated persons, via email and SMS, based on the earthquake parameters. In parallel, Seiscomp3 sends automatic notifications (emails) with the earthquake parameters. The real-time seismic network and software acquisition and data processing used in the National Data Center development have increased the number of events detected locally and globally, the increase of the quality parameters obtained by data processing and potentially increasing visibility on the national and internationally.
Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin
2016-10-01
Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses.
Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin
2016-01-01
Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses. PMID:27706086
NASA Astrophysics Data System (ADS)
Kar, B.; Robinson, C.; Koch, D. B.; Omitaomu, O.
2017-12-01
The Sendai Framework for Disaster Risk Reduction 2015-2030 identified the following four priorities to prevent and reduce disaster risks: i) understanding disaster risk; ii) strengthening governance to manage disaster risk; iii) investing in disaster risk reduction for resilience and; iv) enhancing disaster preparedness for effective response, and to "Build Back Better" in recovery, rehabilitation and reconstruction. While forecasting and decision making tools are in place to predict and understand future impacts of natural hazards, the knowledge to action approach that currently exists fails to provide updated information needed by decision makers to undertake response and recovery efforts following a hazard event. For instance, during a tropical storm event advisories are released every two to three hours, but manual analysis of geospatial data to determine potential impacts of the event tends to be time-consuming and a post-event process. Researchers at Oak Ridge National Laboratory have developed a Spatial Decision Support System that enables real-time analysis of storm impact based on updated advisory. A prototype of the tool that focuses on determining projected power outage areas and projected duration of outages demonstrates the feasibility of integrating science with decision making for emergency management personnel to act in real time to protect communities and reduce risk.
Visual tracking using neuromorphic asynchronous event-based cameras.
Ni, Zhenjiang; Ieng, Sio-Hoi; Posch, Christoph; Régnier, Stéphane; Benosman, Ryad
2015-04-01
This letter presents a novel computationally efficient and robust pattern tracking method based on a time-encoded, frame-free visual data. Recent interdisciplinary developments, combining inputs from engineering and biology, have yielded a novel type of camera that encodes visual information into a continuous stream of asynchronous, temporal events. These events encode temporal contrast and intensity locally in space and time. We show that the sparse yet accurately timed information is well suited as a computational input for object tracking. In this letter, visual data processing is performed for each incoming event at the time it arrives. The method provides a continuous and iterative estimation of the geometric transformation between the model and the events representing the tracked object. It can handle isometry, similarities, and affine distortions and allows for unprecedented real-time performance at equivalent frame rates in the kilohertz range on a standard PC. Furthermore, by using the dimension of time that is currently underexploited by most artificial vision systems, the method we present is able to solve ambiguous cases of object occlusions that classical frame-based techniques handle poorly.
Neural Bases of Sequence Processing in Action and Language
ERIC Educational Resources Information Center
Carota, Francesca; Sirigu, Angela
2008-01-01
Real-time estimation of what we will do next is a crucial prerequisite of purposive behavior. During the planning of goal-oriented actions, for instance, the temporal and causal organization of upcoming subsequent moves needs to be predicted based on our knowledge of events. A forward computation of sequential structure is also essential for…
NASA Technical Reports Server (NTRS)
Zalameda, Joseph N.; Horne, Michael R.; Madaras, Eric I.; Burke, Eric R.
2016-01-01
Passive thermography and acoustic emission data were obtained for improved real time damage detection during fatigue loading. A strong positive correlation was demonstrated between acoustic energy event location and thermal heating, especially if the structure under load was nearing ultimate failure. An image processing routine was developed to map the acoustic emission data onto the thermal imagery. This required removing optical barrel distortion and angular rotation from the thermal data. The acoustic emission data were then mapped onto thermal data, revealing the cluster of acoustic emission event locations around the thermal signatures of interest. By combining both techniques, progression of damage growth is confirmed and areas of failure are identified. This technology provides improved real time inspections of advanced composite structures during fatigue testing.Keywords: Thermal nondestructive evaluation, fatigue damage detection, aerospace composite inspection, acoustic emission, passive thermography
Närhi, Mikko; Wetzel, Benjamin; Billet, Cyril; Toenger, Shanti; Sylvestre, Thibaut; Merolla, Jean-Marc; Morandotti, Roberto; Dias, Frederic; Genty, Goëry; Dudley, John M.
2016-01-01
Modulation instability is a fundamental process of nonlinear science, leading to the unstable breakup of a constant amplitude solution of a physical system. There has been particular interest in studying modulation instability in the cubic nonlinear Schrödinger equation, a generic model for a host of nonlinear systems including superfluids, fibre optics, plasmas and Bose–Einstein condensates. Modulation instability is also a significant area of study in the context of understanding the emergence of high amplitude events that satisfy rogue wave statistical criteria. Here, exploiting advances in ultrafast optical metrology, we perform real-time measurements in an optical fibre system of the unstable breakup of a continuous wave field, simultaneously characterizing emergent modulation instability breather pulses and their associated statistics. Our results allow quantitative comparison between experiment, modelling and theory, and are expected to open new perspectives on studies of instability dynamics in physics. PMID:27991513
NASA Pioneer: Venus reverse playback telemetry program TR 78-2
NASA Technical Reports Server (NTRS)
Modestino, J. W.; Daut, D. G.; Vickers, A. L.; Matis, K. R.
1978-01-01
During the entry of the Pioneer Venus Atmospheric Probes into the Venus atmosphere, there were several events (RF blackout and data rate changes) which caused the ground receiving equipment to lose lock on the signal. This caused periods of data loss immediately following each one of these disturbing events which lasted until all the ground receiving units (receiver, subcarrier demodulator, symbol synchronizer, and sequential decoder) acquired lock once more. A scheme to recover these data by off-line data processing was implemented. This scheme consisted of receiving the S band signals from the probes with an open loop reciever (requiring no lock up on the signal) in parallel with the closed loop receivers of the real time receiving equipment, down converting the signals to baseband, and recording them on an analog recorder. The off-line processing consisted of playing the analog recording in the reverse direction (starting with the end of the tape) up, converting the signal to S-band, feeding the signal into the "real time" receiving system and recording on digital tape, the soft decisions from the symbol synchronizer.
The California Integrated Seismic Network
NASA Astrophysics Data System (ADS)
Hellweg, M.; Given, D.; Hauksson, E.; Neuhauser, D.; Oppenheimer, D.; Shakal, A.
2007-05-01
The mission of the California Integrated Seismic Network (CISN) is to operate a reliable, modern system to monitor earthquakes throughout the state; to generate and distribute information in real-time for emergency response, for the benefit of public safety, and for loss mitigation; and to collect and archive data for seismological and earthquake engineering research. To meet these needs, the CISN operates data processing and archiving centers, as well as more than 3000 seismic stations. Furthermore, the CISN is actively developing and enhancing its infrastructure, including its automated processing and archival systems. The CISN integrates seismic and strong motion networks operated by the University of California Berkeley (UCB), the California Institute of Technology (Caltech), and the United States Geological Survey (USGS) offices in Menlo Park and Pasadena, as well as the USGS National Strong Motion Program (NSMP), and the California Geological Survey (CGS). The CISN operates two earthquake management centers (the NCEMC and SCEMC) where statewide, real-time earthquake monitoring takes place, and an engineering data center (EDC) for processing strong motion data and making it available in near real-time to the engineering community. These centers employ redundant hardware to minimize disruptions to the earthquake detection and processing systems. At the same time, dual feeds of data from a subset of broadband and strong motion stations are telemetered in real- time directly to both the NCEMC and the SCEMC to ensure the availability of statewide data in the event of a catastrophic failure at one of these two centers. The CISN uses a backbone T1 ring (with automatic backup over the internet) to interconnect the centers and the California Office of Emergency Services. The T1 ring enables real-time exchange of selected waveforms, derived ground motion data, phase arrivals, earthquake parameters, and ShakeMaps. With the goal of operating similar and redundant statewide earthquake processing systems at both real-time EMCs, the CISN is currently adopting and enhancing the database-centric, earthquake processing and analysis software originally developed for the Caltech/USGS Pasadena TriNet project. Earthquake data and waveforms are made available to researchers and to the public in near real-time through the CISN's Northern and Southern California Eathquake Data Centers (NCEDC and SCEDC) and through the USGS Earthquake Notification System (ENS). The CISN partners have developed procedures to automatically exchange strong motion data, both waveforms and peak parameters, for use in ShakeMap and in the rapid engineering reports which are available near real-time through the strong motion EDC.
CLARA: CLAS12 Reconstruction and Analysis Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gyurjyan, Vardan; Matta, Sebastian Mancilla; Oyarzun, Ricardo
2016-11-01
In this paper we present SOA based CLAS12 event Reconstruction and Analyses (CLARA) framework. CLARA design focus is on two main traits: real-time data stream processing, and service-oriented architecture (SOA) in a flow based programming (FBP) paradigm. Data driven and data centric architecture of CLARA presents an environment for developing agile, elastic, multilingual data processing applications. The CLARA framework presents solutions capable of processing large volumes of data interactively and substantially faster than batch systems.
Event-driven time-optimal control for a class of discontinuous bioreactors.
Moreno, Jaime A; Betancur, Manuel J; Buitrón, Germán; Moreno-Andrade, Iván
2006-07-05
Discontinuous bioreactors may be further optimized for processing inhibitory substrates using a convenient fed-batch mode. To do so the filling rate must be controlled in such a way as to push the reaction rate to its maximum value, by increasing the substrate concentration just up to the point where inhibition begins. However, an exact optimal controller requires measuring several variables (e.g., substrate concentrations in the feed and in the tank) and also good model knowledge (e.g., yield and kinetic parameters), requirements rarely satisfied in real applications. An environmentally important case, that exemplifies all these handicaps, is toxicant wastewater treatment. There the lack of online practical pollutant sensors may allow unforeseen high shock loads to be fed to the bioreactor, causing biomass inhibition that slows down the treatment process and, in extreme cases, even renders the biological process useless. In this work an event-driven time-optimal control (ED-TOC) is proposed to circumvent these limitations. We show how to detect a "there is inhibition" event by using some computable function of the available measurements. This event drives the ED-TOC to stop the filling. Later, by detecting the symmetric event, "there is no inhibition," the ED-TOC may restart the filling. A fill-react cycling then maintains the process safely hovering near its maximum reaction rate, allowing a robust and practically time-optimal operation of the bioreactor. An experimental study case of a wastewater treatment process application is presented. There the dissolved oxygen concentration was used to detect the events needed to drive the controller. (c) 2006 Wiley Periodicals, Inc.
2011-01-01
Background Cardiac arrest affects 30-35, 000 hospitalised patients in the UK every year. For these patients to be given the best chance of survival, high quality cardiopulmonary resuscitation (CPR) must be delivered, however the quality of CPR in real-life is often suboptimal. CPR feedback devices have been shown to improve CPR quality in the pre-hospital setting and post-event debriefing can improve adherence to guidelines and CPR quality. However, the evidence for use of these improvement methods in hospital remains unclear. The CPR quality improvement initiative is a prospective cohort study of the Q-CPR real-time feedback device combined with post-event debriefing in hospitalised adult patients who sustain a cardiac arrest. Methods/design The primary objective of this trial is to assess whether a CPR quality improvement initiative will improve rate of return of sustained spontaneous circulation in in-hospital-cardiac-arrest patients. The study is set in one NHS trust operating three hospital sites. Secondary objectives will evaluate: any return of spontaneous circulation; survival to hospital discharge and patient cerebral performance category at discharge; quality of CPR variables and cardiac arrest team factors. Methods: All three sites will have an initial control phase before any improvements are implemented; site 1 will implement audiovisual feedback combined with post event debriefing, site 2 will implement audiovisual feedback only and site 3 will remain as a control site to measure any changes in outcome due to any other trust-wide changes in resuscitation practice. All adult patients sustaining a cardiac arrest and receiving resuscitation from the hospital cardiac arrest team will be included. Patients will be excluded if; they have a Do-not-attempt resuscitation order written and documented in their medical records, the cardiac arrest is not attended by a resuscitation team, the arrest occurs out-of-hospital or the patient has previously participated in this study. The trial will recruit a total of 912 patients from the three hospital sites. Conclusion This trial will evaluate patient and process focussed outcomes following the implementation of a CPR quality improvement initiative using real-time audiovisual feedback and post event debriefing. Trial registration ISRCTN56583860 PMID:22008636
Technical improvements for the dynamic measurement of general scour and landslides
NASA Astrophysics Data System (ADS)
Chung Yang, Han; Su, Chih Chiang
2017-04-01
Disasters occurring near riverbeds, such as landslides, earth slides, debris flow, and general scour, are easily caused by flooding from typhoons. The occurrence of each type of disaster involves a process, so if a disaster event can be monitored in real time, hazards can be predicted, thereby enabling early warnings that could reduce the degree of loss engendered by the disaster. The study of technical improvements for the dynamic measurement of general scour and landslides could help to release these early warnings. In this study, improved wireless tracers were set up on site to ensure the feasibility of the improved measurement technology. A wireless tracer signal transmission system was simultaneously set up to avoid danger to surveyors caused by them having to be on site to take measurements. In order to understand the real-time dynamic riverbed scouring situation, after the flow path of the river was confirmed, the sites for riverbed scouring observation were established at the P30 pier of the Dajia River Bridge of National Highway No. 3, and approximately 100 m both upstream and downstream (for a total of three sites). A rainy event that caused riverbed erosion occurred in May 2015, and subsequently, Typhoons Soudelor, Goni, and Dujuan caused further erosion in the observed area. The results of the observations of several flood events revealed that wireless tracers can reflect the change in riverbed scour depth caused by typhoons and flooding in real time. The wireless tracer technique can be applied to real-time dynamic scouring observation of rivers, and these improvements in measurement technology could be helpful in preventing landslides in the future.
Jensen, Morten Hasselstrøm; Christensen, Toke Folke; Tarnow, Lise; Seto, Edmund; Dencker Johansen, Mette; Hejlesen, Ole Kristian
2013-07-01
Hypoglycemia is a potentially fatal condition. Continuous glucose monitoring (CGM) has the potential to detect hypoglycemia in real time and thereby reduce time in hypoglycemia and avoid any further decline in blood glucose level. However, CGM is inaccurate and shows a substantial number of cases in which the hypoglycemic event is not detected by the CGM. The aim of this study was to develop a pattern classification model to optimize real-time hypoglycemia detection. Features such as time since last insulin injection and linear regression, kurtosis, and skewness of the CGM signal in different time intervals were extracted from data of 10 male subjects experiencing 17 insulin-induced hypoglycemic events in an experimental setting. Nondiscriminative features were eliminated with SEPCOR and forward selection. The feature combinations were used in a Support Vector Machine model and the performance assessed by sample-based sensitivity and specificity and event-based sensitivity and number of false-positives. The best model was composed by using seven features and was able to detect 17 of 17 hypoglycemic events with one false-positive compared with 12 of 17 hypoglycemic events with zero false-positives for the CGM alone. Lead-time was 14 min and 0 min for the model and the CGM alone, respectively. This optimized real-time hypoglycemia detection provides a unique approach for the diabetes patient to reduce time in hypoglycemia and learn about patterns in glucose excursions. Although these results are promising, the model needs to be validated on CGM data from patients with spontaneous hypoglycemic events.
Using multiplets to track volcanic processes at Kilauea Volcano, Hawaii
NASA Astrophysics Data System (ADS)
Thelen, W. A.
2011-12-01
Multiplets, or repeating earthquakes, are commonly observed at volcanoes, particularly those exhibiting unrest. At Kilauea, multiplets have been observed as part of long period (LP) earthquake swarms [Battaglia et al., 2003] and as volcano-tectonic (VT) earthquakes associated with dike intrusion [Rubin et al., 1998]. The focus of most previous studies has been on the precise location of the multiplets based on reviewed absolute locations, a process that can require extensive human intervention and post-processing. Conversely, the detection of multiplets and measurement of multiplet parameters can be done in real-time without human interaction with locations approximated by the stations that best record the multiplet. The Hawaiian Volcano Observatory (HVO) is in the process of implementing and testing an algorithm to detect multiplets in near-real time and to analyze certain metrics to provide enhanced interpretive insights into ongoing volcanic processes. Metrics such as multiplet percent of total seismicity, multiplet event recurrence interval, multiplet lifespan, average event amplitude, and multiplet event amplitude variability have been shown to be valuable in understanding volcanic processes at Bezymianny Volcano, Russia and Mount St. Helens, Washington and thus are tracked as part of the algorithm. The near real-time implementation of the algorithm can be triggered from an earthworm subnet trigger or other triggering algorithm and employs a MySQL database to store results, similar to an algorithm implemented by Got et al. [2002]. Initial results using this algorithm to analyze VT earthquakes along Kilauea's Upper East Rift Zone between September 2010 and August 2011 show that periods of summit pressurization coincide with ample multiplet development. Summit pressurization is loosely defined by high rates of seismicity within the summit and Upper East Rift areas, coincident with lava high stands in the Halema`uma`u lava lake. High percentages, up to 100%, of earthquakes occurring during summit pressurization were part of a multiplet. Percentages were particularly high immediately prior to the March 5 Kamoamoa eruption. Interestingly, many multiplets that were present prior to the Kamoamoa eruption were reactivated during summit pressurization occurring in late July 2011. At a correlation coefficient of 0.7, 90% of the multiplets during the study period had populations of 10 or fewer earthquakes. Between periods of summit pressurization, earthquakes that belong to multiplets rarely occur, even though magma is flowing through the Upper East Rift Zone. Battaglia, J., Got, J. L. and Okubo, P., 2003. Location of long-period events below Kilauea Volcano using seismic amplitudes and accurate relative relocation. Journal of Geophysical Research-Solid Earth, v.108 (B12) 2553. Got, J. L., P. Okubo, R. Machenbaum, and W. Tanigawa (2002), A real-time procedure for progressive multiplet relative relocation at the Hawaiian Volcano Observatory, Bulletin of the Seismological Society of America, 92(5), 2019. Rubin, A. M., D. Gillard, and J. L. Got (1998), A reinterpretation of seismicity associated with the January 1983 dike intrusion at Kilauea Volcano, Hawaii, Journal of Geophysical Research-Solid Earth, 103(B5), 10003.
A Framework of Simple Event Detection in Surveillance Video
NASA Astrophysics Data System (ADS)
Xu, Weiguang; Zhang, Yafei; Lu, Jianjiang; Tian, Yulong; Wang, Jiabao
Video surveillance is playing more and more important role in people's social life. Real-time alerting of threaten events and searching interesting content in stored large scale video footage needs human operator to pay full attention on monitor for long time. The labor intensive mode has limit the effectiveness and efficiency of the system. A framework of simple event detection is presented advance the automation of video surveillance. An improved inner key point matching approach is used to compensate motion of background in real-time; frame difference are used to detect foreground; HOG based classifiers are used to classify foreground object into people and car; mean-shift is used to tracking the recognized objects. Events are detected based on predefined rules. The maturity of the algorithms guarantee the robustness of the framework, and the improved approach and the easily checked rules enable the framework to work in real-time. Future works to be done are also discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosso, A.
Since the large North Eastern power system blackout on August 14, 2003, U.S. electric utilities have spent lot of effort on preventing power system cascading outages. Two of the main causes of the August 14, 2003 blackout were inadequate situational awareness and inadequate operator training In addition to the enhancements of the infrastructure of the interconnected power systems, more research and development of advanced power system applications are required for improving the wide-area security monitoring, operation and planning in order to prevent large- scale cascading outages of interconnected power systems. It is critically important for improving the wide-area situation awarenessmore » of the operators or operational engineers and regional reliability coordinators of large interconnected systems. With the installation of large number of phasor measurement units (PMU) and the related communication infrastructure, it will be possible to improve the operators’ situation awareness and to quickly identify the sequence of events during a large system disturbance for the post-event analysis using the real-time or historical synchrophasor data. The purpose of this project was to develop and demonstrate a novel synchrophasor-based comprehensive situational awareness system for control centers of power transmission systems. The developed system named WASA (Wide Area Situation Awareness) is intended to improve situational awareness at control centers of the power system operators and regional reliability coordinators. It consists of following main software modules: • Wide-area visualizations of real-time frequency, voltage, and phase angle measurements and their contour displays for security monitoring. • Online detection and location of a major event (location, time, size, and type, such as generator or line outage). • Near-real-time event replay (in seconds) after a major event occurs. • Early warning of potential wide-area stability problems. The system has been deployed and demonstrated at the Tennessee Valley Authority (TVA) and ISO New England system using real-time synchrophasor data from openPDC. Apart from the software product, the outcome of this project consists of a set of technical reports and papers describing the mathematical foundations and computational approaches of different tools and modules, implementation issues and considerations, lessons learned, and the results of lidation processes.« less
Rapid classification of hippocampal replay content for real-time applications
Liu, Daniel F.; Karlsson, Mattias P.; Frank, Loren M.; Eden, Uri T.
2016-01-01
Sharp-wave ripple (SWR) events in the hippocampus replay millisecond-timescale patterns of place cell activity related to the past experience of an animal. Interrupting SWR events leads to learning and memory impairments, but how the specific patterns of place cell spiking seen during SWRs contribute to learning and memory remains unclear. A deeper understanding of this issue will require the ability to manipulate SWR events based on their content. Accurate real-time decoding of SWR replay events requires new algorithms that are able to estimate replay content and the associated uncertainty, along with software and hardware that can execute these algorithms for biological interventions on a millisecond timescale. Here we develop an efficient estimation algorithm to categorize the content of replay from multiunit spiking activity. Specifically, we apply real-time decoding methods to each SWR event and then compute the posterior probability of the replay feature. We illustrate this approach by classifying SWR events from data recorded in the hippocampus of a rat performing a spatial memory task into four categories: whether they represent outbound or inbound trajectories and whether the activity is replayed forward or backward in time. We show that our algorithm can classify the majority of SWR events in a recording epoch within 20 ms of the replay onset with high certainty, which makes the algorithm suitable for a real-time implementation with short latencies to incorporate into content-based feedback experiments. PMID:27535369
Lessons Learned from Real-Time, Event-Based Internet Science Communications
NASA Technical Reports Server (NTRS)
Phillips, T.; Myszka, E.; Gallagher, D. L.; Adams, M. L.; Koczor, R. J.; Whitaker, Ann F. (Technical Monitor)
2001-01-01
For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The Directorate's Science Roundtable includes active researchers, NASA public relations, educators, and administrators. The Science@NASA award-winning family of Web sites features science, mathematics, and space news. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing science activities in real-time has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases, broadcasts accommodate active feedback and questions from Internet participants. Through these projects a pattern has emerged in the level of interest or popularity with the public. The pattern differentiates projects that include science from those that do not, All real-time, event-based Internet activities have captured public interest at a level not achieved through science stories or educator resource material exclusively. The worst event-based activity attracted more interest than the best written science story. One truly rewarding lesson learned through these projects is that the public recognizes the importance and excitement of being part of scientific discovery. Flying a camera to 100,000 feet altitude isn't as interesting to the public as searching for viable life-forms at these oxygen-poor altitudes. The details of these real-time, event-based projects and lessons learned will be discussed.
Bulgarian National Digital Seismological Network
NASA Astrophysics Data System (ADS)
Dimitrova, L.; Solakov, D.; Nikolova, S.; Stoyanov, S.; Simeonova, S.; Zimakov, L. G.; Khaikin, L.
2011-12-01
The Bulgarian National Digital Seismological Network (BNDSN) consists of a National Data Center (NDC), 13 stations equipped with RefTek High Resolution Broadband Seismic Recorders - model DAS 130-01/3, 1 station equipped with Quanterra 680 and broadband sensors and accelerometers. Real-time data transfer from seismic stations to NDC is realized via Virtual Private Network of the Bulgarian Telecommunication Company. The communication interruptions don't cause any data loss at the NDC. The data are backed up in the field station recorder's 4Mb RAM memory and are retransmitted to the NDC immediately after the communication link is re-established. The recorders are equipped with 2 compact flash disks able to save more than 1 month long data. The data from the flash disks can be downloaded remotely using FTP. The data acquisition and processing hardware redundancy at the NDC is achieved by two clustered SUN servers and two Blade Workstations. To secure the acquisition, processing and data storage processes a three layer local network is designed at the NDC. Real-time data acquisition is performed using REFTEK's full duplex error-correction protocol RTPD. Data from the Quanterra recorder and foreign stations are fed into RTPD in real-time via SeisComP/SeedLink protocol. Using SeisComP/SeedLink software the NDC transfers real-time data to INGV-Roma, NEIC-USA, ORFEUS Data Center. Regional real-time data exchange with Romania, Macedonia, Serbia and Greece is established at the NDC also. Data processing is performed by the Seismic Network Data Processor (SNDP) software package running on the both Servers. SNDP includes subsystems: Real-time subsystem (RTS_SNDP) - for signal detection; evaluation of the signal parameters; phase identification and association; source estimation; Seismic analysis subsystem (SAS_SNDP) - for interactive data processing; Early warning subsystem (EWS_SNDP) - based on the first arrived P-phases. The signal detection process is performed by traditional STA/LTA detection algorithm. The filter parameters of the detectors are defined on the base of previously evaluated ambient noise at the seismic stations. Some extra modules for network command/control, state-of-health network monitoring and data archiving are running as well in the National Data Center. Three types of archives are produced in the NDC - two continuous - miniSEED format and RefTek PASSCAL format; and one event oriented in CSS3.0 scheme format. Modern digital equipment and broad-band seismometers installed at Bulgarian seismic stations, careful selection of the software packages for automatic and interactive data processing in the data center proved to be suitable choice for the purposes of BNDSN and NDC: ? to ensure reliable automatic localization of the seismic events and rapid notification of the governmental authorities in case of felt earthquakes on the territory of Bulgaria; ? to provide a modern basis for seismological studies in Bulgaria.
Real-time computing platform for spiking neurons (RT-spike).
Ros, Eduardo; Ortigosa, Eva M; Agís, Rodrigo; Carrillo, Richard; Arnold, Michael
2006-07-01
A computing platform is described for simulating arbitrary networks of spiking neurons in real time. A hybrid computing scheme is adopted that uses both software and hardware components to manage the tradeoff between flexibility and computational power; the neuron model is implemented in hardware and the network model and the learning are implemented in software. The incremental transition of the software components into hardware is supported. We focus on a spike response model (SRM) for a neuron where the synapses are modeled as input-driven conductances. The temporal dynamics of the synaptic integration process are modeled with a synaptic time constant that results in a gradual injection of charge. This type of model is computationally expensive and is not easily amenable to existing software-based event-driven approaches. As an alternative we have designed an efficient time-based computing architecture in hardware, where the different stages of the neuron model are processed in parallel. Further improvements occur by computing multiple neurons in parallel using multiple processing units. This design is tested using reconfigurable hardware and its scalability and performance evaluated. Our overall goal is to investigate biologically realistic models for the real-time control of robots operating within closed action-perception loops, and so we evaluate the performance of the system on simulating a model of the cerebellum where the emulation of the temporal dynamics of the synaptic integration process is important.
NASA Technical Reports Server (NTRS)
Wu, Huan; Adler, Robert F.; Tian, Yudong; Huffman, George J.; Li, Hongyi; Wang, JianJian
2014-01-01
A widely used land surface model, the Variable Infiltration Capacity (VIC) model, is coupled with a newly developed hierarchical dominant river tracing-based runoff-routing model to form the Dominant river tracing-Routing Integrated with VIC Environment (DRIVE) model, which serves as the new core of the real-time Global Flood Monitoring System (GFMS). The GFMS uses real-time satellite-based precipitation to derive flood monitoring parameters for the latitude band 50 deg. N - 50 deg. S at relatively high spatial (approximately 12 km) and temporal (3 hourly) resolution. Examples of model results for recent flood events are computed using the real-time GFMS (http://flood.umd.edu). To evaluate the accuracy of the new GFMS, the DRIVE model is run retrospectively for 15 years using both research-quality and real-time satellite precipitation products. Evaluation results are slightly better for the research-quality input and significantly better for longer duration events (3 day events versus 1 day events). Basins with fewer dams tend to provide lower false alarm ratios. For events longer than three days in areas with few dams, the probability of detection is approximately 0.9 and the false alarm ratio is approximately 0.6. In general, these statistical results are better than those of the previous system. Streamflow was evaluated at 1121 river gauges across the quasi-global domain. Validation using real-time precipitation across the tropics (30 deg. S - 30 deg. N) gives positive daily Nash-Sutcliffe Coefficients for 107 out of 375 (28%) stations with a mean of 0.19 and 51% of the same gauges at monthly scale with a mean of 0.33. There were poorer results in higher latitudes, probably due to larger errors in the satellite precipitation input.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Huan; Adler, Robert F.; Tian, Yudong
2014-03-01
A widely used land surface model, the Variable Infiltration Capacity (VIC) model, is coupled with a newly developed hierarchical dominant river tracing-based runoff-routing model to form the Dominant river tracing-Routing Integrated with VIC Environment (DRIVE) model, which serves as the new core of the real-time Global Flood Monitoring System (GFMS). The GFMS uses real-time satellite-based precipitation to derive flood monitoring parameters for the latitude band 50°N–50°S at relatively high spatial (~12 km) and temporal (3 hourly) resolution. Examples of model results for recent flood events are computed using the real-time GFMS (http://flood.umd.edu). To evaluate the accuracy of the new GFMS,more » the DRIVE model is run retrospectively for 15 years using both research-quality and real-time satellite precipitation products. Evaluation results are slightly better for the research-quality input and significantly better for longer duration events (3 day events versus 1 day events). Basins with fewer dams tend to provide lower false alarm ratios. For events longer than three days in areas with few dams, the probability of detection is ~0.9 and the false alarm ratio is ~0.6. In general, these statistical results are better than those of the previous system. Streamflow was evaluated at 1121 river gauges across the quasi-global domain. Validation using real-time precipitation across the tropics (30°S–30°N) gives positive daily Nash-Sutcliffe Coefficients for 107 out of 375 (28%) stations with a mean of 0.19 and 51% of the same gauges at monthly scale with a mean of 0.33. Finally, there were poorer results in higher latitudes, probably due to larger errors in the satellite precipitation input.« less
Mobile technologies for disease surveillance in humans and animals.
Mwabukusi, Mpoki; Karimuribo, Esron D; Rweyemamu, Mark M; Beda, Eric
2014-04-23
A paper-based disease reporting system has been associated with a number of challenges. These include difficulties to submit hard copies of the disease surveillance forms because of poor road infrastructure, weather conditions or challenging terrain, particularly in the developing countries. The system demands re-entry of the data at data processing and analysis points, thus making it prone to introduction of errors during this process. All these challenges contribute to delayed acquisition, processing and response to disease events occurring in remote hard to reach areas. Our study piloted the use of mobile phones in order to transmit near to real-time data from remote districts in Tanzania (Ngorongoro and Ngara), Burundi (Muyinga) and Zambia (Kazungula and Sesheke). Two technologies namely, digital and short messaging services were used to capture and transmit disease event data in the animal and human health sectors in the study areas based on a server-client model. Smart phones running the Android operating system (minimum required version: Android 1.6), and which supported open source application, Epicollect, as well as the Open Data Kit application, were used in the study. These phones allowed collection of geo-tagged data, with the opportunity of including static and moving images related to disease events. The project supported routine disease surveillance systems in the ministries responsible for animal and human health in Burundi, Tanzania and Zambia, as well as data collection for researchers at the Sokoine University of Agriculture, Tanzania. During the project implementation period between 2011 and 2013, a total number of 1651 diseases event-related forms were submitted, which allowed reporters to include GPS coordinates and photographs related to the events captured. It was concluded that the new technology-based surveillance system is useful in providing near to real-time data, with potential for enhancing timely response in rural remote areas of Africa. We recommended adoption of the proven technologies to improve disease surveillance, particularly in the developing countries.
Signal quality and Bayesian signal processing in neurofeedback based on real-time fMRI.
Koush, Yury; Zvyagintsev, Mikhail; Dyck, Miriam; Mathiak, Krystyna A; Mathiak, Klaus
2012-01-02
Real-time fMRI allows analysis and visualization of the brain activity online, i.e. within one repetition time. It can be used in neurofeedback applications where subjects attempt to control an activation level in a specified region of interest (ROI) of their brain. The signal derived from the ROI is contaminated with noise and artifacts, namely with physiological noise from breathing and heart beat, scanner drift, motion-related artifacts and measurement noise. We developed a Bayesian approach to reduce noise and to remove artifacts in real-time using a modified Kalman filter. The system performs several signal processing operations: subtraction of constant and low-frequency signal components, spike removal and signal smoothing. Quantitative feedback signal quality analysis was used to estimate the quality of the neurofeedback time series and performance of the applied signal processing on different ROIs. The signal-to-noise ratio (SNR) across the entire time series and the group event-related SNR (eSNR) were significantly higher for the processed time series in comparison to the raw data. Applied signal processing improved the t-statistic increasing the significance of blood oxygen level-dependent (BOLD) signal changes. Accordingly, the contrast-to-noise ratio (CNR) of the feedback time series was improved as well. In addition, the data revealed increase of localized self-control across feedback sessions. The new signal processing approach provided reliable neurofeedback, performed precise artifacts removal, reduced noise, and required minimal manual adjustments of parameters. Advanced and fast online signal processing algorithms considerably increased the quality as well as the information content of the control signal which in turn resulted in higher contingency in the neurofeedback loop. Copyright © 2011 Elsevier Inc. All rights reserved.
Augmented reality enabling intelligence exploitation at the edge
NASA Astrophysics Data System (ADS)
Kase, Sue E.; Roy, Heather; Bowman, Elizabeth K.; Patton, Debra
2015-05-01
Today's Warfighters need to make quick decisions while interacting in densely populated environments comprised of friendly, hostile, and neutral host nation locals. However, there is a gap in the real-time processing of big data streams for edge intelligence. We introduce a big data processing pipeline called ARTEA that ingests, monitors, and performs a variety of analytics including noise reduction, pattern identification, and trend and event detection in the context of an area of operations (AOR). Results of the analytics are presented to the Soldier via an augmented reality (AR) device Google Glass (Glass). Non-intrusive AR devices such as Glass can visually communicate contextually relevant alerts to the Soldier based on the current mission objectives, time, location, and observed or sensed activities. This real-time processing and AR presentation approach to knowledge discovery flattens the intelligence hierarchy enabling the edge Soldier to act as a vital and active participant in the analysis process. We report preliminary observations testing ARTEA and Glass in a document exploitation and person of interest scenario simulating edge Soldier participation in the intelligence process in disconnected deployment conditions.
NASA Technical Reports Server (NTRS)
Park, Han G. (Inventor); Zak, Michail (Inventor); James, Mark L. (Inventor); Mackey, Ryan M. E. (Inventor)
2003-01-01
A general method of anomaly detection from time-correlated sensor data is disclosed. Multiple time-correlated signals are received. Their cross-signal behavior is compared against a fixed library of invariants. The library is constructed during a training process, which is itself data-driven using the same time-correlated signals. The method is applicable to a broad class of problems and is designed to respond to any departure from normal operation, including faults or events that lie outside the training envelope.
Real time software tools and methodologies
NASA Technical Reports Server (NTRS)
Christofferson, M. J.
1981-01-01
Real time systems are characterized by high speed processing and throughput as well as asynchronous event processing requirements. These requirements give rise to particular implementations of parallel or pipeline multitasking structures, of intertask or interprocess communications mechanisms, and finally of message (buffer) routing or switching mechanisms. These mechanisms or structures, along with the data structue, describe the essential character of the system. These common structural elements and mechanisms are identified, their implementation in the form of routines, tasks or macros - in other words, tools are formalized. The tools developed support or make available the following: reentrant task creation, generalized message routing techniques, generalized task structures/task families, standardized intertask communications mechanisms, and pipeline and parallel processing architectures in a multitasking environment. Tools development raise some interesting prospects in the areas of software instrumentation and software portability. These issues are discussed following the description of the tools themselves.
EEGLAB, SIFT, NFT, BCILAB, and ERICA: New Tools for Advanced EEG Processing
Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott
2011-01-01
We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments. PMID:21687590
The LUX experiment - trigger and data acquisition systems
NASA Astrophysics Data System (ADS)
Druszkiewicz, Eryk
2013-04-01
The Large Underground Xenon (LUX) detector is a two-phase xenon time projection chamber designed to detect interactions of dark matter particles with the xenon nuclei. Signals from the detector PMTs are processed by custom-built analog electronics which provide properly shaped signals for the trigger and data acquisition (DAQ) systems. During calibrations, both systems must be able to handle high rates and have large dynamic ranges; during dark matter searches, maximum sensitivity requires low thresholds. The trigger system uses eight-channel 64-MHz digitizers (DDC-8) connected to a Trigger Builder (TB). The FPGA cores on the digitizers perform real-time pulse identification (discriminating between S1 and S2-like signals) and event localization. The TB uses hit patterns, hit maps, and maximum response detection to make trigger decisions, which are reached within few microseconds after the occurrence of an event of interest. The DAQ system is comprised of commercial digitizers with customized firmware. Its real-time baseline suppression allows for a maximum event acquisition rate in excess of 1.5 kHz, which results in virtually no deadtime. The performance of the trigger and DAQ systems during the commissioning runs of LUX will be discussed.
Real-time Interplanetary Shock Prediciton System
NASA Astrophysics Data System (ADS)
Vandegriff, J.; Ho, G.; Plauger, J.
A system is being developed to predict the arrival times and maximum intensities of energetic storm particle (ESP) events at the earth. Measurements of particle flux values at L1 being made by the Electron, Proton, and Alpha Monitor (EPAM) instrument aboard NASA's ACE spacecraft are made available in real-time by the NOAA Space Environment Center as 5 minute averages of several proton and electron energy channels. Past EPAM flux measurements can be used to train forecasting algorithms which then run on the real-time data. Up to 3 days before the arrival of the interplanetary shock associated with an ESP event, characteristic changes in the particle intensities (such as decreased spectral slope and increased overall flux level) are easily discernable. Once the onset of an event is detected, a neural net is used to forecast the arrival time and flux level for the event. We present results obtained with this technique for forecasting the largest of the ESP events detected by EPAM. Forecasting information will be made publicly available through http://sd-www.jhuapl.edu/ACE/EPAM/, the Johns Hopkins University Applied Physics Lab web site for the ACE/EPAM instrument.
A software framework for real-time multi-modal detection of microsleeps.
Knopp, Simon J; Bones, Philip J; Weddell, Stephen J; Jones, Richard D
2017-09-01
A software framework is described which was designed to process EEG, video of one eye, and head movement in real time, towards achieving early detection of microsleeps for prevention of fatal accidents, particularly in transport sectors. The framework is based around a pipeline structure with user-replaceable signal processing modules. This structure can encapsulate a wide variety of feature extraction and classification techniques and can be applied to detecting a variety of aspects of cognitive state. Users of the framework can implement signal processing plugins in C++ or Python. The framework also provides a graphical user interface and the ability to save and load data to and from arbitrary file formats. Two small studies are reported which demonstrate the capabilities of the framework in typical applications: monitoring eye closure and detecting simulated microsleeps. While specifically designed for microsleep detection/prediction, the software framework can be just as appropriately applied to (i) other measures of cognitive state and (ii) development of biomedical instruments for multi-modal real-time physiological monitoring and event detection in intensive care, anaesthesiology, cardiology, neurosurgery, etc. The software framework has been made freely available for researchers to use and modify under an open source licence.
Optimizing Tsunami Forecast Model Accuracy
NASA Astrophysics Data System (ADS)
Whitmore, P.; Nyland, D. L.; Huang, P. Y.
2015-12-01
Recent tsunamis provide a means to determine the accuracy that can be expected of real-time tsunami forecast models. Forecast accuracy using two different tsunami forecast models are compared for seven events since 2006 based on both real-time application and optimized, after-the-fact "forecasts". Lessons learned by comparing the forecast accuracy determined during an event to modified applications of the models after-the-fact provide improved methods for real-time forecasting for future events. Variables such as source definition, data assimilation, and model scaling factors are examined to optimize forecast accuracy. Forecast accuracy is also compared for direct forward modeling based on earthquake source parameters versus accuracy obtained by assimilating sea level data into the forecast model. Results show that including assimilated sea level data into the models increases accuracy by approximately 15% for the events examined.
Microseismic Velocity Imaging of the Fracturing Zone
NASA Astrophysics Data System (ADS)
Zhang, H.; Chen, Y.
2015-12-01
Hydraulic fracturing of low permeability reservoirs can induce microseismic events during fracture development. For this reason, microseismic monitoring using sensors on surface or in borehole have been widely used to delineate fracture spatial distribution and to understand fracturing mechanisms. It is often the case that the stimulated reservoir volume (SRV) is determined solely based on microseismic locations. However, it is known that for some fracture development stage, long period long duration events, instead of microseismic events may be associated. In addition, because microseismic events are essentially weak and there exist different sources of noise during monitoring, some microseismic events could not be detected and thus located. Therefore the estimation of the SRV is biased if it is solely determined by microseismic locations. With the existence of fluids and fractures, the seismic velocity of reservoir layers will be decreased. Based on this fact, we have developed a near real time seismic velocity tomography method to characterize velocity changes associated with fracturing process. The method is based on double-difference seismic tomography algorithm to image the fracturing zone where microseismic events occur by using differential arrival times from microseismic event pairs. To take into account varying data distribution for different fracking stages, the method solves the velocity model in the wavelet domain so that different scales of model features can be obtained according to different data distribution. We have applied this real time tomography method to both acoustic emission data from lab experiment and microseismic data from a downhole microseismic monitoring project for shale gas hydraulic fracturing treatment. The tomography results from lab data clearly show the velocity changes associated with different rock fracturing stages. For the field data application, it shows that microseismic events are located in low velocity anomalies. By combining low velocity anomalies with microseismic events, we should better estimate the SRV.
Evolution of damage during deformation in porous granular materials (Louis Néel Medal Lecture)
NASA Astrophysics Data System (ADS)
Main, Ian
2014-05-01
'Crackling noise' occurs in a wide variety of systems that respond to external forcing in an intermittent way, leading to sudden bursts of energy release similar to those heard when crunching up a piece of paper or listening to a fire. In mineral magnetism ('Barkhausen') crackling noise occurs due to sudden changes in the size and orientation of microscopic ferromagnetic domains when the external magnetic field is changed. In rock physics sudden changes in internal stress associated with microscopically brittle failure events lead to acoustic emissions that can be recorded on the sample boundary, and used to infer the state of internal damage. Crackling noise is inherently stochastic, but the population of events often exhibits remarkably robust scaling properties, in terms of the source area, duration, energy, and in the waiting time between events. Here I describe how these scaling properties emerge and evolve spontaneously in a fully-dynamic discrete element model of sedimentary rocks subject to uniaxial compression at a constant strain rate. The discrete elements have structural disorder similar to that of a real rock, and this is the only source of heterogeneity. Despite the stationary loading and the lack of any time-dependent weakening processes, the results are all characterized by emergent power law distributions over a broad range of scales, in agreement with experimental observation. As deformation evolves, the scaling exponents change systematically in a way that is similar to the evolution of damage in experiments on real sedimentary rocks. The potential for real-time failure forecasting is examined by using synthetic and real data from laboratory tests and prior to volcanic eruptions. The combination of non-linearity and an irreducible stochastic component leads to significant variations in the precision and accuracy of the forecast failure time, leading to a significant proportion of 'false alarms' (forecast too early) and 'missed events' (forecast too late), as well as an over-optimistic assessments of forecasting power and quality when the failure time is known (the 'benefit of hindsight'). The evolution becomes progressively more complex, and the forecasting power diminishes, in going from ideal synthetics to controlled laboratory tests to open natural systems at larger scales in space and time.
Enabling Near Real-Time Remote Search for Fast Transient Events with Lossy Data Compression
NASA Astrophysics Data System (ADS)
Vohl, Dany; Pritchard, Tyler; Andreoni, Igor; Cooke, Jeffrey; Meade, Bernard
2017-09-01
We present a systematic evaluation of JPEG2000 (ISO/IEC 15444) as a transport data format to enable rapid remote searches for fast transient events as part of the Deeper Wider Faster programme. Deeper Wider Faster programme uses 20 telescopes from radio to gamma rays to perform simultaneous and rapid-response follow-up searches for fast transient events on millisecond-to-hours timescales. Deeper Wider Faster programme search demands have a set of constraints that is becoming common amongst large collaborations. Here, we focus on the rapid optical data component of Deeper Wider Faster programme led by the Dark Energy Camera at Cerro Tololo Inter-American Observatory. Each Dark Energy Camera image has 70 total coupled-charged devices saved as a 1.2 gigabyte FITS file. Near real-time data processing and fast transient candidate identifications-in minutes for rapid follow-up triggers on other telescopes-requires computational power exceeding what is currently available on-site at Cerro Tololo Inter-American Observatory. In this context, data files need to be transmitted rapidly to a foreign location for supercomputing post-processing, source finding, visualisation and analysis. This step in the search process poses a major bottleneck, and reducing the data size helps accommodate faster data transmission. To maximise our gain in transfer time and still achieve our science goals, we opt for lossy data compression-keeping in mind that raw data is archived and can be evaluated at a later time. We evaluate how lossy JPEG2000 compression affects the process of finding transients, and find only a negligible effect for compression ratios up to 25:1. We also find a linear relation between compression ratio and the mean estimated data transmission speed-up factor. Adding highly customised compression and decompression steps to the science pipeline considerably reduces the transmission time-validating its introduction to the Deeper Wider Faster programme science pipeline and enabling science that was otherwise too difficult with current technology.
NASA Astrophysics Data System (ADS)
Tang, Xiaojing
Fast and accurate monitoring of tropical forest disturbance is essential for understanding current patterns of deforestation as well as helping eliminate illegal logging. This dissertation explores the use of data from different satellites for near real-time monitoring of forest disturbance in tropical forests, including: development of new monitoring methods; development of new assessment methods; and assessment of the performance and operational readiness of existing methods. Current methods for accuracy assessment of remote sensing products do not address the priority of near real-time monitoring of detecting disturbance events as early as possible. I introduce a new assessment framework for near real-time products that focuses on the timing and the minimum detectable size of disturbance events. The new framework reveals the relationship between change detection accuracy and the time needed to identify events. In regions that are frequently cloudy, near real-time monitoring using data from a single sensor is difficult. This study extends the work by Xin et al. (2013) and develops a new time series method (Fusion2) based on fusion of Landsat and MODIS (Moderate Resolution Imaging Spectroradiometer) data. Results of three test sites in the Amazon Basin show that Fusion2 can detect 44.4% of the forest disturbance within 13 clear observations (82 days) after the initial disturbance. The smallest event detected by Fusion2 is 6.5 ha. Also, Fusion2 detects disturbance faster and has less commission error than more conventional methods. In a comparison of coarse resolution sensors, MODIS Terra and Aqua combined provides faster and more accurate detection of disturbance events than VIIRS (Visible Infrared Imaging Radiometer Suite) and MODIS single sensor data. The performance of near real-time monitoring using VIIRS is slightly worse than MODIS Terra but significantly better than MODIS Aqua. New monitoring methods developed in this dissertation provide forest protection organizations the capacity to monitor illegal logging events promptly. In the future, combining two Landsat and two Sentinel-2 satellites will provide global coverage at 30 m resolution every 4 days, and routine monitoring may be possible at high resolution. The methods and assessment framework developed in this dissertation are adaptable to newly available datasets.
Monte Carlo event generators in atomic collisions: A new tool to tackle the few-body dynamics
NASA Astrophysics Data System (ADS)
Ciappina, M. F.; Kirchner, T.; Schulz, M.
2010-04-01
We present a set of routines to produce theoretical event files, for both single and double ionization of atoms by ion impact, based on a Monte Carlo event generator (MCEG) scheme. Such event files are the theoretical counterpart of the data obtained from a kinematically complete experiment; i.e. they contain the momentum components of all collision fragments for a large number of ionization events. Among the advantages of working with theoretical event files is the possibility to incorporate the conditions present in a real experiment, such as the uncertainties in the measured quantities. Additionally, by manipulating them it is possible to generate any type of cross sections, specially those that are usually too complicated to compute with conventional methods due to a lack of symmetry. Consequently, the numerical effort of such calculations is dramatically reduced. We show examples for both single and double ionization, with special emphasis on a new data analysis tool, called four-body Dalitz plots, developed very recently. Program summaryProgram title: MCEG Catalogue identifier: AEFV_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFV_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2695 No. of bytes in distributed program, including test data, etc.: 18 501 Distribution format: tar.gz Programming language: FORTRAN 77 with parallelization directives using scripting Computer: Single machines using Linux and Linux servers/clusters (with cores with any clock speed, cache memory and bits in a word) Operating system: Linux (any version and flavor) and FORTRAN 77 compilers Has the code been vectorised or parallelized?: Yes RAM: 64-128 kBytes (the codes are very cpu intensive) Classification: 2.6 Nature of problem: The code deals with single and double ionization of atoms by ion impact. Conventional theoretical approaches aim at a direct calculation of the corresponding cross sections. This has the important shortcoming that it is difficult to account for the experimental conditions when comparing results to measured data. In contrast, the present code generates theoretical event files of the same type as are obtained in a real experiment. From these event files any type of cross sections can be easily extracted. The theoretical schemes are based on distorted wave formalisms for both processes of interest. Solution method: The codes employ a Monte Carlo Event Generator based on theoretical formalisms to generate event files for both single and double ionization. One of the main advantages of having access to theoretical event files is the possibility of adding the conditions present in real experiments (parameter uncertainties, environmental conditions, etc.) and to incorporate additional physics in the resulting event files (e.g. elastic scattering or other interactions absent in the underlying calculations). Additional comments: The computational time can be dramatically reduced if a large number of processors is used. Since the codes has no communication between processes it is possible to achieve an efficiency of a 100% (this number certainly will be penalized by the queuing waiting time). Running time: Times vary according to the process, single or double ionization, to be simulated, the number of processors and the type of theoretical model. The typical running time is between several hours and up to a few weeks.
NASA Astrophysics Data System (ADS)
Greuter, U.; Buehler, C.; Rasmussen, P.; Emmenegger, M.; Maden, D.; Koennecke, M.; Schlumpf, N.
We present the basic concept and the realization of our fully configurable data-acquisition hardware for the neutron scattering instruments at SINQ. This system allows collection of the different data entities and event-related signals generated by the various detector units. It offers a variety of synchronization options, including a time-measuring mode for time-of-flight determinations. Based on configurable logic (FPGA, CPLD), event rates up to the MHz range can be processed and transmitted to a programmable online data-reduction system (Histogram Memory). It is implemented on a commercially available VME Power PC module running a real-time operating system (VxWorks).
The ability to forecast local and regional air pollution events is challenging since the processes governing the production and sustenance of atmospheric pollutants are complex and often non-linear. Comprehensive atmospheric models, by representing in as much detail as possible t...
V-FASTR: THE VLBA FAST RADIO TRANSIENTS EXPERIMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wayth, Randall B.; Tingay, Steven J.; Brisken, Walter F.
2011-07-10
Recent discoveries of dispersed, non-periodic impulsive radio signals with single-dish radio telescopes have sparked significant interest in exploring the relatively uncharted space of fast transient radio signals. Here we describe V-FASTR, an experiment to perform a blind search for fast transient radio signals using the Very Long Baseline Array (VLBA). The experiment runs entirely in a commensal mode, alongside normal VLBA observations and operations. It is made possible by the features and flexibility of the DiFX software correlator that is used to process VLBA data. Using the VLBA for this type of experiment offers significant advantages over single-dish experiments, includingmore » a larger field of view, the ability to easily distinguish local radio-frequency interference from real signals, and the possibility to localize detected events on the sky to milliarcsecond accuracy. We describe our software pipeline, which accepts short integration ({approx} ms) spectrometer data from each antenna in real time during correlation and performs an incoherent dedispersion separately for each antenna, over a range of trial dispersion measures. The dedispersed data are processed by a sophisticated detector and candidate events are recorded. At the end of the correlation, small snippets of the raw data at the time of the events are stored for further analysis. We present the results of our event detection pipeline from some test observations of the pulsars B0329+54 and B0531+21 (the Crab pulsar).« less
Tsui, Fu-Chiang; Espino, Jeremy U.; Weng, Yan; Choudary, Arvinder; Su, Hoah-Der; Wagner, Michael M.
2005-01-01
The National Retail Data Monitor (NRDM) has monitored over-the-counter (OTC) medication sales in the United States since December 2002. The NRDM collects data from over 18,600 retail stores and processes over 0.6 million sales records per day. This paper describes key architectural features that we have found necessary for a data utility component in a national biosurveillance system. These elements include event-driven architecture to provide analyses of data in near real time, multiple levels of caching to improve query response time, high availability through the use of clustered servers, scalable data storage through the use of storage area networks and a web-service function for interoperation with affiliated systems. The methods and architectural principles are relevant to the design of any production data utility for public health surveillance—systems that collect data from multiple sources in near real time for use by analytic programs and user interfaces that have substantial requirements for time-series data aggregated in multiple dimensions. PMID:16779138
The operational platform XTREM for rainfall measurement and monitoring
NASA Astrophysics Data System (ADS)
Mioche, G.; Van Baëlen, J.; Buisson, E.
2012-04-01
Nowadays in the risk management field, new tools to anticipate extremes meteorological events are in development. Over the last 20 years, the occurrence of such types of events has increased and today they represent a serious threat for human activities and health. In particular, local and intense precipitation events cause significant damages on private and public materials and properties and even loss of lives, especially in vulnerable areas such as urban or mountain environments. The XTREM platform (X-band radar and operational plaTform for high REsolution precipitation Monitoring and forecasting) is an operating system designed to monitor, quantify and even forecast rain events with high time and space resolutions. This is also a useful tool for decision support in the environmental risk management domain. The main instrument of XTREM is an X band radar which is able to measure precipitations with high spatial and temporal resolutions (100 m, 1 minute) on local areas, in real time and continuously, in addition to the existing meteorological radars network. This radar is particularly well adapted in urban areas or in complex orography regions (such as mountains). In this communication, the data processing of X band radar data will be first described, then the XTREM platform products will be presented. Concerning the data processing, the first step is to estimate the attenuation due to the hydrometeors. Then the conversion of reflectivity in rain rate R is made with specific Z-R relationships to provide accurate estimates. Thanks to a system of alerts with customizable thresholds, the real time mode will generates useful information to users to anticipate risks linked to strong rainfall, such as an estimation of the rain height and cumulative rain on defined areas. XTREM is also able to integrate a rain gauge network. The user gets the opportunity to compare in real time radar retrievals with rain gauge data, which allows assessing radar retrievals accuracy. XTREM includes also nowcasting/forecasting products, derived from various methods (extrapolation technique, blending with numerical modelling). Furthermore, an analysis mode is available to study in details a specific event. In this mode, more scientific tools are available (various attenuation calculation methods or various Z-R relationships) in order to carry detailed investigation on particular events observed. Finally, the case study of a local and strong precipitation event which took place in Clermont-Ferrand will be presented, showing the products and impact provided by XTREM.
Real-Time Joint Streaming Data Processing from Social and Physical Sensors
NASA Astrophysics Data System (ADS)
Kropivnitskaya, Y. Y.; Qin, J.; Tiampo, K. F.; Bauer, M.
2014-12-01
The results of the technological breakthroughs in computing that have taken place over the last few decades makes it possible to achieve emergency management objectives that focus on saving human lives and decreasing economic effects. In particular, the integration of a wide variety of information sources, including observations from spatially-referenced physical sensors and new social media sources, enables better real-time seismic hazard analysis through distributed computing networks. The main goal of this work is to utilize innovative computational algorithms for better real-time seismic risk analysis by integrating different data sources and processing tools into streaming and cloud computing applications. The Geological Survey of Canada operates the Canadian National Seismograph Network (CNSN) with over 100 high-gain instruments and 60 low-gain or strong motion seismographs. The processing of the continuous data streams from each station of the CNSN provides the opportunity to detect possible earthquakes in near real-time. The information from physical sources is combined to calculate a location and magnitude for an earthquake. The automatically calculated results are not always sufficiently precise and prompt that can significantly reduce the response time to a felt or damaging earthquake. Social sensors, here represented as Twitter users, can provide information earlier to the general public and more rapidly to the emergency planning and disaster relief agencies. We introduce joint streaming data processing from social and physical sensors in real-time based on the idea that social media observations serve as proxies for physical sensors. By using the streams of data in the form of Twitter messages, each of which has an associated time and location, we can extract information related to a target event and perform enhanced analysis by combining it with physical sensor data. Results of this work suggest that the use of data from social media, in conjunction with the development of innovative computing algorithms, when combined with sensor data can provide a new paradigm for real-time earthquake detection in order to facilitate rapid and inexpensive natural risk reduction.
Resolving Peak Ground Displacements in Real-Time GNSS PPP Solutions
NASA Astrophysics Data System (ADS)
Hodgkinson, K. M.; Mencin, D.; Mattioli, G. S.; Sievers, C.; Fox, O.
2017-12-01
The goal of early earthquake warning (EEW) systems is to provide warning of impending ground shaking to the public, infrastructure managers, and emergency responders. Shaking intensity can be estimated using Ground Motion Prediction Equations (GMPEs), but only if site characteristics, hypocentral distance and event magnitude are known. In recent years work has been done analyzing the first few seconds of the seismic P wave to derive event location and magnitude. While initial rupture locations seem to be sufficiently constrained, it has been shown that P-wave magnitude estimates tend to saturate at M>7. Regions where major and great earthquakes occur may therefore be vulnerable to an underestimation of shaking intensity if only P waves magnitudes are used. Crowell et al., (2013) first demonstrated that Peak Ground Displacement (PGD) from long-period surface waves recorded by GNSS receivers could provide a source-scaling relation that does not saturate with event magnitude. GNSS PGD derived magnitudes could improve the accuracy of EEW GMPE calculations. If such a source-scaling method were to be implemented in EEW algorithms it is critical that the noise levels in real-time GNSS processed time-series are low enough to resolve long-period surface waves. UNAVCO currently operates 770 real-time GNSS sites, most of which are located along the North American-Pacific Plate Boundary. In this study, we present an analysis of noise levels observed in the GNSS Precise Point Positioning (PPP) solutions generated and distributed in real-time by UNAVCO for periods from seconds to hours. The analysis is performed using the 770 sites in the real-time network and data collected through July 2017. We compare noise levels determined from various monument types and receiver-antenna configurations. This analysis gives a robust estimation of noise levels in PPP solutions because the solutions analyzed are those that were generated in real-time and thus contain all the problems observed in routine network operations e.g., data outages, high latencies and data from research-quality to less ideal monumentation. Using these noise estimates we can identify which sites are best able to resolve the PGDs for earthquakes over a range of focal distances and those that may not using their current configurations.
Can Real-Time Data Also Be Climate Quality?
NASA Astrophysics Data System (ADS)
Brewer, M.; Wentz, F. J.
2015-12-01
GMI, AMSR-2 and WindSat herald a new era of highly accurate and timely microwave data products. Traditionally, there has been a large divide between real-time and re-analysis data products. What if these completely separate processing systems could be merged? Through advanced modeling and physically based algorithms, Remote Sensing Systems (RSS) has narrowed the gap between real-time and research-quality. Satellite microwave ocean products have proven useful for a wide array of timely Earth science applications. Through cloud SST capabilities have enormously benefited tropical cyclone forecasting and day to day fisheries management, to name a few. Oceanic wind vectors enhance operational safety of shipping and recreational boating. Atmospheric rivers are of import to many human endeavors, as are cloud cover and knowledge of precipitation events. Some activities benefit from both climate and real-time operational data used in conjunction. RSS has been consistently improving microwave Earth Science Data Records (ESDRs) for several decades, while making near real-time data publicly available for semi-operational use. These data streams have often been produced in 2 stages: near real-time, followed by research quality final files. Over the years, we have seen this time delay shrink from months or weeks to mere hours. As well, we have seen the quality of near real-time data improve to the point where the distinction starts to blur. We continue to work towards better and faster RFI filtering, adaptive algorithms and improved real-time validation statistics for earlier detection of problems. Can it be possible to produce climate quality data in real-time, and what would the advantages be? We will try to answer these questions…
Liu, Feiyan; Wang, Zhen; Wang, Wenli; Luo, Jian-Guang; Kong, Lingyi
2018-06-19
γ-Glutamyltranspeptidase (GGT) plays critical roles in regulating various physiological/pathophysiological processes including the intracellular redox homeostasis. However, an effective fluorescent probe for dissecting the relationships between GGT and oxidative stress in vivo remains largely unexplored. Herein, we present a light-up fluorescent probe (DCDHF-Glu) with long wavelength emission (613 nm) for the highly sensitive and selective detection of GGT using dicyanomethylenedihydrofuran derivative as the fluorescent reporter and γ-glutamyl group as the enzyme-active trigger. DCDHF-Glu is competent to real-time image endogenous GGT in live cells and mice. In particular, DCDHF-Glu enables the direct real-time visualization of the upregulation of GGT under drug-induced oxidative stress in the HepG2 cells and the LO2 cells, as well as in vivo, vividly implying its excellent capacity in elucidation of GGT function in GGT-related biological events.
Modeling and Simulation of Metallurgical Process Based on Hybrid Petri Net
NASA Astrophysics Data System (ADS)
Ren, Yujuan; Bao, Hong
2016-11-01
In order to achieve the goals of energy saving and emission reduction of iron and steel enterprises, an increasing number of modeling and simulation technologies are used to research and analyse metallurgical production process. In this paper, the basic principle of Hybrid Petri net is used to model and analyse the Metallurgical Process. Firstly, the definition of Hybrid Petri Net System of Metallurgical Process (MPHPNS) and its modeling theory are proposed. Secondly, the model of MPHPNS based on material flow is constructed. The dynamic flow of materials and the real-time change of each technological state in metallurgical process are simulated vividly by using this model. The simulation process can implement interaction between the continuous event dynamic system and the discrete event dynamic system at the same level, and play a positive role in the production decision.
NASA Astrophysics Data System (ADS)
Michnovicz, Michael R.
1997-06-01
A real-time executive has been implemented to control a high altitude pointing and tracking experiment. The track and mode controller (TMC) implements a table driven design, in which the track mode logic for a tracking mission is defined within a state transition diagram (STD). THe STD is implemented as a state transition table in the TMC software. Status Events trigger the state transitions in the STD. Each state, as it is entered, causes a number of processes to be activated within the system. As these processes propagate through the system, the status of key processes are monitored by the TMC, allowing further transitions within the STD. This architecture is implemented in real-time, using the vxWorks operating system. VxWorks message queues allow communication of status events from the Event Monitor task to the STD task. Process commands are propagated to the rest of the system processors by means of the SCRAMNet shared memory network. The system mode logic contained in the STD will autonomously sequence in acquisition, tracking and pointing system through an entire engagement sequence, starting with target detection and ending with aimpoint maintenance. Simulation results and lab test results will be presented to verify the mode controller. In addition to implementing the system mode logic with the STD, the TMC can process prerecorded time sequences of commands required during startup operations. It can also process single commands from the system operator. In this paper, the author presents (1) an overview, in which he describes the TMC architecture, the relationship of an end-to-end simulation to the flight software and the laboratory testing environment, (2) implementation details, including information on the vxWorks message queues and the SCRAMNet shared memory network, (3) simulation results and lab test results which verify the mode controller, and (4) plans for the future, specifically as to how this executive will expedite transition to a fully functional system.
NASA Astrophysics Data System (ADS)
Kun, C.
2015-12-01
Studies have shown that estimates of ground motion parameter from ground motion attenuation relationship often greater than the observed value, mainly because multiple ruptures of the big earthquake reduce the source pulse height of source time function. In the absence of real-time data of the station after the earthquake, this paper attempts to make some constraints from the source, to improve the accuracy of shakemaps. Causative fault of Yushu Ms 7.1 earthquake is vertical approximately (dip 83 °), and source process in time and space was dispersive distinctly. Main shock of Yushu Ms7.1 earthquake can be divided into several sub-events based on source process of this earthquake. Magnitude of each sub-events depended on each area under the curve of source pulse of source time function, and location derived from source process of each sub-event. We use ShakeMap method with considering the site effect to generate shakeMap for each sub-event, respectively. Finally, ShakeMaps of mainshock can be aquired from superposition of shakemaps for all the sub-events in space. Shakemaps based on surface rupture of causative Fault from field survey can also be derived for mainshock with only one magnitude. We compare ShakeMaps of both the above methods with Intensity of investigation. Comparisons show that decomposition method of main shock more accurately reflect the shake of earthquake in near-field, but for far field the shake is controlled by the weakening influence of the source, the estimated Ⅵ area was smaller than the intensity of the actual investigation. Perhaps seismic intensity in far-field may be related to the increasing seismic duration for the two events. In general, decomposition method of main shock based on source process, considering shakemap of each sub-event, is feasible for disaster emergency response, decision-making and rapid Disaster Assessment after the earthquake.
NASA Astrophysics Data System (ADS)
Kromer, Ryan A.; Abellán, Antonio; Hutchinson, D. Jean; Lato, Matt; Chanut, Marie-Aurelie; Dubois, Laurent; Jaboyedoff, Michel
2017-05-01
We present an automated terrestrial laser scanning (ATLS) system with automatic near-real-time change detection processing. The ATLS system was tested on the Séchilienne landslide in France for a 6-week period with data collected at 30 min intervals. The purpose of developing the system was to fill the gap of high-temporal-resolution TLS monitoring studies of earth surface processes and to offer a cost-effective, light, portable alternative to ground-based interferometric synthetic aperture radar (GB-InSAR) deformation monitoring. During the study, we detected the flux of talus, displacement of the landslide and pre-failure deformation of discrete rockfall events. Additionally, we found the ATLS system to be an effective tool in monitoring landslide and rockfall processes despite missing points due to poor atmospheric conditions or rainfall. Furthermore, such a system has the potential to help us better understand a wide variety of slope processes at high levels of temporal detail.
Low Power Shoe Integrated Intelligent Wireless Gait Measurement System
NASA Astrophysics Data System (ADS)
Wahab, Y.; Mazalan, M.; Bakar, N. A.; Anuar, A. F.; Zainol, M. Z.; Hamzah, F.
2014-04-01
Gait analysis measurement is a method to assess and identify gait events and the measurements of dynamic, motion and pressure parameters involving the lowest part of the body. This significant analysis is widely used in sports, rehabilitation as well as other health diagnostic towards improving the quality of life. This paper presents a new system empowered by Inertia Measurement Unit (IMU), ultrasonic sensors, piezoceramic sensors array, XBee wireless modules and Arduino processing unit. This research focuses on the design and development of a low power ultra-portable shoe integrated wireless intelligent gait measurement using MEMS and recent microelectronic devices for foot clearance, orientation, error correction, gait events and pressure measurement system. It is developed to be cheap, low power, wireless, real time and suitable for real life in-door and out-door environment.
Coarse Resolution SAR Imagery to Support Flood Inundation Models in Near Real Time
NASA Astrophysics Data System (ADS)
Di Baldassarre, Giuliano; Schumann, Guy; Brandimarte, Luigia; Bates, Paul
2009-11-01
In recent years, the availability of new emerging data (e.g. remote sensing, intelligent wireless sensors, etc) has led to a sudden shift from a data-sparse to a data-rich environment for hydrological and hydraulic modelling. Furthermore, the increased socioeconomic relevance of river flood studies has motivated the development of complex methodologies for the simulation of the hydraulic behaviour of river systems. In this context, this study aims at assessing the capability of coarse resolution SAR (Synthetic Aperture Radar) imagery to support and quickly validate flood inundation models in near real time. A hydraulic model of a 98km reach of the River Po (Italy), previously calibrated on a high-magnitude flood event with extensive and high quality field data, is tested using a SAR flood image, acquired and processed in near real time, during the June 2008 low-magnitude event. Specifically, the image is an acquisition by the ENVISAT-ASAR sensor in wide swath mode and has been provided through ESA (European Space Agency) Fast Registration system at no cost 24 hours after the acquisition. The study shows that the SAR image enables validation and improvement of the model in a time shorter than the flood travel time. This increases the reliability of model predictions (e.g. water elevation and inundation width along the river reach) and, consequently, assists flood management authorities in undertaking the necessary prevention activities.
Monitoring of waste disposal in deep geological formations
NASA Astrophysics Data System (ADS)
German, V.; Mansurov, V.
2003-04-01
In the paper application of kinetic approach for description of rock failure process and waste disposal microseismic monitoring is advanced. On base of two-stage model of failure process the capability of rock fracture is proved. The requests to monitoring system such as real time mode of data registration and processing and its precision range are formulated. The method of failure nuclei delineation in a rock masses is presented. This method is implemented in a software program for strong seismic events forecasting. It is based on direct use of the fracture concentration criterion. The method is applied to the database of microseismic events of the North Ural Bauxite Mine. The results of this application, such as: efficiency, stability, possibility of forecasting rockburst are discussed.
Applications For Real Time NOMADS At NCEP To Disseminate NOAA's Operational Model Data Base
NASA Astrophysics Data System (ADS)
Alpert, J. C.; Wang, J.; Rutledge, G.
2007-05-01
A wide range of environmental information, in digital form, with metadata descriptions and supporting infrastructure is contained in the NOAA Operational Modeling Archive Distribution System (NOMADS) and its Real Time (RT) project prototype at the National Centers for Environmental Prediction (NCEP). NOMADS is now delivering on its goal of a seamless framework, from archival to real time data dissemination for NOAA's operational model data holdings. A process is under way to make NOMADS part of NCEP's operational production of products. A goal is to foster collaborations among the research and education communities, value added retailers, and public access for science and development. In the National Research Council's "Completing the Forecast", Recommendation 3.4 states: "NOMADS should be maintained and extended to include (a) long-term archives of the global and regional ensemble forecasting systems at their native resolution, and (b) re-forecast datasets to facilitate post-processing." As one of many participants of NOMADS, NCEP serves the operational model data base using data application protocol (Open-DAP) and other services for participants to serve their data sets and users to obtain them. Using the NCEP global ensemble data as an example, we show an Open-DAP (also known as DODS) client application that provides a request-and-fulfill mechanism for access to the complex ensemble matrix of holdings. As an example of the DAP service, we show a client application which accesses the Global or Regional Ensemble data set to produce user selected weather element event probabilities. The event probabilities are easily extended over model forecast time to show probability histograms defining the future trend of user selected events. This approach insures an efficient use of computer resources because users transmit only the data necessary for their tasks. Data sets are served by OPeN-DAP allowing commercial clients such as MATLAB or IDL as well as freeware clients such as GrADS to access the NCEP real time database. We will demonstrate how users can use NOMADS services to repackage area subsets and select levels and variables that are sent to a users selected ftp site. NOMADS can also display plots on demand for area subsets, selected levels, time series and selected variables.
Real-Time Imaging System for the OpenPET
NASA Astrophysics Data System (ADS)
Tashima, Hideaki; Yoshida, Eiji; Kinouchi, Shoko; Nishikido, Fumihiko; Inadama, Naoko; Murayama, Hideo; Suga, Mikio; Haneishi, Hideaki; Yamaya, Taiga
2012-02-01
The OpenPET and its real-time imaging capability have great potential for real-time tumor tracking in medical procedures such as biopsy and radiation therapy. For the real-time imaging system, we intend to use the one-pass list-mode dynamic row-action maximum likelihood algorithm (DRAMA) and implement it using general-purpose computing on graphics processing units (GPGPU) techniques. However, it is difficult to make consistent reconstructions in real-time because the amount of list-mode data acquired in PET scans may be large depending on the level of radioactivity, and the reconstruction speed depends on the amount of the list-mode data. In this study, we developed a system to control the data used in the reconstruction step while retaining quantitative performance. In the proposed system, the data transfer control system limits the event counts to be used in the reconstruction step according to the reconstruction speed, and the reconstructed images are properly intensified by using the ratio of the used counts to the total counts. We implemented the system on a small OpenPET prototype system and evaluated the performance in terms of the real-time tracking ability by displaying reconstructed images in which the intensity was compensated. The intensity of the displayed images correlated properly with the original count rate and a frame rate of 2 frames per second was achieved with average delay time of 2.1 s.
Beyond Event Segmentation: Spatial- and Social-Cognitive Processes in Verb-to-Action Mapping
ERIC Educational Resources Information Center
Friend, Margaret; Pace, Amy
2011-01-01
The present article investigates spatial- and social-cognitive processes in toddlers' mapping of concepts to real-world events. In 2 studies we explore how event segmentation might lay the groundwork for extracting actions from the event stream and conceptually mapping novel verbs to these actions. In Study 1, toddlers demonstrated the ability to…
Real-Time Payload Control and Monitoring on the World Wide Web
NASA Technical Reports Server (NTRS)
Sun, Charles; Windrem, May; Givens, John J. (Technical Monitor)
1998-01-01
World Wide Web (W3) technologies such as the Hypertext Transfer Protocol (HTTP) and the Java object-oriented programming environment offer a powerful, yet relatively inexpensive, framework for distributed application software development. This paper describes the design of a real-time payload control and monitoring system that was developed with W3 technologies at NASA Ames Research Center. Based on Java Development Toolkit (JDK) 1.1, the system uses an event-driven "publish and subscribe" approach to inter-process communication and graphical user-interface construction. A C Language Integrated Production System (CLIPS) compatible inference engine provides the back-end intelligent data processing capability, while Oracle Relational Database Management System (RDBMS) provides the data management function. Preliminary evaluation shows acceptable performance for some classes of payloads, with Java's portability and multimedia support identified as the most significant benefit.
An arc control and protection system for the JET lower hybrid antenna based on an imaging system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Figueiredo, J., E-mail: joao.figueiredo@jet.efda.org; Mailloux, J.; Kirov, K.
Arcs are the potentially most dangerous events related to Lower Hybrid (LH) antenna operation. If left uncontrolled they can produce damage and cause plasma disruption by impurity influx. To address this issue an arc real time control and protection imaging system for the Joint European Torus (JET) LH antenna has been implemented. The LH system is one of the additional heating systems at JET. It comprises 24 microwave generators (klystrons, operating at 3.7 GHz) providing up to 5 MW of heating and current drive to the JET plasma. This is done through an antenna composed of an array of waveguidesmore » facing the plasma. The protection system presented here is based primarily on an imaging arc detection and real time control system. It has adapted the ITER like wall hotspot protection system using an identical CCD camera and real time image processing unit. A filter has been installed to avoid saturation and spurious system triggers caused by ionization light. The antenna is divided in 24 Regions Of Interest (ROIs) each one corresponding to one klystron. If an arc precursor is detected in a ROI, power is reduced locally with subsequent potential damage and plasma disruption avoided. The power is subsequently reinstated if, during a defined interval of time, arcing is confirmed not to be present by image analysis. This system was successfully commissioned during the restart phase and beginning of the 2013 scientific campaign. Since its installation and commissioning, arcs and related phenomena have been prevented. In this contribution we briefly describe the camera, image processing, and real time control systems. Most importantly, we demonstrate that an LH antenna arc protection system based on CCD camera imaging systems works. Examples of both controlled and uncontrolled LH arc events and their consequences are shown.« less
Facilitating preemptive hardware system design using partial reconfiguration techniques.
Dondo Gazzano, Julio; Rincon, Fernando; Vaderrama, Carlos; Villanueva, Felix; Caba, Julian; Lopez, Juan Carlos
2014-01-01
In FPGA-based control system design, partial reconfiguration is especially well suited to implement preemptive systems. In real-time systems, the deadline for critical task can compel the preemption of noncritical one. Besides, an asynchronous event can demand immediate attention and, then, force launching a reconfiguration process for high-priority task implementation. If the asynchronous event is previously scheduled, an explicit activation of the reconfiguration process is performed. If the event cannot be previously programmed, such as in dynamically scheduled systems, an implicit activation to the reconfiguration process is demanded. This paper provides a hardware-based approach to explicit and implicit activation of the partial reconfiguration process in dynamically reconfigurable SoCs and includes all the necessary tasks to cope with this issue. Furthermore, the reconfiguration service introduced in this work allows remote invocation of the reconfiguration process and then the remote integration of off-chip components. A model that offers component location transparency is also presented to enhance and facilitate system integration.
Facilitating Preemptive Hardware System Design Using Partial Reconfiguration Techniques
Rincon, Fernando; Vaderrama, Carlos; Villanueva, Felix; Caba, Julian; Lopez, Juan Carlos
2014-01-01
In FPGA-based control system design, partial reconfiguration is especially well suited to implement preemptive systems. In real-time systems, the deadline for critical task can compel the preemption of noncritical one. Besides, an asynchronous event can demand immediate attention and, then, force launching a reconfiguration process for high-priority task implementation. If the asynchronous event is previously scheduled, an explicit activation of the reconfiguration process is performed. If the event cannot be previously programmed, such as in dynamically scheduled systems, an implicit activation to the reconfiguration process is demanded. This paper provides a hardware-based approach to explicit and implicit activation of the partial reconfiguration process in dynamically reconfigurable SoCs and includes all the necessary tasks to cope with this issue. Furthermore, the reconfiguration service introduced in this work allows remote invocation of the reconfiguration process and then the remote integration of off-chip components. A model that offers component location transparency is also presented to enhance and facilitate system integration. PMID:24672292
Fact vs fiction--how paratextual information shapes our reading processes.
Altmann, Ulrike; Bohrn, Isabel C; Lubrich, Oliver; Menninghaus, Winfried; Jacobs, Arthur M
2014-01-01
Our life is full of stories: some of them depict real-life events and were reported, e.g. in the daily news or in autobiographies, whereas other stories, as often presented to us in movies and novels, are fictional. However, we have only little insights in the neurocognitive processes underlying the reading of factual as compared to fictional contents. We investigated the neurocognitive effects of reading short narratives, labeled to be either factual or fictional. Reading in a factual mode engaged an activation pattern suggesting an action-based reconstruction of the events depicted in a story. This process seems to be past-oriented and leads to shorter reaction times at the behavioral level. In contrast, the brain activation patterns corresponding to reading fiction seem to reflect a constructive simulation of what might have happened. This is in line with studies on imagination of possible past or future events.
Fact vs fiction—how paratextual information shapes our reading processes
Altmann, Ulrike; Bohrn, Isabel C.; Lubrich, Oliver; Menninghaus, Winfried; Jacobs, Arthur M.
2014-01-01
Our life is full of stories: some of them depict real-life events and were reported, e.g. in the daily news or in autobiographies, whereas other stories, as often presented to us in movies and novels, are fictional. However, we have only little insights in the neurocognitive processes underlying the reading of factual as compared to fictional contents. We investigated the neurocognitive effects of reading short narratives, labeled to be either factual or fictional. Reading in a factual mode engaged an activation pattern suggesting an action-based reconstruction of the events depicted in a story. This process seems to be past-oriented and leads to shorter reaction times at the behavioral level. In contrast, the brain activation patterns corresponding to reading fiction seem to reflect a constructive simulation of what might have happened. This is in line with studies on imagination of possible past or future events. PMID:22956671
NASA Astrophysics Data System (ADS)
Joyce, Malcolm J.; Gamage, Kelum A. A.; Aspinall, M. D.; Cave, F. D.; Lavietes, A.
2014-06-01
The design, principle of operation and the results of measurements made with a four-channel organic scintillator system are described. The system comprises four detectors and a multiplexed analyzer for the real-time parallel processing of fast neutron events. The function of the real-time, digital multiple-channel pulse-shape discrimination analyzer is described together with the results of laboratory-based measurements with 252Cf, 241Am-Li and plutonium. The analyzer is based on a single-board solution with integrated high-voltage supplies and graphical user interface. It has been developed to meet the requirements of nuclear materials assay of relevance to safeguards and security. Data are presented for the real-time coincidence assay of plutonium in terms of doubles count rate versus mass. This includes an assessment of the limiting mass uncertainty for coincidence assay based on a 100 s measurement period and samples in the range 0-50 g. Measurements of count rate versus order of multiplicity for 252Cf and 241Am-Li and combinations of both are also presented.
NASA Astrophysics Data System (ADS)
Kazarov, A.; Lehmann Miotto, G.; Magnoni, L.
2012-06-01
The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for collecting and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This requires strong competence and experience in understanding and discovering problems and root causes, and often the meaningful information is not in the single message or update, but in the aggregated behavior in a certain time-line. The AAL project is meant at reducing the man power needs and at assuring a constant high quality of problem detection by automating most of the monitoring tasks and providing real-time correlation of data-taking and system metrics. This project combines technologies coming from different disciplines, in particular it leverages on an Event Driven Architecture to unify the flow of data from the ATLAS infrastructure, on a Complex Event Processing (CEP) engine for correlation of events and on a message oriented architecture for components integration. The project is composed of 2 main components: a core processing engine, responsible for correlation of events through expert-defined queries and a web based front-end to present real-time information and interact with the system. All components works in a loose-coupled event based architecture, with a message broker to centralize all communication between modules. The result is an intelligent system able to extract and compute relevant information from the flow of operational data to provide real-time feedback to human experts who can promptly react when needed. The paper presents the design and implementation of the AAL project, together with the results of its usage as automated monitoring assistant for the ATLAS data taking infrastructure.
Children's Use of Morphological Cues in Real-Time Event Representation
ERIC Educational Resources Information Center
Zhou, Peng; Ma, Weiyi
2018-01-01
The present study investigated whether and how fast young children can use information encoded in morphological markers during real-time event representation. Using the visual world paradigm, we tested 35 adults, 34 5-year-olds and 33 3-year-olds. The results showed that the adults, the 5-year-olds and the 3-year-olds all exhibited eye gaze…
The Earthquake Early Warning System In Southern Italy: Performance Tests And Next Developments
NASA Astrophysics Data System (ADS)
Zollo, A.; Elia, L.; Martino, C.; Colombelli, S.; Emolo, A.; Festa, G.; Iannaccone, G.
2011-12-01
PRESTo (PRobabilistic and Evolutionary early warning SysTem) is the software platform for Earthquake Early Warning (EEW) in Southern Italy, that integrates recent algorithms for real-time earthquake location, magnitude estimation and damage assessment, into a highly configurable and easily portable package. The system is under active experimentation based on the Irpinia Seismic Network (ISNet). PRESTo processes the live streams of 3C acceleration data for P-wave arrival detection and, while an event is occurring, promptly performs event detection and provides location, magnitude estimations and peak ground shaking predictions at target sites. The earthquake location is obtained by an evolutionary, real-time probabilistic approach based on an equal differential time formulation. At each time step, it uses information from both triggered and not-yet-triggered stations. Magnitude estimation exploits an empirical relationship that correlates it to the filtered Peak Displacement (Pd), measured over the first 2-4 s of P-signal. Peak ground-motion parameters at any distance can be finally estimated by ground motion prediction equations. Alarm messages containing the updated estimates of these parameters can thus reach target sites before the destructive waves, enabling automatic safety procedures. Using the real-time data streaming from the ISNet network, PRESTo has produced a bulletin for about a hundred low-magnitude events occurred during last two years. Meanwhile, the performances of the EEW system were assessed off-line playing-back the records for moderate and large events from Italy, Spain and Japan and synthetic waveforms for large historical events in Italy. These tests have shown that, when a dense seismic network is deployed in the fault area, PRESTo produces reliable estimates of earthquake location and size within 5-6 s from the event origin time (To). Estimates are provided as probability density functions whose uncertainty typically decreases with time, obtaining a stable solution within 10 s from To. The regional approach was recently integrated with a threshold-based early warning method for the definition of alert levels and the estimation of the Potential Damaged Zone (PDZ) in which the highest intensity levels are expected. The dominant period Tau_c and the peak displacement (Pd) are simultaneously measured in a 3s window after the first P-arrival time. Pd and Tau_c are then compared with threshold values, previously established through an empirical regression analysis, that define a decisional table with four alert levels. According to the real-time measured values of Pd and tau_c, each station provides a local alert level that can be used to warn distant sites and to define the extent of the PDZ. The integrated system was validated off-line for the M6.3, 2009 Central Italy earthquake and ten large Japanese events, due to the low-magnitude events currently occurring in Irpinia. The results confirmed the feasibility and the robustness of such an approach, providing reliable predictions of the earthquake damaging effects, that is a relevant information for the efficient planning of the rescue operations in the immediate post-event emergency phase.
Global Near Real-Time MODIS and Landsat Flood Mapping and Product Delivery
NASA Astrophysics Data System (ADS)
Policelli, F. S.; Slayback, D. A.; Tokay, M. M.; Brakenridge, G. R.
2014-12-01
Flooding is the most destructive, frequent, and costly natural disaster faced by modern society, and is increasing in frequency and damage (deaths, displacements, and financial costs) as populations increase and climate change generates more extreme weather events. When major flooding events occur, the disaster management community needs frequently updated and easily accessible information to better understand the extent of flooding and coordinate response efforts. With funding from NASA's Applied Sciences program, we developed and are now operating a near real-time global flood mapping system to help provide flood extent information within 24-48 hours of events. The principal element of the system applies a water detection algorithm to MODIS imagery, which is processed by the LANCE (Land Atmosphere Near real-time Capability for EOS) system at NASA Goddard within a few hours of satellite overpass. Using imagery from both the Terra (10:30 AM local time overpass) and Aqua (1:30 PM) platforms allows the system to deliver an initial daily assessment of flood extent by late afternoon, and more robust assessments after accumulating cloud-free imagery over several days. Cloud cover is the primary limitation in detecting surface water from MODIS imagery. Other issues include the relatively coarse scale of the MODIS imagery (250 meters) for some events, the difficulty of detecting flood waters in areas with continuous canopy cover, confusion of shadow (cloud or terrain) with water, and accurately identifying detected water as flood as opposed to normal water extent. We are working on improvements to address these limitations. We have also begun delivery of near real time water maps at 30 m resolution from Landsat imagery. Although Landsat is not available daily globally, but only every 8 days if imagery from both operating platforms (Landsat 7 and 8) is accessed, it can provide useful higher resolution data on water extent when a clear acquisition coincides with an active flood event. These data products are provided in various formats on our website, and also via live OGC (Open Geospatial Consortium) services, and ArcGIS Online accessible web maps, allowing easy access from a variety of platforms, from desktop GIS software to web browsers on mobile phones. https://oas.gsfc.nasa.gov/floodmap
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Hunter, Stanley D.; Hanu, Andrei R.; Sheets, Teresa B.
2016-01-01
Richard O. Duda and Peter E. Hart of Stanford Research Institute in [1] described the recurring problem in computer image processing as the detection of straight lines in digitized images. The problem is to detect the presence of groups of collinear or almost collinear figure points. It is clear that the problem can be solved to any desired degree of accuracy by testing the lines formed by all pairs of points. However, the computation required for n=NxM points image is approximately proportional to n2 or O(n2), becoming prohibitive for large images or when data processing cadence time is in milliseconds. Rosenfeld in [2] described an ingenious method due to Hough [3] for replacing the original problem of finding collinear points by a mathematically equivalent problem of finding concurrent lines. This method involves transforming each of the figure points into a straight line in a parameter space. Hough chose to use the familiar slope-intercept parameters, and thus his parameter space was the two-dimensional slope-intercept plane. A parallel Hough transform running on multi-core processors was elaborated in [4]. There are many other proposed methods of solving a similar problem, such as sampling-up-the-ramp algorithm (SUTR) [5] and algorithms involving artificial swarm intelligence techniques [6]. However, all state-of-the-art algorithms lack in real time performance. Namely, they are slow for large images that require performance cadence of a few dozens of milliseconds (50ms). This problem arises in spaceflight applications such as near real-time analysis of gamma ray measurements contaminated by overwhelming amount of traces of cosmic rays (CR). Future spaceflight instruments such as the Advanced Energetic Pair Telescope instrument (AdEPT) [7-9] for cosmos gamma ray survey employ large detector readout planes registering multitudes of cosmic ray interference events and sparse science gamma ray event traces' projections. The AdEPT science of interest is in the gamma ray events and the problem is to detect and reject the much more voluminous cosmic ray projections, so that the remaining science data can be telemetered to the ground over the constrained communication link. The state-of-the-art in cosmic rays detection and rejection does not provide an adequate computational solution. This paper presents a novel approach to the AdEPT on-board data processing burdened with the CR detection top pole bottleneck problem. This paper is introducing the data processing object, demonstrates object segmentation and distribution for processing among many processing elements (PEs) and presents solution algorithm for the processing bottleneck - the CR-Algorithm. The algorithm is based on the a priori knowledge that a CR pierces the entire instrument pressure vessel. This phenomenon is also the basis for a straightforward CR simulator, allowing the CR-Algorithm performance testing. Parallel processing of the readout image's (2(N+M) - 4) peripheral voxels is detecting all CRs, resulting in O(n) computational complexity. This algorithm near real-time performance is making AdEPT class spaceflight instruments feasible.
An Efficient Randomized Algorithm for Real-Time Process Scheduling in PicOS Operating System
NASA Astrophysics Data System (ADS)
Helmy*, Tarek; Fatai, Anifowose; Sallam, El-Sayed
PicOS is an event-driven operating environment designed for use with embedded networked sensors. More specifically, it is designed to support the concurrency in intensive operations required by networked sensors with minimal hardware requirements. Existing process scheduling algorithms of PicOS; a commercial tiny, low-footprint, real-time operating system; have their associated drawbacks. An efficient, alternative algorithm, based on a randomized selection policy, has been proposed, demonstrated, confirmed for efficiency and fairness, on the average, and has been recommended for implementation in PicOS. Simulations were carried out and performance measures such as Average Waiting Time (AWT) and Average Turn-around Time (ATT) were used to assess the efficiency of the proposed randomized version over the existing ones. The results prove that Randomized algorithm is the best and most attractive for implementation in PicOS, since it is most fair and has the least AWT and ATT on average over the other non-preemptive scheduling algorithms implemented in this paper.
NASA Astrophysics Data System (ADS)
Jackson, Michael; Passmore, Paul; Zimakov, Leonid; Raczka, Jared
2014-05-01
One of the fundamental requirements of an Earthquake Early Warning (EEW) system (and other mission critical applications) is to quickly detect and process the information from the strong motion event, i.e. event detection and location, magnitude estimation, and the peak ground motion estimation at the defined targeted site, thus allowing the civil protection authorities to provide pre-programmed emergency response actions: Slow down or stop rapid transit trains and high-speed trains; shutoff of gas pipelines and chemical facilities; stop elevators at the nearest floor; send alarms to hospitals, schools and other civil institutions. An important question associated with the EEW system is: can we measure displacements in real time with sufficient accuracy? Scientific GNSS networks are moving towards a model of real-time data acquisition, storage integrity, and real-time position and displacement calculations. This new paradigm allows the integration of real-time, high-rate GNSS displacement information with acceleration and velocity data to create very high-rate displacement records. The mating of these two instruments allows the creation of a new, very high-rate (200 Hz) displacement observable that has the full-scale displacement characteristics of GNSS and high-precision dynamic motions of seismic technologies. It is envisioned that these new observables can be used for earthquake early warning studies and other mission critical applications, such as volcano monitoring, building, bridge and dam monitoring systems. REF TEK a Division of Trimble has developed the integrated GNSS/Accelerograph system, model 160-09SG, which consists of REF TEK's fourth generation electronics, a 147-01 high-resolution ANSS Class A accelerometer, and Trimble GNSS receiver and antenna capable of real time, on board Precise Point Positioning (PPP) techniques with satellite clock and orbit corrections delivered to the receiver directly via L-band satellite communications. The test we conducted with the 160-09SG Recorder is focused on the characteristics of GNSS and seismic sensors in high dynamic environments, including historic earthquakes replicated on a shake table, over a range of displacements and frequencies. The main goals of the field tests are to explore the optimum integration of these sensors from a filtering perspective including simple harmonic impulses over varying frequencies and amplitudes and under the dynamic conditions of various earthquake scenarios.
Stavrakakis, P; Agapiou, A; Mikedi, K; Karma, S; Statheropoulos, M; Pallis, G C; Pappa, A
2014-01-01
Fires are becoming more violent and frequent resulting in major economic losses and long-lasting effects on communities and ecosystems; thus, efficient fire monitoring is becoming a necessity. A novel triple multi-sensor approach was developed for monitoring and studying the burning of dry forest fuel in an open field scheduled experiment; chemical, optical, and acoustical sensors were combined to record the fire spread. The results of this integrated field campaign for real-time monitoring of the fire event are presented and discussed. Chemical analysis, despite its limitations, corresponded to the burning process with a minor time delay. Nevertheless, the evolution profile of CO2, CO, NO, and O2 were detected and monitored. The chemical monitoring of smoke components enabled the observing of the different fire phases (flaming, smoldering) based on the emissions identified in each phase. The analysis of fire acoustical signals presented accurate and timely response to the fire event. In the same content, the use of a thermographic camera, for monitoring the biomass burning, was also considerable (both profiles of the intensities of average gray and red component greater than 230) and presented similar promising potentials to audio results. Further work is needed towards integrating sensors signals for automation purposes leading to potential applications in real situations.
2000-06-01
real - time operating system and design of a human-computer interface (HCI) for a triple modular redundant (TMR) fault-tolerant microprocessor for use in space-based applications. Once disadvantage of using COTS hardware components is their susceptibility to the radiation effects present in the space environment. and specifically, radiation-induced single-event upsets (SEUs). In the event of an SEU, a fault-tolerant system can mitigate the effects of the upset and continue to process from the last known correct system state. The TMR basic hardware
Event-based image recognition applied in tennis training assistance
NASA Astrophysics Data System (ADS)
Wawrzyniak, Zbigniew M.; Kowalski, Adam
2016-09-01
This paper presents a concept of a real-time system for individual tennis training assistance. The system is supposed to provide user (player) with information on his strokes accuracy as well as other training quality parameters such as velocity and rotation of the ball during its flight. The method is based on image processing methods equipped with developed explorative analysis of the events and their description by parameters of the movement. There has been presented the concept for further deployment to create a complete system that could assist tennis player during individual training.
Tesla: An application for real-time data analysis in High Energy Physics
NASA Astrophysics Data System (ADS)
Aaij, R.; Amato, S.; Anderlini, L.; Benson, S.; Cattaneo, M.; Clemencic, M.; Couturier, B.; Frank, M.; Gligorov, V. V.; Head, T.; Jones, C.; Komarov, I.; Lupton, O.; Matev, R.; Raven, G.; Sciascia, B.; Skwarnicki, T.; Spradlin, P.; Stahl, S.; Storaci, B.; Vesterinen, M.
2016-11-01
Upgrades to the LHCb computing infrastructure in the first long shutdown of the LHC have allowed for high quality decay information to be calculated by the software trigger making a separate offline event reconstruction unnecessary. Furthermore, the storage space of the triggered candidate is an order of magnitude smaller than the entire raw event that would otherwise need to be persisted. Tesla is an application designed to process the information calculated by the trigger, with the resulting output used to directly perform physics measurements.
Real Time Fire Reconnaissance Satellite Monitoring System Failure Model
NASA Astrophysics Data System (ADS)
Nino Prieto, Omar Ariosto; Colmenares Guillen, Luis Enrique
2013-09-01
In this paper the Real Time Fire Reconnaissance Satellite Monitoring System is presented. This architecture is a legacy of the Detection System for Real-Time Physical Variables which is undergoing a patent process in Mexico. The methodologies for this design are the Structured Analysis for Real Time (SA- RT) [8], and the software is carried out by LACATRE (Langage d'aide à la Conception d'Application multitâche Temps Réel) [9,10] Real Time formal language. The system failures model is analyzed and the proposal is based on the formal language for the design of critical systems and Risk Assessment; AltaRica. This formal architecture uses satellites as input sensors and it was adapted from the original model which is a design pattern for physical variation detection in Real Time. The original design, whose task is to monitor events such as natural disasters and health related applications, or actual sickness monitoring and prevention, as the Real Time Diabetes Monitoring System, among others. Some related work has been presented on the Mexican Space Agency (AEM) Creation and Consultation Forums (2010-2011), and throughout the International Mexican Aerospace Science and Technology Society (SOMECYTA) international congress held in San Luis Potosí, México (2012). This Architecture will allow a Real Time Fire Satellite Monitoring, which will reduce the damage and danger caused by fires which consumes the forests and tropical forests of Mexico. This new proposal, permits having a new system that impacts on disaster prevention, by combining national and international technologies and cooperation for the benefit of humankind.
Feature Acquisition with Imbalanced Training Data
NASA Technical Reports Server (NTRS)
Thompson, David R.; Wagstaff, Kiri L.; Majid, Walid A.; Jones, Dayton L.
2011-01-01
This work considers cost-sensitive feature acquisition that attempts to classify a candidate datapoint from incomplete information. In this task, an agent acquires features of the datapoint using one or more costly diagnostic tests, and eventually ascribes a classification label. A cost function describes both the penalties for feature acquisition, as well as misclassification errors. A common solution is a Cost Sensitive Decision Tree (CSDT), a branching sequence of tests with features acquired at interior decision points and class assignment at the leaves. CSDT's can incorporate a wide range of diagnostic tests and can reflect arbitrary cost structures. They are particularly useful for online applications due to their low computational overhead. In this innovation, CSDT's are applied to cost-sensitive feature acquisition where the goal is to recognize very rare or unique phenomena in real time. Example applications from this domain include four areas. In stream processing, one seeks unique events in a real time data stream that is too large to store. In fault protection, a system must adapt quickly to react to anticipated errors by triggering repair activities or follow- up diagnostics. With real-time sensor networks, one seeks to classify unique, new events as they occur. With observational sciences, a new generation of instrumentation seeks unique events through online analysis of large observational datasets. This work presents a solution based on transfer learning principles that permits principled CSDT learning while exploiting any prior knowledge of the designer to correct both between-class and withinclass imbalance. Training examples are adaptively reweighted based on a decomposition of the data attributes. The result is a new, nonparametric representation that matches the anticipated attribute distribution for the target events.
NASA Astrophysics Data System (ADS)
Patton, J.; Yeck, W.; Benz, H.
2017-12-01
The U.S. Geological Survey National Earthquake Information Center (USGS NEIC) is implementing and integrating new signal detection methods such as subspace correlation, continuous beamforming, multi-band picking and automatic phase identification into near-real-time monitoring operations. Leveraging the additional information from these techniques help the NEIC utilize a large and varied network on local to global scales. The NEIC is developing an ordered, rapid, robust, and decentralized framework for distributing seismic detection data as well as a set of formalized formatting standards. These frameworks and standards enable the NEIC to implement a seismic event detection framework that supports basic tasks, including automatic arrival time picking, social media based event detections, and automatic association of different seismic detection data into seismic earthquake events. In addition, this framework enables retrospective detection processing such as automated S-wave arrival time picking given a detected event, discrimination and classification of detected events by type, back-azimuth and slowness calculations, and ensuring aftershock and induced sequence detection completeness. These processes and infrastructure improve the NEIC's capabilities, accuracy, and speed of response. In addition, this same infrastructure provides an improved and convenient structure to support access to automatic detection data for both research and algorithmic development.
Diagnosis of delay-deadline failures in real time discrete event models.
Biswas, Santosh; Sarkar, Dipankar; Bhowal, Prodip; Mukhopadhyay, Siddhartha
2007-10-01
In this paper a method for fault detection and diagnosis (FDD) of real time systems has been developed. A modeling framework termed as real time discrete event system (RTDES) model is presented and a mechanism for FDD of the same has been developed. The use of RTDES framework for FDD is an extension of the works reported in the discrete event system (DES) literature, which are based on finite state machines (FSM). FDD of RTDES models are suited for real time systems because of their capability of representing timing faults leading to failures in terms of erroneous delays and deadlines, which FSM-based ones cannot address. The concept of measurement restriction of variables is introduced for RTDES and the consequent equivalence of states and indistinguishability of transitions have been characterized. Faults are modeled in terms of an unmeasurable condition variable in the state map. Diagnosability is defined and the procedure of constructing a diagnoser is provided. A checkable property of the diagnoser is shown to be a necessary and sufficient condition for diagnosability. The methodology is illustrated with an example of a hydraulic cylinder.
Event-specific real-time detection and quantification of genetically modified Roundup Ready soybean.
Huang, Chia-Chia; Pan, Tzu-Ming
2005-05-18
The event-specific real-time detection and quantification of Roundup Ready soybean (RRS) using an ABI PRISM 7700 sequence detection system with light upon extension (LUX) primer was developed in this study. The event-specific primers were designed, targeting the junction of the RRS 5' integration site and the endogenous gene lectin1. Then, a standard reference plasmid was constructed that carried both of the targeted sequences for quantitative analysis. The detection limit of the LUX real-time PCR system was 0.05 ng of 100% RRS genomic DNA, which was equal to 20.5 copies. The range of quantification was from 0.1 to 100%. The sensitivity and range of quantification successfully met the requirement of the labeling rules in the European Union and Taiwan.
Real-time Automatic Detectors of P and S Waves Using Singular Values Decomposition
NASA Astrophysics Data System (ADS)
Kurzon, I.; Vernon, F.; Rosenberger, A.; Ben-Zion, Y.
2013-12-01
We implement a new method for the automatic detection of the primary P and S phases using Singular Value Decomposition (SVD) analysis. The method is based on a real-time iteration algorithm of Rosenberger (2010) for the SVD of three component seismograms. Rosenberger's algorithm identifies the incidence angle by applying SVD and separates the waveforms into their P and S components. We have been using the same algorithm with the modification that we filter the waveforms prior to the SVD, and then apply SNR (Signal-to-Noise Ratio) detectors for picking the P and S arrivals, on the new filtered+SVD-separated channels. A recent deployment in San Jacinto Fault Zone area provides a very dense seismic network that allows us to test the detection algorithm in diverse setting, such as: events with different source mechanisms, stations with different site characteristics, and ray paths that diverge from the SVD approximation used in the algorithm, (e.g., rays propagating within the fault and recorded on linear arrays, crossing the fault). We have found that a Butterworth band-pass filter of 2-30Hz, with four poles at each of the corner frequencies, shows the best performance in a large variety of events and stations within the SJFZ. Using the SVD detectors we obtain a similar number of P and S picks, which is a rare thing to see in ordinary SNR detectors. Also for the actual real-time operation of the ANZA and SJFZ real-time seismic networks, the above filter (2-30Hz) shows a very impressive performance, tested on many events and several aftershock sequences in the region from the MW 5.2 of June 2005, through the MW 5.4 of July 2010, to MW 4.7 of March 2013. Here we show the results of testing the detectors on the most complex and intense aftershock sequence, the MW 5.2 of June 2005, in which in the very first hour there were ~4 events a minute. This aftershock sequence was thoroughly reviewed by several analysts, identifying 294 events in the first hour, located in a condensed cluster around the main shock. We used this hour of events to fine-tune the automatic SVD detection, association and location of the real-time system, reaching a 37% automatic identification and location of events, with a minimum of 10 stations per event, all events fall within the same condensed cluster and there are no false events or large offsets of their locations. An ordinary SNR detector did not exceed the 11% success with a minimum of 8 stations per event, 2 false events and a wider spread of events (not within the reviewed cluster). One of the main advantages of the SVD detectors for real-time operations is the actual separation between the P and S components, by that significantly reducing the noise of picks detected by ordinary SNR detectors. The new method has been applied for a significant amount of events within the SJFZ in the past 8 years, and is now in the final stage of real-time implementation in UCSD for the ANZA and SJFZ networks, tuned for automatic detection and location of local events.
ERIC Educational Resources Information Center
Paczynski, Martin; Kuperberg, Gina R.
2012-01-01
We aimed to determine whether semantic relatedness between an incoming word and its preceding context can override expectations based on two types of stored knowledge: real-world knowledge about the specific events and states conveyed by a verb, and the verb's broader selection restrictions on the animacy of its argument. We recorded event-related…
Soliton formation from a noise-like pulse during extreme events in a fibre ring laser
NASA Astrophysics Data System (ADS)
Pottiez, O.; Ibarra-Villalon, H. E.; Bracamontes-Rodriguez, Y.; Minguela-Gallardo, J. A.; Garcia-Sanchez, E.; Lauterio-Cruz, J. P.; Hernandez-Garcia, J. C.; Bello-Jimenez, M.; Kuzin, E. A.
2017-10-01
We study experimentally the interactions between soliton and noise-like pulse (NLP) components in a mode-locked fibre ring laser operating in a hybrid soliton-NLP regime. For proper polarization adjustments, one NLP and multiple packets of solitons coexist in the cavity, at 1530 nm and 1558 nm, respectively. By examining time-domain sequences measured using a 16 GHz real-time oscilloscope, we unveil the process of soliton genesis: they are produced during extreme-intensity episodes affecting the NLP. These extreme events can emerge sporadically, appear in small groups or even form quasi-periodic sequences. Once formed, the wavelength-shifted soliton packet drifts away from the NLP in the dispersive cavity, and eventually vanishes after a variable lifetime. Evidence of the inverse process, through which NLP formation is occasionally seeded by an extreme-intensity event affecting a bunch of solitons, is also provided. The quasi-stationary dynamics described here constitutes an impressive illustration of the connections and interactions between NLPs, extreme events and solitons in passively mode-locked fibre lasers.
NASA Astrophysics Data System (ADS)
Waldhauser, F.; Schaff, D. P.
2012-12-01
Archives of digital seismic data recorded by seismometer networks around the world have grown tremendously over the last several decades helped by the deployment of seismic stations and their continued operation within the framework of monitoring earthquake activity and verification of the Nuclear Test-Ban Treaty. We show results from our continuing effort in developing efficient waveform cross-correlation and double-difference analysis methods for the large-scale processing of regional and global seismic archives to improve existing earthquake parameter estimates, detect seismic events with magnitudes below current detection thresholds, and improve real-time monitoring procedures. We demonstrate the performance of these algorithms as applied to the 28-year long seismic archive of the Northern California Seismic Network. The tools enable the computation of periodic updates of a high-resolution earthquake catalog of currently over 500,000 earthquakes using simultaneous double-difference inversions, achieving up to three orders of magnitude resolution improvement over existing hypocenter locations. This catalog, together with associated metadata, form the underlying relational database for a real-time double-difference scheme, DDRT, which rapidly computes high-precision correlation times and hypocenter locations of new events with respect to the background archive (http://ddrt.ldeo.columbia.edu). The DDRT system facilitates near-real-time seismicity analysis, including the ability to search at an unprecedented resolution for spatio-temporal changes in seismogenic properties. In areas with continuously recording stations, we show that a detector built around a scaled cross-correlation function can lower the detection threshold by one magnitude unit compared to the STA/LTA based detector employed at the network. This leads to increased event density, which in turn pushes the resolution capability of our location algorithms. On a global scale, we are currently building the computational framework for double-difference processing the combined parametric and waveform archives of the ISC, NEIC, and IRIS with over three million recorded earthquakes worldwide. Since our methods are scalable and run on inexpensive Beowulf clusters, periodic re-analysis of such archives may thus become a routine procedure to continuously improve resolution in existing global earthquake catalogs. Results from subduction zones and aftershock sequences of recent great earthquakes demonstrate the considerable social and economic impact that high-resolution images of active faults, when available in real-time, will have in the prompt evaluation and mitigation of seismic hazards. These results also highlight the need for consistent long-term seismic monitoring and archiving of records.
Near-Real-Time Earth Observation Data Supporting Wildfire Management
NASA Astrophysics Data System (ADS)
Ambrosia, V. G.; Zajkowski, T.; Quayle, B.
2013-12-01
During disaster events, the most critical element needed by responding personnel and management teams is situational intelligence / awareness. During rapidly-evolving events such as wildfires, the need for timely information is critical to save lives, property and resources. The wildfire management agencies in the US rely heavily on remote sensing information both from airborne platforms as well as from orbital assets. The ability to readily have information from those systems, not just data, is critical to effective control and damage mitigation. NASA has been collaborating with the USFS to mature and operationalize various asset-information capabilities to effect improved knowledge of fire-prone areas, monitor wildfire events in real-time, assess effectiveness of fire management strategies, and provide rapid, post-fire assessment for recovery operations. Specific examples of near-real-time remote sensing asset utility include daily MODIS data employed to assess fire potential / wildfire hazard areas, and national-scale hot-spot detection, airborne thermal sensor collected during wildfire events to effect management strategies, EO-1 ALI 'pointable' satellite sensor data to assess fire-retardant application effectiveness, and Landsat 8 and other sensor data to derive burn severity indices for post-fire remediation work. These cases of where near-real-time data is used operationally during the previous few fire seasons will be presented.
First LHCb measurement with data from the LHC Run 2
NASA Astrophysics Data System (ADS)
Anderlini, L.; Amerio, S.
2017-01-01
LHCb has recently introduced a novel real-time detector alignment and calibration strategy for the Run 2. Data collected at the start of each LHC fill are processed in few minutes and used to update the alignment. On the other hand, the calibration constants will be evaluated for each run of data taking. An increase in the CPU and disk capacity of the event filter farm, combined with improvements to the reconstruction software, allow for efficient, exclusive selections already in the first stage of the High Level Trigger (HLT1), while the second stage, HLT2, performs complete, offline-quality, event reconstruction. In Run 2, LHCb will collect the largest data sample of charm mesons ever recorded. Novel data processing and analysis techniques are required to maximise the physics potential of this data sample with the available computing resources, taking into account data preservation constraints. In this write-up, we describe the full analysis chain used to obtain important results analysing the data collected in proton-proton collisions in 2015, such as the J/ψ and open charm production cross-sections, and consider the further steps required to obtain real-time results after the LHCb upgrade.
NASA Astrophysics Data System (ADS)
Stagnaro, Mattia; Colli, Matteo; Lanza, Luca Giovanni; Chan, Pak Wai
2016-11-01
Eight rainfall events recorded from May to September 2013 at Hong Kong International Airport (HKIA) have been selected to investigate the performance of post-processing algorithms used to calculate the rainfall intensity (RI) from tipping-bucket rain gauges (TBRGs). We assumed a drop-counter catching-type gauge as a working reference and compared rainfall intensity measurements with two calibrated TBRGs operated at a time resolution of 1 min. The two TBRGs differ in their internal mechanics, one being a traditional single-layer dual-bucket assembly, while the other has two layers of buckets. The drop-counter gauge operates at a time resolution of 10 s, while the time of tipping is recorded for the two TBRGs. The post-processing algorithms employed for the two TBRGs are based on the assumption that the tip volume is uniformly distributed over the inter-tip period. A series of data of an ideal TBRG is reconstructed using the virtual time of tipping derived from the drop-counter data. From the comparison between the ideal gauge and the measurements from the two real TBRGs, the performances of different post-processing and correction algorithms are statistically evaluated over the set of recorded rain events. The improvement obtained by adopting the inter-tip time algorithm in the calculation of the RI is confirmed. However, by comparing the performance of the real and ideal TBRGs, the beneficial effect of the inter-tip algorithm is shown to be relevant for the mid-low range (6-50 mm
NASA Astrophysics Data System (ADS)
Hibert, Clement; Stumpf, André; Provost, Floriane; Malet, Jean-Philippe
2017-04-01
In the past decades, the increasing quality of seismic sensors and capability to transfer remotely large quantity of data led to a fast densification of local, regional and global seismic networks for near real-time monitoring of crustal and surface processes. This technological advance permits the use of seismology to document geological and natural/anthropogenic processes (volcanoes, ice-calving, landslides, snow and rock avalanches, geothermal fields), but also led to an ever-growing quantity of seismic data. This wealth of seismic data makes the construction of complete seismicity catalogs, which include earthquakes but also other sources of seismic waves, more challenging and very time-consuming as this critical pre-processing stage is classically done by human operators and because hundreds of thousands of seismic signals have to be processed. To overcome this issue, the development of automatic methods for the processing of continuous seismic data appears to be a necessity. The classification algorithm should satisfy the need of a method that is robust, precise and versatile enough to be deployed to monitor the seismicity in very different contexts. In this study, we evaluate the ability of machine learning algorithms for the analysis of seismic sources at the Piton de la Fournaise volcano being Random Forest and Deep Neural Network classifiers. We gather a catalog of more than 20,000 events, belonging to 8 classes of seismic sources. We define 60 attributes, based on the waveform, the frequency content and the polarization of the seismic waves, to parameterize the seismic signals recorded. We show that both algorithms provide similar positive classification rates, with values exceeding 90% of the events. When trained with a sufficient number of events, the rate of positive identification can reach 99%. These very high rates of positive identification open the perspective of an operational implementation of these algorithms for near-real time monitoring of mass movements and other environmental sources at the local, regional and even global scale.
CHELSI: a portable neutron spectrometer for the 20-800 MeV region.
McLean, T D; Olsher, R H; Romero, L L; Miles, L H; Devine, R T; Fallu-Labruyere, A; Grudberg, P
2007-01-01
CHELSI is a CsI-based portable spectrometer being developed at Los Alamos National Laboratory for use in high-energy neutron fields. Based on the inherent pulse shape discrimination properties of CsI(Tl), the instrument flags charged particle events produced via neutron-induced spallation events. Scintillation events are processed in real time using digital signal processing and a conservative estimate of neutron dose rate is made based on the charged particle energy distribution. A more accurate dose estimate can be made by unfolding the 2D charged particle versus pulse height distribution to reveal the incident neutron spectrum from which dose is readily obtained. A prototype probe has been assembled and data collected in quasi-monoenergetic fields at The Svedberg Laboratory (TSL) in Uppsala as well as at the Los Alamos Neutron Science Center (LANSCE). Preliminary efforts at deconvoluting the shape/energy data using empirical response functions derived from time-of-flight measurements are described.
Liu, Chengyu; Zhao, Lina; Tang, Hong; Li, Qiao; Wei, Shoushui; Li, Jianqing
2016-08-01
False alarm (FA) rates as high as 86% have been reported in intensive care unit monitors. High FA rates decrease quality of care by slowing staff response times while increasing patient burdens and stresses. In this study, we proposed a rule-based and multi-channel information fusion method for accurately classifying the true or false alarms for five life-threatening arrhythmias: asystole (ASY), extreme bradycardia (EBR), extreme tachycardia (ETC), ventricular tachycardia (VTA) and ventricular flutter/fibrillation (VFB). The proposed method consisted of five steps: (1) signal pre-processing, (2) feature detection and validation, (3) true/false alarm determination for each channel, (4) 'real-time' true/false alarm determination and (5) 'retrospective' true/false alarm determination (if needed). Up to four signal channels, that is, two electrocardiogram signals, one arterial blood pressure and/or one photoplethysmogram signal were included in the analysis. Two events were set for the method validation: event 1 for 'real-time' and event 2 for 'retrospective' alarm classification. The results showed that 100% true positive ratio (i.e. sensitivity) on the training set were obtained for ASY, EBR, ETC and VFB types, and 94% for VTA type, accompanied by the corresponding true negative ratio (i.e. specificity) results of 93%, 81%, 78%, 85% and 50% respectively, resulting in the score values of 96.50, 90.70, 88.89, 92.31 and 64.90, as well as with a final score of 80.57 for event 1 and 79.12 for event 2. For the test set, the proposed method obtained the score of 88.73 for ASY, 77.78 for EBR, 89.92 for ETC, 67.74 for VFB and 61.04 for VTA types, with the final score of 71.68 for event 1 and 75.91 for event 2.
NASA Astrophysics Data System (ADS)
Douša, Jan; Dick, Galina; Kačmařík, Michal; Václavovic, Pavel; Pottiaux, Eric; Zus, Florian; Brenot, Hugues; Moeller, Gregor; Hinterberger, Fabian; Pacione, Rosa; Stuerze, Andrea; Eben, Kryštof; Teferle, Norman; Ding, Wenwu; Morel, Laurent; Kaplon, Jan; Hordyniec, Pavel; Rohm, Witold
2017-04-01
The COST Action ES1206 GNSS4SWEC addresses new exploitations of the synergy between developments in GNSS and meteorological communities. The Working Group 1 (Advanced GNSS processing techniques) deals with implementing and assessing new methods for GNSS tropospheric monitoring and precise positioning exploiting all modern GNSS constellations, signals, products etc. Besides other goals, WG1 coordinates development of advanced tropospheric products in support of weather numerical and non-numerical nowcasting. These are ultra-fast and high-resolution tropospheric products available in real time or in a sub-hourly fashion and parameters in support of monitoring an anisotropy of the troposphere, e.g. horizontal gradients and tropospheric slant path delays. This talk gives an overview of WG1 activities and, particularly, achievements in two activities, Benchmark and Real-time demonstration campaigns. For the Benchmark campaign a complex data set of GNSS observations and various meteorological data were collected for a two-month period in 2013 (May-June) which included severe weather events in central Europe. An initial processing of data sets from GNSS and numerical weather models (NWM) provided independently estimated reference parameters - ZTDs and tropospheric horizontal gradients. The comparison of horizontal tropospheric gradients from GNSS and NWM data demonstrated a very good agreement among independent solutions with negligible biases and an accuracy of about 0.5 mm. Visual comparisons of maps of zenith wet delays and tropospheric horizontal gradients showed very promising results for future exploitations of advanced GNSS tropospheric products in meteorological applications such as severe weather event monitoring and weather nowcasting. The Benchmark data set is also used for an extensive validation of line-of-sight tropospheric Slant Total Delays (STD) from GNSS, NWM-raytracing and Water Vapour Radiometer (WVR) solutions. Seven institutions delivered their STDs estimated based on GNSS observations processed using different software and strategies. STDs from NWM ray-tracing came from three institutions using four different NWM models. Results show generally a very good mutual agreement among all solutions from all techniques. The influence of adding not cleaned GNSS post-fit residuals, i.e. residuals that still contains non-tropospheric systematic effects such as multipath, to estimated STDs will be presented. The Real-time demonstration campaign aims at enhancing and assessing ultra-fast GNSS tropospheric products for severe weather and NWM nowcasting. Results are showed from real-time demonstrations as well as offline production simulating real-time using Benchmark campaign.
NASA Astrophysics Data System (ADS)
Larnier, H.; Sailhac, P.; Chambodut, A.
2018-01-01
Atmospheric electromagnetic waves created by global lightning activity contain information about electrical processes of the inner and the outer Earth. Large signal-to-noise ratio events are particularly interesting because they convey information about electromagnetic properties along their path. We introduce a new methodology to automatically detect and characterize lightning-based waves using a time-frequency decomposition obtained through the application of continuous wavelet transform. We focus specifically on three types of sources, namely, atmospherics, slow tails and whistlers, that cover the frequency range 10 Hz to 10 kHz. Each wave has distinguishable characteristics in the time-frequency domain due to source shape and dispersion processes. Our methodology allows automatic detection of each type of event in the time-frequency decomposition thanks to their specific signature. Horizontal polarization attributes are also recovered in the time-frequency domain. This procedure is first applied to synthetic extremely low frequency time-series with different signal-to-noise ratios to test for robustness. We then apply it on real data: three stations of audio-magnetotelluric data acquired in Guadeloupe, oversea French territories. Most of analysed atmospherics and slow tails display linear polarization, whereas analysed whistlers are elliptically polarized. The diversity of lightning activity is finally analysed in an audio-magnetotelluric data processing framework, as used in subsurface prospecting, through estimation of the impedance response functions. We show that audio-magnetotelluric processing results depend mainly on the frequency content of electromagnetic waves observed in processed time-series, with an emphasis on the difference between morning and afternoon acquisition. Our new methodology based on the time-frequency signature of lightning-induced electromagnetic waves allows automatic detection and characterization of events in audio-magnetotelluric time-series, providing the means to assess quality of response functions obtained through processing.
Solar Demon: near real-time solar eruptive event detection on SDO/AIA images
NASA Astrophysics Data System (ADS)
Kraaikamp, Emil; Verbeeck, Cis
Solar flares, dimmings and EUV waves have been observed routinely in extreme ultra-violet (EUV) images of the Sun since 1996. These events are closely associated with coronal mass ejections (CMEs), and therefore provide useful information for early space weather alerts. The Solar Dynamics Observatory/Atmospheric Imaging Assembly (SDO/AIA) generates such a massive dataset that it becomes impossible to find most of these eruptive events manually. Solar Demon is a set of automatic detection algorithms that attempts to solve this problem by providing both near real-time warnings of eruptive events and a catalog of characterized events. Solar Demon has been designed to detect and characterize dimmings, EUV waves, as well as solar flares in near real-time on SDO/AIA data. The detection modules are running continuously at the Royal Observatory of Belgium on both quick-look data and synoptic science data. The output of Solar Demon can be accessed in near real-time on the Solar Demon website, and includes images, movies, light curves, and the numerical evolution of several parameters. Solar Demon is the result of collaboration between the FP7 projects AFFECTS and COMESEP. Flare detections of Solar Demon are integrated into the COMESEP alert system. Here we present the Solar Demon detection algorithms and their output. We will focus on the algorithm and its operational implementation. Examples of interesting flare, dimming and EUV wave events, and general statistics of the detections made so far during solar cycle 24 will be presented as well.
Forensic Disaster Analysis in Near-real Time
NASA Astrophysics Data System (ADS)
Kunz, Michael; Zschau, Jochen; Wenzel, Friedemann; Khazai, Bijan; Kunz-Plapp, Tina; Trieselmann, Werner
2014-05-01
The impacts of extreme hydro-meteorological and geophysical events are controlled by various factors including severity of the event (intensity, duration, spatial extent), amplification with other phenomena (multihazard or cascading effects), interdependencies of technical systems and infrastructure, preparedness and resilience of the society. The Center for Disaster Management and Risk Reduction Technology (CEDIM) has adopted the comprehensive understanding of disasters and develops methodologies of near real-time FDA as a complementing component of the FORIN program of IRDR. The new research strategy 'Near Real-Time Forensic Disaster Analysis (FDA)' aims at scrutinizing disasters closely with a multi-disciplinary approach in order to assess the various aspects of disasters and to identify mechanisms most relevant for an extreme event to become a disaster (e.g., causal loss analysis). Recent technology developments - which have opened unprecedented opportunities for real-time hazard, vulnerability and loss assessment - are used for analyzing disasters and their impacts in combination with databases of historical events. The former covers modern empirical and analytical methods available in engineering and remote sensing for rapid impact assessments, rapid information extraction from crowd sourcing as well as rapid assessments of socio-economic impacts and economic losses. The event-driven science-based assessments of CEDIM are compiled based on interdisciplinary expertise and include the critical evaluation, assessment, validation, and quantification of an event. An important component of CEDIM's FDA is the near real-time approach which is expected to significantly speed up our understanding of natural disasters and be used to provide timely, relevant and valuable information to various user groups within their respective contexts. Currently, CEDIM has developed models and methodologies to assess different types of hazard. These approaches were applied to several disasters including, for example, Super Typhoon Haiyan/Yolanda (Nov. 2013), Central European Floods (June 2013), Hurricane Sandy (Oct. 2012), US Droughts (Summer 2012), or Typhoon Saola in Taiwan and Philippines (July 2012).
Graumann, Johannes; Scheltema, Richard A; Zhang, Yong; Cox, Jürgen; Mann, Matthias
2012-03-01
In the analysis of complex peptide mixtures by MS-based proteomics, many more peptides elute at any given time than can be identified and quantified by the mass spectrometer. This makes it desirable to optimally allocate peptide sequencing and narrow mass range quantification events. In computer science, intelligent agents are frequently used to make autonomous decisions in complex environments. Here we develop and describe a framework for intelligent data acquisition and real-time database searching and showcase selected examples. The intelligent agent is implemented in the MaxQuant computational proteomics environment, termed MaxQuant Real-Time. It analyzes data as it is acquired on the mass spectrometer, constructs isotope patterns and SILAC pair information as well as controls MS and tandem MS events based on real-time and prior MS data or external knowledge. Re-implementing a top10 method in the intelligent agent yields similar performance to the data dependent methods running on the mass spectrometer itself. We demonstrate the capabilities of MaxQuant Real-Time by creating a real-time search engine capable of identifying peptides "on-the-fly" within 30 ms, well within the time constraints of a shotgun fragmentation "topN" method. The agent can focus sequencing events onto peptides of specific interest, such as those originating from a specific gene ontology (GO) term, or peptides that are likely modified versions of already identified peptides. Finally, we demonstrate enhanced quantification of SILAC pairs whose ratios were poorly defined in survey spectra. MaxQuant Real-Time is flexible and can be applied to a large number of scenarios that would benefit from intelligent, directed data acquisition. Our framework should be especially useful for new instrument types, such as the quadrupole-Orbitrap, that are currently becoming available.
Graumann, Johannes; Scheltema, Richard A.; Zhang, Yong; Cox, Jürgen; Mann, Matthias
2012-01-01
In the analysis of complex peptide mixtures by MS-based proteomics, many more peptides elute at any given time than can be identified and quantified by the mass spectrometer. This makes it desirable to optimally allocate peptide sequencing and narrow mass range quantification events. In computer science, intelligent agents are frequently used to make autonomous decisions in complex environments. Here we develop and describe a framework for intelligent data acquisition and real-time database searching and showcase selected examples. The intelligent agent is implemented in the MaxQuant computational proteomics environment, termed MaxQuant Real-Time. It analyzes data as it is acquired on the mass spectrometer, constructs isotope patterns and SILAC pair information as well as controls MS and tandem MS events based on real-time and prior MS data or external knowledge. Re-implementing a top10 method in the intelligent agent yields similar performance to the data dependent methods running on the mass spectrometer itself. We demonstrate the capabilities of MaxQuant Real-Time by creating a real-time search engine capable of identifying peptides “on-the-fly” within 30 ms, well within the time constraints of a shotgun fragmentation “topN” method. The agent can focus sequencing events onto peptides of specific interest, such as those originating from a specific gene ontology (GO) term, or peptides that are likely modified versions of already identified peptides. Finally, we demonstrate enhanced quantification of SILAC pairs whose ratios were poorly defined in survey spectra. MaxQuant Real-Time is flexible and can be applied to a large number of scenarios that would benefit from intelligent, directed data acquisition. Our framework should be especially useful for new instrument types, such as the quadrupole-Orbitrap, that are currently becoming available. PMID:22171319
Sub-millisecond closed-loop feedback stimulation between arbitrary sets of individual neurons
Müller, Jan; Bakkum, Douglas J.; Hierlemann, Andreas
2012-01-01
We present a system to artificially correlate the spike timing between sets of arbitrary neurons that were interfaced to a complementary metal–oxide–semiconductor (CMOS) high-density microelectrode array (MEA). The system features a novel reprogrammable and flexible event engine unit to detect arbitrary spatio-temporal patterns of recorded action potentials and is capable of delivering sub-millisecond closed-loop feedback of electrical stimulation upon trigger events in real-time. The relative timing between action potentials of individual neurons as well as the temporal pattern among multiple neurons, or neuronal assemblies, is considered an important factor governing memory and learning in the brain. Artificially changing timings between arbitrary sets of spiking neurons with our system could provide a “knob” to tune information processing in the network. PMID:23335887
Mental time travel and the shaping of language.
Corballis, Michael C
2009-01-01
Episodic memory can be regarded as part of a more general system, unique to humans, for mental time travel, and the construction of future episodes. This allows more detailed planning than is afforded by the more general mechanisms of instinct, learning, and semantic memory. To be useful, episodic memory need not provide a complete or even a faithful record of past events, and may even be part of a process whereby we construct fictional accounts. The properties of language are aptly designed for the communication and sharing of episodes, and for the telling of stories; these properties include symbolic representation of the elements of real-world events, time markers, and combinatorial rules. Language and mental time travel probably co-evolved during the Pleistocene, when brain size increased dramatically.
LHCb detector and trigger performance in Run II
NASA Astrophysics Data System (ADS)
Francesca, Dordei
2017-12-01
The LHCb detector is a forward spectrometer at the LHC, designed to perform high precision studies of b- and c- hadrons. In Run II of the LHC, a new scheme for the software trigger at LHCb allows splitting the triggering of events into two stages, giving room to perform the alignment and calibration in real time. In the novel detector alignment and calibration strategy for Run II, data collected at the start of the fill are processed in a few minutes and used to update the alignment, while the calibration constants are evaluated for each run. This allows identical constants to be used in the online and offline reconstruction, thus improving the correlation between triggered and offline selected events. The required computing time constraints are met thanks to a new dedicated framework using the multi-core farm infrastructure for the trigger. The larger timing budget, available in the trigger, allows to perform the same track reconstruction online and offline. This enables LHCb to achieve the best reconstruction performance already in the trigger, and allows physics analyses to be performed directly on the data produced by the trigger reconstruction. The novel real-time processing strategy at LHCb is discussed from both the technical and operational point of view. The overall performance of the LHCb detector on the data of Run II is presented as well.
Wang, Yuanjia; Chen, Tianle; Zeng, Donglin
2016-01-01
Learning risk scores to predict dichotomous or continuous outcomes using machine learning approaches has been studied extensively. However, how to learn risk scores for time-to-event outcomes subject to right censoring has received little attention until recently. Existing approaches rely on inverse probability weighting or rank-based regression, which may be inefficient. In this paper, we develop a new support vector hazards machine (SVHM) approach to predict censored outcomes. Our method is based on predicting the counting process associated with the time-to-event outcomes among subjects at risk via a series of support vector machines. Introducing counting processes to represent time-to-event data leads to a connection between support vector machines in supervised learning and hazards regression in standard survival analysis. To account for different at risk populations at observed event times, a time-varying offset is used in estimating risk scores. The resulting optimization is a convex quadratic programming problem that can easily incorporate non-linearity using kernel trick. We demonstrate an interesting link from the profiled empirical risk function of SVHM to the Cox partial likelihood. We then formally show that SVHM is optimal in discriminating covariate-specific hazard function from population average hazard function, and establish the consistency and learning rate of the predicted risk using the estimated risk scores. Simulation studies show improved prediction accuracy of the event times using SVHM compared to existing machine learning methods and standard conventional approaches. Finally, we analyze two real world biomedical study data where we use clinical markers and neuroimaging biomarkers to predict age-at-onset of a disease, and demonstrate superiority of SVHM in distinguishing high risk versus low risk subjects.
Location of Microearthquakes in Various Noisy Environments Using Envelope Stacking
NASA Astrophysics Data System (ADS)
Oye, V.; Gharti, H.
2009-12-01
Monitoring of microearthquakes is routinely conducted in various environments such as hydrocarbon and geothermal reservoirs, mines, dams, seismically active faults, volcanoes, nuclear power plants and CO2 storages. In many of these cases the handled data is sensitive and the interpretation of the data may be vital. In some cases, such as during mining or hydraulic fracturing activities, the number of microearthquakes is very large with tens to thousands of events per hour. In others, almost no events occur during a week and furthermore, it might not be anticipated that many events occur at all. However, the general setup of seismic networks, including surface and downhole stations, is usually optimized to record as many microearthquakes as possible, thereby trying to lower the detection threshold of the network. This process is obviously limited to some extent. Most microearthquake location techniques take advantage of a combination of P- and S-wave onset times that often can be picked reliably in an automatic mode. Moreover, when using seismic wave onset times, sometimes in combination with seismic wave polarization, these methods are more accurate compared to migration-based location routines. However, many events cannot be located because their magnitude is too small, i.e. the P- and/or S-wave onset times cannot be picked accurately on a sufficient number of receivers. Nevertheless, these small events are important for the interpretation of the processes that are monitored and even an inferior estimate of event locations and strengths is valuable information. Moreover, the smaller the event the more often such events statistically occur and the more important such additional information becomes. In this study we try to enhance the performance of any microseismic network, providing additional estimates of event locations below the actual detection threshold. We present a migration-based event location method, where we project the recorded seismograms onto the ray coordinate system, which corresponds to a configuration of trial sources and the real receiver network. A time window of predefined length is centered on the arrival time of the related phase that is calculated for the same grid of trial locations. The area spanned by the time window below the computed envelope is stacked for each component (L, T, Q) individually. Subsequently, the objective function is formulated as the squared sum of the stacked values. To obtain the final location, we apply a robust global optimization routine called differential evolution, which provides the maximum value of the objective function. This method provides a complete algorithm with a minimum of control parameters making it suitable for automated processing. The method can be applied to both single and multi-component data, and either P or S or both phases can be used. As a result, this method allows for a flexible application to a wide range of data. Synthetic data were computed for a complex and heterogeneous model of an ore mine and we applied this method to real, observed microearthquake data.
Simplex and duplex event-specific analytical methods for functional biotech maize.
Lee, Seong-Hun; Kim, Su-Jeong; Yi, Bu-Young
2009-08-26
Analytical methods are very important in the control of genetically modified organism (GMO) labeling systems or living modified organism (LMO) management for biotech crops. Event-specific primers and probes were developed for qualitative and quantitative analysis for biotech maize event 3272 and LY 038 on the basis of the 3' flanking regions, respectively. The qualitative primers confirmed the specificity by a single PCR product and sensitivity to 0.05% as a limit of detection (LOD). Simplex and duplex quantitative methods were also developed using TaqMan real-time PCR. One synthetic plasmid was constructed from two taxon-specific DNA sequences of maize and two event-specific 3' flanking DNA sequences of event 3272 and LY 038 as reference molecules. In-house validation of the quantitative methods was performed using six levels of mixing samples, from 0.1 to 10.0%. As a result, the biases from the true value and the relative deviations were all within the range of +/-30%. Limits of quantitation (LOQs) of the quantitative methods were all 0.1% for simplex real-time PCRs of event 3272 and LY 038 and 0.5% for duplex real-time PCR of LY 038. This study reports that event-specific analytical methods were applicable for qualitative and quantitative analysis for biotech maize event 3272 and LY 038.
Event-by-event PET image reconstruction using list-mode origin ensembles algorithm
NASA Astrophysics Data System (ADS)
Andreyev, Andriy
2016-03-01
There is a great demand for real time or event-by-event (EBE) image reconstruction in emission tomography. Ideally, as soon as event has been detected by the acquisition electronics, it needs to be used in the image reconstruction software. This would greatly speed up the image reconstruction since most of the data will be processed and reconstructed while the patient is still undergoing the scan. Unfortunately, the current industry standard is that the reconstruction of the image would not start until all the data for the current image frame would be acquired. Implementing an EBE reconstruction for MLEM family of algorithms is possible, but not straightforward as multiple (computationally expensive) updates to the image estimate are required. In this work an alternative Origin Ensembles (OE) image reconstruction algorithm for PET imaging is converted to EBE mode and is investigated whether it is viable alternative for real-time image reconstruction. In OE algorithm all acquired events are seen as points that are located somewhere along the corresponding line-of-responses (LORs), together forming a point cloud. Iteratively, with a multitude of quasi-random shifts following the likelihood function the point cloud converges to a reflection of an actual radiotracer distribution with the degree of accuracy that is similar to MLEM. New data can be naturally added into the point cloud. Preliminary results with simulated data show little difference between regular reconstruction and EBE mode, proving the feasibility of the proposed approach.
NASA Astrophysics Data System (ADS)
Solanki, K.; Hauksson, E.; Kanamori, H.; Wu, Y.; Heaton, T.; Boese, M.
2007-12-01
We have implemented an on-site early warning algorithm using the infrastructure of the Caltech/USGS Southern California Seismic Network (SCSN). We are evaluating the real-time performance of the software system and the algorithm for rapid assessment of earthquakes. In addition, we are interested in understanding what parts of the SCSN need to be improved to make early warning practical. Our EEW processing system is composed of many independent programs that process waveforms in real-time. The codes were generated by using a software framework. The Pd (maximum displacement amplitude of P wave during the first 3sec) and Tau-c (a period parameter during the first 3 sec) values determined during the EEW processing are being forwarded to the California Integrated Seismic Network (CISN) web page for independent evaluation of the results. The on-site algorithm measures the amplitude of the P-wave (Pd) and the frequency content of the P-wave during the first three seconds (Tau-c). The Pd and the Tau-c values make it possible to discriminate between a variety of events such as large distant events, nearby small events, and potentially damaging nearby events. The Pd can be used to infer the expected maximum ground shaking. The method relies on data from a single station although it will become more reliable if readings from several stations are associated. To eliminate false triggers from stations with high background noise level, we have created per station Pd threshold configuration for the Pd/Tau-c algorithm. To determine appropriate values for the Pd threshold we calculate Pd thresholds for stations based on the information from the EEW logs. We have operated our EEW test system for about a year and recorded numerous earthquakes in the magnitude range from M3 to M5. Two recent examples are a M4.5 earthquake near Chatsworth and a M4.7 earthquake near Elsinore. In both cases, the Pd and Tau-c parameters were determined successfully within 10 to 20 sec of the arrival of the P-wave at the station. The Tau-c values predicted the magnitude within 0.1 and the predicted average peak-ground-motion was 0.7 cm/s and 0.6 cm/s. The delays in the system are caused mostly by the packetizing delay because our software system is based on processing miniseed packets. Most recently we have begun reducing the data latency using new qmaserv2 software for the Q330 Quanterra datalogger. We implemented qmaserv2 based multicast receiver software to receive the native 1 sec packets from the dataloggers. The receiver reads multicast packets from the network and writes them into shared memory area. This new software will fully take advantage of the capabilities of the Q330 datalogger and significantly reduce data latency for EEW system. We have also implemented a new EEW sub-system that compliments the currently running EEW system by associating Pd and Tau-c values from multiple stations. So far, we have implemented a new trigger generation algorithm for real-time processing for the sub-system, and are able to routinely locate events and determine magnitudes using the Pd and Tau-c values.
NASA Astrophysics Data System (ADS)
Brachet, N.; Mialle, P.; Brown, D.; Coyne, J.; Drob, D.; Virieux, J.; Garcés, M.
2009-04-01
The International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty (CTBTO) Preparatory Commission in Vienna is pursuing its automatic processing effort for the return of infrasound data processing into operations in 2009. Concurrently, work is also underway to further improve this process by enhancing the modeling of the infrasound propagation in the atmosphere and then by labeling the phases in order to improve the event categorization and location. In 2008, the IDC acquired WASP-3D Sph (Windy Atmospheric Sonic Propagation) (Virieux et al., 2004) a 3-D ray-tracing based long range propagation software that accounts for the heterogeneity of the atmosphere. Once adapted to the IDC environment, WASP-3 Sph has been used to improve the understanding of infrasound wave propagation and has been compared with the 1-D ray tracing Taupc software (Garcés and Drob, 2007) at the IDC. In addition to performing the infrasound propagation simulation, different atmospheric models are available at the IDC, either real-time: ECMWF (European Centre for Middle-range Weather Forecast), or empiric: HWM93 (Horizontal Wind Model) and HWM07 (Drob, 2008), used in their initial format or interpolated into G2S (Ground to Space) model. The IDC infrasound reference database is used for testing, comparing and validating the various propagation software and atmospheric specifications. Moreover all the performed simulations are giving feedback on the quality of the infrasound reference events and provide useful information to improve their location by refining infrasonic wave propagation characteristics. The results of this study are presented for a selection of reference events and they will help the IDC designing and defining short and mid-term enhancements of the infrasound automatic and interactive processing to take into account the spatial and temporal heterogeneities of the atmosphere.
Real-Time Research: An Experiment in the Design of Scholarship
ERIC Educational Resources Information Center
Zimmerman, Eric; Squire, Kurt; Steinkuehler, Constance; Dikkers, Seann
2009-01-01
This article reports on an unconventional collaborative event called Real-Time Research, a project that brought 25 participants together from radically divergent fields for a playful and somewhat improvisational investigation of what it means to do games and learning research. Real-Time Research took the form of a two-part workshop session at the…
Chang, Li-Chiu; Chen, Pin-An; Chang, Fi-John
2012-08-01
A reliable forecast of future events possesses great value. The main purpose of this paper is to propose an innovative learning technique for reinforcing the accuracy of two-step-ahead (2SA) forecasts. The real-time recurrent learning (RTRL) algorithm for recurrent neural networks (RNNs) can effectively model the dynamics of complex processes and has been used successfully in one-step-ahead forecasts for various time series. A reinforced RTRL algorithm for 2SA forecasts using RNNs is proposed in this paper, and its performance is investigated by two famous benchmark time series and a streamflow during flood events in Taiwan. Results demonstrate that the proposed reinforced 2SA RTRL algorithm for RNNs can adequately forecast the benchmark (theoretical) time series, significantly improve the accuracy of flood forecasts, and effectively reduce time-lag effects.
Fed-batch control based upon the measurement of intracellular NADH
NASA Technical Reports Server (NTRS)
Armiger, W. B.; Lee, J. F.; Montalvo, L. M.; Forro, J. R.
1987-01-01
A series of experiments demonstrating that on-line measurements of intracellular NADH by culture fluorescence can be used to monitor and control the fermentation process are described. A distinct advantage of intercellular NADH measurements over other monitoring techniques such as pH and dissolved oxygen is that it directly measures real time events occurring within the cell rather than changes in the environment. When coupled with other measurement parameters, it can provide a finer degree of sophistication in process control.
Expert systems for real-time monitoring and fault diagnosis
NASA Technical Reports Server (NTRS)
Edwards, S. J.; Caglayan, A. K.
1989-01-01
Methods for building real-time onboard expert systems were investigated, and the use of expert systems technology was demonstrated in improving the performance of current real-time onboard monitoring and fault diagnosis applications. The potential applications of the proposed research include an expert system environment allowing the integration of expert systems into conventional time-critical application solutions, a grammar for describing the discrete event behavior of monitoring and fault diagnosis systems, and their applications to new real-time hardware fault diagnosis and monitoring systems for aircraft.
Safety management for polluted confined space with IT system: a running case.
Hwang, Jing-Jang; Wu, Chien-Hsing; Zhuang, Zheng-Yun; Hsu, Yi-Chang
2015-01-01
This study traced a deployed real IT system to enhance occupational safety for a polluted confined space. By incorporating wireless technology, it automatically monitors the status of workers on the site and upon detected anomalous events, managers are notified effectively. The system, with a redefined standard operations process, is running well at one of Formosa Petrochemical Corporation's refineries. Evidence shows that after deployment, the system does enhance the safety level by real-time monitoring the workers and by managing well and controlling the anomalies. Therefore, such technical architecture can be applied to similar scenarios for safety enhancement purposes.
Headlines: Planet Earth: Improving Climate Literacy with Short Format News Videos
NASA Astrophysics Data System (ADS)
Tenenbaum, L. F.; Kulikov, A.; Jackson, R.
2012-12-01
One of the challenges of communicating climate science is the sense that climate change is remote and unconnected to daily life--something that's happening to someone else or in the future. To help face this challenge, NASA's Global Climate Change website http://climate.nasa.gov has launched a new video series, "Headlines: Planet Earth," which focuses on current climate news events. This rapid-response video series uses 3D video visualization technology combined with real-time satellite data and images, to throw a spotlight on real-world events.. The "Headlines: Planet Earth" news video products will be deployed frequently, ensuring timeliness. NASA's Global Climate Change Website makes extensive use of interactive media, immersive visualizations, ground-based and remote images, narrated and time-lapse videos, time-series animations, and real-time scientific data, plus maps and user-friendly graphics that make the scientific content both accessible and engaging to the public. The site has also won two consecutive Webby Awards for Best Science Website. Connecting climate science to current real-world events will contribute to improving climate literacy by making climate science relevant to everyday life.
Forecasting volcanic unrest using seismicity: The good, the bad and the time consuming
NASA Astrophysics Data System (ADS)
Salvage, Rebecca; Neuberg, Jurgen W.
2013-04-01
Volcanic eruptions are inherently unpredictable in nature, with scientists struggling to forecast the type and timing of events, in particular in real time scenarios. Current understanding suggests that the use of statistical patterns within precursory datasets of seismicity prior to eruptive events could hold the potential to be used as real time forecasting tools. They allow us to determine times of clear deviation in data, which might be indicative of volcanic unrest. The identification of low frequency seismic swarms and the acceleration of this seismicity prior to observed volcanic unrest may be key in developing forecasting tools. The development of these real time forecasting models which can be implemented at volcano observatories is of particular importance since the identification of early warning signals allows danger to the proximal population to be minimized. We concentrate on understanding the significance and development of these seismic swarms as unrest develops at the volcano. In particular, analysis of accelerations in event rate, amplitude and energy rates released by seismicity prior to eruption suggests that these are important indicators of developing unrest. Real time analysis of these parameters simultaneously allows possible improvements to forecasting models. Although more time and computationally intense, cross correlation techniques applied to continuous seismicity prior to volcanic unrest scenarios allows all significant seismic events to be analysed, rather than only those which can be detected by an automated identification system. This may allow a more accurate forecast since all precursory seismicity can be taken into account. In addition, the classification of seismic events based on spectral characteristics may allow us to isolate individual types of signals which are responsible for certain types of unrest. In this way, we may be able to better forecast the type of eruption that may ensue, or at least some of its prevailing characteristics.
Particle monitoring and control in vacuum processing equipment
NASA Astrophysics Data System (ADS)
Borden, Peter G., Dr.; Gregg, John
1989-10-01
Particle contamination during vacuum processes has emerged as the largest single source of yield loss in VLSI manufacturing. While a number of tools have been available to help understand the sources and nature of this contamination, only recently has it been possible to monitor free particle levels within vacuum equipment in real-time. As a result, a better picture is available of how particle contamination can affect a variety of processes. This paper reviews some of the work that has been done to monitor particles in vacuum loadlocks and in processes such as etching, sputtering and ion implantation. The aim has been to make free particles in vacuum equipment a measurable process parameter. Achieving this allows particles to be controlled using statistical process control. It will be shown that free particle levels in load locks correlate to wafer surface counts, device yield and process conditions, but that these levels are considerable higher during production than when dummy wafers are run to qualify a system. It will also be shown how real-time free particle monitoring can be used to monitor and control cleaning cycles, how major episodic events can be detected, and how data can be gathered in a format suitable for statistical process control.
Probabilistic and Evolutionary Early Warning System: concepts, performances, and case-studies
NASA Astrophysics Data System (ADS)
Zollo, A.; Emolo, A.; Colombelli, S.; Elia, L.; Festa, G.; Martino, C.; Picozzi, M.
2013-12-01
PRESTo (PRobabilistic and Evolutionary early warning SysTem) is a software platform for Earthquake Early Warning that integrates algorithms for real-time earthquake location, magnitude estimation and damage assessment into a highly configurable and easily portable package. In its regional configuration, the software processes, in real-time, the 3-component acceleration data streams coming from seismic stations, for P-waves arrival detection and, in the case a quite large event is occurring, can promptly performs event detection and location, magnitude estimation and peak ground-motion prediction at target sites. The regional approach has been integrated with a threshold-based early warning method that allows, in the very first seconds after a moderate-to-large earthquake, to identify the most Probable Damaged Zone starting from the real-time measurement at near-source stations located at increasing distances from the earthquake epicenter, of the peak displacement (Pd) and predominant period of P-waves (τc), over a few-second long window after the P-wave arrival. Thus, each recording site independently provides an evolutionary alert level, according to the Pd and τc it measured, through a decisional table. Since 2009, PRESTo has been under continuous real-time testing using data streaming from the Iripinia Seismic Network (Southern Italy) and has produced a bulletin of some hundreds low magnitude events, including all the M≥2.5 earthquakes occurred in that period in Irpinia. Recently, PRESTo has been also implemented at the accelerometric network and broad-band networks in South Korea and in Romania, and off-line tested in Iberian Peninsula, in Turkey, in Israel, and in Japan. The feasibility of an Early Warning System at national scale, is currently under testing by studying the performances of the PRESTo platform for the Italian Accelerometric Network. Moreover, PRESTo is under experimentation in order to provide alert in a high-school located in the neighborhood of Naples at about 100 km from the Irpinia region.
NASA Astrophysics Data System (ADS)
Keck, N. N.; Macduff, M.; Martin, T.
2017-12-01
The Atmospheric Radiation Measurement's (ARM) Data Management Facility (DMF) plays a critical support role in processing and curating data generated by the Department of Energy's ARM Program. Data are collected near real time from hundreds of observational instruments spread out all over the globe. Data are then ingested hourly to provide time series data in NetCDF (network Common Data Format) and includes standardized metadata. Based on automated processes and a variety of user reviews the data may need to be reprocessed. Final data sets are then stored and accessed by users through the ARM Archive. Over the course of 20 years, a suite of data visualization tools have been developed to facilitate the operational processes to manage and maintain the more than 18,000 real time events, that move 1.3 TB of data each day through the various stages of the DMF's data system. This poster will present the resources and methodology used to capture metadata and the tools that assist in routine data management and discoverability.
Complex effusive events at Kilauea as documented by the GOES satellite and remote video cameras
Harris, A.J.L.; Thornber, C.R.
1999-01-01
GOES provides thermal data for all of the Hawaiian volcanoes once every 15 min. We show how volcanic radiance time series produced from this data stream can be used as a simple measure of effusive activity. Two types of radiance trends in these time series can be used to monitor effusive activity: (a) Gradual variations in radiance reveal steady flow-field extension and tube development. (b) Discrete spikes correlate with short bursts of activity, such as lava fountaining or lava-lake overflows. We are confident that any effusive event covering more than 10,000 m2 of ground in less than 60 min will be unambiguously detectable using this approach. We demonstrate this capability using GOES, video camera and ground-based observational data for the current eruption of Kilauea volcano (Hawai'i). A GOES radiance time series was constructed from 3987 images between 19 June and 12 August 1997. This time series displayed 24 radiance spikes elevated more than two standard deviations above the mean; 19 of these are correlated with video-recorded short-burst effusive events. Less ambiguous events are interpreted, assessed and related to specific volcanic events by simultaneous use of permanently recording video camera data and ground-observer reports. The GOES radiance time series are automatically processed on data reception and made available in near-real-time, so such time series can contribute to three main monitoring functions: (a) automatically alerting major effusive events; (b) event confirmation and assessment; and (c) establishing effusive event chronology.
Negative Emotional Events that People Ruminate about Feel Closer in Time
Siedlecka, Ewa; Capper, Miriam M.; Denson, Thomas F.
2015-01-01
Rumination is intrusive, perseverative cognition. We suggest that one psychological consequence of ruminating about negative emotional events is that the events feel as though they happened metaphorically “just yesterday”. Results from three studies showed that ruminating about real world anger provocations, guilt-inducing events, and sad times in the last year made these past events feel as though they happened more recently. The relationship between rumination and reduced temporal psychological distance persisted even when controlling for when the event occurred and the emotional intensity of the event. Moreover, angry rumination was correlated with enhanced approach motivation, which mediated the rumination-distance relationship. The relationship between guilty rumination and distance was mediated by enhanced vividness. Construal level and taking a 3rd person perspective contributed to the sense of distance when participants were prompted to think about less emotionally charged situations. A meta-analysis of the data showed that the relationship between rumination and reduced distance was significant and twice as large as the same relationship for neutral events. These findings have implications for understanding the role of emotional rumination on memory processes in clinical populations and people prone to rumination. This research suggests that rumination may be a critical mechanism that keeps negative events close in the heart, mind, and time. PMID:25714395
An Integrated Monitoring System of Pre-earthquake Processes in Peloponnese, Greece
NASA Astrophysics Data System (ADS)
Karastathis, V. K.; Tsinganos, K.; Kafatos, M.; Eleftheriou, G.; Ouzounov, D.; Mouzakiotis, E.; Papadopoulos, G. A.; Voulgaris, N.; Bocchini, G. M.; Liakopoulos, S.; Aspiotis, T.; Gika, F.; Tselentis, A.; Moshou, A.; Psiloglou, B.
2017-12-01
One of the controversial issues in the contemporary seismology is the ability of radon accumulation monitoring to provide reliable earthquake forecasting. Although there are many examples in the literature showing radon increase before earthquakes, skepticism arises from instability of the measurements, false alarms, difficulties in interpretation caused by the weather influence (eg. rainfall) and difficulties on the consideration an irrefutable theoretical background of the phenomenon.We have developed and extensively tested a multi parameter network aimed for studying of the pre-earthquake processes and operating as a part of integrated monitoring system in the high seismicity area of the Western Hellenic Arc (SW Peloponnese, Greece). The prototype consists of four components: A real-time monitoring system of Radon accumulation. It consists of three gamma radiation detectors [NaI(Tl) scintillators] A nine-station seismic array to monitor the microseismicity in the offshore area of the Hellenic arc. The processing of the data is based on F-K and beam-forming techniques. Real-time weather monitoring systems for air temperature, relative humidity, precipitation and pressure. Thermal radiation emission from AVHRR/NOAA-18 polar orbit satellite observation. The project revolved around the idea of jointly studying the emission of Radon that has been proven in many cases as a reliable indicator of the possible time of an event, with the accurate location of the foreshock activity detected by the seismic array that can be a more reliable indicator of the possible position of an event. In parallel a satellite thermal anomaly detection technique has been used for monitoring of larger magnitude events (possible indicator for strong events M ≥5.0.). The first year of operations revealed a number of pre-seismic radon variation anomalies before several local earthquakes (M>3.6). The Radon increases systematically before the larger events.Details about the overall performance in registration of pre-seismic signals in Peloponnese region, along with two distant but very strong earthquakes in Jun 12, 2017 M6.3 and Jul 20, 2017 M6.6 in Greece will be discussed.
Rapid Characterization of Large Earthquakes in Chile
NASA Astrophysics Data System (ADS)
Barrientos, S. E.; Team, C.
2015-12-01
Chile, along 3000 km of it 4200 km long coast, is regularly affected by very large earthquakes (up to magnitude 9.5) resulting from the convergence and subduction of the Nazca plate beneath the South American plate. These megathrust earthquakes exhibit long rupture regions reaching several hundreds of km with fault displacements of several tens of meters. Minimum delay characterization of these giant events to establish their rupture extent and slip distribution is of the utmost importance for rapid estimations of the shaking area and their corresponding tsunami-genic potential evaluation, particularly when there are only few minutes to warn the coastal population for immediate actions. The task of a rapid evaluation of large earthquakes is accomplished in Chile through a network of sensors being implemented by the National Seismological Center of the University of Chile. The network is mainly composed approximately by one hundred broad-band and strong motion instruments and 130 GNSS devices; all will be connected in real time. Forty units present an optional RTX capability, where satellite orbits and clock corrections are sent to the field device producing a 1-Hz stream at 4-cm level. Tests are being conducted to stream the real-time raw data to be later processed at the central facility. Hypocentral locations and magnitudes are estimated after few minutes by automatic processing software based on wave arrival; for magnitudes less than 7.0 the rapid estimation works within acceptable bounds. For larger events, we are currently developing automatic detectors and amplitude estimators of displacement coming out from the real time GNSS streams. This software has been tested for several cases showing that, for plate interface events, the minimum magnitude threshold detectability reaches values within 6.2 and 6.5 (1-2 cm coastal displacement), providing an excellent tool for earthquake early characterization from a tsunamigenic perspective.
NASA Astrophysics Data System (ADS)
Rothacher, Markus
2017-04-01
Mankind is constantly threatened by a variety of natural disasters and global change phenomena. In order to be able to better predict and assess these catastrophic and disastrous events a continuous observation and monitoring of the causative Earth processes is a necessity. These processes may happen in time scales from extremely short (earthquakes, volcano eruptions, land slides, ...) to very long (melting of ice sheets, sea level change, plate tectonics, ...). Appropriate monitoring and early warning systems must allow, therefore, the detection and quantification of catastrophic events in (near) real-time on the one hand and the reliable identification of barely noticeable, but crucial long-term trends (e.g., sea level rise) on the other hand. The Global Geodetic Observing System (GGOS), established by the International Association of Geodesy (IAG) in 2003, already now contributes in a multitude of ways to meet this challenge, e.g., by providing a highly accurate and stable global reference frame, without which the measurement of a sea level rise of 2-3 mm/y would not be possible; by measuring displacements in near real-time and deformations over decades that offer valuable clues to plate tectonics, earthquake processes, tsunamis, volcanos, land slides, and glaciers dynamics; by observing the mass loss of ice sheets with gravity satellite missions; and by estimating essential variables such as the amount of water vapor in the troposphere relevant for weather predictions and climate and the content of free electrons in the ionosphere crucial for space weather.
A Control Chart Approach for Representing and Mining Data Streams with Shape Based Similarity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Omitaomu, Olufemi A
The mining of data streams for online condition monitoring is a challenging task in several domains including (electric) power grid system, intelligent manufacturing, and consumer science. Considering a power grid application in which thousands of sensors, called the phasor measurement units, are deployed on the power grid network to continuously collect streams of digital data for real-time situational awareness and system management. Depending on design, each sensor could stream between ten and sixty data samples per second. The myriad of sensory data captured could convey deeper insights about sequence of events in real-time and before major damages are done. However,more » the timely processing and analysis of these high-velocity and high-volume data streams is a challenge. Hence, a new data processing and transformation approach, based on the concept of control charts, for representing sequence of data streams from sensors is proposed. In addition, an application of the proposed approach for enhancing data mining tasks such as clustering using real-world power grid data streams is presented. The results indicate that the proposed approach is very efficient for data streams storage and manipulation.« less
Real time video analysis to monitor neonatal medical condition
NASA Astrophysics Data System (ADS)
Shirvaikar, Mukul; Paydarfar, David; Indic, Premananda
2017-05-01
One in eight live births in the United States is premature and these infants have complications leading to life threatening events such as apnea (pauses in breathing), bradycardia (slowness of heart) and hypoxia (oxygen desaturation). Infant movement pattern has been hypothesized as an important predictive marker for these life threatening events. Thus estimation of movement along with behavioral states, as a precursor of life threatening events, can be useful for risk stratification of infants as well as for effective management of disease state. However, more important and challenging is the determination of the behavioral state of the infant. This information includes important cues such as sleep position and the status of the eyes, which are important markers for neonatal neurodevelopment state. This paper explores the feasibility of using real time video analysis to monitor the condition of premature infants. The image of the infant can be segmented into regions to localize and focus on specific areas of interest. Analysis of the segmented regions can be performed to identify different parts of the body including the face, arms, legs and torso. This is necessary due to real-time processing speed considerations. Such a monitoring system would be of great benefit as an aide to medical staff in neonatal hospital settings requiring constant surveillance. Any such system would have to satisfy extremely stringent reliability and accuracy requirements, before it can be deployed in a hospital care unit, due to obvious reasons. The effect of lighting conditions and interference will have to be mitigated to achieve such performance.
Nonparametric Bayesian Segmentation of a Multivariate Inhomogeneous Space-Time Poisson Process.
Ding, Mingtao; He, Lihan; Dunson, David; Carin, Lawrence
2012-12-01
A nonparametric Bayesian model is proposed for segmenting time-evolving multivariate spatial point process data. An inhomogeneous Poisson process is assumed, with a logistic stick-breaking process (LSBP) used to encourage piecewise-constant spatial Poisson intensities. The LSBP explicitly favors spatially contiguous segments, and infers the number of segments based on the observed data. The temporal dynamics of the segmentation and of the Poisson intensities are modeled with exponential correlation in time, implemented in the form of a first-order autoregressive model for uniformly sampled discrete data, and via a Gaussian process with an exponential kernel for general temporal sampling. We consider and compare two different inference techniques: a Markov chain Monte Carlo sampler, which has relatively high computational complexity; and an approximate and efficient variational Bayesian analysis. The model is demonstrated with a simulated example and a real example of space-time crime events in Cincinnati, Ohio, USA.
Current Development at the Southern California Earthquake Data Center (SCEDC)
NASA Astrophysics Data System (ADS)
Appel, V. L.; Clayton, R. W.
2005-12-01
Over the past year, the SCEDC completed or is near completion of three featured projects: Station Information System (SIS) Development: The SIS will provide users with an interface into complete and accurate station metadata for all current and historic data at the SCEDC. The goal of this project is to develop a system that can interact with a single database source to enter, update and retrieve station metadata easily and efficiently. The system will provide accurate station/channel information for active stations to the SCSN real-time processing system, as will as station/channel information for stations that have parametric data at the SCEDC i.e., for users retrieving data via STP. Additionally, the SIS will supply information required to generate dataless SEED and COSMOS V0 volumes and allow stations to be added to the system with a minimum, but incomplete set of information using predefined defaults that can be easily updated as more information becomes available. Finally, the system will facilitate statewide metadata exchange for both real-time processing and provide a common approach to CISN historic station metadata. Moment Tensor Solutions: The SCEDC is currently archiving and delivering Moment Magnitudes and Moment Tensor Solutions (MTS) produced by the SCSN in real-time and post-processing solutions for events spanning back to 1999. The automatic MTS runs on all local events with magnitudes > 3.0, and all regional events > 3.5. The distributed solution automatically creates links from all USGS Simpson Maps to a text e-mail summary solution, creates a .gif image of the solution, and updates the moment tensor database tables at the SCEDC. Searchable Scanned Waveforms Site: The Caltech Seismological Lab has made available 12,223 scanned images of pre-digital analog recordings of major earthquakes recorded in Southern California between 1962 and 1992 at http://www.data.scec.org/research/scans/. The SCEDC has developed a searchable web interface that allows users to search the available files, select multiple files for download and then retrieve a zipped file containing the results. Scanned images of paper records for M>3.5 southern California earthquakes and several significant teleseisms are available for download via the SCEDC through this search tool.
NASA Astrophysics Data System (ADS)
Nanda, Trushnamayee; Beria, Harsh; Sahoo, Bhabagrahi; Chatterjee, Chandranath
2016-04-01
Increasing frequency of hydrologic extremes in a warming climate call for the development of reliable flood forecasting systems. The unavailability of meteorological parameters in real-time, especially in the developing parts of the world, makes it a challenging task to accurately predict flood, even at short lead times. The satellite-based Tropical Rainfall Measuring Mission (TRMM) provides an alternative to the real-time precipitation data scarcity. Moreover, rainfall forecasts by the numerical weather prediction models such as the medium term forecasts issued by the European Center for Medium range Weather Forecasts (ECMWF) are promising for multistep-ahead flow forecasts. We systematically evaluate these rainfall products over a large catchment in Eastern India (Mahanadi River basin). We found spatially coherent trends, with both the real-time TRMM rainfall and ECMWF rainfall forecast products overestimating low rainfall events and underestimating high rainfall events. However, no significant bias was found for the medium rainfall events. Another key finding was that these rainfall products captured the phase of the storms pretty well, but suffered from consistent under-prediction. The utility of the real-time TRMM and ECMWF forecast products are evaluated by rainfall-runoff modeling using different artificial neural network (ANN)-based models up to 3-days ahead. Keywords: TRMM; ECMWF; forecast; ANN; rainfall-runoff modeling
ACE: AMY CDC (central drift chamber) fast track finder
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mori, T.
1988-01-01
The central drift chamber (CDC) of the AMY detector at the TRISTAN e/sup /+//e/sup /-// collider features its fine granularity and multi-band structure. The tracking software named ACE which makes the most of these features shows an excellent performance for reconstruction of high multiplicity events with highly collimated jets. The obtained reconstruction efficiency is 97% for the particles coming from within 5 cm of the primary vertex with p/sub t/ /approx gt/ 500 MeVc in the simulated hadronic events. The processing time is on average less than 300 ms per hadronic event (simulated or real) on a FACOM M-382 computer.more » 3 refs., 5 figs.« less
Real-time Bayesian anomaly detection in streaming environmental data
NASA Astrophysics Data System (ADS)
Hill, David J.; Minsker, Barbara S.; Amir, Eyal
2009-04-01
With large volumes of data arriving in near real time from environmental sensors, there is a need for automated detection of anomalous data caused by sensor or transmission errors or by infrequent system behaviors. This study develops and evaluates three automated anomaly detection methods using dynamic Bayesian networks (DBNs), which perform fast, incremental evaluation of data as they become available, scale to large quantities of data, and require no a priori information regarding process variables or types of anomalies that may be encountered. This study investigates these methods' abilities to identify anomalies in eight meteorological data streams from Corpus Christi, Texas. The results indicate that DBN-based detectors, using either robust Kalman filtering or Rao-Blackwellized particle filtering, outperform a DBN-based detector using Kalman filtering, with the former having false positive/negative rates of less than 2%. These methods were successful at identifying data anomalies caused by two real events: a sensor failure and a large storm.
Continuous monitoring of water flow and solute transport using vadose zone monitoring technology
NASA Astrophysics Data System (ADS)
Dahan, O.
2009-04-01
Groundwater contamination is usually attributed to pollution events that initiate on land surface. These may be related to various sources such as industrial, urban or agricultural, and may appear as point or non point sources, through a single accidental event or a continuous pollution process. In all cases, groundwater pollution is a consequence of pollutant transport processes that take place in the vadose zone above the water table. Attempts to control pollution events and prevent groundwater contamination usually involve groundwater monitoring programs. This, however, can not provide any protection against contamination since pollution identification in groundwater is clear evidence that the groundwater is already polluted and contaminants have already traversed the entire vadose zone. Accordingly, an efficient monitoring program that aims at providing information that may prevent groundwater pollution has to include vadose-zone monitoring systems. Such system should provide real-time information on the hydrological and chemical properties of the percolating water and serve as an early warning system capable of detecting pollution events in their early stages before arrival of contaminants to groundwater. Recently, a vadose-zone monitoring system (VMS) was developed to allow continuous monitoring of the hydrological and chemical properties of percolating water in the deep vadose zone. The VMS includes flexible time-domain reflectometry (FTDR) probes for continuous tracking of water content profiles, and vadose-zone sampling ports (VSPs) for frequent sampling of the deep vadose pore water at multiple depths. The monitoring probes and sampling ports are installed through uncased slanted boreholes using a flexible sleeve that allows attachment of the monitoring devices to the borehole walls while achieving good contact between the sensors and the undisturbed sediment column. The system has been successfully implemented in several studies on water flow and contaminant transport in various hydrological and geological setups. These include floodwater infiltration in arid environments, land use impact on groundwater quality, and control of remediation process in a contaminated vadose zone. The data which is collected by the VMS allows direct measurements of flow velocities and fluxes in the vadose zone while continuously monitoring the chemical evolution of the percolating water. While real time information on the hydrological and chemical properties of the percolating water in the vadose is essential to prevent groundwater contamination it is also vital for any remediation actions. Remediation of polluted soils and aquifers essentially involves manipulation of surface and subsurface hydrological, physical and biochemical conditions to improve pollutant attenuation. Controlling the biochemical conditions to enhance biodegradation often includes introducing degrading microorganisms, applying electron donors or acceptors, or adding nutrients that can promote growth of the desired degrading organisms. Accordingly real time data on the hydrological and chemical properties of the vadose zone may be used to select remediation strategies and determine its efficiency on the basis of real time information.
A model of seismic coda arrivals to suppress spurious events.
NASA Astrophysics Data System (ADS)
Arora, N.; Russell, S.
2012-04-01
We describe a model of coda arrivals which has been added to NET-VISA (Network processing Vertically Integrated Seismic Analysis) our probabilistic generative model of seismic events, their transmission, and detection on a global seismic network. The scattered energy that follows a seismic phase arrival tends to deceive typical STA/LTA based arrival picking software into believing that a real seismic phase has been detected. These coda arrivals which tend to follow all seismic phases cause most network processing software including NET-VISA to believe that multiple events have taken place. It is not a simple matter of ignoring closely spaced arrivals since arrivals from multiple events can indeed overlap. The current practice in NET-VISA of pruning events within a small space-time neighborhood of a larger event works reasonably well, but it may mask real events produced in an after-shock sequence. Our new model allows any seismic arrival, even coda arrivals, to trigger a subsequent coda arrival. The probability of such a triggered arrival depends on the amplitude of the triggering arrival. Although real seismic phases are more likely to generate such coda arrivals. Real seismic phases also tend to generate coda arrivals with more strongly correlated parameters, for example azimuth and slowness. However, the SNR (Signal to Noise Ratio) of a coda arrival immediately following a phase arrival tends to be lower because of the nature of the SNR calculation. We have calibrated our model on historical statistics of such triggered arrivals and our inference accounts for them while searching for the best explanation of seismic events their association to the arrivals and the coda arrivals. We have tested our new model on one week of global seismic data spanning March 22, 2009 to March 29, 2009. Our model was trained on two and half months of data from April 5, 2009 to June 20, 2009. We use the LEB bulletin produced by the IDC (International Data Center) as the ground truth and computed the precision (percentage of reported events which are true) and recall (percentage of true events which are reported). The existing model has a precision of 32.2 and recall of 88.6 which changes to a precision of 50.7 and recall of 88.5 after pruning. The new model has a precision of 56.8 and recall of 86.9 without any pruning and the corresponding precision recall curve is dramatically improved. In contrast, the performance of the current automated bulletin at the IDC, SEL3, has a precision of 46.2 and recall of 69.7.
Stadler, H; Klock, E; Skritek, P; Mach, R L; Zerobin, W; Farnleitner, A H
2010-01-01
Because spring water quality from alpine karst aquifers can change very rapidly during event situations, water abstraction management has to be performed in near real-time. Four summer events (2005-2008) at alpine karst springs were investigated in detail in order to evaluate the spectral absorption coefficient at 254 nm (SAC254) as a real-time early warning proxy for faecal pollution. For the investigation Low-Earth-Orbit (LEO) Satellite-based data communication between portable hydrometeorological measuring stations and an automated microbiological sampling device was used. The method for event triggered microbial sampling and analyzing was already established and described in a previous paper. Data analysis including on-line event characterisation (i.e. precipitation, discharge, turbidity, SAC254) and comprehensive E. coli determination (n>800) indicated that SAC254 is a useful early warning proxy. Irrespective of the studied event situations SAC254 always increased 3 to 6 hours earlier than the onset of faecal pollution, featuring different correlation phases. Furthermore, it seems also possible to use SAC254 as a real-time proxy parameter for estimating the extent of faecal pollution after establishing specific spring and event-type calibrations that take into consideration the variability of the occurrence and the transferability of faecal material It should be highlighted that diffuse faecal pollution from wildlife and live stock sources was responsible for spring water contamination at the investigated catchments. In this respect, the SAC254 can also provide useful information to support microbial source tracking efforts where different situations of infiltration have to be investigated.
Ni, Yizhao; Lingren, Todd; Hall, Eric S; Leonard, Matthew; Melton, Kristin; Kirkendall, Eric S
2018-05-01
Timely identification of medication administration errors (MAEs) promises great benefits for mitigating medication errors and associated harm. Despite previous efforts utilizing computerized methods to monitor medication errors, sustaining effective and accurate detection of MAEs remains challenging. In this study, we developed a real-time MAE detection system and evaluated its performance prior to system integration into institutional workflows. Our prospective observational study included automated MAE detection of 10 high-risk medications and fluids for patients admitted to the neonatal intensive care unit at Cincinnati Children's Hospital Medical Center during a 4-month period. The automated system extracted real-time medication use information from the institutional electronic health records and identified MAEs using logic-based rules and natural language processing techniques. The MAE summary was delivered via a real-time messaging platform to promote reduction of patient exposure to potential harm. System performance was validated using a physician-generated gold standard of MAE events, and results were compared with those of current practice (incident reporting and trigger tools). Physicians identified 116 MAEs from 10 104 medication administrations during the study period. Compared to current practice, the sensitivity with automated MAE detection was improved significantly from 4.3% to 85.3% (P = .009), with a positive predictive value of 78.0%. Furthermore, the system showed potential to reduce patient exposure to harm, from 256 min to 35 min (P < .001). The automated system demonstrated improved capacity for identifying MAEs while guarding against alert fatigue. It also showed promise for reducing patient exposure to potential harm following MAE events.
Deep neural networks to enable real-time multimessenger astrophysics
NASA Astrophysics Data System (ADS)
George, Daniel; Huerta, E. A.
2018-02-01
Gravitational wave astronomy has set in motion a scientific revolution. To further enhance the science reach of this emergent field of research, there is a pressing need to increase the depth and speed of the algorithms used to enable these ground-breaking discoveries. We introduce Deep Filtering—a new scalable machine learning method for end-to-end time-series signal processing. Deep Filtering is based on deep learning with two deep convolutional neural networks, which are designed for classification and regression, to detect gravitational wave signals in highly noisy time-series data streams and also estimate the parameters of their sources in real time. Acknowledging that some of the most sensitive algorithms for the detection of gravitational waves are based on implementations of matched filtering, and that a matched filter is the optimal linear filter in Gaussian noise, the application of Deep Filtering using whitened signals in Gaussian noise is investigated in this foundational article. The results indicate that Deep Filtering outperforms conventional machine learning techniques, achieves similar performance compared to matched filtering, while being several orders of magnitude faster, allowing real-time signal processing with minimal resources. Furthermore, we demonstrate that Deep Filtering can detect and characterize waveform signals emitted from new classes of eccentric or spin-precessing binary black holes, even when trained with data sets of only quasicircular binary black hole waveforms. The results presented in this article, and the recent use of deep neural networks for the identification of optical transients in telescope data, suggests that deep learning can facilitate real-time searches of gravitational wave sources and their electromagnetic and astroparticle counterparts. In the subsequent article, the framework introduced herein is directly applied to identify and characterize gravitational wave events in real LIGO data.
An Experimental Seismic Data and Parameter Exchange System for Tsunami Warning Systems
NASA Astrophysics Data System (ADS)
Hoffmann, T. L.; Hanka, W.; Saul, J.; Weber, B.; Becker, J.; Heinloo, A.; Hoffmann, M.
2009-12-01
For several years GFZ Potsdam is operating a global earthquake monitoring system. Since the beginning of 2008, this system is also used as an experimental seismic background data center for two different regional Tsunami Warning Systems (TWS), the IOTWS (Indian Ocean) and the interim NEAMTWS (NE Atlantic and Mediterranean). The SeisComP3 (SC3) software, developed within the GITEWS (German Indian Ocean Tsunami Early Warning System) project, capable to acquire, archive and process real-time data feeds, was extended for export and import of individual processing results within the two clusters of connected SC3 systems. Therefore not only real-time waveform data are routed to the attached warning centers through GFZ but also processing results. While the current experimental NEAMTWS cluster consists of SC3 systems in six designated national warning centers in Europe, the IOTWS cluster presently includes seven centers, with another three likely to join in 2009/10. For NEAMTWS purposes, the GFZ virtual real-time seismic network (GEOFON Extended Virtual Network -GEVN) in Europe was substantially extended by adding many stations from Western European countries optimizing the station distribution. In parallel to the data collection over the Internet, a GFZ VSAT hub for secured data collection of the EuroMED GEOFON and NEAMTWS backbone network stations became operational and first data links were established through this backbone. For the Southeast Asia region, a VSAT hub has been established in Jakarta already in 2006, with some other partner networks connecting to this backbone via the Internet. Since its establishment, the experimental system has had the opportunity to prove its performance in a number of relevant earthquakes. Reliable solutions derived from a minimum of 25 stations were very promising in terms of speed. For important events, automatic alerts were released and disseminated by emails and SMS. Manually verified solutions are added as soon as they become available. The results are also promising in terms of accuracy since epicenter coordinates, depth and magnitude estimates were sufficiently accurate from the very beginning, and usually do not differ substantially from the final solutions. In summary, automatic seismic event processing has shown to work well as a first step for starting a Tsunami Warning process. However, for the secured assessment of the tsunami potential of a given event, 24/7-manned regional TWCs are mandatory for reliable manual verification of the automatic seismic results. At this time, GFZ itself provides manual verification only when staff is available, not on a 24/7 basis, while the actual national tsunami warning centers have all a reliable 24/7 service.
Post-processing of seismic parameter data based on valid seismic event determination
McEvilly, Thomas V.
1985-01-01
An automated seismic processing system and method are disclosed, including an array of CMOS microprocessors for unattended battery-powered processing of a multi-station network. According to a characterizing feature of the invention, each channel of the network is independently operable to automatically detect, measure times and amplitudes, and compute and fit Fast Fourier transforms (FFT's) for both P- and S- waves on analog seismic data after it has been sampled at a given rate. The measured parameter data from each channel are then reviewed for event validity by a central controlling microprocessor and if determined by preset criteria to constitute a valid event, the parameter data are passed to an analysis computer for calculation of hypocenter location, running b-values, source parameters, event count, P- wave polarities, moment-tensor inversion, and Vp/Vs ratios. The in-field real-time analysis of data maximizes the efficiency of microearthquake surveys allowing flexibility in experimental procedures, with a minimum of traditional labor-intensive postprocessing. A unique consequence of the system is that none of the original data (i.e., the sensor analog output signals) are necessarily saved after computation, but rather, the numerical parameters generated by the automatic analysis are the sole output of the automated seismic processor.
Assessing the Applicability of Earthquake Early Warning in Nicaragua.
NASA Astrophysics Data System (ADS)
Massin, F.; Clinton, J. F.; Behr, Y.; Strauch, W.; Cauzzi, C.; Boese, M.; Talavera, E.; Tenorio, V.; Ramirez, J.
2016-12-01
Nicaragua, like much of Central America, suffers from frequent damaging earthquakes (6 M7+ earthquakes occurred in the last 100 years). Thrust events occur at the Middle America Trench where the Cocos plate subducts by 72-81 mm/yr eastward beneath the Caribbean plate. Shallow crustal events occur on-shore, with potential extensive damage as demonstrated in 1972 by a M6.2 earthquake, 5 km beneath Managua. This seismotectonic setting is challenging for Earthquake Early Warning (EEW) because the target events derive from both the offshore seismicity, with potentially large lead times but uncertain locations, and shallow seismicity in close proximity to densely urbanized areas, where an early warning would be short if available at all. Nevertheless, EEW could reduce Nicaragua's earthquake exposure. The Swiss Development and Cooperation Fund and the Nicaraguan Government have funded a collaboration between the Swiss Seismological Service (SED) at ETH Zurich and the Nicaraguan Geosciences Institute (INETER) in Managua to investigate and build a prototype EEW system for Nicaragua and the wider region. In this contribution, we present the potential of EEW to effectively alert Nicaragua and the neighbouring regions. We model alert time delays using all available seismic stations (existing and planned) in the region, as well as communication and processing delays (observed and optimal) to estimate current and potential performances of EEW alerts. Theoretical results are verified with the output from the Virtual Seismologist in SeisComP3 (VS(SC3)). VS(SC3) is implemented in the INETER SeisComP3 system for real-time operation and as an offline instance, that simulates real-time operation, to record processing delays of playback events. We compare our results with similar studies for Europe, California and New Zealand. We further highlight current capabilities and challenges for providing EEW alerts in Nicaragua. We also discuss how combining different algorithms, like e.g. VS and FinDer, can lead to a robust approach to EEW.
How do infants and adults process communicative events in real time?
Yamashiro, Amy; Vouloumanos, Athena
2018-09-01
Speech allows humans to communicate and to navigate the social world. By 12 months, infants recognize that speech elicits appropriate responses from others. However, it is unclear how infants process dynamic communicative scenes and how their processing abilities compare with those of adults. Do infants, like adults, process communicative events while the event is occurring or only after being presented with the outcome? We examined 12-month-olds' and adults' eye movements as they watched a Communicator grasp one (target) of two objects. During the test event, the Communicator could no longer reach the objects, so she spoke or coughed to a Listener, who selected either object. Infants' and adults' patterns of looking to the actors and objects revealed that both groups immediately evaluated the Communicator's speech, but not her cough, as communicative and recognized that the Listener should select the target object only when the Communicator spoke. Furthermore, infants and adults shifted their attention between the actors and the objects in very similar ways. This suggests that 12-month-olds can quickly process communicative events as they occur with adult-like accuracy. However, differences in looking reveal that 12-month-olds process slower than adults. This early developing processing ability may allow infants to learn language and acquire knowledge from communicative interactions. Copyright © 2018 Elsevier Inc. All rights reserved.
Real-Time Earthquake Monitoring with Spatio-Temporal Fields
NASA Astrophysics Data System (ADS)
Whittier, J. C.; Nittel, S.; Subasinghe, I.
2017-10-01
With live streaming sensors and sensor networks, increasingly large numbers of individual sensors are deployed in physical space. Sensor data streams are a fundamentally novel mechanism to deliver observations to information systems. They enable us to represent spatio-temporal continuous phenomena such as radiation accidents, toxic plumes, or earthquakes almost as instantaneously as they happen in the real world. Sensor data streams discretely sample an earthquake, while the earthquake is continuous over space and time. Programmers attempting to integrate many streams to analyze earthquake activity and scope need to write code to integrate potentially very large sets of asynchronously sampled, concurrent streams in tedious application code. In previous work, we proposed the field stream data model (Liang et al., 2016) for data stream engines. Abstracting the stream of an individual sensor as a temporal field, the field represents the Earth's movement at the sensor position as continuous. This simplifies analysis across many sensors significantly. In this paper, we undertake a feasibility study of using the field stream model and the open source Data Stream Engine (DSE) Apache Spark(Apache Spark, 2017) to implement a real-time earthquake event detection with a subset of the 250 GPS sensor data streams of the Southern California Integrated GPS Network (SCIGN). The field-based real-time stream queries compute maximum displacement values over the latest query window of each stream, and related spatially neighboring streams to identify earthquake events and their extent. Further, we correlated the detected events with an USGS earthquake event feed. The query results are visualized in real-time.
Real-Time Visualization of Network Behaviors for Situational Awareness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Best, Daniel M.; Bohn, Shawn J.; Love, Douglas V.
Plentiful, complex, and dynamic data make understanding the state of an enterprise network difficult. Although visualization can help analysts understand baseline behaviors in network traffic and identify off-normal events, visual analysis systems often do not scale well to operational data volumes (in the hundreds of millions to billions of transactions per day) nor to analysis of emergent trends in real-time data. We present a system that combines multiple, complementary visualization techniques coupled with in-stream analytics, behavioral modeling of network actors, and a high-throughput processing platform called MeDICi. This system provides situational understanding of real-time network activity to help analysts takemore » proactive response steps. We have developed these techniques using requirements gathered from the government users for which the tools are being developed. By linking multiple visualization tools to a streaming analytic pipeline, and designing each tool to support a particular kind of analysis (from high-level awareness to detailed investigation), analysts can understand the behavior of a network across multiple levels of abstraction.« less
Bigdely-Shamlo, Nima; Cockfield, Jeremy; Makeig, Scott; Rognon, Thomas; La Valle, Chris; Miyakoshi, Makoto; Robbins, Kay A.
2016-01-01
Real-world brain imaging by EEG requires accurate annotation of complex subject-environment interactions in event-rich tasks and paradigms. This paper describes the evolution of the Hierarchical Event Descriptor (HED) system for systematically describing both laboratory and real-world events. HED version 2, first described here, provides the semantic capability of describing a variety of subject and environmental states. HED descriptions can include stimulus presentation events on screen or in virtual worlds, experimental or spontaneous events occurring in the real world environment, and events experienced via one or multiple sensory modalities. Furthermore, HED 2 can distinguish between the mere presence of an object and its actual (or putative) perception by a subject. Although the HED framework has implicit ontological and linked data representations, the user-interface for HED annotation is more intuitive than traditional ontological annotation. We believe that hiding the formal representations allows for a more user-friendly interface, making consistent, detailed tagging of experimental, and real-world events possible for research users. HED is extensible while retaining the advantages of having an enforced common core vocabulary. We have developed a collection of tools to support HED tag assignment and validation; these are available at hedtags.org. A plug-in for EEGLAB (sccn.ucsd.edu/eeglab), CTAGGER, is also available to speed the process of tagging existing studies. PMID:27799907
NASA Astrophysics Data System (ADS)
George, Daniel; Huerta, E. A.
2018-03-01
The recent Nobel-prize-winning detections of gravitational waves from merging black holes and the subsequent detection of the collision of two neutron stars in coincidence with electromagnetic observations have inaugurated a new era of multimessenger astrophysics. To enhance the scope of this emergent field of science, we pioneered the use of deep learning with convolutional neural networks, that take time-series inputs, for rapid detection and characterization of gravitational wave signals. This approach, Deep Filtering, was initially demonstrated using simulated LIGO noise. In this article, we present the extension of Deep Filtering using real data from LIGO, for both detection and parameter estimation of gravitational waves from binary black hole mergers using continuous data streams from multiple LIGO detectors. We demonstrate for the first time that machine learning can detect and estimate the true parameters of real events observed by LIGO. Our results show that Deep Filtering achieves similar sensitivities and lower errors compared to matched-filtering while being far more computationally efficient and more resilient to glitches, allowing real-time processing of weak time-series signals in non-stationary non-Gaussian noise with minimal resources, and also enables the detection of new classes of gravitational wave sources that may go unnoticed with existing detection algorithms. This unified framework for data analysis is ideally suited to enable coincident detection campaigns of gravitational waves and their multimessenger counterparts in real-time.
Liu, Gangjun; Zhang, Jun; Yu, Lingfeng; Xie, Tuqiang; Chen, Zhongping
2010-01-01
With the increase of the A-line speed of optical coherence tomography (OCT) systems, real-time processing of acquired data has become a bottleneck. The shared-memory parallel computing technique is used to process OCT data in real time. The real-time processing power of a quad-core personal computer (PC) is analyzed. It is shown that the quad-core PC could provide real-time OCT data processing ability of more than 80K A-lines per second. A real-time, fiber-based, swept source polarization-sensitive OCT system with 20K A-line speed is demonstrated with this technique. The real-time 2D and 3D polarization-sensitive imaging of chicken muscle and pig tendon is also demonstrated. PMID:19904337
A fast one-chip event-preprocessor and sequencer for the Simbol-X Low Energy Detector
NASA Astrophysics Data System (ADS)
Schanz, T.; Tenzer, C.; Maier, D.; Kendziorra, E.; Santangelo, A.
2010-12-01
We present an FPGA-based digital camera electronics consisting of an Event-Preprocessor (EPP) for on-board data preprocessing and a related Sequencer (SEQ) to generate the necessary signals to control the readout of the detector. The device has been originally designed for the Simbol-X low energy detector (LED). The EPP operates on 64×64 pixel images and has a real-time processing capability of more than 8000 frames per second. The already working releases of the EPP and the SEQ are now combined into one Digital-Camera-Controller-Chip (D3C).
Schendan, Haline E.; Ganis, Giorgio
2015-01-01
People categorize objects more slowly when visual input is highly impoverished instead of optimal. While bottom-up models may explain a decision with optimal input, perceptual hypothesis testing (PHT) theories implicate top-down processes with impoverished input. Brain mechanisms and the time course of PHT are largely unknown. This event-related potential study used a neuroimaging paradigm that implicated prefrontal cortex in top-down modulation of occipitotemporal cortex. Subjects categorized more impoverished and less impoverished real and pseudo objects. PHT theories predict larger impoverishment effects for real than pseudo objects because top-down processes modulate knowledge only for real objects, but different PHT variants predict different timing. Consistent with parietal-prefrontal PHT variants, around 250 ms, the earliest impoverished real object interaction started on an N3 complex, which reflects interactive cortical activity for object cognition. N3 impoverishment effects localized to both prefrontal and occipitotemporal cortex for real objects only. The N3 also showed knowledge effects by 230 ms that localized to occipitotemporal cortex. Later effects reflected (a) word meaning in temporal cortex during the N400, (b) internal evaluation of prior decision and memory processes and secondary higher-order memory involving anterotemporal parts of a default mode network during posterior positivity (P600), and (c) response related activity in posterior cingulate during an anterior slow wave (SW) after 700 ms. Finally, response activity in supplementary motor area during a posterior SW after 900 ms showed impoverishment effects that correlated with RTs. Convergent evidence from studies of vision, memory, and mental imagery which reflects purely top-down inputs, indicates that the N3 reflects the critical top-down processes of PHT. A hybrid multiple-state interactive, PHT and decision theory best explains the visual constancy of object cognition. PMID:26441701
Yuan, Jie; Xu, Guan; Yu, Yao; Zhou, Yu; Carson, Paul L; Wang, Xueding; Liu, Xiaojun
2013-08-01
Photoacoustic tomography (PAT) offers structural and functional imaging of living biological tissue with highly sensitive optical absorption contrast and excellent spatial resolution comparable to medical ultrasound (US) imaging. We report the development of a fully integrated PAT and US dual-modality imaging system, which performs signal scanning, image reconstruction, and display for both photoacoustic (PA) and US imaging all in a truly real-time manner. The back-projection (BP) algorithm for PA image reconstruction is optimized to reduce the computational cost and facilitate parallel computation on a state of the art graphics processing unit (GPU) card. For the first time, PAT and US imaging of the same object can be conducted simultaneously and continuously, at a real-time frame rate, presently limited by the laser repetition rate of 10 Hz. Noninvasive PAT and US imaging of human peripheral joints in vivo were achieved, demonstrating the satisfactory image quality realized with this system. Another experiment, simultaneous PAT and US imaging of contrast agent flowing through an artificial vessel, was conducted to verify the performance of this system for imaging fast biological events. The GPU-based image reconstruction software code for this dual-modality system is open source and available for download from http://sourceforge.net/projects/patrealtime.
NASA Astrophysics Data System (ADS)
Chen, Junhua
2013-03-01
To cope with a large amount of data in current sensed environments, decision aid tools should provide their understanding of situations in a time-efficient manner, so there is an increasing need for real-time network security situation awareness and threat assessment. In this study, the state transition model of vulnerability in the network based on semi-Markov process is proposed at first. Once events are triggered by an attacker's action or system response, the current states of the vulnerabilities are known. Then we calculate the transition probabilities of the vulnerability from the current state to security failure state. Furthermore in order to improve accuracy of our algorithms, we adjust the probabilities that they exploit the vulnerability according to the attacker's skill level. In the light of the preconditions and post-conditions of vulnerabilities in the network, attack graph is built to visualize security situation in real time. Subsequently, we predict attack path, recognize attack intention and estimate the impact through analysis of attack graph. These help administrators to insight into intrusion steps, determine security state and assess threat. Finally testing in a network shows that this method is reasonable and feasible, and can undertake tremendous analysis task to facilitate administrators' work.
NASA Astrophysics Data System (ADS)
Thanos, Konstantinos-Georgios; Thomopoulos, Stelios C. A.
2016-05-01
wayGoo is a fully functional application whose main functionalities include content geolocation, event scheduling, and indoor navigation. However, significant information about events do not reach users' attention, either because of the size of this information or because some information comes from real - time data sources. The purpose of this work is to facilitate event management operations by prioritizing the presented events, based on users' interests using both, static and real - time data. Through the wayGoo interface, users select conceptual topics that are interesting for them. These topics constitute a browsing behavior vector which is used for learning users' interests implicitly, without being intrusive. Then, the system estimates user preferences and return an events list sorted from the most preferred one to the least. User preferences are modeled via a Naïve Bayesian Network which consists of: a) the `decision' random variable corresponding to users' decision on attending an event, b) the `distance' random variable, modeled by a linear regression that estimates the probability that the distance between a user and each event destination is not discouraging, ` the seat availability' random variable, modeled by a linear regression, which estimates the probability that the seat availability is encouraging d) and the `relevance' random variable, modeled by a clustering - based collaborative filtering, which determines the relevance of each event users' interests. Finally, experimental results show that the proposed system contribute essentially to assisting users in browsing and selecting events to attend.
Tsukahara, Keita; Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Nishimaki-Mogami, Tomoko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi
2016-01-01
A real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event, MON87701. First, a standard plasmid for MON87701 quantification was constructed. The conversion factor (C f ) required to calculate the amount of genetically modified organism (GMO) was experimentally determined for a real-time PCR instrument. The determined C f for the real-time PCR instrument was 1.24. For the evaluation of the developed method, a blind test was carried out in an inter-laboratory trial. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr), respectively. The determined biases and the RSDr values were less than 30 and 13%, respectively, at all evaluated concentrations. The limit of quantitation of the method was 0.5%, and the developed method would thus be applicable for practical analyses for the detection and quantification of MON87701.
A simplified real time method to forecast semi-enclosed basins storm surge
NASA Astrophysics Data System (ADS)
Pasquali, D.; Di Risio, M.; De Girolamo, P.
2015-11-01
Semi-enclosed basins are often prone to storm surge events. Indeed, their meteorological exposition, the presence of large continental shelf and their shape can lead to strong sea level set-up. A real time system aimed at forecasting storm surge may be of great help to protect human activities (i.e. to forecast flooding due to storm surge events), to manage ports and to safeguard coasts safety. This paper aims at illustrating a simple method able to forecast storm surge events in semi-enclosed basins in real time. The method is based on a mixed approach in which the results obtained by means of a simplified physics based model with low computational costs are corrected by means of statistical techniques. The proposed method is applied to a point of interest located in the Northern part of the Adriatic Sea. The comparison of forecasted levels against observed values shows the satisfactory reliability of the forecasts.
Search strategy using LHC pileup interactions as a zero bias sample
NASA Astrophysics Data System (ADS)
Nachman, Benjamin; Rubbo, Francesco
2018-05-01
Due to a limited bandwidth and a large proton-proton interaction cross section relative to the rate of interesting physics processes, most events produced at the Large Hadron Collider (LHC) are discarded in real time. A sophisticated trigger system must quickly decide which events should be kept and is very efficient for a broad range of processes. However, there are many processes that cannot be accommodated by this trigger system. Furthermore, there may be models of physics beyond the standard model (BSM) constructed after data taking that could have been triggered, but no trigger was implemented at run time. Both of these cases can be covered by exploiting pileup interactions as an effective zero bias sample. At the end of high-luminosity LHC operations, this zero bias dataset will have accumulated about 1 fb-1 of data from which a bottom line cross section limit of O (1 ) fb can be set for BSM models already in the literature and those yet to come.
NASA Technical Reports Server (NTRS)
Wenzel, Elizabeth M.; Fisher, Scott S.; Stone, Philip K.; Foster, Scott H.
1991-01-01
The real time acoustic display capabilities are described which were developed for the Virtual Environment Workstation (VIEW) Project at NASA-Ames. The acoustic display is capable of generating localized acoustic cues in real time over headphones. An auditory symbology, a related collection of representational auditory 'objects' or 'icons', can be designed using ACE (Auditory Cue Editor), which links both discrete and continuously varying acoustic parameters with information or events in the display. During a given display scenario, the symbology can be dynamically coordinated in real time with 3-D visual objects, speech, and gestural displays. The types of displays feasible with the system range from simple warnings and alarms to the acoustic representation of multidimensional data or events.
Monitoring and Identifying in Real time Critical Patients Events.
Chavez Mora, Emma
2014-01-01
Nowadays pervasive health care monitoring environments, as well as business activity monitoring environments, gather information from a variety of data sources. However it includes new challenges because of the use of body and wireless sensors, nontraditional operational and transactional sources. This makes the health data more difficult to monitor. Decision making in this environment is typically complex and unstructured as clinical work is essentially interpretative, multitasking, collaborative, distributed and reactive. Thus, the health care arena requires real time data management in areas such as patient monitoring, detection of adverse events and adaptive responses to operational failures. This research presents a new architecture that enables real time patient data management through the use of intelligent data sources.
NASA Astrophysics Data System (ADS)
Wenzel, Elizabeth M.; Fisher, Scott S.; Stone, Philip K.; Foster, Scott H.
1991-03-01
The real time acoustic display capabilities are described which were developed for the Virtual Environment Workstation (VIEW) Project at NASA-Ames. The acoustic display is capable of generating localized acoustic cues in real time over headphones. An auditory symbology, a related collection of representational auditory 'objects' or 'icons', can be designed using ACE (Auditory Cue Editor), which links both discrete and continuously varying acoustic parameters with information or events in the display. During a given display scenario, the symbology can be dynamically coordinated in real time with 3-D visual objects, speech, and gestural displays. The types of displays feasible with the system range from simple warnings and alarms to the acoustic representation of multidimensional data or events.
From IHE Audit Trails to XES Event Logs Facilitating Process Mining.
Paster, Ferdinand; Helm, Emmanuel
2015-01-01
Recently Business Intelligence approaches like process mining are applied to the healthcare domain. The goal of process mining is to gain process knowledge, compliance and room for improvement by investigating recorded event data. Previous approaches focused on process discovery by event data from various specific systems. IHE, as a globally recognized basis for healthcare information systems, defines in its ATNA profile how real-world events must be recorded in centralized event logs. The following approach presents how audit trails collected by the means of ATNA can be transformed to enable process mining. Using the standardized audit trails provides the ability to apply these methods to all IHE based information systems.
Evaluation of the real-time earthquake information system in Japan
NASA Astrophysics Data System (ADS)
Nakamura, Hiromitsu; Horiuchi, Shigeki; Wu, Changjiang; Yamamoto, Shunroku; Rydelek, Paul A.
2009-01-01
The real-time earthquake information system (REIS) of the Japanese seismic network is developed for automatically determining earthquake parameters within a few seconds after the P-waves arrive at the closest stations using both the P-wave arrival times and the timing data that P-waves have not yet arrived at other stations. REIS results play a fundamental role in the real-time information for earthquake early warning in Japan. We show the rapidity and accuracy of REIS from the analysis of 4,050 earthquakes in three years since 2005; 44 percent of the first reports are issued within 5 seconds after the first P-wave arrival and 80 percent of the events have a difference in epicenter distance less than 20 km relative to manually determined locations. We compared the formal catalog to the estimated magnitude from the real-time analysis and found that 94 percent of the events had a magnitude difference of +/-1.0 unit.
Real Time Coincidence Detection Engine for High Count Rate Timestamp Based PET
NASA Astrophysics Data System (ADS)
Tetrault, M.-A.; Oliver, J. F.; Bergeron, M.; Lecomte, R.; Fontaine, R.
2010-02-01
Coincidence engines follow two main implementation flows: timestamp based systems and AND-gate based systems. The latter have been more widespread in recent years because of its lower cost and high efficiency. However, they are highly dependent on the selected electronic components, they have limited flexibility once assembled and they are customized to fit a specific scanner's geometry. Timestamp based systems are gathering more attention lately, especially with high channel count fully digital systems. These new systems must however cope with important singles count rates. One option is to record every detected event and postpone coincidence detection offline. For daily use systems, a real time engine is preferable because it dramatically reduces data volume and hence image preprocessing time and raw data management. This paper presents the timestamp based coincidence engine for the LabPET¿, a small animal PET scanner with up to 4608 individual readout avalanche photodiode channels. The engine can handle up to 100 million single events per second and has extensive flexibility because it resides in programmable logic devices. It can be adapted for any detector geometry or channel count, can be ported to newer, faster programmable devices and can have extra modules added to take advantage of scanner-specific features. Finally, the user can select between full processing mode for imaging protocols and minimum processing mode to study different approaches for coincidence detection with offline software.
ERIC Educational Resources Information Center
Rauchberger, Nirit; Kaniel, Shlomo; Gross, Zehavit
2017-01-01
This study examines the process of judging complex real-life events in Israel: the disengagement from Gush Katif, Rabin's assassination and the Second Lebanon War. The process of judging is based on Weiner's attribution model, (Weiner, 2000, 2006); however, due to the complexity of the events studied, variables were added to characterize the…
Characteristics of Near-Death Experiences Memories as Compared to Real and Imagined Events Memories
Brédart, Serge; Dehon, Hedwige; Ledoux, Didier; Laureys, Steven; Vanhaudenhuyse, Audrey
2013-01-01
Since the dawn of time, Near-Death Experiences (NDEs) have intrigued and, nowadays, are still not fully explained. Since reports of NDEs are proposed to be imagined events, and since memories of imagined events have, on average, fewer phenomenological characteristics than real events memories, we here compared phenomenological characteristics of NDEs reports with memories of imagined and real events. We included three groups of coma survivors (8 patients with NDE as defined by the Greyson NDE scale, 6 patients without NDE but with memories of their coma, 7 patients without memories of their coma) and a group of 18 age-matched healthy volunteers. Five types of memories were assessed using Memory Characteristics Questionnaire (MCQ – Johnson et al., 1988): target memories (NDE for NDE memory group, coma memory for coma memory group, and first childhood memory for no memory and control groups), old and recent real event memories and old and recent imagined event memories. Since NDEs are known to have high emotional content, participants were requested to choose the most emotionally salient memories for both real and imagined recent and old event memories. Results showed that, in NDE memories group, NDE memories have more characteristics than memories of imagined and real events (p<0.02). NDE memories contain more self-referential and emotional information and have better clarity than memories of coma (all ps<0.02). The present study showed that NDE memories contained more characteristics than real event memories and coma memories. Thus, this suggests that they cannot be considered as imagined event memories. On the contrary, their physiological origins could lead them to be really perceived although not lived in the reality. Further work is needed to better understand this phenomenon. PMID:23544039
NASA Astrophysics Data System (ADS)
Onken, Jeffrey
This dissertation introduces a multidisciplinary framework for the enabling of future research and analysis of alternatives for control centers for real-time operations of safety-critical systems. The multidisciplinary framework integrates functional and computational models that describe the dynamics in fundamental concepts of previously disparate engineering and psychology research disciplines, such as group performance and processes, supervisory control, situation awareness, events and delays, and expertise. The application in this dissertation is the real-time operations within the NASA Mission Control Center in Houston, TX. This dissertation operationalizes the framework into a model and simulation, which simulates the functional and computational models in the framework according to user-configured scenarios for a NASA human-spaceflight mission. The model and simulation generates data according to the effectiveness of the mission-control team in supporting the completion of mission objectives and detecting, isolating, and recovering from anomalies. Accompanying the multidisciplinary framework is a proof of concept, which demonstrates the feasibility of such a framework. The proof of concept demonstrates that variability occurs where expected based on the models. The proof of concept also demonstrates that the data generated from the model and simulation is useful for analyzing and comparing MCC configuration alternatives because an investigator can give a diverse set of scenarios to the simulation and the output compared in detail to inform decisions about the effect of MCC configurations on mission operations performance.
NASA Astrophysics Data System (ADS)
Savastano, Giorgio; Komjathy, Attila; Verkhoglyadova, Olga; Wei, Yong; Mazzoni, Augusto; Crespi, Mattia
2017-04-01
Tsunamis can produce gravity waves that propagate up to the ionosphere generating disturbed electron densities in the E and F regions. These ionospheric disturbances are studied in detail using ionospheric total electron content (TEC) measurements collected by continuously operating ground-based receivers from the Global Navigation Satellite Systems (GNSS). Here, we present results using a new approach, named VARION (Variometric Approach for Real-Time Ionosphere Observation), and for the first time, we estimate slant TEC (sTEC) variations in a real-time scenario from GPS and Galileo constellations. Specifically, we study the 2016 New Zealand tsunami event using GNSS receivers with multi-constellation tracking capabilities located in the Pacific region. We compare sTEC estimates obtained using GPS and Galileo constellations. The efficiency of the real-time sTEC estimation using the VARION algorithm has been demonstrated for the 2012 Haida Gwaii tsunami event. TEC variations induced by the tsunami event are computed using 56 GPS receivers in Hawai'i. We observe TEC perturbations with amplitudes up to 0.25 TEC units and traveling ionospheric disturbances moving away from the epicenter at a speed of about 316 m/s. We present comparisons with the real-time tsunami model MOST (Method of Splitting Tsunami) provided by the NOAA Center for Tsunami Research. We observe variations in TEC that correlate well in time and space with the propagating tsunami waves. We conclude that the integration of different satellite constellations is a crucial step forward to increasing the reliability of real-time tsunami detection systems using ground-based GNSS receivers as an augmentation to existing tsunami early warning systems.
Optimisation of multiplet identifier processing on a PLAYSTATION® 3
NASA Astrophysics Data System (ADS)
Hattori, Masami; Mizuno, Takashi
2010-02-01
To enable high-performance computing (HPC) for applications with large datasets using a Sony® PLAYSTATION® 3 (PS3™) video game console, we configured a hybrid system consisting of a Windows® PC and a PS3™. To validate this system, we implemented the real-time multiplet identifier (RTMI) application, which identifies multiplets of microearthquakes in terms of the similarity of their waveforms. The cross-correlation computation, which is a core algorithm of the RTMI application, was optimised for the PS3™ platform, while the rest of the computation, including data input and output remained on the PC. With this configuration, the core part of the algorithm ran 69 times faster than the original program, accelerating total computation speed more than five times. As a result, the system processed up to 2100 total microseismic events, whereas the original implementation had a limit of 400 events. These results indicate that this system enables high-performance computing for large datasets using the PS3™, as long as data transfer time is negligible compared with computation time.
Rapid Modeling of and Response to Large Earthquakes Using Real-Time GPS Networks (Invited)
NASA Astrophysics Data System (ADS)
Crowell, B. W.; Bock, Y.; Squibb, M. B.
2010-12-01
Real-time GPS networks have the advantage of capturing motions throughout the entire earthquake cycle (interseismic, seismic, coseismic, postseismic), and because of this, are ideal for real-time monitoring of fault slip in the region. Real-time GPS networks provide the perfect supplement to seismic networks, which operate with lower noise and higher sampling rates than GPS networks, but only measure accelerations or velocities, putting them at a supreme disadvantage for ascertaining the full extent of slip during a large earthquake in real-time. Here we report on two examples of rapid modeling of recent large earthquakes near large regional real-time GPS networks. The first utilizes Japan’s GEONET consisting of about 1200 stations during the 2003 Mw 8.3 Tokachi-Oki earthquake about 100 km offshore Hokkaido Island and the second investigates the 2010 Mw 7.2 El Mayor-Cucapah earthquake recorded by more than 100 stations in the California Real Time Network. The principal components of strain were computed throughout the networks and utilized as a trigger to initiate earthquake modeling. Total displacement waveforms were then computed in a simulated real-time fashion using a real-time network adjustment algorithm that fixes a station far away from the rupture to obtain a stable reference frame. Initial peak ground displacement measurements can then be used to obtain an initial size through scaling relationships. Finally, a full coseismic model of the event can be run minutes after the event, given predefined fault geometries, allowing emergency first responders and researchers to pinpoint the regions of highest damage. Furthermore, we are also investigating using total displacement waveforms for real-time moment tensor inversions to look at spatiotemporal variations in slip.
Toward an optimisation technique for dynamically monitored environment
NASA Astrophysics Data System (ADS)
Shurrab, Orabi M.
2016-10-01
The data fusion community has introduced multiple procedures of situational assessments; this is to facilitate timely responses to emerging situations. More directly, the process refinement of the Joint Directors of Laboratories (JDL) is a meta-process to assess and improve the data fusion task during real-time operation. In other wording, it is an optimisation technique to verify the overall data fusion performance, and enhance it toward the top goals of the decision-making resources. This paper discusses the theoretical concept of prioritisation. Where the analysts team is required to keep an up to date with the dynamically changing environment, concerning different domains such as air, sea, land, space and cyberspace. Furthermore, it demonstrates an illustration example of how various tracking activities are ranked, simultaneously into a predetermined order. Specifically, it presents a modelling scheme for a case study based scenario, where the real-time system is reporting different classes of prioritised events. Followed by a performance metrics for evaluating the prioritisation process of situational awareness (SWA) domain. The proposed performance metrics has been designed and evaluated using an analytical approach. The modelling scheme represents the situational awareness system outputs mathematically, in the form of a list of activities. Such methods allowed the evaluation process to conduct a rigorous analysis of the prioritisation process, despite any constrained related to a domain-specific configuration. After conducted three levels of assessments over three separates scenario, The Prioritisation Capability Score (PCS) has provided an appropriate scoring scheme for different ranking instances, Indeed, from the data fusion perspectives, the proposed metric has assessed real-time system performance adequately, and it is capable of conducting a verification process, to direct the operator's attention to any issue, concerning the prioritisation capability of situational awareness domain.
Continuous catchment-scale monitoring of geomorphic processes with a 2-D seismological array
NASA Astrophysics Data System (ADS)
Burtin, A.; Hovius, N.; Milodowski, D.; Chen, Y.-G.; Wu, Y.-M.; Lin, C.-W.; Chen, H.
2012-04-01
The monitoring of geomorphic processes during extreme climatic events is of a primary interest to estimate their impact on the landscape dynamics. However, available techniques to survey the surface activity do not provide a relevant time and/or space resolution. Furthermore, these methods hardly investigate the dynamics of the events since their detection are made a posteriori. To increase our knowledge of the landscape evolution and the influence of extreme climatic events on a catchment dynamics, we need to develop new tools and procedures. In many past works, it has been shown that seismic signals are relevant to detect and locate surface processes (landslides, debris flows). During the 2010 typhoon season, we deployed a network of 12 seismometers dedicated to monitor the surface processes of the Chenyoulan catchment in Taiwan. We test the ability of a two dimensional array and small inter-stations distances (~ 11 km) to map in continuous and at a catchment-scale the geomorphic activity. The spectral analysis of continuous records shows a high-frequency (> 1 Hz) seismic energy that is coherent with the occurrence of hillslope and river processes. Using a basic detection algorithm and a location approach running on the analysis of seismic amplitudes, we manage to locate the catchment activity. We mainly observe short-time events (> 300 occurrences) associated with debris falls and bank collapses during daily convective storms, where 69% of occurrences are coherent with the time distribution of precipitations. We also identify a couple of debris flows during a large tropical storm. In contrast, the FORMOSAT imagery does not detect any activity, which somehow reflects the lack of extreme climatic conditions during the experiment. However, high resolution pictures confirm the existence of links between most of geomorphic events and existing structures (landslide scars, gullies...). We thus conclude to an activity that is dominated by reactivation processes. It highlights the major interest of a seismic monitoring since it allows a detailed spatial and temporal survey of events that classic approaches are not able to observe. In the future, dense two dimensional seismological arrays will assess in real-time the landscape dynamics of an entire catchment, tracking sediments from slopes to rivers.
Fandom Biases Retrospective Judgments Not Perception.
Huff, Markus; Papenmeier, Frank; Maurer, Annika E; Meitz, Tino G K; Garsoffky, Bärbel; Schwan, Stephan
2017-02-24
Attitudes and motivations have been shown to affect the processing of visual input, indicating that observers may see a given situation each literally in a different way. Yet, in real-life, processing information in an unbiased manner is considered to be of high adaptive value. Attitudinal and motivational effects were found for attention, characterization, categorization, and memory. On the other hand, for dynamic real-life events, visual processing has been found to be highly synchronous among viewers. Thus, while in a seminal study fandom as a particularly strong case of attitudes did bias judgments of a sports event, it left the question open whether attitudes do bias prior processing stages. Here, we investigated influences of fandom during the live TV broadcasting of the 2013 UEFA-Champions-League Final regarding attention, event segmentation, immediate and delayed cued recall, as well as affect, memory confidence, and retrospective judgments. Even though we replicated biased retrospective judgments, we found that eye-movements, event segmentation, and cued recall were largely similar across both groups of fans. Our findings demonstrate that, while highly involving sports events are interpreted in a fan dependent way, at initial stages they are processed in an unbiased manner.
Fandom Biases Retrospective Judgments Not Perception
Huff, Markus; Papenmeier, Frank; Maurer, Annika E.; Meitz, Tino G. K.; Garsoffky, Bärbel; Schwan, Stephan
2017-01-01
Attitudes and motivations have been shown to affect the processing of visual input, indicating that observers may see a given situation each literally in a different way. Yet, in real-life, processing information in an unbiased manner is considered to be of high adaptive value. Attitudinal and motivational effects were found for attention, characterization, categorization, and memory. On the other hand, for dynamic real-life events, visual processing has been found to be highly synchronous among viewers. Thus, while in a seminal study fandom as a particularly strong case of attitudes did bias judgments of a sports event, it left the question open whether attitudes do bias prior processing stages. Here, we investigated influences of fandom during the live TV broadcasting of the 2013 UEFA-Champions-League Final regarding attention, event segmentation, immediate and delayed cued recall, as well as affect, memory confidence, and retrospective judgments. Even though we replicated biased retrospective judgments, we found that eye-movements, event segmentation, and cued recall were largely similar across both groups of fans. Our findings demonstrate that, while highly involving sports events are interpreted in a fan dependent way, at initial stages they are processed in an unbiased manner. PMID:28233877
Real time freeway incident detection.
DOT National Transportation Integrated Search
2014-04-01
The US Department of Transportation (US-DOT) estimates that over half of all congestion : events are caused by highway incidents rather than by rush-hour traffic in big cities. Real-time : incident detection on freeways is an important part of any mo...
Secret Forwarding of Events over Distributed Publish/Subscribe Overlay Network.
Yoon, Young; Kim, Beom Heyn
2016-01-01
Publish/subscribe is a communication paradigm where loosely-coupled clients communicate in an asynchronous fashion. Publish/subscribe supports the flexible development of large-scale, event-driven and ubiquitous systems. Publish/subscribe is prevalent in a number of application domains such as social networking, distributed business processes and real-time mission-critical systems. Many publish/subscribe applications are sensitive to message loss and violation of privacy. To overcome such issues, we propose a novel method of using secret sharing and replication techniques. This is to reliably and confidentially deliver decryption keys along with encrypted publications even under the presence of several Byzantine brokers across publish/subscribe overlay networks. We also propose a framework for dynamically and strategically allocating broker replicas based on flexibly definable criteria for reliability and performance. Moreover, a thorough evaluation is done through a case study on social networks using the real trace of interactions among Facebook users.
Secret Forwarding of Events over Distributed Publish/Subscribe Overlay Network
Kim, Beom Heyn
2016-01-01
Publish/subscribe is a communication paradigm where loosely-coupled clients communicate in an asynchronous fashion. Publish/subscribe supports the flexible development of large-scale, event-driven and ubiquitous systems. Publish/subscribe is prevalent in a number of application domains such as social networking, distributed business processes and real-time mission-critical systems. Many publish/subscribe applications are sensitive to message loss and violation of privacy. To overcome such issues, we propose a novel method of using secret sharing and replication techniques. This is to reliably and confidentially deliver decryption keys along with encrypted publications even under the presence of several Byzantine brokers across publish/subscribe overlay networks. We also propose a framework for dynamically and strategically allocating broker replicas based on flexibly definable criteria for reliability and performance. Moreover, a thorough evaluation is done through a case study on social networks using the real trace of interactions among Facebook users. PMID:27367610
The Texas Thermal Interface: A real-time computer interface for an Inframetrics infrared camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Storek, D.J.; Gentle, K.W.
1996-03-01
The Texas Thermal Interface (TTI) offers an advantageous alternative to the conventional video path for computer analysis of infrared images from Inframetrics cameras. The TTI provides real-time computer data acquisition of 48 consecutive fields (version described here) with 8-bit pixels. The alternative requires time-consuming individual frame grabs from video tape with frequent loss of resolution in the D/A/D conversion. Within seconds after the event, the TTI temperature files may be viewed and processed to infer heat fluxes or other quantities as needed. The system cost is far less than commercial units which offer less capability. The system was developed formore » and is being used to measure heat fluxes to the plasma-facing components in a tokamak. {copyright} {ital 1996 American Institute of Physics.}« less
NASA Astrophysics Data System (ADS)
Ghosh, G. K.; Sivakumar, C.
2018-03-01
Longwall mining technique has been widely used around the globe due to its safe mining process. However, mining operations are suspended when various problems arise like collapse of roof falls, cracks and fractures propagation in the roof and complexity in roof strata behaviors. To overcome these colossal problems, an underground real time microseismic monitoring technique has been implemented in the working panel-P2 in the Rajendra longwall underground coal mine at South Eastern Coalfields Limited (SECL), India. The target coal seams appears at the panel P-2 within a depth of 70 m to 76 m. In this process, 10 to 15 uniaxial geophones were placed inside a borehole at depth range of 40 m to 60 m located over the working panel-P2 with high rock quality designation value for better seismic signal. Various microseismic events were recorded with magnitude ranging from -5 to 2 in the Richter scale. The time-series processing was carried out to get various seismic parameters like activity rate, potential energy, viscosity rate, seismic moment, energy index, apparent volume and potential energy with respect to time. The used of these parameters helped tracing the events, understanding crack and fractures propagation and locating both high and low stress distribution zones prior to roof fall occurrence. In most of the cases, the events were divided into three stage processes: initial or preliminary, middle or building, and final or falling. The results of this study reveal that underground microseismic monitoring provides sufficient prior information of underground weighting events. The information gathered during the study was conveyed to the mining personnel in advance prior to roof fall event. This permits to take appropriate action for safer mining operations and risk reduction during longwall operation.
Implementing real-time GNSS monitoring to investigate continental rift initiation processes
NASA Astrophysics Data System (ADS)
Jones, J. R.; Stamps, D. S.; Wauthier, C.; Daniels, M. D.; Saria, E.; Ji, K. H.; Mencin, D.; Ntambila, D.
2017-12-01
Continental rift initiation remains an elusive, yet fundamental, process in the context of plate tectonic theory. Our early work in the Natron Rift, Tanzania, the Earth's archetype continental rift initiation setting, indicates feedback between volcanic deformation and fault slip play a key role in the rift initiation process. We found evidence that fault slip on the Natron border fault during active volcanism at Ol Doniyo Lengai in 2008 required only 0.01 MPa of Coulomb stress change. This previous study was limited by GPS constraints 18 km from the volcano, rather than immediately adjacent on the rift shoulder. We hypothesize that fault slip adjacent to the volcano creeps, and without the need for active eruption. We also hypothesize silent slip events may occur over time-scales less than 1 day. To test our hypotheses we designed a GNSS network with 4 sites on the flanks of Ol Doinyo Lengai and 1 site on the adjacent Natron border fault with the capability to calculate 1 second, 3-5 cm precision positions. Data is transmitted to UNAVCO in real-time with remote satellite internet, which we automatically import to the EarthCube building block CHORDS (Cloud Hosted Real-time Data Services for the Geosciences) using our newly developed method. We use CHORDS to monitor and evaluate the health of our network while visualizing the GNSS data in real-time. In addition to our import method we have also developed user-friendly capabilities to export GNSS positions (longitude, latitude, height) with CHORDS assuming the data are available at UNAVCO in NMEA standardized format through the Networked Transport of RTCM via Internet Protocol (NTRIP). The ability to access the GNSS data that continuously monitors volcanic deformation, tectonics, and their interactions on and around Ol Doinyo Lengai is a crucial component in our investigation of continental rift initiation in the Natron Rift, Tanzania. Our new user-friendly methods developed to access and post-process real-time GNSS positioning data can also be used by others in the geodesy community that need 3-5 cm precision positions (longitude, latitude, height).
A geodetic matched-filter search for slow slip with application to the Mexico subduction zone
NASA Astrophysics Data System (ADS)
Rousset, B.; Campillo, M.; Lasserre, C.; Frank, W.; Cotte, N.; Walpersdorf, A.; Socquet, A.; Kostoglodov, V.
2017-12-01
Since the discovery of slow slip events, many methods have been successfully applied to model obvious transient events in geodetic time series, such as the widely used network strain filter. Independent seismological observations of tremors or low frequency earthquakes and repeating earthquakes provide evidence of low amplitude slow deformation but do not always coincide with clear occurrences of transient signals in geodetic time series. Here, we aim to extract the signal corresponding to slow slips hidden in the noise of GPS time series, without using information from independent datasets. We first build a library of synthetic slow slip event templates by assembling a source function with Green's functions for a discretized fault. We then correlate the templates with post-processed GPS time series. Once the events have been detected in time, we estimate their duration T and magnitude Mw by modelling a weighted stack of GPS time series. An analysis of synthetic time series shows that this method is able to resolve the correct timing, location, T and Mw of events larger than Mw 6.0 in the context of the Mexico subduction zone. Applied on a real data set of 29 GPS time series in the Guerrero area from 2005 to 2014, this technique allows us to detect 28 transient events from Mw 6.3 to 7.2 with durations that range from 3 to 39 days. These events have a dominant recurrence time of 40 days and are mainly located at the down dip edges of the Mw > 7.5 SSEs.
Real-Time Gait Event Detection Based on Kinematic Data Coupled to a Biomechanical Model.
Lambrecht, Stefan; Harutyunyan, Anna; Tanghe, Kevin; Afschrift, Maarten; De Schutter, Joris; Jonkers, Ilse
2017-03-24
Real-time detection of multiple stance events, more specifically initial contact (IC), foot flat (FF), heel off (HO), and toe off (TO), could greatly benefit neurorobotic (NR) and neuroprosthetic (NP) control. Three real-time threshold-based algorithms have been developed, detecting the aforementioned events based on kinematic data in combination with a biomechanical model. Data from seven subjects walking at three speeds on an instrumented treadmill were used to validate the presented algorithms, accumulating to a total of 558 steps. The reference for the gait events was obtained using marker and force plate data. All algorithms had excellent precision and no false positives were observed. Timing delays of the presented algorithms were similar to current state-of-the-art algorithms for the detection of IC and TO, whereas smaller delays were achieved for the detection of FF. Our results indicate that, based on their high precision and low delays, these algorithms can be used for the control of an NR/NP, with the exception of the HO event. Kinematic data is used in most NR/NP control schemes and is thus available at no additional cost, resulting in a minimal computational burden. The presented methods can also be applied for screening pathological gait or gait analysis in general in/outside of the laboratory.
GPS-based PWV for precipitation forecasting and its application to a typhoon event
NASA Astrophysics Data System (ADS)
Zhao, Qingzhi; Yao, Yibin; Yao, Wanqiang
2018-01-01
The temporal variability of precipitable water vapour (PWV) derived from Global Navigation Satellite System (GNSS) observations can be used to forecast precipitation events. A number of case studies of precipitation events have been analysed in Zhejiang Province, and a forecasting method for precipitation events was proposed. The PWV time series retrieved from the Global Positioning System (GPS) observations was processed by using a least-squares fitting method, so as to obtain the line tendency of ascents and descents over PWV. The increment of PWV for a short time (two to six hours) and PWV slope for a longer time (a few hours to more than ten hours) during the PWV ascending period are considered as predictive factors with which to forecast the precipitation event. The numerical results show that about 80%-90% of precipitation events and more than 90% of heavy rain events can be forecasted two to six hours in advance of the precipitation event based on the proposed method. 5-minute PWV data derived from GPS observations based on real-time precise point positioning (RT-PPP) were used for the typhoon event that passed over Zhejiang Province between 10 and 12 July, 2015. A good result was acquired using the proposed method and about 74% of precipitation events were predicted at some ten to thirty minutes earlier than their onset with a false alarm rate of 18%. This study shows that the GPS-based PWV was promising for short-term and now-casting precipitation forecasting.
NASA Astrophysics Data System (ADS)
Meier, V. L.; Scuderi, L.; Fischer, T.; Realmuto, V.; Hilton, D.
2006-12-01
Measurements of volcanic SO2 emissions provide insight into the processes working below a volcano, which can presage volcanic events. Being able to measure SO2 in near real-time is invaluable for the planning and response of hazard mitigation teams. Currently, there are several methods used to quantify the SO2 output of degassing volcanoes. Ground and aerial-based measurements using the differential optical absorption spectrometer (mini-DOAS) provide real-time estimates of SO2 output. Satellite-based measurements, which can provide similar estimates in near real-time, have increasingly been used as a tool for volcanic monitoring. Direct Broadcast (DB) real-time processing of remotely sensed data from NASA's Earth Observing System (EOS) satellites (MODIS Terra and Aqua) presents volcanologists with a range of spectral bands and processing options for the study of volcanic emissions. While the spatial resolution of MODIS is 1 km in the Very Near Infrared (VNIR) and Thermal Infrared (TIR), a high temporal resolution and a wide range of radiance measurements in 32 channels between VNIR and TIR combine to provide a versatile space borne platform to monitor SO2 emissions from volcanoes. An important question remaining to be answered is how well do MODIS SO2 estimates compare with DOAS estimates? In 2004 ground-based plume measurements were collected on April 24th and 25th at Anatahan volcano in the Mariana Islands using a mini-DOAS (Fischer and Hilton). SO2 measurements for these same dates have also been calculated using MODIS images and SO2 mapping software (Realmuto). A comparison of these different approaches to the measurement of SO2 for the same plume is presented. Differences in these observations are used to better quantify SO2 emissions, to assess the current mismatch between ground based and remotely sensed retrievals, and to develop an approach to continuously and accurately monitor volcanic activity from space in near real-time.
Integrating Real-time Earthquakes into Natural Hazard Courses
NASA Astrophysics Data System (ADS)
Furlong, K. P.; Benz, H. M.; Whitlock, J. S.; Bittenbinder, A. N.; Bogaert, B. B.
2001-12-01
Natural hazard courses are playing an increasingly important role in college and university earth science curricula. Students' intrinsic curiosity about the subject and the potential to make the course relevant to the interests of both science and non-science students make natural hazards courses popular additions to a department's offerings. However, one vital aspect of "real-life" natural hazard management that has not translated well into the classroom is the real-time nature of both events and response. The lack of a way to entrain students into the event/response mode has made implementing such real-time activities into classroom activities problematic. Although a variety of web sites provide near real-time postings of natural hazards, students essentially learn of the event after the fact. This is particularly true for earthquakes and other events with few precursors. As a result, the "time factor" and personal responsibility associated with natural hazard response is lost to the students. We have integrated the real-time aspects of earthquake response into two natural hazard courses at Penn State (a 'general education' course for non-science majors, and an upper-level course for science majors) by implementing a modification of the USGS Earthworm system. The Earthworm Database Management System (E-DBMS) catalogs current global seismic activity. It provides earthquake professionals with real-time email/cell phone alerts of global seismic activity and access to the data for review/revision purposes. We have modified this system so that real-time response can be used to address specific scientific, policy, and social questions in our classes. As a prototype of using the E-DBMS in courses, we have established an Earthworm server at Penn State. This server receives national and global seismic network data and, in turn, transmits the tailored alerts to "on-duty" students (e-mail, pager/cell phone notification). These students are responsible to react to the alarm real-time, consulting other members of their class and accessing the E-DBMS server and other links to glean information that they will then use to make decisions. Students wrestle with the complications in interpreting natural hazard data, evaluating whether a response is needed, and problems such as those associated with communication between media and the public through these focused exercises. Although earthquakes are targeted at present, similar DBMS systems are envisioned for other natural hazards like flooding, volcanoes, and severe weather. We are testing this system as a prototype intended to be expanded to provide web-based access to classes at both the middle/high school and college/university levels.
Information security of Smart Factories
NASA Astrophysics Data System (ADS)
Iureva, R. A.; Andreev, Y. S.; Iuvshin, A. M.; Timko, A. S.
2018-05-01
In several years, technologies and systems based on the Internet of things (IoT) will be widely used in all smart factories. When processing a huge array of unstructured data, their filtration and adequate interpretation are a priority for enterprises. In this context, the correct representation of information in a user-friendly form acquires special importance, for which the market today presents advanced analytical platforms designed to collect, store and analyze data on technological processes and events in real time. The main idea of the paper is the statement of the information security problem in IoT and integrity of processed information.
Computational Electrocardiography: Revisiting Holter ECG Monitoring.
Deserno, Thomas M; Marx, Nikolaus
2016-08-05
Since 1942, when Goldberger introduced the 12-lead electrocardiography (ECG), this diagnostic method has not been changed. After 70 years of technologic developments, we revisit Holter ECG from recording to understanding. A fundamental change is fore-seen towards "computational ECG" (CECG), where continuous monitoring is producing big data volumes that are impossible to be inspected conventionally but require efficient computational methods. We draw parallels between CECG and computational biology, in particular with respect to computed tomography, computed radiology, and computed photography. From that, we identify technology and methodology needed for CECG. Real-time transfer of raw data into meaningful parameters that are tracked over time will allow prediction of serious events, such as sudden cardiac death. Evolved from Holter's technology, portable smartphones with Bluetooth-connected textile-embedded sensors will capture noisy raw data (recording), process meaningful parameters over time (analysis), and transfer them to cloud services for sharing (handling), predicting serious events, and alarming (understanding). To make this happen, the following fields need more research: i) signal processing, ii) cycle decomposition; iii) cycle normalization, iv) cycle modeling, v) clinical parameter computation, vi) physiological modeling, and vii) event prediction. We shall start immediately developing methodology for CECG analysis and understanding.
Dozza, Marco; González, Nieves Pañeda
2013-11-01
New trends in research on traffic accidents include Naturalistic Driving Studies (NDS). NDS are based on large scale data collection of driver, vehicle, and environment information in real world. NDS data sets have proven to be extremely valuable for the analysis of safety critical events such as crashes and near crashes. However, finding safety critical events in NDS data is often difficult and time consuming. Safety critical events are currently identified using kinematic triggers, for instance searching for deceleration below a certain threshold signifying harsh braking. Due to the low sensitivity and specificity of this filtering procedure, manual review of video data is currently necessary to decide whether the events identified by the triggers are actually safety critical. Such reviewing procedure is based on subjective decisions, is expensive and time consuming, and often tedious for the analysts. Furthermore, since NDS data is exponentially growing over time, this reviewing procedure may not be viable anymore in the very near future. This study tested the hypothesis that automatic processing of driver video information could increase the correct classification of safety critical events from kinematic triggers in naturalistic driving data. Review of about 400 video sequences recorded from the events, collected by 100 Volvo cars in the euroFOT project, suggested that drivers' individual reaction may be the key to recognize safety critical events. In fact, whether an event is safety critical or not often depends on the individual driver. A few algorithms, able to automatically classify driver reaction from video data, have been compared. The results presented in this paper show that the state of the art subjective review procedures to identify safety critical events from NDS can benefit from automated objective video processing. In addition, this paper discusses the major challenges in making such video analysis viable for future NDS and new potential applications for NDS video processing. As new NDS such as SHRP2 are now providing the equivalent of five years of one vehicle data each day, the development of new methods, such as the one proposed in this paper, seems necessary to guarantee that these data can actually be analysed. Copyright © 2013 Elsevier Ltd. All rights reserved.
Not so secret agents: Event-related potentials to semantic roles in visual event comprehension.
Cohn, Neil; Paczynski, Martin; Kutas, Marta
2017-12-01
Research across domains has suggested that agents, the doers of actions, have a processing advantage over patients, the receivers of actions. We hypothesized that agents as "event builders" for discrete actions (e.g., throwing a ball, punching) build on cues embedded in their preparatory postures (e.g., reaching back an arm to throw or punch) that lead to (predictable) culminating actions, and that these cues afford frontloading of event structure processing. To test this hypothesis, we compared event-related brain potentials (ERPs) to averbal comic panels depicting preparatory agents (ex. reaching back an arm to punch) that cued specific actions with those to non-preparatory agents (ex. arm to the side) and patients that did not cue any specific actions. We also compared subsequent completed action panels (ex. agent punching patient) across conditions, where we expected an inverse pattern of ERPs indexing the differential costs of processing completed actions asa function of preparatory cues. Preparatory agents evoked a greater frontal positivity (600-900ms) relative to non-preparatory agents and patients, while subsequent completed actions panels following non-preparatory agents elicited a smaller frontal positivity (600-900ms). These results suggest that preparatory (vs. non-) postures may differentially impact the processing of agents and subsequent actions in real time. Copyright © 2017 Elsevier Inc. All rights reserved.
Event-Driven Technology to Generate Relevant Collections of Near-Realtime Data
NASA Astrophysics Data System (ADS)
Graves, S. J.; Keiser, K.; Nair, U. S.; Beck, J. M.; Ebersole, S.
2017-12-01
Getting the right data when it is needed continues to be a challenge for researchers and decision makers. Event-Driven Data Delivery (ED3), funded by the NASA Applied Science program, is a technology that allows researchers and decision makers to pre-plan what data, information and processes they need to have collected or executed in response to future events. The Information Technology and Systems Center at the University of Alabama in Huntsville (UAH) has developed the ED3 framework in collaboration with atmospheric scientists at UAH, scientists at the Geological Survey of Alabama, and other federal, state and local stakeholders to meet the data preparedness needs for research, decisions and situational awareness. The ED3 framework supports an API that supports the addition of loosely-coupled, distributed event handlers and data processes. This approach allows the easy addition of new events and data processes so the system can scale to support virtually any type of event or data process. Using ED3's underlying services, applications have been developed that monitor for alerts of registered event types and automatically triggers subscriptions that match new events, providing users with a living "album" of results that can continued to be curated as more information for an event becomes available. This capability can allow users to improve capacity for the collection, creation and use of data and real-time processes (data access, model execution, product generation, sensor tasking, social media filtering, etc), in response to disaster (and other) events by preparing in advance for data and information needs for future events. This presentation will provide an update on the ED3 developments and deployments, and further explain the applicability for utilizing near-realtime data in hazards research, response and situational awareness.
Infrasonic monitoring of snow avalanches in the Alps
NASA Astrophysics Data System (ADS)
Marchetti, E.; Ulivieri, G.; Ripepe, M.; Chiambretti, I.; Segor, V.
2012-04-01
Risk assessment of snow avalanches is mostly related to weather conditions and snow cover. However a robust risk validation requires to identify all avalanches occurring, in order to compare predictions to real effects. For this purpose on December 2010 we installed a permanent 4-element, small aperture (100 m), infrasound array in the Alps, after a pilot experiment carried out in Gressonay during the 2009-2010 winter season. The array has been deployed in the Ayas Valley, at an elevation of 2000 m a.s.l., where natural avalanches are expected and controlled events are regularly performed. The array consists into 4 Optimic 2180 infrasonic microphones, with a sensitivity of 10-3 Pa in the 0.5-50 Hz frequency band and a 4 channel Guralp CMG-DM24 A/D converter, sampling at 100 Hz. Timing is achieved with a GPS receiver. Data are transmitted to the Department of Earth Sciences of the University of Firenze, where data is recorded and processed in real-time. A multi-channel semblance is carried out on the continuous data set as a function of slowness, back-azimuth and frequency of recorded infrasound in order to detect all avalanches occurring from the back-ground signal, strongly affected by microbarom and mountain induced gravity waves. This permanent installation in Italy will allow to verify the efficiency of the system in short-to-medium range (2-8 km) avalanche detection, and might represent an important validation to model avalanches activity during this winter season. Moreover, the real-time processing of infrasonic array data, might strongly contribute to avalanche risk assessments providing an up-to-description of ongoing events.
Project Management in Real Time: A Service-Learning Project
ERIC Educational Resources Information Center
Larson, Erik; Drexler, John A., Jr.
2010-01-01
This article describes a service-learning assignment for a project management course. It is designed to facilitate hands-on student learning of both the technical and the interpersonal aspects of project management, and it involves student engagement with real customers and real stakeholders in the creation of real events with real outcomes. As…
Properties of induced seismicity at the geothermal reservoir Insheim, Germany
NASA Astrophysics Data System (ADS)
Olbert, Kai; Küperkoch, Ludger; Thomas, Meier
2017-04-01
Within the framework of the German MAGS2 Project the processing of induced events at the geothermal power plant Insheim, Germany, has been reassessed and evaluated. The power plant is located close to the western rim of the Upper Rhine Graben in a region with a strongly heterogeneous subsurface. Therefore, the location of seismic events particularly the depth estimation is challenging. The seismic network consisting of up to 50 stations has an aperture of approximately 15 km around the power plant. Consequently, the manual processing is time consuming. Using a waveform similarity detection algorithm, the existing dataset from 2012 to 2016 has been reprocessed to complete the catalog of induced seismic events. Based on the waveform similarity clusters of similar events have been detected. Automated P- and S-arrival time determination using an improved multi-component autoregressive prediction algorithm yields approximately 14.000 P- and S-arrivals for 758 events. Applying a dataset of manual picks as reference the automated picking algorithm has been optimized resulting in a standard deviation of the residuals between automated and manual picks of about 0.02s. The automated locations show uncertainties comparable to locations of the manual reference dataset. 90 % of the automated relocations fall within the error ellipsoid of the manual locations. The remaining locations are either badly resolved due to low numbers of picks or so well resolved that the automatic location is outside the error ellipsoid although located close to the manual location. The developed automated processing scheme proved to be a useful tool to supplement real-time monitoring. The event clusters are located at small patches of faults known from reflection seismic studies. The clusters are observed close to both the injection as well as the production wells.
NASA Astrophysics Data System (ADS)
Jones, K. R.; Arrowsmith, S.
2013-12-01
The Southwest U.S. Seismo-Acoustic Network (SUSSAN) is a collaborative project designed to produce infrasound event detection bulletins for the infrasound community for research purposes. We are aggregating a large, unique, near real-time data set with available ground truth information from seismo-acoustic arrays across New Mexico, Utah, Nevada, California, Texas and Hawaii. The data are processed in near real-time (~ every 20 minutes) with detections being made on individual arrays and locations determined for networks of arrays. The detection and location data are then combined with any available ground truth information and compiled into a bulletin that will be released to the general public directly and eventually through the IRIS infrasound event bulletin. We use the open source Earthworm seismic data aggregation software to acquire waveform data either directly from the station operator or via the Incorporated Research Institutions for Seismology Data Management Center (IRIS DMC), if available. The data are processed using InfraMonitor, a powerful infrasound event detection and localization software program developed by Stephen Arrowsmith at Los Alamos National Laboratory (LANL). Our goal with this program is to provide the infrasound community with an event database that can be used collaboratively to study various natural and man-made sources. We encourage participation in this program directly or by making infrasound array data available through the IRIS DMC or other means. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. R&A 5317326
Petkewich, Matthew D.; Daamen, Ruby C.; Roehl, Edwin A.; Conrads, Paul
2016-09-29
The Everglades Depth Estimation Network (EDEN), with over 240 real-time gaging stations, provides hydrologic data for freshwater and tidal areas of the Everglades. These data are used to generate daily water-level and water-depth maps of the Everglades that are used to assess biotic responses to hydrologic change resulting from the U.S. Army Corps of Engineers Comprehensive Everglades Restoration Plan. The generation of EDEN daily water-level and water-depth maps is dependent on high quality real-time data from water-level stations. Real-time data are automatically checked for outliers by assigning minimum and maximum thresholds for each station. Small errors in the real-time data, such as gradual drift of malfunctioning pressure transducers, are more difficult to immediately identify with visual inspection of time-series plots and may only be identified during on-site inspections of the stations. Correcting these small errors in the data often is time consuming and water-level data may not be finalized for several months. To provide daily water-level and water-depth maps on a near real-time basis, EDEN needed an automated process to identify errors in water-level data and to provide estimates for missing or erroneous water-level data.The Automated Data Assurance and Management (ADAM) software uses inferential sensor technology often used in industrial applications. Rather than installing a redundant sensor to measure a process, such as an additional water-level station, inferential sensors, or virtual sensors, were developed for each station that make accurate estimates of the process measured by the hard sensor (water-level gaging station). The inferential sensors in the ADAM software are empirical models that use inputs from one or more proximal stations. The advantage of ADAM is that it provides a redundant signal to the sensor in the field without the environmental threats associated with field conditions at stations (flood or hurricane, for example). In the event that a station does malfunction, ADAM provides an accurate estimate for the period of missing data. The ADAM software also is used in the quality assurance and quality control of the data. The virtual signals are compared to the real-time data, and if the difference between the two signals exceeds a certain tolerance, corrective action to the data and (or) the gaging station can be taken. The ADAM software is automated so that, each morning, the real-time EDEN data are compared to the inferential sensor signals and digital reports highlighting potential erroneous real-time data are generated for appropriate support personnel. The development and application of inferential sensors is easily transferable to other real-time hydrologic monitoring networks.
Patton, John M.; Guy, Michelle R.; Benz, Harley M.; Buland, Raymond P.; Erickson, Brian K.; Kragness, David S.
2016-08-18
This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National Earthquake Information Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.
Real-time fMRI processing with physiological noise correction - Comparison with off-line analysis.
Misaki, Masaya; Barzigar, Nafise; Zotev, Vadim; Phillips, Raquel; Cheng, Samuel; Bodurka, Jerzy
2015-12-30
While applications of real-time functional magnetic resonance imaging (rtfMRI) are growing rapidly, there are still limitations in real-time data processing compared to off-line analysis. We developed a proof-of-concept real-time fMRI processing (rtfMRIp) system utilizing a personal computer (PC) with a dedicated graphic processing unit (GPU) to demonstrate that it is now possible to perform intensive whole-brain fMRI data processing in real-time. The rtfMRIp performs slice-timing correction, motion correction, spatial smoothing, signal scaling, and general linear model (GLM) analysis with multiple noise regressors including physiological noise modeled with cardiac (RETROICOR) and respiration volume per time (RVT). The whole-brain data analysis with more than 100,000voxels and more than 250volumes is completed in less than 300ms, much faster than the time required to acquire the fMRI volume. Real-time processing implementation cannot be identical to off-line analysis when time-course information is used, such as in slice-timing correction, signal scaling, and GLM. We verified that reduced slice-timing correction for real-time analysis had comparable output with off-line analysis. The real-time GLM analysis, however, showed over-fitting when the number of sampled volumes was small. Our system implemented real-time RETROICOR and RVT physiological noise corrections for the first time and it is capable of processing these steps on all available data at a given time, without need for recursive algorithms. Comprehensive data processing in rtfMRI is possible with a PC, while the number of samples should be considered in real-time GLM. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Mencin, David; Meertens, Charles; Mattioli, Glen; Feaux, Karl; Looney, Sara; Sievers, Charles; Austin, Ken
2013-04-01
Recent advances in GPS technology and data processing are providing position estimates with centimeter-level precision at high-rate (1-5 Hz) and low latency (<1 s). Broad community interest in these data is growing rapidly because these data will have the potential to improve our understanding in diverse areas of geophysics including properties of seismic, volcanic, magmatic and tsunami deformation sources, and moreover profoundly transforming rapid event characterization, early warning, as well as hazard mitigation and response. Other scientific and operational applications for high-rate GPS also include glacier and ice sheet motions, tropospheric modeling, and better constraints on the dynamics of space weather. UNAVCO, through community input and the recent Plate Boundary Observatory (PBO) NSF-ARRA Cascadia initiative, has nearly completed the process of upgrading a total of 373 PBO GPS sites to real-time high-rate capability and these streams are now being archived in the UNAVCO data center. Further, through the UNAVCO core proposal (GAGE), currently under review at NSF, UNAVCO has proposed upgrading a significant portion of the ~1100 GPS stations that PBO currently operates to real-time high-rate capability to address community science and operational needs. In addition, in collaboration with NOAA, 74 of these stations will provide meteorological data in real-time, primarily to support watershed and flood analyses for regional early-warning systems related to NOAA's work with California Department of Water Resources. In preparation for this increased emphasis on high-rate GPS data, UNAVCO hosted an NSF funded workshop in Boulder, CO on March 26-28, 2012, which brought together 70 participants representing a spectrum of research fields with a goal to develop a community plan for the use of real-time GPS data products within the UNAVCO and EarthScope communities. These data products are expected to improve and expand the use of real-time, high-rate GPS data over the next decade.
UNAVCO Geodetic HIgh-Rate and Real-Time Products and Services: A next generation geodetic network
NASA Astrophysics Data System (ADS)
Mattioli, G. S.; Mencin, D.; Meertens, C. M.; Feaux, K.; Looney, S.
2012-12-01
Recent advances in GPS technology and data processing are providing position estimates with centimeter-level precision at high-rate (1 Hz) and low latency (<1 s). These data will have the potential to improve our understanding in diverse areas of geophysics including properties of seismic, volcanic, magmatic and tsunami deformation sources, and moreover profoundly transforming rapid event characterization, early warning, as well as hazard mitigation and response. Other scientific and operational applications for high-rate GPS also include glacier and ice sheet motions, tropospheric modeling, and better constraints on the dynamics of space weather. UNAVCO, through community input and the recent Plate Boundary Observatory (PBO) NSF-ARRA Cascadia initiative, has nearly completed the process of upgrading a total of 373 PBO GPS sites to real-time high-rate capability and these streams are now being archived in our data center. In addition, UNAVCO hosted an NSF funded workshop in Boulder, CO on March 26-28, which brought together 70 participants representing a spectrum of research fields with a goal to develop a community plan for the use of real-time GPS data products within the UNAVCO and EarthScope communities. These data products are expected to improve and expand the use of real-time GPS data over the next decade. Additionally, in collaboration with NOAA, 74 of these stations will provide meteorological data in real-time, primarily to support watershed and flood analyses for regional early-warning systems related to NOAA's work with California Department of Water Resources. As part of this upgrade UNAVCO is also exploring making the 75 PBO borehole strainmeter sites, whose data are now collected with a latency of 24 hours, available in SEED format in real-time in the near future, providing an opportunity to combine high-rate surface positioning and strain data together.
3-Dimensional Root Cause Diagnosis via Co-analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Ziming; Lan, Zhiling; Yu, Li
2012-01-01
With the growth of system size and complexity, reliability has become a major concern for large-scale systems. Upon the occurrence of failure, system administrators typically trace the events in Reliability, Availability, and Serviceability (RAS) logs for root cause diagnosis. However, RAS log only contains limited diagnosis information. Moreover, the manual processing is time-consuming, error-prone, and not scalable. To address the problem, in this paper we present an automated root cause diagnosis mechanism for large-scale HPC systems. Our mechanism examines multiple logs to provide a 3-D fine-grained root cause analysis. Here, 3-D means that our analysis will pinpoint the failure layer,more » the time, and the location of the event that causes the problem. We evaluate our mechanism by means of real logs collected from a production IBM Blue Gene/P system at Oak Ridge National Laboratory. It successfully identifies failure layer information for 219 failures during 23-month period. Furthermore, it effectively identifies the triggering events with time and location information, even when the triggering events occur hundreds of hours before the resulting failures.« less
Solar Energetic Particle Warnings from a Coronagraph
NASA Technical Reports Server (NTRS)
St Cyr, O. C.; Posner, A.; Burkepile, J. T.
2017-01-01
We report here the concept of using near-real time observations from a coronagraph to provide early warning of a fast coronal mass ejection (CME) and the possible onset of a solar energetic particle (SEP) event. The 1 January 2016, fast CME, and its associated SEP event are cited as an example. The CME was detected by the ground-based K-Cor coronagraph at Mauna Loa Solar Observatory and by the SOHO Large Angle and Spectrometric Coronagraph. The near-real-time availability of the high-cadence K-Cor observations in the low corona leads to an obvious question: Why has no one attempted to use a coronagraph as an early warning device for SEP events? The answer is that the low image cadence and the long latency of existing spaceborne coronagraphs make them valid for archival studies but typically unsuitable for near-real-time forecasting. The January 2016 event provided favorable CME viewing geometry and demonstrated that the primary component of a prototype ground-based system for SEP warnings is available several hours on most days. We discuss how a conceptual CME-based warning system relates to other techniques, including an estimate of the relative SEP warning times, and how such a system might be realized.
Precision Seismic Monitoring of Volcanic Eruptions at Axial Seamount
NASA Astrophysics Data System (ADS)
Waldhauser, F.; Wilcock, W. S. D.; Tolstoy, M.; Baillard, C.; Tan, Y. J.; Schaff, D. P.
2017-12-01
Seven permanent ocean bottom seismometers of the Ocean Observatories Initiative's real time cabled observatory at Axial Seamount off the coast of the western United States record seismic activity since 2014. The array captured the April 2015 eruption, shedding light on the detailed structure and dynamics of the volcano and the Juan de Fuca midocean ridge system (Wilcock et al., 2016). After a period of continuously increasing seismic activity primarily associated with the reactivation of caldera ring faults, and the subsequent seismic crisis on April 24, 2015 with 7000 recorded events that day, seismicity rates steadily declined and the array currently records an average of 5 events per day. Here we present results from ongoing efforts to automatically detect and precisely locate seismic events at Axial in real-time, providing the computational framework and fundamental data that will allow rapid characterization and analysis of spatio-temporal changes in seismogenic properties. We combine a kurtosis-based P- and S-phase onset picker and time domain cross-correlation detection and phase delay timing algorithms together with single-event and double-difference location methods to rapidly and precisely (tens of meters) compute the location and magnitudes of new events with respect to a 2-year long, high-resolution background catalog that includes nearly 100,000 events within a 5×5 km region. We extend the real-time double-difference location software DD-RT to efficiently handle the anticipated high-rate and high-density earthquake activity during future eruptions. The modular monitoring framework will allow real-time tracking of other seismic events such as tremors and sea-floor lava explosions that enable the timing and location of lava flows and thus guide response research cruises to the most interesting sites. Finally, rapid detection of eruption precursors and initiation will allow for adaptive sampling by the OOI instruments for optimal recording of future eruptions. With a higher eruption recurrence rate than land-based volcanoes the Axial OOI observatory offers the opportunity to monitor and study volcanic eruptions throughout multiple cycles.
Timing and documentation of key events in neonatal resuscitation.
Heathcote, Adam Charles; Jones, Jacqueline; Clarke, Paul
2018-04-30
Only a minority of babies require extended resuscitation at birth. Resuscitations concerning babies who die or who survive with adverse outcomes are increasingly subject to medicolegal scrutiny. Our aim was to describe real-life timings of key resuscitation events observed in a historical series of newborns who required full resuscitation at birth. Twenty-seven babies born in our centre over a 10-year period had an Apgar score of 0 at 1 min and required full resuscitation. The median (95% confidence interval) postnatal age at achieving key events were commencing cardiac compressions, 2.0 (1.5-4.0) min; endotracheal intubation, 3.8 (2.0-6.0) min; umbilical venous catheterisation 9.0 (7.5-12.0) min; and administration of first adrenaline dose 10.0 (8.0-14.0) min. The wide range of timings presented from real-life cases may prove useful to clinicians involved in medical negligence claims and provide a baseline for quality improvements in resuscitation training. What is Known: • Only a minority of babies require extended resuscitation at birth; these cases are often subject to medicolegal interrogation • Timings of key resuscitation events are poorly described and documentation of resuscitation events is often lacking yet is open to medicolegal scrutiny What is New: • We present a wide range of real-life timings of key resuscitation events during the era of routine newborn life support training • These timings may prove useful to clinicians involved in medical negligence claims and provide a baseline for quality improvements in resuscitation training.
Tracking in Real-Time Pyroclastic Flows at Soufriere Hills Volcano, Montserrat, by infrasonic array.
NASA Astrophysics Data System (ADS)
Ripepe, M.; de Angelis, S.; Lacanna, G.; Poggi, P.; Williams, C.
2008-12-01
Active volcanoes produce infrasonic airwaves, which provide valuable insight into the eruption dynamics and the level of volcanic activity. On open conduit volcanoes, infrasound can be used to monitor the gas overpressure in the magma and the degassing rate of active volcanic vents. On volcanoes characterized by dome growth, infrasound can also be generated by non-explosive sources related to dome collapses and pyroclastic flows. In March 2008, the Department of Earth Science (DST) of Firenze (Italy) in cooperation with Montserrat Volcano Observatory (MVO) has installed a small-aperture infrasonic array at a distance of ~3000 m from the dome of the Soufriere Hill Volcano (SHV). The array has an aperture of 200 m and a "star" geometry, with 3 satellite stations at 100 m distance from the receiving central station. Each element of the array is linked to the receiver station by fiber optics cable, and the signal is acquired with a resolution of 16 bits at a rate of 50 samples/sec. The data collected by the array are sent via a radio modem link to the MVO offices, on Montserrat, where they are archived and processed in real-time. Real-time location of infrasonic events are obtained and displayed on computer monitors for use in monitoring of volcanic activity. After a period of very low levels of activity, starting from the end of May 2008, SHV has produced several small explosions without any short-term precursory sign. Some of these events have generated ash plumes reaching up to a few thousands of meters above the sea level, and were accompanied by moderate-to-large size pyroclastic flows that descended the western flanks of the volcanic edifice. The array was able to detect and locate in real-time the clear infrasound associated both with the explosions and the pyroclastic flows. In the latter case, the array estimated the speed and the direction of the flux revealing the presence of several pulses within the same flow. The variable azimuth of the signal during the flow indicated a mean speed of 160-175 km/h. The ability to detect and track such events in a real-time fashion has a strong impact on understanding the dynamics of pyroclastic flow propagation as well as on monitoring operations and risk management in Montserrat.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joyce, M. J.; Aspinall, M. D.; Cave, F. D.
Pulse-shape discrimination (PSD) in fast, organic scintillation detectors is a long-established technique used to separate neutrons and {gamma} rays in mixed radiation fields. In the analogue domain the method can achieve separation in real time, but all knowledge of the pulses themselves is lost thereby preventing the possibility of any post- or repeated analysis. Also, it is typically reliant on electronic systems that are largely obsolete and which require significant experience to set up. In the digital domain, PSD is often more flexible but significant post-processing has usually been necessary to obtain neutron/{gamma}-ray separation. Moreover, the scintillation media on whichmore » the technique relies usually have a low flash point and are thus deemed hazardous. This complicates the ease with which they are used in industrial applications. In this paper, results obtained with a new portable digital pulse-shape discrimination instrument are described. This instrument provides real-time, digital neutron/{gamma} separation whilst preserving the synchronization with the time-of-arrival for each event, and realizing throughputs of 3 x 10{sup 6} events per second. Furthermore, this system has been tested with a scintillation medium that is non-flammable and not hazardous. (authors)« less
Automating the Processing of Earth Observation Data
NASA Technical Reports Server (NTRS)
Golden, Keith; Pang, Wan-Lin; Nemani, Ramakrishna; Votava, Petr
2003-01-01
NASA s vision for Earth science is to build a "sensor web": an adaptive array of heterogeneous satellites and other sensors that will track important events, such as storms, and provide real-time information about the state of the Earth to a wide variety of customers. Achieving this vision will require automation not only in the scheduling of the observations but also in the processing of the resulting data. To address this need, we are developing a planner-based agent to automatically generate and execute data-flow programs to produce the requested data products.
Real-Time Embedded High Performance Computing: Communications Scheduling.
1995-06-01
real - time operating system must explicitly limit the degradation of the timing performance of all processes as the number of processes...adequately supported by a real - time operating system , could compound the development problems encountered in the past. Many experts feel that the... real - time operating system support for an MPP, although they all provide some support for distributed real-time applications. A distributed real
Effects of global and local contexts on chord processing: An ERP study.
Zhang, Jingjing; Zhou, Xuefeng; Chang, Ruohan; Yang, Yufang
2018-01-31
In real life, the processing of an incoming event is continuously influenced by prior information at multiple timescales. The present study investigated how harmonic contexts at both local and global levels influence the processing of an incoming chord in an event-related potentials experiment. Chord sequences containing two phrases were presented to musically trained listeners, with the last critical chord either harmonically related or less related to its preceding context at local and/or global levels. ERPs data showed an ERAN-like effect for local context in early time window and a N5-like component for later interaction between the local context and global context. These results suggest that both the local and global contexts influence the processing of an incoming music event, and the local effect happens earlier than the global. Moreover, the interaction between the local context and global context in N5 may suggest that music syntactic integration at local level takes place prior to the integration at global level. Copyright © 2017 Elsevier Ltd. All rights reserved.
A new Bayesian Inference-based Phase Associator for Earthquake Early Warning
NASA Astrophysics Data System (ADS)
Meier, Men-Andrin; Heaton, Thomas; Clinton, John; Wiemer, Stefan
2013-04-01
State of the art network-based Earthquake Early Warning (EEW) systems can provide warnings for large magnitude 7+ earthquakes. Although regions in the direct vicinity of the epicenter will not receive warnings prior to damaging shaking, real-time event characterization is available before the destructive S-wave arrival across much of the strongly affected region. In contrast, in the case of the more frequent medium size events, such as the devastating 1994 Mw6.7 Northridge, California, earthquake, providing timely warning to the smaller damage zone is more difficult. For such events the "blind zone" of current systems (e.g. the CISN ShakeAlert system in California) is similar in size to the area over which severe damage occurs. We propose a faster and more robust Bayesian inference-based event associator, that in contrast to the current standard associators (e.g. Earthworm Binder), is tailored to EEW and exploits information other than only phase arrival times. In particular, the associator potentially allows for reliable automated event association with as little as two observations, which, compared to the ShakeAlert system, would speed up the real-time characterizations by about ten seconds and thus reduce the blind zone area by up to 80%. We compile an extensive data set of regional and teleseismic earthquake and noise waveforms spanning a wide range of earthquake magnitudes and tectonic regimes. We pass these waveforms through a causal real-time filterbank with passband filters between 0.1 and 50Hz, and, updating every second from the event detection, extract the maximum amplitudes in each frequency band. Using this dataset, we define distributions of amplitude maxima in each passband as a function of epicentral distance and magnitude. For the real-time data, we pass incoming broadband and strong motion waveforms through the same filterbank and extract an evolving set of maximum amplitudes in each passband. We use the maximum amplitude distributions to check whether the incoming waveforms are consistent with amplitude and frequency patterns of local earthquakes by means of a maximum likelihood approach. If such a single-station event likelihood is larger than a predefined threshold value we check whether there are neighboring stations that also have single-station event likelihoods above the threshold. If this is the case for at least one other station, we evaluate whether the respective relative arrival times are in agreement with a common earthquake origin (assuming a simple velocity model and using an Equal Differential Time location scheme). Additionally we check if there are stations where, given the preliminary location, observations would be expected but were not reported ("not-yet-arrived data"). Together, the single-station event likelihood functions and the location likelihood function constitute the multi-station event likelihood function. This function can then be combined with various types of prior information (such as station noise levels, preceding seismicity, fault proximity, etc.) to obtain a Bayesian posterior distribution, representing the degree of belief that the ensemble of the current real-time observations correspond to a local earthquake, rather than to some other signal source irrelevant for EEW. Additional to the reduction of the blind zone size, this approach facilitates the eventual development of an end-to-end probabilistic framework for an EEW system that provides systematic real-time assessment of the risk of false alerts, which enables end users of EEW to implement damage mitigation strategies only above a specified certainty level.
USGS Provision of Near Real Time Remotely Sensed Imagery for Emergency Response
NASA Astrophysics Data System (ADS)
Jones, B. K.
2014-12-01
The use of remotely sensed imagery in the aftermath of a disaster can have an important impact on the effectiveness of the response for many types of disasters such as floods, earthquakes, volcanic eruptions, landslides, and other natural or human-induced disasters. Ideally, responders in areas that are commonly affected by disasters would have access to archived remote sensing imagery plus the ability to easily obtain the new post event data products. The cost of obtaining and storing the data and the lack of trained professionals who can process the data into a mapping product oftentimes prevent this from happening. USGS Emergency Operations provides remote sensing and geospatial support to emergency managers by providing access to satellite images from numerous domestic and international space agencies including those affiliated with the International Charter Space and Major Disasters and their space-based assets and by hosting and distributing thousands of near real time event related images and map products through the Hazards Data Distribution System (HDDS). These data may include digital elevation models, hydrographic models, base satellite images, vector data layers such as roads, aerial photographs, and other pre and post disaster data. These layers are incorporated into a Web-based browser and data delivery service, the Hazards Data Distribution System (HDDS). The HDDS can be made accessible either to the general public or to specific response agencies. The HDDS concept anticipates customer requirements and provides rapid delivery of data and services. This presentation will provide an overview of remotely sensed imagery that is currently available to support emergency response operations and examples of products that have been created for past events that have provided near real time situational awareness for responding agencies.
Developing an EEG-based on-line closed-loop lapse detection and mitigation system
Wang, Yu-Te; Huang, Kuan-Chih; Wei, Chun-Shu; Huang, Teng-Yi; Ko, Li-Wei; Lin, Chin-Teng; Cheng, Chung-Kuan; Jung, Tzyy-Ping
2014-01-01
In America, 60% of adults reported that they have driven a motor vehicle while feeling drowsy, and at least 15–20% of fatal car accidents are fatigue-related. This study translates previous laboratory-oriented neurophysiological research to design, develop, and test an On-line Closed-loop Lapse Detection and Mitigation (OCLDM) System featuring a mobile wireless dry-sensor EEG headgear and a cell-phone based real-time EEG processing platform. Eleven subjects participated in an event-related lane-keeping task, in which they were instructed to manipulate a randomly deviated, fixed-speed cruising car on a 4-lane highway. This was simulated in a 1st person view with an 8-screen and 8-projector immersive virtual-reality environment. When the subjects experienced lapses or failed to respond to events during the experiment, auditory warning was delivered to rectify the performance decrements. However, the arousing auditory signals were not always effective. The EEG spectra exhibited statistically significant differences between effective and ineffective arousing signals, suggesting that EEG spectra could be used as a countermeasure of the efficacy of arousing signals. In this on-line pilot study, the proposed OCLDM System was able to continuously detect EEG signatures of fatigue, deliver arousing warning to subjects suffering momentary cognitive lapses, and assess the efficacy of the warning in near real-time to rectify cognitive lapses. The on-line testing results of the OCLDM System validated the efficacy of the arousing signals in improving subjects' response times to the subsequent lane-departure events. This study may lead to a practical on-line lapse detection and mitigation system in real-world environments. PMID:25352773
Real-Time Application of Multi-Satellite Precipitation Analysis for Floods and Landslides
NASA Technical Reports Server (NTRS)
Adler, Robert; Hong, Yang; Huffman, George
2007-01-01
Satellite data acquired and processed in real time now have the potential to provide the spacetime information on rainfall needed to monitor flood and landslide events around the world. This can be achieved by integrating the satellite-derived forcing data with hydrological models and landslide algorithms. Progress in using the TRMM Multi-satellite Precipitation Analysis (TMPA) as input to flood and landslide forecasts is outlined, with a focus on understanding limitations of the rainfall data and impacts of those limitations on flood/landslide analyses. Case studies of both successes and failures will be shown, as well as comparison with ground comparison data sets-- both in terms of rainfall and in terms of flood/landslide events. In addition to potential uses in real-time, the nearly ten years of TMPA data allow retrospective running of the models to examine variations in extreme events. The flood determination algorithm consists of four major components: 1) multi-satellite precipitation estimation; 2) characterization of land surface including digital elevation from NASA SRTM (Shuttle Radar Terrain Mission), topography-derived hydrologic parameters such as flow direction, flow accumulation, basin, and river network etc.; 3) a hydrological model to infiltrate rainfall and route overland runoff; and 4) an implementation interface to relay the input data to the models and display the flood inundation results to potential users and decision-makers, In terms of landslides, the satellite rainfall information is combined with a global landslide susceptibility map, derived from a combination of global surface characteristics (digital elevation topography, slope, soil types, soil texture, and land cover classification etc.) using a weighted linear combination approach. In those areas identified as "susceptible" (based on the surface characteristics), landslides are forecast where and when a rainfall intensity/duration threshold is exceeded. Results are described indicating general agreement with landslide occurrences.
NASA Astrophysics Data System (ADS)
Tsinganos, Kanaris; Karastathis, Vassilios K.; Kafatos, Menas; Ouzounov, Dimitar; Tselentis, Gerassimos; Papadopoulos, Gerassimos A.; Voulgaris, Nikolaos; Eleftheriou, Georgios; Mouzakiotis, Evangellos; Liakopoulos, Spyridon; Aspiotis, Theodoros; Gika, Fevronia; E Psiloglou, Basil
2017-04-01
We are presenting the first results of developing a new integrated observational site in Greece to study pre-earthquake processes in Peloponnese, lead by the National Observatory of Athens. We have developed a prototype of multiparameter network approach using an integrated system aimed at monitoring and thorough studies of pre-earthquake processes at the high seismicity area of the Western Hellenic Arc (SW Peloponnese, Greece). The initial prototype of the new observational systems consists of: (1) continuous real-time monitoring of Radon accumulation in the ground through a network of radon sensors, consisting of three gamma radiation detectors [NaI(Tl) scintillators], (2) nine-station seismic array installed to detect and locate events of low magnitude (less than 1.0 R) in the offshore area of the Hellenic arc, (3) real-time weather monitoring systems (air temperature, relative humidity, precipitation, pressure) and (4) satellite thermal radiation from AVHRR/NOAA-18 polar orbit sensing. The first few moths of operations revealed a number of pre-seismic radon variation anomalies before several earthquakes (M>3.6). The radon increases systematically before the larger events. For example a radon anomaly was predominant before the event of Sep 28, M 5.0 (36.73°N, 21.87°E), 18 km ESE of Methoni. The seismic array assists in the evaluation of current seismicity and may yield identification of foreshock activity. Thermal anomalies in satellite images are also examined as an additional tool for evaluation and verification of the Radon increase. According to the Lithosphere-Atmosphere-Ionosphere Coupling (LAIC) concept, atmospheric thermal anomalies observed before large seismic events are associated with the increase of Radon concentration on the ground. Details about the integrating ground and space observations, overall performance of the observational sites, future plans in advancing the cooperation in observations will be discussed.
NASA Astrophysics Data System (ADS)
Macpherson, K. A.
2017-12-01
The National Oceanographic and Atmospheric Administration's National and Pacific Tsunami Warning Centers currently rely on traditional seismic data in order to detect and evaluate potential tsunamigenic earthquakes anywhere on the globe. The first information products disseminated by the centers following a significant seismic event are based solely on seismically-derived earthquake locations and magnitudes, and are issued within minutes of the earthquake origin time. Thus, the rapid and reliable determination of the earthquake magnitude is a critical piece of information needed by the centers to generate the appropriate alert levels. However, seismically-derived magnitudes of large events are plagued by well-known problems, particularly during the first few minutes following the origin time; near-source broad-band instruments may go off scale, and magnitudes tend to saturate until sufficient teleseismic data arrive to represent the long-period signal that characterizes large events. However, geodetic data such as high-rate Global Positioning System (hGPS) displacements and seismogeodetic data that is a combination of collocated hGPS and accelerometer data do not suffer from these limitations. These sensors stay on scale, even for large events, and they record both dynamic and static displacements that may be used to estimate magnitude without saturation. Therefore, there is an ongoing effort to incorporate these data streams into the operations of the tsunami warning centers to enhance current magnitude determination capabilities, and eventually, to invert the geodetic displacements for mechanism and finite-fault information. These later quantities will be useful for tsunami modeling and forecasting. The tsunami warning centers rely on the Earthworm system for real-time data acquisition, so we have developed Earthworm modules for the Magnitude from Peak Ground Displacement (MPGD) algorithm, developed at the University of Washington and the University of California, Berkeley, and a module for a Static Offset Estimator algorithm that was developed by the NASA Jet Propulsion Laboratory. In this presentation we will discuss module architecture and show output computed by replaying both synthetic and historical scenarios in a simulated real-time Earthworm environment.
TiD-Introducing and Benchmarking an Event-Delivery System for Brain-Computer Interfaces.
Breitwieser, Christian; Tavella, Michele; Schreuder, Martijn; Cincotti, Febo; Leeb, Robert; Muller-Putz, Gernot R
2017-12-01
In this paper, we present and analyze an event distribution system for brain-computer interfaces. Events are commonly used to mark and describe incidents during an experiment and are therefore critical for later data analysis or immediate real-time processing. The presented approach, called Tools for brain-computer interaction interface D (TiD), delivers messages in XML format via a buslike system using transmission control protocol connections or shared memory. A dedicated server dispatches TiD messages to distributed or local clients. The TiD message is designed to be flexible and contains time stamps for event synchronization, whereas events describe incidents, which occur during an experiment. TiD was tested extensively toward stability and latency. The effect of an occurring event jitter was analyzed and benchmarked on a reference implementation under different conditions as gigabit and 100-Mb Ethernet or Wi-Fi with a different number of event receivers. A 3-dB signal attenuation, which occurs when averaging jitter influenced trials aligned by events, is starting to become visible at around 1-2 kHz in the case of a gigabit connection. Mean event distribution times across operating systems are ranging from 0.3 to 0.5ms for a gigabit network connection for 10 6 events. Results for other environmental conditions are available in this paper. References already using TiD for event distribution are provided showing the applicability of TiD for event delivery with distributed or local clients.
Adaptive lesion formation using dual mode ultrasound array system
NASA Astrophysics Data System (ADS)
Liu, Dalong; Casper, Andrew; Haritonova, Alyona; Ebbini, Emad S.
2017-03-01
We present the results from an ultrasound-guided focused ultrasound platform designed to perform real-time monitoring and control of lesion formation. Real-time signal processing of echogenicity changes during lesion formation allows for identification of signature events indicative of tissue damage. The detection of these events triggers the cessation or the reduction of the exposure (intensity and/or time) to prevent overexposure. A dual mode ultrasound array (DMUA) is used for forming single- and multiple-focus patterns in a variety of tissues. The DMUA approach allows for inherent registration between the therapeutic and imaging coordinate systems providing instantaneous, spatially-accurate feedback on lesion formation dynamics. The beamformed RF data has been shown to have high sensitivity and specificity to tissue changes during lesion formation, including in vivo. In particular, the beamformed echo data from the DMUA is very sensitive to cavitation activity in response to HIFU in a variety of modes, e.g. boiling cavitation. This form of feedback is characterized by sudden increase in echogenicity that could occur within milliseconds of the application of HIFU (see http://youtu.be/No2wh-ceTLs for an example). The real-time beamforming and signal processing allowing the adaptive control of lesion formation is enabled by a high performance GPU platform (response time within 10 msec). We present results from a series of experiments in bovine cardiac tissue demonstrating the robustness and increased speed of volumetric lesion formation for a range of clinically-relevant exposures. Gross histology demonstrate clearly that adaptive lesion formation results in tissue damage consistent with the size of the focal spot and the raster scan in 3 dimensions. In contrast, uncontrolled volumetric lesions exhibit significant pre-focal buildup due to excessive exposure from multiple full-exposure HIFU shots. Stopping or reducing the HIFU exposure upon the detection of such an events has been shown to produce precisely controlled lesions with no evidence of overexposure even when fast raster scan of volumetric HIFU lesion is attempted. We also show that the DMUA beamformed echo data is capable of detecting underexposure condition at the target location, e.g. due to the obstruction of the HIFU beam resulting from cavitation activity in the path of the beam. The results clearly demonstrate the advantage of adaptive lesion formation in reducing the treatment time while confining the tissue damage to the target volume.
Camuñas-Mesa, Luis A; Domínguez-Cordero, Yaisel L; Linares-Barranco, Alejandro; Serrano-Gotarredona, Teresa; Linares-Barranco, Bernabé
2018-01-01
Convolutional Neural Networks (ConvNets) are a particular type of neural network often used for many applications like image recognition, video analysis or natural language processing. They are inspired by the human brain, following a specific organization of the connectivity pattern between layers of neurons known as receptive field. These networks have been traditionally implemented in software, but they are becoming more computationally expensive as they scale up, having limitations for real-time processing of high-speed stimuli. On the other hand, hardware implementations show difficulties to be used for different applications, due to their reduced flexibility. In this paper, we propose a fully configurable event-driven convolutional node with rate saturation mechanism that can be used to implement arbitrary ConvNets on FPGAs. This node includes a convolutional processing unit and a routing element which allows to build large 2D arrays where any multilayer structure can be implemented. The rate saturation mechanism emulates the refractory behavior in biological neurons, guaranteeing a minimum separation in time between consecutive events. A 4-layer ConvNet with 22 convolutional nodes trained for poker card symbol recognition has been implemented in a Spartan6 FPGA. This network has been tested with a stimulus where 40 poker cards were observed by a Dynamic Vision Sensor (DVS) in 1 s time. Different slow-down factors were applied to characterize the behavior of the system for high speed processing. For slow stimulus play-back, a 96% recognition rate is obtained with a power consumption of 0.85 mW. At maximum play-back speed, a traffic control mechanism downsamples the input stimulus, obtaining a recognition rate above 63% when less than 20% of the input events are processed, demonstrating the robustness of the network.
Camuñas-Mesa, Luis A.; Domínguez-Cordero, Yaisel L.; Linares-Barranco, Alejandro; Serrano-Gotarredona, Teresa; Linares-Barranco, Bernabé
2018-01-01
Convolutional Neural Networks (ConvNets) are a particular type of neural network often used for many applications like image recognition, video analysis or natural language processing. They are inspired by the human brain, following a specific organization of the connectivity pattern between layers of neurons known as receptive field. These networks have been traditionally implemented in software, but they are becoming more computationally expensive as they scale up, having limitations for real-time processing of high-speed stimuli. On the other hand, hardware implementations show difficulties to be used for different applications, due to their reduced flexibility. In this paper, we propose a fully configurable event-driven convolutional node with rate saturation mechanism that can be used to implement arbitrary ConvNets on FPGAs. This node includes a convolutional processing unit and a routing element which allows to build large 2D arrays where any multilayer structure can be implemented. The rate saturation mechanism emulates the refractory behavior in biological neurons, guaranteeing a minimum separation in time between consecutive events. A 4-layer ConvNet with 22 convolutional nodes trained for poker card symbol recognition has been implemented in a Spartan6 FPGA. This network has been tested with a stimulus where 40 poker cards were observed by a Dynamic Vision Sensor (DVS) in 1 s time. Different slow-down factors were applied to characterize the behavior of the system for high speed processing. For slow stimulus play-back, a 96% recognition rate is obtained with a power consumption of 0.85 mW. At maximum play-back speed, a traffic control mechanism downsamples the input stimulus, obtaining a recognition rate above 63% when less than 20% of the input events are processed, demonstrating the robustness of the network. PMID:29515349
NASA Astrophysics Data System (ADS)
Souvatzoglou, G.; Papaioannou, A.; Mavromichalaki, H.; Dimitroulakos, J.; Sarlanis, C.
2014-11-01
Whenever a significant intensity increase is being recorded by at least three neutron monitor stations in real-time mode, a ground level enhancement (GLE) event is marked and an automated alert is issued. Although, the physical concept of the algorithm is solid and has efficiently worked in a number of cases, the availability of real-time data is still an open issue and makes timely GLE alerts quite challenging. In this work we present the optimization of the GLE alert that has been set into operation since 2006 at the Athens Neutron Monitor Station. This upgrade has led to GLE Alert Plus, which is currently based upon the Neutron Monitor Database (NMDB). We have determined the critical values per station allowing us to issue reliable GLE alerts close to the initiation of the event while at the same time we keep the false alert rate at low levels. Furthermore, we have managed to treat the problem of data availability, introducing the Go-Back-N algorithm. A total of 13 GLE events have been marked from January 2000 to December 2012. GLE Alert Plus issued an alert for 12 events. These alert times are compared to the alert times of GOES Space Weather Prediction Center and Solar Energetic Particle forecaster of the University of Málaga (UMASEP). In all cases GLE Alert Plus precedes the GOES alert by ≈8-52 min. The comparison with UMASEP demonstrated a remarkably good agreement. Real-time GLE alerts by GLE Alert Plus may be retrieved by http://cosray.phys.uoa.gr/gle_alert_plus.html, http://www.nmdb.eu, and http://swe.ssa.esa.int/web/guest/space-radiation. An automated GLE alert email notification system is also available to interested users.
NASA Astrophysics Data System (ADS)
Bouchard, R.; Locke, L.; Hansen, W.; Collins, S.; McArthur, S.
2007-12-01
DART systems are a critical component of the tsunami warning system as they provide the only real-time, in situ, tsunami detection before landfall. DART systems consist of a surface buoy that serves as a position locater and communications transceiver and a Bottom Pressure Recorder (BPR) on the seafloor. The BPR records temperature and pressure at 15-second intervals to a memory card for later retrieval for analysis and use by tsunami researchers, but the BPRs are normally recovered only once every two years. The DART systems also transmit subsets of the data, converted to an estimation of the sea surface height, in near real-time for use by the tsunami warning community. These data are available on NDBC's webpages, http://www.ndbc.noaa.gov/dart.shtml. Although not of the resolution of the data recorded to the BPR memory card, the near real-time data have proven to be of value in research applications [1]. Of particular interest are the DART data associated with geophysical events. The DART BPR continuously compares the measured sea height with a predicted sea-height and when the difference exceeds a threshold value, the BPR goes into Event Mode. Event Mode provides an extended, more frequent near real-time reporting of the sea surface heights for tsunami detection. The BPR can go into Event Mode because of geophysical triggers, such as tsunamis or seismic activity, which may or may not be tsunamigenic. The BPR can also go into Event Mode during recovery of the BPR as it leaves the seafloor, or when manually triggered by the Tsunami Warning Centers in advance of an expected tsunami. On occasion, the BPR will go into Event Mode without any associated tsunami or seismic activity or human intervention and these are considered "False'' Events. Approximately one- third of all Events can be classified as "False". NDBC is responsible for the operations, maintenance, and data management of the DART stations. Each DART station has a webpage with a drop-down list of all Events. NDBC maintains the non-geophysical Events in order to maintain the continuity of the time series records. In 2007, NDBC compiled all DART Events that occurred while under NDBC's operational control and made an assessment on their validity. The NDBC analysts performed the assessment using the characteristics of the data time series, triggering criteria, and associated seismic events. The compilation and assessments are catalogued in a NDBC technical document. The Catalog also includes a listing of the one-hour, high-resolution data, retrieved remotely from the BPRs that are not available on the web pages. The Events are classified by their triggering mechanism and listed by station location and, for those Events associated with geophysical triggers, they are listed by their associated seismic events. The Catalog provides researchers with a valuable tool in locating, assessing, and applying near real-time DART data to tsunami research and will be updated following DART Events. A link to the published Catalog can be found on the NDBC DART website, http://www.ndbc.noaa.gov/dart.shtml. Reference: [1] Gower, J. and F. González (2006), U.S. Warning System Detected the Sumatra Tsunami, Eos Trans. AGU, 87(10), 105-112.
Next generation PET data acquisition architectures
NASA Astrophysics Data System (ADS)
Jones, W. F.; Reed, J. H.; Everman, J. L.; Young, J. W.; Seese, R. D.
1997-06-01
New architectures for higher performance data acquisition in PET are proposed. Improvements are demanded primarily by three areas of advancing PET state of the art. First, larger detector arrays such as the Hammersmith ECAT/sup (R/) EXACT HR/sup ++/ exceed the addressing capacity of 32 bit coincidence event words. Second, better scintillators (LSO) make depth-of interaction (DOI) and time-of-flight (TOF) operation more practical. Third, fully optimized single photon attenuation correction requires higher rates of data collection. New technologies which enable the proposed third generation Real Time Sorter (RTS III) include: (1) 80 Mbyte/sec Fibre Channel RAID disk systems, (2) PowerPC on both VMEbus and PCI Local bus, and (3) quadruple interleaved DRAM controller designs. Data acquisition flexibility is enhanced through a wider 64 bit coincidence event word. PET methodology support includes DOI (6 bits), TOF (6 bits), multiple energy windows (6 bits), 512/spl times/512 sinogram indexes (18 bits), and 256 crystal rings (16 bits). Throughput of 10 M events/sec is expected for list-mode data collection as well as both on-line and replay histogramming. Fully efficient list-mode storage for each PET application is provided by real-time bit packing of only the active event word bits. Real-time circuits provide DOI rebinning.
Some comments on Dr Iglesias's paper, 'In vitro fertilisation: the major issues'.
Mill, J M
1986-01-01
In an article in an earlier edition of the Journal of Medical Ethics (1) Dr Iglesias bases her analysis upon the mediaeval interpretation of Platonic metaphysics and Aristotelian logic as given by Aquinas. Propositional forms are applied to the analysis of experience. This results in a very abstract analysis. The essential connection of events and their changing temporal relationships are ignored. The dichotomy between body and soul is a central concept. The unchanging elements in experience are assumed to be more real than the actual world of experienced process. Such a view makes the analysis of the temporal factors in experience impossible. Its abstractness is quite unsuitable for the analysis of the ontological structure and development of the neonate from fertilisation to birth. A N Whitehead made the notion of organism central to his philosophy. He refused to place human experience outside nature, or admit dualism. His philosophy of organism is an attempt to uncover the essential elements connecting human experience with the physical and biological sciences. Time, change and process are, in his view, more real than the static abstractions obtainable by the use of the fallacy of misplaced concreteness. Use of the latter negates the essential connectedness of events and the importance of temporarily and change (2). In this paper I argue that the embryo, being an organism, is not analysable in terms of thinghood. It is a process. To apply Aristotelian logical concepts to it is to distort the real nature of the datum. PMID:3959039
Simulating recurrent event data with hazard functions defined on a total time scale.
Jahn-Eimermacher, Antje; Ingel, Katharina; Ozga, Ann-Kathrin; Preussler, Stella; Binder, Harald
2015-03-08
In medical studies with recurrent event data a total time scale perspective is often needed to adequately reflect disease mechanisms. This means that the hazard process is defined on the time since some starting point, e.g. the beginning of some disease, in contrast to a gap time scale where the hazard process restarts after each event. While techniques such as the Andersen-Gill model have been developed for analyzing data from a total time perspective, techniques for the simulation of such data, e.g. for sample size planning, have not been investigated so far. We have derived a simulation algorithm covering the Andersen-Gill model that can be used for sample size planning in clinical trials as well as the investigation of modeling techniques. Specifically, we allow for fixed and/or random covariates and an arbitrary hazard function defined on a total time scale. Furthermore we take into account that individuals may be temporarily insusceptible to a recurrent incidence of the event. The methods are based on conditional distributions of the inter-event times conditional on the total time of the preceeding event or study start. Closed form solutions are provided for common distributions. The derived methods have been implemented in a readily accessible R script. The proposed techniques are illustrated by planning the sample size for a clinical trial with complex recurrent event data. The required sample size is shown to be affected not only by censoring and intra-patient correlation, but also by the presence of risk-free intervals. This demonstrates the need for a simulation algorithm that particularly allows for complex study designs where no analytical sample size formulas might exist. The derived simulation algorithm is seen to be useful for the simulation of recurrent event data that follow an Andersen-Gill model. Next to the use of a total time scale, it allows for intra-patient correlation and risk-free intervals as are often observed in clinical trial data. Its application therefore allows the simulation of data that closely resemble real settings and thus can improve the use of simulation studies for designing and analysing studies.
Maturation of Structural Health Management Systems for Solid Rocket Motors
NASA Technical Reports Server (NTRS)
Quing, Xinlin; Beard, Shawn; Zhang, Chang
2011-01-01
Concepts of an autonomous and automated space-compliant diagnostic system were developed for conditioned-based maintenance (CBM) of rocket motors for space exploration vehicles. The diagnostic system will provide real-time information on the integrity of critical structures on launch vehicles, improve their performance, and greatly increase crew safety while decreasing inspection costs. Using the SMART Layer technology as a basis, detailed procedures and calibration techniques for implementation of the diagnostic system were developed. The diagnostic system is a distributed system, which consists of a sensor network, local data loggers, and a host central processor. The system detects external impact to the structure. The major functions of the system include an estimate of impact location, estimate of impact force at impacted location, and estimate of the structure damage at impacted location. This system consists of a large-area sensor network, dedicated multiple local data loggers with signal processing and data analysis software to allow for real-time, in situ monitoring, and longterm tracking of structural integrity of solid rocket motors. Specifically, the system could provide easy installation of large sensor networks, onboard operation under harsh environments and loading, inspection of inaccessible areas without disassembly, detection of impact events and impact damage in real-time, and monitoring of a large area with local data processing to reduce wiring.
Real Time Global Tests of the ALICE High Level Trigger Data Transport Framework
NASA Astrophysics Data System (ADS)
Becker, B.; Chattopadhyay, S.; Cicalo, C.; Cleymans, J.; de Vaux, G.; Fearick, R. W.; Lindenstruth, V.; Richter, M.; Rohrich, D.; Staley, F.; Steinbeck, T. M.; Szostak, A.; Tilsner, H.; Weis, R.; Vilakazi, Z. Z.
2008-04-01
The High Level Trigger (HLT) system of the ALICE experiment is an online event filter and trigger system designed for input bandwidths of up to 25 GB/s at event rates of up to 1 kHz. The system is designed as a scalable PC cluster, implementing several hundred nodes. The transport of data in the system is handled by an object-oriented data flow framework operating on the basis of the publisher-subscriber principle, being designed fully pipelined with lowest processing overhead and communication latency in the cluster. In this paper, we report the latest measurements where this framework has been operated on five different sites over a global north-south link extending more than 10,000 km, processing a ldquoreal-timerdquo data flow.
A novel real-time health monitoring system for unmanned vehicles
NASA Astrophysics Data System (ADS)
Zhang, David C.; Ouyang, Lien; Qing, Peter; Li, Irene
2008-04-01
Real-time monitoring the status of in-service structures such as unmanned vehicles can provide invaluable information to detect the damages to the structures on time. The unmanned vehicles can be maintained and repaired in time if such damages are found. One typical cause of damages of unmanned vehicles is from impacts caused by bumping into some obstacles or being hit by some objects such as hostile fire. This paper introduces a novel impact event sensing system that can detect the location of the impact events and the force-time history of the impact events. The system consists of the Piezo-electric sensor network, the hardware platform and the analysis software. The new customized battery-powered impact event sensing system supports up to 64-channel parallel data acquisition. It features an innovative low-power hardware trigger circuit that monitors 64 channels simultaneously. The system is in the sleep mode most of the time. When an impact event happens, the system will wake up in micro-seconds and detect the impact location and corresponding force-time history. The system can be combined with the SMART sensing system to further evaluate the impact damage severity.
The First Ground-Level Enhancement of Solar Cycle 24 on 17 May 2012 and Its Real-Time Detection
NASA Astrophysics Data System (ADS)
Papaioannou, A.; Souvatzoglou, G.; Paschalis, P.; Gerontidou, M.; Mavromichalaki, H.
2014-01-01
Ground-level enhancements (GLEs) are defined as sudden increases in the recorded intensity of cosmic-ray particles, usually by neutron monitors (NMs). In this work we present a time-shifting analysis (TSA) for the first arriving particles that were detected at Earth by NMs. We also present an automated real-time GLE alert that has been developed and is operating via the Neutron Monitor Database (NMDB), which successfully identified the 17 May 2012 event, designated as GLE71. We discuss the time evolution of the real-time GLE alert that was issued for GLE71 and present the event onset-time for NMs that contributed to this GLE alert based on their archived data. A comparison with their real-time time-stamp was made to illustrate the necessity for high-resolution data ( e.g. 1-min time resolution) made available at every minute. The first results on the propagation of relativistic protons that have been recorded by NMs, as inferred by the TSA, imply that they are most probably accelerated by the coronal-mass-ejection-driven shock. Furthermore, the successful usage of NM data and the corresponding achievement of issuing a timely GLE alert are discussed.
ERIC Educational Resources Information Center
Hare, Mary; Jones, Michael; Thomson, Caroline; Kelly, Sarah; McRae, Ken
2009-01-01
An increasing number of results in sentence and discourse processing demonstrate that comprehension relies on rich pragmatic knowledge about real-world events, and that incoming words incrementally activate such knowledge. If so, then even outside of any larger context, nouns should activate knowledge of the generalized events that they denote or…
Concurrent systems and time synchronization
NASA Astrophysics Data System (ADS)
Burgin, Mark; Grathoff, Annette
2018-05-01
In the majority of scientific fields, system dynamics is described assuming existence of unique time for the whole system. However, it is established theoretically, for example, in relativity theory or in the system theory of time, and validated experimentally that there are different times and time scales in a variety of real systems - physical, chemical, biological, social, etc. In spite of this, there are no wide-ranging scientific approaches to exploration of such systems. Therefore, the goal of this paper is to study systems with this property. We call them concurrent systems because processes in them can go, events can happen and actions can be performed in different time scales. The problem of time synchronization is specifically explored.
BENEFITS OF SEWERAGE SYSTEM REAL-TIME CONTROL
Real-time control (RTC) is a custom-designed computer-assisted management system for a specific urban sewerage network that is activated during a wet-weather flow event. Though uses of RTC systems had started in the mid 60s, recent developments in computers, telecommunication, in...
REAL-TIME REMOTE MONITORING OF DRINKING WATER QUALITY
Over the past eight years, the U.S. Environmental Protection Agency's (EPA) Office of Research and Development (ORD) has funded the testing and evaluation of various online "real-time" technologies for monitoring drinking water quality. The events of 9/11 and subsequent threats t...
REAL-TIME CONTROL OF COMBINED SEWER NETWORKS
Real-time control (RTC) is a custom-designed management program for a specific urban sewerage system during a wet-weather event. The function of RTC is to assure efficient operation of the sewerage system and maximum utilization of existing storage capacity, either to fully conta...
NASA Astrophysics Data System (ADS)
Schröter, Kai; Elmer, Florian; Trieselmann, Werner; Kreibich, Heidi; Kunz, Michael; Khazai, Bijan; Dransch, Doris; Wenzel, Friedemann; Zschau, Jochen; Merz, Bruno; Mühr, Bernhard; Kunz-Plapp, Tina; Möhrle, Stella; Bessel, Tina; Fohringer, Joachim
2014-05-01
The Central European flood of June 2013 is one of the most severe flood events that have occurred in Central Europe in the past decades. All major German river basins were affected (Rhine, Danube, and Elbe as well as the smaller Weser catchment).In terms of spatial extent and event magnitude, it was the most severe event at least since 1950. Within the current research focus on near real time forensic disaster analysis, the Center for Disaster Management and Risk Reduction Technology (CEDIM) assessed and analysed the multiple facets of the flood event from the beginning. The aim is to describe the on-going event, analyse the event sources, link the physical characteristics to the impact and consequences of the event and to understand the root causes that turn the physical event into a disaster (or prevent it from becoming disastrous). For the near real time component of this research, tools for rapid assessment and concise presentation of analysis results are essential. This contribution provides a graphical summary of the results of the CEDIM-FDA analyses on the June 2013 flood. It demonstrates the potential of visual representations for improving the communication and hence usability of findings in a rapid, intelligible and expressive way as a valuable supplement to usual event reporting. It is based on analyses of the hydrometeorological sources, the flood pathways (from satellite imagery, data extraction from social media), the resilience of the affected regions, and causal loss analysis. The prototypical representation of the FDA-results for the June 2013 flood provides an important step in the development of graphical event templates for the visualisation of forensic disaster analyses. These are intended to become a standard component of future CEDIM-FDA event activities.
Jacchia, Sara; Nardini, Elena; Savini, Christian; Petrillo, Mauro; Angers-Loustau, Alexandre; Shim, Jung-Hyun; Trijatmiko, Kurniawan; Kreysa, Joachim; Mazzara, Marco
2015-02-18
In this study, we developed, optimized, and in-house validated a real-time PCR method for the event-specific detection and quantification of Golden Rice 2, a genetically modified rice with provitamin A in the grain. We optimized and evaluated the performance of the taxon (targeting rice Phospholipase D α2 gene)- and event (targeting the 3' insert-to-plant DNA junction)-specific assays that compose the method as independent modules, using haploid genome equivalents as unit of measurement. We verified the specificity of the two real-time PCR assays and determined their dynamic range, limit of quantification, limit of detection, and robustness. We also confirmed that the taxon-specific DNA sequence is present in single copy in the rice genome and verified its stability of amplification across 132 rice varieties. A relative quantification experiment evidenced the correct performance of the two assays when used in combination.
A new prototype system for earthquake early warning in Taiwan
NASA Astrophysics Data System (ADS)
Hsiao, N.; Wu, Y.; Chen, D.; Kuo, K.; Shin, T.
2009-12-01
Earthquake early warning (EEW) system has already been developed and tested in Taiwan for more than ten years. With the implementation of a real-time strong-motion network by the Central Weather Bureau (CWB), a virtual sub-network (VSN) system based on regional early warning approach was utilized at the first attempt. In order to shorten the processing time, seismic waveforms in a 10-sec time window starting from the first P-wave arrival time at the nearest station are used to determine the hypocenter and earthquake magnitude which is dubbed ML10. Since 2001, this EEW system has responded to a total of 255 events with magnitude greater than 4.5 occurred inland or off the coast of Taiwan. The system is capable of issuing an earthquake report within 20 sec of its occurrence with good magnitude estimations for events up to magnitude 6.5. This will provide early warning for metropolitan areas located 70 km away from the epicentre. In the latest development, a new prototype EEW system based on P-wave method was developed. Instead of ML10, we adopt the “Pd magnitude”, MPd, as our magnitude indicator in the new system. Pd is defined as the peak amplitude of the initial P-wave displacement. In the previous studies, by analyzing the Pd attenuation relationship with earthquake magnitudes, Pd was proved to be a good magnitude estimator for EEW purpose. Therefore, we adopt the Pd magnitude in developing our next generation EEW system. The new system is designed and constructed based on the Central Weather Bureau Seismographic Network (CWBSN). The CWBSN is a real-time seismographic network with more than one hundred digital telemetered seismic stations distributed over the entire Taiwan. Currently, there are three types of seismic instruments installed at the stations, either co-site or separately installed, including short-period seismographs, accelerometers, and broadband instruments. For the need of integral data processing, we use the Earthworm system as a common platform to integrate all real-time signals. In the process, strong-motion and broadband signals are used for automatic P-wave arrival time and Pd determination. However, short-period signals are only used for P-wave arrival time picking. This new system is still under development and being improved, with the hope of replacing the current operational EEW system in the future.
Seismic Linear Noise Attenuation with Use of Radial Transform
NASA Astrophysics Data System (ADS)
Szymańska-Małysa, Żaneta
2018-03-01
One of the goals of seismic data processing is to attenuate the recorded noise in order to enable correct interpretation of the image. Radial transform has been used as a very effective tool in the attenuation of various types of linear noise, both numerical and real (such as ground roll, direct waves, head waves, guided waves etc). The result of transformation from offset - time (X - T) domain into apparent velocity - time (R - T) domain is frequency separation between reflections and linear events. In this article synthetic and real seismic shot gathers were examined. One example was targeted at far offset area of dataset where reflections and noise had similar apparent velocities and frequency bands. Another example was a result of elastic modelling where linear artefacts were produced. Bandpass filtering and scaling operation executed in radial domain attenuated all discussed types of linear noise very effectively. After noise reduction all further processing steps reveal better results, especially velocity analysis, migration and stacking. In all presented cases signal-to-noise ratio was significantly increased and reflections covered previously by noise were revealed. Power spectra of filtered seismic records preserved real dynamics of reflections.
Akiyama, Hiroshi; Sakata, Kozue; Makiyma, Daiki; Nakamura, Kosuke; Teshima, Reiko; Nakashima, Akie; Ogawa, Asako; Yamagishi, Toru; Futo, Satoshi; Oguchi, Taichi; Mano, Junichi; Kitta, Kazumi
2011-01-01
In many countries, the labeling of grains, feed, and foodstuff is mandatory if the genetically modified (GM) organism content exceeds a certain level of approved GM varieties. We previously developed an individual kernel detection system consisting of grinding individual kernels, DNA extraction from the individually ground kernels, GM detection using multiplex real-time PCR, and GM event detection using multiplex qualitative PCR to analyze the precise commingling level and varieties of GM maize in real sample grains. We performed the interlaboratory study of the DNA extraction with multiple ground samples, multiplex real-time PCR detection, and multiplex qualitative PCR detection to evaluate its applicability, practicality, and ruggedness for the individual kernel detection system of GM maize. DNA extraction with multiple ground samples, multiplex real-time PCR, and multiplex qualitative PCR were evaluated by five laboratories in Japan, and all results from these laboratories were consistent with the expected results in terms of the commingling level and event analysis. Thus, the DNA extraction with multiple ground samples, multiplex real-time PCR, and multiplex qualitative PCR for the individual kernel detection system is applicable and practicable in a laboratory to regulate the commingling level of GM maize grain for GM samples, including stacked GM maize.
Sudell, Maria; Tudur Smith, Catrin; Gueyffier, François; Kolamunnage-Dona, Ruwanthi
2018-04-15
Joint modelling of longitudinal and time-to-event data is often preferred over separate longitudinal or time-to-event analyses as it can account for study dropout, error in longitudinally measured covariates, and correlation between longitudinal and time-to-event outcomes. The joint modelling literature focuses mainly on the analysis of single studies with no methods currently available for the meta-analysis of joint model estimates from multiple studies. We propose a 2-stage method for meta-analysis of joint model estimates. These methods are applied to the INDANA dataset to combine joint model estimates of systolic blood pressure with time to death, time to myocardial infarction, and time to stroke. Results are compared to meta-analyses of separate longitudinal or time-to-event models. A simulation study is conducted to contrast separate versus joint analyses over a range of scenarios. Using the real dataset, similar results were obtained by using the separate and joint analyses. However, the simulation study indicated a benefit of use of joint rather than separate methods in a meta-analytic setting where association exists between the longitudinal and time-to-event outcomes. Where evidence of association between longitudinal and time-to-event outcomes exists, results from joint models over standalone analyses should be pooled in 2-stage meta-analyses. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
West-Coast Wide Expansion and Testing of the Geodetic Alarm System (G-larmS)
NASA Astrophysics Data System (ADS)
Ruhl, C. J.; Grapenthin, R.; Melgar, D.; Aranha, M. A.; Allen, R. M.
2016-12-01
The Geodetic Alarm System (G-larmS) was developed in collaboration between the Berkeley Seismological Laboratory (BSL) and New Mexico Tech for real-time Earthquake Early Warning (EEW). G-larmS has been in continuous operation at the BSL since 2014 using event triggers from the ShakeAlert EEW system and real-time position time series from a fully triangulated network consisting of BARD, PBO and USGS stations across northern California (CA). G-larmS has been extended to include southern CA and Cascadia, providing continuous west-coast wide coverage. G-larmS currently uses high rate (1 Hz), low latency (< 5 s), accurate positioning (cm level) time series data from a regional GPS network and P-wave event triggers from the ShakeAlert EEW system. It extracts static offsets from real-time GPS time series upon S-wave arrival and performs a least squares inversion on these offsets to determine slip on a finite fault. A key issue with geodetic EEW approaches is that unlike seismology-based algorithms that are routinely tested using frequent small-magnitude events, geodetic systems are not regularly exercised. Scenario ruptures are therefore important for testing the performance of G-larmS. We discuss results from scenario events on several large faults (capable of M>6.5) in CA and Cascadia built from realistic 3D geometries. Synthetic long-period 1Hz displacement waveforms were obtained from a new stochastic kinematic slip distribution generation method. Waveforms are validated by direct comparison to peak P-wave displacement scaling laws and to PGD GMPEs obtained from high-rate GPS observations of large events worldwide. We run the scenarios on real-time streams to systematically test the recovery of slip and magnitude by G-larmS. In addition to presenting these results, we will discuss new capabilities, such as implementing 2D geometry and the applicability of these results to GPS enhanced tsunami warning systems.
NASA Astrophysics Data System (ADS)
Tong, Qiujie; Wang, Qianqian; Li, Xiaoyang; Shan, Bin; Cui, Xuntai; Li, Chenyu; Peng, Zhong
2016-11-01
In order to satisfy the requirements of the real-time and generality, a laser target simulator in semi-physical simulation system based on RTX+LabWindows/CVI platform is proposed in this paper. Compared with the upper-lower computers simulation platform architecture used in the most of the real-time system now, this system has better maintainability and portability. This system runs on the Windows platform, using Windows RTX real-time extension subsystem to ensure the real-time performance of the system combining with the reflective memory network to complete some real-time tasks such as calculating the simulation model, transmitting the simulation data, and keeping real-time communication. The real-time tasks of simulation system run under the RTSS process. At the same time, we use the LabWindows/CVI to compile a graphical interface, and complete some non-real-time tasks in the process of simulation such as man-machine interaction, display and storage of the simulation data, which run under the Win32 process. Through the design of RTX shared memory and task scheduling algorithm, the data interaction between the real-time tasks process of RTSS and non-real-time tasks process of Win32 is completed. The experimental results show that this system has the strongly real-time performance, highly stability, and highly simulation accuracy. At the same time, it also has the good performance of human-computer interaction.
Real-Time Nonlinear Optical Information Processing.
1979-06-01
operations aree presented. One approach realizes the halftone method of nonlinear optical processing in real time by replacing the conventional...photographic recording medium with a real-time image transducer. In the second approach halftoning is eliminated and the real-time device is used directly
Multibeam Gpu Transient Pipeline for the Medicina BEST-2 Array
NASA Astrophysics Data System (ADS)
Magro, A.; Hickish, J.; Adami, K. Z.
2013-09-01
Radio transient discovery using next generation radio telescopes will pose several digital signal processing and data transfer challenges, requiring specialized high-performance backends. Several accelerator technologies are being considered as prototyping platforms, including Graphics Processing Units (GPUs). In this paper we present a real-time pipeline prototype capable of processing multiple beams concurrently, performing Radio Frequency Interference (RFI) rejection through thresholding, correcting for the delay in signal arrival times across the frequency band using brute-force dedispersion, event detection and clustering, and finally candidate filtering, with the capability of persisting data buffers containing interesting signals to disk. This setup was deployed at the BEST-2 SKA pathfinder in Medicina, Italy, where several benchmarks and test observations of astrophysical transients were conducted. These tests show that on the deployed hardware eight 20 MHz beams can be processed simultaneously for 640 Dispersion Measure (DM) values. Furthermore, the clustering and candidate filtering algorithms employed prove to be good candidates for online event detection techniques. The number of beams which can be processed increases proportionally to the number of servers deployed and number of GPUs, making it a viable architecture for current and future radio telescopes.
NASA Astrophysics Data System (ADS)
Kassab, Ala'; Liang, Steve; Gao, Yang
2010-12-01
Emergency agencies seek to maintain situational awareness and effective decision making through continuous monitoring of, and real-time alerting about, sources of information regarding current incidents and developing fire hazards. The nature of this goal requires integrating different, potentially numerous, sources of dynamic geospatial information on the one side, and a large number of clients having heterogeneous and specific interests in data on the other side. In such scenarios, the traditional request/reply communication style may function inefficiently, as it is based on point-to-point, synchronous, and pulling mode interaction between consumer clients and information providers/services. In this work, we propose Geospatial-based Publish/ Subscribe, an interaction framework that serves as a middleware for real-time transacting of spatially related information of interest, termed geospatial events, in distributed systems. Expressive data models, including geospatial event and geospatial subscription, as well as an efficient matching approach for fast dissemination of geospatial events to interested clients, are introduced. The proposed interaction framework is realized through the development of a Real-Time Fire Emergency Response System (RFERS) prototype. The prototype is designed for transacting several topics of geospatial events that are crucial within the context of fire emergencies, including GPS locations of emergency assets, meteorological observations of wireless sensors, fire incidents reports, and temporal sequences of remote sensing images of active wildfires. The performance of the system prototype has been evaluated in order to demonstrate its efficiency.
SOCIB Glider toolbox: from sensor to data repository
NASA Astrophysics Data System (ADS)
Pau Beltran, Joan; Heslop, Emma; Ruiz, Simón; Troupin, Charles; Tintoré, Joaquín
2015-04-01
Nowadays in oceanography, gliders constitutes a mature, cost-effective technology for the acquisition of measurements independently of the sea state (unlike ships), providing subsurface data during sustained periods, including extreme weather events. The SOCIB glider toolbox is a set of MATLAB/Octave scripts and functions developed in order to manage the data collected by a glider fleet. They cover the main stages of the data management process, both in real-time and delayed-time modes: metadata aggregation, downloading, processing, and automatic generation of data products and figures. The toolbox is distributed under the GNU licence (http://www.gnu.org/copyleft/gpl.html) and is available at http://www.socib.es/users/glider/glider_toolbox.
U.S. Geological Survey (USGS) Earthquake Web Applications
NASA Astrophysics Data System (ADS)
Fee, J.; Martinez, E.
2015-12-01
USGS Earthquake web applications provide access to earthquake information from USGS and other Advanced National Seismic System (ANSS) contributors. One of the primary goals of these applications is to provide a consistent experience for accessing both near-real time information as soon as it is available and historic information after it is thoroughly reviewed. Millions of people use these applications every month including people who feel an earthquake, emergency responders looking for the latest information about a recent event, and scientists researching historic earthquakes and their effects. Information from multiple catalogs and contributors is combined by the ANSS Comprehensive Catalog into one composite catalog, identifying the most preferred information from any source for each event. A web service and near-real time feeds provide access to all contributed data, and are used by a number of users and software packages. The Latest Earthquakes application displays summaries of many events, either near-real time feeds or custom searches, and the Event Page application shows detailed information for each event. Because all data is accessed through the web service, it can also be downloaded by users. The applications are maintained as open source projects on github, and use mobile-first and responsive-web-design approaches to work well on both mobile devices and desktop computers. http://earthquake.usgs.gov/earthquakes/map/
Towards an intelligent hospital environment: OR of the future.
Sutherland, Jeffrey V; van den Heuvel, Willem-Jan; Ganous, Tim; Burton, Matthew M; Kumar, Animesh
2005-01-01
Patients, providers, payers, and government demand more effective and efficient healthcare services, and the healthcare industry needs innovative ways to re-invent core processes. Business process reengineering (BPR) showed adopting new hospital information systems can leverage this transformation and workflow management technologies can automate process management. Our research indicates workflow technologies in healthcare require real time patient monitoring, detection of adverse events, and adaptive responses to breakdown in normal processes. Adaptive workflow systems are rarely implemented making current workflow implementations inappropriate for healthcare. The advent of evidence based medicine, guideline based practice, and better understanding of cognitive workflow combined with novel technologies including Radio Frequency Identification (RFID), mobile/wireless technologies, internet workflow, intelligent agents, and Service Oriented Architectures (SOA) opens up new and exciting ways of automating business processes. Total situational awareness of events, timing, and location of healthcare activities can generate self-organizing change in behaviors of humans and machines. A test bed of a novel approach towards continuous process management was designed for the new Weinburg Surgery Building at the University of Maryland Medical. Early results based on clinical process mapping and analysis of patient flow bottlenecks demonstrated 100% improvement in delivery of supplies and instruments at surgery start time. This work has been directly applied to the design of the DARPA Trauma Pod research program where robotic surgery will be performed on wounded soldiers on the battlefield.
A novel adaptive, real-time algorithm to detect gait events from wearable sensors.
Chia Bejarano, Noelia; Ambrosini, Emilia; Pedrocchi, Alessandra; Ferrigno, Giancarlo; Monticone, Marco; Ferrante, Simona
2015-05-01
A real-time, adaptive algorithm based on two inertial and magnetic sensors placed on the shanks was developed for gait-event detection. For each leg, the algorithm detected the Initial Contact (IC), as the minimum of the flexion/extension angle, and the End Contact (EC) and the Mid-Swing (MS), as minimum and maximum of the angular velocity, respectively. The algorithm consisted of calibration, real-time detection, and step-by-step update. Data collected from 22 healthy subjects (21 to 85 years) walking at three self-selected speeds were used to validate the algorithm against the GaitRite system. Comparable levels of accuracy and significantly lower detection delays were achieved with respect to other published methods. The algorithm robustness was tested on ten healthy subjects performing sudden speed changes and on ten stroke subjects (43 to 89 years). For healthy subjects, F1-scores of 1 and mean detection delays lower than 14 ms were obtained. For stroke subjects, F1-scores of 0.998 and 0.944 were obtained for IC and EC, respectively, with mean detection delays always below 31 ms. The algorithm accurately detected gait events in real time from a heterogeneous dataset of gait patterns and paves the way for the design of closed-loop controllers for customized gait trainings and/or assistive devices.
Iterative Strategies for Aftershock Classification in Automatic Seismic Processing Pipelines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gibbons, Steven J.; Kvaerna, Tormod; Harris, David B.
We report aftershock sequences following very large earthquakes present enormous challenges to near-real-time generation of seismic bulletins. The increase in analyst resources needed to relocate an inflated number of events is compounded by failures of phase-association algorithms and a significant deterioration in the quality of underlying, fully automatic event bulletins. Current processing pipelines were designed a generation ago, and, due to computational limitations of the time, are usually limited to single passes over the raw data. With current processing capability, multiple passes over the data are feasible. Processing the raw data at each station currently generates parametric data streams thatmore » are then scanned by a phase-association algorithm to form event hypotheses. We consider the scenario in which a large earthquake has occurred and propose to define a region of likely aftershock activity in which events are detected and accurately located, using a separate specially targeted semiautomatic process. This effort may focus on so-called pattern detectors, but here we demonstrate a more general grid-search algorithm that may cover wider source regions without requiring waveform similarity. Given many well-located aftershocks within our source region, we may remove all associated phases from the original detection lists prior to a new iteration of the phase-association algorithm. We provide a proof-of-concept example for the 2015 Gorkha sequence, Nepal, recorded on seismic arrays of the International Monitoring System. Even with very conservative conditions for defining event hypotheses within the aftershock source region, we can automatically remove about half of the original detections that could have been generated by Nepal earthquakes and reduce the likelihood of false associations and spurious event hypotheses. Lastly, further reductions in the number of detections in the parametric data streams are likely, using correlation and subspace detectors and/or empirical matched field processing.« less
Iterative Strategies for Aftershock Classification in Automatic Seismic Processing Pipelines
Gibbons, Steven J.; Kvaerna, Tormod; Harris, David B.; ...
2016-06-08
We report aftershock sequences following very large earthquakes present enormous challenges to near-real-time generation of seismic bulletins. The increase in analyst resources needed to relocate an inflated number of events is compounded by failures of phase-association algorithms and a significant deterioration in the quality of underlying, fully automatic event bulletins. Current processing pipelines were designed a generation ago, and, due to computational limitations of the time, are usually limited to single passes over the raw data. With current processing capability, multiple passes over the data are feasible. Processing the raw data at each station currently generates parametric data streams thatmore » are then scanned by a phase-association algorithm to form event hypotheses. We consider the scenario in which a large earthquake has occurred and propose to define a region of likely aftershock activity in which events are detected and accurately located, using a separate specially targeted semiautomatic process. This effort may focus on so-called pattern detectors, but here we demonstrate a more general grid-search algorithm that may cover wider source regions without requiring waveform similarity. Given many well-located aftershocks within our source region, we may remove all associated phases from the original detection lists prior to a new iteration of the phase-association algorithm. We provide a proof-of-concept example for the 2015 Gorkha sequence, Nepal, recorded on seismic arrays of the International Monitoring System. Even with very conservative conditions for defining event hypotheses within the aftershock source region, we can automatically remove about half of the original detections that could have been generated by Nepal earthquakes and reduce the likelihood of false associations and spurious event hypotheses. Lastly, further reductions in the number of detections in the parametric data streams are likely, using correlation and subspace detectors and/or empirical matched field processing.« less
NASA Astrophysics Data System (ADS)
Lee, Shiann-Jong; Liang, Wen-Tzong; Cheng, Hui-Wen; Tu, Feng-Shan; Ma, Kuo-Fong; Tsuruoka, Hiroshi; Kawakatsu, Hitoshi; Huang, Bor-Shouh; Liu, Chun-Chi
2014-01-01
We have developed a real-time moment tensor monitoring system (RMT) which takes advantage of a grid-based moment tensor inversion technique and real-time broad-band seismic recordings to automatically monitor earthquake activities in the vicinity of Taiwan. The centroid moment tensor (CMT) inversion technique and a grid search scheme are applied to obtain the information of earthquake source parameters, including the event origin time, hypocentral location, moment magnitude and focal mechanism. All of these source parameters can be determined simultaneously within 117 s after the occurrence of an earthquake. The monitoring area involves the entire Taiwan Island and the offshore region, which covers the area of 119.3°E to 123.0°E and 21.0°N to 26.0°N, with a depth from 6 to 136 km. A 3-D grid system is implemented in the monitoring area with a uniform horizontal interval of 0.1° and a vertical interval of 10 km. The inversion procedure is based on a 1-D Green's function database calculated by the frequency-wavenumber (fk) method. We compare our results with the Central Weather Bureau (CWB) catalogue data for earthquakes occurred between 2010 and 2012. The average differences between event origin time and hypocentral location are less than 2 s and 10 km, respectively. The focal mechanisms determined by RMT are also comparable with the Broadband Array in Taiwan for Seismology (BATS) CMT solutions. These results indicate that the RMT system is realizable and efficient to monitor local seismic activities. In addition, the time needed to obtain all the point source parameters is reduced substantially compared to routine earthquake reports. By connecting RMT with a real-time online earthquake simulation (ROS) system, all the source parameters will be forwarded to the ROS to make the real-time earthquake simulation feasible. The RMT has operated offline (2010-2011) and online (since January 2012 to present) at the Institute of Earth Sciences (IES), Academia Sinica (http://rmt.earth.sinica.edu.tw). The long-term goal of this system is to provide real-time source information for rapid seismic hazard assessment during large earthquakes.
Sandberg, Warren S; Häkkinen, Matti; Egan, Marie; Curran, Paige K; Fairbrother, Pamela; Choquette, Ken; Daily, Bethany; Sarkka, Jukka-Pekka; Rattner, David
2005-09-01
When procedures and processes to assure patient location based on human performance do not work as expected, patients are brought incrementally closer to a possible "wrong patient-wrong procedure'' error. We developed a system for automated patient location monitoring and management. Real-time data from an active infrared/radio frequency identification tracking system provides patient location data that are robust and can be compared with an "expected process'' model to automatically flag wrong-location events as soon as they occur. The system also generates messages that are automatically sent to process managers via the hospital paging system, thus creating an active alerting function to annunciate errors. We deployed the system to detect and annunciate "patient-in-wrong-OR'' events. The system detected all "wrong-operating room (OR)'' events, and all "wrong-OR'' locations were correctly assigned within 0.50+/-0.28 minutes (mean+/-SD). This corresponded to the measured latency of the tracking system. All wrong-OR events were correctly annunciated via the paging function. This experiment demonstrates that current technology can automatically collect sufficient data to remotely monitor patient flow through a hospital, provide decision support based on predefined rules, and automatically notify stakeholders of errors.
NASA Technical Reports Server (NTRS)
Adams, Mitzi L.; Gallagher, D. L.; Whitt, A.; Whitaker, Ann F. (Technical Monitor)
2001-01-01
For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing real-time science related events has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases broadcasts accommodate active feedback and questions from Internet participants. Panel participation will be used to communicate the problems and lessons learned from these activities over the last three years.
Technical Manual for the Geospatial Stream Flow Model (GeoSFM)
Asante, Kwabena O.; Artan, Guleid A.; Pervez, Md Shahriar; Bandaragoda, Christina; Verdin, James P.
2008-01-01
The monitoring of wide-area hydrologic events requires the use of geospatial and time series data available in near-real time. These data sets must be manipulated into information products that speak to the location and magnitude of the event. Scientists at the U.S. Geological Survey Earth Resources Observation and Science (USGS EROS) Center have implemented a hydrologic modeling system which consists of an operational data processing system and the Geospatial Stream Flow Model (GeoSFM). The data processing system generates daily forcing evapotranspiration and precipitation data from various remotely sensed and ground-based data sources. To allow for rapid implementation in data scarce environments, widely available terrain, soil, and land cover data sets are used for model setup and initial parameter estimation. GeoSFM performs geospatial preprocessing and postprocessing tasks as well as hydrologic modeling tasks within an ArcView GIS environment. The integration of GIS routines and time series processing routines is achieved seamlessly through the use of dynamically linked libraries (DLLs) embedded within Avenue scripts. GeoSFM is run operationally to identify and map wide-area streamflow anomalies. Daily model results including daily streamflow and soil water maps are disseminated through Internet map servers, flood hazard bulletins and other media.
FPGA-Based Front-End Electronics for Positron Emission Tomography
Haselman, Michael; DeWitt, Don; McDougald, Wendy; Lewellen, Thomas K.; Miyaoka, Robert; Hauck, Scott
2010-01-01
Modern Field Programmable Gate Arrays (FPGAs) are capable of performing complex discrete signal processing algorithms with clock rates above 100MHz. This combined with FPGA’s low expense, ease of use, and selected dedicated hardware make them an ideal technology for a data acquisition system for positron emission tomography (PET) scanners. Our laboratory is producing a high-resolution, small-animal PET scanner that utilizes FPGAs as the core of the front-end electronics. For this next generation scanner, functions that are typically performed in dedicated circuits, or offline, are being migrated to the FPGA. This will not only simplify the electronics, but the features of modern FPGAs can be utilizes to add significant signal processing power to produce higher resolution images. In this paper two such processes, sub-clock rate pulse timing and event localization, will be discussed in detail. We show that timing performed in the FPGA can achieve a resolution that is suitable for small-animal scanners, and will outperform the analog version given a low enough sampling period for the ADC. We will also show that the position of events in the scanner can be determined in real time using a statistical positioning based algorithm. PMID:21961085
Real-Time Multimission Event Notification System for Mars Relay
NASA Technical Reports Server (NTRS)
Wallick, Michael N.; Allard, Daniel A.; Gladden, Roy E.; Wang, Paul; Hy, Franklin H.
2013-01-01
As the Mars Relay Network is in constant flux (missions and teams going through their daily workflow), it is imperative that users are aware of such state changes. For example, a change by an orbiter team can affect operations on a lander team. This software provides an ambient view of the real-time status of the Mars network. The Mars Relay Operations Service (MaROS) comprises a number of tools to coordinate, plan, and visualize various aspects of the Mars Relay Network. As part of MaROS, a feature set was developed that operates on several levels of the software architecture. These levels include a Web-based user interface, a back-end "ReSTlet" built in Java, and databases that store the data as it is received from the network. The result is a real-time event notification and management system, so mission teams can track and act upon events on a moment-by-moment basis. This software retrieves events from MaROS and displays them to the end user. Updates happen in real time, i.e., messages are pushed to the user while logged into the system, and queued when the user is not online for later viewing. The software does not do away with the email notifications, but augments them with in-line notifications. Further, this software expands the events that can generate a notification, and allows user-generated notifications. Existing software sends a smaller subset of mission-generated notifications via email. A common complaint of users was that the system-generated e-mails often "get lost" with other e-mail that comes in. This software allows for an expanded set (including user-generated) of notifications displayed in-line of the program. By separating notifications, this can improve a user's workflow.
Singh, Monika; Bhoge, Rajesh K; Randhawa, Gurinderjit
2018-04-20
Background : Confirming the integrity of seed samples in powdered form is important priorto conducting a genetically modified organism (GMO) test. Rapid onsite methods may provide a technological solution to check for genetically modified (GM) events at ports of entry. In India, Bt cotton is the commercialized GM crop with four approved GM events; however, 59 GM events have been approved globally. GMO screening is required to test for authorized GM events. The identity and amplifiability of test samples could be ensured first by employing endogenous genes as an internal control. Objective : A rapid onsite detection method was developed for an endogenous reference gene, stearoyl acyl carrier protein desaturase ( Sad1 ) of cotton, employing visual and real-time loop-mediated isothermal amplification (LAMP). Methods : The assays were performed at a constant temperature of 63°C for 30 min for visual LAMP and 62ºC for 40 min for real-time LAMP. Positive amplification was visualized as a change in color from orange to green on addition of SYBR ® Green or detected as real-time amplification curves. Results : Specificity of LAMP assays was confirmed using a set of 10 samples. LOD for visual LAMP was up to 0.1%, detecting 40 target copies, and for real-time LAMP up to 0.05%, detecting 20 target copies. Conclusions : The developed methods could be utilized to confirm the integrity of seed powder prior to conducting a GMO test for specific GM events of cotton. Highlights : LAMP assays for the endogenous Sad1 gene of cotton have been developed to be used as an internal control for onsite GMO testing in cotton.
Venkatesan, Sudhir; Myles, Puja R; McCann, Gerard; Kousoulis, Antonis A; Hashmi, Maimoona; Belatri, Rabah; Boyle, Emma; Barcroft, Alan; van Staa, Tjeerd Pieter; Kirkham, Jamie J; Nguyen Van Tam, Jonathan S; Williams, Timothy J; Semple, Malcolm G
2015-10-01
During pandemics of novel influenza and outbreaks of emerging infections, surge in health-care demand can exceed capacity to provide normal standards of care. In such exceptional circumstances, triage tools may aid decisions in identifying people who are most likely to benefit from higher levels of care. Rapid research during the early phase of an outbreak should allow refinement and validation of triage tools so that in the event of surge a valid tool is available. The overarching study aim is to conduct a prospective near real-time analysis of structured clinical assessments of influenza-like illness (ILI) using primary care electronic health records (EHRs) during a pandemic. This abstract summarises the preparatory work, infrastructure development, user testing and proof-of-concept study. (1) In preparation for conducting rapid research in the early phase of a future outbreak, to develop processes that allow near real-time analysis of general practitioner (GP) assessments of people presenting with ILI, management decisions and patient outcomes. (2) As proof of concept: conduct a pilot study evaluating the performance of the triage tools 'Community Assessment Tools' and 'Pandemic Medical Early Warning Score' to predict hospital admission and death in patients presenting with ILI to GPs during inter-pandemic winter seasons. Prospective near real-time analysis of structured clinical assessments and anonymised linkage to data from EHRs. User experience was evaluated by semistructured interviews with participating GPs. Thirty GPs in England, Wales and Scotland, participating in the Clinical Practice Research Datalink. All people presenting with ILI. None. Study outcome is proof of concept through demonstration of data capture and near real-time analysis. Primary patient outcomes were hospital admission within 24 hours and death (all causes) within 30 days of GP assessment. Secondary patient outcomes included GP decision to prescribe antibiotics and/or influenza-specific antiviral drugs and/or refer to hospital - if admitted, the need for higher levels of care and length of hospital stay. Linked anonymised data from a web-based structured clinical assessment and primary care EHRs. In the 24 months to April 2015, data from 704 adult and 159 child consultations by 30 GPs were captured. GPs referred 11 (1.6%) adults and six (3.8%) children to hospital. There were 13 (1.8%) deaths of adults and two (1.3%) of children. There were too few outcome events to draw any conclusions regarding the performance of the triage tools. GP interviews showed that although there were some difficulties with installation, the web-based data collection tool was quick and easy to use. Some GPs felt that a minimal monetary incentive would promote participation. We have developed processes that allow capture and near real-time automated analysis of GP's clinical assessments and management decisions of people presenting with ILI. We will develop processes to include other EHR systems, attempt linkage to data on influenza surveillance and maintain processes in readiness for a future outbreak. This study is registered as ISRCTN87130712 and UK Clinical Research Network 12827. The National Institute for Health Research Health Technology Assessment programme. MGS is supported by the UK NIHR Health Protection Research Unit in Emerging and Zoonotic Infections.
Social Media as Seismic Networks for the Earthquake Damage Assessment
NASA Astrophysics Data System (ADS)
Meletti, C.; Cresci, S.; La Polla, M. N.; Marchetti, A.; Tesconi, M.
2014-12-01
The growing popularity of online platforms, based on user-generated content, is gradually creating a digital world that mirrors the physical world. In the paradigm of crowdsensing, the crowd becomes a distributed network of sensors that allows us to understand real life events at a quasi-real-time rate. The SoS-Social Sensing project [http://socialsensing.it/] exploits the opportunistic crowdsensing, involving users in the sensing process in a minimal way, for social media emergency management purposes in order to obtain a very fast, but still reliable, detection of emergency dimension to face. First of all we designed and implemented a decision support system for the detection and the damage assessment of earthquakes. Our system exploits the messages shared in real-time on Twitter. In the detection phase, data mining and natural language processing techniques are firstly adopted to select meaningful and comprehensive sets of tweets. Then we applied a burst detection algorithm in order to promptly identify outbreaking seismic events. Using georeferenced tweets and reported locality names, a rough epicentral determination is also possible. The results, compared to Italian INGV official reports, show that the system is able to detect, within seconds, events of a magnitude in the region of 3.5 with a precision of 75% and a recall of 81,82%. We then focused our attention on damage assessment phase. We investigated the possibility to exploit social media data to estimate earthquake intensity. We designed a set of predictive linear models and evaluated their ability to map the intensity of worldwide earthquakes. The models build on a dataset of almost 5 million tweets exploited to compute our earthquake features, and more than 7,000 globally distributed earthquakes data, acquired in a semi-automatic way from USGS, serving as ground truth. We extracted 45 distinct features falling into four categories: profile, tweet, time and linguistic. We run diagnostic tests and simulations on generated models to assess their significance and avoid overfitting. Overall results show a correlation between the messages shared in social media and intensity estimations based on online survey data (CDI).
NASA Astrophysics Data System (ADS)
Chen, Y. W.; Chang, L. C.
2012-04-01
Typhoons which normally bring a great amount of precipitation are the primary natural hazard in Taiwan during flooding season. Because the plentiful rainfall quantities brought by typhoons are normally stored for the usage of the next draught period, the determination of release strategies for flood operation of reservoirs which is required to simultaneously consider not only the impact of reservoir safety and the flooding damage in plain area but also for the water resource stored in the reservoir after typhoon becomes important. This study proposes a two-steps study process. First, this study develop an optimal flood operation model (OFOM) for the planning of flood control and also applies the OFOM on Tseng-wun reservoir and the downstream plain related to the reservoir. Second, integrating a typhoon event database with the OFOM mentioned above makes the proposed planning model have ability to deal with a real-time flood control problem and names as real-time flood operation model (RTFOM). Three conditions are considered in the proposed models, OFOM and RTFOM, include the safety of the reservoir itself, the reservoir storage after typhoons and the impact of flooding in the plain area. Besides, the flood operation guideline announced by government is also considered in the proposed models. The these conditions and the guideline can be formed as an optimization problem which is solved by the genetic algorithm (GA) in this study. Furthermore, a distributed runoff model, kinematic-wave geomorphic instantaneous unit hydrograph (KW-GIUH), and a river flow simulation model, HEC-RAS, are used to simulate the river water level of Tseng-wun basin in the plain area and the simulated level is shown as an index of the impact of flooding. Because the simulated levels are required to re-calculate iteratively in the optimization model, applying a recursive artificial neural network (recursive ANN) instead of the HEC-RAS model can significantly reduce the computational burden of the entire optimization problem. This study applies the developed methodology to Tseng-wun Reservoir. Forty typhoon events are collected as the historical database and six typhoon events are used to verify the proposed model. These typhoons include Typhoon Sepat and Typhoon Korsa in 2007 and Typhoon Kalmaegi, Typhoon Fung-Wong, Typhoon Sinlaku and Typhoon Jangmi in 2008. The results show that the proposed model can reduce the flood duration at the downstream area. For example, the real-time flood control model can reduce the flood duration by four and three hours for Typhoon Korsa and Typhoon Sinlaku respectively. This results indicate that the developed model can be a very useful tool for real-time flood control operation of reservoirs.
Real Time Coincidence Processing Algorithm for Geiger Mode LADAR using FPGAs
2017-01-09
Defense for Research and Engineering. Real Time Coincidence Processing Algorithm for Geiger-Mode Ladar using FPGAs Rufo A. Antonio1, Alexandru N...the first ever Geiger-mode ladar processing al- gorithm that is suitable for implementation on an FPGA enabling real time pro- cessing and data...developed embedded FPGA real time processing algorithms that take noisy raw data, streaming at upwards of 1GB/sec, and filters the data to obtain a near- ly
NASA Astrophysics Data System (ADS)
Lee, Shiann-Jong; Liu, Qinya; Tromp, Jeroen; Komatitsch, Dimitri; Liang, Wen-Tzong; Huang, Bor-Shouh
2014-06-01
We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 min after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). A new island-wide, high resolution SEM mesh model is developed for the whole Taiwan in this study. We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 min for a 70 s ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.
Efficient Prediction of Low-Visibility Events at Airports Using Machine-Learning Regression
NASA Astrophysics Data System (ADS)
Cornejo-Bueno, L.; Casanova-Mateo, C.; Sanz-Justo, J.; Cerro-Prada, E.; Salcedo-Sanz, S.
2017-11-01
We address the prediction of low-visibility events at airports using machine-learning regression. The proposed model successfully forecasts low-visibility events in terms of the runway visual range at the airport, with the use of support-vector regression, neural networks (multi-layer perceptrons and extreme-learning machines) and Gaussian-process algorithms. We assess the performance of these algorithms based on real data collected at the Valladolid airport, Spain. We also propose a study of the atmospheric variables measured at a nearby tower related to low-visibility atmospheric conditions, since they are considered as the inputs of the different regressors. A pre-processing procedure of these input variables with wavelet transforms is also described. The results show that the proposed machine-learning algorithms are able to predict low-visibility events well. The Gaussian process is the best algorithm among those analyzed, obtaining over 98% of the correct classification rate in low-visibility events when the runway visual range is {>}1000 m, and about 80% under this threshold. The performance of all the machine-learning algorithms tested is clearly affected in extreme low-visibility conditions ({<}500 m). However, we show improved results of all the methods when data from a neighbouring meteorological tower are included, and also with a pre-processing scheme using a wavelet transform. Also presented are results of the algorithm performance in daytime and nighttime conditions, and for different prediction time horizons.
Operational, Real-Time, Sun-to-Earth Interplanetary Shock Predictions During Solar Cycle 23
NASA Astrophysics Data System (ADS)
Fry, C. D.; Dryer, M.; Sun, W.; Deehr, C. S.; Smith, Z.; Akasofu, S.
2002-05-01
We report on our progress in predicting interplanetary shock arrival time (SAT) in real-time, using three forecast models: the Hakamada-Akasofu-Fry (HAF) modified kinematic model, the Interplanetary Shock Propagation Model (ISPM) and the Shock Time of Arrival (STOA) model. These models are run concurrently to provide real-time predictions of the arrival time at Earth of interplanetary shocks caused by solar events. These "fearless forecasts" are the first, and presently only, publicly distributed predictions of SAT and are undergoing quantitative evaluation for operational utility and scientific benchmarking. All three models predict SAT, but the HAF model also provides a global view of the propagation of interplanetary shocks through the pre-existing, non-uniform heliospheric structure. This allows the forecaster to track the propagation of the shock and to differentiate between shocks caused by solar events and those associated with co-rotating interaction regions (CIRs). This study includes 173 events during the period February, 1997 to October, 2000. Shock predictions were compared with spacecraft observations at the L1 location to determine how well the models perform. Sixty-eight shocks were observed at L1 within 120 hours of an event. We concluded that 6 of these observed shocks were caused by CIRs, and the remainder were caused by solar events. The forecast skill of the models are presented in terms of RMS errors, contingency tables and skill scores commonly used by the weather forecasting community. The false alarm rate for HAF was higher than for ISPM or STOA but much lower than for predictions based upon empirical studies or climatology. Of the parameters used to characterize a shock source at the Sun, the initial speed of the coronal shock, as represented by the observed metric type II speed, has the largest influence on the predicted SAT. We also found that HAF model predictions based upon type II speed are generally better for shocks originating from sites near central meridian, and worse for limb events. This tendency suggests that the observed type II speed is more representative of the interplanetary shock speed for events occurring near central meridian. In particular, the type II speed appears to underestimate the actual Earth-directed IP shock speed when the source of the event is near the limb. Several of the most interesting events (Bastille Day epoch (2000), April Fools Day epoch (2001))will be discussed in more detail with the use of real-time animations.
Increasing the Operational Value of Event Messages
NASA Technical Reports Server (NTRS)
Li, Zhenping; Savkli, Cetin; Smith, Dan
2003-01-01
Assessing the health of a space mission has traditionally been performed using telemetry analysis tools. Parameter values are compared to known operational limits and are plotted over various time periods. This presentation begins with the notion that there is an incredible amount of untapped information contained within the mission s event message logs. Through creative advancements in message handling tools, the event message logs can be used to better assess spacecraft and ground system status and to highlight and report on conditions not readily apparent when messages are evaluated one-at-a-time during a real-time pass. Work in this area is being funded as part of a larger NASA effort at the Goddard Space Flight Center to create component-based, middleware-based, standards-based general purpose ground system architecture referred to as GMSEC - the GSFC Mission Services Evolution Center. The new capabilities and operational concepts for event display, event data analyses and data mining are being developed by Lockheed Martin and the new subsystem has been named GREAT - the GMSEC Reusable Event Analysis Toolkit. Planned for use on existing and future missions, GREAT has the potential to increase operational efficiency in areas of problem detection and analysis, general status reporting, and real-time situational awareness.
Hardware design and implementation of fast DOA estimation method based on multicore DSP
NASA Astrophysics Data System (ADS)
Guo, Rui; Zhao, Yingxiao; Zhang, Yue; Lin, Qianqiang; Chen, Zengping
2016-10-01
In this paper, we present a high-speed real-time signal processing hardware platform based on multicore digital signal processor (DSP). The real-time signal processing platform shows several excellent characteristics including high performance computing, low power consumption, large-capacity data storage and high speed data transmission, which make it able to meet the constraint of real-time direction of arrival (DOA) estimation. To reduce the high computational complexity of DOA estimation algorithm, a novel real-valued MUSIC estimator is used. The algorithm is decomposed into several independent steps and the time consumption of each step is counted. Based on the statistics of the time consumption, we present a new parallel processing strategy to distribute the task of DOA estimation to different cores of the real-time signal processing hardware platform. Experimental results demonstrate that the high processing capability of the signal processing platform meets the constraint of real-time direction of arrival (DOA) estimation.
Real-Time and Post-Processed Georeferencing for Hyperpspectral Drone Remote Sensing
NASA Astrophysics Data System (ADS)
Oliveira, R. A.; Khoramshahi, E.; Suomalainen, J.; Hakala, T.; Viljanen, N.; Honkavaara, E.
2018-05-01
The use of drones and photogrammetric technologies are increasing rapidly in different applications. Currently, drone processing workflow is in most cases based on sequential image acquisition and post-processing, but there are great interests towards real-time solutions. Fast and reliable real-time drone data processing can benefit, for instance, environmental monitoring tasks in precision agriculture and in forest. Recent developments in miniaturized and low-cost inertial measurement systems and GNSS sensors, and Real-time kinematic (RTK) position data are offering new perspectives for the comprehensive remote sensing applications. The combination of these sensors and light-weight and low-cost multi- or hyperspectral frame sensors in drones provides the opportunity of creating near real-time or real-time remote sensing data of target object. We have developed a system with direct georeferencing onboard drone to be used combined with hyperspectral frame cameras in real-time remote sensing applications. The objective of this study is to evaluate the real-time georeferencing comparing with post-processing solutions. Experimental data sets were captured in agricultural and forested test sites using the system. The accuracy of onboard georeferencing data were better than 0.5 m. The results showed that the real-time remote sensing is promising and feasible in both test sites.
Event schemas in autism spectrum disorders: the role of theory of mind and weak central coherence.
Loth, Eva; Gómez, Juan Carlos; Happé, Francesca
2008-03-01
Event schemas (generalized knowledge of what happens at common real-life events, e.g., a birthday party) are an important cognitive tool for social understanding: They provide structure for social experiences while accounting for many variable aspects. Using an event narratives task, this study tested the hypotheses that theory of mind (ToM) deficits and weak central coherence (WCC, a local processing bias) undermine different aspects of event knowledge in people with autism spectrum disorder (ASD). Event narratives of ASD ToM-failers were overall significantly impaired. ASD ToM-passers showed more specific abnormalities relating to variable activities, and some of these were significantly associated to WCC. Abnormalities in event knowledge might help linking ASD-typical social deficits in real-life situations and the adherence to inflexible routines.
Control and data acquisition upgrades for NSTX-U
Davis, W. M.; Tchilinguirian, G. J.; Carroll, T.; ...
2016-06-06
The extensive NSTX Upgrade (NSTX-U) Project includes major components which allow a doubling of the toroidal field strength to 1 T, of the Neutral Beam heating power to 12 MW, and the plasma current to 2 MA, and substantial structural enhancements to withstand the increased electromagnetic loads. The maximum pulse length will go from 1.5 to 5 s. The larger and more complex forces on the coils will be protected by a Digital Coil Protection System, which requires demanding real-time data input rates, calculations and responses. The amount of conventional digitized data for a given pulse is expected to increasemore » from 2.5 to 5 GB per second of pulse. 2-D Fast Camera data is expected to go from 2.5 GB/pulse to 10, and another 2 GB/pulse is expected from new IR cameras. Our network capacity will be increased by a factor of 10, with 10 Gb/s fibers used for the major trunks. 32-core Linux systems will be used for several functions, including between-shot data processing, MDSplus data serving, between-shot EFIT analysis, real-time processing, and for a new capability, between-shot TRANSP. As a result, improvements to the MDSplus events subsystem will be made through the use of both UDP and TCP/IP based methods and the addition of a dedicated “event server”.« less
SeisComP 3 - Where are we now?
NASA Astrophysics Data System (ADS)
Saul, Joachim; Becker, Jan; Hanka, Winfried; Heinloo, Andres; Weber, Bernd
2010-05-01
The seismological software SeisComP has evolved within the last approximately 10 years from a pure acquisition modules to a fully featured real-time earthquake monitoring software. The now very popular SeedLink protocol for seismic data transmission has been the core of SeisComP from the very beginning. Later additions included simple, purely automatic event detection, location and magnitude determination capabilities. Especially within the development of the 3rd-generation SeisComP, also known as "SeisComP 3", automatic processing capabilities have been augmented by graphical user interfaces for vizualization, rapid event review and quality control. Communication between the modules is achieved using a a TCP/IP infrastructure that allows distributed computing and remote review. For seismological metadata exchange export/import to/from QuakeML is avalable, which also provides a convenient interface with 3rd-party software. SeisComP is the primary seismological processing software at the GFZ Potsdam. It has also been in use for years in numerous seismic networks in Europe and, more recently, has been adopted as primary monitoring software by several tsunami warning centers around the Indian Ocean. In our presentation we describe the current status of development as well as future plans. We illustrate its possibilities by discussing different use cases for global and regional real-time earthquake monitoring and tsunami warning.
Using waveform cross correlation for automatic recovery of aftershock sequences
NASA Astrophysics Data System (ADS)
Bobrov, Dmitry; Kitov, Ivan; Rozhkov, Mikhail
2017-04-01
Aftershock sequences of the largest earthquakes are difficult to recover. There can be several hundred mid-sized aftershocks per hour within a few hundred km from each other recorded by the same stations. Moreover, these events generate thousands of reflected/refracted phases having azimuth and slowness close to those from the P-waves. Therefore, aftershock sequences with thousands of events represent a major challenge for automatic and interactive processing at the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Organization (CTBTO). Standard methods of detection and phase association do not use all information contained in signals. As a result, wrong association of the first and later phases, both regular and site specific, produces enormous number of wrong event hypotheses and destroys valid event hypotheses in automatic IDC processing. In turn, the IDC analysts have to reject false and recreate valid hypotheses wasting precious human resources. At the current level of the IDC catalogue completeness, the method of waveform cross correlation (WCC) can resolve most of detection and association problems fully utilizing the similarity of waveforms generated by aftershocks. Array seismic stations of the International monitoring system (IMS) can enhance the performance of the WCC method: reduce station-specific detection thresholds, allow accurate estimate of signal attributes, including relative magnitude, and effectively suppress irrelevant arrivals. We have developed and tested a prototype of an aftershock tool matching all IDC processing requirements and merged it with the current IDC pipeline. This tool includes creation of master events consisting of real or synthetic waveform templates at ten and more IMS stations; cross correlation (CC) of real-time waveforms with these templates, association of arrivals detected at CC-traces in event hypotheses; building events matching the IDC quality criteria; and resolution of conflicts between events hypotheses created by neighboring master-events. The final cross correlation standard event lists (XSEL) is a start point for interactive analysis with standard tools. We present select results for the biggest earthquakes, like Sumatra 2004 and Tohoku 2011, as well as for several smaller events with hundreds of aftershocks. The sensitivity and resolution of the aftershock tool is demonstrated on the example of mb=2.2 aftershock found after the September 9, 2016 DPRK test.
NASA Astrophysics Data System (ADS)
Sadeghi, Saman; MacKay, William A.; van Dam, R. Michael; Thompson, Michael
2011-02-01
Real-time analysis of multi-channel spatio-temporal sensor data presents a considerable technical challenge for a number of applications. For example, in brain-computer interfaces, signal patterns originating on a time-dependent basis from an array of electrodes on the scalp (i.e. electroencephalography) must be analyzed in real time to recognize mental states and translate these to commands which control operations in a machine. In this paper we describe a new technique for recognition of spatio-temporal patterns based on performing online discrimination of time-resolved events through the use of correlation of phase dynamics between various channels in a multi-channel system. The algorithm extracts unique sensor signature patterns associated with each event during a training period and ranks importance of sensor pairs in order to distinguish between time-resolved stimuli to which the system may be exposed during real-time operation. We apply the algorithm to electroencephalographic signals obtained from subjects tested in the neurophysiology laboratories at the University of Toronto. The extension of this algorithm for rapid detection of patterns in other sensing applications, including chemical identification via chemical or bio-chemical sensor arrays, is also discussed.
Sudden Event Recognition: A Survey
Suriani, Nor Surayahani; Hussain, Aini; Zulkifley, Mohd Asyraf
2013-01-01
Event recognition is one of the most active research areas in video surveillance fields. Advancement in event recognition systems mainly aims to provide convenience, safety and an efficient lifestyle for humanity. A precise, accurate and robust approach is necessary to enable event recognition systems to respond to sudden changes in various uncontrolled environments, such as the case of an emergency, physical threat and a fire or bomb alert. The performance of sudden event recognition systems depends heavily on the accuracy of low level processing, like detection, recognition, tracking and machine learning algorithms. This survey aims to detect and characterize a sudden event, which is a subset of an abnormal event in several video surveillance applications. This paper discusses the following in detail: (1) the importance of a sudden event over a general anomalous event; (2) frameworks used in sudden event recognition; (3) the requirements and comparative studies of a sudden event recognition system and (4) various decision-making approaches for sudden event recognition. The advantages and drawbacks of using 3D images from multiple cameras for real-time application are also discussed. The paper concludes with suggestions for future research directions in sudden event recognition. PMID:23921828
Performances and recent evolutions of EMSC Real Time Information services
NASA Astrophysics Data System (ADS)
Mazet-Roux, G.; Godey, S.; Bossu, R.
2009-04-01
The EMSC (http://www.emsc-csem.org) operates Real Time Earthquake Information services for the public and the scientific community which aim at providing rapid and reliable information on the seismic-ity of the Euro-Mediterranean region and on significant earthquakes worldwide. These services are based on parametric data rapidly provided by 66 seismological networks which are automatically merged and processed at EMSC. A web page which is updated every minute displays a list and a map of the latest earthquakes as well as additional information like location maps, moment tensors solutions or past regional seismicity. Since 2004, the performances and the popularity of these services have dramatically increased. The number of messages received from the contributors and the number of published events have been multiplied by 2 since 2004 and by 1.6 since 2005 respectively. The web traffic and the numbers of users of the Earthquake Notification Service (ENS) have been multiplied by 15 and 7 respectively. In terms of performances of the ENS, the median dissemination time for Euro-Med events is minutes in 2008. In order to further improve its performances and especially the speed and robustness of the reception of real time data, EMSC has recently implemented a software named QWIDS (Quake Watch Information Distribution System) which provides a quick and robust data exchange system through permanent TCP connections. At the difference with emails that can sometimes be delayed or lost, QWIDS is an actual real time communication system that ensures the data delivery. In terms of hardware, EMSC imple-mented a high availability, dynamic load balancing, redundant and scalable web servers infrastructure, composed of two SUN T2000 and one F5 BIG-IP switch. This will allow coping with constantly increas-ing web traffic and the occurrence of huge peaks of traffic after widely felt earthquakes.
A multilayer network dataset of interaction and influence spreading in a virtual world
NASA Astrophysics Data System (ADS)
Jankowski, Jarosław; Michalski, Radosław; Bródka, Piotr
2017-10-01
Presented data contains the record of five spreading campaigns that occurred in a virtual world platform. Users distributed avatars between each other during the campaigns. The processes varied in time and range and were either incentivized or not incentivized. Campaign data is accompanied by events. The data can be used to build a multilayer network to place the campaigns in a wider context. To the best of the authors' knowledge, the study is the first publicly available dataset containing a complete real multilayer social network together, along with five complete spreading processes in it.
Gao, Panke; Jin, Zhen; Cheng, Yingying; Cao, Xiangshan
2014-10-01
Aberrant splicing events play important roles in the pathogenesis of acute myeloid leukemia (AML). To investigate the aberrant splicing events in AML during treatment, we carried out RNA sequencing in peripheral mononuclear cell samples from a patient with complete remission. In addition to the sequencing samples, selected splicing events were confirmed and validated with real-time quantitative RT-PCR in another seven pairs of samples. A total of 4.05 and 3.39 GB clean data of the AML and remission sample were generated, respectively, and 2,223 differentially expressed genes (DEGs) were identified. Integrated with gene expression profiling on T cells from AML patients compared with healthy donors, 82 DEGs were also differentially expressed in AML CD4 T cells and CD8 T cells. Twenty-three alternative splicing events were considered to be confidential, and they were involved in many biological processes, such as RNA processing, cellular macromolecule catabolic process, and DNA binding process. An exon3-skipping event in TRIP12 was detected in patients at remission and further validated in another three independent samples. TRIP12 is an ubiquitin ligase of ARF, which suppresses aberrant cell growth by activating p53 responses. The exon3-skipping isoform of TRIP12 increased significantly after treatment. Our results may provide new understanding of AML, and the confirmed alternative splicing event of TRIP12 may be used as potential target for future investigations.
NASA Astrophysics Data System (ADS)
Lyon, A. L.; Kowalkowski, J. B.; Jones, C. D.
2017-10-01
ParaView is a high performance visualization application not widely used in High Energy Physics (HEP). It is a long standing open source project led by Kitware and involves several Department of Energy (DOE) and Department of Defense (DOD) laboratories. Futhermore, it has been adopted by many DOE supercomputing centers and other sites. ParaView is unique in speed and efficiency by using state-of-the-art techniques developed by the academic visualization community that are often not found in applications written by the HEP community. In-situ visualization of events, where event details are visualized during processing/analysis, is a common task for experiment software frameworks. Kitware supplies Catalyst, a library that enables scientific software to serve visualization objects to client ParaView viewers yielding a real-time event display. Connecting ParaView to the Fermilab art framework will be described and the capabilities it brings discussed.
Applied Use of Safety Event Occurrence Control Charts of Harm and Non-Harm Events: A Case Study.
Robinson, Susan N; Neyens, David M; Diller, Thomas
Most hospitals use occurrence reporting systems that facilitate identifying serious events that lead to root cause investigations. Thus, the events catalyze improvement efforts to mitigate patient harm. A serious limitation is that only a few of the occurrences are investigated. A challenge is leveraging the data to generate knowledge. The goal is to present a methodology to supplement these incident assessment efforts. The framework affords an enhanced understanding of patient safety through the use of control charts to monitor non-harm and harm incidents simultaneously. This approach can identify harm and non-harm reporting rates and also can facilitate monitoring occurrence trends. This method also can expedite identifying changes in workflow, processes, or safety culture. Although unable to identify root causes, this approach can identify changes in near real time. This approach also supports evaluating safety or policy interventions that may not be observable in annual safety climate surveys.
Intelligent Software Agents: Sensor Integration and Response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kulesz, James J; Lee, Ronald W
2013-01-01
Abstract In a post Macondo world the buzzwords are Integrity Management and Incident Response Management. The twin processes are not new but the opportunity to link the two is novel. Intelligent software agents can be used with sensor networks in distributed and centralized computing systems to enhance real-time monitoring of system integrity as well as manage the follow-on incident response to changing, and potentially hazardous, environmental conditions. The software components are embedded at the sensor network nodes in surveillance systems used for monitoring unusual events. When an event occurs, the software agents establish a new concept of operation at themore » sensing node, post the event status to a blackboard for software agents at other nodes to see , and then react quickly and efficiently to monitor the scale of the event. The technology addresses a current challenge in sensor networks that prevents a rapid and efficient response when a sensor measurement indicates that an event has occurred. By using intelligent software agents - which can be stationary or mobile, interact socially, and adapt to changing situations - the technology offers features that are particularly important when systems need to adapt to active circumstances. For example, when a release is detected, the local software agent collaborates with other agents at the node to exercise the appropriate operation, such as: targeted detection, increased detection frequency, decreased detection frequency for other non-alarming sensors, and determination of environmental conditions so that adjacent nodes can be informed that an event is occurring and when it will arrive. The software agents at the nodes can also post the data in a targeted manner, so that agents at other nodes and the command center can exercise appropriate operations to recalibrate the overall sensor network and associated intelligence systems. The paper describes the concepts and provides examples of real-world implementations including the Threat Detection and Analysis System (TDAS) at the International Port of Memphis and the Biological Warning and Incident Characterization System (BWIC) Environmental Monitoring (EM) Component. Technologies developed for these 24/7 operational systems have applications for improved real-time system integrity awareness as well as provide incident response (as needed) for production and field applications.« less
Maximum-Likelihood Estimation With a Contracting-Grid Search Algorithm
Hesterman, Jacob Y.; Caucci, Luca; Kupinski, Matthew A.; Barrett, Harrison H.; Furenlid, Lars R.
2010-01-01
A fast search algorithm capable of operating in multi-dimensional spaces is introduced. As a sample application, we demonstrate its utility in the 2D and 3D maximum-likelihood position-estimation problem that arises in the processing of PMT signals to derive interaction locations in compact gamma cameras. We demonstrate that the algorithm can be parallelized in pipelines, and thereby efficiently implemented in specialized hardware, such as field-programmable gate arrays (FPGAs). A 2D implementation of the algorithm is achieved in Cell/BE processors, resulting in processing speeds above one million events per second, which is a 20× increase in speed over a conventional desktop machine. Graphics processing units (GPUs) are used for a 3D application of the algorithm, resulting in processing speeds of nearly 250,000 events per second which is a 250× increase in speed over a conventional desktop machine. These implementations indicate the viability of the algorithm for use in real-time imaging applications. PMID:20824155
Khandelwal, Siddhartha; Wickstrom, Nicholas
2016-12-01
Detecting gait events is the key to many gait analysis applications that would benefit from continuous monitoring or long-term analysis. Most gait event detection algorithms using wearable sensors that offer a potential for use in daily living have been developed from data collected in controlled indoor experiments. However, for real-word applications, it is essential that the analysis is carried out in humans' natural environment; that involves different gait speeds, changing walking terrains, varying surface inclinations and regular turns among other factors. Existing domain knowledge in the form of principles or underlying fundamental gait relationships can be utilized to drive and support the data analysis in order to develop robust algorithms that can tackle real-world challenges in gait analysis. This paper presents a novel approach that exhibits how domain knowledge about human gait can be incorporated into time-frequency analysis to detect gait events from long-term accelerometer signals. The accuracy and robustness of the proposed algorithm are validated by experiments done in indoor and outdoor environments with approximately 93 600 gait events in total. The proposed algorithm exhibits consistently high performance scores across all datasets in both, indoor and outdoor environments.
Activity Recognition on Streaming Sensor Data.
Krishnan, Narayanan C; Cook, Diane J
2014-02-01
Many real-world applications that focus on addressing needs of a human, require information about the activities being performed by the human in real-time. While advances in pervasive computing have lead to the development of wireless and non-intrusive sensors that can capture the necessary activity information, current activity recognition approaches have so far experimented on either a scripted or pre-segmented sequence of sensor events related to activities. In this paper we propose and evaluate a sliding window based approach to perform activity recognition in an on line or streaming fashion; recognizing activities as and when new sensor events are recorded. To account for the fact that different activities can be best characterized by different window lengths of sensor events, we incorporate the time decay and mutual information based weighting of sensor events within a window. Additional contextual information in the form of the previous activity and the activity of the previous window is also appended to the feature describing a sensor window. The experiments conducted to evaluate these techniques on real-world smart home datasets suggests that combining mutual information based weighting of sensor events and adding past contextual information into the feature leads to best performance for streaming activity recognition.
Bhattacharya, Sudipta
2018-06-01
Recurrent adverse events, once occur often continue for some duration of time in clinical trials; and the number of events along with their durations is clinically considered as a measure of severity of a disease under study. While there are methods available for analyzing recurrent events or durations or for analyzing both side by side, no effort has been made so far to combine them and present as a single measure. However, this single-valued combined measure may help clinicians assess the wholesome effect of recurrence of incident comprising events and durations. Non-parametric approach is adapted here to develop an estimator for estimating the combined rate of both, the recurrence of events as well as the event-continuation, that is the duration per event. The proposed estimator produces a single numerical value, the interpretation and meaningfulness of which are discussed through the analysis of a real-life clinical dataset. The algebraic expression of variance is derived, asymptotic normality of the estimator is noted, and demonstration is provided on how the estimator can be used in the setup of testing of statistical hypothesis. Further possible development of the estimator is also noted, to adjust for the dependence of event occurrences on the history of the process generating recurrent events through covariates and for the case of dependent censoring.
Ji, Jin; Yang, Jiun-Chan; Larson, Dale N.
2009-01-01
We demonstrate using nanohole arrays of mixed designs and a microwriting process based on dip-pen nanolithography to monitor multiple, different protein binding events simultaneously in real time based on the intensity of Extraordinary Optical Transmission of nanohole arrays. The microwriting process and small footprint of the individual nanohole arrays enabled us to observe different binding events located only 16μm apart, achieving high spatial resolution. We also present a novel concept that incorporates nanohole arrays of different designs to improve confidence and accuracy of binding studies. For proof of concept, two types of nanohole arrays, designed to exhibit opposite responses to protein bindings, were fabricated on one transducer. Initial studies indicate that the mixed designs could help to screen out artifacts such as protein intrinsic signals, providing improved accuracy of binding interpretation. PMID:19297143
NASA Astrophysics Data System (ADS)
Lecompte, M. A.; Heaps, J. F.; Williams, F. H.
Imaging the earth from Geostationary Earth Orbit (GEO) allows frequent updates of environmental conditions within an observable hemisphere at time and spatial scales appropriate to the most transient observable terrestrial phenomena. Coverage provided by current GEO Meteorological Satellites (METSATS) fails to fully exploit this advantage due primarily to obsolescent technology and also institutional inertia. With the full benefit of GEO based imaging unrealized, rapidly evolving phenomena, occurring at the smallest spatial and temporal scales that frequently have significant environmental impact remain unobserved. These phenomena may be precursors for the most destructive natural processes that adversely effect society. Timely distribution of information derived from "real-time" observations thus may provide opportunities to mitigate much of the damage to life and property that would otherwise occur. AstroVision International's AVStar Earth monitoring system is designed to overcome the current limitations if GEO Earth coverage and to provide real time monitoring of changes to the Earth's complete atmospheric, land and marine surface environments including fires, volcanic events, lightning and meteoritic events on a "live," true color, and multispectral basis. The understanding of severe storm dynamics and its coupling to the earth's electro-sphere will be greatly enhanced by observations at unprecedented sampling frequencies and spatial resolution. Better understanding of these natural phenomena and AVStar operational real-time coverage may also benefit society through improvements in severe weather prediction and warning. AstroVision's AVStar system, designed to provide this capability with the first of a constellation of GEO- based commercial environmental monitoring satellites to be launched in late 2003 will be discussed, including spatial and temporal resolution, spectral coverage with applications and an inventory of the potential benefits to society, science, commerce and education.
Real-Time Event Detection for Monitoring Natural and Source ...
The use of event detection systems in finished drinking water systems is increasing in order to monitor water quality in both operational and security contexts. Recent incidents involving harmful algal blooms and chemical spills into watersheds have increased interest in monitoring source water quality prior to treatment. This work highlights the use of the CANARY event detection software in detecting suspected illicit events in an actively monitored watershed in South Carolina. CANARY is an open source event detection software that was developed by USEPA and Sandia National Laboratories. The software works with any type of sensor, utilizes multiple detection algorithms and approaches, and can incorporate operational information as needed. Monitoring has been underway for several years to detect events related to intentional or unintentional dumping of materials into the monitored watershed. This work evaluates the feasibility of using CANARY to enhance the detection of events in this watershed. This presentation will describe the real-time monitoring approach used in this watershed, the selection of CANARY configuration parameters that optimize detection for this watershed and monitoring application, and the performance of CANARY during the time frame analyzed. Further, this work will highlight how rainfall events impacted analysis, and the innovative application of CANARY taken in order to effectively detect the suspected illicit events. This presentation d
Statistical Methods in Ai: Rare Event Learning Using Associative Rules and Higher-Order Statistics
NASA Astrophysics Data System (ADS)
Iyer, V.; Shetty, S.; Iyengar, S. S.
2015-07-01
Rare event learning has not been actively researched since lately due to the unavailability of algorithms which deal with big samples. The research addresses spatio-temporal streams from multi-resolution sensors to find actionable items from a perspective of real-time algorithms. This computing framework is independent of the number of input samples, application domain, labelled or label-less streams. A sampling overlap algorithm such as Brooks-Iyengar is used for dealing with noisy sensor streams. We extend the existing noise pre-processing algorithms using Data-Cleaning trees. Pre-processing using ensemble of trees using bagging and multi-target regression showed robustness to random noise and missing data. As spatio-temporal streams are highly statistically correlated, we prove that a temporal window based sampling from sensor data streams converges after n samples using Hoeffding bounds. Which can be used for fast prediction of new samples in real-time. The Data-cleaning tree model uses a nonparametric node splitting technique, which can be learned in an iterative way which scales linearly in memory consumption for any size input stream. The improved task based ensemble extraction is compared with non-linear computation models using various SVM kernels for speed and accuracy. We show using empirical datasets the explicit rule learning computation is linear in time and is only dependent on the number of leafs present in the tree ensemble. The use of unpruned trees (t) in our proposed ensemble always yields minimum number (m) of leafs keeping pre-processing computation to n × t log m compared to N2 for Gram Matrix. We also show that the task based feature induction yields higher Qualify of Data (QoD) in the feature space compared to kernel methods using Gram Matrix.
VERSE - Virtual Equivalent Real-time Simulation
NASA Technical Reports Server (NTRS)
Zheng, Yang; Martin, Bryan J.; Villaume, Nathaniel
2005-01-01
Distributed real-time simulations provide important timing validation and hardware in the- loop results for the spacecraft flight software development cycle. Occasionally, the need for higher fidelity modeling and more comprehensive debugging capabilities - combined with a limited amount of computational resources - calls for a non real-time simulation environment that mimics the real-time environment. By creating a non real-time environment that accommodates simulations and flight software designed for a multi-CPU real-time system, we can save development time, cut mission costs, and reduce the likelihood of errors. This paper presents such a solution: Virtual Equivalent Real-time Simulation Environment (VERSE). VERSE turns the real-time operating system RTAI (Real-time Application Interface) into an event driven simulator that runs in virtual real time. Designed to keep the original RTAI architecture as intact as possible, and therefore inheriting RTAI's many capabilities, VERSE was implemented with remarkably little change to the RTAI source code. This small footprint together with use of the same API allows users to easily run the same application in both real-time and virtual time environments. VERSE has been used to build a workstation testbed for NASA's Space Interferometry Mission (SIM PlanetQuest) instrument flight software. With its flexible simulation controls and inexpensive setup and replication costs, VERSE will become an invaluable tool in future mission development.
Neural network pattern recognition of lingual-palatal pressure for automated detection of swallow.
Hadley, Aaron J; Krival, Kate R; Ridgel, Angela L; Hahn, Elizabeth C; Tyler, Dustin J
2015-04-01
We describe a novel device and method for real-time measurement of lingual-palatal pressure and automatic identification of the oral transfer phase of deglutition. Clinical measurement of the oral transport phase of swallowing is a complicated process requiring either placement of obstructive sensors or sitting within a fluoroscope or articulograph for recording. Existing detection algorithms distinguish oral events with EMG, sound, and pressure signals from the head and neck, but are imprecise and frequently result in false detection. We placed seven pressure sensors on a molded mouthpiece fitting over the upper teeth and hard palate and recorded pressure during a variety of swallow and non-swallow activities. Pressure measures and swallow times from 12 healthy and 7 Parkinson's subjects provided training data for a time-delay artificial neural network to categorize the recordings as swallow or non-swallow events. User-specific neural networks properly categorized 96 % of swallow and non-swallow events, while a generalized population-trained network was able to properly categorize 93 % of swallow and non-swallow events across all recordings. Lingual-palatal pressure signals are sufficient to selectively and specifically recognize the initiation of swallowing in healthy and dysphagic patients.
Event Oriented Design and Adaptive Multiprocessing
1991-08-31
System 5 2.3 The Classification 5 2.4 Real-Time Systems 7 2.5 Non Real-Time Systems 10 2.6 Common Characterizations of all Software Systems 10 2.7... Non -Optimal Guarantee Test Theorem 37 6.3.2 Chetto’s Optimal Guarantee Test Theorem 37 6.3.3 Multistate Case: An Extended Guarantee 39 Test Theorem...which subdivides all software systems according to the way in which they operate, such as interactive, non interactive, real-time, etc. Having defined
Notification of real-time clinical alerts generated by pharmacy expert systems.
Miller, J. E.; Reichley, R. M.; McNamee, L. A.; Steib, S. A.; Bailey, T. C.
1999-01-01
We developed and implemented a strategy for notifying clinical pharmacists of alerts generated in real-time by two pharmacy expert systems: one for drug dosing and the other for adverse drug event prevention. Display pagers were selected as the preferred notification method and a concise, yet readable, format for displaying alert data was developed. This combination of real-time alert generation and notification via display pagers was shown to be efficient and effective in a 30-day trial. PMID:10566374
Real-time monitoring of Lévy flights in a single quantum system
NASA Astrophysics Data System (ADS)
Issler, M.; Höller, J.; Imamoǧlu, A.
2016-02-01
Lévy flights are random walks where the dynamics is dominated by rare events. Even though they have been studied in vastly different physical systems, their observation in a single quantum system has remained elusive. Here we analyze a periodically driven open central spin system and demonstrate theoretically that the dynamics of the spin environment exhibits Lévy flights. For the particular realization in a single-electron charged quantum dot driven by periodic resonant laser pulses, we use Monte Carlo simulations to confirm that the long waiting times between successive nuclear spin-flip events are governed by a power-law distribution; the corresponding exponent η =-3 /2 can be directly measured in real time by observing the waiting time distribution of successive photon emission events. Remarkably, the dominant intrinsic limitation of the scheme arising from nuclear quadrupole coupling can be minimized by adjusting the magnetic field or by implementing spin echo.
They saw a movie: long-term memory for an extended audiovisual narrative.
Furman, Orit; Dorfman, Nimrod; Hasson, Uri; Davachi, Lila; Dudai, Yadin
2007-06-01
We measured long-term memory for a narrative film. During the study session, participants watched a 27-min movie episode, without instructions to remember it. During the test session, administered at a delay ranging from 3 h to 9 mo after the study session, long-term memory for the movie was probed using a computerized questionnaire that assessed cued recall, recognition, and metamemory of movie events sampled approximately 20 sec apart. The performance of each group of participants was measured at a single time point only. The participants remembered many events in the movie even months after watching it. Analysis of performance, using multiple measures, indicates differences between recent (weeks) and remote (months) memory. While high-confidence recognition performance was a reliable index of memory throughout the measured time span, cued recall accuracy was higher for relatively recent information. Analysis of different content elements in the movie revealed differential memory performance profiles according to time since encoding. We also used the data to propose lower limits on the capacity of long-term memory. This experimental paradigm is useful not only for the analysis of behavioral performance that results from encoding episodes in a continuous real-life-like situation, but is also suitable for studying brain substrates and processes of real-life memory using functional brain imaging.
Amsel, Ben D
2011-04-01
Empirically derived semantic feature norms categorized into different types of knowledge (e.g., visual, functional, auditory) can be summed to create number-of-feature counts per knowledge type. Initial evidence suggests several such knowledge types may be recruited during language comprehension. The present study provides a more detailed understanding of the timecourse and intensity of influence of several such knowledge types on real-time neural activity. A linear mixed-effects model was applied to single trial event-related potentials for 207 visually presented concrete words measured on total number of features (semantic richness), imageability, and number of visual motion, color, visual form, smell, taste, sound, and function features. Significant influences of multiple feature types occurred before 200ms, suggesting parallel neural computation of word form and conceptual knowledge during language comprehension. Function and visual motion features most prominently influenced neural activity, underscoring the importance of action-related knowledge in computing word meaning. The dynamic time courses and topographies of these effects are most consistent with a flexible conceptual system wherein temporally dynamic recruitment of representations in modal and supramodal cortex are a crucial element of the constellation of processes constituting word meaning computation in the brain. Copyright © 2011 Elsevier Ltd. All rights reserved.
They saw a movie: Long-term memory for an extended audiovisual narrative
Furman, Orit; Dorfman, Nimrod; Hasson, Uri; Davachi, Lila; Dudai, Yadin
2007-01-01
We measured long-term memory for a narrative film. During the study session, participants watched a 27-min movie episode, without instructions to remember it. During the test session, administered at a delay ranging from 3 h to 9 mo after the study session, long-term memory for the movie was probed using a computerized questionnaire that assessed cued recall, recognition, and metamemory of movie events sampled ∼20 sec apart. The performance of each group of participants was measured at a single time point only. The participants remembered many events in the movie even months after watching it. Analysis of performance, using multiple measures, indicates differences between recent (weeks) and remote (months) memory. While high-confidence recognition performance was a reliable index of memory throughout the measured time span, cued recall accuracy was higher for relatively recent information. Analysis of different content elements in the movie revealed differential memory performance profiles according to time since encoding. We also used the data to propose lower limits on the capacity of long-term memory. This experimental paradigm is useful not only for the analysis of behavioral performance that results from encoding episodes in a continuous real-life-like situation, but is also suitable for studying brain substrates and processes of real-life memory using functional brain imaging. PMID:17562897
Real-time high-level video understanding using data warehouse
NASA Astrophysics Data System (ADS)
Lienard, Bruno; Desurmont, Xavier; Barrie, Bertrand; Delaigle, Jean-Francois
2006-02-01
High-level Video content analysis such as video-surveillance is often limited by computational aspects of automatic image understanding, i.e. it requires huge computing resources for reasoning processes like categorization and huge amount of data to represent knowledge of objects, scenarios and other models. This article explains how to design and develop a "near real-time adaptive image datamart", used, as a decisional support system for vision algorithms, and then as a mass storage system. Using RDF specification as storing format of vision algorithms meta-data, we can optimise the data warehouse concepts for video analysis, add some processes able to adapt the current model and pre-process data to speed-up queries. In this way, when new data is sent from a sensor to the data warehouse for long term storage, using remote procedure call embedded in object-oriented interfaces to simplified queries, they are processed and in memory data-model is updated. After some processing, possible interpretations of this data can be returned back to the sensor. To demonstrate this new approach, we will present typical scenarios applied to this architecture such as people tracking and events detection in a multi-camera network. Finally we will show how this system becomes a high-semantic data container for external data-mining.
Building a new space weather facility at the National Observatory of Athens
NASA Astrophysics Data System (ADS)
Kontogiannis, Ioannis; Belehaki, Anna; Tsiropoula, Georgia; Tsagouri, Ioanna; Anastasiadis, Anastasios; Papaioannou, Athanasios
2016-01-01
The PROTEAS project has been initiated at the Institute of Astronomy, Astrophysics, Space Applications and Remote Sensing (IAASARS) of the National Observatory of Athens (NOA). One of its main objectives is to provide observations, processed data and space weather nowcasting and forecasting products, designed to support the space weather research community and operators of commercial and industrial systems. The space weather products to be released by this facility, will be the result of the exploitation of ground-based, as well as space-borne observations and of model results and tools already available or under development by IAASARS researchers. The objective will be achieved through: (a) the operation of a small full-disk solar telescope to conduct regular observations of the Sun in the H-alpha line; (b) the construction of a database with near real-time solar observations which will be available to the community through a web-based facility (HELIOSERVER); (c) the development of a tool for forecasting Solar Energetic Particle (SEP) events in relation to observed solar eruptive events; (d) the upgrade of the Athens Digisonde with digital transceivers and the capability of operating in bi-static link mode and (e) the sustainable operation of the European Digital Upper Atmosphere Server (DIAS) upgraded with additional data sets integrated in an interface with the HELIOSERVER and with improved models for the real-time quantification of the effects of solar eruptive events in the ionosphere.
NASA Astrophysics Data System (ADS)
Portugués Mollá, Iván; Felici, Xavier Bonache i.; Mateu Bellés, Joan F.; Segura, Juan B. Marco
2015-04-01
Flash-floods are recurrent events in the Mediterranean arch, mostly derived from cold air pool phenomena triggering hydro-geomorphic high-intensity processes, combining high discharge and low frequency. In urban environments the complexity of the processes become higher due to the existence of very fast-response basins and quick-response runoff. However, immediate activities of cleaning up and restoration delete the urban marks. After a short time both significance and dimension of the hydro-geomorphic event become completely unrecognizable. Nevertheless, these episodes generate extensive administrative documentation which is testimony of the processes in almost real time. Exploiting this source typology in order to reconstruct events far in time within urban areas, which may lack database sufficiently rich, is necessary to understand the hydrological and hydraulic derived processes. This is particularly the case of the Valencia flash-flood (1957), located in the lower Turia River basin (6.400 km2). Within a short interval (15 hours) there were registered two flood peaks (estimated at that time at 2.500 and 3.700 m3/s). The double overflowing inundated a large proportion of the urban area. The flash-flood activated fast processes with high energy that left numerous hydro-geomorphic marks. Although those tracks were deleted in a short while after the flood, it remains a legacy that had not yet been exploited, consisting of immediate aerial and oblique high resolution photography, pictures at street level, water level record and administrative records, such as claim files for compensation. Paradoxically, despite the event is considered as a milestone on metropolitan territorial planning and it was decided to divert the river Turia definitely through a major project (12 km of channeling, known as South Solution), being the scenario notably altered, the analysis of the hydrological and hydraulic process has never been reviewed. Undoubtedly, a modern study would ensure a more effective and accurate risk management within the Valencian metropolitan area. The development of a GIS-based model enables the utilization of these materials, most of them unpublished. This non-systematic information can be treated jointly from a new perspective. In short, this model facilitates the provision of a database through a vast amount of organized, structured and georeferenced information about the event. In a second stage, it makes possible to interpret the hydro-geomorphic processes from the 1957 event (trenches along barrier beaches, erosion, deposition processes…) and hydraulic processes (main flow encroachment versus quasi-hydrostatic-flood, or 1D versus 2D flood behavior), which can be identified in order to obtain georeferenced information about spatial variability, directional information of flows and point distribution of water levels and flooded points. It is also necessary to carry out photo-interpretation works to clarify some unresolved issues with the objective of establishing the real order of magnitude of the flash-flood concerning the discharge rank. In the same way, some other elements can be identified such as urban streams along the streets, levees overtopping and breaks, flooded area, etc. Lastly, in the future the GIS database will enable to obtain a more accurate both hydraulic mathematical modelling and calibration/validation.
RTOS kernel in portable electrocardiograph
NASA Astrophysics Data System (ADS)
Centeno, C. A.; Voos, J. A.; Riva, G. G.; Zerbini, C.; Gonzalez, E. A.
2011-12-01
This paper presents the use of a Real Time Operating System (RTOS) on a portable electrocardiograph based on a microcontroller platform. All medical device digital functions are performed by the microcontroller. The electrocardiograph CPU is based on the 18F4550 microcontroller, in which an uCOS-II RTOS can be embedded. The decision associated with the kernel use is based on its benefits, the license for educational use and its intrinsic time control and peripherals management. The feasibility of its use on the electrocardiograph is evaluated based on the minimum memory requirements due to the kernel structure. The kernel's own tools were used for time estimation and evaluation of resources used by each process. After this feasibility analysis, the migration from cyclic code to a structure based on separate processes or tasks able to synchronize events is used; resulting in an electrocardiograph running on one Central Processing Unit (CPU) based on RTOS.
NASA Astrophysics Data System (ADS)
de Thomaz, A. A.; Faustino, W. M.; Fontes, A.; Fernandes, H. P.; Barjas-Castro, M. d. L.; Metze, K.; Giorgio, S.; Barbosa, L. C.; Cesar, C. L.
2007-09-01
The research in biomedical photonics is clearly evolving in the direction of the understanding of biological processes at the cell level. The spatial resolution to accomplish this task practically requires photonics tools. However, an integration of different photonic tools and a multimodal and functional approach will be necessary to access the mechanical and biochemical cell processes. This way we can observe mechanicaly triggered biochemical events or biochemicaly triggered mechanical events, or even observe simultaneously mechanical and biochemical events triggered by other means, e.g. electricaly. One great advantage of the photonic tools is its easiness for integration. Therefore, we developed such integrated tool by incorporating single and double Optical Tweezers with Confocal Single and Multiphoton Microscopies. This system can perform 2-photon excited fluorescence and Second Harmonic Generation microscopies together with optical manipulations. It also can acquire Fluorescence and SHG spectra of specific spots. Force, elasticity and viscosity measurements of stretched membranes can be followed by real time confocal microscopies. Also opticaly trapped living protozoas, such as leishmania amazonensis. Integration with CARS microscopy is under way. We will show several examples of the use of such integrated instrument and its potential to observe mechanical and biochemical processes at cell level.
Fond, Guillaume; Gaman, Alexandru; Brunel, Lore; Haffen, Emmanuel; Llorca, Pierre-Michel
2015-08-30
Two studies have shown that increasing the consultation of the word "suicide" in the Google search engine was associated with a subsequent increase in the prevalence of suicide attempts. The main goal of this article was to explore the trends generated by a key-word search associated with suicide, depression and bipolarity in an attempt to identify general trends (disorders epidemics in the population/"real events" vs newsworthy advertisement/"media event"). Based on previous studies, the frequency of the search words "how to suicide" and "commit suicide" were analyzed for suicide, as well as "depression" (for depressive disorders) and "bipolar disorder". Together, these analyses suggest that the search for the words "how to suicide" or "commit suicide" on the Google search engine may be a good indicator for suicide prevention policies. However, the tool is not developed enough to date to be used as a real time dynamic indicator of suicide epidemics. The frequency of the search for the word "suicide" was associated with those for "depression" but not for "bipolar disorder", but searches for psychiatric conditions seem to be influenced by media events more than by real events in the general population. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Characterization of real-time computers
NASA Technical Reports Server (NTRS)
Shin, K. G.; Krishna, C. M.
1984-01-01
A real-time system consists of a computer controller and controlled processes. Despite the synergistic relationship between these two components, they have been traditionally designed and analyzed independently of and separately from each other; namely, computer controllers by computer scientists/engineers and controlled processes by control scientists. As a remedy for this problem, in this report real-time computers are characterized by performance measures based on computer controller response time that are: (1) congruent to the real-time applications, (2) able to offer an objective comparison of rival computer systems, and (3) experimentally measurable/determinable. These measures, unlike others, provide the real-time computer controller with a natural link to controlled processes. In order to demonstrate their utility and power, these measures are first determined for example controlled processes on the basis of control performance functionals. They are then used for two important real-time multiprocessor design applications - the number-power tradeoff and fault-masking and synchronization.
High-Speed Observer: Automated Streak Detection in SSME Plumes
NASA Technical Reports Server (NTRS)
Rieckoff, T. J.; Covan, M.; OFarrell, J. M.
2001-01-01
A high frame rate digital video camera installed on test stands at Stennis Space Center has been used to capture images of Space Shuttle main engine plumes during test. These plume images are processed in real time to detect and differentiate anomalous plume events occurring during a time interval on the order of 5 msec. Such speed yields near instantaneous availability of information concerning the state of the hardware. This information can be monitored by the test conductor or by other computer systems, such as the integrated health monitoring system processors, for possible test shutdown before occurrence of a catastrophic engine failure.
Helbig-Lang, Sylvia; von Auer, Maxie; Neubauer, Karolin; Murray, Eileen; Gerlach, Alexander L
2016-09-01
Excessive post-mortem processing after social situations, a core symptom of social anxiety disorder (SAD), is thought to contribute to the perpetuation of social anxiety by consolidating negative self-schemata. Empirical findings on actual mechanisms underlying this so-called Post-Event Processing (PEP) are still scarce. The present study sought to identify variables associated with the experience of PEP after real-life social situations in a sample of 49 individuals diagnosed with SAD. Using an ambulatory assessment approach, individuals were asked to report on each distressing social event experienced during one week. A total of 192 events were captured. Hierarchical linear modeling indicated that next to trait social anxiety, the type of social situation (performance vs. interaction situations), self-focused attention, safety behavior use, and negative affect predicted levels of PEP after social situations. These findings add to the growing literature that emphasizes the importance of situational factors for the experience of PEP, and highlight potential venues to prevent it. Copyright © 2016 Elsevier Ltd. All rights reserved.
Real-time X-ray Diffraction: Applications to Materials Characterization
NASA Technical Reports Server (NTRS)
Rosemeier, R. G.
1984-01-01
With the high speed growth of materials it becomes necessary to develop measuring systems which also have the capabilities of characterizing these materials at high speeds. One of the conventional techniques of characterizing materials was X-ray diffraction. Film, which is the oldest method of recording the X-ray diffraction phenomenon, is not quite adequate in most circumstances to record fast changing events. Even though conventional proportional counters and scintillation counters can provide the speed necessary to record these changing events, they lack the ability to provide image information which may be important in some types of experiment or production arrangements. A selected number of novel applications of using X-ray diffraction to characterize materials in real-time are discussed. Also, device characteristics of some X-ray intensifiers useful in instantaneous X-ray diffraction applications briefly presented. Real-time X-ray diffraction experiments with the incorporation of image X-ray intensification add a new dimension in the characterization of materials. The uses of real-time image intensification in laboratory and production arrangements are quite unlimited and their application depends more upon the ingenuity of the scientist or engineer.
NASA Astrophysics Data System (ADS)
Beranzoli, Laura; Best, Mairi; Chierici, Francesco; Embriaco, Davide; Galbraith, Nan; Heeseman, Martin; Kelley, Deborah; Pirenne, Benoit; Scofield, Oscar; Weller, Robert
2015-04-01
There is a need for tsunami modeling and early warning systems for near-source areas. For example this is a common public safety threat in the Mediterranean and Juan de Fuca/NE Pacific Coast of N.A.; Regions covered by the EMSO, OOI, and ONC ocean observatories. Through the CoopEUS international cooperation project, a number of environmental research infrastructures have come together to coordinate efforts on environmental challenges; this tsunami case study tackles one such challenge. There is a mutual need of tsunami event field data and modeling to deepen our experience in testing methodology and developing real-time data processing. Tsunami field data are already available for past events, part of this use case compares these for compatibility, gap analysis, and model groundtruthing. It also reviews sensors needed and harmonizes instrument settings. Sensor metadata and registries are compared, harmonized, and aligned. Data policies and access are also compared and assessed for gap analysis. Modelling algorithms are compared and tested against archived and real-time data. This case study will then be extended to other related tsunami data and model sources globally with similar geographic and seismic scenarios.
Taverniers, Isabel; Windels, Pieter; Vaïtilingom, Marc; Milcamps, Anne; Van Bockstaele, Erik; Van den Eede, Guy; De Loose, Marc
2005-04-20
Since the 18th of April 2004, two new regulations, EC/1829/2003 on genetically modified food and feed products and EC/1830/2003 on traceability and labeling of GMOs, are in force in the EU. This new, comprehensive regulatory framework emphasizes the need of an adequate tracing system. Unique identifiers, such as the transgene genome junction region or a specific rearrangement within the transgene DNA, should form the basis of such a tracing system. In this study, we describe the development of event-specific tracing systems for transgenic maize lines Bt11, Bt176, and GA21 and for canola event GT73. Molecular characterization of the transgene loci enabled us to clone an event-specific sequence into a plasmid vector, to be used as a marker, and to develop line-specific primers. Primer specificity was tested through qualitative PCRs and dissociation curve analysis in SYBR Green I real-time PCRs. The primers were then combined with event-specific TaqMan probes in quantitative real-time PCRs. Calibration curves were set up both with genomic DNA samples and the newly synthesized plasmid DNA markers. It is shown that cloned plasmid GMO target sequences are perfectly suitable as unique identifiers and quantitative calibrators. Together with an event-specific primer pair and a highly specific TaqMan probe, the plasmid markers form crucial components of a unique and straighforward tracing system for Bt11, Bt176, and GA21 maize and GT73 canola events.
Real Time Data for Seismology at the IRIS Data Management Center, AN Nsf-Sponsored Facility
NASA Astrophysics Data System (ADS)
Benson, R. B.; Ahern, T. K.; Trabant, C.; Weertman, B. R.; Casey, R.; Stromme, S.; Karstens, R.
2012-12-01
When IRIS was incorporated in 1984, it committed to provide long-term support for the science of seismology. It first upgraded analog networks by installing observatory grade digital seismic recording equipment (by constructing the Global Seismic Network to upgrade the World Wide Standardized Seismographic Network) that became the backbone of the International Federation of Digital Seismic Networks (FDSN), and in 1990 constructed a state-of-the-art data center that would allow free and open access to data to everyone. For the first decade, IRIS leveraged a complicated system of telemetry which laid the foundation for delivering (relatively) high rate and continuous seismic time series data to the IRIS Data Management Center, which was designed to accept data that arrived with highly variable latencies and on many media formats. This meant that science had to often wait until data became complete, which at the time was primarily related to studying earthquakes or similar events. During the 1990's, numerous incremental but small improvements were made to get data into the hands of users with less latency, leveraging dialup, satellite telemetry, and a variety of Internet protocols. But beginning in 2000, the IRIS Data Management Center began the process of accumulating data comprehensively in real time. It was first justified because it eliminated the time-consuming transcription and manual data handling on various media formats, like magnetic tapes, CD's and DVD's. However, the switch to real-time telemetry proved to be a major improvement technologically because it not only simplified data transfer, it opened access to a large volume of previously inaccessible data (local resource limitations), and many networks began willingly providing their geophysical data to the broad research community. It also enabled researchers the ability to process data in different and streamlined ways, by incorporating data directly into workflows and processing packages. Any network on the Internet, small or large, can now share data, and today, the IRIS DMC receives nearly all of its seismic data from regional and international networks in real time. We will show that this evolution to managing real time data has provided the framework for accomplishing many important benefits that illustrate that open, real time data should be the goal of every observatory operation and can provide: - Faster (therefore cost and data saving) quality control, - Data products that highlight source properties and provide teachable moments - Data delivery to regional or national networks around the globe for immediate access for monitoring. -Use in teaching the public, providing streaming data to museums, schools, etc.
Real-time Environmental Monitoring from a Wind Farm Platform in the Texas Hypoxic Zone
NASA Astrophysics Data System (ADS)
Mullins, R. L.; Dimarco, S. F.; Walpert, J. N.; Guinasso, N. L.; Howard, M. K.
2009-12-01
Ocean observing systems (OOS) provide coastal managers with data for informed decision-making. OOS are designed to monitor oceanographic and atmospheric conditions from a variety of offshore platforms. In the summer of 2009, a multi-disciplinary system, the Galveston Instrument Garden for Environmental Monitoring (GIGEM), was deployed off the coast of Galveston, Texas (Location: 29o 08’ 29.654’’N, 94o 44’ 51.339’’W) to monitor coastal waters and provide real-time observations for investigating processes responsible for coastal Texas hypoxia. Hypoxia occurs in the Gulf of Mexico over the continental shelf and refers to low dissolved oxygen concentrations in the bottom waters caused by a combination of environmental and physical parameters. Events form rapidly, last for a few days to weeks, and commonly occur along the Louisiana and Texas coasts; however, little research has been conducted to investigate the processes responsible for Texas hypoxia formation. GIGEM was designed to study this problem by contributing real-time measurements to compare with historical coastal data series. Unlike most coastal OOS, GIGEM is installed on an experimental wind farm platform operated by Wind Energy System Technologies Inc. This platform is the first executed offshore wind energy lease in the United States. GIGEM is comprised of two components, the subsurface mooring and a nearby bottom package. The data telemetry system includes a unique design of underwater and surface inductive modems. GIGEM is the only coastal OOS currently collecting real-time environmental water quality measurements on the Texas shelf. The work presented describes: the obstacles and challenges associated with deploying GIGEM, the flow of information from the water column to the user, and how this type of OOS fulfills the societal goals for protecting coastal ecosystems and improving coastal weather and ocean predictions envisioned by the Integrated Ocean Observing System (IOOS). Data and analysis results include profiles of vertical water column, examining the role of stratification in the formation of coastal hypoxia, and the influence of storm events on water column stability recorded from GIGEM. The comparison of real-time data from GIGEM with historical data will be presented in a unique 4D visualization tool (Eonfusion, Myriax Pty. Ltd.) as a useful method for investigating coastal hypoxia. The GIGEM data sets will be fused with model and remotely sensed data from the Gulf of Mexico Coastal Observing System (GCOOS) data portal to show the data in broader context for use in decision support tools.
NASA Astrophysics Data System (ADS)
Chen, Sheng; Hu, Junjun; Zhang, Asi; Min, Chao; Huang, Chaoying; Liang, Zhenqing
2018-02-01
This study assesses the performance of near real-time Global Satellite Mapping of Precipitation (GSMaP_NRT) estimates over northern China, including Beijing and its adjacent regions, during three heavy precipitation events from 21 July 2012 to 2 August 2012. Two additional near real-time satellite-based products, the Climate Prediction Center morphing method (CMORPH) and Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks-Cloud Classification System (PERSIANN-CCS), were used for parallel comparison with GSMaP_NRT. Gridded gauge observations were used as reference for a performance evaluation with respect to spatiotemporal variability, probability distribution of precipitation rate and volume, and contingency scores. Overall, GSMaP_NRT generally captures the spatiotemporal variability of precipitation and shows promising potential in near real-time mapping applications. GSMaP_NRT misplaced storm centers in all three storms. GSMaP_NRT demonstrated higher skill scores in the first high-impact storm event on 21 July 2015. GSMaP_NRT passive microwave only precipitation can generally capture the pattern of heavy precipitation distributions over flat areas but failed to capture the intensive rain belt over complicated mountainous terrain. The results of this study can be useful to both algorithm developers and the scientific end users, providing a better understanding of strengths and weaknesses to hydrologists using satellite precipitation products.
Full-Field Spectroscopy at Megahertz-frame-rates: Application of Coherent Time-Stretch Transform
NASA Astrophysics Data System (ADS)
DeVore, Peter Thomas Setsuda
Outliers or rogue events are found extensively in our world and have incredible effects. Also called rare events, they arise in the distribution of wealth (e.g., Pareto index), finance, network traffic, ocean waves, and e-commerce (selling less of more). Interest in rare optical events exploded after the sighting of optical rogue waves in laboratory experiments at UCLA. Detecting such tail events in fast streams of information necessitates real-time measurements. The Coherent Time-Stretch Transform chirps a pulsed source of radiation so that its temporal envelope matches its spectral profile (analogous to the far field regime of spatial diffraction), and the mapped spectral electric field is slow enough to be captured by a real-time digitizer. Combining this technique with spectral encoding, the time stretch technique has enabled a new class of ultra-high performance spectrometers and cameras (30+ MHz), and analog-to-digital converters that have led to the discovery of optical rogue waves and detection of cancer cells in blood with one in a million sensitivity. Conventionally, the Coherent Time-Stretch Transform maps the spectrum into the temporal electric field, but the time-dilation process along with inherent fiber losses results in reduction of peak power and loss of sensitivity, a problem exacerbated by extremely narrow molecular linewidths. The loss issue notwithstanding, in many cases the requisite dispersive optical device is not available. By extending the Coherent Time-Stretch Transform to the temporal near field, I have demonstrated, for the first time, phase-sensitive absorption spectroscopy of a gaseous sample at millions of frames per second. As the Coherent Time-Stretch Transform may capture both near and far field optical waves, it is a complete spectro-temporal optical characterization tool. This is manifested as an amplitude-dependent chirp, which implies the ability to measure the complex refractive index dispersion at megahertz frame rates. This technique is not only four orders of magnitude faster than even the fastest (kHz) spectrometers, but will also enable capture of real-time complex dielectric function dynamics of plasmas and chemical reactions (e.g. combustion). It also has applications in high-energy physics, biology, and monitoring fast high-throughput industrial processes. Adding an electro-optic modulator to the Time-Stretch Transform yields time-to-time mapping of electrical waveforms. Known as TiSER, it is an analog slow-motion processor that uses light to reduce the bandwidth of broadband RF signals for capture by high-sensitivity analog-to-digital converters (ADC). However, the electro-optic modulator limits the electrical bandwidth of TiSER. To solve this, I introduced Optical Sideband-only Amplification, wherein electro-optically generated modulation (containing the RF information) is amplified at the expense of the carrier, addressing the two most important problems plaguing electro-optic modulators: (1) low RF bandwidth and (2) high required RF drive voltages. I demonstrated drive voltage reductions of 5x at 10 GHz and 10x at 50 GHz, while simultaneously increasing the RF bandwidth.
Time-recovering PCI-AER interface for bio-inspired spiking systems
NASA Astrophysics Data System (ADS)
Paz-Vicente, R.; Linares-Barranco, A.; Cascado, D.; Vicente, S.; Jimenez, G.; Civit, A.
2005-06-01
Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows for real-time virtual massive connectivity between huge number neurons located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate 'events' according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. When building multi-chip muti-layered AER systems it is absolutely necessary to have a computer interface that allows (a) to read AER interchip traffic into the computer and visualize it on screen, and (b) inject a sequence of events at some point of the AER structure. This is necessary for testing and debugging complex AER systems. This paper presents a PCI to AER interface, that dispatches a sequence of events received from the PCI bus with embedded timing information to establish when each event will be delivered. A set of specialized states machines has been introduced to recovery the possible time delays introduced by the asynchronous AER bus. On the input channel, the interface capture events assigning a timestamp and delivers them through the PCI bus to MATLAB applications. It has been implemented in real time hardware using VHDL and it has been tested in a PCI-AER board, developed by authors, that includes a Spartan II 200 FPGA. The demonstration hardware is currently capable to send and receive events at a peak rate of 8,3 Mev/sec, and a typical rate of 1 Mev/sec.
A Low-Cost Tracking System for Running Race Applications Based on Bluetooth Low Energy Technology.
Perez-Diaz-de-Cerio, David; Hernández-Solana, Ángela; Valdovinos, Antonio; Valenzuela, Jose Luis
2018-03-20
Timing points used in running races and other competition events are generally based on radio-frequency identification (RFID) technology. Athletes' times are calculated via passive RFID tags and reader kits. Specifically, the reader infrastructure needed is complex and requires the deployment of a mat or ramps which hide the receiver antennae under them. Moreover, with the employed tags, it is not possible to transmit additional and dynamic information such as pulse or oximetry monitoring, alarms, etc. In this paper we present a system based on two low complex schemes allowed in Bluetooth Low Energy (BLE): the non-connectable undirected advertisement process and a modified version of scannable undirected advertisement process using the new capabilities present in Bluetooth 5. After fully describing the system architecture, which allows full real-time position monitoring of the runners using mobile phones on the organizer side and BLE sensors on the participants' side, we derive the mobility patterns of runners and capacity requirements, which are determinant for evaluating the performance of the proposed system. They have been obtained from the analysis of the real data measured in the last Barcelona Marathon. By means of simulations, we demonstrate that, even under disadvantageous conditions (50% error ratio), both schemes perform reliably and are able to detect the 100% of the participants in all the cases. The cell coverage of the system needs to be adjusted when non-connectable process is considered. Nevertheless, through simulation and experimental, we show that the proposed scheme based on the new events available in Bluetooth 5 is clearly the best implementation alternative for all the cases, no matter the coverage area and the runner speed. The proposal widely exceeds the detection requirements of the real scenario, surpassing the measured peaks of 20 sensors per second incoming in the coverage area, moving at speeds that range from 1.5 m/s to 6.25 m/s. The designed real test-bed shows that the scheme is able to detect 72 sensors below 600 ms, fulfilling comfortably the requirements determined for the intended application. The main disadvantage of this system would be that the sensors are active, but we have proved that its consumption can be so low (9.5 µA) that, with a typical button cell, the sensor battery life would be over 10,000 h of use.
A Low-Cost Tracking System for Running Race Applications Based on Bluetooth Low Energy Technology
2018-01-01
Timing points used in running races and other competition events are generally based on radio-frequency identification (RFID) technology. Athletes’ times are calculated via passive RFID tags and reader kits. Specifically, the reader infrastructure needed is complex and requires the deployment of a mat or ramps which hide the receiver antennae under them. Moreover, with the employed tags, it is not possible to transmit additional and dynamic information such as pulse or oximetry monitoring, alarms, etc. In this paper we present a system based on two low complex schemes allowed in Bluetooth Low Energy (BLE): the non-connectable undirected advertisement process and a modified version of scannable undirected advertisement process using the new capabilities present in Bluetooth 5. After fully describing the system architecture, which allows full real-time position monitoring of the runners using mobile phones on the organizer side and BLE sensors on the participants’ side, we derive the mobility patterns of runners and capacity requirements, which are determinant for evaluating the performance of the proposed system. They have been obtained from the analysis of the real data measured in the last Barcelona Marathon. By means of simulations, we demonstrate that, even under disadvantageous conditions (50% error ratio), both schemes perform reliably and are able to detect the 100% of the participants in all the cases. The cell coverage of the system needs to be adjusted when non-connectable process is considered. Nevertheless, through simulation and experimental, we show that the proposed scheme based on the new events available in Bluetooth 5 is clearly the best implementation alternative for all the cases, no matter the coverage area and the runner speed. The proposal widely exceeds the detection requirements of the real scenario, surpassing the measured peaks of 20 sensors per second incoming in the coverage area, moving at speeds that range from 1.5 m/s to 6.25 m/s. The designed real test-bed shows that the scheme is able to detect 72 sensors below 600 ms, fulfilling comfortably the requirements determined for the intended application. The main disadvantage of this system would be that the sensors are active, but we have proved that its consumption can be so low (9.5 µA) that, with a typical button cell, the sensor battery life would be over 10,000 h of use. PMID:29558432
Symbolic discrete event system specification
NASA Technical Reports Server (NTRS)
Zeigler, Bernard P.; Chi, Sungdo
1992-01-01
Extending discrete event modeling formalisms to facilitate greater symbol manipulation capabilities is important to further their use in intelligent control and design of high autonomy systems. An extension to the DEVS formalism that facilitates symbolic expression of event times by extending the time base from the real numbers to the field of linear polynomials over the reals is defined. A simulation algorithm is developed to generate the branching trajectories resulting from the underlying nondeterminism. To efficiently manage symbolic constraints, a consistency checking algorithm for linear polynomial constraints based on feasibility checking algorithms borrowed from linear programming has been developed. The extended formalism offers a convenient means to conduct multiple, simultaneous explorations of model behaviors. Examples of application are given with concentration on fault model analysis.
1989-09-16
SWOTHR was conceived to be an organic asset capable of providing early detection and tracking of fast , surface-skimming threats, such as cruise missiles...distributed real-time processing and threat tracking system. Spe- cific project goals were to verify detection performance pree ctions for small, fast targets...means that enlarging the ground plane would have been a fruitless excercise in any event. B-6 5 i I U Table B-1 summarizes the calculated parameters of
On line instrument systems for monitoring steam turbogenerators
NASA Astrophysics Data System (ADS)
Clapis, A.; Giorgetti, G.; Lapini, G. L.; Benanti, A.; Frigeri, C.; Gadda, E.; Mantino, E.
A computerized real time data acquisition and data processing for the diagnosis of malfunctioning of steam turbogenerator systems is described. Pressure, vibration and temperature measurements are continuously collected from standard or special sensors including startup or stop events. The architecture of the monitoring system is detailed. Examples of the graphics output are presented. It is shown that such a system allows accurate diagnosis and the possibility of creating a data bank to describe the dynamic characteristics of the machine park.
Ayres-de-Campos, Diogo; Rei, Mariana; Nunes, Inês; Sousa, Paulo; Bernardes, João
2017-01-01
SisPorto 4.0 is the most recent version of a program for the computer analysis of cardiotocographic (CTG) signals and ST events, which has been adapted to the 2015 International Federation of Gynaecology and Obstetrics (FIGO) guidelines for intrapartum foetal monitoring. This paper provides a detailed description of the analysis performed by the system, including the signal-processing algorithms involved in identification of basic CTG features and the resulting real-time alerts.
A novel seizure detection algorithm informed by hidden Markov model event states
NASA Astrophysics Data System (ADS)
Baldassano, Steven; Wulsin, Drausin; Ung, Hoameng; Blevins, Tyler; Brown, Mesha-Gay; Fox, Emily; Litt, Brian
2016-06-01
Objective. Recently the FDA approved the first responsive, closed-loop intracranial device to treat epilepsy. Because these devices must respond within seconds of seizure onset and not miss events, they are tuned to have high sensitivity, leading to frequent false positive stimulations and decreased battery life. In this work, we propose a more robust seizure detection model. Approach. We use a Bayesian nonparametric Markov switching process to parse intracranial EEG (iEEG) data into distinct dynamic event states. Each event state is then modeled as a multidimensional Gaussian distribution to allow for predictive state assignment. By detecting event states highly specific for seizure onset zones, the method can identify precise regions of iEEG data associated with the transition to seizure activity, reducing false positive detections associated with interictal bursts. The seizure detection algorithm was translated to a real-time application and validated in a small pilot study using 391 days of continuous iEEG data from two dogs with naturally occurring, multifocal epilepsy. A feature-based seizure detector modeled after the NeuroPace RNS System was developed as a control. Main results. Our novel seizure detection method demonstrated an improvement in false negative rate (0/55 seizures missed versus 2/55 seizures missed) as well as a significantly reduced false positive rate (0.0012 h versus 0.058 h-1). All seizures were detected an average of 12.1 ± 6.9 s before the onset of unequivocal epileptic activity (unequivocal epileptic onset (UEO)). Significance. This algorithm represents a computationally inexpensive, individualized, real-time detection method suitable for implantable antiepileptic devices that may considerably reduce false positive rate relative to current industry standards.
Lacour, C; Joannis, C; Gromaire, M-C; Chebbo, G
2009-01-01
Turbidity sensors can be used to continuously monitor the evolution of pollutant mass discharge. For two sites within the Paris combined sewer system, continuous turbidity, conductivity and flow data were recorded at one-minute time intervals over a one-year period. This paper is intended to highlight the variability in turbidity dynamics during wet weather. For each storm event, turbidity response aspects were analysed through different classifications. The correlation between classification and common parameters, such as the antecedent dry weather period, total event volume per impervious hectare and both the mean and maximum hydraulic flow for each event, was also studied. Moreover, the dynamics of flow and turbidity signals were compared at the event scale. No simple relation between turbidity responses, hydraulic flow dynamics and the chosen parameters was derived from this effort. Knowledge of turbidity dynamics could therefore potentially improve wet weather management, especially when using pollution-based real-time control (P-RTC) since turbidity contains information not included in hydraulic flow dynamics and not readily predictable from such dynamics.
Real-Time Radiographic In-Situ Characterization Of Ply Lift In Composite Aerospace Materials
NASA Technical Reports Server (NTRS)
Beshears, Ronald D.; Doering, Edward R.
2006-01-01
The problem of ply lifting in composite materials is a significant issue for various aerospace and military applications. A fundamental element in the prevention or mitigation of ply lift is determination of the timing of the ply lifting event during exposure of the composite material to flight conditions. The Marshall Space Flight Center s Nondestructive Evaluation Team developed a real-time radiographic technique for the detection of ply lift in carbon phenolic ablative materials in situ during live firings of subscale test motors in support of NASA s Reusable Solid Rocket Motor program, using amorphous silicon detector panels. The radiographic method has successfully detected ply lifting in seven consecutive carbon phenolic converging cones attached to solid fuel torches, providing the time of ply lift initiation in each test. Post-processing of the radiographic images improved the accuracy of timing measurements and allowed measurement of the ply lifting height as a function of time. Radiographic data correlated well with independent pressure and temperature measurements that indicate the onset of ply lift in the nozzle material.
Avatars, Virtual Reality Technology, and the U.S. Military: Emerging Policy Issues
2008-04-09
called “ Sentient Worldwide Simulation,” which will “mirror” real life and automatically follow real-world events in real time. Some virtual world...cities, with the final goal of creating a fully functioning virtual model of the entire world, which will be known as the Sentient Worldwide Simulation
Alternating event processes during lifetimes: population dynamics and statistical inference.
Shinohara, Russell T; Sun, Yifei; Wang, Mei-Cheng
2018-01-01
In the literature studying recurrent event data, a large amount of work has been focused on univariate recurrent event processes where the occurrence of each event is treated as a single point in time. There are many applications, however, in which univariate recurrent events are insufficient to characterize the feature of the process because patients experience nontrivial durations associated with each event. This results in an alternating event process where the disease status of a patient alternates between exacerbations and remissions. In this paper, we consider the dynamics of a chronic disease and its associated exacerbation-remission process over two time scales: calendar time and time-since-onset. In particular, over calendar time, we explore population dynamics and the relationship between incidence, prevalence and duration for such alternating event processes. We provide nonparametric estimation techniques for characteristic quantities of the process. In some settings, exacerbation processes are observed from an onset time until death; to account for the relationship between the survival and alternating event processes, nonparametric approaches are developed for estimating exacerbation process over lifetime. By understanding the population dynamics and within-process structure, the paper provide a new and general way to study alternating event processes.
Aspects regarding at 13C isotope separation column control using Petri nets system
NASA Astrophysics Data System (ADS)
Boca, M. L.; Ciortea, M. E.
2015-11-01
This paper is intended to show that Petri nets can be also applicable in the chemical industry. It used linear programming, modeling underlying Petri nets, especially discrete event systems for isotopic separation, the purpose of considering and control events in real-time through graphical representations. In this paper it is simulate the control of 13C Isotope Separation column using Petri nets. The major problem with 13C comes from the difficulty of obtaining it and raising its natural fraction. Carbon isotopes can be obtained using many methods, one of them being the cryogenic distillation of carbon monoxide. Some few aspects regarding operating conditions and the construction of such cryogenic plants are known today, and even less information are available as far as the separation process modeling and control are concerned. In fact, the efficient control of the carbon monoxide distillation process represents a necessity for large-scale 13C production. Referring to a classic distillation process, some models for carbon isotope separation have been proposed, some based on mass, component and energy balance equations, some on the nonlinear wave theory or the Cohen equations. For modeling the system it was used Petri nets because in this case it is deal with discrete event systems. In use of the non-timed and with auxiliary times Petri model, the transport stream was divided into sections and these sections will be analyzed successively. Because of the complexity of the system and the large amount of calculations required it was not possible to analyze the system as a unitary whole. A first attempt to model the system as a unitary whole led to the blocking of the model during simulation, because of the large processing times.
The AMchip04 and the processing unit prototype for the FastTracker
NASA Astrophysics Data System (ADS)
Andreani, A.; Annovi, A.; Beretta, M.; Bogdan, M.; Citterio, M.; Alberti, F.; Giannetti, P.; Lanza, A.; Magalotti, D.; Piendibene, M.; Shochet, M.; Stabile, A.; Tang, J.; Tompkins, L.; Volpi, G.
2012-08-01
Modern experiments search for extremely rare processes hidden in much larger background levels. As the experiment`s complexity, the accelerator backgrounds and luminosity increase we need increasingly complex and exclusive event selection. We present the first prototype of a new Processing Unit (PU), the core of the FastTracker processor (FTK). FTK is a real time tracking device for the ATLAS experiment`s trigger upgrade. The computing power of the PU is such that a few hundred of them will be able to reconstruct all the tracks with transverse momentum above 1 GeV/c in ATLAS events up to Phase II instantaneous luminosities (3 × 1034 cm-2 s-1) with an event input rate of 100 kHz and a latency below a hundred microseconds. The PU provides massive computing power to minimize the online execution time of complex tracking algorithms. The time consuming pattern recognition problem, generally referred to as the ``combinatorial challenge'', is solved by the Associative Memory (AM) technology exploiting parallelism to the maximum extent; it compares the event to all pre-calculated ``expectations'' or ``patterns'' (pattern matching) simultaneously, looking for candidate tracks called ``roads''. This approach reduces to a linear behavior the typical exponential complexity of the CPU based algorithms. Pattern recognition is completed by the time data are loaded into the AM devices. We report on the design of the first Processing Unit prototypes. The design had to address the most challenging aspects of this technology: a huge number of detector clusters (``hits'') must be distributed at high rate with very large fan-out to all patterns (10 Million patterns will be located on 128 chips placed on a single board) and a huge number of roads must be collected and sent back to the FTK post-pattern-recognition functions. A network of high speed serial links is used to solve the data distribution problem.
Real-Time Processing System for the JET Hard X-Ray and Gamma-Ray Profile Monitor Enhancement
NASA Astrophysics Data System (ADS)
Fernandes, Ana M.; Pereira, Rita C.; Neto, André; Valcárcel, Daniel F.; Alves, Diogo; Sousa, Jorge; Carvalho, Bernardo B.; Kiptily, Vasily; Syme, Brian; Blanchard, Patrick; Murari, Andrea; Correia, Carlos M. B. A.; Varandas, Carlos A. F.; Gonçalves, Bruno
2014-06-01
The Joint European Torus (JET) is currently undertaking an enhancement program which includes tests of relevant diagnostics with real-time processing capabilities for the International Thermonuclear Experimental Reactor (ITER). Accordingly, a new real-time processing system was developed and installed at JET for the gamma-ray and hard X-ray profile monitor diagnostic. The new system is connected to 19 CsI(Tl) photodiodes in order to obtain the line-integrated profiles of the gamma-ray and hard X-ray emissions. Moreover, it was designed to overcome the former data acquisition (DAQ) limitations while exploiting the required real-time features. The new DAQ hardware, based on the Advanced Telecommunication Computer Architecture (ATCA) standard, includes reconfigurable digitizer modules with embedded field-programmable gate array (FPGA) devices capable of acquiring and simultaneously processing data in real-time from the 19 detectors. A suitable algorithm was developed and implemented in the FPGAs, which are able to deliver the corresponding energy of the acquired pulses. The processed data is sent periodically, during the discharge, through the JET real-time network and stored in the JET scientific databases at the end of the pulse. The interface between the ATCA digitizers, the JET control and data acquisition system (CODAS), and the JET real-time network is provided by the Multithreaded Application Real-Time executor (MARTe). The work developed allowed attaining two of the major milestones required by next fusion devices: the ability to process and simultaneously supply high volume data rates in real-time.
Regional early flood warning system: design and implementation
NASA Astrophysics Data System (ADS)
Chang, L. C.; Yang, S. N.; Kuo, C. L.; Wang, Y. F.
2017-12-01
This study proposes a prototype of the regional early flood inundation warning system in Tainan City, Taiwan. The AI technology is used to forecast multi-step-ahead regional flood inundation maps during storm events. The computing time is only few seconds that leads to real-time regional flood inundation forecasting. A database is built to organize data and information for building real-time forecasting models, maintaining the relations of forecasted points, and displaying forecasted results, while real-time data acquisition is another key task where the model requires immediately accessing rain gauge information to provide forecast services. All programs related database are constructed in Microsoft SQL Server by using Visual C# to extracting real-time hydrological data, managing data, storing the forecasted data and providing the information to the visual map-based display. The regional early flood inundation warning system use the up-to-date Web technologies driven by the database and real-time data acquisition to display the on-line forecasting flood inundation depths in the study area. The friendly interface includes on-line sequentially showing inundation area by Google Map, maximum inundation depth and its location, and providing KMZ file download of the results which can be watched on Google Earth. The developed system can provide all the relevant information and on-line forecast results that helps city authorities to make decisions during typhoon events and make actions to mitigate the losses.
The geology and Mesozoic collisional history of the Cordillera Real, Ecuador
NASA Astrophysics Data System (ADS)
Aspden, John A.; Litherland, Martin
1992-04-01
The geology of the metamorphic rocks of the Cordillera Real of Ecuador is described in terms of five informal lithotectonic divisions. We deduce that during the Mesozoic repeated accretionary events occurred and that dextral transpression has been of fundamental importance in determining the tectonic evolution of this part of the Northern Andes. The oldest event recognised, of probable Late Triassic age, may be related to the break-up of western Gondwana and generated a regional belt of 'S-type' plutons. During the Jurassic, major calc-alkaline batholiths were intruded. Following this, in latest Jurassic to Early Cretaceous time, a volcano-sedimentary terrane, of possible oceanic or marginal basin origin (the Alao division), and the most westerly, gneissic Chaucha-Arenillas terrane, were accreted to continental South America. The accretion of the oceanic Western Cordillera took place in latest Cretaceous to earliest Tertiary time. This latter event coincided with widespread thermal disturbance, as evidenced by the large number of young K-Ar mineral ages recorded from the Cordillera Real.
NASA Technical Reports Server (NTRS)
Albus, James S.
1996-01-01
The Real-time Control System (RCS) developed at NIST and elsewhere over the past two decades defines a reference model architecture for design and analysis of complex intelligent control systems. The RCS architecture consists of a hierarchically layered set of functional processing modules connected by a network of communication pathways. The primary distinguishing feature of the layers is the bandwidth of the control loops. The characteristic bandwidth of each level is determined by the spatial and temporal integration window of filters, the temporal frequency of signals and events, the spatial frequency of patterns, and the planning horizon and granularity of the planners that operate at each level. At each level, tasks are decomposed into sequential subtasks, to be performed by cooperating sets of subordinate agents. At each level, signals from sensors are filtered and correlated with spatial and temporal features that are relevant to the control function being implemented at that level.
A real-time early warning system for pathogens in water
NASA Astrophysics Data System (ADS)
Adams, John A.; McCarty, David; Crousore, Kristina
2006-05-01
The events of September 11, 2001 represented an escalation in the means and effects of terrorist attacks and raised awareness of the vulnerability of major infrastructures such as transportation, finance, power and energy, communications, food, and water. A re-examination of the security of critical assets was initiated. Actions were taken in the United States to protect our drinking water. Anti-terrorism monitoring systems that allow us to take action before contaminated water can reach the consumer have been under development since then. This presentation will discuss the current performance of a laser-based, multi-angle light scattering (MALS) technology for continuous, real-time detection and classification of microorganisms for security applications in all drinking and process water applications inclusive of protection of major assets, potable and distributed water. Field test data for a number of waterborne pathogens will also be presented.
NASA Astrophysics Data System (ADS)
Sulaiman, M.; El-Shafie, A.; Karim, O.; Basri, H.
2011-10-01
Flood forecasting models are a necessity, as they help in planning for flood events, and thus help prevent loss of lives and minimize damage. At present, artificial neural networks (ANN) have been successfully applied in river flow and water level forecasting studies. ANN requires historical data to develop a forecasting model. However, long-term historical water level data, such as hourly data, poses two crucial problems in data training. First is that the high volume of data slows the computation process. Second is that data training reaches its optimal performance within a few cycles of data training, due to there being a high volume of normal water level data in the data training, while the forecasting performance for high water level events is still poor. In this study, the zoning matching approach (ZMA) is used in ANN to accurately monitor flood events in real time by focusing the development of the forecasting model on high water level zones. ZMA is a trial and error approach, where several training datasets using high water level data are tested to find the best training dataset for forecasting high water level events. The advantage of ZMA is that relevant knowledge of water level patterns in historical records is used. Importantly, the forecasting model developed based on ZMA successfully achieves high accuracy forecasting results at 1 to 3 h ahead and satisfactory performance results at 6 h. Seven performance measures are adopted in this study to describe the accuracy and reliability of the forecasting model developed.