NASA Technical Reports Server (NTRS)
Zeigler, Bernard P.
1989-01-01
It is shown how systems can be advantageously represented as discrete-event models by using DEVS (discrete-event system specification), a set-theoretic formalism. Such DEVS models provide a basis for the design of event-based logic control. In this control paradigm, the controller expects to receive confirming sensor responses to its control commands within definite time windows determined by its DEVS model of the system under control. The event-based contral paradigm is applied in advanced robotic and intelligent automation, showing how classical process control can be readily interfaced with rule-based symbolic reasoning systems.
Adaptable, high recall, event extraction system with minimal configuration.
Miwa, Makoto; Ananiadou, Sophia
2015-01-01
Biomedical event extraction has been a major focus of biomedical natural language processing (BioNLP) research since the first BioNLP shared task was held in 2009. Accordingly, a large number of event extraction systems have been developed. Most such systems, however, have been developed for specific tasks and/or incorporated task specific settings, making their application to new corpora and tasks problematic without modification of the systems themselves. There is thus a need for event extraction systems that can achieve high levels of accuracy when applied to corpora in new domains, without the need for exhaustive tuning or modification, whilst retaining competitive levels of performance. We have enhanced our state-of-the-art event extraction system, EventMine, to alleviate the need for task-specific tuning. Task-specific details are specified in a configuration file, while extensive task-specific parameter tuning is avoided through the integration of a weighting method, a covariate shift method, and their combination. The task-specific configuration and weighting method have been employed within the context of two different sub-tasks of BioNLP shared task 2013, i.e. Cancer Genetics (CG) and Pathway Curation (PC), removing the need to modify the system specifically for each task. With minimal task specific configuration and tuning, EventMine achieved the 1st place in the PC task, and 2nd in the CG, achieving the highest recall for both tasks. The system has been further enhanced following the shared task by incorporating the covariate shift method and entity generalisations based on the task definitions, leading to further performance improvements. We have shown that it is possible to apply a state-of-the-art event extraction system to new tasks with high levels of performance, without having to modify the system internally. Both covariate shift and weighting methods are useful in facilitating the production of high recall systems. These methods and their combination can adapt a model to the target data with no deep tuning and little manual configuration.
2014-09-18
and full/scale experimental verifications towards ground/ satellite quantum key distribution0 Oat Qhotonics 4235>9+7,=5;9!អ \\58^ Zin K. Dao Z. Miu T...Conceptual Modeling of a Quantum Key Distribution Simulation Framework Using the Discrete Event System Specification DISSERTATION Jeffrey D. Morris... QUANTUM KEY DISTRIBUTION SIMULATION FRAMEWORK USING THE DISCRETE EVENT SYSTEM SPECIFICATION DISSERTATION Presented to the Faculty Department of Systems
Adaptable, high recall, event extraction system with minimal configuration
2015-01-01
Background Biomedical event extraction has been a major focus of biomedical natural language processing (BioNLP) research since the first BioNLP shared task was held in 2009. Accordingly, a large number of event extraction systems have been developed. Most such systems, however, have been developed for specific tasks and/or incorporated task specific settings, making their application to new corpora and tasks problematic without modification of the systems themselves. There is thus a need for event extraction systems that can achieve high levels of accuracy when applied to corpora in new domains, without the need for exhaustive tuning or modification, whilst retaining competitive levels of performance. Results We have enhanced our state-of-the-art event extraction system, EventMine, to alleviate the need for task-specific tuning. Task-specific details are specified in a configuration file, while extensive task-specific parameter tuning is avoided through the integration of a weighting method, a covariate shift method, and their combination. The task-specific configuration and weighting method have been employed within the context of two different sub-tasks of BioNLP shared task 2013, i.e. Cancer Genetics (CG) and Pathway Curation (PC), removing the need to modify the system specifically for each task. With minimal task specific configuration and tuning, EventMine achieved the 1st place in the PC task, and 2nd in the CG, achieving the highest recall for both tasks. The system has been further enhanced following the shared task by incorporating the covariate shift method and entity generalisations based on the task definitions, leading to further performance improvements. Conclusions We have shown that it is possible to apply a state-of-the-art event extraction system to new tasks with high levels of performance, without having to modify the system internally. Both covariate shift and weighting methods are useful in facilitating the production of high recall systems. These methods and their combination can adapt a model to the target data with no deep tuning and little manual configuration. PMID:26201408
Taverniers, Isabel; Windels, Pieter; Vaïtilingom, Marc; Milcamps, Anne; Van Bockstaele, Erik; Van den Eede, Guy; De Loose, Marc
2005-04-20
Since the 18th of April 2004, two new regulations, EC/1829/2003 on genetically modified food and feed products and EC/1830/2003 on traceability and labeling of GMOs, are in force in the EU. This new, comprehensive regulatory framework emphasizes the need of an adequate tracing system. Unique identifiers, such as the transgene genome junction region or a specific rearrangement within the transgene DNA, should form the basis of such a tracing system. In this study, we describe the development of event-specific tracing systems for transgenic maize lines Bt11, Bt176, and GA21 and for canola event GT73. Molecular characterization of the transgene loci enabled us to clone an event-specific sequence into a plasmid vector, to be used as a marker, and to develop line-specific primers. Primer specificity was tested through qualitative PCRs and dissociation curve analysis in SYBR Green I real-time PCRs. The primers were then combined with event-specific TaqMan probes in quantitative real-time PCRs. Calibration curves were set up both with genomic DNA samples and the newly synthesized plasmid DNA markers. It is shown that cloned plasmid GMO target sequences are perfectly suitable as unique identifiers and quantitative calibrators. Together with an event-specific primer pair and a highly specific TaqMan probe, the plasmid markers form crucial components of a unique and straighforward tracing system for Bt11, Bt176, and GA21 maize and GT73 canola events.
Narration and Vividness as Measures of Event-Specificity in Autobiographical Memory
ERIC Educational Resources Information Center
Nelson, Kristin L.; Moskovitz, Damian J.; Steiner, Hans
2008-01-01
The event specificity of autobiographical memories refers to the degree to which retold memories include specific details about a unique personal experience from a variety of representational systems supported by different brain areas. This article proposes 2 text measures as indicators of event specificity: (a) a measure of temporal sequence in…
48 CFR 2110.7003 - Significant events.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Significant events. 2110.7003 Section 2110.7003 Federal Acquisition Regulations System OFFICE OF PERSONNEL MANAGEMENT, FEDERAL..., AND OTHER PURCHASE DESCRIPTIONS Contract Specifications 2110.7003 Significant events. The contractor...
The digital trigger system for the RED-100 detector
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naumov, P. P., E-mail: ddr727@yandex.ru; Akimov, D. Yu.; Belov, V. A.
The system for forming a trigger for the liquid xenon detector RED-100 is developed. The trigger can be generated for all types of events that the detector needs for calibration and data acquisition, including the events with a single electron of ionization. In the system, a mechanism of event detection is implemented according to which the timestamp and event type are assigned to each event. The trigger system is required in the systems searching for rare events to select and keep only the necessary information from the ADC array. The specifications and implementation of the trigger unit which provides amore » high efficiency of response even to low-energy events are considered.« less
From Goal-Oriented Requirements to Event-B Specifications
NASA Technical Reports Server (NTRS)
Aziz, Benjamin; Arenas, Alvaro E.; Bicarregui, Juan; Ponsard, Christophe; Massonet, Philippe
2009-01-01
In goal-oriented requirements engineering methodologies, goals are structured into refinement trees from high-level system-wide goals down to fine-grained requirements assigned to specific software/ hardware/human agents that can realise them. Functional goals assigned to software agents need to be operationalised into specification of services that the agent should provide to realise those requirements. In this paper, we propose an approach for operationalising requirements into specifications expressed in the Event-B formalism. Our approach has the benefit of aiding software designers by bridging the gap between declarative requirements and operational system specifications in a rigorous manner, enabling powerful correctness proofs and allowing further refinements down to the implementation level. Our solution is based on verifying that a consistent Event-B machine exhibits properties corresponding to requirements.
48 CFR 2110.7003 - Significant events.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Significant events. 2110..., AND OTHER PURCHASE DESCRIPTIONS Contract Specifications 2110.7003 Significant events. The contractor is required to inform the contracting officer of all significant events. ...
48 CFR 2110.7003 - Significant events.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 6 2013-10-01 2013-10-01 false Significant events. 2110..., AND OTHER PURCHASE DESCRIPTIONS Contract Specifications 2110.7003 Significant events. The contractor is required to inform the contracting officer of all significant events. ...
48 CFR 2110.7003 - Significant events.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 6 2012-10-01 2012-10-01 false Significant events. 2110..., AND OTHER PURCHASE DESCRIPTIONS Contract Specifications 2110.7003 Significant events. The contractor is required to inform the contracting officer of all significant events. ...
48 CFR 2110.7003 - Significant events.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 6 2014-10-01 2014-10-01 false Significant events. 2110..., AND OTHER PURCHASE DESCRIPTIONS Contract Specifications 2110.7003 Significant events. The contractor is required to inform the contracting officer of all significant events. ...
An extension of the OpenModelica compiler for using Modelica models in a discrete event simulation
Nutaro, James
2014-11-03
In this article, a new back-end and run-time system is described for the OpenModelica compiler. This new back-end transforms a Modelica model into a module for the adevs discrete event simulation package, thereby extending adevs to encompass complex, hybrid dynamical systems. The new run-time system that has been built within the adevs simulation package supports models with state-events and time-events and that comprise differential-algebraic systems with high index. Finally, although the procedure for effecting this transformation is based on adevs and the Discrete Event System Specification, it can be adapted to any discrete event simulation package.
IDC Re-Engineering Phase 2 System Specification Document Version 1.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Satpathi, Meara Allena; Burns, John F.; Harris, James M.
This document contains the system specifications derived to satisfy the system requirements found in the IDC System Requirements Document for the IDC Re-Engineering Phase 2 project. This System Specification Document (SSD) defines waveform data processing requirements for the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The routine processing includes characterization of events with the objective of screening out events considered to be consistent with natural phenomena or non-nuclear, man-made phenomena. This document does not address requirements concerning acquisition, processing and analysis of radionuclide data but does include requirements for the dissemination of radionuclide datamore » and products.« less
Design a Learning-Oriented Fall Event Reporting System Based on Kirkpatrick Model.
Zhou, Sicheng; Kang, Hong; Gong, Yang
2017-01-01
Patient fall has been a severe problem in healthcare facilities around the world due to its prevalence and cost. Routine fall prevention training programs are not as effective as expected. Using event reporting systems is the trend for reducing patient safety events such as falls, although some limitations of the systems exist at current stage. We summarized these limitations through literature review, and developed an improved web-based fall event reporting system. The Kirkpatrick model, widely used in the business area for training program evaluation, has been integrated during the design of our system. Different from traditional event reporting systems that only collect and store the reports, our system automatically annotates and analyzes the reported events, and provides users with timely knowledge support specific to the reported event. The paper illustrates the design of our system and how its features are intended to reduce patient falls by learning from previous errors.
Beam Energy Scan of Specific Heat Through Temperature Fluctuations in Heavy Ion Collisions
NASA Astrophysics Data System (ADS)
Basu, Sumit; Nandi, Basanta K.; Chatterjee, Sandeep; Chatterjee, Rupa; Nayak, Tapan
2016-01-01
Temperature fluctuations may have two distinct origins, first, quantum fluctuations that are initial state fluctuations, and second, thermodynamical fluctuations. We discuss a method of extracting the thermodynamic temperature from the mean transverse momentum of pions, by using controllable parameters such as centrality of the system, and range of the transverse momenta. Event-by-event fluctuations in global temperature over a large phase space provide the specific heat of the system. We present Beam Energy Scan of specific heat from data, AMPT and HRG model prediction. Experimental results from NA49, STAR, PHENIX, PHOBOS and ALICE are combined to obtain the specific heat as a function of beam energy. These results are compared to calculations from AMPT event generator, HRG model and lattice calculations, respectively.
Whole-Body MR Imaging Including Angiography: Predicting Recurrent Events in Diabetics.
Bertheau, Robert C; Bamberg, Fabian; Lochner, Elena; Findeisen, Hannes M; Parhofer, Klaus G; Kauczor, Hans-Ulrich; Schoenberg, Stefan O; Weckbach, Sabine; Schlett, Christopher L
2016-05-01
Whether whole-body MRI can predict occurrence of recurrent events in patients with diabetes mellitus. Whole-body MRI was prospectively applied to 61 diabetics and assessed for arteriosclerosis and ischemic cerebral/myocardial changes. Occurrence of cardiocerebral events and diabetic comorbidites was determined. Patients were stratified whether no, a single or recurrent events arose. As a secondary endpoint, events were stratified into organ system-specific groups. During a median follow-up of 70 months, 26 diabetics developed a total of 39 events; 18 (30%) developed one, 8 (13%) recurrent events. Between diabetics with no, a single and recurrent events, a stepwise higher burden was observed for presence of left ventricular (LV) hypo-/akinesia (3/28/75%, p < 0.0001), myocardial delayed-contrast-enhancement (17/33/63%, p = 0.001), carotid artery stenosis (11/17/63%, p = 0.005), peripheral artery stenosis (26/56/88%, p = 0.0006) and vessel score (1.00/1.30/1.76, p < 0.0001). After adjusting for clinical characteristics, LV hypo-/akinesia (hazard rate ratio = 6.57, p < 0.0001) and vessel score (hazard rate ratio = 12.29, p < 0.0001) remained independently associated. Assessing organ system risk, cardiac and cerebral MR findings predicted more strongly events in their respective organ system. Vessel-score predicted both cardiac and cerebral, but not non-cardiocerebral, events. Whole-body MR findings predict occurrence of recurrent events in diabetics independent of clinical characteristics, and may concurrently provide organ system-specific risk. • Patients with long-standing diabetes mellitus are at high risk for recurrent events. • Whole-body MRI predicts occurrence of recurrent events independently of clinical characteristics. • The vessel score derived from whole-body angiography is a good general risk-marker. • Whole-body MRI may also provide organ-specific risk assessment. • Current findings may indicate benefits of whole-body MRI for risk stratification.
Single Event Testing on Complex Devices: Test Like You Fly versus Test-Specific Design Structures
NASA Technical Reports Server (NTRS)
Berg, Melanie; LaBel, Kenneth A.
2014-01-01
We present a framework for evaluating complex digital systems targeted for harsh radiation environments such as space. Focus is limited to analyzing the single event upset (SEU) susceptibility of designs implemented inside Field Programmable Gate Array (FPGA) devices. Tradeoffs are provided between application-specific versus test-specific test structures.
Screening DNA chip and event-specific multiplex PCR detection methods for biotech crops.
Lee, Seong-Hun
2014-11-01
There are about 80 biotech crop events that have been approved by safety assessment in Korea. They have been controlled by genetically modified organism (GMO) and living modified organism (LMO) labeling systems. The DNA-based detection method has been used as an efficient scientific management tool. Recently, the multiplex polymerase chain reaction (PCR) and DNA chip have been developed as simultaneous detection methods for several biotech crops' events. The event-specific multiplex PCR method was developed to detect five biotech maize events: MIR604, Event 3272, LY 038, MON 88017 and DAS-59122-7. The specificity was confirmed and the sensitivity was 0.5%. The screening DNA chip was developed from four endogenous genes of soybean, maize, cotton and canola respectively along with two regulatory elements and seven genes: P35S, tNOS, pat, bar, epsps1, epsps2, pmi, cry1Ac and cry3B. The specificity was confirmed and the sensitivity was 0.5% for four crops' 12 events: one soybean, six maize, three cotton and two canola events. The multiplex PCR and DNA chip can be available for screening, gene-specific and event-specific analysis of biotech crops as efficient detection methods by saving on workload and time. © 2014 Society of Chemical Industry. © 2014 Society of Chemical Industry.
Xu, Xiaodan; Li, Yingcong; Zhao, Heng; Wen, Si-yuan; Wang, Sheng-qi; Huang, Jian; Huang, Kun-lun; Luo, Yun-bo
2005-05-18
To devise a rapid and reliable method for the detection and identification of genetically modified (GM) events, we developed a multiplex polymerase chain reaction (PCR) coupled with a DNA microarray system simultaneously aiming at many targets in a single reaction. The system included probes for screening gene, species reference gene, specific gene, construct-specific gene, event-specific gene, and internal and negative control genes. 18S rRNA was combined with species reference genes as internal controls to assess the efficiency of all reactions and to eliminate false negatives. Two sets of the multiplex PCR system were used to amplify four and five targets, respectively. Eight different structure genes could be detected and identified simultaneously for Roundup Ready soybean in a single microarray. The microarray specificity was validated by its ability to discriminate two GM maizes Bt176 and Bt11. The advantages of this method are its high specificity and greatly reduced false-positives and -negatives. The multiplex PCR coupled with microarray technology presented here is a rapid and reliable tool for the simultaneous detection of GM organism ingredients.
Simplex and duplex event-specific analytical methods for functional biotech maize.
Lee, Seong-Hun; Kim, Su-Jeong; Yi, Bu-Young
2009-08-26
Analytical methods are very important in the control of genetically modified organism (GMO) labeling systems or living modified organism (LMO) management for biotech crops. Event-specific primers and probes were developed for qualitative and quantitative analysis for biotech maize event 3272 and LY 038 on the basis of the 3' flanking regions, respectively. The qualitative primers confirmed the specificity by a single PCR product and sensitivity to 0.05% as a limit of detection (LOD). Simplex and duplex quantitative methods were also developed using TaqMan real-time PCR. One synthetic plasmid was constructed from two taxon-specific DNA sequences of maize and two event-specific 3' flanking DNA sequences of event 3272 and LY 038 as reference molecules. In-house validation of the quantitative methods was performed using six levels of mixing samples, from 0.1 to 10.0%. As a result, the biases from the true value and the relative deviations were all within the range of +/-30%. Limits of quantitation (LOQs) of the quantitative methods were all 0.1% for simplex real-time PCRs of event 3272 and LY 038 and 0.5% for duplex real-time PCR of LY 038. This study reports that event-specific analytical methods were applicable for qualitative and quantitative analysis for biotech maize event 3272 and LY 038.
Local instability driving extreme events in a pair of coupled chaotic electronic circuits
NASA Astrophysics Data System (ADS)
de Oliveira, Gilson F.; Di Lorenzo, Orlando; de Silans, Thierry Passerat; Chevrollier, Martine; Oriá, Marcos; Cavalcante, Hugo L. D. de Souza
2016-06-01
For a long time, extreme events happening in complex systems, such as financial markets, earthquakes, and neurological networks, were thought to follow power-law size distributions. More recently, evidence suggests that in many systems the largest and rarest events differ from the other ones. They are dragon kings, outliers that make the distribution deviate from a power law in the tail. Understanding the processes of formation of extreme events and what circumstances lead to dragon kings or to a power-law distribution is an open question and it is a very important one to assess whether extreme events will occur too often in a specific system. In the particular system studied in this paper, we show that the rate of occurrence of dragon kings is controlled by the value of a parameter. The system under study here is composed of two nearly identical chaotic oscillators which fail to remain in a permanently synchronized state when coupled. We analyze the statistics of the desynchronization events in this specific example of two coupled chaotic electronic circuits and find that modifying a parameter associated to the local instability responsible for the loss of synchronization reduces the occurrence of dragon kings, while preserving the power-law distribution of small- to intermediate-size events with the same scaling exponent. Our results support the hypothesis that the dragon kings are caused by local instabilities in the phase space.
Dai, Li; Huang, Ying; Wang, Ying; Han, Huan-Li; Li, Qu-Bei; Jiang, Yong-Hui
2014-01-01
To retrospectively assess serious systemic adverse effects of standardized dust-mite vaccine in children with asthma. Medical records of 704 children (5-17 years in age) with asthma between January, 2005 and December, 2011 were reviewed. Serious systemic adverse events following treatment with a standardized dust-mite vaccine in these children were analyzed. A total of 336 systemic adverse reactions were observed in 17.0% (120/704) of the patients analyzed of these adverse reactions, 18 (5.4%) were serious (level 3), 318 (94.6%) were not serious (below level 3), and no single case of anaphylactic shock (level 4) was recorded. Systemic adverse events occurred most frequently in the 5 to 11-year age group and in the summer season (from June to August). In the 18 severe cases, the peak expiratory flow (PEF) dropped by 20% immediately after the vaccine injection, and other major clinical symptoms included cough, wheezing and urticaria. All children with serious systemic adverse effects were given inhaled hormone and atomized short-acting beta agonists, oral antihistamines, intravenous dexamethasone and/or intramuscular adrenaline. After these treatments, the clinical symptoms were significantly relieved. The rate of serious systemic adverse events following allergen-specific immunotherapy is relatively low in children with allergic asthma. Conventional medications are effective in managing these immunotherapy-associated adverse events.
A Risk Assessment System with Automatic Extraction of Event Types
NASA Astrophysics Data System (ADS)
Capet, Philippe; Delavallade, Thomas; Nakamura, Takuya; Sandor, Agnes; Tarsitano, Cedric; Voyatzi, Stavroula
In this article we describe the joint effort of experts in linguistics, information extraction and risk assessment to integrate EventSpotter, an automatic event extraction engine, into ADAC, an automated early warning system. By detecting as early as possible weak signals of emerging risks ADAC provides a dynamic synthetic picture of situations involving risk. The ADAC system calculates risk on the basis of fuzzy logic rules operated on a template graph whose leaves are event types. EventSpotter is based on a general purpose natural language dependency parser, XIP, enhanced with domain-specific lexical resources (Lexicon-Grammar). Its role is to automatically feed the leaves with input data.
Event-specific real-time detection and quantification of genetically modified Roundup Ready soybean.
Huang, Chia-Chia; Pan, Tzu-Ming
2005-05-18
The event-specific real-time detection and quantification of Roundup Ready soybean (RRS) using an ABI PRISM 7700 sequence detection system with light upon extension (LUX) primer was developed in this study. The event-specific primers were designed, targeting the junction of the RRS 5' integration site and the endogenous gene lectin1. Then, a standard reference plasmid was constructed that carried both of the targeted sequences for quantitative analysis. The detection limit of the LUX real-time PCR system was 0.05 ng of 100% RRS genomic DNA, which was equal to 20.5 copies. The range of quantification was from 0.1 to 100%. The sensitivity and range of quantification successfully met the requirement of the labeling rules in the European Union and Taiwan.
Data Albums: An Event Driven Search, Aggregation and Curation Tool for Earth Science
NASA Technical Reports Server (NTRS)
Ramachandran, Rahul; Kulkarni, Ajinkya; Maskey, Manil; Bakare, Rohan; Basyal, Sabin; Li, Xiang; Flynn, Shannon
2014-01-01
One of the largest continuing challenges in any Earth science investigation is the discovery and access of useful science content from the increasingly large volumes of Earth science data and related information available. Approaches used in Earth science research such as case study analysis and climatology studies involve gathering discovering and gathering diverse data sets and information to support the research goals. Research based on case studies involves a detailed description of specific weather events using data from different sources, to characterize physical processes in play for a specific event. Climatology-based research tends to focus on the representativeness of a given event, by studying the characteristics and distribution of a large number of events. This allows researchers to generalize characteristics such as spatio-temporal distribution, intensity, annual cycle, duration, etc. To gather relevant data and information for case studies and climatology analysis is both tedious and time consuming. Current Earth science data systems are designed with the assumption that researchers access data primarily by instrument or geophysical parameter. Those who know exactly the datasets of interest can obtain the specific files they need using these systems. However, in cases where researchers are interested in studying a significant event, they have to manually assemble a variety of datasets relevant to it by searching the different distributed data systems. In these cases, a search process needs to be organized around the event rather than observing instruments. In addition, the existing data systems assume users have sufficient knowledge regarding the domain vocabulary to be able to effectively utilize their catalogs. These systems do not support new or interdisciplinary researchers who may be unfamiliar with the domain terminology. This paper presents a specialized search, aggregation and curation tool for Earth science to address these existing challenges. The search tool automatically creates curated "Data Albums", aggregated collections of information related to a specific science topic or event, containing links to relevant data files (granules) from different instruments; tools and services for visualization and analysis; and information about the event contained in news reports, images or videos to supplement research analysis. Curation in the tool is driven via an ontology based relevancy ranking algorithm to filter out non-relevant information and data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, Fei; Jiang, Huaiguang; Tan, Jin
This paper proposes an event-driven approach for reconfiguring distribution systems automatically. Specifically, an optimal synchrophasor sensor placement (OSSP) is used to reduce the number of synchrophasor sensors while keeping the whole system observable. Then, a wavelet-based event detection and location approach is used to detect and locate the event, which performs as a trigger for network reconfiguration. With the detected information, the system is then reconfigured using the hierarchical decentralized approach to seek for the new optimal topology. In this manner, whenever an event happens the distribution network can be reconfigured automatically based on the real-time information that is observablemore » and detectable.« less
Data Discovery and Access via the Heliophysics Events Knowledgebase (HEK)
NASA Astrophysics Data System (ADS)
Somani, A.; Hurlburt, N. E.; Schrijver, C. J.; Cheung, M.; Freeland, S.; Slater, G. L.; Seguin, R.; Timmons, R.; Green, S.; Chang, L.; Kobashi, A.; Jaffey, A.
2011-12-01
The HEK is a integrated system which helps direct scientists to solar events and data from a variety of providers. The system is fully operational and adoption of HEK has been growing since the launch of NASA's SDO mission. In this presentation we describe the different components that comprise HEK. The Heliophysics Events Registry (HER) and Heliophysics Coverage Registry (HCR) form the two major databases behind the system. The HCR allows the user to search on coverage event metadata for a variety of instruments. The HER allows the user to search on annotated event metadata for a variety of instruments. Both the HCR and HER are accessible via a web API which can return search results in machine readable formats (e.g. XML and JSON). A variety of SolarSoft services are also provided to allow users to search the HEK as well as obtain and manipulate data. Other components include - the Event Detection System (EDS) continually runs feature finding algorithms on SDO data to populate the HER with relevant events, - A web form for users to request SDO data cutouts for multiple AIA channels as well as HMI line-of-sight magnetograms, - iSolSearch, which allows a user to browse events in the HER and search for specific events over a specific time interval, all within a graphical web page, - Panorama, which is the software tool used for rapid visualization of large volumes of solar image data in multiple channels/wavelengths. The user can also easily create WYSIWYG movies and launch the Annotator tool to describe events and features. - EVACS, which provides a JOGL powered client for the HER and HCR. EVACS displays the searched for events on a full disk magnetogram of the sun while displaying more detailed information for events.
From IHE Audit Trails to XES Event Logs Facilitating Process Mining.
Paster, Ferdinand; Helm, Emmanuel
2015-01-01
Recently Business Intelligence approaches like process mining are applied to the healthcare domain. The goal of process mining is to gain process knowledge, compliance and room for improvement by investigating recorded event data. Previous approaches focused on process discovery by event data from various specific systems. IHE, as a globally recognized basis for healthcare information systems, defines in its ATNA profile how real-world events must be recorded in centralized event logs. The following approach presents how audit trails collected by the means of ATNA can be transformed to enable process mining. Using the standardized audit trails provides the ability to apply these methods to all IHE based information systems.
Time-varying causal network of the Korean financial system based on firm-specific risk premiums
NASA Astrophysics Data System (ADS)
Song, Jae Wook; Ko, Bonggyun; Cho, Poongjin; Chang, Woojin
2016-09-01
The aim of this paper is to investigate the Korean financial system based on time-varying causal network. We discover many stylized facts by utilizing the firm-specific risk premiums for measuring the causality direction from a firm to firm. At first, we discover that the interconnectedness of causal network is affected by the outbreak of financial events; the co-movement of firm-specific risk premium is strengthened after each positive event, and vice versa. Secondly, we find that the major sector of the Korean financial system is the Depositories, and the financial reform in June-2011 achieves its purpose by weakening the power of risk-spillovers of Broker-Dealers. Thirdly, we identify that the causal network is a small-world network with scale-free topology where the power-law exponents of out-Degree and negative event are more significant than those of in-Degree and positive event. Lastly, we discuss that the current aspects of causal network are closely related to the long-term future scenario of the KOSPI Composite index where the direction and stability are significantly affected by the power of risk-spillovers and the power-law exponents of degree distributions, respectively.
Challenges and Demands on Automated Software Revision
NASA Technical Reports Server (NTRS)
Bonakdarpour, Borzoo; Kulkarni, Sandeep S.
2008-01-01
In the past three decades, automated program verification has undoubtedly been one of the most successful contributions of formal methods to software development. However, when verification of a program against a logical specification discovers bugs in the program, manual manipulation of the program is needed in order to repair it. Thus, in the face of existence of numerous unverified and un- certified legacy software in virtually any organization, tools that enable engineers to automatically verify and subsequently fix existing programs are highly desirable. In addition, since requirements of software systems often evolve during the software life cycle, the issue of incomplete specification has become a customary fact in many design and development teams. Thus, automated techniques that revise existing programs according to new specifications are of great assistance to designers, developers, and maintenance engineers. As a result, incorporating program synthesis techniques where an algorithm generates a program, that is correct-by-construction, seems to be a necessity. The notion of manual program repair described above turns out to be even more complex when programs are integrated with large collections of sensors and actuators in hostile physical environments in the so-called cyber-physical systems. When such systems are safety/mission- critical (e.g., in avionics systems), it is essential that the system reacts to physical events such as faults, delays, signals, attacks, etc, so that the system specification is not violated. In fact, since it is impossible to anticipate all possible such physical events at design time, it is highly desirable to have automated techniques that revise programs with respect to newly identified physical events according to the system specification.
Schiller, Q.; Tu, W.; Ali, A. F.; ...
2017-03-11
The most significant unknown regarding relativistic electrons in Earth’s outer Van Allen radiation belt is the relative contribution of loss, transport, and acceleration processes within the inner magnetosphere. Detangling each individual process is critical to improve the understanding of radiation belt dynamics, but determining a single component is challenging due to sparse measurements in diverse spatial and temporal regimes. However, there are currently an unprecedented number of spacecraft taking measurements that sample different regions of the inner magnetosphere. With the increasing number of varied observational platforms, system dynamics can begin to be unraveled. In this work, we employ in-situ measurementsmore » during the 13-14 January 2013 enhancement event to isolate transport, loss, and source dynamics in a one dimensional radial diffusion model. We then validate the results by comparing them to Van Allen Probes and THEMIS observations, indicating that the three terms have been accurately and individually quantified for the event. Finally, a direct comparison is performed between the model containing event-specific terms and various models containing terms parameterized by geomagnetic index. Models using a simple 3/Kp loss timescale show deviation from the event specific model of nearly two orders of magnitude within 72 hours of the enhancement event. However, models using alternative loss timescales closely resemble the event specific model.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schiller, Q.; Tu, W.; Ali, A. F.
The most significant unknown regarding relativistic electrons in Earth’s outer Van Allen radiation belt is the relative contribution of loss, transport, and acceleration processes within the inner magnetosphere. Detangling each individual process is critical to improve the understanding of radiation belt dynamics, but determining a single component is challenging due to sparse measurements in diverse spatial and temporal regimes. However, there are currently an unprecedented number of spacecraft taking measurements that sample different regions of the inner magnetosphere. With the increasing number of varied observational platforms, system dynamics can begin to be unraveled. In this work, we employ in-situ measurementsmore » during the 13-14 January 2013 enhancement event to isolate transport, loss, and source dynamics in a one dimensional radial diffusion model. We then validate the results by comparing them to Van Allen Probes and THEMIS observations, indicating that the three terms have been accurately and individually quantified for the event. Finally, a direct comparison is performed between the model containing event-specific terms and various models containing terms parameterized by geomagnetic index. Models using a simple 3/Kp loss timescale show deviation from the event specific model of nearly two orders of magnitude within 72 hours of the enhancement event. However, models using alternative loss timescales closely resemble the event specific model.« less
NASA Technical Reports Server (NTRS)
Berg, Melanie D.; LaBel, Kenneth A.
2018-01-01
The following are updated or new subjects added to the FPGA SEE Test Guidelines manual: academic versus mission specific device evaluation, single event latch-up (SEL) test and analysis, SEE response visibility enhancement during radiation testing, mitigation evaluation (embedded and user-implemented), unreliable design and its affects to SEE Data, testing flushable architectures versus non-flushable architectures, intellectual property core (IP Core) test and evaluation (addresses embedded and user-inserted), heavy-ion energy and linear energy transfer (LET) selection, proton versus heavy-ion testing, fault injection, mean fluence to failure analysis, and mission specific system-level single event upset (SEU) response prediction. Most sections within the guidelines manual provide information regarding best practices for test structure and test system development. The scope of this manual addresses academic versus mission specific device evaluation and visibility enhancement in IP Core testing.
NASA Technical Reports Server (NTRS)
Berg, Melanie D.; LaBel, Kenneth A.
2018-01-01
The following are updated or new subjects added to the FPGA SEE Test Guidelines manual: academic versus mission specific device evaluation, single event latch-up (SEL) test and analysis, SEE response visibility enhancement during radiation testing, mitigation evaluation (embedded and user-implemented), unreliable design and its affects to SEE Data, testing flushable architectures versus non-flushable architectures, intellectual property core (IP Core) test and evaluation (addresses embedded and user-inserted), heavy-ion energy and linear energy transfer (LET) selection, proton versus heavy-ion testing, fault injection, mean fluence to failure analysis, and mission specific system-level single event upset (SEU) response prediction. Most sections within the guidelines manual provide information regarding best practices for test structure and test system development. The scope of this manual addresses academic versus mission specific device evaluation and visibility enhancement in IP Core testing.
NASA Astrophysics Data System (ADS)
Podolska, Katerina
2017-04-01
The paper contains a statistical analysis of exceptional solar events and daily numbers of deaths from diseases from ICD-10 group VI. Diseases of the nervous system, group IX. Diseases of the circulatory system, and overall daily numbers of deaths in the Czech Republic. It is demonstrated that neurological diseases exhibit greater instability during the period of rising and falling solar activity. Specifically, we study the daily number of deaths separately for both sexes at the age groups under 39 and 40+ during the Solar Cycles No. 23 and No. 24. We focus mainly on exceptional solar events such as a "Bastille Day event" on July 14, 2000 (class X5), "Halloween solar storm" on October 28, 2003 (class X17), and events on January 7, 1997, April 2, 2000 (class X20), or September 7, 2005 (class X15). Special attention is given to "St. Patrick's Day storm" on March 17, 2015, the strongest geomagnetic storm of the Solar Cycle No. 24 that occurred following a coronal mass ejection (CME). We investigate changes in daily numbers of deaths during 1 month before and 1 month after these exceptional solar events. We take specific storm dynamics of geophysical parameters into consideration, and we also apply the results of risky characteristics of expositions by ionospheric and geomagnetic parameters. It is verified that, for diseases of the nervous system, women are generally more sensitive than men. On the contrary, this differences between men and women are not found for diseases of the circulatory system. Our findings suggest that the impact of hazardous space weather conditions on human health depends on the specific course and strength of individual solar storm.
NASA Technical Reports Server (NTRS)
Shvartzvald, Y.; Li, Z.; Udalski, A.; Gould, A.; Sumi, T.; Street, R. A.; Calchi Novati, S.; Hundertmark, M.; Bozza, V.; Beichman, C.;
2016-01-01
Simultaneous observations of microlensing events from multiple locations allow for the breaking of degeneracies between the physical properties of the lensing system, specifically by exploring different regions of the lens plane and by directly measuring the "microlens parallax". We report the discovery of a 30-65M J brown dwarf orbiting a K dwarf in the microlensing event OGLE-2015-BLG-1319. The system is located at a distance of approximately 5 kpc toward the Galactic Bulge. The event was observed by several ground-based groups as well as by Spitzer and Swift, allowing a measurement of the physical properties. However, the event is still subject to an eight-fold degeneracy, in particular the well-known close-wide degeneracy, and thus the projected separation between the two lens components is either approximately 0.25 au or approximately 45 au. This is the first microlensing event observed by Swift, with the UVOT camera. We study the region of microlensing parameter space to which Swift is sensitive, finding that though Swift could not measure the microlens parallax with respect to ground-based observations for this event, it can be important for other events. Specifically, it is important for detecting nearby brown dwarfs and free-floating planets in high magnification events.
NASA Astrophysics Data System (ADS)
Balbus, J. M.; Kirsch, T.; Mitrani-Reiser, J.
2017-12-01
Over recent decades, natural disasters and mass-casualty events in United States have repeatedly revealed the serious consequences of health care facility vulnerability and the subsequent ability to deliver care for the affected people. Advances in predictive modeling and vulnerability assessment for health care facility failure, integrated infrastructure, and extreme weather events have now enabled a more rigorous scientific approach to evaluating health care system vulnerability and assessing impacts of natural and human disasters as well as the value of specific interventions. Concurrent advances in computing capacity also allow, for the first time, full integration of these multiple individual models, along with the modeling of population behaviors and mass casualty responses during a disaster. A team of federal and academic investigators led by the National Center for Disaster Medicine and Public Health (NCDMPH) is develoing a platform for integrating extreme event forecasts, health risk/impact assessment and population simulations, critical infrastructure (electrical, water, transportation, communication) impact and response models, health care facility-specific vulnerability and failure assessments, and health system/patient flow responses. The integration of these models is intended to develop much greater understanding of critical tipping points in the vulnerability of health systems during natural and human disasters and build an evidence base for specific interventions. Development of such a modeling platform will greatly facilitate the assessment of potential concurrent or sequential catastrophic events, such as a terrorism act following a severe heat wave or hurricane. This presentation will highlight the development of this modeling platform as well as applications not just for the US health system, but also for international science-based disaster risk reduction efforts, such as the Sendai Framework and the WHO SMART hospital project.
BCIs in the Laboratory and at Home: The Wadsworth Research Program
NASA Astrophysics Data System (ADS)
Sellers, Eric W.; McFarland, Dennis J.; Vaughan, Theresa M.; Wolpaw, Jonathan R.
Many people with severe motor disabilities lack the muscle control that would allow them to rely on conventional methods of augmentative communication and control. Numerous studies over the past two decades have indicated that scalp-recorded electroencephalographic (EEG) activity can be the basis for non-muscular communication and control systems, commonly called brain-computer interfaces (BCIs) [55]. EEG-based BCI systems measure specific features of EEG activity and translate these features into device commands. The most commonly used features are rhythms produced by the sensorimotor cortex [38, 55, 56, 59], slow cortical potentials [4, 5, 23], and the P300 event-related potential [12, 17, 46]. Systems based on sensorimotor rhythms or slow cortical potentials use oscillations or transient signals that are spontaneous in the sense that they are not dependent on specific sensory events. Systems based on the P300 response use transient signals in the EEG that are elicited by specific stimuli.
Battles, J B; Kaplan, H S; Van der Schaaf, T W; Shea, C E
1998-03-01
To design, develop, and implement a prototype medical event-reporting system for use in transfusion medicine to improve transfusion safety by studying incidents and errors. The IDEALS concept of design was used to identify specifications for the event-reporting system, and a Delphi and subsequent nominal group technique meetings were used to reach consensus on the development of the system. An interdisciplinary panel of experts from aviation safety, nuclear power, cognitive psychology, artificial intelligence, and education and representatives of major transfusion medicine organizations participated in the development process. Setting.- Three blood centers and three hospital transfusion services implemented the reporting system. A working prototype event-reporting system was recommended and implemented. The system has seven components: detection, selection, description, classification, computation, interpretation, and local evaluation. Its unique features include no-fault reporting initiated by the individual discovering the event, who submits a report that is investigated by local quality assurance personnel and forwarded to a nonregulatory central system for computation and interpretation. An event-reporting system incorporated into present quality assurance and risk management efforts can help organizations address system structural and procedural weakness where the potential for errors can adversely affect health care outcomes. Input from the end users of the system as well as from external experts should enable this reporting system to serve as a useful model for others who may develop event-reporting systems in other medical domains.
Non-Lipschitz Dynamics Approach to Discrete Event Systems
NASA Technical Reports Server (NTRS)
Zak, M.; Meyers, R.
1995-01-01
This paper presents and discusses a mathematical formalism for simulation of discrete event dynamics (DED) - a special type of 'man- made' system designed to aid specific areas of information processing. A main objective is to demonstrate that the mathematical formalism for DED can be based upon the terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.
Interactive experimenters' planning procedures and mission control
NASA Technical Reports Server (NTRS)
Desjardins, R. L.
1973-01-01
The computerized mission control and planning system routinely generates a 24-hour schedule in one hour of operator time by including time dimensions into experimental planning procedures. Planning is validated interactively as it is being generated segment by segment in the frame of specific event times. The planner simply points a light pen at the time mark of interest on the time line for entering specific event times into the schedule.
Cheng, Nan; Shang, Ying; Xu, Yuancong; Zhang, Li; Luo, Yunbo; Huang, Kunlun; Xu, Wentao
2017-05-15
Stacked genetically modified organisms (GMO) are becoming popular for their enhanced production efficiency and improved functional properties, and on-site detection of stacked GMO is an urgent challenge to be solved. In this study, we developed a cascade system combining event-specific tag-labeled multiplex LAMP with a DNAzyme-lateral flow biosensor for reliable detection of stacked events (DP305423× GTS 40-3-2). Three primer sets, both event-specific and soybean species-specific, were newly designed for the tag-labeled multiplex LAMP system. A trident-like lateral flow biosensor displayed amplified products simultaneously without cross contamination, and DNAzyme enhancement improved the sensitivity effectively. After optimization, the limit of detection was approximately 0.1% (w/w) for stacked GM soybean, which is sensitive enough to detect genetically modified content up to a threshold value established by several countries for regulatory compliance. The entire detection process could be shortened to 120min without any large-scale instrumentation. This method may be useful for the in-field detection of DP305423× GTS 40-3-2 soybean on a single kernel basis and on-site screening tests of stacked GM soybean lines and individual parent GM soybean lines in highly processed foods. Copyright © 2017 Elsevier B.V. All rights reserved.
Embedded security system for multi-modal surveillance in a railway carriage
NASA Astrophysics Data System (ADS)
Zouaoui, Rhalem; Audigier, Romaric; Ambellouis, Sébastien; Capman, François; Benhadda, Hamid; Joudrier, Stéphanie; Sodoyer, David; Lamarque, Thierry
2015-10-01
Public transport security is one of the main priorities of the public authorities when fighting against crime and terrorism. In this context, there is a great demand for autonomous systems able to detect abnormal events such as violent acts aboard passenger cars and intrusions when the train is parked at the depot. To this end, we present an innovative approach which aims at providing efficient automatic event detection by fusing video and audio analytics and reducing the false alarm rate compared to classical stand-alone video detection. The multi-modal system is composed of two microphones and one camera and integrates onboard video and audio analytics and fusion capabilities. On the one hand, for detecting intrusion, the system relies on the fusion of "unusual" audio events detection with intrusion detections from video processing. The audio analysis consists in modeling the normal ambience and detecting deviation from the trained models during testing. This unsupervised approach is based on clustering of automatically extracted segments of acoustic features and statistical Gaussian Mixture Model (GMM) modeling of each cluster. The intrusion detection is based on the three-dimensional (3D) detection and tracking of individuals in the videos. On the other hand, for violent events detection, the system fuses unsupervised and supervised audio algorithms with video event detection. The supervised audio technique detects specific events such as shouts. A GMM is used to catch the formant structure of a shout signal. Video analytics use an original approach for detecting aggressive motion by focusing on erratic motion patterns specific to violent events. As data with violent events is not easily available, a normality model with structured motions from non-violent videos is learned for one-class classification. A fusion algorithm based on Dempster-Shafer's theory analyses the asynchronous detection outputs and computes the degree of belief of each probable event.
Dual-stage periodic event-triggered output-feedback control for linear systems.
Ruan, Zhen; Chen, Wu-Hua; Lu, Xiaomei
2018-05-01
This paper proposes an event-triggered control framework, called dual-stage periodic event-triggered control (DSPETC), which unifies periodic event-triggered control (PETC) and switching event-triggered control (SETC). Specifically, two period parameters h 1 and h 2 are introduced to characterize the new event-triggering rule, where h 1 denotes the sampling period, while h 2 denotes the monitoring period. By choosing some specified values of h 2 , the proposed control scheme can reduce to PETC or SETC scheme. In the DSPETC framework, the controlled system is represented as a switched system model and its stability is analyzed via a switching-time-dependent Lyapunov functional. Both the cases with/without network-induced delays are investigated. Simulation and experimental results show that the DSPETC scheme is superior to the PETC scheme and the SETC scheme. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
Gupta, Rahul; Audhkhasi, Kartik; Lee, Sungbok; Narayanan, Shrikanth
2017-01-01
Non-verbal communication involves encoding, transmission and decoding of non-lexical cues and is realized using vocal (e.g. prosody) or visual (e.g. gaze, body language) channels during conversation. These cues perform the function of maintaining conversational flow, expressing emotions, and marking personality and interpersonal attitude. In particular, non-verbal cues in speech such as paralanguage and non-verbal vocal events (e.g. laughters, sighs, cries) are used to nuance meaning and convey emotions, mood and attitude. For instance, laughters are associated with affective expressions while fillers (e.g. um, ah, um) are used to hold floor during a conversation. In this paper we present an automatic non-verbal vocal events detection system focusing on the detect of laughter and fillers. We extend our system presented during Interspeech 2013 Social Signals Sub-challenge (that was the winning entry in the challenge) for frame-wise event detection and test several schemes for incorporating local context during detection. Specifically, we incorporate context at two separate levels in our system: (i) the raw frame-wise features and, (ii) the output decisions. Furthermore, our system processes the output probabilities based on a few heuristic rules in order to reduce erroneous frame-based predictions. Our overall system achieves an Area Under the Receiver Operating Characteristics curve of 95.3% for detecting laughters and 90.4% for fillers on the test set drawn from the data specifications of the Interspeech 2013 Social Signals Sub-challenge. We perform further analysis to understand the interrelation between the features and obtained results. Specifically, we conduct a feature sensitivity analysis and correlate it with each feature's stand alone performance. The observations suggest that the trained system is more sensitive to a feature carrying higher discriminability with implications towards a better system design. PMID:28713197
NASA Technical Reports Server (NTRS)
Dehghani, Navid; Tankenson, Michael
2006-01-01
This paper details an architectural description of the Mission Data Processing and Control System (MPCS), an event-driven, multi-mission ground data processing components providing uplink, downlink, and data management capabilities which will support the Mars Science Laboratory (MSL) project as its first target mission. MPCS is developed based on a set of small reusable components, implemented in Java, each designed with a specific function and well-defined interfaces. An industry standard messaging bus is used to transfer information among system components. Components generate standard messages which are used to capture system information, as well as triggers to support the event-driven architecture of the system. Event-driven systems are highly desirable for processing high-rate telemetry (science and engineering) data, and for supporting automation for many mission operations processes.
Johanson, Bradley E.; Fox, Armando; Winograd, Terry A.; Hanrahan, Patrick M.
2010-04-20
An efficient and adaptive middleware infrastructure called the Event Heap system dynamically coordinates application interactions and communications in a ubiquitous computing environment, e.g., an interactive workspace, having heterogeneous software applications running on various machines and devices across different platforms. Applications exchange events via the Event Heap. Each event is characterized by a set of unordered, named fields. Events are routed by matching certain attributes in the fields. The source and target versions of each field are automatically set when an event is posted or used as a template. The Event Heap system implements a unique combination of features, both intrinsic to tuplespaces and specific to the Event Heap, including content based addressing, support for routing patterns, standard routing fields, limited data persistence, query persistence/registration, transparent communication, self-description, flexible typing, logical/physical centralization, portable client API, at most once per source first-in-first-out ordering, and modular restartability.
The effectiveness of pretreatment physics plan review for detecting errors in radiation therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gopan, Olga; Zeng, Jing; Novak, Avrey
Purpose: The pretreatment physics plan review is a standard tool for ensuring treatment quality. Studies have shown that the majority of errors in radiation oncology originate in treatment planning, which underscores the importance of the pretreatment physics plan review. This quality assurance measure is fundamentally important and central to the safety of patients and the quality of care that they receive. However, little is known about its effectiveness. The purpose of this study was to analyze reported incidents to quantify the effectiveness of the pretreatment physics plan review with the goal of improving it. Methods: This study analyzed 522 potentiallymore » severe or critical near-miss events within an institutional incident learning system collected over a three-year period. Of these 522 events, 356 originated at a workflow point that was prior to the pretreatment physics plan review. The remaining 166 events originated after the pretreatment physics plan review and were not considered in the study. The applicable 356 events were classified into one of the three categories: (1) events detected by the pretreatment physics plan review, (2) events not detected but “potentially detectable” by the physics review, and (3) events “not detectable” by the physics review. Potentially detectable events were further classified by which specific checks performed during the pretreatment physics plan review detected or could have detected the event. For these events, the associated specific check was also evaluated as to the possibility of automating that check given current data structures. For comparison, a similar analysis was carried out on 81 events from the international SAFRON radiation oncology incident learning system. Results: Of the 356 applicable events from the institutional database, 180/356 (51%) were detected or could have been detected by the pretreatment physics plan review. Of these events, 125 actually passed through the physics review; however, only 38% (47/125) were actually detected at the review. Of the 81 events from the SAFRON database, 66/81 (81%) were potentially detectable by the pretreatment physics plan review. From the institutional database, three specific physics checks were particularly effective at detecting events (combined effectiveness of 38%): verifying the isocenter (39/180), verifying DRRs (17/180), and verifying that the plan matched the prescription (12/180). The most effective checks from the SAFRON database were verifying that the plan matched the prescription (13/66) and verifying the field parameters in the record and verify system against those in the plan (23/66). Software-based plan checking systems, if available, would have potential effectiveness of 29% and 64% at detecting events from the institutional and SAFRON databases, respectively. Conclusions: Pretreatment physics plan review is a key safety measure and can detect a high percentage of errors. However, the majority of errors that potentially could have been detected were not detected in this study, indicating the need to improve the pretreatment physics plan review performance. Suggestions for improvement include the automation of specific physics checks performed during the pretreatment physics plan review and the standardization of the review process.« less
The effectiveness of pretreatment physics plan review for detecting errors in radiation therapy.
Gopan, Olga; Zeng, Jing; Novak, Avrey; Nyflot, Matthew; Ford, Eric
2016-09-01
The pretreatment physics plan review is a standard tool for ensuring treatment quality. Studies have shown that the majority of errors in radiation oncology originate in treatment planning, which underscores the importance of the pretreatment physics plan review. This quality assurance measure is fundamentally important and central to the safety of patients and the quality of care that they receive. However, little is known about its effectiveness. The purpose of this study was to analyze reported incidents to quantify the effectiveness of the pretreatment physics plan review with the goal of improving it. This study analyzed 522 potentially severe or critical near-miss events within an institutional incident learning system collected over a three-year period. Of these 522 events, 356 originated at a workflow point that was prior to the pretreatment physics plan review. The remaining 166 events originated after the pretreatment physics plan review and were not considered in the study. The applicable 356 events were classified into one of the three categories: (1) events detected by the pretreatment physics plan review, (2) events not detected but "potentially detectable" by the physics review, and (3) events "not detectable" by the physics review. Potentially detectable events were further classified by which specific checks performed during the pretreatment physics plan review detected or could have detected the event. For these events, the associated specific check was also evaluated as to the possibility of automating that check given current data structures. For comparison, a similar analysis was carried out on 81 events from the international SAFRON radiation oncology incident learning system. Of the 356 applicable events from the institutional database, 180/356 (51%) were detected or could have been detected by the pretreatment physics plan review. Of these events, 125 actually passed through the physics review; however, only 38% (47/125) were actually detected at the review. Of the 81 events from the SAFRON database, 66/81 (81%) were potentially detectable by the pretreatment physics plan review. From the institutional database, three specific physics checks were particularly effective at detecting events (combined effectiveness of 38%): verifying the isocenter (39/180), verifying DRRs (17/180), and verifying that the plan matched the prescription (12/180). The most effective checks from the SAFRON database were verifying that the plan matched the prescription (13/66) and verifying the field parameters in the record and verify system against those in the plan (23/66). Software-based plan checking systems, if available, would have potential effectiveness of 29% and 64% at detecting events from the institutional and SAFRON databases, respectively. Pretreatment physics plan review is a key safety measure and can detect a high percentage of errors. However, the majority of errors that potentially could have been detected were not detected in this study, indicating the need to improve the pretreatment physics plan review performance. Suggestions for improvement include the automation of specific physics checks performed during the pretreatment physics plan review and the standardization of the review process.
Ho, Hung Chak; Wong, Man Sing; Yang, Lin; Shi, Wenzhong; Yang, Jinxin; Bilal, Muhammad; Chan, Ta-Chien
2018-03-01
Haze is an extreme weather event that can severely increase air pollution exposure, resulting in higher burdens on human health. Few studies have explored the health effects of haze, and none have investigated the spatiotemporal interaction between temperature, air quality and urban environment that may exacerbate the adverse health effects of haze. We investigated the spatiotemporal pattern of haze effects and explored the additional effects of temperature, air pollution and urban environment on the short-term mortality risk during hazy days. We applied a Poisson regression model to daily mortality data from 2007 through 2014, to analyze the short-term mortality risk during haze events in Hong Kong. We evaluated the adverse effect on five types of cause-specific mortality after four types of haze event. We also analyzed the additional effect contributed by the spatial variability of urban environment on each type of cause-specific mortality during a specific haze event. A regular hazy day (lag 0) has higher all-cause mortality risk than a day without haze (odds ratio: 1.029 [1.009, 1.049]). We have also observed high mortality risks associated with mental disorders and diseases of the nervous system during hazy days. In addition, extreme weather and air quality contributed to haze-related mortality, while cold weather and higher ground-level ozone had stronger influences on mortality risk. Areas with a high-density environment, lower vegetation, higher anthropogenic heat, and higher PM 2.5 featured stronger effects of haze on mortality than the others. A combined influence of haze, extreme weather/air quality, and urban environment can result in extremely high mortality due to mental/behavioral disorders or diseases of the nervous system. In conclusion, we developed a data-driven technique to analyze the effects of haze on mortality. Our results target the specific dates and areas with higher mortality during haze events, which can be used for development of health warning protocols/systems. Copyright © 2017 Elsevier Ltd. All rights reserved.
Yang, Litao; Xu, Songci; Pan, Aihu; Yin, Changsong; Zhang, Kewei; Wang, Zhenying; Zhou, Zhigang; Zhang, Dabing
2005-11-30
Because of the genetically modified organisms (GMOs) labeling policies issued in many countries and areas, polymerase chain reaction (PCR) methods were developed for the execution of GMO labeling policies, such as screening, gene specific, construct specific, and event specific PCR detection methods, which have become a mainstay of GMOs detection. The event specific PCR detection method is the primary trend in GMOs detection because of its high specificity based on the flanking sequence of the exogenous integrant. This genetically modified maize, MON863, contains a Cry3Bb1 coding sequence that produces a protein with enhanced insecticidal activity against the coleopteran pest, corn rootworm. In this study, the 5'-integration junction sequence between the host plant DNA and the integrated gene construct of the genetically modified maize MON863 was revealed by means of thermal asymmetric interlaced-PCR, and the specific PCR primers and TaqMan probe were designed based upon the revealed 5'-integration junction sequence; the conventional qualitative PCR and quantitative TaqMan real-time PCR detection methods employing these primers and probes were successfully developed. In conventional qualitative PCR assay, the limit of detection (LOD) was 0.1% for MON863 in 100 ng of maize genomic DNA for one reaction. In the quantitative TaqMan real-time PCR assay, the LOD and the limit of quantification were eight and 80 haploid genome copies, respectively. In addition, three mixed maize samples with known MON863 contents were detected using the established real-time PCR systems, and the ideal results indicated that the established event specific real-time PCR detection systems were reliable, sensitive, and accurate.
Terminal Dynamics Approach to Discrete Event Systems
NASA Technical Reports Server (NTRS)
Zak, Michail; Meyers, Ronald
1995-01-01
This paper presents and discusses a mathematical formalism for simulation of discrete event dynamic (DED)-a special type of 'man-made' systems to serve specific purposes of information processing. The main objective of this work is to demonstrate that the mathematical formalism for DED can be based upon a terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.!.
NASA Astrophysics Data System (ADS)
Funk, Daniel
2016-04-01
The successful provision of from seasonal to decadal (S2D) climate service products to sector-specific users is dependent on specific problem characteristics and individual user needs and decision-making processes. Climate information requires an impact on decision making to have any value (Rodwell and Doblas-Reyes, 2006). For that reason the knowledge of sector-specific vulnerabilities to S2D climate variability is very valuable information for both, climate service producers and users. In this context a concept for a vulnerability assessment framework was developed to (i) identify climate events (and especially their temporal scales) critical for sector-specific problems to assess the basic requirements for an appropriate climate-service product development; and to (ii) assess the potential impact or value of related climate information for decision-makers. The concept was developed within the EUPORIAS project (European Provision of Regional Impacts Assessments on Seasonal and Decadal Timescales) based on ten project-related case-studies from different sectors all over Europe. In the prevalent stage the framework may be useful as preliminary assessment or 'quick-scan' of the vulnerability of specific systems to climate variability in the context of S2D climate service provision. The assessment strategy of the framework is user-focused, using predominantly a bottom-up approach (vulnerability as state) but also a top-down approach (vulnerability as outcome) generally based on qualitative data (surveys, interviews, etc.) and literature research for system understanding. The starting point of analysis is a climate-sensitive 'critical situation' of the considered system which requires a decision and is defined by the user. From this basis the related 'critical climate conditions' are assessed and 'climate information needs' are derived. This mainly refers to the critical period of time of the climate event or sequence of events. The relevant period of time of problem-specific critical climate conditions may be assessed by the resilience of the system of concern, the response time of an interconnected system (i.e. top-down approach using a bottom-up methodology) or alternatively, by the critical time-frame of decision-making processes (bottom-up approach). This approach counters the challenges for a vulnerability assessment of economic sectors to S2D climate events which originate from the inherent role of climate for economic sectors: climate may affect economic sectors as hazard, resource, production- or regulation factor. This implies, that climate dependencies are often indirect and nonlinear. Consequently, climate events which are critical for affected systems do not necessarily correlate with common climatological extremes. One important output of the framework is a classification system of 'climate-impact types' which classifies sector-specific problems in a systemic way. This system proves to be promising because (i) it reflects and thus differentiates the cause for the climate relevance of a specific problem (compositions of buffer factors); (ii) it integrates decision-making processes which proved to be a significant factor; (iii) it indicates a potential usability of S2D climate service products and thus integrates coping options, and (vi) it is a systemic approach which goes beyond the established 'snap-shot' of vulnerability assessments.
Time to foster a rational approach to preventing cardiovascular morbid events.
Cohn, Jay N; Duprez, Daniel A
2008-07-29
Efforts to prevent atherosclerotic morbid events have focused primarily on risk factor prevention and intervention. These approaches, based on the statistical association of risk factors with events, have dominated clinical practice in the last generation. Because the cardiovascular abnormalities eventuating in morbid events are detectable in the arteries and heart before the development of symptomatic disease, recent efforts have focused on identifying the presence of these abnormalities as a more sensitive and specific guide to the need for therapy. Advances in noninvasive techniques for studying the vasculature and the left ventricle now provide the opportunity to use early disease rather than risk factors as the tool for clinical decision making. A disease scoring system has been developed using 10 tests of vascular and cardiac function and structure. More extensive data to confirm the sensitivity and specificity of this scoring system and to demonstrate its utility in tracking the response to therapy are needed to justify widespread application in clinical practice.
Forecasting Significant Societal Events Using The Embers Streaming Predictive Analytics System
Katz, Graham; Summers, Kristen; Ackermann, Chris; Zavorin, Ilya; Lim, Zunsik; Muthiah, Sathappan; Butler, Patrick; Self, Nathan; Zhao, Liang; Lu, Chang-Tien; Khandpur, Rupinder Paul; Fayed, Youssef; Ramakrishnan, Naren
2014-01-01
Abstract Developed under the Intelligence Advanced Research Project Activity Open Source Indicators program, Early Model Based Event Recognition using Surrogates (EMBERS) is a large-scale big data analytics system for forecasting significant societal events, such as civil unrest events on the basis of continuous, automated analysis of large volumes of publicly available data. It has been operational since November 2012 and delivers approximately 50 predictions each day for countries of Latin America. EMBERS is built on a streaming, scalable, loosely coupled, shared-nothing architecture using ZeroMQ as its messaging backbone and JSON as its wire data format. It is deployed on Amazon Web Services using an entirely automated deployment process. We describe the architecture of the system, some of the design tradeoffs encountered during development, and specifics of the machine learning models underlying EMBERS. We also present a detailed prospective evaluation of EMBERS in forecasting significant societal events in the past 2 years. PMID:25553271
Forecasting Significant Societal Events Using The Embers Streaming Predictive Analytics System.
Doyle, Andy; Katz, Graham; Summers, Kristen; Ackermann, Chris; Zavorin, Ilya; Lim, Zunsik; Muthiah, Sathappan; Butler, Patrick; Self, Nathan; Zhao, Liang; Lu, Chang-Tien; Khandpur, Rupinder Paul; Fayed, Youssef; Ramakrishnan, Naren
2014-12-01
Developed under the Intelligence Advanced Research Project Activity Open Source Indicators program, Early Model Based Event Recognition using Surrogates (EMBERS) is a large-scale big data analytics system for forecasting significant societal events, such as civil unrest events on the basis of continuous, automated analysis of large volumes of publicly available data. It has been operational since November 2012 and delivers approximately 50 predictions each day for countries of Latin America. EMBERS is built on a streaming, scalable, loosely coupled, shared-nothing architecture using ZeroMQ as its messaging backbone and JSON as its wire data format. It is deployed on Amazon Web Services using an entirely automated deployment process. We describe the architecture of the system, some of the design tradeoffs encountered during development, and specifics of the machine learning models underlying EMBERS. We also present a detailed prospective evaluation of EMBERS in forecasting significant societal events in the past 2 years.
Drosophila melanogaster as a model system for assessing development under conditions of microgravity
NASA Technical Reports Server (NTRS)
Abbott, M. K.; Hilgenfeld, R. B.; Denell, R. E.; Spooner, B. S. (Principal Investigator)
1992-01-01
More is known about the regulation of early developmental events in Drosophila than any other animal. In addition, its size and short life cycle make it a facile experimental system. Since developmental perturbations have been demonstrated when both oogenesis and embryogenesis occur in the space environment, there is a strong rationale for using this organism for the elucidation of specific gravity-sensitive developmental events.
Gupta, Priyanka; Schomburg, John; Krishna, Suprita; Adejoro, Oluwakayode; Wang, Qi; Marsh, Benjamin; Nguyen, Andrew; Genere, Juan Reyes; Self, Patrick; Lund, Erik; Konety, Badrinath R
2017-01-01
To examine the Manufacturer and User Facility Device Experience Database (MAUDE) database to capture adverse events experienced with the Da Vinci Surgical System. In addition, to design a standardized classification system to categorize the complications and machine failures associated with the device. Overall, 1,057,000 DaVinci procedures were performed in the United States between 2009 and 2012. Currently, no system exists for classifying and comparing device-related errors and complications with which to evaluate adverse events associated with the Da Vinci Surgical System. The MAUDE database was queried for events reports related to the DaVinci Surgical System between the years 2009 and 2012. A classification system was developed and tested among 14 robotic surgeons to associate a level of severity with each event and its relationship to the DaVinci Surgical System. Events were then classified according to this system and examined by using Chi-square analysis. Two thousand eight hundred thirty-seven events were identified, of which 34% were obstetrics and gynecology (Ob/Gyn); 19%, urology; 11%, other; and 36%, not specified. Our classification system had moderate agreement with a Kappa score of 0.52. Using our classification system, we identified 75% of the events as mild, 18% as moderate, 4% as severe, and 3% as life threatening or resulting in death. Seventy-seven percent were classified as definitely related to the device, 15% as possibly related, and 8% as not related. Urology procedures compared with Ob/Gyn were associated with more severe events (38% vs 26%, p < 0.0001). Energy instruments were associated with less severe events compared with the surgical system (8% vs 87%, p < 0.0001). Events that were definitely associated with the device tended to be less severe (81% vs 19%, p < 0.0001). Our classification system is a valid tool with moderate inter-rater agreement that can be used to better understand device-related adverse events. The majority of robotic related events were mild but associated with the device.
Duke, Jon D.; Friedlin, Jeff
2010-01-01
Evaluating medications for potential adverse events is a time-consuming process, typically involving manual lookup of information by physicians. This process can be expedited by CDS systems that support dynamic retrieval and filtering of adverse drug events (ADE’s), but such systems require a source of semantically-coded ADE data. We created a two-component system that addresses this need. First we created a natural language processing application which extracts adverse events from Structured Product Labels and generates a standardized ADE knowledge base. We then built a decision support service that consumes a Continuity of Care Document and returns a list of patient-specific ADE’s. Our database currently contains 534,125 ADE’s from 5602 product labels. An NLP evaluation of 9529 ADE’s showed recall of 93% and precision of 95%. On a trial set of 30 CCD’s, the system provided adverse event data for 88% of drugs and returned these results in an average of 620ms. PMID:21346964
Savage, W.U.; Nishenko, S.P.; Honegger, D.G.; Kempner, L.
2006-01-01
Electric power utilities are familiar with and skilled in preparing for and responding to almost-routine natural hazard events such as strong wind and ice storms and seasonal floods, as well as intentional human acts such as vandalism. Recent extreme weather (hurricanes Katrina and Rita), extremely destructive international earthquakes (in Sumatra and Pakistan), and nation-wide concerns regarding future terrorist attacks have increased the pressure on utilities to take appropriate steps to avoid being overwhelmed by such infrequent and exceedingly severe events. Determining what constitutes the appropriate steps to take requires various levels of understanding of the specific hazards and the risks faced by the utility. The American Lifelines Alliance (www. americanlifelinesalliance.org) has prepared a Guideline that provides clear, concise, and nationally-applicable guidance on determining the scope and level of effort necessary to assess power system performance in the wide range of natural hazard or human threat events. Included in this Guideline are specific procedures to follow and information to consider in performing standardized assessments. With the results of such assessments, utility owners can effectively establish and carry out risk management programs that will lead to achieving appropriate levels of performance in future events. The Guideline incorporates an inquiry-driven process with a two-phase performance assessment that can be applied to power systems of any size. The screening phase enables systems or components that are clearly not at risk to be screened out early. The subsequent analysis phase uses results from the screening phase to prioritize and allocate resources for more detailed assessments of hazard, vulnerability, and system performance. This process helps assure that the scope of the assessment meets the specific performance objectives of the inquiry. A case history is presented to illustrate the type of experience with an inquiry-driven process that was considered in developing the Guideline to meet the diverse needs of utility personnel in engineering, operations, and management. Copyright ASCE 2007.
Teaching Cultural History from Primary Events
ERIC Educational Resources Information Center
Carson, Robert N.
2004-01-01
This article explores the relationship between specific cultural events such as Galileo's work with the pendulum and a curriculum design that seeks to establish in skeletal form a comprehensive epic narrative about the co-evolution of cultural systems and human consciousness. The article explores some of the challenges and some of the strategies…
Event-Based Surveillance During EXPO Milan 2015: Rationale, Tools, Procedures, and Initial Results
Manso, Martina Del; Caporali, Maria Grazia; Napoli, Christian; Linge, Jens P.; Mantica, Eleonora; Verile, Marco; Piatti, Alessandra; Pompa, Maria Grazia; Vellucci, Loredana; Costanzo, Virgilio; Bastiampillai, Anan Judina; Gabrielli, Eugenia; Gramegna, Maria; Declich, Silvia
2016-01-01
More than 21 million participants attended EXPO Milan from May to October 2015, making it one of the largest protracted mass gathering events in Europe. Given the expected national and international population movement and health security issues associated with this event, Italy fully implemented, for the first time, an event-based surveillance (EBS) system focusing on naturally occurring infectious diseases and the monitoring of biological agents with potential for intentional release. The system started its pilot phase in March 2015 and was fully operational between April and November 2015. In order to set the specific objectives of the EBS system, and its complementary role to indicator-based surveillance, we defined a list of priority diseases and conditions. This list was designed on the basis of the probability and possible public health impact of infectious disease transmission, existing statutory surveillance systems in place, and any surveillance enhancements during the mass gathering event. This article reports the methodology used to design the EBS system for EXPO Milan and the results of 8 months of surveillance. PMID:27314656
The MK VI - A second generation attitude control system
NASA Astrophysics Data System (ADS)
Meredith, P. J.
1986-10-01
The MK VI, a new multipurpose attitude control system for the exoatmospheric attitude control of sounding rocket payloads, is described. The system employs reprogrammable microcomputer memory for storage of basic control logic and for specific mission event control data. The paper includes descriptions of MK VI specifications and configuration; sensor characteristics; the electronic, analog, and digital sections; the pneumatic system; ground equipment; the system operation; and software. A review of the MK VI performance for the Comet Halley flight is presented. Block diagrams are included.
Tool for Human-Systems Integration Assessment: HSI Scorecard
NASA Technical Reports Server (NTRS)
Whitmore, Nihriban; Sandor, Aniko; McGuire, Kerry M.; Berdich, Debbie
2009-01-01
This paper describes the development and rationale for a human-systems integration (HSI) scorecard that can be used in reviews of vehicle specification and design. This tool can be used to assess whether specific HSI related criteria have been met as part of a project milestone or critical event, such as technical reviews, crew station reviews, mockup evaluations, or even review of major plans or processes. Examples of HSI related criteria include Human Performance Capabilities, Health Management, Human System Interfaces, Anthropometry and Biomechanics, and Natural and Induced Environments. The tool is not intended to evaluate requirements compliance and verification, but to review how well the human related systems have been considered for the specific event and to identify gaps and vulnerabilities from an HSI perspective. The scorecard offers common basis, and criteria for discussions among system managers, evaluators, and design engineers. Furthermore, the scorecard items highlight the main areas of system development that need to be followed during system lifecycle. The ratings provide a repeatable quantitative measure to what has been often seen as only subjective commentary. Thus, the scorecard is anticipated to be a useful HSI tool to communicate review results to the institutional and the project office management.
NASA Technical Reports Server (NTRS)
Kavelund, Klaus; Barringer, Howard
2012-01-01
TraceContract is an API (Application Programming Interface) for trace analysis. A trace is a sequence of events, and can, for example, be generated by a running program, instrumented appropriately to generate events. An event can be any data object. An example of a trace is a log file containing events that a programmer has found important to record during a program execution. Trace - Contract takes as input such a trace together with a specification formulated using the API and reports on any violations of the specification, potentially calling code (reactions) to be executed when violations are detected. The software is developed as an internal DSL (Domain Specific Language) in the Scala programming language. Scala is a relatively new programming language that is specifically convenient for defining such internal DSLs due to a number of language characteristics. This includes Scala s elegant combination of object-oriented and functional programming, a succinct notation, and an advanced type system. The DSL offers a combination of data-parameterized state machines and temporal logic, which is novel. As an extension of Scala, it is a very expressive and convenient log file analysis framework.
An Overview of the Runtime Verification Tool Java PathExplorer
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)
2002-01-01
We present an overview of the Java PathExplorer runtime verification tool, in short referred to as JPAX. JPAX can monitor the execution of a Java program and check that it conforms with a set of user provided properties formulated in temporal logic. JPAX can in addition analyze the program for concurrency errors such as deadlocks and data races. The concurrency analysis requires no user provided specification. The tool facilitates automated instrumentation of a program's bytecode, which when executed will emit an event stream, the execution trace, to an observer. The observer dispatches the incoming event stream to a set of observer processes, each performing a specialized analysis, such as the temporal logic verification, the deadlock analysis and the data race analysis. Temporal logic specifications can be formulated by the user in the Maude rewriting logic, where Maude is a high-speed rewriting system for equational logic, but here extended with executable temporal logic. The Maude rewriting engine is then activated as an event driven monitoring process. Alternatively, temporal specifications can be translated into efficient automata, which check the event stream. JPAX can be used during program testing to gain increased information about program executions, and can potentially furthermore be applied during operation to survey safety critical systems.
Application of a temporal reasoning framework tool in analysis of medical device adverse events.
Clark, Kimberly K; Sharma, Deepak K; Chute, Christopher G; Tao, Cui
2011-01-01
The Clinical Narrative Temporal Relation Ontology (CNTRO)1 project offers a semantic-web based reasoning framework, which represents temporal events and relationships within clinical narrative texts, and infer new knowledge over them. In this paper, the CNTRO reasoning framework is applied to temporal analysis of medical device adverse event files. One specific adverse event was used as a test case: late stent thrombosis. Adverse event narratives were obtained from the Food and Drug Administration's (FDA) Manufacturing and User Facility Device Experience (MAUDE) database2. 15 adverse event files in which late stent thrombosis was confirmed were randomly selected across multiple drug eluting stent devices. From these files, 81 events and 72 temporal relations were annotated. 73 temporal questions were generated, of which 65 were correctly answered by the CNTRO system. This results in an overall accuracy of 89%. This system should be pursued further to continue assessing its potential benefits in temporal analysis of medical device adverse events.
Stability of a giant connected component in a complex network
NASA Astrophysics Data System (ADS)
Kitsak, Maksim; Ganin, Alexander A.; Eisenberg, Daniel A.; Krapivsky, Pavel L.; Krioukov, Dmitri; Alderson, David L.; Linkov, Igor
2018-01-01
We analyze the stability of the network's giant connected component under impact of adverse events, which we model through the link percolation. Specifically, we quantify the extent to which the largest connected component of a network consists of the same nodes, regardless of the specific set of deactivated links. Our results are intuitive in the case of single-layered systems: the presence of large degree nodes in a single-layered network ensures both its robustness and stability. In contrast, we find that interdependent networks that are robust to adverse events have unstable connected components. Our results bring novel insights to the design of resilient network topologies and the reinforcement of existing networked systems.
Identity method for particle number fluctuations and correlations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gorenstein, M. I.
An incomplete particle identification distorts the observed event-by-event fluctuations of the hadron chemical composition in nucleus-nucleus collisions. A new experimental technique called the identity method was recently proposed. It eliminated the misidentification problem for one specific combination of the second moments in a system of two hadron species. In the present paper, this method is extended to calculate all the second moments in a system with an arbitrary number of hadron species. Special linear combinations of the second moments are introduced. These combinations are presented in terms of single-particle variables and can be found experimentally from the event-by-event averaging. Themore » mathematical problem is then reduced to solving a system of linear equations. The effect of incomplete particle identification is fully eliminated from the final results.« less
Formal analysis of imprecise system requirements with Event-B.
Le, Hong Anh; Nakajima, Shin; Truong, Ninh Thuan
2016-01-01
Formal analysis of functional properties of system requirements needs precise descriptions. However, the stakeholders sometimes describe the system with ambiguous, vague or fuzzy terms, hence formal frameworks for modeling and verifying such requirements are desirable. The Fuzzy If-Then rules have been used for imprecise requirements representation, but verifying their functional properties still needs new methods. In this paper, we propose a refinement-based modeling approach for specification and verification of such requirements. First, we introduce a representation of imprecise requirements in the set theory. Then we make use of Event-B refinement providing a set of translation rules from Fuzzy If-Then rules to Event-B notations. After that, we show how to verify both safety and eventuality properties with RODIN/Event-B. Finally, we illustrate the proposed method on the example of Crane Controller.
Sols, Ignasi; DuBrow, Sarah; Davachi, Lila; Fuentemilla, Lluís
2017-11-20
Although everyday experiences unfold continuously over time, shifts in context, or event boundaries, can influence how those events come to be represented in memory [1-4]. Specifically, mnemonic binding across sequential representations is more challenging at context shifts, such that successful temporal associations are more likely to be formed within than across contexts [1, 2, 5-9]. However, in order to preserve a subjective sense of continuity, it is important that the memory system bridge temporally adjacent events, even if they occur in seemingly distinct contexts. Here, we used pattern similarity analysis to scalp electroencephalographic (EEG) recordings during a sequential learning task [2, 3] in humans and showed that the detection of event boundaries triggered a rapid memory reinstatement of the just-encoded sequence episode. Memory reactivation was detected rapidly (∼200-800 ms from the onset of the event boundary) and was specific to context shifts that were preceded by an event sequence with episodic content. Memory reinstatement was not observed during the sequential encoding of events within an episode, indicating that memory reactivation was induced specifically upon context shifts. Finally, the degree of neural similarity between neural responses elicited during sequence encoding and at event boundaries correlated positively with participants' ability to later link across sequences of events, suggesting a critical role in binding temporally adjacent events in long-term memory. Current results shed light onto the neural mechanisms that promote episodic encoding not only for information within the event, but also, importantly, in the ability to link across events to create a memory representation of continuous experience. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Ambrose, Maureen; Hess, Ronald L.; Ganesan, Shankar
2007-01-01
Research in organizational justice has always been interested in the relationship between justice and attitudes. This research often examines how different types of justice affect different attitudes, with distributive justice predicted to affect attitudes about specific events (e.g., performance evaluation) and procedural justice predicted to…
Homeland Security Advisory System: An Assessment of Its Ability to Formulate a Risk Message
2010-06-01
37 C. ELEVATED CONDITION ( YELLOW ) .....................................................38 D. HIGH CONDITION (ORANGE...Low = Green, Guarded = Blue, Elevated = Yellow , High = Orange and Severe = Red. Each threat condition is associated with specific protective measures...i.e., hazardous material spills, nuclear materials releases), geological events (i.e., earthquakes, volcanoes ), climatological events (i.e., tornados
2000-06-01
real - time operating system and design of a human-computer interface (HCI) for a triple modular redundant (TMR) fault-tolerant microprocessor for use in space-based applications. Once disadvantage of using COTS hardware components is their susceptibility to the radiation effects present in the space environment. and specifically, radiation-induced single-event upsets (SEUs). In the event of an SEU, a fault-tolerant system can mitigate the effects of the upset and continue to process from the last known correct system state. The TMR basic hardware
Systems analysis of arrestin pathway functions.
Maudsley, Stuart; Siddiqui, Sana; Martin, Bronwen
2013-01-01
To fully appreciate the diversity and specificity of complex cellular signaling events, such as arrestin-mediated signaling from G protein-coupled receptor activation, a complex systems-level investigation currently appears to be the best option. A rational combination of transcriptomics, proteomics, and interactomics, all coherently integrated with applied next-generation bioinformatics, is vital for the future understanding of the development, translation, and expression of GPCR-mediated arrestin signaling events in physiological contexts. Through a more nuanced, systems-level appreciation of arrestin-mediated signaling, the creation of arrestin-specific molecular response "signatures" should be made simple and ultimately amenable to drug discovery processes. Arrestin-based signaling paradigms possess important aspects, such as its specific temporal kinetics and ability to strongly affect transcriptional activity, that make it an ideal test bed for next-generation of drug discovery bioinformatic approaches such as multi-parallel dose-response analysis, data texturization, and latent semantic indexing-based natural language data processing and feature extraction. Copyright © 2013 Elsevier Inc. All rights reserved.
BENEFITS OF SEWERAGE SYSTEM REAL-TIME CONTROL
Real-time control (RTC) is a custom-designed computer-assisted management system for a specific urban sewerage network that is activated during a wet-weather flow event. Though uses of RTC systems had started in the mid 60s, recent developments in computers, telecommunication, in...
Lewis, Geraint; Kirkham, Heather; Duncan, Ian; Vaithianathan, Rhema
2013-04-01
Health care systems in many countries are using the "Triple Aim"--to improve patients' experience of care, to advance population health, and to lower per capita costs--as a focus for improving quality. Population strategies for addressing the Triple Aim are becoming increasingly prevalent in developed countries, but ultimately success will also require targeting specific subgroups and individuals. Certain events, which we call "Triple Fail" events, constitute a simultaneous failure to meet all three Triple Aim goals. The risk of experiencing different Triple Fail events varies widely across people. We argue that by stratifying populations according to each person's risk and anticipated response to an intervention, health systems could more effectively target different preventive interventions at particular risk strata. In this article we describe how such an approach could be planned and operationalized. Policy makers should consider using this stratified approach to reduce the incidence of Triple Fail events, thereby improving outcomes, enhancing patient experience, and lowering costs.
Detection of rain events in radiological early warning networks with spectro-dosimetric systems
NASA Astrophysics Data System (ADS)
Dąbrowski, R.; Dombrowski, H.; Kessler, P.; Röttger, A.; Neumaier, S.
2017-10-01
Short-term pronounced increases of the ambient dose equivalent rate, due to rainfall are a well-known phenomenon. Increases in the same order of magnitude or even below may also be caused by a nuclear or radiological event, i.e. by artificial radiation. Hence, it is important to be able to identify natural rain events in dosimetric early warning networks and to distinguish them from radiological events. Novel spectrometric systems based on scintillators may be used to differentiate between the two scenarios, because the measured gamma spectra provide significant nuclide-specific information. This paper describes three simple, automatic methods to check whether an dot H*(10) increase is caused by a rain event or by artificial radiation. These methods were applied to measurements of three spectrometric systems based on CeBr3, LaBr3 and SrI2 scintillation crystals, investigated and tested for their practicability at a free-field reference site of PTB.
Relative Time-scale for Channeling Events Within Chaotic Terrains, Margaritifer Sinus, Mars
NASA Technical Reports Server (NTRS)
Janke, D.
1985-01-01
A relative time scale for ordering channel and chaos forming events was constructed for areas within the Margaritifer Sinus region of Mars. Transection and superposition relationships of channels, chaotic terrain, and the surfaces surrounding them were used to create the relative time scale; crater density studies were not used. Channels and chaos in contact with one another were treated as systems. These systems were in turn treated both separately (in order to understand internal relationships) and as members of the suite of Martian erosional forms (in order to produce a combined, master time scale). Channeling events associated with chaotic terrain development occurred over an extended geomorphic period. The channels can be divided into three convenient groups: those that pre-date intercrater plains development post-plains, pre-chasma systems; and those associated with the development of the Vallis Marineris chasmata. No correlations with cyclic climatic changes, major geologic events in other regions on Mars, or triggering phenomena (for example, specific impact events) were found.
NASA Technical Reports Server (NTRS)
Havelund, Klaus
2014-01-01
The field of runtime verification has during the last decade seen a multitude of systems for monitoring event sequences (traces) emitted by a running system. The objective is to ensure correctness of a system by checking its execution traces against formal specifications representing requirements. A special challenge is data parameterized events, where monitors have to keep track of the combination of control states as well as data constraints, relating events and the data they carry across time points. This poses a challenge wrt. efficiency of monitors, as well as expressiveness of logics. Data automata is a form of automata where states are parameterized with data, supporting monitoring of data parameterized events. We describe the full details of a very simple API in the Scala programming language, an internal DSL (Domain-Specific Language), implementing data automata. The small implementation suggests a design pattern. Data automata allow transition conditions to refer to other states than the source state, and allow target states of transitions to be inlined, offering a temporal logic flavored notation. An embedding of a logic in a high-level language like Scala in addition allows monitors to be programmed using all of Scala's language constructs, offering the full flexibility of a programming language. The framework is demonstrated on an XML processing scenario previously addressed in related work.
Intelligent rover decision-making in response to exogenous events
NASA Technical Reports Server (NTRS)
Chouinard, C.; Estlin, T.; Gaines, D.; Fisher, F.
2005-01-01
This paper presents an introduction to the CLEAR system which performs rover command generation and re-planning, the challenges faced maintaining domain specific information in an uncertain environment, and the successes demonstrated with several methods of system testing.
Spraker, Matthew B; Fain, Robert; Gopan, Olga; Zeng, Jing; Nyflot, Matthew; Jordan, Loucille; Kane, Gabrielle; Ford, Eric
Incident learning systems (ILSs) are a popular strategy for improving safety in radiation oncology (RO) clinics, but few reports focus on the causes of errors in RO. The goal of this study was to test a causal factor taxonomy developed in 2012 by the American Association of Physicists in Medicine and adopted for use in the RO: Incident Learning System (RO-ILS). Three hundred event reports were randomly selected from an institutional ILS database and Safety in Radiation Oncology (SAFRON), an international ILS. The reports were split into 3 groups of 100 events each: low-risk institutional, high-risk institutional, and SAFRON. Three raters retrospectively analyzed each event for contributing factors using the American Association of Physicists in Medicine taxonomy. No events were described by a single causal factor (median, 7). The causal factor taxonomy was found to be applicable for all events, but 4 causal factors were not described in the taxonomy: linear accelerator failure (n = 3), hardware/equipment failure (n = 2), failure to follow through with a quality improvement intervention (n = 1), and workflow documentation was misleading (n = 1). The most common causal factor categories contributing to events were similar in all event types. The most common specific causal factor to contribute to events was a "slip causing physical error." Poor human factors engineering was the only causal factor found to contribute more frequently to high-risk institutional versus low-risk institutional events. The taxonomy in the study was found to be applicable for all events and may be useful in root cause analyses and future studies. Communication and human behaviors were the most common errors affecting all types of events. Poor human factors engineering was found to specifically contribute to high-risk more than low-risk institutional events, and may represent a strategy for reducing errors in all types of events. Copyright © 2017 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.
2016-06-23
The Food and Drug Administration (FDA) is announcing the availability of its FDA Adverse Event Reporting System (FAERS) Regional Implementation Specifications for the International Conference on Harmonisation (ICH) E2B(R3) Specification. FDA is making this technical specifications document available to assist interested parties in electronically submitting individual case safety reports (ICSRs) (and ICSR attachments) to the Center for Drug Evaluation and Research (CDER) and the Center for Biologics Evaluation and Research (CBER). This document, entitled "FDA Regional Implementation Specifications for ICH E2B(R3) Implementation: Postmarket Submission of Individual Case Safety Reports (ICSRs) for Drugs and Biologics, Excluding Vaccines" supplements the "E2B(R3) Electronic Transmission of Individual Case Safety Reports (ICSRs) Implementation Guide--Data Elements and Message Specification" final guidance for industry and describes FDA's technical approach for receiving ICSRs, for incorporating regionally controlled terminology, and for adding region-specific data elements when reporting to FAERS.
Emergence of Coding and its Specificity as a Physico-Informatic Problem
NASA Astrophysics Data System (ADS)
Wills, Peter R.; Nieselt, Kay; McCaskill, John S.
2015-06-01
We explore the origin-of-life consequences of the view that biological systems are demarcated from inanimate matter by their possession of referential information, which is processed computationally to control choices of specific physico-chemical events. Cells are cybernetic: they use genetic information in processes of communication and control, subjecting physical events to a system of integrated governance. The genetic code is the most obvious example of how cells use information computationally, but the historical origin of the usefulness of molecular information is not well understood. Genetic coding made information useful because it imposed a modular metric on the evolutionary search and thereby offered a general solution to the problem of finding catalysts of any specificity. We use the term "quasispecies symmetry breaking" to describe the iterated process of self-organisation whereby the alphabets of distinguishable codons and amino acids increased, step by step.
HIGH-PRECISION BIOLOGICAL EVENT EXTRACTION: EFFECTS OF SYSTEM AND OF DATA
Cohen, K. Bretonnel; Verspoor, Karin; Johnson, Helen L.; Roeder, Chris; Ogren, Philip V.; Baumgartner, William A.; White, Elizabeth; Tipney, Hannah; Hunter, Lawrence
2013-01-01
We approached the problems of event detection, argument identification, and negation and speculation detection in the BioNLP’09 information extraction challenge through concept recognition and analysis. Our methodology involved using the OpenDMAP semantic parser with manually written rules. The original OpenDMAP system was updated for this challenge with a broad ontology defined for the events of interest, new linguistic patterns for those events, and specialized coordination handling. We achieved state-of-the-art precision for two of the three tasks, scoring the highest of 24 teams at precision of 71.81 on Task 1 and the highest of 6 teams at precision of 70.97 on Task 2. We provide a detailed analysis of the training data and show that a number of trigger words were ambiguous as to event type, even when their arguments are constrained by semantic class. The data is also shown to have a number of missing annotations. Analysis of a sampling of the comparatively small number of false positives returned by our system shows that major causes of this type of error were failing to recognize second themes in two-theme events, failing to recognize events when they were the arguments to other events, failure to recognize nontheme arguments, and sentence segmentation errors. We show that specifically handling coordination had a small but important impact on the overall performance of the system. The OpenDMAP system and the rule set are available at http://bionlp.sourceforge.net. PMID:25937701
2012-01-01
Background In recent years, biological event extraction has emerged as a key natural language processing task, aiming to address the information overload problem in accessing the molecular biology literature. The BioNLP shared task competitions have contributed to this recent interest considerably. The first competition (BioNLP'09) focused on extracting biological events from Medline abstracts from a narrow domain, while the theme of the latest competition (BioNLP-ST'11) was generalization and a wider range of text types, event types, and subject domains were considered. We view event extraction as a building block in larger discourse interpretation and propose a two-phase, linguistically-grounded, rule-based methodology. In the first phase, a general, underspecified semantic interpretation is composed from syntactic dependency relations in a bottom-up manner. The notion of embedding underpins this phase and it is informed by a trigger dictionary and argument identification rules. Coreference resolution is also performed at this step, allowing extraction of inter-sentential relations. The second phase is concerned with constraining the resulting semantic interpretation by shared task specifications. We evaluated our general methodology on core biological event extraction and speculation/negation tasks in three main tracks of BioNLP-ST'11 (GENIA, EPI, and ID). Results We achieved competitive results in GENIA and ID tracks, while our results in the EPI track leave room for improvement. One notable feature of our system is that its performance across abstracts and articles bodies is stable. Coreference resolution results in minor improvement in system performance. Due to our interest in discourse-level elements, such as speculation/negation and coreference, we provide a more detailed analysis of our system performance in these subtasks. Conclusions The results demonstrate the viability of a robust, linguistically-oriented methodology, which clearly distinguishes general semantic interpretation from shared task specific aspects, for biological event extraction. Our error analysis pinpoints some shortcomings, which we plan to address in future work within our incremental system development methodology. PMID:22759461
Time Study of Harvesting Equipment Using GPS-Derived Positional Data
Tim McDonald
1999-01-01
The objectives in this study were to develop and test a data analysis system for calculating machine productivity from GPS-derived positional information alone. A technique was used where positions were `filtered' initially to locate specific events that were independent of what actually traveled the path, then these events were combined using user-specified rules...
Real-time optimizations for integrated smart network camera
NASA Astrophysics Data System (ADS)
Desurmont, Xavier; Lienard, Bruno; Meessen, Jerome; Delaigle, Jean-Francois
2005-02-01
We present an integrated real-time smart network camera. This system is composed of an image sensor, an embedded PC based electronic card for image processing and some network capabilities. The application detects events of interest in visual scenes, highlights alarms and computes statistics. The system also produces meta-data information that could be shared between other cameras in a network. We describe the requirements of such a system and then show how the design of the system is optimized to process and compress video in real-time. Indeed, typical video-surveillance algorithms as background differencing, tracking and event detection should be highly optimized and simplified to be used in this hardware. To have a good adequation between hardware and software in this light embedded system, the software management is written on top of the java based middle-ware specification established by the OSGi alliance. We can integrate easily software and hardware in complex environments thanks to the Java Real-Time specification for the virtual machine and some network and service oriented java specifications (like RMI and Jini). Finally, we will report some outcomes and typical case studies of such a camera like counter-flow detection.
Scholze, Stefan; Schiefer, Stefan; Partzsch, Johannes; Hartmann, Stephan; Mayr, Christian Georg; Höppner, Sebastian; Eisenreich, Holger; Henker, Stephan; Vogginger, Bernhard; Schüffny, Rene
2011-01-01
State-of-the-art large-scale neuromorphic systems require sophisticated spike event communication between units of the neural network. We present a high-speed communication infrastructure for a waferscale neuromorphic system, based on application-specific neuromorphic communication ICs in an field programmable gate arrays (FPGA)-maintained environment. The ICs implement configurable axonal delays, as required for certain types of dynamic processing or for emulating spike-based learning among distant cortical areas. Measurements are presented which show the efficacy of these delays in influencing behavior of neuromorphic benchmarks. The specialized, dedicated address-event-representation communication in most current systems requires separate, low-bandwidth configuration channels. In contrast, the configuration of the waferscale neuromorphic system is also handled by the digital packet-based pulse channel, which transmits configuration data at the full bandwidth otherwise used for pulse transmission. The overall so-called pulse communication subgroup (ICs and FPGA) delivers a factor 25–50 more event transmission rate than other current neuromorphic communication infrastructures. PMID:22016720
Takla, Anja; Velasco, Edward; Benzler, Justus
2012-07-31
Mass gatherings require a decision from public health authorities on how to monitor infectious diseases during the event. The appropriate level of enhanced surveillance depends on parameters like the scale of the event (duration, spatial distribution, season), participants' origin, amount of public attention, and baseline disease activity in the host country. For the FIFA Men's World Cup 2006, Germany implemented enhanced surveillance. As the scale of the FIFA Women's World Cup (June 26 - July 17, 2011) was estimated to be substantially smaller in size, visitors and duration, it was not feasible to simply adopt the previously implemented measures. Our aim was therefore to develop a strategy to tailor an event-specific enhanced surveillance for this smaller-scale mass gathering. Based on the enhanced surveillance measures during the Men's Cup, we conducted a needs assessment with the district health authorities in the 9 host cities in March 2011. Specific measures with a majority consent were implemented. After the event, we surveyed the 9 district and their corresponding 7 state health authorities to evaluate the implemented measures. All 9 district health authorities participated in the pre-event needs assessment. The majority of sites consented to moving from weekly to daily (Monday-Friday) notification reporting of routine infectious diseases, receiving regular feedback on those notification reports and summaries of national/international World Cup-relevant epidemiological incidents, e.g. outbreaks in countries of participating teams. In addition, we decided to implement twice-weekly reports of "unusual events" at district and state level. This enhanced system would commence on the first day and continue to one day following the tournament. No World Cup-related infectious disease outbreaks were reported during this time period. Eight of 9 district and 6 of 8 state health authorities participated in the final evaluation. The majority perceived the implemented measures as adequate. Our approach to tailor an event-specific enhanced surveillance concept worked well. Involvement of the participating stakeholders early-on in the planning phase secured ownership of and guaranteed support for the chosen strategy. The enhanced surveillance for this event resulted as a low-level surveillance. However, we included mechanisms for rapid upscaling if the situation would require adaptations.
DOT National Transportation Integrated Search
1982-07-01
In order to examine specific automated guideway transit (AGT) developments and concepts, UMTA undertook a program of studies and technology investigations called Automated Guideway Transit Technology (AGTT) Program. The objectives of one segment of t...
DOT National Transportation Integrated Search
1981-07-01
The Detailed Station Model (DSM) is a discrete event model representing the interrelated queueing processes associated with vehicle and passenger activities in an AGT station. The DSM will provide operational and performance measures of alternative s...
Comprehensive, Multi-Source Cyber-Security Events Data Set
Kent, Alexander D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-05-21
This data set represents 58 consecutive days of de-identified event data collected from five sources within Los Alamos National Laboratory’s corporate, internal computer network. The data sources include Windows-based authentication events from both individual computers and centralized Active Directory domain controller servers; process start and stop events from individual Windows computers; Domain Name Service (DNS) lookups as collected on internal DNS servers; network flow data as collected on at several key router locations; and a set of well-defined red teaming events that present bad behavior within the 58 days. In total, the data set is approximately 12 gigabytes compressed across the five data elements and presents 1,648,275,307 events in total for 12,425 users, 17,684 computers, and 62,974 processes. Specific users that are well known system related (SYSTEM, Local Service) were not de-identified though any well-known administrators account were still de-identified. In the network flow data, well-known ports (e.g. 80, 443, etc) were not de-identified. All other users, computers, process, ports, times, and other details were de-identified as a unified set across all the data elements (e.g. U1 is the same U1 in all of the data). The specific timeframe used is not disclosed for security purposes. In addition, no data that allows association outside of LANL’s network is included. All data starts with a time epoch of 1 using a time resolution of 1 second. In the authentication data, failed authentication events are only included for users that had a successful authentication event somewhere within the data set.
Deep brain optical measurements of cell type-specific neural activity in behaving mice.
Cui, Guohong; Jun, Sang Beom; Jin, Xin; Luo, Guoxiang; Pham, Michael D; Lovinger, David M; Vogel, Steven S; Costa, Rui M
2014-01-01
Recent advances in genetically encoded fluorescent sensors enable the monitoring of cellular events from genetically defined groups of neurons in vivo. In this protocol, we describe how to use a time-correlated single-photon counting (TCSPC)-based fiber optics system to measure the intensity, emission spectra and lifetime of fluorescent biosensors expressed in deep brain structures in freely moving mice. When combined with Cre-dependent selective expression of genetically encoded Ca(2+) indicators (GECIs), this system can be used to measure the average neural activity from a specific population of cells in mice performing complex behavioral tasks. As an example, we used viral expression of GCaMPs in striatal projection neurons (SPNs) and recorded the fluorescence changes associated with calcium spikes from mice performing a lever-pressing operant task. The whole procedure, consisting of virus injection, behavior training and optical recording, takes 3-4 weeks to complete. With minor adaptations, this protocol can also be applied to recording cellular events from other cell types in deep brain regions, such as dopaminergic neurons in the ventral tegmental area. The simultaneously recorded fluorescence signals and behavior events can be used to explore the relationship between the neural activity of specific brain circuits and behavior.
Active and passive surveillance of enoxaparin generics: a case study relevant to biosimilars.
Grampp, Gustavo; Bonafede, Machaon; Felix, Thomas; Li, Edward; Malecki, Michael; Sprafka, J Michael
2015-03-01
This retrospective analysis assessed the capability of active and passive safety surveillance systems to track product-specific safety events in the USA for branded and generic enoxaparin, a complex injectable subject to immune-related and other adverse events (AEs). Analysis of heparin-induced thrombocytopenia (HIT) incidence was performed on benefit claims for commercial and Medicare supplemental-insured individuals newly treated with enoxaparin under pharmacy benefit (1 January 2009 - 30 June 2012). Additionally, spontaneous reports from the FDA AE Reporting System were reviewed to identify incidence and attribution of enoxaparin-related reports to specific manufacturers. Specific, dispensed products were identifiable from National Drug Codes only in pharmacy-benefit databases, permitting sensitive comparison of HIT incidence in nearly a third of patients treated with brand or generic enoxaparin. After originator medicine's loss of exclusivity, only 5% of spontaneous reports were processed by generic manufacturers; reports attributable to specific generics were approximately ninefold lower than expected based on market share. Claims data were useful for active surveillance of enoxaparin generics dispensed under pharmacy benefits but not for products administered under medical benefits. These findings suggest that the current spontaneous reporting system will not distinguish product-specific safety signals for products distributed by multiple manufacturers, including biosimilars.
Botsis, T; Woo, E J; Ball, R
2013-01-01
We previously demonstrated that a general purpose text mining system, the Vaccine adverse event Text Mining (VaeTM) system, could be used to automatically classify reports of an-aphylaxis for post-marketing safety surveillance of vaccines. To evaluate the ability of VaeTM to classify reports to the Vaccine Adverse Event Reporting System (VAERS) of possible Guillain-Barré Syndrome (GBS). We used VaeTM to extract the key diagnostic features from the text of reports in VAERS. Then, we applied the Brighton Collaboration (BC) case definition for GBS, and an information retrieval strategy (i.e. the vector space model) to quantify the specific information that is included in the key features extracted by VaeTM and compared it with the encoded information that is already stored in VAERS as Medical Dictionary for Regulatory Activities (MedDRA) Preferred Terms (PTs). We also evaluated the contribution of the primary (diagnosis and cause of death) and secondary (second level diagnosis and symptoms) diagnostic VaeTM-based features to the total VaeTM-based information. MedDRA captured more information and better supported the classification of reports for GBS than VaeTM (AUC: 0.904 vs. 0.777); the lower performance of VaeTM is likely due to the lack of extraction by VaeTM of specific laboratory results that are included in the BC criteria for GBS. On the other hand, the VaeTM-based classification exhibited greater specificity than the MedDRA-based approach (94.96% vs. 87.65%). Most of the VaeTM-based information was contained in the secondary diagnostic features. For GBS, clinical signs and symptoms alone are not sufficient to match MedDRA coding for purposes of case classification, but are preferred if specificity is the priority.
Processor design optimization methodology for synthetic vision systems
NASA Astrophysics Data System (ADS)
Wren, Bill; Tarleton, Norman G.; Symosek, Peter F.
1997-06-01
Architecture optimization requires numerous inputs from hardware to software specifications. The task of varying these input parameters to obtain an optimal system architecture with regard to cost, specified performance and method of upgrade considerably increases the development cost due to the infinitude of events, most of which cannot even be defined by any simple enumeration or set of inequalities. We shall address the use of a PC-based tool using genetic algorithms to optimize the architecture for an avionics synthetic vision system, specifically passive millimeter wave system implementation.
Isoform Specificity of Protein Kinase Cs in Synaptic Plasticity
ERIC Educational Resources Information Center
Sossin, Wayne S.
2007-01-01
Protein kinase Cs (PKCs) are implicated in many forms of synaptic plasticity. However, the specific isoform(s) of PKC that underlie(s) these events are often not known. We have used "Aplysia" as a model system in order to investigate the isoform specificity of PKC actions due to the presence of fewer isoforms and a large number of documented…
DOT National Transportation Integrated Search
1982-06-01
In order to examine specific Automated Guideway Transit (AGT) developments and concepts, and to build a better knowledge base for future decision-making, the Urban Mass Transportation Administration (UMTA) undertook a new program of studies and techn...
IGES, a key interface specification for CAD/CAM systems integration
NASA Technical Reports Server (NTRS)
Smith, B. M.; Wellington, J.
1984-01-01
The Initial Graphics Exchange Specification (IGES) program has focused the efforts of 52 companies on the development and documentation of a means of graphics data base exchange among present day CAD/CAM systems. The project's brief history has seen the evolution of the Specification into preliminary industrial usage marked by public demonstrations of vendor capability, mandatory requests in procurement actions, and a formalization into an American National Standard in September 1981. Recent events have demonstrated intersystem data exchange among seven vendor systems with a total of 30 vendors committing to offer IGES capability. A full range of documentation supports the IGES project and the recently approved IGES Version 2.0 of the Specification.
Pedologic and geomorphic impacts of a tornado blowdown event in a mixed pine-hardwood forest
Jonathan D. Phillips; Daniel A. Marion; Alice V. Turkington
2008-01-01
Biomechanical effects of trees on soils and surface processes may be extensive in forest environments. Two blowdown sites caused by a November 2005 tornado in the Ouachita National Forest, Arkansas allowed a case study examination of bioturbation associated with a specific forest blowdown event, as well as detailed examination of relationships between tree root systems...
Dokas, Ioannis M; Panagiotakopoulos, Demetrios C
2006-08-01
The available expertise on managing and operating solid waste management (SWM) facilities varies among countries and among types of facilities. Few experts are willing to record their experience, while few researchers systematically investigate the chains of events that could trigger operational failures in a facility; expertise acquisition and dissemination, in SWM, is neither popular nor easy, despite the great need for it. This paper presents a knowledge acquisition process aimed at capturing, codifying and expanding reliable expertise and propagating it to non-experts. The knowledge engineer (KE), the person performing the acquisition, must identify the events (or causes) that could trigger a failure, determine whether a specific event could trigger more than one failure, and establish how various events are related among themselves and how they are linked to specific operational problems. The proposed process, which utilizes logic diagrams (fault trees) widely used in system safety and reliability analyses, was used for the analysis of 24 common landfill operational problems. The acquired knowledge led to the development of a web-based expert system (Landfill Operation Management Advisor, http://loma.civil.duth.gr), which estimates the occurrence possibility of operational problems, provides advice and suggests solutions.
Assessing the Value of Information for Identifying Optimal Floodplain Management Portfolios
NASA Astrophysics Data System (ADS)
Read, L.; Bates, M.; Hui, R.; Lund, J. R.
2014-12-01
Floodplain management is a complex portfolio problem that can be analyzed from an integrated perspective incorporating traditionally structural and nonstructural options. One method to identify effective strategies for preparing, responding to, and recovering from floods is to optimize for a portfolio of temporary (emergency) and permanent floodplain management options. A risk-based optimization approach to this problem assigns probabilities to specific flood events and calculates the associated expected damages. This approach is currently limited by: (1) the assumption of perfect flood forecast information, i.e. implementing temporary management activities according to the actual flood event may differ from optimizing based on forecasted information and (2) the inability to assess system resilience across a range of possible future events (risk-centric approach). Resilience is defined here as the ability of a system to absorb and recover from a severe disturbance or extreme event. In our analysis, resilience is a system property that requires integration of physical, social, and information domains. This work employs a 3-stage linear program to identify the optimal mix of floodplain management options using conditional probabilities to represent perfect and imperfect flood stages (forecast vs. actual events). We assess the value of information in terms of minimizing damage costs for two theoretical cases - urban and rural systems. We use portfolio analysis to explore how the set of optimal management options differs depending on whether the goal is for the system to be risk-adverse to a specified event or resilient over a range of events.
Multilingual event extraction for epidemic detection.
Lejeune, Gaël; Brixtel, Romain; Doucet, Antoine; Lucas, Nadine
2015-10-01
This paper presents a multilingual news surveillance system applied to tele-epidemiology. It has been shown that multilingual approaches improve timeliness in detection of epidemic events across the globe, eliminating the wait for local news to be translated into major languages. We present here a system to extract epidemic events in potentially any language, provided a Wikipedia seed for common disease names exists. The Daniel system presented herein relies on properties that are common to news writing (the journalistic genre), the most useful being repetition and saliency. Wikipedia is used to screen common disease names to be matched with repeated characters strings. Language variations, such as declensions, are handled by processing text at the character-level, rather than at the word level. This additionally makes it possible to handle various writing systems in a similar fashion. As no multilingual ground truth existed to evaluate the Daniel system, we built a multilingual corpus from the Web, and collected annotations from native speakers of Chinese, English, Greek, Polish and Russian, with no connection or interest in the Daniel system. This data set is available online freely, and can be used for the evaluation of other event extraction systems. Experiments for 5 languages out of 17 tested are detailed in this paper: Chinese, English, Greek, Polish and Russian. The Daniel system achieves an average F-measure of 82% in these 5 languages. It reaches 87% on BEcorpus, the state-of-the-art corpus in English, slightly below top-performing systems, which are tailored with numerous language-specific resources. The consistent performance of Daniel on multiple languages is an important contribution to the reactivity and the coverage of epidemiological event detection systems. Most event extraction systems rely on extensive resources that are language-specific. While their sophistication induces excellent results (over 90% precision and recall), it restricts their coverage in terms of languages and geographic areas. In contrast, in order to detect epidemic events in any language, the Daniel system only requires a list of a few hundreds of disease names and locations, which can actually be acquired automatically. The system can perform consistently well on any language, with precision and recall around 82% on average, according to this paper's evaluation. Daniel's character-based approach is especially interesting for morphologically-rich and low-resourced languages. The lack of resources to be exploited and the state of the art string matching algorithms imply that Daniel can process thousands of documents per minute on a simple laptop. In the context of epidemic surveillance, reactivity and geographic coverage are of primary importance, since no one knows where the next event will strike, and therefore in what vernacular language it will first be reported. By being able to process any language, the Daniel system offers unique coverage for poorly endowed languages, and can complete state of the art techniques for major languages. Copyright © 2015 Elsevier B.V. All rights reserved.
2012-01-01
Background Mass gatherings require a decision from public health authorities on how to monitor infectious diseases during the event. The appropriate level of enhanced surveillance depends on parameters like the scale of the event (duration, spatial distribution, season), participants’ origin, amount of public attention, and baseline disease activity in the host country. For the FIFA Men’s World Cup 2006, Germany implemented enhanced surveillance. As the scale of the FIFA Women’s World Cup (June 26 – July 17, 2011) was estimated to be substantially smaller in size, visitors and duration, it was not feasible to simply adopt the previously implemented measures. Our aim was therefore to develop a strategy to tailor an event-specific enhanced surveillance for this smaller-scale mass gathering. Methods Based on the enhanced surveillance measures during the Men’s Cup, we conducted a needs assessment with the district health authorities in the 9 host cities in March 2011. Specific measures with a majority consent were implemented. After the event, we surveyed the 9 district and their corresponding 7 state health authorities to evaluate the implemented measures. Results All 9 district health authorities participated in the pre-event needs assessment. The majority of sites consented to moving from weekly to daily (Monday-Friday) notification reporting of routine infectious diseases, receiving regular feedback on those notification reports and summaries of national/international World Cup-relevant epidemiological incidents, e.g. outbreaks in countries of participating teams. In addition, we decided to implement twice-weekly reports of “unusual events” at district and state level. This enhanced system would commence on the first day and continue to one day following the tournament. No World Cup-related infectious disease outbreaks were reported during this time period. Eight of 9 district and 6 of 8 state health authorities participated in the final evaluation. The majority perceived the implemented measures as adequate. Conclusions Our approach to tailor an event-specific enhanced surveillance concept worked well. Involvement of the participating stakeholders early-on in the planning phase secured ownership of and guaranteed support for the chosen strategy. The enhanced surveillance for this event resulted as a low-level surveillance. However, we included mechanisms for rapid upscaling if the situation would require adaptations. PMID:22849632
Shao, Ning; Jiang, Shi-Meng; Zhang, Miao; Wang, Jing; Guo, Shu-Juan; Li, Yang; Jiang, He-Wei; Liu, Cheng-Xi; Zhang, Da-Bing; Yang, Li-Tao; Tao, Sheng-Ce
2014-01-21
The monitoring of genetically modified organisms (GMOs) is a primary step of GMO regulation. However, there is presently a lack of effective and high-throughput methodologies for specifically and sensitively monitoring most of the commercialized GMOs. Herein, we developed a multiplex amplification on a chip with readout on an oligo microarray (MACRO) system specifically for convenient GMO monitoring. This system is composed of a microchip for multiplex amplification and an oligo microarray for the readout of multiple amplicons, containing a total of 91 targets (18 universal elements, 20 exogenous genes, 45 events, and 8 endogenous reference genes) that covers 97.1% of all GM events that have been commercialized up to 2012. We demonstrate that the specificity of MACRO is ~100%, with a limit of detection (LOD) that is suitable for real-world applications. Moreover, the results obtained of simulated complex samples and blind samples with MACRO were 100% consistent with expectations and the results of independently performed real-time PCRs, respectively. Thus, we believe MACRO is the first system that can be applied for effectively monitoring the majority of the commercialized GMOs in a single test.
Grammatical Aspect and Mental Simulation
ERIC Educational Resources Information Center
Bergen, Benjamin; Wheeler, Kathryn
2010-01-01
When processing sentences about perceptible scenes and performable actions, language understanders activate perceptual and motor systems to perform mental simulations of those events. But little is known about exactly what linguistic elements activate modality-specific systems during language processing. While it is known that content words, like…
REAL-TIME CONTROL OF COMBINED SEWER NETWORKS
Real-time control (RTC) is a custom-designed management program for a specific urban sewerage system during a wet-weather event. The function of RTC is to assure efficient operation of the sewerage system and maximum utilization of existing storage capacity, either to fully conta...
Using the Statecharts paradigm for simulation of patient flow in surgical care.
Sobolev, Boris; Harel, David; Vasilakis, Christos; Levy, Adrian
2008-03-01
Computer simulation of patient flow has been used extensively to assess the impacts of changes in the management of surgical care. However, little research is available on the utility of existing modeling techniques. The purpose of this paper is to examine the capacity of Statecharts, a system of graphical specification, for constructing a discrete-event simulation model of the perioperative process. The Statecharts specification paradigm was originally developed for representing reactive systems by extending the formalism of finite-state machines through notions of hierarchy, parallelism, and event broadcasting. Hierarchy permits subordination between states so that one state may contain other states. Parallelism permits more than one state to be active at any given time. Broadcasting of events allows one state to detect changes in another state. In the context of the peri-operative process, hierarchy provides the means to describe steps within activities and to cluster related activities, parallelism provides the means to specify concurrent activities, and event broadcasting provides the means to trigger a series of actions in one activity according to transitions that occur in another activity. Combined with hierarchy and parallelism, event broadcasting offers a convenient way to describe the interaction of concurrent activities. We applied the Statecharts formalism to describe the progress of individual patients through surgical care as a series of asynchronous updates in patient records generated in reaction to events produced by parallel finite-state machines representing concurrent clinical and managerial activities. We conclude that Statecharts capture successfully the behavioral aspects of surgical care delivery by specifying permissible chronology of events, conditions, and actions.
Three decades of disasters: a review of disaster-specific literature from 1977-2009.
Smith, Erin; Wasiak, Jason; Sen, Ayan; Archer, Frank; Burkle, Frederick M
2009-01-01
The potential for disasters exists in all communities. To mitigate the potential catastrophes that confront humanity in the new millennium, an evidence-based approach to disaster management is required urgently. This study moves toward such an evidence-based approach by identifying peer-reviewed publications following a range of disasters and events over the past three decades. Peer-reviewed, event-specific literature was identified using a comprehensive search of the electronically indexed database, MEDLINE (1956-January 2009). An extended comprehensive search was conducted for one event to compare the event-specific literature indexed in MEDLINE to other electronic databases (EMBASE, CINAHL, AMED, CENTRAL, Psych Info, Maternity and Infant Care, EBM Reviews). Following 25 individual disasters or overwhelming crises, a total of 2,098 peer-reviewed, event-specific publications were published in 789 journals (652 publications following disasters/events caused by natural hazards, 966 following human-made/technological disasters/events, and 480 following conflict/complex humanitarian events).The event with the greatest number of peer-reviewed, event-specific publications was the 11 September 2001 terrorist attacks (686 publications). Prehospital and Disaster Medicine published the greatest number of peer-reviewed, event-specific publications (54), followed by Journal of Traumatic Stress (42), Military Medicine (40), and Psychiatric Services (40). The primary topics of event-specific publications were mental health, medical health, and response. When an extended, comprehensive search was conducted for one event, 75% of all peer-reviewed, event-specific publications were indexed in MEDLINE. A broad range of multi-disciplinary journals publish peer reviewed, event-specific publications. While the majority of peer-reviewed, event-specific literature is indexed in MEDLINE, comprehensive search strategies should include EMBASE to increase yield.
Code of Federal Regulations, 2011 CFR
2011-01-01
... vitro measure of the beryllium antigen-specific, cell-mediated immune response. Beryllium worker means a... particles. Immune response refers to the series of cellular events by which the immune system reacts to...
Implementing a Rule-Based Contract Compliance Checker
NASA Astrophysics Data System (ADS)
Strano, Massimo; Molina-Jimenez, Carlos; Shrivastava, Santosh
The paper describes the design and implementation of an independent, third party contract monitoring service called Contract Compliance Checker (CCC). The CCC is provided with the specification of the contract in force, and is capable of observing and logging the relevant business-to-business (B2B) interaction events, in order to determine whether the actions of the business partners are consistent with the contract. A contract specification language called EROP (for Events, Rights, Obligations and Prohibitions) for the CCC has been developed based on business rules, that provides constructs to specify what rights, obligation and prohibitions become active and inactive after the occurrence of events related to the execution of business operations. The system has been designed to work with B2B industry standards such as ebXML and RosettaNet.
Quantified Event Automata: Towards Expressive and Efficient Runtime Monitors
NASA Technical Reports Server (NTRS)
Barringer, Howard; Falcone, Ylies; Havelund, Klaus; Reger, Giles; Rydeheard, David
2012-01-01
Runtime verification is the process of checking a property on a trace of events produced by the execution of a computational system. Runtime verification techniques have recently focused on parametric specifications where events take data values as parameters. These techniques exist on a spectrum inhabited by both efficient and expressive techniques. These characteristics are usually shown to be conflicting - in state-of-the-art solutions, efficiency is obtained at the cost of loss of expressiveness and vice-versa. To seek a solution to this conflict we explore a new point on the spectrum by defining an alternative runtime verification approach.We introduce a new formalism for concisely capturing expressive specifications with parameters. Our technique is more expressive than the currently most efficient techniques while at the same time allowing for optimizations.
Sengupta, Partha Pratim; Gloria, Jared N; Amato, Dahlia N; Amato, Douglas V; Patton, Derek L; Murali, Beddhu; Flynt, Alex S
2015-10-12
Detection of specific RNA or DNA molecules by hybridization to "probe" nucleic acids via complementary base-pairing is a powerful method for analysis of biological systems. Here we describe a strategy for transducing hybridization events through modulating intrinsic properties of the electroconductive polymer polyaniline (PANI). When DNA-based probes electrostatically interact with PANI, its fluorescence properties are increased, a phenomenon that can be enhanced by UV irradiation. Hybridization of target nucleic acids results in dissociation of probes causing PANI fluorescence to return to basal levels. By monitoring restoration of base PANI fluorescence as little as 10(-11) M (10 pM) of target oligonucleotides could be detected within 15 min of hybridization. Detection of complementary oligos was specific, with introduction of a single mismatch failing to form a target-probe duplex that would dissociate from PANI. Furthermore, this approach is robust and is capable of detecting specific RNAs in extracts from animals. This sensor system improves on previously reported strategies by transducing highly specific probe dissociation events through intrinsic properties of a conducting polymer without the need for additional labels.
Cre recombinase-mediated site-specific recombination between plant chromosomes.
Qin, M; Bayley, C; Stockton, T; Ow, D W
1994-01-01
We report the use of the bacteriophage P1 Cre-lox system for generating conservative site-specific recombination between tobacco chromosomes. Two constructs, one containing a promoterless hygromycin-resistance gene preceded by a lox site (lox-hpt) and the other containing a cauliflower mosaic virus 35S promoter linked to a lox sequence and the cre coding region (35S-lox-cre), were introduced separately into tobacco plants. Crosses between plants harboring either construct produced plants with the two constructs situated on different chromosomes. Plants with recombination events were identified by selecting for hygromycin resistance, a phenotype expressed upon recombination. Molecular analysis showed that these recombination events occurred specifically at the lox sites and resulted in the reciprocal exchange of flanking host DNA. Progenies of these plants showed 67-100% cotransmission of the new transgenes, 35S-lox-hpt and lox-cre, consistent with the preferential cosegregation of translocated chromosomes. These results illustrate that site-specific recombination systems can be useful tools for the large-scale manipulation of eukaryotic chromosomes in vivo. Images PMID:8127869
NASA Astrophysics Data System (ADS)
Tao, J.; Barros, A. P.
2013-07-01
Debris flows associated with rainstorms are a frequent and devastating hazard in the Southern Appalachians in the United States. Whereas warm season events are clearly associated with heavy rainfall intensity, the same cannot be said for the cold season events. Instead, there is a relationship between large (cumulative) rainfall events independently of season, and thus hydrometeorological regime, and debris flows. This suggests that the dynamics of subsurface hydrologic processes play an important role as a trigger mechanism, specifically through soil moisture redistribution by interflow. The first objective of this study is to investigate this hypothesis. The second objective is to assess the physical basis for a regional coupled flood prediction and debris flow warning system. For this purpose, uncalibrated model simulations of well-documented debris flows in headwater catchments of the Southern Appalachians using a 3-D surface-groundwater hydrologic model coupled with slope stability models are examined in detail. Specifically, we focus on two vulnerable headwater catchments that experience frequent debris flows, the Big Creek and the Jonathan Creek in the Upper Pigeon River Basin, North Carolina, and three distinct weather systems: an extremely heavy summertime convective storm in 2011; a persistent winter storm lasting several days; and a severe winter storm in 2009. These events were selected due to the optimal availability of rainfall observations, availability of detailed field surveys of the landslides shortly after they occurred, which can be used to evaluate model predictions, and because they are representative of events that cause major economic losses in the region. The model results substantiate that interflow is a useful prognostic of conditions necessary for the initiation of slope instability, and should therefore be considered explicitly in landslide hazard assessments. Moreover, the relationships between slope stability and interflow are strongly modulated by the topography and catchment specific geomorphologic features that determine subsurface flow convergence zones. The three case-studies demonstrate the value of coupled prediction of flood response and debris flow initiation potential in the context of developing a regional hazard warning system.
Collapse of Experimental Colloidal Aging using Record Dynamics
NASA Astrophysics Data System (ADS)
Robe, Dominic; Boettcher, Stefan; Sibani, Paolo; Yunker, Peter
The theoretical framework of record dynamics (RD) posits that aging behavior in jammed systems is controlled by short, rare events involving activation of only a few degrees of freedom. RD predicts dynamics in an aging system to progress with the logarithm of t /tw . This prediction has been verified through new analysis of experimental data on an aging 2D colloidal system. MSD and persistence curves spanning three orders of magnitude in waiting time are collapsed. These predictions have also been found consistent with a number of experiments and simulations, but verification of the specific assumptions that RD makes about the underlying statistics of these rare events has been elusive. Here the observation of individual particles allows for the first time the direct verification of the assumptions about event rates and sizes. This work is suppoted by NSF Grant DMR-1207431.
Applying AI tools to operational space environmental analysis
NASA Technical Reports Server (NTRS)
Krajnak, Mike; Jesse, Lisa; Mucks, John
1995-01-01
The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines events covering reports of natural phenomena such as solar flares, bursts, geomagnetic storms, and five others pertinent to space environmental analysis. With our preliminary event definitions we experimented with TAS's support for temporal pattern analysis using X-ray flare and geomagnetic storm forecasts as case studies. We are currently working on a framework for integrating advanced graphics and space environmental models into this analytical environment.
Akagi, Takashi; Henry, Isabelle M; Morimoto, Takuya; Tao, Ryutaro
2016-06-01
Self-incompatibility (SI) is an important plant reproduction mechanism that facilitates the maintenance of genetic diversity within species. Three plant families, the Solanaceae, Rosaceae and Plantaginaceae, share an S-RNase-based gametophytic SI (GSI) system that involves a single S-RNase as the pistil S determinant and several F-box genes as pollen S determinants that act via non-self-recognition. Previous evidence has suggested a specific self-recognition mechanism in Prunus (Rosaceae), raising questions about the generality of the S-RNase-based GSI system. We investigated the evolution of the pollen S determinant by comparing the sequences of the Prunus S haplotype-specific F-box gene (SFB) with those of its orthologs in other angiosperm genomes. Our results indicate that the Prunus SFB does not cluster with the pollen S of other plants and diverged early after the establishment of the Eudicots. Our results further indicate multiple F-box gene duplication events, specifically in the Rosaceae family, and suggest that the Prunus SFB gene originated in a recent Prunus-specific gene duplication event. Transcriptomic and evolutionary analyses of the Prunus S paralogs are consistent with the establishment of a Prunus-specific SI system, and the possibility of subfunctionalization differentiating the newly generated SFB from the original pollen S determinant. © The Author 2016. Published by Oxford University Press on behalf of Japanese Society of Plant Physiologists. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Hydraulics of epiphreatic flow of a karst aquifer
NASA Astrophysics Data System (ADS)
Gabrovšek, Franci; Peric, Borut; Kaufmann, Georg
2018-05-01
The nature of epiphreatic flow remains an important research challenge in karst hydrology. This study focuses on the flood propagation along the epiphreatic system of Reka-Timavo system (Kras/Carso Plateau, Slovenia/Italy). It is based on long-term monitoring of basic physical parameters (pressure/level, temperature, specific electric conductivity) of ground water in six active caves belonging to the flow system. The system vigorously responds to flood events, with stage rising >100 m in some of the caves. Besides presenting the response of the system to flood events of different scales, the work focuses on the interpretation of recorded hydrographs in view of the known distribution and size of conduits and basic hydraulic relations. Furthermore, the hydrographs were used to infer the unknown geometry between the observation points. This way, the main flow restrictors, overflow passages and large epiphreatic storages were identified. The assumptions were tested with a hydraulic model, where the inversion procedure was used for an additional parameter optimisation. Time series of temperature and specific electric conductivity were used to assess the apparent velocities of flow between consecutive points.
ERIC Educational Resources Information Center
Hazy, James K.; Silberstang, Joyce
2009-01-01
One tradition within the complexity paradigm considers organisations as complex adaptive systems in which autonomous individuals interact, often in complex ways with difficult to predict, non-linear outcomes. Building upon this tradition, and more specifically following the complex systems leadership theory approach, we describe the ways in which…
Albattat, Ali; Gruenwald, Benjamin C.; Yucelen, Tansel
2016-01-01
The last decade has witnessed an increased interest in physical systems controlled over wireless networks (networked control systems). These systems allow the computation of control signals via processors that are not attached to the physical systems, and the feedback loops are closed over wireless networks. The contribution of this paper is to design and analyze event-triggered decentralized and distributed adaptive control architectures for uncertain networked large-scale modular systems; that is, systems consist of physically-interconnected modules controlled over wireless networks. Specifically, the proposed adaptive architectures guarantee overall system stability while reducing wireless network utilization and achieving a given system performance in the presence of system uncertainties that can result from modeling and degraded modes of operation of the modules and their interconnections between each other. In addition to the theoretical findings including rigorous system stability and the boundedness analysis of the closed-loop dynamical system, as well as the characterization of the effect of user-defined event-triggering thresholds and the design parameters of the proposed adaptive architectures on the overall system performance, an illustrative numerical example is further provided to demonstrate the efficacy of the proposed decentralized and distributed control approaches. PMID:27537894
Albattat, Ali; Gruenwald, Benjamin C; Yucelen, Tansel
2016-08-16
The last decade has witnessed an increased interest in physical systems controlled over wireless networks (networked control systems). These systems allow the computation of control signals via processors that are not attached to the physical systems, and the feedback loops are closed over wireless networks. The contribution of this paper is to design and analyze event-triggered decentralized and distributed adaptive control architectures for uncertain networked large-scale modular systems; that is, systems consist of physically-interconnected modules controlled over wireless networks. Specifically, the proposed adaptive architectures guarantee overall system stability while reducing wireless network utilization and achieving a given system performance in the presence of system uncertainties that can result from modeling and degraded modes of operation of the modules and their interconnections between each other. In addition to the theoretical findings including rigorous system stability and the boundedness analysis of the closed-loop dynamical system, as well as the characterization of the effect of user-defined event-triggering thresholds and the design parameters of the proposed adaptive architectures on the overall system performance, an illustrative numerical example is further provided to demonstrate the efficacy of the proposed decentralized and distributed control approaches.
WHAT IS PAIN? A HISTORY THE PROTHERO LECTURE.
Bourke, Joanna
2013-12-01
What is pain? This article argues that it is useful to think of pain as a 'kind of event' or a way of being-in-the-world. Pain-events are unstable; they are historically constituted and reconstituted in relation to language systems, social and environmental interactions and bodily comportment. The historical question becomes: how has pain been done and what ideological work do acts of being-in-pain seek to achieve? By what mechanisms do these types of events change? Who decides the content of any particular, historically specific and geographically situated ontology?
Smith, Shannon M; Jones, Judith K; Katz, Nathaniel P; Roland, Carl L; Setnik, Beatrice; Trudeau, Jeremiah J; Wright, Stephen; Burke, Laurie B; Comer, Sandra D; Dart, Richard C; Dionne, Raymond; Haddox, J David; Jaffe, Jerome H; Kopecky, Ernest A; Martell, Bridget A; Montoya, Ivan D; Stanton, Marsha; Wasan, Ajay D; Turk, Dennis C; Dworkin, Robert H
2017-11-01
Accurate assessment of inappropriate medication use events (ie, misuse, abuse, and related events) occurring in clinical trials is an important component in evaluating a medication's abuse potential. A meeting was convened to review all instruments measuring such events in clinical trials according to previously published standardized terminology and definitions. Only 2 approaches have been reported that are specifically designed to identify and classify misuse, abuse, and related events occurring in clinical trials, rather than to measure an individual's risk of using a medication inappropriately: the Self-Reported Misuse, Abuse, and Diversion (SR-MAD) instrument and the Misuse, Abuse, and Diversion Drug Event Reporting System (MADDERS). The conceptual basis, strengths, and limitations of these methods are discussed. To our knowledge, MADDERS is the only system available to comprehensively evaluate inappropriate medication use events prospectively to determine the underlying intent. MADDERS can also be applied retrospectively to completed trial data. SR-MAD can be used prospectively; additional development may be required to standardize its implementation and fully appraise the intent of inappropriate use events. Additional research is needed to further demonstrate the validity and utility of MADDERS as well as SR-MAD. Identifying a medication's abuse potential requires assessing inappropriate medication use events in clinical trials on the basis of a standardized event classification system. The strengths and limitations of the 2 published methods designed to evaluate inappropriate medication use events are reviewed, with recommended considerations for further development and current implementation. Copyright © 2017 American Pain Society. Published by Elsevier Inc. All rights reserved.
Symbolic discrete event system specification
NASA Technical Reports Server (NTRS)
Zeigler, Bernard P.; Chi, Sungdo
1992-01-01
Extending discrete event modeling formalisms to facilitate greater symbol manipulation capabilities is important to further their use in intelligent control and design of high autonomy systems. An extension to the DEVS formalism that facilitates symbolic expression of event times by extending the time base from the real numbers to the field of linear polynomials over the reals is defined. A simulation algorithm is developed to generate the branching trajectories resulting from the underlying nondeterminism. To efficiently manage symbolic constraints, a consistency checking algorithm for linear polynomial constraints based on feasibility checking algorithms borrowed from linear programming has been developed. The extended formalism offers a convenient means to conduct multiple, simultaneous explorations of model behaviors. Examples of application are given with concentration on fault model analysis.
Alternative Splice in Alternative Lice
Tovar-Corona, Jaime M.; Castillo-Morales, Atahualpa; Chen, Lu; Olds, Brett P.; Clark, John M.; Reynolds, Stuart E.; Pittendrigh, Barry R.; Feil, Edward J.; Urrutia, Araxi O.
2015-01-01
Genomic and transcriptomics analyses have revealed human head and body lice to be almost genetically identical; although con-specific, they nevertheless occupy distinct ecological niches and have differing feeding patterns. Most importantly, while head lice are not known to be vector competent, body lice can transmit three serious bacterial diseases; epidemictyphus, trench fever, and relapsing fever. In order to gain insights into the molecular bases for these differences, we analyzed alternative splicing (AS) using next-generation sequencing data for one strain of head lice and one strain of body lice. We identified a total of 3,598 AS events which were head or body lice specific. Exon skipping AS events were overrepresented among both head and body lice, whereas intron retention events were underrepresented in both. However, both the enrichment of exon skipping and the underrepresentation of intron retention are significantly stronger in body lice compared with head lice. Genes containing body louse-specific AS events were found to be significantly enriched for functions associated with development of the nervous system, salivary gland, trachea, and ovarian follicle cells, as well as regulation of transcription. In contrast, no functional categories were overrepresented among genes with head louse-specific AS events. Together, our results constitute the first evidence for transcript pool differences in head and body lice, providing insights into molecular adaptations that enabled human lice to adapt to clothing, and representing a powerful illustration of the pivotal role AS can play in functional adaptation. PMID:26169943
Evaluating a Control System Architecture Based on a Formally Derived AOCS Model
NASA Astrophysics Data System (ADS)
Ilic, Dubravka; Latvala, Timo; Varpaaniemi, Kimmo; Vaisanen, Pauli; Troubitsyna, Elena; Laibinis, Linas
2010-08-01
Attitude & Orbit Control System (AOCS) refers to a wider class of control systems which are used to determine and control the attitude of the spacecraft while in orbit, based on the information obtained from various sensors. In this paper, we propose an approach to evaluate a typical (yet somewhat simplified) AOCS architecture using formal development - based on the Event-B method. As a starting point, an Ada specification of the AOCS is translated into a formal specification and further refined to incorporate all the details of its original source code specification. This way we are able not only to evaluate the Ada specification by expressing and verifying specific system properties in our formal models, but also to determine how well the chosen modelling framework copes with the level of detail required for an actual implementation and code generation from the derived models.
Mutic, Sasa; Brame, R Scott; Oddiraju, Swetha; Parikh, Parag; Westfall, Melisa A; Hopkins, Merilee L; Medina, Angel D; Danieley, Jonathan C; Michalski, Jeff M; El Naqa, Issam M; Low, Daniel A; Wu, Bin
2010-09-01
The value of near-miss and error reporting processes in many industries is well appreciated and typically can be supported with data that have been collected over time. While it is generally accepted that such processes are important in the radiation therapy (RT) setting, studies analyzing the effects of organized reporting and process improvement systems on operation and patient safety in individual clinics remain scarce. The purpose of this work is to report on the design and long-term use of an electronic reporting system in a RT department and compare it to the paper-based reporting system it replaced. A specifically designed web-based system was designed for reporting of individual events in RT and clinically implemented in 2007. An event was defined as any occurrence that could have, or had, resulted in a deviation in the delivery of patient care. The aim of the system was to support process improvement in patient care and safety. The reporting tool was designed so individual events could be quickly and easily reported without disrupting clinical work. This was very important because the system use was voluntary. The spectrum of reported deviations extended from minor workflow issues (e.g., scheduling) to errors in treatment delivery. Reports were categorized based on functional area, type, and severity of an event. The events were processed and analyzed by a formal process improvement group that used the data and the statistics collected through the web-based tool for guidance in reengineering clinical processes. The reporting trends for the first 24 months with the electronic system were compared to the events that were reported in the same clinic with a paper-based system over a seven-year period. The reporting system and the process improvement structure resulted in increased event reporting, improved event communication, and improved identification of clinical areas which needed process and safety improvements. The reported data were also useful for the evaluation of corrective measures and recognition of ineffective measures and efforts. The electronic system was relatively well accepted by personnel and resulted in minimal disruption of clinical work. Event reporting in the quarters with the fewest number of reported events, though voluntary, was almost four times greater than the most events reported in any one quarter with the paper-based system and remained consistent from the inception of the process through the date of this report. However, the acceptance was not universal, validating the need for improved education regarding reporting processes and systematic approaches to reporting culture development. Specially designed electronic event reporting systems in a radiotherapy setting can provide valuable data for process and patient safety improvement and are more effective reporting mechanisms than paper-based systems. Additional work is needed to develop methods that can more effectively utilize reported data for process improvement, including the development of standardized event taxonomy and a classification system for RT.
An att site-based recombination reporter system for genome engineering and synthetic DNA assembly.
Bland, Michael J; Ducos-Galand, Magaly; Val, Marie-Eve; Mazel, Didier
2017-07-14
Direct manipulation of the genome is a widespread technique for genetic studies and synthetic biology applications. The tyrosine and serine site-specific recombination systems of bacteriophages HK022 and ΦC31 are widely used for stable directional exchange and relocation of DNA sequences, making them valuable tools in these contexts. We have developed site-specific recombination tools that allow the direct selection of recombination events by embedding the attB site from each system within the β-lactamase resistance coding sequence (bla). The HK and ΦC31 tools were developed by placing the attB sites from each system into the signal peptide cleavage site coding sequence of bla. All possible open reading frames (ORFs) were inserted and tested for recombination efficiency and bla activity. Efficient recombination was observed for all tested ORFs (3 for HK, 6 for ΦC31) as shown through a cointegrate formation assay. The bla gene with the embedded attB site was functional for eight of the nine constructs tested. The HK/ΦC31 att-bla system offers a simple way to directly select recombination events, thus enhancing the use of site-specific recombination systems for carrying out precise, large-scale DNA manipulation, and adding useful tools to the genetics toolbox. We further show the power and flexibility of bla to be used as a reporter for recombination.
Extreme Precipitation, Public Health Emergencies, and Safe Drinking Water in the USA.
Exum, Natalie G; Betanzo, Elin; Schwab, Kellogg J; Chen, Thomas Y J; Guikema, Seth; Harvey, David E
2018-06-01
This review examines the effectiveness of drinking water regulations to inform public health during extreme precipitation events. This paper estimates the vulnerability of specific populations to flooding in their public water system, reviews the literature linking precipitation to waterborne outbreaks, examines the role that Safe Drinking Water Act and Public Notification (PN) Rule have in public health emergencies, and reviews the effectiveness of the PN Rule during the 2017 Hurricane Maria in Puerto Rico. Public water systems in large metropolitan areas have substantial portions of their customer base at risk for a waterborne outbreak during a flooding event. The PN Rule are ambiguous for who is responsible for declaring a "waterborne emergency" following a natural disaster like Hurricane Maria. Revisions to the current PN Rule that mandate public notification and water quality sampling during extreme precipitation events are necessary to ensure the public is aware of their drinking water quality following these events.
Making safety an integral part of 5S in healthcare.
Ikuma, Laura H; Nahmens, Isabelina
2014-01-01
Healthcare faces major challenges with provider safety and rising costs, and many organizations are using Lean to instigate change. One Lean tool, 5S, is becoming popular for improving efficiency of physical work environments, and it can also improve safety. This paper demonstrates that safety is an integral part of 5S by examining five specific 5S events in acute care facilities. We provide two arguments for how safety is linked to 5S:1. Safety is affected by 5S events, regardless of whether safety is a specific goal and 2. Safety can and should permeate all five S's as part of a comprehensive plan for system improvement. Reports of 5S events from five departments in one health system were used to evaluate how changes made at each step of the 5S impacted safety. Safety was affected positively in each step of the 5S through initial safety goals and side effects of other changes. The case studies show that 5S can be a mechanism for improving safety. Practitioners may reap additional safety benefits by incorporating safety into 5S events through a safety analysis before the 5S, safety goals and considerations during the 5S, and follow-up safety analysis.
Yokotsuka, M; Aoyama, M; Kubota, K
2000-07-01
The Medical Dictionary for Regulatory Activities Terminology (MedDRA) version 2.1 (V2.1) was released in March 1999 accompanied by the MedDRA/J V2.1J specifically for Japanese users. In prescription-event monitoring in Japan (J-PEM), we have employed the MedDRA/J for data entry, signal generation and event listing. In J-PEM, the lowest level terms (LLTs) in the MedDRA/J are used in data entry because the richness of LLTs is judged to be advantageous. A signal is generated normally at the preferred term (PT) level, but it has been found that various reporters describe the same event using descriptions that are potentially encoded by LLTs under different PTs. In addition, some PTs are considered too specific to generate the proper signal. In the system used in J-PEM, when an LLT is selected as a candidate to encode an event, another LLT under a different PT, if any, is displayed on the computer screen so that it may be coded instead of, or in addition to, the candidate LLT. The five-level structure of the MedDRA is used when listing events but some modification is required to generate a functional event list.
NASA Technical Reports Server (NTRS)
Golden, D. P., Jr.; Wolthuis, R. A.; Hoffler, G. W.; Gowen, R. J.
1974-01-01
Frequency bands that best discriminate the Korotkov sounds at systole and at diastole from the sounds immediately preceding these events are defined. Korotkov sound data were recorded from five normotensive subjects during orthostatic stress (lower body negative pressure) and bicycle ergometry. A spectral analysis of the seven Korotkov sounds centered about the systolic and diastolic auscultatory events revealed that a maximum increase in amplitude at the systolic transition occurred in the 18-26-Hz band, while a maximum decrease in amplitude at the diastolic transition occurred in the 40-60-Hz band. These findings were remarkably consistent across subjects and test conditions. These passbands are included in the design specifications for an automatic blood pressure measuring system used in conjuction with medical experiments during NASA's Skylab program.
Molecular brake pad hypothesis: pulling off the brakes for emotional memory
Vogel-Ciernia, Annie
2015-01-01
Under basal conditions histone deacetylases (HDACs) and their associated co-repressor complexes serve as molecular ‘brake pads’ to prevent the gene expression required for long-term memory formation. Following a learning event, HDACs and their co-repressor complexes are removed from a subset of specific gene promoters, allowing the histone acetylation and active gene expression required for long-term memory formation. Inhibition of HDACs increases histone acetylation, extends gene expression profiles, and allows for the formation of persistent long-term memories for training events that are otherwise forgotten. We propose that emotionally salient experiences have utilized this system to form strong and persistent memories for behaviorally significant events. Consequently, the presence or absence of HDACs at a selection of specific gene promoters could serve as a critical barrier for permitting the formation of long-term memories. PMID:23096102
Design and Optimization of a Dual-HPGe Gamma Spectrometer and Its Cosmic Veto System
NASA Astrophysics Data System (ADS)
Zhang, Weihua; Ro, Hyunje; Liu, Chuanlei; Hoffman, Ian; Ungar, Kurt
2017-03-01
In this paper, a dual high purity germanium (HPGe) gamma spectrometer detection system with an increased solid angle was developed. The detection system consists of a pair of Broad Energy Germanium (BE-5030p) detectors and an XIA LLC digital gamma finder/Pixie-4 data-acquisition system. A data file processor was developed containing five modules that parses Pixie-4 list-mode data output files and classifies detections into anticoincident/coincident events and their specific coincidence types (double/triple/quadruple) for further analysis. A novel cosmic veto system was installed in the detection system. It was designed to be easy to install around an existing system while still providing sufficient cosmic veto shielding comparable to other designs. This paper describes the coverage and efficiency of this cosmic veto and the data processing system. It has been demonstrated that the cosmic veto system can provide a mean background reduction of 66.1%, which results in a mean MDA improvement of 58.3%. The counting time to meet the required MDA for specific radionuclide can be reduced by a factor of 2-3 compared to those using a conventional HPGe system. This paper also provides an initial overview of coincidence timing distributions between an incoming event from a cosmic veto plate and HPGe detector.
Smith, Shannon M.; Jones, Judith K.; Katz, Nathaniel P.; Roland, Carl L.; Setnik, Beatrice; Trudeau, Jeremiah J.; Wright, Stephen; Burke, Laurie B.; Comer, Sandra D.; Dart, Richard C.; Dionne, Raymond; Haddox, J. David; Jaffe, Jerome H.; Kopecky, Ernest A.; Martell, Bridget A.; Montoya, Ivan D.; Stanton, Marsha; Wasan, Ajay D.; Turk, Dennis C.; Dworkin, Robert H.
2017-01-01
Accurate assessment of inappropriate medication use events (i.e., misuse, abuse, and related events [MAREs]) occurring in clinical trials is an important component in evaluating a medication’s abuse potential. A meeting was convened to review all instruments measuring such events in clinical trials according to previously published standardized terminology and definitions [27]. Only 2 approaches have been reported that are specifically designed to identify and classify MAREs occurring in clinical trials, rather than to measure an individual’s risk of using a medication inappropriately: the Self-Reported Misuse, Abuse, and Diversion [SR-MAD] instrument and the Misuse, Abuse, and Diversion Drug Event Reporting System [MADDERS]. The conceptual basis, strengths, and limitations of these methods are discussed. To our knowledge, MADDERS is the only system available to comprehensively evaluate inappropriate medication use events prospectively in order to determine the underlying intent. MADDERS can also be applied retrospectively to completed trial data. SR-MAD can be used prospectively; additional development may be required to standardize its implementation and fully appraise the intent of inappropriate use events. Additional research is needed to further demonstrate the validity and utility of both MADDERS and SR-MAD. PMID:28479207
A variational approach to probing extreme events in turbulent dynamical systems
Farazmand, Mohammad; Sapsis, Themistoklis P.
2017-01-01
Extreme events are ubiquitous in a wide range of dynamical systems, including turbulent fluid flows, nonlinear waves, large-scale networks, and biological systems. We propose a variational framework for probing conditions that trigger intermittent extreme events in high-dimensional nonlinear dynamical systems. We seek the triggers as the probabilistically feasible solutions of an appropriately constrained optimization problem, where the function to be maximized is a system observable exhibiting intermittent extreme bursts. The constraints are imposed to ensure the physical admissibility of the optimal solutions, that is, significant probability for their occurrence under the natural flow of the dynamical system. We apply the method to a body-forced incompressible Navier-Stokes equation, known as the Kolmogorov flow. We find that the intermittent bursts of the energy dissipation are independent of the external forcing and are instead caused by the spontaneous transfer of energy from large scales to the mean flow via nonlinear triad interactions. The global maximizer of the corresponding variational problem identifies the responsible triad, hence providing a precursor for the occurrence of extreme dissipation events. Specifically, monitoring the energy transfers within this triad allows us to develop a data-driven short-term predictor for the intermittent bursts of energy dissipation. We assess the performance of this predictor through direct numerical simulations. PMID:28948226
Developing points-based risk-scoring systems in the presence of competing risks.
Austin, Peter C; Lee, Douglas S; D'Agostino, Ralph B; Fine, Jason P
2016-09-30
Predicting the occurrence of an adverse event over time is an important issue in clinical medicine. Clinical prediction models and associated points-based risk-scoring systems are popular statistical methods for summarizing the relationship between a multivariable set of patient risk factors and the risk of the occurrence of an adverse event. Points-based risk-scoring systems are popular amongst physicians as they permit a rapid assessment of patient risk without the use of computers or other electronic devices. The use of such points-based risk-scoring systems facilitates evidence-based clinical decision making. There is a growing interest in cause-specific mortality and in non-fatal outcomes. However, when considering these types of outcomes, one must account for competing risks whose occurrence precludes the occurrence of the event of interest. We describe how points-based risk-scoring systems can be developed in the presence of competing events. We illustrate the application of these methods by developing risk-scoring systems for predicting cardiovascular mortality in patients hospitalized with acute myocardial infarction. Code in the R statistical programming language is provided for the implementation of the described methods. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.
Wang, Soon Joo; Choi, Jin Tae; Arnold, Jeffrey
2003-01-01
South Korea has experienced > 30 suspected terrorism-related events since 1958, including attacks against South Korean citizens in foreign countries. The most common types of terrorism used have included bombings, shootings, hijackings, and kidnappings. Prior to 1990, North Korea was responsible for almost all terrorism-related events inside of South Korea, including multiple assassination attempts on its presidents, regular kidnappings of South Korean fisherman, and several high-profile bombings. Since 1990, most of the terrorist attacks against South Korean citizens have occurred abroad and have been related to the emerging worldwide pattern of terrorism by international terrorist organizations or deranged individuals. The 1988 Seoul Olympic Games provided a major stimulus for South Korea to develop a national emergency response system for terrorism-related events based on the participation of multiple ministries. The 11 September 2001 World Trade Center and Pentagon attacks and the 2001 United States of America (US) anthrax letter attacks prompted South Korea to organize a new national system of emergency response for terrorism-related events. The system is based on five divisions for the response to specific types of terrorist events, involving conventional terrorism, bioterrorism, chemical terrorism, radiological terrorism, and cyber-terrorism. No terrorism-related events occurred during the 2002 World Cup and Asian Games held in South Korea. The emergency management of terrorism-related events in South Korea is adapting to the changing risk of terrorism in the new century.
Earthquake information products and tools from the Advanced National Seismic System (ANSS)
Wald, Lisa
2006-01-01
This Fact Sheet provides a brief description of postearthquake tools and products provided by the Advanced National Seismic System (ANSS) through the U.S. Geological Survey Earthquake Hazards Program. The focus is on products specifically aimed at providing situational awareness in the period immediately following significant earthquake events.
Sources of Invalidity When Comparing Classroom Behaviors Across Cultures and Nations.
ERIC Educational Resources Information Center
Pfau, Richard H.
Focusing on the use of category systems in classroom observation, this report summarizes factors that may significantly affect the validity of cross-national and cross-cultural comparisons of classroom behaviors. Category systems measure well-defined behaviors by recording events observed at specific intervals or as they begin and end. Areas of…
Porter, Eileen J
2008-01-01
There is little research guiding interventions to help old homebound women prepare to manage an intrusion event. During a phenomenological study of the experience of reaching help quickly, I compared intentions during a possible intrusion event for 9 women subscribing to a personal emergency response system and 5 nonsubscribers. The phenomenon of contemplating what I would do if an intruder got in my home had 4 components. Only 2 personal emergency response system subscribers voiced the definitive intention to use the personal emergency response system. Findings underpin a new empirical perspective of competence grounded in situations relevant to living alone at home rather than specific tasks of daily living.
Health and rescue services management system during a crisis event
Nicolaidou, Iolie; Hadjichristofi, George; Kyprianou, Stelios; Christou, Synesios; Constantinou, Riana
2016-01-01
Τhe performance of rescuers and personnel handling major emergencies or crisis events can be significantly improved through continuous training and through technology support. The work done in order to create a system has been discussed which can support both resources and victims during a crisis or major emergency event. More specifically, the system supports real-time management of firefighter teams, rescue teams, health services, and victims during a major disaster. It can be deployed in an ad hoc manner in the disaster area, as a stand-alone infrastructure (using its own telecommunications and power). It mainly consists of a control station, which is installed in the area command centre, the firefighters units, the rescuers units, the ambulance vehicles units, and the telemedicine units that can be used in order to support victim handling at the casualties clearing station. The system has been tested and improved through continuous communication with experts and through professional exercises; the results and conclusions are presented. PMID:27733928
Botsis, T.; Woo, E. J.; Ball, R.
2013-01-01
Background We previously demonstrated that a general purpose text mining system, the Vaccine adverse event Text Mining (VaeTM) system, could be used to automatically classify reports of an-aphylaxis for post-marketing safety surveillance of vaccines. Objective To evaluate the ability of VaeTM to classify reports to the Vaccine Adverse Event Reporting System (VAERS) of possible Guillain-Barré Syndrome (GBS). Methods We used VaeTM to extract the key diagnostic features from the text of reports in VAERS. Then, we applied the Brighton Collaboration (BC) case definition for GBS, and an information retrieval strategy (i.e. the vector space model) to quantify the specific information that is included in the key features extracted by VaeTM and compared it with the encoded information that is already stored in VAERS as Medical Dictionary for Regulatory Activities (MedDRA) Preferred Terms (PTs). We also evaluated the contribution of the primary (diagnosis and cause of death) and secondary (second level diagnosis and symptoms) diagnostic VaeTM-based features to the total VaeTM-based information. Results MedDRA captured more information and better supported the classification of reports for GBS than VaeTM (AUC: 0.904 vs. 0.777); the lower performance of VaeTM is likely due to the lack of extraction by VaeTM of specific laboratory results that are included in the BC criteria for GBS. On the other hand, the VaeTM-based classification exhibited greater specificity than the MedDRA-based approach (94.96% vs. 87.65%). Most of the VaeTM-based information was contained in the secondary diagnostic features. Conclusion For GBS, clinical signs and symptoms alone are not sufficient to match MedDRA coding for purposes of case classification, but are preferred if specificity is the priority. PMID:23650490
NASA Technical Reports Server (NTRS)
Finger, Herbert; Weeks, Bill
1985-01-01
This presentation discusses instrumentation that will be used for a specific event, which we hope will carry on to future events within the Space Shuttle program. The experiment is the Autogenic Feedback Training Experiment (AFTE) scheduled for Spacelab 3, currently scheduled to be launched in November, 1984. The objectives of the AFTE are to determine the effectiveness of autogenic feedback in preventing or reducing space adaptation syndrome (SAS), to monitor and record in-flight data from the crew, to determine if prediction criteria for SAS can be established, and, finally, to develop an ambulatory instrument package to mount the crew throughout the mission. The purpose of the Ambulatory Feedback System (AFS) is to record the responses of the subject during a provocative event in space and provide a real-time feedback display to reinforce the training.
Investigating cardiorespiratory interaction by cross-spectral analysis of event series
NASA Astrophysics Data System (ADS)
Schäfer, Carsten; Rosenblum, Michael G.; Pikovsky, Arkady S.; Kurths, Jürgen
2000-02-01
The human cardiovascular and respiratory systems interact with each other and show effects of modulation and synchronization. Here we present a cross-spectral technique that specifically considers the event-like character of the heartbeat and avoids typical restrictions of other spectral methods. Using models as well as experimental data, we demonstrate how modulation and synchronization can be distinguished. Finally, we compare the method to traditional techniques and to the analysis of instantaneous phases.
Moore, Kieran M; Edge, Graham; Kurc, Andrew R
2008-11-14
Timeliness is a critical asset to the detection of public health threats when using syndromic surveillance systems. In order for epidemiologists to effectively distinguish which events are indicative of a true outbreak, the ability to utilize specific data streams from generalized data summaries is necessary. Taking advantage of graphical user interfaces and visualization capacities of current surveillance systems makes it easier for users to investigate detected anomalies by generating custom graphs, maps, plots, and temporal-spatial analysis of specific syndromes or data sources.
Moore, Kieran M; Edge, Graham; Kurc, Andrew R
2008-01-01
Timeliness is a critical asset to the detection of public health threats when using syndromic surveillance systems. In order for epidemiologists to effectively distinguish which events are indicative of a true outbreak, the ability to utilize specific data streams from generalized data summaries is necessary. Taking advantage of graphical user interfaces and visualization capacities of current surveillance systems makes it easier for users to investigate detected anomalies by generating custom graphs, maps, plots, and temporal-spatial analysis of specific syndromes or data sources. PMID:19025683
Fall Detection Using Smartphone Audio Features.
Cheffena, Michael
2016-07-01
An automated fall detection system based on smartphone audio features is developed. The spectrogram, mel frequency cepstral coefficents (MFCCs), linear predictive coding (LPC), and matching pursuit (MP) features of different fall and no-fall sound events are extracted from experimental data. Based on the extracted audio features, four different machine learning classifiers: k-nearest neighbor classifier (k-NN), support vector machine (SVM), least squares method (LSM), and artificial neural network (ANN) are investigated for distinguishing between fall and no-fall events. For each audio feature, the performance of each classifier in terms of sensitivity, specificity, accuracy, and computational complexity is evaluated. The best performance is achieved using spectrogram features with ANN classifier with sensitivity, specificity, and accuracy all above 98%. The classifier also has acceptable computational requirement for training and testing. The system is applicable in home environments where the phone is placed in the vicinity of the user.
Quality Improvement With Discrete Event Simulation: A Primer for Radiologists.
Booker, Michael T; O'Connell, Ryan J; Desai, Bhushan; Duddalwar, Vinay A
2016-04-01
The application of simulation software in health care has transformed quality and process improvement. Specifically, software based on discrete-event simulation (DES) has shown the ability to improve radiology workflows and systems. Nevertheless, despite the successful application of DES in the medical literature, the power and value of simulation remains underutilized. For this reason, the basics of DES modeling are introduced, with specific attention to medical imaging. In an effort to provide readers with the tools necessary to begin their own DES analyses, the practical steps of choosing a software package and building a basic radiology model are discussed. In addition, three radiology system examples are presented, with accompanying DES models that assist in analysis and decision making. Through these simulations, we provide readers with an understanding of the theory, requirements, and benefits of implementing DES in their own radiology practices. Copyright © 2016 American College of Radiology. All rights reserved.
ERIC Educational Resources Information Center
Maguire, Mandy J.; Hirsh-Pasek, Kathy; Golinkoff, Roberta Michnick; Imai, Mutsumi; Haryu, Etsuko; Vanegas, Sandra; Okada, Hiroyuki; Pulverman, Rachel; Sanchez-Davis, Brenda
2010-01-01
The world's languages draw on a common set of event components for their verb systems. Yet, these components are differentially distributed across languages. At what age do children begin to use language-specific patterns to narrow possible verb meanings? English-, Japanese-, and Spanish-speaking adults, toddlers, and preschoolers were shown…
NASA Technical Reports Server (NTRS)
Jules, Kenol; Lin, Paul P.
2001-01-01
This paper presents an artificial intelligence monitoring system developed by the NASA Glenn Principal Investigator Microgravity Services project to help the principal investigator teams identify the primary vibratory disturbance sources that are active, at any moment in time, on-board the International Space Station, which might impact the microgravity environment their experiments are exposed to. From the Principal Investigator Microgravity Services' web site, the principal investigator teams can monitor via a graphical display, in near real time, which event(s) is/are on, such as crew activities, pumps, fans, centrifuges, compressor, crew exercise, platform structural modes, etc., and decide whether or not to run their experiments based on the acceleration environment associated with a specific event. This monitoring system is focused primarily on detecting the vibratory disturbance sources, but could be used as well to detect some of the transient disturbance sources, depending on the events duration. The system has built-in capability to detect both known and unknown vibratory disturbance sources. Several soft computing techniques such as Kohonen's Self-Organizing Feature Map, Learning Vector Quantization, Back-Propagation Neural Networks, and Fuzzy Logic were used to design the system.
Alternative Splice in Alternative Lice.
Tovar-Corona, Jaime M; Castillo-Morales, Atahualpa; Chen, Lu; Olds, Brett P; Clark, John M; Reynolds, Stuart E; Pittendrigh, Barry R; Feil, Edward J; Urrutia, Araxi O
2015-10-01
Genomic and transcriptomics analyses have revealed human head and body lice to be almost genetically identical; although con-specific, they nevertheless occupy distinct ecological niches and have differing feeding patterns. Most importantly, while head lice are not known to be vector competent, body lice can transmit three serious bacterial diseases; epidemictyphus, trench fever, and relapsing fever. In order to gain insights into the molecular bases for these differences, we analyzed alternative splicing (AS) using next-generation sequencing data for one strain of head lice and one strain of body lice. We identified a total of 3,598 AS events which were head or body lice specific. Exon skipping AS events were overrepresented among both head and body lice, whereas intron retention events were underrepresented in both. However, both the enrichment of exon skipping and the underrepresentation of intron retention are significantly stronger in body lice compared with head lice. Genes containing body louse-specific AS events were found to be significantly enriched for functions associated with development of the nervous system, salivary gland, trachea, and ovarian follicle cells, as well as regulation of transcription. In contrast, no functional categories were overrepresented among genes with head louse-specific AS events. Together, our results constitute the first evidence for transcript pool differences in head and body lice, providing insights into molecular adaptations that enabled human lice to adapt to clothing, and representing a powerful illustration of the pivotal role AS can play in functional adaptation. © The Author 2015. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Pretagostini, R; Gabbrielli, F; Fiaschetti, P; Oliveti, A; Cenci, S; Peritore, D; Stabile, D
2010-05-01
Starting from the report on medical errors published in 1999 by the US Institute of Medicine, a number of different approaches to risk management have been developed for maximum risk reduction in health care activities. The health care authorities in many countries have focused attention on patient safety, employing action research programs that are based on quite different principles. We performed a systematic Medline research of the literature since 1999. The following key words were used, also combining boolean operators and medical subheading terms: "adverse event," "risk management," "error," and "governance." Studies published in the last 5 years were particularly classified in various groups: risk management in health care systems; safety in specific hospital activities; and health care institutions' official documents. Methods of action researches have been analysed and their characteristics compared. Their suitability for safety development in donation, retrieval, and transplantation processes were discussed in the reality of the Italian transplant network. Some action researches and studies were dedicated to entire national healthcare systems, whereas others focused on specific risks. Many research programs have undergone critical review in the literature. Retrospective analysis has centered on so-called sentinel events to particularly analyze only a minor portion of the organizational phenomena, which can be the origin of an adverse event, an incident, or an error. Sentinel events give useful information if they are studied in highly engineered and standardized organizations like laboratories or tissue establishments, but they show several limits in the analysis of organ donation, retrieval, and transplantation processes, which are characterized by prevailing human factors, with high intrinsic risk and variability. Thus, they are poorly effective to deliver sure elements to base safety management improvement programs, especially regarding multidisciplinary systems with high complexity. In organ transplantation, the possibility to increase safety seems greater using proactive research, mainly centred on organizational processes together with retrospective analyses but not limited to sentinel event reports. Copyright (c) 2010. Published by Elsevier Inc.
Choosing MUSE: Validation of a Low-Cost, Portable EEG System for ERP Research.
Krigolson, Olave E; Williams, Chad C; Norton, Angela; Hassall, Cameron D; Colino, Francisco L
2017-01-01
In recent years there has been an increase in the number of portable low-cost electroencephalographic (EEG) systems available to researchers. However, to date the validation of the use of low-cost EEG systems has focused on continuous recording of EEG data and/or the replication of large system EEG setups reliant on event-markers to afford examination of event-related brain potentials (ERP). Here, we demonstrate that it is possible to conduct ERP research without being reliant on event markers using a portable MUSE EEG system and a single computer. Specifically, we report the results of two experiments using data collected with the MUSE EEG system-one using the well-known visual oddball paradigm and the other using a standard reward-learning task. Our results demonstrate that we could observe and quantify the N200 and P300 ERP components in the visual oddball task and the reward positivity (the mirror opposite component to the feedback-related negativity) in the reward-learning task. Specifically, single sample t -tests of component existence (all p 's < 0.05), computation of Bayesian credible intervals, and 95% confidence intervals all statistically verified the existence of the N200, P300, and reward positivity in all analyses. We provide with this research paper an open source website with all the instructions, methods, and software to replicate our findings and to provide researchers with an easy way to use the MUSE EEG system for ERP research. Importantly, our work highlights that with a single computer and a portable EEG system such as the MUSE one can conduct ERP research with ease thus greatly extending the possible use of the ERP methodology to a variety of novel contexts.
Reactive system verification case study: Fault-tolerant transputer communication
NASA Technical Reports Server (NTRS)
Crane, D. Francis; Hamory, Philip J.
1993-01-01
A reactive program is one which engages in an ongoing interaction with its environment. A system which is controlled by an embedded reactive program is called a reactive system. Examples of reactive systems are aircraft flight management systems, bank automatic teller machine (ATM) networks, airline reservation systems, and computer operating systems. Reactive systems are often naturally modeled (for logical design purposes) as a composition of autonomous processes which progress concurrently and which communicate to share information and/or to coordinate activities. Formal (i.e., mathematical) frameworks for system verification are tools used to increase the users' confidence that a system design satisfies its specification. A framework for reactive system verification includes formal languages for system modeling and for behavior specification and decision procedures and/or proof-systems for verifying that the system model satisfies the system specifications. Using the Ostroff framework for reactive system verification, an approach to achieving fault-tolerant communication between transputers was shown to be effective. The key components of the design, the decoupler processes, may be viewed as discrete-event-controllers introduced to constrain system behavior such that system specifications are satisfied. The Ostroff framework was also effective. The expressiveness of the modeling language permitted construction of a faithful model of the transputer network. The relevant specifications were readily expressed in the specification language. The set of decision procedures provided was adequate to verify the specifications of interest. The need for improved support for system behavior visualization is emphasized.
Brauchli Pernus, Yolanda; Nan, Cassandra; Verstraeten, Thomas; Pedenko, Mariia; Osokogu, Osemeke U; Weibel, Daniel; Sturkenboom, Miriam; Bonhoeffer, Jan
2016-12-12
Safety signal detection in spontaneous reporting system databases and electronic healthcare records is key to detection of previously unknown adverse events following immunization. Various statistical methods for signal detection in these different datasources have been developed, however none are geared to the pediatric population and none specifically to vaccines. A reference set comprising pediatric vaccine-adverse event pairs is required for reliable performance testing of statistical methods within and across data sources. The study was conducted within the context of the Global Research in Paediatrics (GRiP) project, as part of the seventh framework programme (FP7) of the European Commission. Criteria for the selection of vaccines considered in the reference set were routine and global use in the pediatric population. Adverse events were primarily selected based on importance. Outcome based systematic literature searches were performed for all identified vaccine-adverse event pairs and complemented by expert committee reports, evidence based decision support systems (e.g. Micromedex), and summaries of product characteristics. Classification into positive (PC) and negative control (NC) pairs was performed by two independent reviewers according to a pre-defined algorithm and discussed for consensus in case of disagreement. We selected 13 vaccines and 14 adverse events to be included in the reference set. From a total of 182 vaccine-adverse event pairs, we classified 18 as PC, 113 as NC and 51 as unclassifiable. Most classifications (91) were based on literature review, 45 were based on expert committee reports, and for 46 vaccine-adverse event pairs, an underlying pathomechanism was not plausible classifying the association as NC. A reference set of vaccine-adverse event pairs was developed. We propose its use for comparing signal detection methods and systems in the pediatric population. Published by Elsevier Ltd.
Létourneau, Daniel; Wang, An; Amin, Md Nurul; Pearce, Jim; McNiven, Andrea; Keller, Harald; Norrlinger, Bernhard; Jaffray, David A
2014-12-01
High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3-4 times/week over a period of 10-11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ± 0.5 and ± 1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. The precision of the MLC performance monitoring QC test and the MLC itself was within ± 0.22 mm for most MLC leaves and the majority of the apparent leaf motion was attributed to beam spot displacements between irradiations. The MLC QC test was performed 193 and 162 times over the monitoring period for the studied units and recalibration had to be repeated up to three times on one of these units. For both units, rate of MLC interlocks was moderately associated with MLC servicing events. The strongest association with the MLC performance was observed between the MLC servicing events and the total number of out-of-control leaves. The average elapsed time for which the number of out-of-specification or out-of-control leaves was within a given performance threshold was computed and used to assess adequacy of MLC test frequency. A MLC performance monitoring system has been developed and implemented to acquire high-quality QC data at high frequency. This is enabled by the relatively short acquisition time for the images and automatic image analysis. The monitoring system was also used to record and track the rate of MLC-related interlocks and servicing events. MLC performances for two commercially available MLC models have been assessed and the results support monthly test frequency for widely accepted ± 1 mm specifications. Higher QC test frequency is however required to maintain tighter specification and in-control behavior.
Emergence of self and other in perception and action: an event-control approach.
Jordan, J Scott
2003-12-01
The present paper analyzes the regularities referred to via the concept 'self.' This is important, for cognitive science traditionally models the self as a cognitive mediator between perceptual inputs and behavioral outputs. This leads to the assertion that the self causes action. Recent findings in social psychology indicate this is not the case and, as a consequence, certain cognitive scientists model the self as being epiphenomenal. In contrast, the present paper proposes an alternative approach (i.e., the event-control approach) that is based on recently discovered regularities between perception and action. Specifically, these regularities indicate that perception and action planning utilize common neural resources. This leads to a coupling of perception, planning, and action in which the first two constitute aspects of a single system (i.e., the distal-event system) that is able to pre-specify and detect distal events. This distal-event system is then coupled with action (i.e., effector-control systems) in a constraining, as opposed to 'causal' manner. This model has implications for how we conceptualize the manner in which one infers the intentions of another, anticipates the intentions of another, and possibly even experiences another. In conclusion, it is argued that it may be possible to map the concept 'self' onto the regularities referred to in the event-control model, not in order to reify 'the self' as a causal mechanism, but to demonstrate its status as a useful concept that refers to regularities that are part of the natural order.
Methodology for Software Reliability Prediction. Volume 1.
1987-11-01
SPACECRAFT 0 MANNED SPACECRAFT B ATCH SYSTEM AIRBORNE AVIONICS 0 UNMANNED EVENT C014TROL a REAL TIME CLOSED 0 UNMANNED SPACECRAFT LOOP OPERATINS SPACECRAFT...software reliability. A Software Reliability Measurement Framework was established which spans the life cycle of a software system and includes the...specification, prediction, estimation, and assessment of software reliability. Data from 59 systems , representing over 5 million lines of code, were
An Overview of the NASA Aviation Safety Program Propulsion Health Monitoring Element
NASA Technical Reports Server (NTRS)
Simon, Donald L.
2000-01-01
The NASA Aviation Safety Program (AvSP) has been initiated with aggressive goals to reduce the civil aviation accident rate, To meet these goals, several technology investment areas have been identified including a sub-element in propulsion health monitoring (PHM). Specific AvSP PHM objectives are to develop and validate propulsion system health monitoring technologies designed to prevent engine malfunctions from occurring in flight, and to mitigate detrimental effects in the event an in-flight malfunction does occur. A review of available propulsion system safety information was conducted to help prioritize PHM areas to focus on under the AvSP. It is noted that when a propulsion malfunction is involved in an aviation accident or incident, it is often a contributing factor rather than the sole cause for the event. Challenging aspects of the development and implementation of PHM technology such as cost, weight, robustness, and reliability are discussed. Specific technology plans are overviewed including vibration diagnostics, model-based controls and diagnostics, advanced instrumentation, and general aviation propulsion system health monitoring technology. Propulsion system health monitoring, in addition to engine design, inspection, maintenance, and pilot training and awareness, is intrinsic to enhancing aviation propulsion system safety.
Rosén, Karl G; Norén, Håkan; Carlsson, Ann
2018-04-18
Recent developments have produced new CTG classification systems and the question is to what extent these may affect the model of FHR + ST interpretation? The two new systems (FIGO2015 and SSOG2017) classify FHR + ST events differently from the current CTG classification system used in the STAN interpretation algorithm (STAN2007). Identify the predominant FHR patterns in connection with ST events in cases of cord artery metabolic acidosis missed by the different CTG classification systems. Indicate to what extent STAN clinical guidelines could be modified enhancing the sensitivity. Provide a pathophysiological rationale. Forty-four cases with umbilical cord artery metabolic acidosis were retrieved from a European multicenter database. Significant FHR + ST events were evaluated post hoc in consensus by an expert panel. Eighteen cases were not identified as in need of intervention and regarded as negative in the sensitivity analysis. In 12 cases, ST changes occurred but the CTG was regarded as reassuring. Visual analysis of the FHR + ST tracings revealed specific FHR patterns: Conclusion: These findings indicate FHR + ST analysis may be undertaken regardless of CTG classification system provided there is a more physiologically oriented approach to FHR assessment in connection with an ST event.
Data-assisted reduced-order modeling of extreme events in complex dynamical systems
Koumoutsakos, Petros
2018-01-01
The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN) architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM) regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more significant in regions associated with extreme events, where data is sparse. PMID:29795631
Data-assisted reduced-order modeling of extreme events in complex dynamical systems.
Wan, Zhong Yi; Vlachas, Pantelis; Koumoutsakos, Petros; Sapsis, Themistoklis
2018-01-01
The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN) architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM) regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more significant in regions associated with extreme events, where data is sparse.
Analysis of Alerting System Failures in Commercial Aviation Accidents
NASA Technical Reports Server (NTRS)
Mumaw, Randall J.
2017-01-01
The role of an alerting system is to make the system operator (e.g., pilot) aware of an impending hazard or unsafe state so the hazard can be avoided or managed successfully. A review of 46 commercial aviation accidents (between 1998 and 2014) revealed that, in the vast majority of events, either the hazard was not alerted or relevant hazard alerting occurred but failed to aid the flight crew sufficiently. For this set of events, alerting system failures were placed in one of five phases: Detection, Understanding, Action Selection, Prioritization, and Execution. This study also reviewed the evolution of alerting system schemes in commercial aviation, which revealed naive assumptions about pilot reliability in monitoring flight path parameters; specifically, pilot monitoring was assumed to be more effective than it actually is. Examples are provided of the types of alerting system failures that have occurred, and recommendations are provided for alerting system improvements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borges, Raymond Charles; Beaver, Justin M; Buckner, Mark A
Power system disturbances are inherently complex and can be attributed to a wide range of sources, including both natural and man-made events. Currently, the power system operators are heavily relied on to make decisions regarding the causes of experienced disturbances and the appropriate course of action as a response. In the case of cyber-attacks against a power system, human judgment is less certain since there is an overt attempt to disguise the attack and deceive the operators as to the true state of the system. To enable the human decision maker, we explore the viability of machine learning as amore » means for discriminating types of power system disturbances, and focus specifically on detecting cyber-attacks where deception is a core tenet of the event. We evaluate various machine learning methods as disturbance discriminators and discuss the practical implications for deploying machine learning systems as an enhancement to existing power system architectures.« less
Flood damage: a model for consistent, complete and multipurpose scenarios
NASA Astrophysics Data System (ADS)
Menoni, Scira; Molinari, Daniela; Ballio, Francesco; Minucci, Guido; Mejri, Ouejdane; Atun, Funda; Berni, Nicola; Pandolfo, Claudia
2016-12-01
Effective flood risk mitigation requires the impacts of flood events to be much better and more reliably known than is currently the case. Available post-flood damage assessments usually supply only a partial vision of the consequences of the floods as they typically respond to the specific needs of a particular stakeholder. Consequently, they generally focus (i) on particular items at risk, (ii) on a certain time window after the occurrence of the flood, (iii) on a specific scale of analysis or (iv) on the analysis of damage only, without an investigation of damage mechanisms and root causes. This paper responds to the necessity of a more integrated interpretation of flood events as the base to address the variety of needs arising after a disaster. In particular, a model is supplied to develop multipurpose complete event scenarios. The model organizes available information after the event according to five logical axes. This way post-flood damage assessments can be developed that (i) are multisectoral, (ii) consider physical as well as functional and systemic damage, (iii) address the spatial scales that are relevant for the event at stake depending on the type of damage that has to be analyzed, i.e., direct, functional and systemic, (iv) consider the temporal evolution of damage and finally (v) allow damage mechanisms and root causes to be understood. All the above features are key for the multi-usability of resulting flood scenarios. The model allows, on the one hand, the rationalization of efforts currently implemented in ex post damage assessments, also with the objective of better programming financial resources that will be needed for these types of events in the future. On the other hand, integrated interpretations of flood events are fundamental to adapting and optimizing flood mitigation strategies on the basis of thorough forensic investigation of each event, as corroborated by the implementation of the model in a case study.
NASA Astrophysics Data System (ADS)
Tao, J.; Barros, A. P.
2014-01-01
Debris flows associated with rainstorms are a frequent and devastating hazard in the Southern Appalachians in the United States. Whereas warm-season events are clearly associated with heavy rainfall intensity, the same cannot be said for the cold-season events. Instead, there is a relationship between large (cumulative) rainfall events independently of season, and thus hydrometeorological regime, and debris flows. This suggests that the dynamics of subsurface hydrologic processes play an important role as a trigger mechanism, specifically through soil moisture redistribution by interflow. We further hypothesize that the transient mass fluxes associated with the temporal-spatial dynamics of interflow govern the timing of shallow landslide initiation, and subsequent debris flow mobilization. The first objective of this study is to investigate this relationship. The second objective is to assess the physical basis for a regional coupled flood prediction and debris flow warning system. For this purpose, uncalibrated model simulations of well-documented debris flows in headwater catchments of the Southern Appalachians using a 3-D surface-groundwater hydrologic model coupled with slope stability models are examined in detail. Specifically, we focus on two vulnerable headwater catchments that experience frequent debris flows, the Big Creek and the Jonathan Creek in the Upper Pigeon River Basin, North Carolina, and three distinct weather systems: an extremely heavy summertime convective storm in 2011; a persistent winter storm lasting several days; and a severe winter storm in 2009. These events were selected due to the optimal availability of rainfall observations; availability of detailed field surveys of the landslides shortly after they occurred, which can be used to evaluate model predictions; and because they are representative of events that cause major economic losses in the region. The model results substantiate that interflow is a useful prognostic of conditions necessary for the initiation of slope instability, and should therefore be considered explicitly in landslide hazard assessments. Moreover, the relationships between slope stability and interflow are strongly modulated by the topography and catchment-specific geomorphologic features that determine subsurface flow convergence zones. The three case studies demonstrate the value of coupled prediction of flood response and debris flow initiation potential in the context of developing a regional hazard warning system.
Health Management Applications for International Space Station
NASA Technical Reports Server (NTRS)
Alena, Richard; Duncavage, Dan
2005-01-01
Traditional mission and vehicle management involves teams of highly trained specialists monitoring vehicle status and crew activities, responding rapidly to any anomalies encountered during operations. These teams work from the Mission Control Center and have access to engineering support teams with specialized expertise in International Space Station (ISS) subsystems. Integrated System Health Management (ISHM) applications can significantly augment these capabilities by providing enhanced monitoring, prognostic and diagnostic tools for critical decision support and mission management. The Intelligent Systems Division of NASA Ames Research Center is developing many prototype applications using model-based reasoning, data mining and simulation, working with Mission Control through the ISHM Testbed and Prototypes Project. This paper will briefly describe information technology that supports current mission management practice, and will extend this to a vision for future mission control workflow incorporating new ISHM applications. It will describe ISHM applications currently under development at NASA and will define technical approaches for implementing our vision of future human exploration mission management incorporating artificial intelligence and distributed web service architectures using specific examples. Several prototypes are under development, each highlighting a different computational approach. The ISStrider application allows in-depth analysis of Caution and Warning (C&W) events by correlating real-time telemetry with the logical fault trees used to define off-nominal events. The application uses live telemetry data and the Livingstone diagnostic inference engine to display the specific parameters and fault trees that generated the C&W event, allowing a flight controller to identify the root cause of the event from thousands of possibilities by simply navigating animated fault tree models on their workstation. SimStation models the functional power flow for the ISS Electrical Power System and can predict power balance for nominal and off-nominal conditions. SimStation uses realtime telemetry data to keep detailed computational physics models synchronized with actual ISS power system state. In the event of failure, the application can then rapidly diagnose root cause, predict future resource levels and even correlate technical documents relevant to the specific failure. These advanced computational models will allow better insight and more precise control of ISS subsystems, increasing safety margins by speeding up anomaly resolution and reducing,engineering team effort and cost. This technology will make operating ISS more efficient and is directly applicable to next-generation exploration missions and Crew Exploration Vehicles.
Injury surveillance in multi-sport events: the International Olympic Committee approach.
Junge, A; Engebretsen, L; Alonso, J M; Renström, P; Mountjoy, M; Aubry, M; Dvorak, J
2008-06-01
The protection of athletes' health by preventing injuries is an important task for international sports federations. Standardised injury surveillance provides not only important epidemiological information, but also directions for injury prevention, and the opportunity for monitoring long-term changes in the frequency and circumstances of injury. Numerous studies have evaluated sports injuries during the season, but few have focused on injuries during major sport events such as World Championships, World Cups or the Olympic Games. To provide an injury surveillance system for multi-sports tournaments, using the 2008 Olympic Games in Beijing as an example. A group of experienced researchers reviewed existing injury report systems and developed a scientific sound and concise injury surveillance system for large multi-sport events. The injury report system for multi-sport events is based on an established system for team sports tournaments and has proved feasible for individual sports during the International Association of Athletics Federations World Championships in Athletics 2007. The most important principles and advantages of the system are comprehensive definition of injury, injury report by the physician responsible for the athlete, a single-page report of all injuries, and daily report irrespective of whether or not an injury occurred. Implementation of the injury surveillance system, all definitions, the report form, and the analysis of data are described in detail to enable other researchers to implement the injury surveillance system in any sports tournament. The injury surveillance system has been accepted by experienced team physicians and shown to be feasible for single-sport and multi-sport events. It can be modified depending on the specific objectives of a certain sport or research question; however, a standardised use of injury definition, report forms and methodology will ensure the comparability of results.
HSP70 induction during baculovirus infection
USDA-ARS?s Scientific Manuscript database
Baculoviruses are arthropod-specific double-stranded DNA viruses that have been employed as bio-insecticides against crop pests and to produce heterologous proteins in baculovirus expression systems. Although a consensus has emerged on the dominant molecular events driving baculovirus replication i...
NASA Astrophysics Data System (ADS)
Hui, Min; Cui, Zhaoxia; Liu, Yuan; Song, Chengwen
2017-07-01
In crab, embryogenesis is a complicated developmental program marked by a series of critical events. RNA-Sequencing technology offers developmental biologists a way to identify many more developmental genes than ever before. Here, we present a comprehensive analysis of the transcriptomes of Eriocheir sinensis oosperms (Os) and embryos at the 2-4 cell stage (Cs), which are separated by a cleavage event. A total of 18 923 unigenes were identified, and 403 genes matched with gene ontology (GO) terms related to developmental processes. In total, 432 differentially expressed genes (DEGs) were detected between the two stages. Nine DEGs were specifically expressed at only one stage. These DEGs may be relevant to stage-specific molecular events during development. A number of DEGs related to `hedgehog signaling pathway', `Wnt signaling pathway' `germplasm', `nervous system', `sensory perception' and `segment polarity' were identified as being up-regulated at the Cs stage. The results suggest that these embryonic developmental events begin before the early cleavage event in crabs, and that many of the genes expressed in the two transcriptomes might be maternal genes. Our study provides ample information for further research on the molecular mechanisms underlying crab development.
ERIC Educational Resources Information Center
Zerr, Ryan
2007-01-01
An online homework system created for use by beginning calculus students is described. This system was designed with the specific goal of supporting student engagement outside of class by replicating the attempt-feedback-reattempt sequence of events which often occurs in a teacher's presence. Evidence is presented which indicates that this goal…
ERIC Educational Resources Information Center
Lichtman, Allan J.
2012-01-01
The Keys to the White House is a historically-based system for predicting the result of the popular vote in American presidential elections. The Keys system tracks the big picture of how well the party holding the White House has governed and does not shift with events of the campaign. This model gives specificity to the idea that it is…
Agile and Adaptive IT Ecosystem, Results, Outlook, and Recommendations
2014-06-01
http://netbeans.dzone.com/news/war-fighter- netbeans -platform 5 http://www.afei.org/events/4A07/Documents/1-DI2E%20Brochure%20ISA_10APR13.pdf 6...Client ( NetBeans ), and OSGI bundles (Karafe). DoD systems also use both the Android and iOS mobile operating systems. Each technology has a specific
Journot, Valérie; Tabuteau, Sophie; Collin, Fidéline; Molina, Jean-Michel; Chene, Geneviève; Rancinan, Corinne
2008-03-01
Since 2003, the Medical Dictionary for Regulatory Activities (MedDRA) is the regulatory standard for safety report in clinical trials in the European Community. Yet, we found no published example of a practical experience for a scientifically oriented statistical analysis of events coded with MedDRA. We took advantage of a randomized trial in HIV-infected patients with MedDRA-coded events to explain the difficulties encountered during the events analysis and the strategy developed to report events consistently with trial-specific objectives. MedDRA has a rich hierarchical structure, which allows the grouping of coded terms into 5 levels, the highest being "System Organ Class" (SOC). Each coded term may be related to several SOCs, among which one primary SOC is defined. We developed a new general 5-step strategy to select a SOC as trial primary SOC, consistently with trial-specific objectives for this analysis. We applied it to the ANRS 099 ALIZE trial, where all events were coded with MedDRA version 3.0. We compared the MedDRA and the ALIZE primary SOCs. In the ANRS 099 ALIZE trial, 355 patients were recruited, and 3,722 events were reported and documented, among which 35% had multiple SOCs (2 to 4). We applied the proposed 5-step strategy. Altogether, 23% of MedDRA primary SOCs were modified, mainly from MedDRA primary SOCs "Investigations" (69%) and "Ear and labyrinth disorders" (6%), for the ALIZE primary SOCs "Hepatobiliary disorders" (35%), "Musculoskeletal and connective tissue disorders" (21%), and "Gastrointestinal disorders" (15%). MedDRA largely enhanced in size and complexity with versioning and the development of Standardized MedDRA Queries. Yet, statisticians should not systematically rely on primary SOCs proposed by MedDRA to report events. A simple general 5-step strategy to re-classify events consistently with the trial-specific objectives might be useful in HIV trials as well as in other fields.
EventThread: Visual Summarization and Stage Analysis of Event Sequence Data.
Guo, Shunan; Xu, Ke; Zhao, Rongwen; Gotz, David; Zha, Hongyuan; Cao, Nan
2018-01-01
Event sequence data such as electronic health records, a person's academic records, or car service records, are ordered series of events which have occurred over a period of time. Analyzing collections of event sequences can reveal common or semantically important sequential patterns. For example, event sequence analysis might reveal frequently used care plans for treating a disease, typical publishing patterns of professors, and the patterns of service that result in a well-maintained car. It is challenging, however, to visually explore large numbers of event sequences, or sequences with large numbers of event types. Existing methods focus on extracting explicitly matching patterns of events using statistical analysis to create stages of event progression over time. However, these methods fail to capture latent clusters of similar but not identical evolutions of event sequences. In this paper, we introduce a novel visualization system named EventThread which clusters event sequences into threads based on tensor analysis and visualizes the latent stage categories and evolution patterns by interactively grouping the threads by similarity into time-specific clusters. We demonstrate the effectiveness of EventThread through usage scenarios in three different application domains and via interviews with an expert user.
An overview of the heterogeneous telescope network system: Concept, scalability and operation
NASA Astrophysics Data System (ADS)
White, R. R.; Allan, A.
2008-03-01
In the coming decade there will be an avalanche of data streams devoted to astronomical exploration opening new windows of scientific discovery. The shear volume of data and the diversity of event types (Kantor 2006; Kaiser 2004; Vestrand & Theiler & Wozniak 2004) will necessitate; the move to a common language for the communication of event data, and enabling telescope systems with the ability to not just simply respond, but to act independently in order to take full advantage of available resources in a timely manner. Developed over the past three years, the Virtual Observatory Event (VOEvent) provides the best format for carrying these diverse event messages (White et al. 2006a; Seaman & Warner 2006). However, in order for the telescopes to be able to act independently, a system of interoperable network nodes must be in place, that will allow the astronomical assets to not only issue event notifications, but to coordinate and request specific observations. The Heterogeneous Telescope Network (HTN) is a network architecture that can achieve the goals set forth and provide a scalable design to match both fully autonomous and manual telescope system needs (Allan et al. 2006a; White et al. 2006b; Hessman 2006b). In this paper we will show the design concept of this meta-network and nodes, their scalable architecture and complexity, and how this concept can meet the needs of institutions in the near future.
Choosing MUSE: Validation of a Low-Cost, Portable EEG System for ERP Research
Krigolson, Olave E.; Williams, Chad C.; Norton, Angela; Hassall, Cameron D.; Colino, Francisco L.
2017-01-01
In recent years there has been an increase in the number of portable low-cost electroencephalographic (EEG) systems available to researchers. However, to date the validation of the use of low-cost EEG systems has focused on continuous recording of EEG data and/or the replication of large system EEG setups reliant on event-markers to afford examination of event-related brain potentials (ERP). Here, we demonstrate that it is possible to conduct ERP research without being reliant on event markers using a portable MUSE EEG system and a single computer. Specifically, we report the results of two experiments using data collected with the MUSE EEG system—one using the well-known visual oddball paradigm and the other using a standard reward-learning task. Our results demonstrate that we could observe and quantify the N200 and P300 ERP components in the visual oddball task and the reward positivity (the mirror opposite component to the feedback-related negativity) in the reward-learning task. Specifically, single sample t-tests of component existence (all p's < 0.05), computation of Bayesian credible intervals, and 95% confidence intervals all statistically verified the existence of the N200, P300, and reward positivity in all analyses. We provide with this research paper an open source website with all the instructions, methods, and software to replicate our findings and to provide researchers with an easy way to use the MUSE EEG system for ERP research. Importantly, our work highlights that with a single computer and a portable EEG system such as the MUSE one can conduct ERP research with ease thus greatly extending the possible use of the ERP methodology to a variety of novel contexts. PMID:28344546
Cas9-Guide RNA Directed Genome Editing in Soybean[OPEN
Li, Zhongsen; Liu, Zhan-Bin; Xing, Aiqiu; Moon, Bryan P.; Koellhoffer, Jessica P.; Huang, Lingxia; Ward, R. Timothy; Clifton, Elizabeth; Falco, S. Carl; Cigan, A. Mark
2015-01-01
Recently discovered bacteria and archaea adaptive immune system consisting of clustered regularly interspaced short palindromic repeats (CRISPR) and CRISPR-associated (Cas) endonuclease has been explored in targeted genome editing in different species. Streptococcus pyogenes Cas9-guide RNA (gRNA) was successfully applied to generate targeted mutagenesis, gene integration, and gene editing in soybean (Glycine max). Two genomic sites, DD20 and DD43 on chromosome 4, were mutagenized with frequencies of 59% and 76%, respectively. Sequencing randomly selected transgenic events confirmed that the genome modifications were specific to the Cas9-gRNA cleavage sites and consisted of small deletions or insertions. Targeted gene integrations through homology-directed recombination were detected by border-specific polymerase chain reaction analysis for both sites at callus stage, and one DD43 homology-directed recombination event was transmitted to T1 generation. T1 progenies of the integration event segregated according to Mendelian laws and clean homozygous T1 plants with the donor gene precisely inserted at the DD43 target site were obtained. The Cas9-gRNA system was also successfully applied to make a directed P178S mutation of acetolactate synthase1 gene through in planta gene editing. PMID:26294043
Rapid wave and storm surge warning system for tropical cyclones in Mexico
NASA Astrophysics Data System (ADS)
Appendini, C. M.; Rosengaus, M.; Meza, R.; Camacho, V.
2015-12-01
The National Hurricane Center (NHC) in Miami, is responsible for the forecast of tropical cyclones in the North Atlantic and Eastern North Pacific basins. As such, Mexico, Central America and Caribbean countries depend on the information issued by the NHC related to the characteristics of a particular tropical cyclone and associated watch and warning areas. Despite waves and storm surge are important hazards for marine operations and coastal dwellings, their forecast is not part of the NHC responsibilities. This work presents a rapid wave and storm surge warning system based on 3100 synthetic tropical cyclones doing landfall in Mexico. Hydrodynamic and wave models were driven by the synthetic events to create a robust database composed of maximum envelops of wind speed, significant wave height and storm surge for each event. The results were incorporated into a forecast system that uses the NHC advisory to locate the synthetic events passing inside specified radiuses for the present and forecast position of the real event. Using limited computer resources, the system displays the information meeting the search criteria, and the forecaster can select specific events to generate the desired hazard map (i.e. wind, waves, and storm surge) based on the maximum envelop maps. This system was developed in a limited time frame to be operational in 2015 by the National Hurricane and Severe Storms Unit of the Mexican National Weather Service, and represents a pilot project for other countries in the region not covered by detailed storm surge and waves forecasts.
A Bayesian additive model for understanding public transport usage in special events.
Rodrigues, Filipe; Borysov, Stanislav; Ribeiro, Bernardete; Pereira, Francisco
2016-12-02
Public special events, like sports games, concerts and festivals are well known to create disruptions in transportation systems, often catching the operators by surprise. Although these are usually planned well in advance, their impact is difficult to predict, even when organisers and transportation operators coordinate. The problem highly increases when several events happen concurrently. To solve these problems, costly processes, heavily reliant on manual search and personal experience, are usual practice in large cities like Singapore, London or Tokyo. This paper presents a Bayesian additive model with Gaussian process components that combines smart card records from public transport with context information about events that is continuously mined from the Web. We develop an efficient approximate inference algorithm using expectation propagation, which allows us to predict the total number of public transportation trips to the special event areas, thereby contributing to a more adaptive transportation system. Furthermore, for multiple concurrent event scenarios, the proposed algorithm is able to disaggregate gross trip counts into their most likely components related to specific events and routine behavior. Using real data from Singapore, we show that the presented model outperforms the best baseline model by up to 26% in R2 and also has explanatory power for its individual components.
IR Variability of Eta Carinae: The 2009 Event
NASA Astrophysics Data System (ADS)
Smith, Nathan
2008-08-01
Every 5.5 years, η Carinae experiences a dramatic ``spectroscopic event'' when high-excitation lines in its UV, optical, and IR spectrum disappear, and its hard X-ray and radio continuum flux crash. This periodicity has been attributed to an eccentric binary system with a shell ejection occurring at periastron, and the next periastron event will occur in January 2009. The last event in June/July 2003 was poorly observed because the star was very low in the sky, but this next event is perfectly suited for an intense ground-based monitoring campaign. Mid-IR images and spectra with T-ReCS provide a direct measure of changes in the current bolometric luminosity and a direct measure of the mass in dust formation episodes that may occur at periastron in the colliding wind shock. Near-IR emission lines trace related changes in the post-event wind and ionization changes in the circumstellar environment needed to test specific models for the cause of η Car's variability as it recovers from its recent ``event''. Because the nebular geometry is known very well from previous observations in this program, monitoring the changes in nebular ionization will yield a 3-D map of the changing asymmetric UV radiation field geometry in the binary system, and the first estimate of the orientation of its orbit.
Event-Specific Cannabis Use and Use-Related Impairment: The Relationship to Campus Traditions
Buckner, Julia D.; Henslee, Amber M.; Jeffries, Emily R.
2015-01-01
Objective: Despite high rates of college cannabis use, little work has identified high-risk cannabis use events. For instance, Mardi Gras (MG) and St. Patrick’s Day (SPD) are characterized by more college drinking, yet it is unknown whether they are also related to greater cannabis use. Further, some campuses may have traditions that emphasize substance use during these events, whereas other campuses may not. Such campus differences may affect whether students use cannabis during specific events. The present study tested whether MG and SPD were related to more cannabis use at two campuses with different traditions regarding MG and SPD. Further, given that Campus A has specific traditions regarding MG whereas Campus B has specific traditions regarding SPD, cross-campus differences in event-specific use were examined. Method: Current cannabis-using undergraduates (N = 154) at two campuses completed an online survey of event-specific cannabis use and event-specific cannabis-related problems. Results: Participants used more cannabis during MG and SPD than during a typical weekday, typical day on which the holiday fell, and a holiday unrelated to cannabis use (Presidents’ Day). Among those who engaged in event-specific use, MG and SPD cannabis use was greater than typical weekend use. Campus differences were observed. For example, Campus A reported more cannabis-related problems during MG than SPD, whereas Campus B reported more problems during SPD than MG. Conclusions: Specific holidays were associated with more cannabis use and use-related problems. Observed between-campus differences indicate that campus traditions may affect event-specific cannabis use and use-related problems. PMID:25785793
Statistical evaluation of forecasts
NASA Astrophysics Data System (ADS)
Mader, Malenka; Mader, Wolfgang; Gluckman, Bruce J.; Timmer, Jens; Schelter, Björn
2014-08-01
Reliable forecasts of extreme but rare events, such as earthquakes, financial crashes, and epileptic seizures, would render interventions and precautions possible. Therefore, forecasting methods have been developed which intend to raise an alarm if an extreme event is about to occur. In order to statistically validate the performance of a prediction system, it must be compared to the performance of a random predictor, which raises alarms independent of the events. Such a random predictor can be obtained by bootstrapping or analytically. We propose an analytic statistical framework which, in contrast to conventional methods, allows for validating independently the sensitivity and specificity of a forecasting method. Moreover, our method accounts for the periods during which an event has to remain absent or occur after a respective forecast.
NASA Astrophysics Data System (ADS)
Cecchini, Micael A.; Machado, Luiz A. T.; Artaxo, Paulo
2014-06-01
This work aims to study typical Droplet Size Distributions (DSDs) for different types of precipitation systems and Cloud Condensation Nuclei concentrations over the Vale do Paraíba region in southeastern Brazil. Numerous instruments were deployed during the CHUVA (Cloud processes of tHe main precipitation systems in Brazil: a contribUtion to cloud resolVing modeling and to the GPM) Project in Vale do Paraíba campaign, from November 22, 2011 through January 10, 2012. Measurements of CCN (Cloud Condensation Nuclei) and total particle concentrations, along with measurements of rain DSDs and standard atmospheric properties, including temperature, pressure and wind intensity and direction, were specifically made in this study. The measured DSDs were parameterized with a gamma function using the moment method. The three gamma parameters were disposed in a 3-dimensional space, and subclasses were classified using cluster analysis. Seven DSD categories were chosen to represent the different types of DSDs. The DSD classes were useful in characterizing precipitation events both individually and as a group of systems with similar properties. The rainfall regime classification system was employed to categorize rainy events as local convective rainfall, organized convection rainfall and stratiform rainfall. Furthermore, the frequencies of the seven DSD classes were associated to each type of rainy event. The rainfall categories were also employed to evaluate the impact of the CCN concentration on the DSDs. In the stratiform rain events, the polluted cases had a statistically significant increase in the total rain droplet concentrations (TDCs) compared to cleaner events. An average concentration increase from 668 cm- 3 to 2012 cm- 3 for CCN at 1% supersaturation was found to be associated with an increase of approximately 87 m- 3 in TDC for those events. For the local convection cases, polluted events presented a 10% higher mass weighted mean diameter (Dm) on average. For the organized convection events, no significant results were found.
Ben-Yoav, Hadar; Dykstra, Peter H; Bentley, William E; Ghodssi, Reza
2017-01-01
A microfluidic electrochemical lab-on-a-chip (LOC) device for DNA hybridization detection has been developed. The device comprises a 3 × 3 array of microelectrodes integrated with a dual layer microfluidic valved manipulation system that provides controlled and automated capabilities for high throughput analysis of microliter volume samples. The surface of the microelectrodes is functionalized with single-stranded DNA (ssDNA) probes which enable specific detection of complementary ssDNA targets. These targets are detected by a capacitive technique which measures dielectric variation at the microelectrode-electrolyte interface due to DNA hybridization events. A quantitative analysis of the hybridization events is carried out based on a sensing modeling that includes detailed analysis of energy storage and dissipation components. By calculating these components during hybridization events the device is able to demonstrate specific and dose response sensing characteristics. The developed microfluidic LOC for DNA hybridization detection offers a technology for real-time and label-free assessment of genetic markers outside of laboratory settings, such as at the point-of-care or in-field environmental monitoring.
Stressful life events and the risk of initial central nervous system demyelination.
Saul, Alice; Ponsonby, Anne-Louise; Lucas, Robyn M; Taylor, Bruce V; Simpson, Steve; Valery, Patricia; Dwyer, Terence; Kilpatrick, Trevor J; Pender, Michael P; van der Mei, Ingrid Af
2017-06-01
There is substantial evidence that stress increases multiple sclerosis disease activity, but limited evidence on its association with the onset of multiple sclerosis. To examine the association between stressful life events and risk of first demyelinating event (FDE). This was a multicentre incident case-control study. Cases ( n = 282 with first diagnosis of central nervous system (CNS) demyelination, including n = 216 with 'classic FDE') were aged 18-59 years. Controls without CNS demyelination ( n = 558) were matched to cases on age, sex and study region. Stressful life events were assessed using a questionnaire based on the Social Readjustment Rating Scale. Those who suffered from a serious illness in the previous 12 months were more likely to have an FDE (odds ratio (OR) = 2.35 (1.36, 4.06), p = 0.002), and when we limited our reference group to those who had no stressful life events, the magnitude of effect became stronger (OR = 5.41 (1.80, 16.28)). The total stress number and stress load were not convincingly associated with the risk of an FDE. Cases were more likely to report a serious illness in the previous 12 months, which could suggest that a non-specific illness provides an additional strain to an already predisposed immune system.
Ford, Jaclyn Hennessey; Addis, Donna Rose; Giovanello, Kelly S.
2011-01-01
Previous neuroimaging studies that have examined autobiographical memory specificity have utilized retrieval cues associated with prior searches of the event, potentially changing the retrieval processes being investigated. In the current study, musical cues were used to naturally elicit memories from multiple levels of specificity (i.e., lifetime period, general event, and event-specific). Sixteen young adults participated in a neuroimaging study in which they retrieved autobiographical memories associated with musical cues. These musical cues led to the retrieval of highly emotional memories that had low levels of prior retrieval. Retrieval of all autobiographical memory levels was associated with activity in regions in the autobiographical memory network, specifically the ventromedial prefrontal cortex, posterior cingulate, and right medial temporal lobe. Owing to the use of music, memories from varying levels of specificity were retrieved, allowing for comparison of event memory and abstract personal knowledge, as well as comparison of specific and general event memory. Dorsolateral and dorsomedial prefrontal regions were engaged during event retrieval relative to personal knowledge retrieval, and retrieval of specific event memories was associated with increased activity in the bilateral medial temporal lobe and dorsomedial prefrontal cortex relative to retrieval of general event memories. These results suggest that the initial search processes for memories of different specificity levels preferentially engage different components of the autobiographical memory network. The potential underlying causes of these neural differences are discussed. PMID:21600227
Results of the first continuous meteor head echo survey at polar latitudes
NASA Astrophysics Data System (ADS)
Schult, Carsten; Stober, Gunter; Janches, Diego; Chau, Jorge L.
2017-11-01
We present the first quasi continuous meteor head echo measurements obtained during a period of over two years using the Middle Atmosphere ALOMAR Radar System (MAARSY). The measurements yield information on the altitude, trajectory, vector velocity, radar cross section, deceleration and dynamical mass of every single event. The large statistical amount of nearly one million meteor head detections provide an excellent overview of the elevation, altitude, velocity and daily count rate distributions during different times of the year at polar latitudes. Only 40% of the meteors were detected within the full width half maximum of the specific sporadic meteor sources. Our observation of the sporadic meteors are compared to the observations with other radar systems and a meteor input function (MIF). The best way to compare different radar systems is by comparing the radar cross section (RCS), which is the main detection criterion for each system. In this study we aim to compare our observations with a MIF, which provides information only about the meteoroid mass. Thus, we are using a statistical approach for the elevation and velocity dependent visibility and a specific mass selection. The predicted absolute count rates from the MIF are in a good agreement with the observation when it is assumed that the radar system is only sensitive to meteoroids with masses higher than one microgram. The analysis of the dynamic masses seems to be consistent with this assumption since the count rate of events with smaller masses are low and decrease even more by using events with relatively small errors.
Flexible data-management system
NASA Technical Reports Server (NTRS)
Pelouch, J. J., Jr.
1977-01-01
Combined ASRDI Data-Management and Analysis Technique (CADMAT) is system of computer programs and procedures that can be used to conduct data-management tasks. System was developed specifically for use by scientists and engineers who are confronted with management and analysis of large quantities of data organized into records of events and parametric fields. CADMAT is particularly useful when data are continually accumulated, such as when the need of retrieval and analysis is ongoing.
NASA Astrophysics Data System (ADS)
Skrzypek, Josef; Mesrobian, Edmond; Gungner, David J.
1989-03-01
The development of autonomous land vehicles (ALV) capable of operating in an unconstrained environment has proven to be a formidable research effort. The unpredictability of events in such an environment calls for the design of a robust perceptual system, an impossible task requiring the programming of a system bases on the expectation of future, unconstrained events. Hence, the need for a "general purpose" machine vision system that is capable of perceiving and understanding images in an unconstrained environment in real-time. The research undertaken at the UCLA Machine Perception Laboratory addresses this need by focusing on two specific issues: 1) the long term goals for machine vision research as a joint effort between the neurosciences and computer science; and 2) a framework for evaluating progress in machine vision. In the past, vision research has been carried out independently within different fields including neurosciences, psychology, computer science, and electrical engineering. Our interdisciplinary approach to vision research is based on the rigorous combination of computational neuroscience, as derived from neurophysiology and neuropsychology, with computer science and electrical engineering. The primary motivation behind our approach is that the human visual system is the only existing example of a "general purpose" vision system and using a neurally based computing substrate, it can complete all necessary visual tasks in real-time.
[Consensus conference on providing information of adverse events to patients and relatives].
Martín-Delgado, M C; Fernández-Maillo, M; Bañeres-Amella, J; Campillo-Artero, C; Cabré-Pericas, L; Anglés-Coll, R; Gutiérrez-Fernández, R; Aranaz-Andrés, J M; Pardo-Hernández, A; Wu, A
2013-01-01
To develop recommendations regarding «Information about adverse events to patients and their families», through the implementation of a consensus conference. A literature review was conducted to identify all relevant articles, the major policies and international guidelines, and the specific legislation developed in some countries on this process. The literature review was the basis for responding to a series of questions posed in a public session. A group of experts presented the best available evidence, interacting with stakeholders. At the end of the session, an interdisciplinary and multi-professional jury established the final recommendations of the consensus conference. The main recommendations advocate the need to develop policies and institutional guidelines in our field, favouring the patient adverse events disclosure process. The recommendations emphasize the need for the training of professionals in communication skills and patient safety, as well as the development of strategies for supporting professionals who are involved in an adverse event. The assessment of the interest and impact of specific legislation that would help the implementation of these policies was also considered. A cultural change is needed at all levels, nuanced and adapted to the specific social and cultural aspects of our social and health spheres, and involves all stakeholders in the system to create a framework of trust and credibility in which the processing of information about adverse events may become effective. Copyright © 2013 SECA. Published by Elsevier Espana. All rights reserved.
Surveillance of adverse effects following vaccination and safety of immunization programs.
Waldman, Eliseu Alves; Luhm, Karin Regina; Monteiro, Sandra Aparecida Moreira Gomes; Freitas, Fabiana Ramos Martin de
2011-02-01
The aim of the review was to analyze conceptual and operational aspects of systems for surveillance of adverse events following immunization. Articles available in electronic format were included, published between 1985 and 2009, selected from the PubMed/Medline databases using the key words "adverse events following vaccine surveillance", "post-marketing surveillance", "safety vaccine" and "Phase IV clinical trials". Articles focusing on specific adverse events were excluded. The major aspects underlying the Public Health importance of adverse events following vaccination, the instruments aimed at ensuring vaccine safety, and the purpose, attributes, types, data interpretation issues, limitations, and further challenges in adverse events following immunization were describe, as well as strategies to improve sensitivity. The review was concluded by discussing the challenges to be faced in coming years with respect to ensuring the safety and reliability of vaccination programs.
Action-based flood forecasting for triggering humanitarian action
NASA Astrophysics Data System (ADS)
Coughlan de Perez, Erin; van den Hurk, Bart; van Aalst, Maarten K.; Amuron, Irene; Bamanya, Deus; Hauser, Tristan; Jongma, Brenden; Lopez, Ana; Mason, Simon; Mendler de Suarez, Janot; Pappenberger, Florian; Rueth, Alexandra; Stephens, Elisabeth; Suarez, Pablo; Wagemaker, Jurjen; Zsoter, Ervin
2016-09-01
Too often, credible scientific early warning information of increased disaster risk does not result in humanitarian action. With financial resources tilted heavily towards response after a disaster, disaster managers have limited incentive and ability to process complex scientific data, including uncertainties. These incentives are beginning to change, with the advent of several new forecast-based financing systems that provide funding based on a forecast of an extreme event. Given the changing landscape, here we demonstrate a method to select and use appropriate forecasts for specific humanitarian disaster prevention actions, even in a data-scarce location. This action-based forecasting methodology takes into account the parameters of each action, such as action lifetime, when verifying a forecast. Forecasts are linked with action based on an understanding of (1) the magnitude of previous flooding events and (2) the willingness to act "in vain" for specific actions. This is applied in the context of the Uganda Red Cross Society forecast-based financing pilot project, with forecasts from the Global Flood Awareness System (GloFAS). Using this method, we define the "danger level" of flooding, and we select the probabilistic forecast triggers that are appropriate for specific actions. Results from this methodology can be applied globally across hazards and fed into a financing system that ensures that automatic, pre-funded early action will be triggered by forecasts.
Monitoring risk: post marketing surveillance and signal detection.
Dart, Richard C
2009-12-01
The primary goal of postmarketing surveillance is to provide information for risk assessment of a drug. Drugs affecting the central nervous system form a unique group of products for surveillance because they are often misused, abused, and diverted. These medications include opioid analgesics, stimulants, sedative-hypnotics, muscle relaxants, anticonvulsants and other drug classes. Their adverse events are difficult to monitor because the perpetrator often attempts to conceal the misuse, abuse and diversion of the product. A postmarketing surveillance system for prescription drugs of abuse in the U.S. should include product specific information that is accurate, immediately available, geographically specific and includes all areas of the country. Most producers of branded opioid analgesic products have created systems that measure abuse from multiple vantage points: criminal justice, treatment professionals, susceptible patient populations and acute health events. In the past, the U.S. government has not established similar requirements for the same products produced by generic manufacturers. However, the Food and Drug Administration Amendments Act of 2007 includes generic opioid analgesic products by requiring that all products containing potent opioid drugs perform rigorous surveillance and risk management. While general risk management guidance has been developed by FDA, more specific analyses and guidance are needed to improve surveillance methodology for drugs which are misused, abused, diverted.
Neural bases of event knowledge and syntax integration in comprehension of complex sentences.
Malaia, Evie; Newman, Sharlene
2015-01-01
Comprehension of complex sentences is necessarily supported by both syntactic and semantic knowledge, but what linguistic factors trigger a readers' reliance on a specific system? This functional neuroimaging study orthogonally manipulated argument plausibility and verb event type to investigate cortical bases of the semantic effect on argument comprehension during reading. The data suggest that telic verbs facilitate online processing by means of consolidating the event schemas in episodic memory and by easing the computation of syntactico-thematic hierarchies in the left inferior frontal gyrus. The results demonstrate that syntax-semantics integration relies on trade-offs among a distributed network of regions for maximum comprehension efficiency.
NASA Astrophysics Data System (ADS)
Do, T. D.; Pifer, A.; Chowdhury, Z.; Wahman, D.; Zhang, W.; Fairey, J.
2017-12-01
Detection of nitrification events in chloraminated drinking water distribution systems remains an ongoing challenge for many drinking water utilities, including Dallas Water Utilities (DWU) and the City of Houston (CoH). Each year, these utilities experience nitrification events that necessitate extensive flushing, resulting in the loss of billions of gallons of finished water. Biological techniques used to quantify the activity of nitrifying bacteria are impractical for real-time monitoring because they require significant laboratory efforts and/or lengthy incubation times. At present, DWU and CoH regularly rely on physicochemical parameters including total chlorine and monochloramine residual, and free ammonia, nitrite, and nitrate as indicators of nitrification, but these metrics lack specificity to nitrifying bacteria. To improve detection of nitrification in chloraminated drinking water distribution systems, we seek to develop a real-time fluorescence-based sensor system to detect the early onset of nitrification events by measuring the fluorescence of soluble microbial products (SMPs) specific to nitrifying bacteria. Preliminary data indicates that fluorescence-based metrics have the sensitivity to detect these SMPs in the early stages of nitrification, but several remaining challenges will be explored in this presentation. We will focus on benchtop and sensor results from ongoing batch and annular reactor experiments designed to (1) identify fluorescence wavelength pairs and data processing techniques suitable for measurement of SMPs from nitrification and (2) assess and correct potential interferences, such as those from monochloramine, pH, iron, nitrite, nitrate and humic substances. This work will serve as the basis for developing fluorescence sensor packages for full-scale testing and validation in the DWU and CoH systems. Findings from this research could be leveraged to identify nitrification events in their early stages, facilitating proactive interventions and decreasing the severity and frequency of nitrification episodes and water loss due to flushing.
Westerhoff, P.; Anning, D.
2000-01-01
Dissolved (DOC) and total (TOC) organic carbon concentrations and compositions were studied for several river systems in Arizona, USA. DOC composition was characterized by ultraviolet and visible absorption and fluorescence emission (excitation wavelength of 370 nm) spectra characteristics. Ephemeral sites had the highest DOC concentrations, and unregulated perennial sites had lower concentrations than unregulated intermittent sites, regulated sites, and sites downstream from wastewater-treatment plants (p < 0.05). Reservoir outflows and wastewater-treatment plant effluent were higher in DOC concentration (p < 0.05) and exhibited less variability in concentration than inflows to the reservoirs. Specific ultraviolet absorbance values at 254 nm were typically less than 2 m-1(milligram DOC per liter)-1 and lower than values found in most temperate-region rivers, but specific ultraviolet absorbance values increased during runoff events. Fluorescence measurements indicated that DOC in desert streams typically exhibit characteristics of autochthonous sources; however, DOC in unregulated upland rivers and desert streams experienced sudden shifts from autochthonous to allochthonous sources during runoff events. The urban water system (reservoir systems and wastewater-treatment plants) was found to affect temporal variability in DOC concentration and composition. (C) 2000 Elsevier Science B.V.Dissolved (DOC) and total (TOC) organic carbon concentrations and compositions were studied for several river systems in Arizona, USA. DOC composition was characterized by ultraviolet and visible absorption and fluorescence emission (excitation wavelength of 370 nm) spectra characteristics. Ephemeral sites had the highest DOC concentrations, and unregulated perennial sites had lower concentrations than unregulated intermittent sites, regulated sites, and sites downstream from wastewater-treatment plants (p<0.05). Reservoir outflows and wastewater-treatment plant effluent were higher in DOC concentration (p<0.05) and exhibited less variability in concentration than inflows to the reservoirs. Specific ultraviolet absorbance values at 254 nm were typically less than 2 m-1(milligram DOC per liter)-1 and lower than values found in most temperate-region rivers, but specific ultraviolet absorbance values increased during runoff events. Fluorescence measurements indicated that DOC in desert streams typically exhibit characteristics of autochthonous sources; however, DOC in unregulated upland rivers and desert streams experienced sudden shifts from autochthonous to allochthonous sources during runoff events. The urban water system (reservoir systems and wastewater-treatment plants) was found to affect temporal variability in DOC concentration and composition.The influence of urbanization, becoming increasingly common in arid regions, on dissolved organic carbon (DOC) concentrations in surface water resources was studied. DOC concentration and composition, seasonal watershed runoff events, streamflow variations, water management practices, and urban infrastructure in several Arizona watersheds were monitored. Ephemeral sites had the highest DOC levels, and unregulated perennial sites and lower concentrations than unregulated intermittent sites, regulated sites, and sites downstream from wastewater treatment plants. Reservoir outflows and wastewater treatment plant effluent had higher and less variable DOC concentrations than inflows to reservoirs. UV absorbance values, fluorescence measurements, and other indicators suggest that urban water systems (reservoirs and wastewater treatment plants) affect temporal variability in DOC concentration and composition.
Expert systems and simulation models; Proceedings of the Seminar, Tucson, AZ, November 18, 19, 1985
NASA Technical Reports Server (NTRS)
1986-01-01
The seminar presents papers on modeling and simulation methodology, artificial intelligence and expert systems, environments for simulation/expert system development, and methodology for simulation/expert system development. Particular attention is given to simulation modeling concepts and their representation, modular hierarchical model specification, knowledge representation, and rule-based diagnostic expert system development. Other topics include the combination of symbolic and discrete event simulation, real time inferencing, and the management of large knowledge-based simulation projects.
1989-12-29
1.1.2. General Performance Criteria for Gamma Ray Spectrometers 4 1.1.3. Special Criteria for Space-Based Spectrometer Systems 7 1.1.4. Prior Approaches...calculations were performed for selected incident gamma ray energies and were used to generate tabular and graphical listings of gamma scattering results. The... generated . These output presentations were studied to identify behavior patterns of "good" and "bad" event sequences. For the specific gamma energy
An Introduction to the NCHEMS Costing and Data Management System. Technical Report No. 55.
ERIC Educational Resources Information Center
Haight, Mike; Martin, Ron
The NCHEMS Costing and Data Management System is designed to assist institutions in the implementation of cost studies. There are at least two kinds of cost studies: historical cost studies which display cost-related data that reflect actual events over a specific prior time period, and predictive cost studies which forecast costs that will be…
The knowledge-based framework for a nuclear power plant operator advisor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, D.W.; Hajek, B.K.
1989-01-01
An important facet in the design, development, and evaluation of aids for complex systems is the identification of the tasks performed by the operator. Operator aids utilizing artificial intelligence, or more specifically knowledge-based systems, require identification of these tasks in the context of a knowledge-based framework. In this context, the operator responses to the plant behavior are to monitor and comprehend the state of the plant, identify normal and abnormal plant conditions, diagnose abnormal plant conditions, predict plant response to specific control actions, and select the best available control action, implement a feasible control action, monitor system response to themore » control action, and correct for any inappropriate responses. These tasks have been identified to formulate a knowledge-based framework for an operator advisor under development at Ohio State University that utilizes the generic task methodology proposed by Chandrasekaran. The paper lays the foundation to identify the responses as a knowledge-based set of tasks in accordance with the expected human operator responses during an event. Initial evaluation of the expert system indicates the potential for an operator aid that will improve the operator's ability to respond to both anticipated and unanticipated events.« less
Addressing the Need for Independence in the CSE Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abercrombie, Robert K; Ferragut, Erik M; Sheldon, Frederick T
2011-01-01
Abstract Information system security risk, defined as the product of the monetary losses associated with security incidents and the probability that they occur, is a suitable decision criterion when considering different information system architectures. Risk assessment is the widely accepted process used to understand, quantify, and document the effects of undesirable events on organizational objectives so that risk management, continuity of operations planning, and contingency planning can be performed. One technique, the Cyberspace Security Econometrics System (CSES), is a methodology for estimating security costs to stakeholders as a function of possible risk postures. In earlier works, we presented a computationalmore » infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder stands to sustain, as a result of security breakdowns. Additional work has applied CSES to specific business cases. The current state-of-the-art of CSES addresses independent events. In typical usage, analysts create matrices that capture their expert opinion, and then use those matrices to quantify costs to stakeholders. This expansion generalizes CSES to the common real-world case where events may be dependent.« less
The heparanome--the enigma of encoding and decoding heparan sulfate sulfation.
Lamanna, William C; Kalus, Ina; Padva, Michael; Baldwin, Rebecca J; Merry, Catherine L R; Dierks, Thomas
2007-04-30
Heparan sulfate (HS) is a cell surface carbohydrate polymer modified with sulfate moieties whose highly ordered composition is central to directing specific cell signaling events. The ability of the cell to generate these information rich glycans with such specificity has opened up a new field of "heparanomics" which seeks to understand the systems involved in generating these cell type and developmental stage specific HS sulfation patterns. Unlike other instances where biological information is encrypted as linear sequences in molecules such as DNA, HS sulfation patterns are generated through a non-template driven process. Thus, deciphering the sulfation code and the dynamic nature of its generation has posed a new challenge to system biologists. The recent discovery of two sulfatases, Sulf1 and Sulf2, with the unique ability to edit sulfation patterns at the cell surface, has opened up a new dimension as to how we understand the regulation of HS sulfation patterning and pattern-dependent cell signaling events. This review will focus on the functional relationship between HS sulfation patterning and biological processes. Special attention will be given to Sulf1 and Sulf2 and how these key editing enzymes might act in concert with the HS biosynthetic enzymes to generate and regulate specific HS sulfation patterns in vivo. We will further explore the use of knock out mice as biological models for understanding the dynamic systems involved in generating HS sulfation patterns and their biological relevance. A brief overview of new technologies and innovations summarizes advances in the systems biology field for understanding non-template molecular networks and their influence on the "heparanome".
Advocates and critics for tactical behaviors in UGV navigation
NASA Astrophysics Data System (ADS)
Hussain, Talib S.; Vidaver, Gordon; Berliner, Jeffrey
2005-05-01
Critical to the development of unmanned ground vehicle platforms is the incorporation of adaptive tactical behaviors for the planning of high-level navigation and tactical actions. BBN Technologies recently completed a simulation-based project for the Army Research Lab (ARL) in which we applied an evolutionary computation approach to navigating through a terrain to capture flag objectives while faced with one or more mobile enemies. Our Advocates and Critics for Tactical Behaviors (ACTB) system evolves plans for the vehicle that control its movement goals (in the form of waypoints), and its future actions (e.g., pointing cameras). We apply domain-specific, state-dependent genetic operators called advocates that promote specific tactical behaviors (e.g., adapt a plan to stay closer to walls). We define the fitness function as a weighted sum of a number of independent, domain-specific, state-dependent evaluation components called critics. Critics reward plans based upon specific tactical criteria, such as minimizing risk of exposure or time to the flags. Additionally, the ACTB system provides the capability for a human commander to specify the "rules of engagement" under which the vehicle will operate. The rules of engagement determine the planning emphasis required under different tactical situations (e.g., discovery of an enemy), and provide a mechanism for automatically adapting the relative selection probabilities of the advocates, the weights of the critics, and the depth of planning in response to tactical events. The ACTB system demonstrated highly effective performance in a head-to-head testing event, held by ARL, against two competing tactical behavior systems.
Pinpointing Predation Events: A different molecular approach.
USDA-ARS?s Scientific Manuscript database
A glassy-winged sharpshooter (GWSS), Homalodisca vitripennis, protien marking system has been developed as a diagnostic tool for quantifying predation rates via gut content analysis. A field study was conducted to quantify predation rates on each of the GWSS lifestages. Specifically, two GWSS nymp...
NASA Astrophysics Data System (ADS)
Denny, Ellen; Miller-Rushing, Abraham; Haggerty, Brian; Wilson, Bruce; Weltzin, Jake
2010-05-01
The USA National Phenology Network (www.usanpn.org) has recently initiated a national effort to encourage people at different levels of expertise—from backyard naturalists to professional scientists—to observe phenological events and contribute to a national database that will be used to greatly improve our understanding of spatio-temporal variation in phenology and associated phenological responses to climate change. Traditional phenological observation protocols identify specific single dates at which individual phenological events are observed, but the scientific usefulness of long-term phenological observations can be improved with a more carefully structured protocol. At the USA-NPN we have developed a new approach that directs observers to record each day that they observe an individual plant, and to assess and report the state of specific life stages (or phenophases) as occurring or not occurring on that plant for each observation date. Evaluation is phrased in terms of simple, easy-to-understand, questions (e.g. "Do you see open flowers?"), which makes it very appropriate for a broad audience. From this method, a rich dataset of phenological metrics can be extracted, including the duration of a phenophase (e.g. open flowers), the beginning and end points of a phenophase (e.g. traditional phenological events such as first flower and last flower), multiple distinct occurrences of phenophases within a single growing season (e.g multiple flowering events, common in drought-prone regions), as well as quantification of sampling frequency and observational uncertainties. The system also includes a mechanism for translation of phenophase start and end points into standard traditional phenological events to facilitate comparison of contemporary data collected with this new "phenophase status" monitoring approach to historical datasets collected with the "phenological event" monitoring approach. These features greatly enhance the utility of the resulting data for statistical analyses addressing questions such as how phenological events vary in time and space, and in response to global change.
VME rollback hardware for time warp multiprocessor systems
NASA Technical Reports Server (NTRS)
Robb, Michael J.; Buzzell, Calvin A.
1992-01-01
The purpose of the research effort is to develop and demonstrate innovative hardware to implement specific rollback and timing functions required for efficient queue management and precision timekeeping in multiprocessor discrete event simulations. The previously completed phase 1 effort demonstrated the technical feasibility of building hardware modules which eliminate the state saving overhead of the Time Warp paradigm used in distributed simulations on multiprocessor systems. The current phase 2 effort will build multiple pre-production rollback hardware modules integrated with a network of Sun workstations, and the integrated system will be tested by executing a Time Warp simulation. The rollback hardware will be designed to interface with the greatest number of multiprocessor systems possible. The authors believe that the rollback hardware will provide for significant speedup of large scale discrete event simulation problems and allow multiprocessors using Time Warp to dramatically increase performance.
An Ultralow-Power Sleep Spindle Detection System on Chip.
Iranmanesh, Saam; Rodriguez-Villegas, Esther
2017-08-01
This paper describes a full system-on-chip to automatically detect sleep spindle events from scalp EEG signals. These events, which are known to play an important role on memory consolidation during sleep, are also characteristic of a number of neurological diseases. The operation of the system is based on a previously reported algorithm, which used the Teager energy operator, together with the Spectral Edge Frequency (SEF50) achieving more than 70% sensitivity and 98% specificity. The algorithm is now converted into a hardware analog based customized implementation in order to achieve extremely low levels of power. Experimental results prove that the system, which is fabricated in a 0.18 μm CMOS technology, is able to operate from a 1.25 V power supply consuming only 515 nW, with an accuracy that is comparable to its software counterpart.
ERIC Educational Resources Information Center
Streibel, Michael J.
This paper discusses the implications of Lucy Suchman's conclusion that a theory of situated action--i.e., the actual sense that specific users make out of specific Xeroxing events--is truer to the lived experience of Xerox users than a cognitive account of the user's plans--e.g., the hierarchy of subprocedures for how Xerox machines should be…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sattison, M.B.; Schroeder, J.A.; Russell, K.D.
The Idaho National Engineering Laboratory (INEL) over the past year has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both NRR and AEOD. This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with amore » unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sattison, M.B.; Schroeder, J.A.; Russell, K.D.
The Idaho National Engineering Laboratory (INEL) over the past year has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of conditional core damage probability (CCDP) evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both NRR and AEOD. This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according tomore » plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events.« less
Analysis of the 2015-16 El Niño Event Using NASA's GEOS Data Assimilation System
NASA Astrophysics Data System (ADS)
Pawson, S.; Lim, Y. K.; Kovach, R. M.; Vernieres, G.
2016-12-01
The strong El Niño event that occurred in 2015/2016 is analyzed using atmospheric and oceanic analyses produced using the Goddard Earth Observing System (GEOS) systems. A theme of the work is to compare and contrast this event with two other strong El Niños, in 1982/1983 and 1997/1998, that are included in the satellite-data era of the MERRA and MERRA-2 reanalyses produced using the GEOS system. Distribution of the maximum anomalies of tropical sea-surface temperature (SST), precipitation, Walker circulation, and cloud fraction indicate that 2015/2016 is a Central Pacific (CP) El Niño. The event had an early onset compared to the 1997/1998 El Niño, with extremely strong warming and precipitation over the Central Pacific, and was the strongest in terms of central Pacific SST anomalies. The large region of warm temperature anomalies over most of the Pacific and Indian Ocean in the 2015-2016 event were due to the accumulative impacts of the El Niño event along with a positive phase of the Pacific Decadal Oscillation and a decadal warming trend over the western Pacific, Maritime Continent, and Indian Ocean. The relatively weak development of the 2015/2016 El Niño event over the Eastern Pacific was likely due to weaker westerly wind bursts and Madden-Julian Oscillation during spring, which in 1997/1998 served to drive the warm anomalies further East towards South America, making that event the strongest Eastern Pacific El Niño (in the recent data record). This is reflected in the 2015/2016 event having a shallower thermocline over the Eastern Pacific, with a weaker zonal gradient of sub-surface water temperatures along the equatorial Pacific. The major extra-tropical teleconnections associated with the El Niño in 2015/2016 are at least comparable to those in the 1982/1983 and 1997/1998 El Niño events. Specifically, the Pacific North American (PNA) teleconnection in 2015/2016 is the strongest of these three El Niño events, leading to larger extra-tropical anomalies of geopotential height, temperature, and precipitation over North America.
Integrated software system for improving medical equipment management.
Bliznakov, Z; Pappous, G; Bliznakova, K; Pallikarakis, N
2003-01-01
The evolution of biomedical technology has led to an extraordinary use of medical devices in health care delivery. During the last decade, clinical engineering departments (CEDs) turned toward computerization and application of specific software systems for medical equipment management in order to improve their services and monitor outcomes. Recently, much emphasis has been given to patient safety. Through its Medical Device Directives, the European Union has required all member nations to use a vigilance system to prevent the reoccurrence of adverse events that could lead to injuries or death of patients or personnel as a result of equipment malfunction or improper use. The World Health Organization also has made this issue a high priority and has prepared a number of actions and recommendations. In the present workplace, a new integrated, Windows-oriented system is proposed, addressing all tasks of CEDs but also offering a global approach to their management needs, including vigilance. The system architecture is based on a star model, consisting of a central core module and peripheral units. Its development has been based on the integration of 3 software modules, each one addressing specific predefined tasks. The main features of this system include equipment acquisition and replacement management, inventory archiving and monitoring, follow up on scheduled maintenance, corrective maintenance, user training, data analysis, and reports. It also incorporates vigilance monitoring and information exchange for adverse events, together with a specific application for quality-control procedures. The system offers clinical engineers the ability to monitor and evaluate the quality and cost-effectiveness of the service provided by means of quality and cost indicators. Particular emphasis has been placed on the use of harmonized standards with regard to medical device nomenclature and classification. The system's practical applications have been demonstrated through a pilot evaluation trial.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Létourneau, Daniel, E-mail: daniel.letourneau@rmp.uh.on.ca; McNiven, Andrea; Keller, Harald
2014-12-15
Purpose: High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. Methods:more » The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3–4 times/week over a period of 10–11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ±0.5 and ±1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. Results: The precision of the MLC performance monitoring QC test and the MLC itself was within ±0.22 mm for most MLC leaves and the majority of the apparent leaf motion was attributed to beam spot displacements between irradiations. The MLC QC test was performed 193 and 162 times over the monitoring period for the studied units and recalibration had to be repeated up to three times on one of these units. For both units, rate of MLC interlocks was moderately associated with MLC servicing events. The strongest association with the MLC performance was observed between the MLC servicing events and the total number of out-of-control leaves. The average elapsed time for which the number of out-of-specification or out-of-control leaves was within a given performance threshold was computed and used to assess adequacy of MLC test frequency. Conclusions: A MLC performance monitoring system has been developed and implemented to acquire high-quality QC data at high frequency. This is enabled by the relatively short acquisition time for the images and automatic image analysis. The monitoring system was also used to record and track the rate of MLC-related interlocks and servicing events. MLC performances for two commercially available MLC models have been assessed and the results support monthly test frequency for widely accepted ±1 mm specifications. Higher QC test frequency is however required to maintain tighter specification and in-control behavior.« less
Stepwise construction of a metabolic network in Event-B: The heat shock response.
Sanwal, Usman; Petre, Luigia; Petre, Ion
2017-12-01
There is a high interest in constructing large, detailed computational models for biological processes. This is often done by putting together existing submodels and adding to them extra details/knowledge. The result of such approaches is usually a model that can only answer questions on a very specific level of detail, and thus, ultimately, is of limited use. We focus instead on an approach to systematically add details to a model, with formal verification of its consistency at each step. In this way, one obtains a set of reusable models, at different levels of abstraction, to be used for different purposes depending on the question to address. We demonstrate this approach using Event-B, a computational framework introduced to develop formal specifications of distributed software systems. We first describe how to model generic metabolic networks in Event-B. Then, we apply this method for modeling the biological heat shock response in eukaryotic cells, using Event-B refinement techniques. The advantage of using Event-B consists in having refinement as an intrinsic feature; this provides as a final result not only a correct model, but a chain of models automatically linked by refinement, each of which is provably correct and reusable. This is a proof-of-concept that refinement in Event-B is suitable for biomodeling, serving for mastering biological complexity. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERP effects and perceived exclusion in the Cyberball paradigm: Correlates of expectancy violation?
Weschke, Sarah; Niedeggen, Michael
2015-10-22
A virtual ball-tossing game called Cyberball has allowed the identification of neural structures involved in the processing of social exclusion by using neurocognitive methods. However, there is still an ongoing debate if structures involved are either pain- or exclusion-specific or part of a broader network. In electrophysiological Cyberball studies we have shown that the P3b component is sensitive to exclusion manipulations, possibly modulated by the probability of ball possession of the participant (event "self") or the presumed co-players (event "other"). Since it is known from oddball studies that the P3b is not only modulated by the objective probability of an event, but also by subjective expectancy, we independently manipulated the probability of the events "self" and "other" and the expectancy for these events. Questionnaire data indicate that social need threat is only induced when the expectancy for involvement in the ball-tossing game is violated. Similarly, the P3b amplitude of both "self" and "other" events was a correlate of expectancy violation. We conclude that both the subjective report of exclusion and the P3b effect induced in the Cyberball paradigm are primarily based on a cognitive process sensitive to expectancy violations, and that the P3b is not related to the activation of an exclusion-specific neural alarm system. Copyright © 2015 Elsevier B.V. All rights reserved.
Early warning, warning or alarm systems for natural hazards? A generic classification.
NASA Astrophysics Data System (ADS)
Sättele, Martina; Bründl, Michael; Straub, Daniel
2013-04-01
Early warning, warning and alarm systems have gained popularity in recent years as cost-efficient measures for dangerous natural hazard processes such as floods, storms, rock and snow avalanches, debris flows, rock and ice falls, landslides, flash floods, glacier lake outburst floods, forest fires and even earthquakes. These systems can generate information before an event causes loss of property and life. In this way, they mainly mitigate the overall risk by reducing the presence probability of endangered objects. These systems are typically prototypes tailored to specific project needs. Despite their importance there is no recognised system classification. This contribution classifies warning and alarm systems into three classes: i) threshold systems, ii) expert systems and iii) model-based expert systems. The result is a generic classification, which takes the characteristics of the natural hazard process itself and the related monitoring possibilities into account. The choice of the monitoring parameters directly determines the system's lead time. The classification of 52 active systems moreover revealed typical system characteristics for each system class. i) Threshold systems monitor dynamic process parameters of ongoing events (e.g. water level of a debris flow) and incorporate minor lead times. They have a local geographical coverage and a predefined threshold determines if an alarm is automatically activated to warn endangered objects, authorities and system operators. ii) Expert systems monitor direct changes in the variable disposition (e.g crack opening before a rock avalanche) or trigger events (e.g. heavy rain) at a local scale before the main event starts and thus offer extended lead times. The final alarm decision incorporates human, model and organisational related factors. iii) Model-based expert systems monitor indirect changes in the variable disposition (e.g. snow temperature, height or solar radiation that influence the occurrence probability of snow avalanches) or trigger events (e.g. heavy snow fall) to predict spontaneous hazard events in advance. They encompass regional or national measuring networks and satisfy additional demands such as the standardisation of the measuring stations. The developed classification and the characteristics, which were revealed for each class, yield a valuable input to quantifying the reliability of warning and alarm systems. Importantly, this will facilitate to compare them with well-established standard mitigation measures such as dams, nets and galleries within an integrated risk management approach.
Buechel, Eva C; Zhang, Jiao; Morewedge, Carey K
2017-05-01
Affective forecasts are used to anticipate the hedonic impact of future events and decide which events to pursue or avoid. We propose that because affective forecasters are more sensitive to outcome specifications of events than experiencers, the outcome specification values of an event, such as its duration, magnitude, probability, and psychological distance, can be used to predict the direction of affective forecasting errors: whether affective forecasters will overestimate or underestimate its hedonic impact. When specifications are positively correlated with the hedonic impact of an event, forecasters will overestimate the extent to which high specification values will intensify and low specification values will discount its impact. When outcome specifications are negatively correlated with its hedonic impact, forecasters will overestimate the extent to which low specification values will intensify and high specification values will discount its impact. These affective forecasting errors compound additively when multiple specifications are aligned in their impact: In Experiment 1, affective forecasters underestimated the hedonic impact of winning a smaller prize that they expected to win, and they overestimated the hedonic impact of winning a larger prize that they did not expect to win. In Experiment 2, affective forecasters underestimated the hedonic impact of a short unpleasant video about a temporally distant event, and they overestimated the hedonic impact of a long unpleasant video about a temporally near event. Experiments 3A and 3B showed that differences in the affect-richness of forecasted and experienced events underlie these differences in sensitivity to outcome specifications, therefore accounting for both the impact bias and its reversal. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
A dynamical systems approach to studying midlatitude weather extremes
NASA Astrophysics Data System (ADS)
Messori, Gabriele; Caballero, Rodrigo; Faranda, Davide
2017-04-01
Extreme weather occurrences carry enormous social and economic costs and routinely garner widespread scientific and media coverage. The ability to predict these events is therefore a topic of crucial importance. Here we propose a novel predictability pathway for extreme events, by building upon recent advances in dynamical systems theory. We show that simple dynamical systems metrics can be used to identify sets of large-scale atmospheric flow patterns with similar spatial structure and temporal evolution on time scales of several days to a week. In regions where these patterns favor extreme weather, they afford a particularly good predictability of the extremes. We specifically test this technique on the atmospheric circulation in the North Atlantic region, where it provides predictability of large-scale wintertime surface temperature extremes in Europe up to 1 week in advance.
Code of Federal Regulations, 2010 CFR
2010-01-01
.... Beryllium article means a manufactured item that is formed to a specific shape or design during manufacture... particles. Immune response refers to the series of cellular events by which the immune system reacts to... medical removal from beryllium areas following a recommendation by the Site Occupational Medicine Director...
Characterizing interactions in online social networks during exceptional events
NASA Astrophysics Data System (ADS)
Omodei, Elisa; De Domenico, Manlio; Arenas, Alex
2015-08-01
Nowadays, millions of people interact on a daily basis on online social media like Facebook and Twitter, where they share and discuss information about a wide variety of topics. In this paper, we focus on a specific online social network, Twitter, and we analyze multiple datasets each one consisting of individuals' online activity before, during and after an exceptional event in terms of volume of the communications registered. We consider important events that occurred in different arenas that range from policy to culture or science. For each dataset, the users' online activities are modeled by a multilayer network in which each layer conveys a different kind of interaction, specifically: retweeting, mentioning and replying. This representation allows us to unveil that these distinct types of interaction produce networks with different statistical properties, in particular concerning the degree distribution and the clustering structure. These results suggests that models of online activity cannot discard the information carried by this multilayer representation of the system, and should account for the different processes generated by the different kinds of interactions. Secondly, our analysis unveils the presence of statistical regularities among the different events, suggesting that the non-trivial topological patterns that we observe may represent universal features of the social dynamics on online social networks during exceptional events.
SYNAISTHISI: an IoT-powered smart visitor management and cognitive recommendations system
NASA Astrophysics Data System (ADS)
Thanos, Giorgos Konstandinos; Karafylli, Christina; Karafylli, Maria; Zacharakis, Dimitris; Papadimitriou, Apostolis; Dimitros, Kostantinos; Kanellopoulou, Konstantina; Kyriazanos, Dimitris M.; Thomopoulos, Stelios C. A.
2016-05-01
Location-based and navigation services are really needed to help visitors and audience of big events, complex buildings, shopping malls, airports and large companies. However, the lack of GPS and proper mapping indoors usually renders location-based applications and services useless or simply not applicable in such environments. SYNAISTHISI introduces a mobile application for smartphones which offers navigation capabilities outside and inside buildings and through multiple floor levels. The application comes together with a suite of helpful services, including personalized recommendations, visit/event management and a helpful search functionality in order to navigate to a specific location, event or person. As the user finds his way towards his destination, NFC-enabled checkpoints and bluetooth beacons assist him, while offering re-routing, check-in/out capabilities and useful information about ongoing meetings and nearby events. The application is supported by a back-end GIS system which can provide a broad and clear view to event organizers, campus managers and field personnel for purposes of event logistics, safety and security. SYNAISTHISI system comes with plenty competitive advantages including (a) Seamless Navigation as users move between outdoor and indoor areas and different floor levels by using innovative routing algorithms, (b) connection to and powered by IoT platform, for localization and real-time information feedback, (c) dynamic personalized recommendations based on user profile, location and real-time information provided by the IoT platform and (d) Indoor localization without the need for expensive infrastructure and installations.
Evaluation of a low-cost 3D sound system for immersive virtual reality training systems.
Doerr, Kai-Uwe; Rademacher, Holger; Huesgen, Silke; Kubbat, Wolfgang
2007-01-01
Since Head Mounted Displays (HMD), datagloves, tracking systems, and powerful computer graphics resources are nowadays in an affordable price range, the usage of PC-based "Virtual Training Systems" becomes very attractive. However, due to the limited field of view of HMD devices, additional modalities have to be provided to benefit from 3D environments. A 3D sound simulation can improve the capabilities of VR systems dramatically. Unfortunately, realistic 3D sound simulations are expensive and demand a tremendous amount of computational power to calculate reverberation, occlusion, and obstruction effects. To use 3D sound in a PC-based training system as a way to direct and guide trainees to observe specific events in 3D space, a cheaper alternative has to be provided, so that a broader range of applications can take advantage of this modality. To address this issue, we focus in this paper on the evaluation of a low-cost 3D sound simulation that is capable of providing traceable 3D sound events. We describe our experimental system setup using conventional stereo headsets in combination with a tracked HMD device and present our results with regard to precision, speed, and used signal types for localizing simulated sound events in a virtual training environment.
Factors Affecting Two Types of Memory Specificity: Particularization of Episodes and Details.
Willén, Rebecca M; Granhag, Pär Anders; Strömwall, Leif A
2016-01-01
Memory for repeated events is relevant to legal investigations about repeated occurrences. We investigated how two measures of specificity (number of events referred to and amount of detail reported about the events) were influenced by interviewees' age, number of experienced events, interviewer, perceived unpleasantness, and memory rehearsal. Transcribed narratives consisting of over 40.000 utterances from 95 dental patients, and the corresponding dental records, were studied. Amount of detail was measured by categorizing the utterances as generic, specific, or specific-extended. We found that the two measures were affected differently by all five factors. For instance, number of experienced events positively influenced number of referred events but had no effect on amount of detail provided about the events. We make suggestions for future research and encourage reanalysis of the present data set and reuse of the material.
Unsupervised Spatial Event Detection in Targeted Domains with Applications to Civil Unrest Modeling
Zhao, Liang; Chen, Feng; Dai, Jing; Hua, Ting; Lu, Chang-Tien; Ramakrishnan, Naren
2014-01-01
Twitter has become a popular data source as a surrogate for monitoring and detecting events. Targeted domains such as crime, election, and social unrest require the creation of algorithms capable of detecting events pertinent to these domains. Due to the unstructured language, short-length messages, dynamics, and heterogeneity typical of Twitter data streams, it is technically difficult and labor-intensive to develop and maintain supervised learning systems. We present a novel unsupervised approach for detecting spatial events in targeted domains and illustrate this approach using one specific domain, viz. civil unrest modeling. Given a targeted domain, we propose a dynamic query expansion algorithm to iteratively expand domain-related terms, and generate a tweet homogeneous graph. An anomaly identification method is utilized to detect spatial events over this graph by jointly maximizing local modularity and spatial scan statistics. Extensive experiments conducted in 10 Latin American countries demonstrate the effectiveness of the proposed approach. PMID:25350136
Khan, Stuart J; Deere, Daniel; Leusch, Frederic D L; Humpage, Andrew; Jenkins, Madeleine; Cunliffe, David
2015-11-15
Among the most widely predicted and accepted consequences of global climate change are increases in both the frequency and severity of a variety of extreme weather events. Such weather events include heavy rainfall and floods, cyclones, droughts, heatwaves, extreme cold, and wildfires, each of which can potentially impact drinking water quality by affecting water catchments, storage reservoirs, the performance of water treatment processes or the integrity of distribution systems. Drinking water guidelines, such as the Australian Drinking Water Guidelines and the World Health Organization Guidelines for Drinking-water Quality, provide guidance for the safe management of drinking water. These documents present principles and strategies for managing risks that may be posed to drinking water quality. While these principles and strategies are applicable to all types of water quality risks, very little specific attention has been paid to the management of extreme weather events. We present a review of recent literature on water quality impacts of extreme weather events and consider practical opportunities for improved guidance for water managers. We conclude that there is a case for an enhanced focus on the management of water quality impacts from extreme weather events in future revisions of water quality guidance documents. Copyright © 2015 Elsevier Ltd. All rights reserved.
Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems
NASA Technical Reports Server (NTRS)
Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris
2010-01-01
Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.
Multilingual Analysis of Twitter News in Support of Mass Emergency Events
NASA Astrophysics Data System (ADS)
Zielinski, A.; Bügel, U.; Middleton, L.; Middleton, S. E.; Tokarchuk, L.; Watson, K.; Chaves, F.
2012-04-01
Social media are increasingly becoming an additional source of information for event-based early warning systems in the sense that they can help to detect natural crises and support crisis management during or after disasters. Within the European FP7 TRIDEC project we study the problem of analyzing multilingual twitter feeds for emergency events. Specifically, we consider tsunami and earthquakes, as one possible originating cause of tsunami, and propose to analyze twitter messages for capturing testified information at affected points of interest in order to obtain a better picture of the actual situation. For tsunami, these could be the so called Forecast Points, i.e. agreed-upon points chosen by the Regional Tsunami Warning Centers (RTWC) and the potentially affected countries, which must be considered when calculating expected tsunami arrival times. Generally, local civil protection authorities and the population are likely to respond in their native languages. Therefore, the present work focuses on English as "lingua franca" and on under-resourced Mediterranean languages in endangered zones, particularly in Turkey, Greece, and Romania. We investigated ten earthquake events and defined four language-specific classifiers that can be used to detect natural crisis events by filtering out irrelevant messages that do not relate to the event. Preliminary results indicate that such a filter has the potential to support earthquake detection and could be integrated into seismographic sensor networks. One hindrance in our study is the lack of geo-located data for asserting the geographical origin of the tweets and thus to be able to observe correlations of events across languages. One way to overcome this deficit consists in identifying geographic names contained in tweets that correspond to or which are located in the vicinity of specific points-of-interest such as the forecast points of the tsunami scenario. We also intend to use twitter analysis for situation picture assessment, e.g. for planning relief actions. At present, a multilingual corpus of Twitter messages related to crises is being assembled, and domain-specific language resources such as multilingual terminology lists and language-specific Natural Language Processing (NLP) tools are being built up to help cross the language barrier. The final goal is to extend this work to the main languages spoken around the Mediterranean and to classify and extract relevant information from tweets, translating the main keywords into English.
Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang
2015-01-05
The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection.
Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang
2015-01-01
The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection. PMID:25556930
Semiautomated TaqMan PCR screening of GMO labelled samples for (unauthorised) GMOs.
Scholtens, Ingrid M J; Molenaar, Bonnie; van Hoof, Richard A; Zaaijer, Stephanie; Prins, Theo W; Kok, Esther J
2017-06-01
In most countries, systems are in place to analyse food products for the potential presence of genetically modified organisms (GMOs), to enforce labelling requirements and to screen for the potential presence of unauthorised GMOs. With the growing number of GMOs on the world market, a larger diversity of methods is required for informative analyses. In this paper, the specificity of an extended screening set consisting of 32 screening methods to identify different crop species (endogenous genes) and GMO elements was verified against 59 different GMO reference materials. In addition, a cost- and time-efficient strategy for DNA isolation, screening and identification is presented. A module for semiautomated analysis of the screening results and planning of subsequent event-specific tests for identification has been developed. The Excel-based module contains information on the experimentally verified specificity of the element methods and of the EU authorisation status of the GMO events. If a detected GMO element cannot be explained by any of the events as identified in the same sample, this may indicate the presence of an unknown unauthorised GMO that may not yet have been assessed for its safety for humans, animals or the environment.
Chen, Xing-jie; Liu, Lu-lu; Cui, Ji-fang; Wang, Ya; Chen, An-tao; Li, Feng-hua; Wang, Wei-hong; Zheng, Han-feng; Gan, Ming-yuan; Li, Chun-qiu; Shum, David H. K.; Chan, Raymond C. K.
2016-01-01
Mental time travel refers to the ability to recall past events and to imagine possible future events. Schizophrenia (SCZ) patients have problems in remembering specific personal experiences in the past and imagining what will happen in the future. This study aimed to examine episodic past and future thinking in SCZ spectrum disorders including SCZ patients and individuals with schizotypal personality disorder (SPD) proneness who are at risk for developing SCZ. Thirty-two SCZ patients, 30 SPD proneness individuals, and 33 healthy controls participated in the study. The Sentence Completion for Events from the Past Test (SCEPT) and the Sentence Completion for Events in the Future Test were used to measure past and future thinking abilities. Results showed that SCZ patients showed significantly reduced specificity in recalling past and imagining future events, they generated less proportion of specific and extended events compared to healthy controls. SPD proneness individuals only generated less extended events compared to healthy controls. The reduced specificity was mainly manifested in imagining future events. Both SCZ patients and SPD proneness individuals generated less positive events than controls. These results suggest that mental time travel impairments in SCZ spectrum disorders and have implications for understanding their cognitive and emotional deficits. PMID:27507958
NASA Astrophysics Data System (ADS)
Zakeri, Zeinab; Azadi, Majid; Ghader, Sarmad
2018-01-01
Satellite radiances and in-situ observations are assimilated through Weather Research and Forecasting Data Assimilation (WRFDA) system into Advanced Research WRF (ARW) model over Iran and its neighboring area. Domain specific background error based on x and y components of wind speed (UV) control variables is calculated for WRFDA system and some sensitivity experiments are carried out to compare the impact of global background error and the domain specific background errors, both on the precipitation and 2-m temperature forecasts over Iran. Three precipitation events that occurred over the country during January, September and October 2014 are simulated in three different experiments and the results for precipitation and 2-m temperature are verified against the verifying surface observations. Results show that using domain specific background error improves 2-m temperature and 24-h accumulated precipitation forecasts consistently, while global background error may even degrade the forecasts compared to the experiments without data assimilation. The improvement in 2-m temperature is more evident during the first forecast hours and decreases significantly as the forecast length increases.
Contribution of past and future self-defining event networks to personal identity.
Demblon, Julie; D'Argembeau, Arnaud
2017-05-01
Personal identity is nourished by memories of significant past experiences and by the imagination of meaningful events that one anticipates to happen in the future. The organisation of such self-defining memories and prospective thoughts in the cognitive system has received little empirical attention, however. In the present study, our aims were to investigate to what extent self-defining memories and future projections are organised in networks of related events, and to determine the nature of the connections linking these events. Our results reveal the existence of self-defining event networks, composed of both memories and future events of similar centrality for identity and characterised by similar identity motives. These self-defining networks expressed a strong internal coherence and frequently organised events in meaningful themes and sequences (i.e., event clusters). Finally, we found that the satisfaction of identity motives in represented events and the presence of clustering across events both contributed to increase in the perceived centrality of events for the sense of identity. Overall, these findings suggest that personal identity is not only nourished by representations of significant past and future events, but also depends on the formation of coherent networks of related events that provide an overarching meaning to specific life experiences.
Computing return times or return periods with rare event algorithms
NASA Astrophysics Data System (ADS)
Lestang, Thibault; Ragone, Francesco; Bréhier, Charles-Edouard; Herbert, Corentin; Bouchet, Freddy
2018-04-01
The average time between two occurrences of the same event, referred to as its return time (or return period), is a useful statistical concept for practical applications. For instance insurances or public agencies may be interested by the return time of a 10 m flood of the Seine river in Paris. However, due to their scarcity, reliably estimating return times for rare events is very difficult using either observational data or direct numerical simulations. For rare events, an estimator for return times can be built from the extrema of the observable on trajectory blocks. Here, we show that this estimator can be improved to remain accurate for return times of the order of the block size. More importantly, we show that this approach can be generalised to estimate return times from numerical algorithms specifically designed to sample rare events. So far those algorithms often compute probabilities, rather than return times. The approach we propose provides a computationally extremely efficient way to estimate numerically the return times of rare events for a dynamical system, gaining several orders of magnitude of computational costs. We illustrate the method on two kinds of observables, instantaneous and time-averaged, using two different rare event algorithms, for a simple stochastic process, the Ornstein–Uhlenbeck process. As an example of realistic applications to complex systems, we finally discuss extreme values of the drag on an object in a turbulent flow.
Do strategic processes contribute to the specificity of future simulation in depression?
Addis, Donna Rose; Hach, Sylvia; Tippett, Lynette J
2016-06-01
The tendency to generate overgeneral past or future events is characteristic of individuals with a history of depression. Although much research has investigated the contribution of rumination and avoidance to the reduced specificity of past events, comparatively little research has examined (1) whether the specificity of future events is differentially reduced in depression and (2) the role of executive functions in this phenomenon. Our study aimed to redress this imbalance. Participants with either current or past experience of depressive symptoms ('depressive group'; N = 24) and matched controls ('control group'; N = 24) completed tests of avoidance, rumination, and executive functions. A modified Autobiographical Memory Test was administered to assess the specificity of past and future events. The depressive group were more ruminative and avoidant than controls, but did not exhibit deficits in executive function. Although overall the depressive group generated significantly fewer specific events than controls, this reduction was driven by a significant group difference in future event specificity. Strategic retrieval processes were correlated with both past and future specificity, and predictive of the future specificity, whereas avoidance and rumination were not. Our findings demonstrate that future simulation appears to be particularly vulnerable to disruption in individuals with current or past experience of depressive symptoms, consistent with the notion that future simulation is more cognitively demanding than autobiographical memory retrieval. Moreover, our findings suggest that even subtle changes in executive functions such as strategic processes may impact the ability to imagine specific future events. Future simulation may be particularly vulnerable to executive dysfunction in individuals with current/previous depressive symptoms, with evidence of a differential reduction in the specificity of future events. Strategic retrieval abilities were associated with the degree of future event specificity whereas levels of rumination and avoidance were not. Given that the ability to generate specific simulations of the future is associated with enhanced psychological wellbeing, problem solving and coping behaviours, understanding how to increase the specificity of future simulations in depression is an important direction for future research and clinical practice. Interventions focusing on improving the ability to engage strategic processes may be a fruitful avenue for increasing the ability to imagine specific future events in depression. The autobiographical event tasks have somewhat limited ecological validity as they do not account for the many social and environmental cues present in everyday life; the development of more clinically-relevant tasks may be of benefit to this area of study. © 2016 The British Psychological Society.
Federating Cyber and Physical Models for Event-Driven Situational Awareness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stephan, Eric G.; Pawlowski, Ronald A.; Sridhar, Siddharth
The purpose of this paper is to describe a novel method to improve electric power system monitoring and control software application interoperability. This method employs the concept of federation, which is defined as the use of existing models that represent aspects of a system in specific domains (such as physical and cyber security domains) and building interface to link all of domain models.
Enabling Service Discovery in a Federation of Systems: WS-Discovery Case Study
2014-06-01
found that Pastry [3] coupled with SCRIBE [4] provides everything we require from the overlay network: Pastry nodes form a decentralized, self...application-independent manner. Furthermore, Pastry provides mechanisms that support and facilitate application-specific object replication, caching, and fault...recovery. Add SCRIBE to Pastry , and you get a generic, scalable and efficient group communication and event notification system providing
Species-Specific Exon Loss in Human Transcriptomes
Wang, Jinkai; Lu, Zhi-xiang; Tokheim, Collin J.; Miller, Sara E.; Xing, Yi
2015-01-01
Changes in exon–intron structures and splicing patterns represent an important mechanism for the evolution of gene functions and species-specific regulatory networks. Although exon creation is widespread during primate and human evolution and has been studied extensively, much less is known about the scope and potential impact of human-specific exon loss events. Historically, transcriptome data and exon annotations are significantly biased toward humans over nonhuman primates. This ascertainment bias makes it challenging to discover human-specific exon loss events. We carried out a transcriptome-wide search of human-specific exon loss events, by taking advantage of RNA sequencing (RNA-seq) as a powerful and unbiased tool for exon discovery and annotation. Using RNA-seq data of humans, chimpanzees, and other primates, we reconstructed and compared transcript structures across the primate phylogeny. We discovered 33 candidate human-specific exon loss events, among which six exons passed stringent experimental filters for the complete loss of splicing activities in diverse human tissues. These events may result from human-specific deletion of genomic DNA, or small-scale sequence changes that inactivated splicing signals. The impact of human-specific exon loss events is predominantly regulatory. Three of the six events occurred in the 5′ untranslated region (5′-UTR) and affected cis-regulatory elements of mRNA translation. In SLC7A6, a gene encoding an amino acid transporter, luciferase reporter assays suggested that both a human-specific exon loss event and an independent human-specific single nucleotide substitution in the 5′-UTR increased mRNA translational efficiency. Our study provides novel insights into the molecular mechanisms and evolutionary consequences of exon loss during human evolution. PMID:25398629
A hierarchical approach to defining marine heatwaves
NASA Astrophysics Data System (ADS)
Hobday, Alistair J.; Alexander, Lisa V.; Perkins, Sarah E.; Smale, Dan A.; Straub, Sandra C.; Oliver, Eric C. J.; Benthuysen, Jessica A.; Burrows, Michael T.; Donat, Markus G.; Feng, Ming; Holbrook, Neil J.; Moore, Pippa J.; Scannell, Hillary A.; Sen Gupta, Alex; Wernberg, Thomas
2016-02-01
Marine heatwaves (MHWs) have been observed around the world and are expected to increase in intensity and frequency under anthropogenic climate change. A variety of impacts have been associated with these anomalous events, including shifts in species ranges, local extinctions and economic impacts on seafood industries through declines in important fishery species and impacts on aquaculture. Extreme temperatures are increasingly seen as important influences on biological systems, yet a consistent definition of MHWs does not exist. A clear definition will facilitate retrospective comparisons between MHWs, enabling the synthesis and a mechanistic understanding of the role of MHWs in marine ecosystems. Building on research into atmospheric heatwaves, we propose both a general and specific definition for MHWs, based on a hierarchy of metrics that allow for different data sets to be used in identifying MHWs. We generally define a MHW as a prolonged discrete anomalously warm water event that can be described by its duration, intensity, rate of evolution, and spatial extent. Specifically, we consider an anomalously warm event to be a MHW if it lasts for five or more days, with temperatures warmer than the 90th percentile based on a 30-year historical baseline period. This structure provides flexibility with regard to the description of MHWs and transparency in communicating MHWs to a general audience. The use of these metrics is illustrated for three 21st century MHWs; the northern Mediterranean event in 2003, the Western Australia 'Ningaloo Niño' in 2011, and the northwest Atlantic event in 2012. We recommend a specific quantitative definition for MHWs to facilitate global comparisons and to advance our understanding of these phenomena.
Using natural archives to detect climate and environmental tipping points in the Earth System
NASA Astrophysics Data System (ADS)
Thomas, Zoë A.
2016-11-01
'Tipping points' in the Earth system are characterised by a nonlinear response to gradual forcing, and may have severe and wide-ranging impacts. Many abrupt events result from simple underlying system dynamics termed 'critical transitions' or 'bifurcations'. One of the best ways to identify and potentially predict threshold behaviour in the climate system is through analysis of natural ('palaeo') archives. Specifically, on the approach to a tipping point, early warning signals can be detected as characteristic fluctuations in a time series as a system loses stability. Testing whether these early warning signals can be detected in highly complex real systems is a key challenge, since much work is either theoretical or only tested with simple models. This is particularly problematic in palaeoclimate and palaeoenvironmental records with low resolution, non-equidistant data, which can limit accurate analysis. Here, a range of different datasets are examined to explore generic rules that can be used to detect such dramatic events. A number of key criteria are identified to be necessary for the reliable identification of early warning signals in natural archives, most crucially, the need for a low-noise record of sufficient data length, resolution and accuracy. A deeper understanding of the underlying system dynamics is required to inform the development of more robust system-specific indicators, or to indicate the temporal resolution required, given a known forcing. This review demonstrates that time series precursors from natural archives provide a powerful means of forewarning tipping points within the Earth System.
Increasing Student/Corporate Engagement
ERIC Educational Resources Information Center
Janicki, Thomas N.; Cummings, Jeffrey W.
2017-01-01
Increasing dialog and interaction between the corporate community and students is a key strategic goal of many universities. This paper describes an event that has been specifically designed to increase student and corporate engagement. It describes the process of planning and executing a targeted career day for information systems and information…
ERIC Educational Resources Information Center
Ziff, Stephen J.
1973-01-01
Describes the increased need for emergency lighting equipment for late evening events, adult education evening classes, and for the increasing use of the interior classroom. Explains the difference between central and unit lighting systems; clarifies the specifications in the Occupational Safety and Health Act (OSHA) as they apply to school…
Probabilistic forecasting of extreme weather events based on extreme value theory
NASA Astrophysics Data System (ADS)
Van De Vyver, Hans; Van Schaeybroeck, Bert
2016-04-01
Extreme events in weather and climate such as high wind gusts, heavy precipitation or extreme temperatures are commonly associated with high impacts on both environment and society. Forecasting extreme weather events is difficult, and very high-resolution models are needed to describe explicitly extreme weather phenomena. A prediction system for such events should therefore preferably be probabilistic in nature. Probabilistic forecasts and state estimations are nowadays common in the numerical weather prediction community. In this work, we develop a new probabilistic framework based on extreme value theory that aims to provide early warnings up to several days in advance. We consider the combined events when an observation variable Y (for instance wind speed) exceeds a high threshold y and its corresponding deterministic forecasts X also exceeds a high forecast threshold y. More specifically two problems are addressed:} We consider pairs (X,Y) of extreme events where X represents a deterministic forecast, and Y the observation variable (for instance wind speed). More specifically two problems are addressed: Given a high forecast X=x_0, what is the probability that Y>y? In other words: provide inference on the conditional probability: [ Pr{Y>y|X=x_0}. ] Given a probabilistic model for Problem 1, what is the impact on the verification analysis of extreme events. These problems can be solved with bivariate extremes (Coles, 2001), and the verification analysis in (Ferro, 2007). We apply the Ramos and Ledford (2009) parametric model for bivariate tail estimation of the pair (X,Y). The model accommodates different types of extremal dependence and asymmetry within a parsimonious representation. Results are presented using the ensemble reforecast system of the European Centre of Weather Forecasts (Hagedorn, 2008). Coles, S. (2001) An Introduction to Statistical modelling of Extreme Values. Springer-Verlag.Ferro, C.A.T. (2007) A probability model for verifying deterministic forecasts of extreme events. Wea. Forecasting {22}, 1089-1100.Hagedorn, R. (2008) Using the ECMWF reforecast dataset to calibrate EPS forecasts. ECMWF Newsletter, {117}, 8-13.Ramos, A., Ledford, A. (2009) A new class of models for bivariate joint tails. J.R. Statist. Soc. B {71}, 219-241.
NASA Astrophysics Data System (ADS)
Alfieri, Silvia Maria; De Lorenzi, Francesca; Basile, Angelo; Bonfante, Antonello; Missere, Daniele; Menenti, Massimo
2014-05-01
Climate change in Mediterranean area is likely to reduce precipitation amounts and to increase temperature thus affecting the timing of development stages and the productivity of crops. Further, extreme weather events are expected to increase in the future leading to significant increase in agricultural risk. Some strategies for effectively managing risks and adapting to climate change involve adjustments to irrigation management and use of different varieties. We quantified the risk on Peach production in an irrigated area of "Emilia Romagna" region ( Italy) taking into account the impact on crop yield due to climate change and variability and to extreme weather events as well as the ability of the agricultural system to modulate this impact (adaptive capacity) through changes in water and crop management. We have focused on climatic events causing insufficient water supply to crops, while taking into account the effect of climate on the duration and timing of phenological stages. Further, extreme maximum and minimum temperature events causing significant reduction of crop yield have been considered using phase-specific critical temperatures. In our study risk was assessed as the product of the probability of a damaging event (hazard), such as drought or extreme temperatures, and the estimated impact of such an event (vulnerability). To estimate vulnerability we took into account the possible options to reduce risk, by combining estimates of the sensitivity of the system (negative impact on crop yield) and its adaptive capacity. The latter was evaluated as the relative improvement due to alternate management options: the use of alternate varieties or the changes in irrigation management. Vulnerability was quantified using cultivar-specific thermal and hydrologic requirements of a set of cultivars determined by experimental data and from scientific literature. Critical temperatures determining a certain reduction of crop yield have been estimated and used to assess thermal hazard and vulnerability in sensitive phenological stages. Cultivar-specific yield response functions to water availability were used to assess the reduction of yield for a determinate management option. Downscaled climate scenarios have been used to calculate indicators of soil water availability and thermal times and to evaluate the variability of crop phenology in combination with critical temperatures. Two climate scenarios were considered: reference (1961-90) and future (2021-2050) climate, the former from climatic statistics on observed variables, and the latter from statistical downscaling of general circulation models (AOGCM). Management options were defined by combinations of irrigation strategies (optimal, rainfed and deficit) with use of alternate varieties. As regards hydrologic conditions, risk assessment has been done at landscape scale in all soil units within each study area. The mechanistic model SWAP (Soil-Water-Atmosphere-Plant model) of water flow in the soil-plant-atmosphere system was used to describe the hydrological conditions in response to climate and irrigation. Different farm management options were evaluated. In a moderate water shortage scenario, deficit irrigation was an effective strategy to cope with climate change risks. In a severe water shortage scenario, the study showed the potentiality of intra-specific biodiversity to reduce risk of yield losses, although costs should be evaluated against the benefits of each specific management option. The work was carried out within the Italian national project AGROSCENARI funded by the Ministry for Agricultural, Food and Forest Policies (MIPAAF, D.M. 8608/7303/2008)
Performance Analysis of a Citywide Real-time Landslide Early Warning System in Korea
NASA Astrophysics Data System (ADS)
Park, Joon-Young; Lee, Seung-Rae; Kang, Sinhang; Lee, Deuk-hwan; Nedumpallile Vasu, Nikhil
2017-04-01
Rainfall-induced landslide has been one of the major disasters in Korea since the beginning of 21st century when the global climate change started to give rise to the growth of the magnitude and frequency of extreme precipitation events. In order to mitigate the increasing damage to properties and loss of lives and to provide an effective tool for public officials to manage the landslide disasters, a real-time landslide early warning system with an advanced concept has been developed by taking into account for Busan, the second largest metropolitan city in Korea, as an operational test-bed. The system provides with warning information based on a five-level alert scheme (Normal, Attention, Watch, Alert, and Emergency) using the forecasted/observed rainfall data or the data obtained from ground monitoring (volumetric water content and matric suction). The alert levels are determined by applying seven different thresholds in a step-wise manner following a decision tree. In the pursuit of improved reliability of an early warning level assigned to a specific area, the system makes assessments repetitively using the thresholds of different theoretical backgrounds including statistical(empirical), physically-based, and mathematical analyses as well as direct measurement-based approaches. By mapping the distribution of the five early warning levels determined independently for each of tens of millions grids covering the entire mountainous area of Busan, the regional-scale system can also provide with the early warning information for a specific local area. The fact that the highest warning level is determined by using a concept of a numerically-modelled potential debris-flow risk is another distinctive feature of the system. This study tested the system performance by applying it for four previous rainy seasons in order to validate the operational applicability. During the rainy seasons of 2009, 2011, and 2014, the number of landslides recorded throughout Busan's territory reached 156, 64, and 37, respectively. In 2016, only three landslides were recorded even though the city experienced a couple of heavy rainfall events during the rainy season. The system performance test results show good agreement with the observation results for the past rainfall events. It seems that the system can also provide with reliable warning information for the future rainfall events.
Why System Safety Professionals Should Read Accident Reports
NASA Technical Reports Server (NTRS)
Holloway, C. M.; Johnson, C. W.
2006-01-01
System safety professionals, both researchers and practitioners, who regularly read accident reports reap important benefits. These benefits include an improved ability to separate myths from reality, including both myths about specific accidents and ones concerning accidents in general; an increased understanding of the consequences of unlikely events, which can help inform future designs; a greater recognition of the limits of mathematical models; and guidance on potentially relevant research directions that may contribute to safety improvements in future systems.
Real Time Coincidence Detection Engine for High Count Rate Timestamp Based PET
NASA Astrophysics Data System (ADS)
Tetrault, M.-A.; Oliver, J. F.; Bergeron, M.; Lecomte, R.; Fontaine, R.
2010-02-01
Coincidence engines follow two main implementation flows: timestamp based systems and AND-gate based systems. The latter have been more widespread in recent years because of its lower cost and high efficiency. However, they are highly dependent on the selected electronic components, they have limited flexibility once assembled and they are customized to fit a specific scanner's geometry. Timestamp based systems are gathering more attention lately, especially with high channel count fully digital systems. These new systems must however cope with important singles count rates. One option is to record every detected event and postpone coincidence detection offline. For daily use systems, a real time engine is preferable because it dramatically reduces data volume and hence image preprocessing time and raw data management. This paper presents the timestamp based coincidence engine for the LabPET¿, a small animal PET scanner with up to 4608 individual readout avalanche photodiode channels. The engine can handle up to 100 million single events per second and has extensive flexibility because it resides in programmable logic devices. It can be adapted for any detector geometry or channel count, can be ported to newer, faster programmable devices and can have extra modules added to take advantage of scanner-specific features. Finally, the user can select between full processing mode for imaging protocols and minimum processing mode to study different approaches for coincidence detection with offline software.
Zator, Krysten; Katz, Albert N
2017-07-01
Here, we examined linguistic differences in the reports of memories produced by three cueing methods. Two groups of young adults were cued visually either by words representing events or popular cultural phenomena that took place when they were 5, 10, or 16 years of age, or by words referencing a general lifetime period word cue directing them to that period in their life. A third group heard 30-second long musical clips of songs popular during the same three time periods. In each condition, participants typed a specific event memory evoked by the cue and these typed memories were subjected to analysis by the Linguistic Inquiry and Word Count (LIWC) program. Differences in the reports produced indicated that listening to music evoked memories embodied in motor-perceptual systems more so than memories evoked by our word-cueing conditions. Additionally, relative to music cues, lifetime period word cues produced memories with reliably more uses of personal pronouns, past tense terms, and negative emotions. The findings provide evidence for the embodiment of autobiographical memories, and how those differ when the cues emphasise different aspects of the encoded events.
Disclosure of adverse events and errors in surgical care: challenges and strategies for improvement.
Lipira, Lauren E; Gallagher, Thomas H
2014-07-01
The disclosure of adverse events to patients, including those caused by medical errors, is a critical part of patient-centered healthcare and a fundamental component of patient safety and quality improvement. Disclosure benefits patients, providers, and healthcare institutions. However, the act of disclosure can be difficult for physicians. Surgeons struggle with disclosure in unique ways compared with other specialties, and disclosure in the surgical setting has specific challenges. The frequency of surgical adverse events along with a dysfunctional tort system, the team structure of surgical staff, and obstacles created inadvertently by existing surgical patient safety initiatives may contribute to an environment not conducive to disclosure. Fortunately, there are multiple strategies to address these barriers. Participation in communication and resolution programs, integration of Just Culture principles, surgical team disclosure planning, refinement of informed consent and morbidity and mortality processes, surgery-specific professional standards, and understanding the complexities of disclosing other clinicians' errors all have the potential to help surgeons provide patients with complete, satisfactory disclosures. Improvement in the regularity and quality of disclosures after surgical adverse events and errors will be key as the field of patient safety continues to advance.
STS-61 mission director's post-mission report
NASA Technical Reports Server (NTRS)
Newman, Ronald L.
1995-01-01
To ensure the success of the complex Hubble Space Telescope servicing mission, STS-61, NASA established a number of independent review groups to assess management, design, planning, and preparation for the mission. One of the resulting recommendations for mission success was that an overall Mission Director be appointed to coordinate management activities of the Space Shuttle and Hubble programs and to consolidate results of the team reviews and expedite responses to recommendations. This report presents pre-mission events important to the experience base of mission management, with related Mission Director's recommendations following the event(s) to which they apply. All Mission Director's recommendations are presented collectively in an appendix. Other appendixes contain recommendations from the various review groups, including Payload Officers, the JSC Extravehicular Activity (EVA) Section, JSC EVA Management Office, JSC Crew and Thermal Systems Division, and the STS-61 crew itself. This report also lists mission events in chronological order and includes as an appendix a post-mission summary by the lead Payload Deployment and Retrieval System Officer. Recommendations range from those pertaining to specific component use or operating techniques to those for improved management, review, planning, and safety procedures.
Data Albums: An Event Driven Search, Aggregation and Curation Tool for Earth Science
NASA Technical Reports Server (NTRS)
Ramachandran, Rahul; Kulkarni, Ajinkya; Maskey, Manil; Bakare, Rohan; Basyal, Sabin; Li, Xiang; Flynn, Shannon
2014-01-01
Approaches used in Earth science research such as case study analysis and climatology studies involve discovering and gathering diverse data sets and information to support the research goals. To gather relevant data and information for case studies and climatology analysis is both tedious and time consuming. Current Earth science data systems are designed with the assumption that researchers access data primarily by instrument or geophysical parameter. In cases where researchers are interested in studying a significant event, they have to manually assemble a variety of datasets relevant to it by searching the different distributed data systems. This paper presents a specialized search, aggregation and curation tool for Earth science to address these challenges. The search rool automatically creates curated 'Data Albums', aggregated collections of information related to a specific event, containing links to relevant data files [granules] from different instruments, tools and services for visualization and analysis, and information about the event contained in news reports, images or videos to supplement research analysis. Curation in the tool is driven via an ontology based relevancy ranking algorithm to filter out non relevant information and data.
NASA Astrophysics Data System (ADS)
Funk, Daniel
2015-04-01
Climate variability poses major challenges for decision-makers in climate-sensitive sectors. Seasonal to decadal (S2D) forecasts provide potential value for management decisions especially in the context of climate change where information from present or past climatology loses significance. However, usable and decision-relevant tailored climate forecasts are still sparse for Europe and successful examples of application require elaborate and individual producer-user interaction. The assessment of sector-specific vulnerabilities to critical climate conditions at specific temporal scale will be a great step forward to increase the usability and efficiency of climate forecasts. A concept for a sector-specific vulnerability assessment (VA) to climate variability is presented. The focus of this VA is on the provision of usable vulnerability information which can be directly incorporated in decision-making processes. This is done by developing sector-specific climate-impact-decision-pathways and the identification of their specific time frames using data from both bottom-up and top-down approaches. The structure of common VA's for climate change related issues is adopted which envisages the determination of exposure, sensitivity and coping capacity. However, the application of the common vulnerability components within the context of climate service application poses some fundamental considerations: Exposure - the effect of climate events on the system of concern may be modified and delayed due to interconnected systems (e.g. catchment). The critical time-frame of a climate event or event sequence is dependent on system-internal thresholds and initial conditions. But also on decision-making processes which require specific lead times of climate information to initiate respective coping measures. Sensitivity - in organizational systems climate may pose only one of many factors relevant for decision making. The scope of "sensitivity" in this concept comprises both the potential physical response of the system of concern as well as the criticality of climate-related decision-making processes. Coping capacity - in an operational context coping capacity can only reduce vulnerability if it can be applied purposeful. With respect to climate vulnerabilities this refers to the availability of suitable, usable and skillful climate information. The focus for this concept is on existing S2D climate service products and their match with user needs. The outputs of the VA are climate-impact-decision-pathways which characterize critical climate conditions, estimate the role of climate in decision-making processes and evaluate the availability and potential usability of S2D climate forecast products. A classification scheme is developed for each component of the impact-pathway to assess its specific significance. The systemic character of these schemes enables a broad application of this VA across sectors where quantitative data is limited. This concept is developed and will be tested within the context of the EU-FP7 project "European Provision Of Regional Impacts Assessments on Seasonal and Decadal Timescales" EUPORIAS.
Preliminary study of visual perspective in mental time travel in schizophrenia.
Wang, Ya; Wang, Yi; Zhao, Qing; Cui, Ji-Fang; Hong, Xiao-Hong; Chan, Raymond Ck
2017-10-01
This study explored specificity and visual perspective of mental time travel in schizophrenia. Fifteen patients with schizophrenia and 18 controls were recruited. Participants were asked to recall or imagine specific events according to cue words. Results showed that schizophrenia patients generated fewer specific events than controls, the recalled events were more specific than imagined events. Schizophrenia adopted less field perspective and more observer perspective than controls. These results suggested that patients with schizophrenia were impaired in mental time travel both in specificity and visual perspective. Further studies are needed to identify the underlying mechanisms. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
da Silva, Fabricio Polifke; Justi da Silva, Maria Gertrudes Alvarez; Rotunno Filho, Otto Corrêa; Pires, Gisele Dornelles; Sampaio, Rafael João; de Araújo, Afonso Augusto Magalhães
2018-05-01
Natural disasters are the result of extreme or intense natural phenomena that cause severe impacts on society. These impacts can be mitigated through preventive measures that can be aided by better knowledge of extreme phenomena and monitoring of forecasting and alert systems. The city of Petropolis (in a mountainous region of the state of Rio de Janeiro, Brazil) is prone to heavy rain events, often leading to River overflows, landslides, and loss of life. In that context, this work endeavored to characterize the thermodynamic and dynamic synoptic patterns that trigger heavy rainfall episodes and the corresponding flooding of Quitandinha River. More specifically, we reviewed events from the time period between January 2013 and December 2014 using reanalysis data. We expect that the overall description obtained of synoptic patterns should provide adequate qualitative aid to the decision-making processes involved in operational forecasting procedures. We noticed that flooding events were related to the presence of the South Atlantic Convergence Zone (SACZ), frontal systems (FS), and convective storms (CS). These systems showed a similar behavior on high-frequency wind components, notably with respect to northwest winds before precipitation and to a strong southwest wind component during rainfall events. Clustering analyses indicated that the main component for precipitation formation with regard to CS systems comes from daytime heating, with the dynamic component presenting greater efficiency for the FS configurations. The SACZ events were influenced by moisture availability along the vertical column of the atmosphere and also due to dynamic components of precipitation efficiency and daytime heating, the latter related to the continuous transport of moisture from the Amazon region and South Atlantic Ocean towards Rio de Janeiro state.
Moro, Pedro L; Museru, Oidda I; Niu, Manette; Lewis, Paige; Broder, Karen
2014-06-01
To characterize adverse events (AEs) after hepatitis A vaccines (Hep A) and hepatitis A and hepatitis B combination vaccine (Hep AB) in pregnant women reported to the Vaccine Adverse Event Reporting System (VAERS), a spontaneous reporting surveillance system. We searched VAERS for AEs reports in pregnant women who received Hep A or Hep AB from Jan. 1, 1996-April 5, 2013. Clinicians reviewed all reports and available medical records. VAERS received 139 reports of AEs in pregnant women; 7 (5.0%) were serious; no maternal or infant deaths were identified. Sixty-five (46.8%) did not describe any AEs. For those women whose gestational age was available, most were vaccinated during the first trimester, 50/60 (83.3%) for Hep A and 18/21 (85.7%) for Hep AB. The most common pregnancy-specific outcomes following Hep A or Hep AB vaccinations were spontaneous abortion in 15 (10.8%) reports, elective termination in 10 (7.2%), and preterm delivery in 7 (5.0%) reports. The most common nonpregnancy specific outcome was urinary tract infection and nausea/vomiting with 3 (2.2%) reports each. One case of amelia of the lower extremities was reported in an infant following maternal Hep A immunization. This review of VAERS reports did not identify any concerning pattern of AEs in pregnant women or their infants following maternal Hep A or Hep AB immunizations during pregnancy. Published by Mosby, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rechard, Robert P.
This report presents a concise history in tabular form of events leading up to site identification in 1978, site selection in 1987, subsequent characterization, and ongoing analysis through 2008 of the performance of a repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain in southern Nevada. The tabulated events generally occurred in five periods: (1) commitment to mined geologic disposal and identification of sites; (2) site selection and analysis, based on regional geologic characterization through literature and analogous data; (3) feasibility analysis demonstrating calculation procedures and importance of system components, based on rough measures of performance usingmore » surface exploration, waste process knowledge, and general laboratory experiments; (4) suitability analysis demonstrating viability of disposal system, based on environment-specific laboratory experiments, in-situ experiments, and underground disposal system characterization; and (5) compliance analysis, based on completed site-specific characterization. Because the relationship is important to understanding the evolution of the Yucca Mountain Project, the tabulation also shows the interaction between four broad categories of political bodies and government agencies/institutions: (a) technical milestones of the implementing institutions, (b) development of the regulatory requirements and related federal policy in laws and court decisions, (c) Presidential and agency directives and decisions, and (d) critiques of the Yucca Mountain Project and pertinent national and world events related to nuclear energy and radioactive waste.« less
A modular telerobotic task execution system
NASA Technical Reports Server (NTRS)
Backes, Paul G.; Tso, Kam S.; Hayati, Samad; Lee, Thomas S.
1990-01-01
A telerobot task execution system is proposed to provide a general parametrizable task execution capability. The system includes communication with the calling system, e.g., a task planning system, and single- and dual-arm sensor-based task execution with monitoring and reflexing. A specific task is described by specifying the parameters to various available task execution modules including trajectory generation, compliance control, teleoperation, monitoring, and sensor fusion. Reflex action is achieved by finding the corresponding reflex action in a reflex table when an execution event has been detected with a monitor.
Pohlman, Katherine A; Carroll, Linda; Tsuyuki, Ross T; Hartling, Lisa; Vohra, Sunita
2017-12-01
Patient safety performance can be assessed with several systems, including passive and active surveillance. Passive surveillance systems provide opportunity for health care personnel to confidentially and voluntarily report incidents, including adverse events, occurring in their work environment. Active surveillance systems systematically monitor patient encounters to seek detailed information about adverse events that occur in work environments; unlike passive surveillance, active surveillance allows for collection of both numerator (number of adverse events) and denominator (number of patients seen) data. Chiropractic manual therapy is commonly used in both adults and children, yet few studies have been done to evaluate the safety of chiropractic manual therapy for children. In an attempt to evaluate this, this study will compare adverse event reporting in passive versus active surveillance systems after chiropractic manual therapy in the pediatric population. This cluster randomized controlled trial aims to enroll 70 physicians of chiropractic (unit of randomization) to either passive or active surveillance system to report adverse events that occur after treatment for 60 consecutive pediatric (13 years of age and younger) patient visits (unit of analysis). A modified enrollment process with a two-phase consent procedure will be implemented to maintain provider blinding and minimize dropouts. The first phase of consent is for the provider to confirm their interest in a trial investigating the safety of chiropractic manual therapy. The second phase ensures that they understand the specific requirements for the group to which they were randomized. Percentages, incidence estimates, and 95% confidence intervals will be used to describe the count of reported adverse events in each group. The primary outcome will be the number and quality of the adverse event reports in the active versus the passive surveillance group. With 80% power and 5% one-sided significance level, the sample size was calculated to be 35 providers in each group, which includes an 11% lost to follow-up of chiropractors and 20% of patient visits. This study will be the first direct comparison of adverse event reporting using passive versus active surveillance. It is also the largest prospective evaluation of adverse events reported after chiropractic manual therapy in children, identified as a major gap in the academic literature. ClinicalTrials.gov, ID: NCT02268331 . Registered on 10 October 2014.
Contributory factors in surgical incidents as delineated by a confidential reporting system.
Mushtaq, F; O'Driscoll, C; Smith, Fct; Wilkins, D; Kapur, N; Lawton, R
2018-05-01
Background Confidential reporting systems play a key role in capturing information about adverse surgical events. However, the value of these systems is limited if the reports that are generated are not subjected to systematic analysis. The aim of this study was to provide the first systematic analysis of data from a novel surgical confidential reporting system to delineate contributory factors in surgical incidents and document lessons that can be learned. Methods One-hundred and forty-five patient safety incidents submitted to the UK Confidential Reporting System for Surgery over a 10-year period were analysed using an adapted version of the empirically-grounded Yorkshire Contributory Factors Framework. Results The most common factors identified as contributing to reported surgical incidents were cognitive limitations (30.09%), communication failures (16.11%) and a lack of adherence to established policies and procedures (8.81%). The analysis also revealed that adverse events were only rarely related to an isolated, single factor (20.71%) - with the majority of cases involving multiple contributory factors (79.29% of all cases had more than one contributory factor). Examination of active failures - those closest in time and space to the adverse event - pointed to frequent coupling with latent, systems-related contributory factors. Conclusions Specific patterns of errors often underlie surgical adverse events and may therefore be amenable to targeted intervention, including particular forms of training. The findings in this paper confirm the view that surgical errors tend to be multi-factorial in nature, which also necessitates a multi-disciplinary and system-wide approach to bringing about improvements.
Automatic programming of simulation models
NASA Technical Reports Server (NTRS)
Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.
1990-01-01
The concepts of software engineering were used to improve the simulation modeling environment. Emphasis was placed on the application of an element of rapid prototyping, or automatic programming, to assist the modeler define the problem specification. Then, once the problem specification has been defined, an automatic code generator is used to write the simulation code. The following two domains were selected for evaluating the concepts of software engineering for discrete event simulation: manufacturing domain and a spacecraft countdown network sequence. The specific tasks were to: (1) define the software requirements for a graphical user interface to the Automatic Manufacturing Programming System (AMPS) system; (2) develop a graphical user interface for AMPS; and (3) compare the AMPS graphical interface with the AMPS interactive user interface.
[Systemic safety following intravitreal injections of anti-VEGF].
Baillif, S; Levy, B; Girmens, J-F; Dumas, S; Tadayoni, R
2018-03-01
The goal of this manuscript is to assess data suggesting that intravitreal injection of anti-vascular endothelial growth factors (anti-VEGFs) could result in systemic adverse events (AEs). The class-specific systemic AEs should be similar to those encountered in cancer trials. The most frequent AE observed in oncology, hypertension and proteinuria, should thus be the most common expected in ophthalmology, but their severity should be lower because of the much lower doses of anti-VEGFs administered intravitreally. Such AEs have not been frequently reported in ophthalmology trials. In addition, pharmacokinetic and pharmacodynamic data describing systemic diffusion of anti-VEGFs should be interpreted with caution because of significant inconsistencies reported. Thus, safety data reported in ophthalmology trials and pharmacokinetic/pharmacodynamic data provide robust evidence that systemic events after intravitreal injection are very unlikely. Additional studies are needed to explore this issue further, as much remains to be understood about local and systemic side effects of anti-VEGFs. Copyright © 2018 Elsevier Masson SAS. All rights reserved.
Guffey, Patrick; Szolnoki, Judit; Caldwell, James; Polaner, David
2011-07-01
Current incident reporting systems encourage retrospective reporting of morbidity and mortality and have low participation rates. A near miss is an event that did not cause patient harm, but had the potential to. By tracking and analyzing near misses, systems improvements can be targeted appropriately, and future errors may be prevented. An electronic, web based, secure, anonymous reporting system for anesthesiologists was designed and instituted at The Children's Hospital, Denver. This portal was compared to an existing hospital incident reporting system. A total of 150 incidents were reported in the first 3 months of operation, compared to four entered in the same time period 1 year ago. An anesthesia-specific anonymous near-miss reporting system, which eases and facilitates data entry and can prospectively identify processes and practices that place patients at risk, was implemented at a large, academic, freestanding children's hospital. This resulted in a dramatic increase in reported events and provided data to target and drive quality and process improvement. © 2011 Blackwell Publishing Ltd.
Flex Robotic System in transoral robotic surgery: The first 40 patients.
Mattheis, Stefan; Hasskamp, Pia; Holtmann, Laura; Schäfer, Christina; Geisthoff, Urban; Dominas, Nina; Lang, Stephan
2017-03-01
The Flex Robotic System is a new robotic device specifically developed for transoral robotic surgery (TORS). We performed a prospective clinical study, assessing the safety and efficacy of the Medrobotics Flex Robotic System. A total of 40 patients required a surgical procedure for benign lesions (n = 30) or T1 and T2 carcinomas (n = 10). Access and visualization of different anatomic subsites were individually graded by the surgeon. Setup times, access and visualization times, surgical results, as well as adverse events were documented intraoperatively. The lesions could be exposed and visualized properly in 38 patients (95%) who went on to have a surgical procedure performed with the Flex Robotic System, which were intraoperatively evaluated as successful. No serious adverse events occurred. Lesions in the oropharynx, hypopharynx, or supraglottic larynx could be successfully resected using the Flex Robotic System, thus making the system a safe and effective tool in transoral robotic surgery. © 2016 Wiley Periodicals, Inc. Head Neck 39: 471-475, 2017. © 2016 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Obriejetan, Michael; Rauch, Hans Peter; Florineth, Florin
2013-04-01
Erosion control systems consisting of technical and biological components are widely accepted and proven to work well if installed properly with regard to site-specific parameters. A wide range of implementation measures for this specific protection purpose is existent and new, in particular technical solutions are constantly introduced into the market. Nevertheless, especially vegetation aspects of erosion control measures are frequently disregarded and should be considered enhanced against the backdrop of the development and realization of adaptation strategies in an altering environment due to climate change associated effects. Technical auxiliaries such as geotextiles typically used for slope protection (nettings, blankets, turf reinforcement mats etc.) address specific features and due to structural and material diversity, differing effects on sediment yield, surface runoff and vegetational development seem evident. Nevertheless there is a knowledge gap concerning the mutual interaction processes between technical and biological components respectively specific comparable data on erosion-reducing effects of technical-biological erosion protection systems are insufficient. In this context, an experimental arrangement was set up to study the correlated influences of geotextiles and vegetation and determine its (combined) effects on surface runoff and soil loss during simulated heavy rainfall events. Sowing vessels serve as testing facilities which are filled with top soil under application of various organic and synthetic geotextiles and by using a reliable drought resistant seed mixture. Regular vegetational monitoring as well as two rainfall simulation runs with four repetitions of each variant were conducted. Therefore a portable rainfall simulator with standardized rainfall intensity of 240 mm h-1 and three minute rainfall duration was used to stress these systems on different stages of plant development at an inclination of 30 degrees. First results show significant differences between the systems referring to sediment yield and runoff amount respectively vegetation development.
Holland, Alisha C.; Addis, Donna Rose; Kensinger, Elizabeth A.
2011-01-01
We examined the neural correlates of specific (i.e., unique to time and place) and general (i.e., extended in or repeated over time) autobiographical memories (AMs) during their initial construction and later elaboration phases. The construction and elaboration of specific and general events engaged a widely distributed set of regions previously associated with AM recall. Specific (vs. general) event construction preferentially engaged prefrontal and medial temporal lobe regions known to be critical for memory search and retrieval processes. General event elaboration was differentiated from specific event elaboration by extensive right-lateralized prefrontal cortex (PFC) activity. Interaction analyses confirmed that PFC activity was disproportionately engaged by specific AMs during construction, and general AMs during elaboration; a similar pattern was evident in regions of the left lateral temporal lobe. These neural differences between specific and general AM construction and elaboration were largely unrelated to reported differences in the level of detail recalled about each type of event. PMID:21803063
Jing, Helen G; Madore, Kevin P; Schacter, Daniel L
2017-12-01
A critical adaptive feature of future thinking involves the ability to generate alternative versions of possible future events. However, little is known about the nature of the processes that support this ability. Here we examined whether an episodic specificity induction - brief training in recollecting details of a recent experience that selectively impacts tasks that draw on episodic retrieval - (1) boosts alternative event generation and (2) changes one's initial perceptions of negative future events. In Experiment 1, an episodic specificity induction significantly increased the number of alternative positive outcomes that participants generated to a series of standardized negative events, compared with a control induction not focused on episodic specificity. We also observed larger decreases in the perceived plausibility and negativity of the original events in the specificity condition, where participants generated more alternative outcomes, relative to the control condition. In Experiment 2, we replicated and extended these findings using a series of personalized negative events. Our findings support the idea that episodic memory processes are involved in generating alternative outcomes to anticipated future events, and that boosting the number of alternative outcomes is related to subsequent changes in the perceived plausibility and valence of the original events, which may have implications for psychological well-being. Published by Elsevier B.V.
Towards a global flood detection system using social media
NASA Astrophysics Data System (ADS)
de Bruijn, Jens; de Moel, Hans; Jongman, Brenden; Aerts, Jeroen
2017-04-01
It is widely recognized that an early warning is critical in improving international disaster response. Analysis of social media in real-time can provide valuable information about an event or help to detect unexpected events. For successful and reliable detection systems that work globally, it is important that sufficient data is available and that the algorithm works both in data-rich and data-poor environments. In this study, both a new geotagging system and multi-level event detection system for flood hazards was developed using Twitter data. Geotagging algorithms that regard one tweet as a single document are well-studied. However, no algorithms exist that combine several sequential tweets mentioning keywords regarding a specific event type. Within the time frame of an event, multiple users use event related keywords that refer to the same place name. This notion allows us to treat several sequential tweets posted in the last 24 hours as one document. For all these tweets, we collect a series of spatial indicators given in the tweet metadata and extract additional topological indicators from the text. Using these indicators, we can reduce ambiguity and thus better estimate what locations are tweeted about. Using these localized tweets, Bayesian change-point analysis is used to find significant increases of tweets mentioning countries, provinces or towns. In data-poor environments detection of events on a country level is possible, while in other, data-rich, environments detection on a city level is achieved. Additionally, on a city-level we analyse the spatial dependence of mentioned places. If multiple places within a limited spatial extent are mentioned, detection confidence increases. We run the algorithm using 2 years of Twitter data with flood related keywords in 13 major languages and validate against a flood event database. We find that the geotagging algorithm yields significantly more data than previously developed algorithms and successfully deals with ambiguous place names. In addition, we show that our detection system can both quickly and reliably detect floods, even in countries where data is scarce, while achieving high detail in countries where more data is available.
Surgical Specimen Management: A Descriptive Study of 648 Adverse Events and Near Misses.
Steelman, Victoria M; Williams, Tamara L; Szekendi, Marilyn K; Halverson, Amy L; Dintzis, Suzanne M; Pavkovic, Stephen
2016-12-01
- Surgical specimen adverse events can lead to delays in treatment or diagnosis, misdiagnosis, reoperation, inappropriate treatment, and anxiety or serious patient harm. - To describe the types and frequency of event reports associated with the management of surgical specimens, the contributing factors, and the level of harm associated with these events. - A retrospective review was undertaken of surgical specimen adverse events and near misses voluntarily reported in the University HealthSystem Consortium Safety Intelligence Patient Safety Organization database by more than 50 health care facilities during a 3-year period (2011-2013). Event reports that involved surgical specimen management were reviewed for patients undergoing surgery during which tissue or fluid was sent to the pathology department. - Six hundred forty-eight surgical specimen events were reported in all stages of the specimen management process, with the most common events reported during the prelaboratory phase and, specifically, with specimen labeling, collection/preservation, and transport. The most common contributing factors were failures in handoff communication, staff inattention, knowledge deficit, and environmental issues. Eight percent of the events (52 of 648) resulted in either the need for additional treatment or temporary or permanent harm to the patient. - All phases of specimen handling and processing are vulnerable to errors. These results provide a starting point for health care organizations to conduct proactive risk analyses of specimen handling procedures and to design safer processes. Particular attention should be paid to effective communication and handoffs, consistent processes across care areas, and staff training. In addition, organizations should consider the use of technology-based identification and tracking systems.
Increasing the Automation and Autonomy for Spacecraft Operations with Criteria Action Table
NASA Technical Reports Server (NTRS)
Li, Zhen-Ping; Savki, Cetin
2005-01-01
The Criteria Action Table (CAT) is an automation tool developed for monitoring real time system messages for specific events and processes in order to take user defined actions based on a set of user-defined rules. CAT was developed by Lockheed Martin Space Operations as a part of a larger NASA effort at the Goddard Space Flight Center (GSFC) to create a component-based, middleware-based, and standard-based general purpose ground system architecture referred as GMSEC - the GSFC Mission Services Evolution Center. CAT has been integrated into the upgraded ground systems for Tropical Rainfall Measuring Mission (TRMM) and Small Explorer (SMEX) satellites and it plays the central role in their automation effort to reduce the cost and increase the reliability for spacecraft operations. The GMSEC architecture provides a standard communication interface and protocol for components to publish/describe messages to an information bus. It also provides a standard message definition so components can send and receive messages to the bus interface rather than each other, thus reducing the component-to-component coupling, interface, protocols, and link (socket) management. With the GMSEC architecture, components can publish standard event messages to the bus for all nominal, significant, and surprising events in regard to satellite, celestial, ground system, or any other activity. In addition to sending standard event messages, each GMSEC compliant component is required to accept and process GMSEC directive request messages.
Bicentennial Source Book, Level I, K-2.
ERIC Educational Resources Information Center
Herb, Sharon; And Others
This student activities source book is one of a series of four developed by the Carroll County Public School System, Maryland, for celebration of the Bicentennial. It is specifically designed to generate ideas integrating the Bicentennial celebration into various disciplines, classroom activities, and school-wide events at the kindergarten through…
Bicentennial Source Book, Level II, Grades 3-5.
ERIC Educational Resources Information Center
Orth, Nancy; And Others
This study activities source book is one of a series of four developed by the Carroll County Public School System, Maryland, for celebration of the Bicentennial. It is specifically designed to generate ideas integrating the Bicentennial celebration into various disciplines, classroom activities, and school-wide events at the third grade through…
Automatic Association of News Items.
ERIC Educational Resources Information Center
Carrick, Christina; Watters, Carolyn
1997-01-01
Discussion of electronic news delivery systems and the automatic generation of electronic editions focuses on the association of related items of different media type, specifically photos and stories. The goal is to be able to determine to what degree any two news items refer to the same news event. (Author/LRW)
High-throughput screening (HTS) for potential thyroid–disrupting chemicals requires a system of assays to capture multiple molecular-initiating events (MIEs) that converge on perturbed thyroid hormone (TH) homeostasis. Screening for MIEs specific to TH-disrupting pathways is limi...
Conversion of Questionnaire Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Powell, Danny H; Elwood Jr, Robert H
During the survey, respondents are asked to provide qualitative answers (well, adequate, needs improvement) on how well material control and accountability (MC&A) functions are being performed. These responses can be used to develop failure probabilities for basic events performed during routine operation of the MC&A systems. The failure frequencies for individual events may be used to estimate total system effectiveness using a fault tree in a probabilistic risk analysis (PRA). Numeric risk values are required for the PRA fault tree calculations that are performed to evaluate system effectiveness. So, the performance ratings in the questionnaire must be converted to relativemore » risk values for all of the basic MC&A tasks performed in the facility. If a specific material protection, control, and accountability (MPC&A) task is being performed at the 'perfect' level, the task is considered to have a near zero risk of failure. If the task is performed at a less than perfect level, the deficiency in performance represents some risk of failure for the event. As the degree of deficiency in performance increases, the risk of failure increases. If a task that should be performed is not being performed, that task is in a state of failure. The failure probabilities of all basic events contribute to the total system risk. Conversion of questionnaire MPC&A system performance data to numeric values is a separate function from the process of completing the questionnaire. When specific questions in the questionnaire are answered, the focus is on correctly assessing and reporting, in an adjectival manner, the actual performance of the related MC&A function. Prior to conversion, consideration should not be given to the numeric value that will be assigned during the conversion process. In the conversion process, adjectival responses to questions on system performance are quantified based on a log normal scale typically used in human error analysis (see A.D. Swain and H.E. Guttmann, 'Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications,' NUREG/CR-1278). This conversion produces the basic event risk of failure values required for the fault tree calculations. The fault tree is a deductive logic structure that corresponds to the operational nuclear MC&A system at a nuclear facility. The conventional Delphi process is a time-honored approach commonly used in the risk assessment field to extract numerical values for the failure rates of actions or activities when statistically significant data is absent.« less
NASA Astrophysics Data System (ADS)
Denny, E. G.; Miller-Rushing, A. J.; Haggerty, B. P.; Wilson, B. E.
2009-12-01
The USA National Phenology Network has recently initiated a national effort to encourage people at different levels of expertise—from backyard naturalists to professional scientists—to observe phenological events and contribute to a national database that will be used to greatly improve our understanding of spatio-temporal variation in phenology and associated phenological responses to climate change. Traditional phenological observation protocols identify specific single dates at which individual phenological events are observed, but the scientific usefulness of long-term phenological observations can be improved with a more carefully structured protocol. At the USA-NPN we have developed a new approach that directs observers to record each day that they observe an individual plant, and to assess and report the state of specific life stages (or phenophases) as occurring or not occurring on that plant for each observation date. Evaluation is phrased in terms of simple, easy-to-understand, questions (e.g. “Do you see open flowers?”), which makes it very appropriate for a broad audience. From this method, a rich dataset of phenological metrics can be extracted, including the duration of a phenophase (e.g. open flowers), the beginning and end points of a phenophase (e.g. traditional phenological events such as first flower and last flower), multiple distinct occurrences of phenophases within a single growing season (e.g multiple flowering events, common in drought-prone regions), as well as quantification of sampling frequency and observational uncertainties. The system also includes a mechanism for translation of phenophase start and end points into standard traditional phenological events to facilitate comparison of contemporary data collected with this new “phenophase status” monitoring approach to historical datasets collected with the “phenological event” monitoring approach. These features greatly enhance the utility of the resulting data for statistical analyses addressing questions such as how phenological events vary in time and space, and in response to global change.
NASA Astrophysics Data System (ADS)
Tian, F.; Sivapalan, M.; Li, H.; Hu, H.
2007-12-01
The importance of diagnostic analysis of hydrological models is increasingly recognized by the scientific community (M. Sivapalan, et al., 2003; H. V. Gupta, et al., 2007). Model diagnosis refers to model structures and parameters being identified not only by statistical comparison of system state variables and outputs but also by process understanding in a specific watershed. Process understanding can be gained by the analysis of observational data and model results at the specific watershed as well as through regionalization. Although remote sensing technology can provide valuable data about the inputs, state variables, and outputs of the hydrological system, observational rainfall-runoff data still constitute the most accurate, reliable, direct, and thus a basic component of hydrology related database. One critical question in model diagnostic analysis is, therefore, what signature characteristic can we extract from rainfall and runoff data. To this date only a few studies have focused on this question, such as Merz et al. (2006) and Lana-Renault et al. (2007), still none of these studies related event analysis with model diagnosis in an explicit, rigorous, and systematic manner. Our work focuses on the identification of the dominant runoff generation mechanisms from event analysis of rainfall-runoff data, including correlation analysis and analysis of timing pattern. The correlation analysis involves the identification of the complex relationship among rainfall depth, intensity, runoff coefficient, and antecedent conditions, and the timing pattern analysis aims to identify the clustering pattern of runoff events in relation to the patterns of rainfall events. Our diagnostic analysis illustrates the changing pattern of runoff generation mechanisms in the DMIP2 test watersheds located in Oklahoma region, which is also well recognized by numerical simulations based on TsingHua Representative Elementary Watershed (THREW) model. The result suggests the usefulness of rainfall-runoff event analysis for model development as well as model diagnostics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gopan, O; Novak, A; Zeng, J
Purpose: Physics pre-treatment plan review is crucial to safe radiation oncology treatments. Studies show that most errors originate in treatment planning, which underscores the importance of physics plan review. As a QA measure the physics review is of fundamental importance and is central to the profession of medical physics. However, little is known about its effectiveness. More hard data are needed. The purpose of this study was to quantify the effectiveness of physics review with the goal of improving it. Methods: This study analyzed 315 “potentially serious” near-miss incidents within an institutional incident learning system collected over a two-year period.more » 139 of these originated prior to physics review and were found at the review or after. Incidents were classified as events that: 1)were detected by physics review, 2)could have been detected (but were not), and 3)could not have been detected. Category 1 and 2 events were classified by which specific check (within physics review) detected or could have detected the event. Results: Of the 139 analyzed events, 73/139 (53%) were detected or could have been detected by the physics review; although, 42/73 (58%) were not actually detected. 45/73 (62%) errors originated in treatment planning, making physics review the first step in the workflow that could detect the error. Two specific physics checks were particularly effective (combined effectiveness of >20%): verifying DRRs (8/73) and verifying isocenter (7/73). Software-based plan checking systems were evaluated and found to have potential effectiveness of 40%. Given current data structures, software implementations of some tests such as isocenter verification check would be challenging. Conclusion: Physics plan review is a key safety measure and can detect majority of reported events. However, a majority of events that potentially could have been detected were NOT detected in this study, indicating the need to improve the performance of physics review.« less
Exploring the effect of alcohol on post-event processing specific to a social event.
Battista, Susan R; Kocovski, Nancy L
2010-01-01
Inconsistent findings regarding the relationship between social anxiety and alcohol use suggest that further research is needed to explore how alcohol affects various components of social anxiety. Post-event processing, or rumination after social events, is an element of cognitive models of social anxiety that is related to increased levels of social anxiety. The goal of the current study was to explore the interrelationships among social anxiety, post-event processing, and alcohol use. A sample of 208 university students completed online questionnaires to assess their levels of trait social anxiety and trait depression as well as their alcohol consumption at a specific social event. Participants then completed questionnaires to assess levels of post-event processing specific to the social event they attended. Results revealed that the amount of alcohol individuals consumed at the event predicted increased levels of post-event processing above and beyond levels of trait social anxiety and depression. As such, drinking may lead to increased post-event processing in student samples.
Factors Associated with Complications in Older Adults with Isolated Blunt Chest Trauma
Lotfipour, Shahram; Kaku, Shawn K.; Vaca, Federico E.; Patel, Chirag; Anderson, Craig L.; Ahmed, Suleman S.; Menchine, Michael D.
2009-01-01
Objective: To determine the prevalence of adverse events in elderly trauma patients with isolated blunt thoracic trauma, and to identify variables associated with these adverse events. Methods: We performed a chart review of 160 trauma patients age 65 and older with significant blunt thoracic trauma, drawn from an American College of Surgeons Level I Trauma Center registry. Patients with serious injury to other body areas were excluded to prevent confounding the cause of adverse events. Adverse events were defined as acute respiratory distress syndrome or pneumonia, unanticipated intubation, transfer to the intensive care unit for hypoxemia, or death. Data collected included history, physical examination, radiographic findings, length of hospital stay, and clinical outcomes. Results: Ninety-nine patients had isolated chest injury, while 61 others had other organ systems injured and were excluded. Sixteen patients developed adverse events [16.2% 95% confidence interval (CI) 9.5–24.9%], including two deaths. Adverse events were experienced by 19.2%, 6.1%, and 28.6% of those patients 65–74, 75–84, and ≥85 years old, respectively. The mean length of stay was 14.6 days in patients with an adverse event and 5.8 days in patients without. Post hoc analysis revealed that all 16 patients with an adverse event had one or more of the following: age ≥85, initial systolic blood pressure <90 mmHg, hemothorax, pneumothorax, three or more unilateral rib fractures, or pulmonary contusion (sensitivity 100%, CI 79.4–100%; specificity 38.6%, CI 28.1–49.9%). Conclusion: Adverse events from isolated thoracic trauma in elderly patients complicate 16% of our sample. These criteria were 100% sensitive and 38.5% specific for these adverse events. This study is a first step to identifying variables that might aid in identifying patients at high risk for serious adverse events. PMID:19561823
Mathematical approach to nonlocal interactions using a reaction-diffusion system.
Tanaka, Yoshitaro; Yamamoto, Hiroko; Ninomiya, Hirokazu
2017-06-01
In recent years, spatial long range interactions during developmental processes have been introduced as a result of the integration of microscopic information, such as molecular events and signaling networks. They are often called nonlocal interactions. If the profile of a nonlocal interaction is determined by experiments, we can easily investigate how patterns generate by numerical simulations without detailed microscopic events. Thus, nonlocal interactions are useful tools to understand complex biosystems. However, nonlocal interactions are often inconvenient for observing specific mechanisms because of the integration of information. Accordingly, we proposed a new method that could convert nonlocal interactions into a reaction-diffusion system with auxiliary unknown variables. In this review, by introducing biological and mathematical studies related to nonlocal interactions, we will present the heuristic understanding of nonlocal interactions using a reaction-diffusion system. © 2017 Japanese Society of Developmental Biologists.
FEATURES, EVENTS, AND PROCESSES: SYSTEM-LEVEL AND CRITICALITY
DOE Office of Scientific and Technical Information (OSTI.GOV)
D.L. McGregor
The primary purpose of this Analysis/Model Report (AMR) is to identify and document the screening analyses for the features, events, and processes (FEPs) that do not easily fit into the existing Process Model Report (PMR) structure. These FEPs include the 3 1 FEPs designated as System-Level Primary FEPs and the 22 FEPs designated as Criticality Primary FEPs. A list of these FEPs is provided in Section 1.1. This AMR (AN-WIS-MD-000019) documents the Screening Decision and Regulatory Basis, Screening Argument, and Total System Performance Assessment (TSPA) Disposition for each of the subject Primary FEPs. This AMR provides screening information and decisionsmore » for the TSPA-SR report and provides the same information for incorporation into a project-specific FEPs database. This AMR may also assist reviewers during the licensing-review process.« less
A Functional High-Throughput Assay of Myelination in Vitro
2014-07-01
iPS cells derived from human astrocytes. These cell lines will serve as an excellent source of human cells from which our model systems may be...image the 3D rat dorsal root ganglion ( DRG ) cultures with sufficiently low background as to detect electrically-evoked depolarization events, as...stimulation and recording system specifically for this purpose. Further, we found that the limitations inherent in optimizing speed and FOV may
ERIC Educational Resources Information Center
Park, Insu
2010-01-01
The purpose of this study is to explore systems users' behavior on IS under the various circumstances (e.g., email usage and malware threats, online communication at the individual level, and IS usage in organizations). Specifically, the first essay develops a method for analyzing and predicting the impact category of malicious code, particularly…
OpenROCS: a software tool to control robotic observatories
NASA Astrophysics Data System (ADS)
Colomé, Josep; Sanz, Josep; Vilardell, Francesc; Ribas, Ignasi; Gil, Pere
2012-09-01
We present the Open Robotic Observatory Control System (OpenROCS), an open source software platform developed for the robotic control of telescopes. It acts as a software infrastructure that executes all the necessary processes to implement responses to the system events that appear in the routine and non-routine operations associated to data-flow and housekeeping control. The OpenROCS software design and implementation provides a high flexibility to be adapted to different observatory configurations and event-action specifications. It is based on an abstract model that is independent of the specific hardware or software and is highly configurable. Interfaces to the system components are defined in a simple manner to achieve this goal. We give a detailed description of the version 2.0 of this software, based on a modular architecture developed in PHP and XML configuration files, and using standard communication protocols to interface with applications for hardware monitoring and control, environment monitoring, scheduling of tasks, image processing and data quality control. We provide two examples of how it is used as the core element of the control system in two robotic observatories: the Joan Oró Telescope at the Montsec Astronomical Observatory (Catalonia, Spain) and the SuperWASP Qatar Telescope at the Roque de los Muchachos Observatory (Canary Islands, Spain).
NASA Astrophysics Data System (ADS)
Fiori, E.; Comellas, A.; Molini, L.; Rebora, N.; Siccardi, F.; Gochis, D. J.; Tanelli, S.; Parodi, A.
2014-03-01
The city of Genoa, which places between the Tyrrhenian Sea and the Apennine mountains (Liguria, Italy) was rocked by severe flash floods on the 4th of November, 2011. Nearly 500 mm of rain, a third of the average annual rainfall, fell in six hours. Six people perished and millions of Euros in damages occurred. The synoptic-scale meteorological system moved across the Atlantic Ocean and into the Mediterranean generating floods that killed 5 people in Southern France, before moving over the Ligurian Sea and Genoa producing the extreme event studied here. Cloud-permitting simulations (1 km) of the finger-like convective system responsible for the torrential event over Genoa have been performed using Advanced Research Weather and Forecasting Model (ARW-WRF, version 3.3). Two different microphysics (WSM6 and Thompson) as well as three different convection closures (explicit, Kain-Fritsch, and Betts-Miller-Janjic) were evaluated to gain a deeper understanding of the physical processes underlying the observed heavy rain event and the model's capability to predict, in hindcast mode, its structure and evolution. The impact of forecast initialization and of model vertical discretization on hindcast results is also examined. Comparison between model hindcasts and observed fields provided by raingauge data, satellite data, and radar data show that this particular event is strongly sensitive to the details of the mesoscale initialization despite being evolved from a relatively large scale weather system. Only meso-γ details of the event were not well captured by the best setting of the ARW-WRF model and so peak hourly rainfalls were not exceptionally well reproduced. The results also show that specification of microphysical parameters suitable to these events have a positive impact on the prediction of heavy precipitation intensity values.
NASA Astrophysics Data System (ADS)
Tonini, Roberto; Sandri, Laura; Rouwet, Dmitri; Caudron, Corentin; Marzocchi, Warner; Suparjan
2016-07-01
Although most of volcanic hazard studies focus on magmatic eruptions, volcanic hazardous events can also occur when no migration of magma can be recognized. Examples are tectonic and hydrothermal unrest that may lead to phreatic eruptions. Recent events (e.g., Ontake eruption on September 2014) have demonstrated that phreatic eruptions are still hard to forecast, despite being potentially very hazardous. For these reasons, it is of paramount importance to identify indicators that define the condition of nonmagmatic unrest, in particular for hydrothermal systems. Often, this type of unrest is driven by movement of fluids, requiring alternative monitoring setups, beyond the classical seismic-geodetic-geochemical architectures. Here we present a new version of the probabilistic BET (Bayesian Event Tree) model, specifically developed to include the forecasting of nonmagmatic unrest and related hazards. The structure of the new event tree differs from the previous schemes by adding a specific branch to detail nonmagmatic unrest outcomes. A further goal of this work consists in providing a user-friendly, open-access, and straightforward tool to handle the probabilistic forecast and visualize the results as possible support during a volcanic crisis. The new event tree and tool are here applied to Kawah Ijen stratovolcano, Indonesia, as exemplificative application. In particular, the tool is set on the basis of monitoring data for the learning period 2000-2010, and is then blindly applied to the test period 2010-2012, during which significant unrest phases occurred.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guzowski, R.V.; Newman, G.
1993-12-01
The Greater Confinement Disposal location is being evaluated to determine whether defense-generated transuranic waste buried at this location complies with the Containment Requirements established by the US Environmental Protection Agency. One step in determining compliance is to identify those combinations of events and processes (scenarios) that define possible future states of the disposal system for which performance assessments must be performed. An established scenario-development procedure was used to identify a comprehensive set of mutually exclusive scenarios. To assure completeness, 761 features, events, processes, and other listings (FEPS) were compiled from 11 references. This number was reduced to 205 primarily throughmore » the elimination of duplications. The 205 FEPs were screened based on site-specific, goal-specific, and regulatory criteria. Four events survived screening and were used in preliminary scenario development: (1) exploratory drilling penetrates a GCD borehole, (2) drilling of a withdrawal/injection well penetrates a GCD borehole, (3) subsidence occurs at the RWMS, and (4) irrigation occurs at the RWMS. A logic diagram was used to develop 16 scenarios from the four events. No screening of these scenarios was attempted at this time. Additional screening of the currently retained events and processes will be based on additional data and information from site-characterization activities. When screening of the events and processes is completed, a final set of scenarios will be developed and screened based on consequence and probability of occurrence.« less
1985-04-01
and equipment whose operation can be verified with a visual or aural check. The sequence of outputs shall be cyclic, with provisions to stop the...private memory. The decision to provide spare, expansion capability, or a combination of both shall be based on life cycle cost (to the best extent...Computational System should be determined in conjunction with a computer expert (if possible). In any event, it is best to postpone completing - this
APNEA list mode data acquisition and real-time event processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hogle, R.A.; Miller, P.; Bramblett, R.L.
1997-11-01
The LMSC Active Passive Neutron Examinations and Assay (APNEA) Data Logger is a VME-based data acquisition system using commercial-off-the-shelf hardware with the application-specific software. It receives TTL inputs from eighty-eight {sup 3}He detector tubes and eight timing signals. Two data sets are generated concurrently for each acquisition session: (1) List Mode recording of all detector and timing signals, timestamped to 3 microsecond resolution; (2) Event Accumulations generated in real-time by counting events into short (tens of microseconds) and long (seconds) time bins following repetitive triggers. List Mode data sets can be post-processed to: (1) determine the optimum time bins formore » TRU assay of waste drums, (2) analyze a given data set in several ways to match different assay requirements and conditions and (3) confirm assay results by examining details of the raw data. Data Logger events are processed and timestamped by an array of 15 TMS320C40 DSPs and delivered to an embedded controller (PowerPC604) for interim disk storage. Three acquisition modes, corresponding to different trigger sources are provided. A standard network interface to a remote host system (Windows NT or SunOS) provides for system control, status, and transfer of previously acquired data. 6 figs.« less
Marsiglio, Mary C; Chronister, Krista M; Gibson, Brandon; Leve, Leslie D
2014-12-01
Researchers have postulated associations between childhood trauma and delinquency, but few have examined the direction of these relationships prospectively and, specifically, with samples of delinquent girls. The purpose of this study was to examine the relationship between traumatic events and delinquency for girls in the juvenile justice system using a cross-lagged model. Developmental differences in associations as a function of high school entry status were also examined. The sample included 166 girls in the juvenile justice system who were mandated to community-based out-of-home care due to chronic delinquency. Overall, study results provide evidence that trauma and delinquency risk pathways vary according to high school entry status. Implications for future research and practice are discussed.
Marsiglio, Mary C.; Chronister, Krista M.; Gibson, Brandon; Leve, Leslie D.
2014-01-01
Researchers have postulated associations between childhood trauma and delinquency, but few have examined the direction of these relationships prospectively and, specifically, with samples of delinquent girls. The purpose of this study was to examine the relationship between traumatic events and delinquency for girls in the juvenile justice system using a cross-lagged model. Developmental differences in associations as a function of high school entry status were also examined. The sample included 166 girls in the juvenile justice system who were mandated to community-based out-of-home care due to chronic delinquency. Overall, study results provide evidence that trauma and delinquency risk pathways vary according to high school entry status. Implications for future research and practice are discussed. PMID:25580179
Thackway, Sarah; Churches, Timothy; Fizzell, Jan; Muscatello, David; Armstrong, Paul
2009-01-01
Background Mass gatherings have been defined by the World Health Organisation as "events attended by a sufficient number of people to strain the planning and response resources of a community, state or nation". This paper explores the public health response to mass gatherings in Sydney, the factors that influenced the extent of deployment of resources and the utility of planning for mass gatherings as a preparedness exercise for other health emergencies. Discussion Not all mass gatherings of people require enhanced surveillance and additional response. The main drivers of extensive public health planning for mass gatherings reflect geographical spread, number of international visitors, event duration and political and religious considerations. In these instances, the implementation of a formal risk assessment prior to the event with ongoing daily review is important in identifying public health hazards. Developing and utilising event-specific surveillance to provide early-warning systems that address the specific risks identified through the risk assessment process are essential. The extent to which additional resources are required will vary and depend on the current level of surveillance infrastructure. Planning the public health response is the third step in preparing for mass gatherings. If the existing public health workforce has been regularly trained in emergency response procedures then far less effort and resources will be needed to prepare for each mass gathering event. The use of formal emergency management structures and co-location of surveillance and planning operational teams during events facilitates timely communication and action. Summary One-off mass gathering events can provide a catalyst for innovation and engagement and result in opportunities for ongoing public health planning, training and surveillance enhancements that outlasted each event. PMID:19735577
Thackway, Sarah; Churches, Timothy; Fizzell, Jan; Muscatello, David; Armstrong, Paul
2009-09-08
Mass gatherings have been defined by the World Health Organisation as "events attended by a sufficient number of people to strain the planning and response resources of a community, state or nation". This paper explores the public health response to mass gatherings in Sydney, the factors that influenced the extent of deployment of resources and the utility of planning for mass gatherings as a preparedness exercise for other health emergencies. Not all mass gatherings of people require enhanced surveillance and additional response. The main drivers of extensive public health planning for mass gatherings reflect geographical spread, number of international visitors, event duration and political and religious considerations. In these instances, the implementation of a formal risk assessment prior to the event with ongoing daily review is important in identifying public health hazards.Developing and utilising event-specific surveillance to provide early-warning systems that address the specific risks identified through the risk assessment process are essential. The extent to which additional resources are required will vary and depend on the current level of surveillance infrastructure.Planning the public health response is the third step in preparing for mass gatherings. If the existing public health workforce has been regularly trained in emergency response procedures then far less effort and resources will be needed to prepare for each mass gathering event. The use of formal emergency management structures and co-location of surveillance and planning operational teams during events facilitates timely communication and action. One-off mass gathering events can provide a catalyst for innovation and engagement and result in opportunities for ongoing public health planning, training and surveillance enhancements that outlasted each event.
NASA Astrophysics Data System (ADS)
Tavakoli, S.; Poslad, S.; Fruhwirth, R.; Winter, M.
2012-04-01
This paper introduces an application of a novel EventTracker platform for instantaneous Sensitivity Analysis (SA) of large scale real-time geo-information. Earth disaster management systems demand high quality information to aid a quick and timely response to their evolving environments. The idea behind the proposed EventTracker platform is the assumption that modern information management systems are able to capture data in real-time and have the technological flexibility to adjust their services to work with specific sources of data/information. However, to assure this adaptation in real time, the online data should be collected, interpreted, and translated into corrective actions in a concise and timely manner. This can hardly be handled by existing sensitivity analysis methods because they rely on historical data and lazy processing algorithms. In event-driven systems, the effect of system inputs on its state is of value, as events could cause this state to change. This 'event triggering' situation underpins the logic of the proposed approach. Event tracking sensitivity analysis method describes the system variables and states as a collection of events. The higher the occurrence of an input variable during the trigger of event, the greater its potential impact will be on the final analysis of the system state. Experiments were designed to compare the proposed event tracking sensitivity analysis with existing Entropy-based sensitivity analysis methods. The results have shown a 10% improvement in a computational efficiency with no compromise for accuracy. It has also shown that the computational time to perform the sensitivity analysis is 0.5% of the time required compared to using the Entropy-based method. The proposed method has been applied to real world data in the context of preventing emerging crises at drilling rigs. One of the major purposes of such rigs is to drill boreholes to explore oil or gas reservoirs with the final scope of recovering the content of such reservoirs; both in onshore regions as well as in offshore regions. Drilling a well is always guided by technical, economic and security constraints to prevent crew, equipment and environment from injury, damage and pollution. Although risk assessment and local practice provides a high degree of security, uncertainty is given by the behaviour of the formation which may cause crucial situations at the rig. To overcome such uncertainties real-time sensor measurements form a base to predict and thus prevent such crises, the proposed method supports the identification of the data necessary for that.
Detecting NEO Impacts using the International Monitoring System
NASA Astrophysics Data System (ADS)
Brown, Peter G.; Dube, Kimberlee; Silber, Elizabeth
2014-11-01
As part of the verification regime for the Comprehensive Nuclear Test Ban Treaty an International Monitoring System (IMS) consisting of seismic, hydroacoustic, infrasound and radionuclide technologies has been globally deployed beginning in the late 1990s. The infrasound network sub-component of the IMS consists of 47 active stations as of mid-2014. These microbarograph arrays detect coherent infrasonic signals from a range of sources including volcanoes, man-made explosions and bolides. Bolide detections from IMS stations have been reported since ~2000, but with the maturation of the network over the last several years the rate of detections has increased substantially. Presently the IMS performs semi-automated near real-time global event identification on timescales of 6-12 hours as well as analyst verified event identification having time lags of several weeks. Here we report on infrasound events identified by the IMS between 2010-2014 which are likely bolide impacts. Identification in this context refers to an event being included in one of the event bulletins issued by the IMS. In this untargeted study we find that the IMS globally identifies approximately 16 events per year which are likely bolide impacts. Using data released since the beginning of 2014 of US Government sensor detections (as given at http://neo.jpl.nasa.gov/fireballs/ ) of fireballs we find in a complementary targeted survey that the current IMS system is able to identify ~25% of fireballs with E > 0.1 kT energy. Using all 16 US Government sensor fireballs listed as of July 31, 2014 we are able to detect infrasound from 75% of these events on at least one IMS station. The high ratio of detection/identification is a product of the stricter criteria adopted by the IMS for inclusion in an event bulletin as compared to simple station detection.We discuss energy comparisons between infrasound-estimated energies based on amplitudes and periods and estimates provided by US Government sensors. Specific impact events of interest will be discussed as well as the utility of the global IMS infrasound system for location and timing of future NEAs detected prior to impact.
Transient Volcano Deformation Event Detection over Variable Spatial Scales in Alaska
NASA Astrophysics Data System (ADS)
Li, J. D.; Rude, C. M.; Gowanlock, M.; Herring, T.; Pankratius, V.
2016-12-01
Transient deformation events driven by volcanic activity can be monitored using increasingly dense networks of continuous Global Positioning System (GPS) ground stations. The wide spatial extent of GPS networks, the large number of GPS stations, and the spatially and temporally varying scale of deformation events result in the mixing of signals from multiple sources. Typical analysis then necessitates manual identification of times and regions of volcanic activity for further study and the careful tuning of algorithmic parameters to extract possible transient events. Here we present a computer-aided discovery system that facilitates the discovery of potential transient deformation events at volcanoes by providing a framework for selecting varying spatial regions of interest and for tuning the analysis parameters. This site specification step in the framework reduces the spatial mixing of signals from different volcanic sources before applying filters to remove interfering signals originating from other geophysical processes. We analyze GPS data recorded by the Plate Boundary Observatory network and volcanic activity logs from the Alaska Volcano Observatory to search for and characterize transient inflation events in Alaska. We find 3 transient inflation events between 2008 and 2015 at the Akutan, Westdahl, and Shishaldin volcanoes in the Aleutian Islands. The inflation event detected in the first half of 2008 at Akutan is validated other studies, while the inflation events observed in early 2011 at Westdahl and in early 2013 at Shishaldin are previously unreported. Our analysis framework also incorporates modelling of the transient inflation events and enables a comparison of different magma chamber inversion models. Here, we also estimate the magma sources that best describe the deformation observed by the GPS stations at Akutan, Westdahl, and Shishaldin. We acknowledge support from NASA AIST-NNX15AG84G (PI: V. Pankratius).
Solar Energetic Particles Events and Human Exploration: Measurements in a Space Habitat
NASA Astrophysics Data System (ADS)
Narici, L.; Berrilli, F.; Casolino, M.; Del Moro, D.; Forte, R.; Giovannelli, L.; Martucci, M.; Mergè, M.; Picozza, P.; Rizzo, A.; Scardigli, S.; Sparvoli, R.; Zeitlin, C.
2016-12-01
Solar activity is the source of Space Weather disturbances. Flares, CME and coronal holes modulate physical conditions of circumterrestrial and interplanetary space and ultimately the fluxes of high-energy ionized particles, i.e., solar energetic particle (SEP) and galactic cosmic ray (GCR) background. This ionizing radiation affects spacecrafts and biological systems, therefore it is an important issue for human exploration of space. During a deep space travel (for example the trip to Mars) radiation risk thresholds may well be exceeded by the crew, so mitigation countermeasures must be employed. Solar particle events (SPE) constitute high risks due to their impulsive high rate dose. Forecasting SPE appears to be needed and also specifically tailored to the human exploration needs. Understanding the parameters of the SPE that produce events leading to higher health risks for the astronauts in deep space is therefore a first priority issue. Measurements of SPE effects with active devices in LEO inside the ISS can produce important information for the specific SEP measured, relative to the specific detector location in the ISS (in a human habitat with a shield typical of manned space-crafts). Active detectors can select data from specific geo-magnetic regions along the orbits, allowing geo-magnetic selections that best mimic deep space radiation. We present results from data acquired in 2010 - 2012 by the detector system ALTEA inside the ISS (18 SPEs detected). We compare this data with data from the detector Pamela on a LEO satellite, with the RAD data during the Curiosity Journey to Mars, with GOES data and with several Solar physical parameters. While several features of the radiation modulation are easily understood by the effect of the geomagnetic field, as an example we report a proportionality of the flux in the ISS with the energetic proton flux measured by GOES, some features appear more difficult to interpret. The final goal of this work is to find the characteristics of solar events leading to highest radiation risks in a human habitat during deep space exploration to best focus the needed forecasting.
Surveillance of adverse effects during a vaccination campaign against meningitis C.
Laribière, Anne; Miremont-Salamé, Ghada; Reyre, Hadrien; Abouelfath, Abdelilah; Liège, Ludovic; Moore, Nicholas; Haramburu, Françoise
2005-12-01
To describe adverse events occurring after mass vaccination with conjugate and nonconjugate vaccines and to assess the incidence of serious adverse effects. A mass immunisation campaign against meningococcal C disease was conducted in two French administrative areas, Landes and Pyrénées atlantiques, for 2 months (from October to December 2002). Adverse events were reported by families and physicians by means of a specific reporting form returned to the pharmacovigilance centre 15 days after vaccination. The target population was 260,630 individuals aged between 2 months and 24 years. About 179,000 children and young adults were vaccinated. A total of 92,711 report forms were received by the pharmacovigilance centre, and 12,695 subjects presented at least one adverse event. The most frequently involved systems/disorders were application site disorders (48.4%), whole-body general disorders (21.8%), central and peripheral nervous system disorders (14.6%), and gastrointestinal system disorders (4.7%). Most of these adverse events were transient and not serious. There were 13 serious adverse events: one each of syncope, fever, headache with fever, neuralgia, serum sickness, arthritis, purpura, facial paralysis, multiple sclerosis, lipoma, and meningism, and two cases of bronchospasm. No significant difference was found in rates of adverse event reports between both vaccines. The estimated incidence of serious adverse effect reports was 7 per 100,000. This campaign was the second immunisation campaign undertaken in France involving both physicians and families as reporters. Although unlabeled adverse effects were identified during this campaign, they were mostly nonserious and have been known to occur with other vaccines.
Petersen Multipliers for Several SEU (Single Event Upset) Environment Models.
1986-09-30
The Aerospace Corporation El Segundo, CA 90245 30 Scptcmbcr 1986 Prepared for SPACE DIVISION AIR FORCE SYSTEMS COMMAND 0Los Angeles Air Force Station...X (pC/om)2 /Um " (for S in Pm2 and for in picocoulombs per um). (In another system of units, the constant in the equation for F, above, takes the...distributions that may be expected under a wide variety of conditions. These models have since become standards for use in the specification of system
Polarization of microglia and its role in bacterial sepsis.
Michels, Monique; Sonai, Beatriz; Dal-Pizzol, Felipe
2017-02-15
Microglial polarization in response to brain inflammatory conditions is a crescent field in neuroscience. However, the effect of systemic inflammation, and specifically sepsis, is a relatively unexplored field that has great interest and relevance. Sepsis has been associated with both early and late harmful events of the central nervous system, suggesting that there is a close link between sepsis and neuroinflammation. During sepsis evolution it is supposed that microglial could exert both neurotoxic and repairing effects depending on the specific microglial phenotype assumed. In this context, here it was reviewed the role of microglial polarization during sepsis-associated brain dysfunction. Copyright © 2017 Elsevier B.V. All rights reserved.
Security and privacy qualities of medical devices: an analysis of FDA postmarket surveillance.
Kramer, Daniel B; Baker, Matthew; Ransford, Benjamin; Molina-Markham, Andres; Stewart, Quinn; Fu, Kevin; Reynolds, Matthew R
2012-01-01
Medical devices increasingly depend on computing functions such as wireless communication and Internet connectivity for software-based control of therapies and network-based transmission of patients' stored medical information. These computing capabilities introduce security and privacy risks, yet little is known about the prevalence of such risks within the clinical setting. We used three comprehensive, publicly available databases maintained by the Food and Drug Administration (FDA) to evaluate recalls and adverse events related to security and privacy risks of medical devices. Review of weekly enforcement reports identified 1,845 recalls; 605 (32.8%) of these included computers, 35 (1.9%) stored patient data, and 31 (1.7%) were capable of wireless communication. Searches of databases specific to recalls and adverse events identified only one event with a specific connection to security or privacy. Software-related recalls were relatively common, and most (81.8%) mentioned the possibility of upgrades, though only half of these provided specific instructions for the update mechanism. Our review of recalls and adverse events from federal government databases reveals sharp inconsistencies with databases at individual providers with respect to security and privacy risks. Recalls related to software may increase security risks because of unprotected update and correction mechanisms. To detect signals of security and privacy problems that adversely affect public health, federal postmarket surveillance strategies should rethink how to effectively and efficiently collect data on security and privacy problems in devices that increasingly depend on computing systems susceptible to malware.
Security and Privacy Qualities of Medical Devices: An Analysis of FDA Postmarket Surveillance
Kramer, Daniel B.; Baker, Matthew; Ransford, Benjamin; Molina-Markham, Andres; Stewart, Quinn; Fu, Kevin; Reynolds, Matthew R.
2012-01-01
Background Medical devices increasingly depend on computing functions such as wireless communication and Internet connectivity for software-based control of therapies and network-based transmission of patients’ stored medical information. These computing capabilities introduce security and privacy risks, yet little is known about the prevalence of such risks within the clinical setting. Methods We used three comprehensive, publicly available databases maintained by the Food and Drug Administration (FDA) to evaluate recalls and adverse events related to security and privacy risks of medical devices. Results Review of weekly enforcement reports identified 1,845 recalls; 605 (32.8%) of these included computers, 35 (1.9%) stored patient data, and 31 (1.7%) were capable of wireless communication. Searches of databases specific to recalls and adverse events identified only one event with a specific connection to security or privacy. Software-related recalls were relatively common, and most (81.8%) mentioned the possibility of upgrades, though only half of these provided specific instructions for the update mechanism. Conclusions Our review of recalls and adverse events from federal government databases reveals sharp inconsistencies with databases at individual providers with respect to security and privacy risks. Recalls related to software may increase security risks because of unprotected update and correction mechanisms. To detect signals of security and privacy problems that adversely affect public health, federal postmarket surveillance strategies should rethink how to effectively and efficiently collect data on security and privacy problems in devices that increasingly depend on computing systems susceptible to malware. PMID:22829874
NASA Technical Reports Server (NTRS)
Maul, William A.; Meyer, Claudia M.
1991-01-01
A rocket engine safety system was designed to initiate control procedures to minimize damage to the engine or vehicle or test stand in the event of an engine failure. The features and the implementation issues associated with rocket engine safety systems are discussed, as well as the specific concerns of safety systems applied to a space-based engine and long duration space missions. Examples of safety system features and architectures are given, based on recent safety monitoring investigations conducted for the Space Shuttle Main Engine and for future liquid rocket engines. Also, the general design and implementation process for rocket engine safety systems is presented.
Vouk, Katja; Benter, Ursula; Amonkar, Mayur M; Marocco, Alessia; Stapelkamp, Ceilidh; Pfersch, Sylvie; Benjamin, Laure
2016-09-01
To estimate per-event cost and economic burden associated with managing the most common and/or severe metastatic melanoma (MM) treatment-related adverse events (AEs) in Australia, France, Germany, Italy, and the UK. AEs associated with chemotherapy (dacarbazine, paclitaxel, fotemustine), immunotherapy (ipilimumab), and targeted therapy (vemurafenib) were identified by literature review. Medical resource use data associated with managing AEs were collected through two blinded Delphi panel cycles in each of the five countries. Published costs were used to estimate per-event costs and combined with AEs incidence, treatment usage, and MM prevalence to estimate the economic burden for each country. The costliest AEs were grade 3/4 events due to immunotherapy (Australia/France: colitis; UK: diarrhea) and chemotherapy (Germany/Italy: neutropenia/leukopenia). Treatment of AEs specific to chemotherapy (Australia/Germany/Italy/France: neutropenia/leukopenia) and targeted therapy (UK: squamous cell carcinoma) contributed heavily to country-specific economic burden. Economic burden was estimated assuming that each patient experienced an AE only once. In addition, the context of settings was heterogeneous and the number of Delphi panel experts was limited. Management costs for MM treatment-associated AEs can be substantial. Results could be incorporated in economic models that support reimbursement dossiers. With the availability of newer treatments, establishment of a baseline measure of the economic burden of AEs will be crucial for assessing their impact on patients and regional healthcare systems.
Worker training for new threats: a proposed framework.
Mitchell, Clifford S; Doyle, Mary L; Moran, John B; Lippy, Bruce; Hughes, Joseph T; Lum, Max; Agnew, Jacqueline
2004-11-01
In an effort to identify health and safety training needs for various groups of workers related to weapons of mass destruction, including chemical, biological, radiological, and nuclear weapons and high yield explosives (CBRNE), a conference, "Worker Training in a New Era: Responding to New Threats," was held at the Johns Hopkins Bloomberg School of Public Health in October 2002. Two questions were addressed: Which general skills and knowledge are common to all workers who might be exposed to terrorist threats from CBRNE weapons? What are the particular skills and knowledge relevant to these threats that are specific to workers in different sectors? Thirteen core components for pre- and post-event training were identified. Pre-event training applies to all workers. Post-event training applies to selected personnel including first responders, skilled support personnel, and other workers involved in these operations. Recommendations to improve worker safety training related to preparedness include: identify specific competencies for worker pre- and post-event training; coordinate Federal policy on worker training for CBRNE hazards; adopt federal guidelines or standards on worker training for new CBRNE threats, based on the competencies and coordinated Federal policy; conduct an inventory of training programs and other resources that could be used or adapted for use for new threats; and develop new training content and methods for pre- and post-event training to address specific competencies. Given the possibility for the introduction of CBRNE threats into the workplace, all workers need some training in the potential hazards involved: the individual worker's specific role in an emergency; incident command; activation of the emergency notification system; use of personal protective equipment (PPE); and safe evacuation of the workplace. While some occupational sectors have developed effective training related to these new threats, there is a need to develop, implement, and evaluate training programs across many different sectors of the workforce. Copyright 2004 Wiley-Liss, Inc.
An Accident Precursor Analysis Process Tailored for NASA Space Systems
NASA Technical Reports Server (NTRS)
Groen, Frank; Stamatelatos, Michael; Dezfuli, Homayoon; Maggio, Gaspare
2010-01-01
Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system and which may differ in frequency or type from those in the various models. These discrepancies between the models (perceived risk) and the system (actual risk) provide the leading indication of an underappreciated risk. This paper presents an APA process developed specifically for NASA Earth-to-Orbit space systems. The purpose of the process is to identify and characterize potential sources of system risk as evidenced by anomalous events which, although not necessarily presenting an immediate safety impact, may indicate that an unknown or insufficiently understood risk-significant condition exists in the system. Such anomalous events are considered accident precursors because they signal the potential for severe consequences that may occur in the future, due to causes that are discernible from their occurrence today. Their early identification allows them to be integrated into the overall system risk model used to intbrm decisions relating to safety.
Spatio-temporal assessment of food safety risks in Canadian food distribution systems using GIS.
Hashemi Beni, Leila; Villeneuve, Sébastien; LeBlanc, Denyse I; Côté, Kevin; Fazil, Aamir; Otten, Ainsley; McKellar, Robin; Delaquis, Pascal
2012-09-01
While the value of geographic information systems (GIS) is widely applied in public health there have been comparatively few examples of applications that extend to the assessment of risks in food distribution systems. GIS can provide decision makers with strong computing platforms for spatial data management, integration, analysis, querying and visualization. The present report addresses some spatio-analyses in a complex food distribution system and defines influence areas as travel time zones generated through road network analysis on a national scale rather than on a community scale. In addition, a dynamic risk index is defined to translate a contamination event into a public health risk as time progresses. More specifically, in this research, GIS is used to map the Canadian produce distribution system, analyze accessibility to contaminated product by consumers, and estimate the level of risk associated with a contamination event over time, as illustrated in a scenario. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.
Tool for Constructing Data Albums for Significant Weather Events
NASA Astrophysics Data System (ADS)
Kulkarni, A.; Ramachandran, R.; Conover, H.; McEniry, M.; Goodman, H.; Zavodsky, B. T.; Braun, S. A.; Wilson, B. D.
2012-12-01
Case study analysis and climatology studies are common approaches used in Atmospheric Science research. Research based on case studies involves a detailed description of specific weather events using data from different sources, to characterize physical processes in play for a given event. Climatology-based research tends to focus on the representativeness of a given event, by studying the characteristics and distribution of a large number of events. To gather relevant data and information for case studies and climatology analysis is tedious and time consuming; current Earth Science data systems are not suited to assemble multi-instrument, multi mission datasets around specific events. For example, in hurricane science, finding airborne or satellite data relevant to a given storm requires searching through web pages and data archives. Background information related to damages, deaths, and injuries requires extensive online searches for news reports and official storm summaries. We will present a knowledge synthesis engine to create curated "Data Albums" to support case study analysis and climatology studies. The technological challenges in building such a reusable and scalable knowledge synthesis engine are several. First, how to encode domain knowledge in a machine usable form? This knowledge must capture what information and data resources are relevant and the semantic relationships between the various fragments of information and data. Second, how to extract semantic information from various heterogeneous sources including unstructured texts using the encoded knowledge? Finally, how to design a structured database from the encoded knowledge to store all information and to support querying? The structured database must allow both knowledge overviews of an event as well as drill down capability needed for detailed analysis. An application ontology driven framework is being used to design the knowledge synthesis engine. The knowledge synthesis engine is being applied to build a portal for hurricane case studies at the Global Hydrology and Resource Center (GHRC), a NASA Data Center. This portal will auto-generate Data Albums for specific hurricane events, compiling information from distributed resources such as NASA field campaign collections, relevant data sets, storm reports, pictures, videos and other useful sources.
NASA Astrophysics Data System (ADS)
Kravtsova, M. V.; Sdobnov, V. E.
2015-09-01
Using data from a worldwide network of neutron monitors, we have investigated the cosmicray (CR) energy spectra and anisotropy during the CR increases attributable to the solar events of June 11 and 15, 1991, by the spectrographic global survey method. By jointly analyzing ground-based and satellite measurements, we have determined the parameters of the CR rigidity spectrum reflecting the electromagnetic characteristics of the heliospheric fields in each hour of observations within the framework of the model of CR modulation by regular heliospheric electromagnetic fields. The CR spectra and relative CR intensity variations in the solar—ecliptic geocentric coordinate system are presented at specific times of these events.
Chen, Xing-Jie; Liu, Lu-Lu; Cui, Ji-Fang; Wang, Ya; Shum, David H. K.; Chan, Raymond C. K.
2015-01-01
Mental time travel refers to the ability to recall episodic past and imagine future events. The present study aimed to investigate cultural differences in mental time travel between Chinese and Australian university students. A total of 231 students (108 Chinese and 123 Australians) participated in the study. Their mental time travel abilities were measured by the Sentence Completion for Events from the Past Test (SCEPT) and the Sentence Completion for Events in the Future Test (SCEFT). Results showed that there were no cultural differences in the number of specific events generated for the past or future. Significant differences between the Chinese and Australian participants were found mainly in the emotional valence and content of the events generated. Both Chinese and Australian participants generated more specific positive events compared to negative events when thinking about the future and Chinese participants were more positive about their past than Australian participants when recalling specific events. For content, Chinese participants recalled more events about their interpersonal relationships, while Australian participants imagined more about personal future achievements. These findings shed some lights on cultural differences in episodic past and future thinking. PMID:26167154
NASA Astrophysics Data System (ADS)
Diaz, Julia M.; Hansel, Colleen M.; Apprill, Amy; Brighi, Caterina; Zhang, Tong; Weber, Laura; McNally, Sean; Xun, Liping
2016-12-01
The reactive oxygen species superoxide (O2.-) is both beneficial and detrimental to life. Within corals, superoxide may contribute to pathogen resistance but also bleaching, the loss of essential algal symbionts. Yet, the role of superoxide in coral health and physiology is not completely understood owing to a lack of direct in situ observations. By conducting field measurements of superoxide produced by corals during a bleaching event, we show substantial species-specific variation in external superoxide levels, which reflect the balance of production and degradation processes. Extracellular superoxide concentrations are independent of light, algal symbiont abundance and bleaching status, but depend on coral species and bacterial community composition. Furthermore, coral-derived superoxide concentrations ranged from levels below bulk seawater up to ~120 nM, some of the highest superoxide concentrations observed in marine systems. Overall, these results unveil the ability of corals and/or their microbiomes to regulate superoxide in their immediate surroundings, which suggests species-specific roles of superoxide in coral health and physiology.
Tay, Sen Hee; Mak, Anselm
2017-04-01
Neurological and psychiatric syndromes, collectively referred to as NPSLE, occur frequently in SLE. The frequency of NPSLE varies from 21 to 95%; however, only 13-38% of neuropsychiatric (NP) events could be attributable to SLE in the NPSLE SLICC inception cohort. This variability in the frequency of NPSLE is attributable to the low specificity of the ACR case definitions for SLE-attributed NP syndromes, inclusion of minor NP events in the ACR nomenclature, difficulty in ascertainment of NP events and diverse experience of rheumatologists in the clinical assessment of NP events. Making the correct and early attribution of NP events to SLE is important to institute appropriate immunosuppressive treatment for favourable outcomes. Various attribution models using composite decision rules have been developed and used to ascribe NP events to SLE. This review will focus on the various clinical presentations, diagnostic work-up and attributions of the common NPSLE syndromes, including other NP events not included in the ACR nomenclature but which have come to attention in recent years. © The Author 2016. Published by Oxford University Press on behalf of the British Society for Rheumatology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Scott, J; Botsis, T; Ball, R
2014-01-01
Spontaneous Reporting Systems [SRS] are critical tools in the post-licensure evaluation of medical product safety. Regulatory authorities use a variety of data mining techniques to detect potential safety signals in SRS databases. Assessing the performance of such signal detection procedures requires simulated SRS databases, but simulation strategies proposed to date each have limitations. We sought to develop a novel SRS simulation strategy based on plausible mechanisms for the growth of databases over time. We developed a simulation strategy based on the network principle of preferential attachment. We demonstrated how this strategy can be used to create simulations based on specific databases of interest, and provided an example of using such simulations to compare signal detection thresholds for a popular data mining algorithm. The preferential attachment simulations were generally structurally similar to our targeted SRS database, although they had fewer nodes of very high degree. The approach was able to generate signal-free SRS simulations, as well as mimicking specific known true signals. Explorations of different reporting thresholds for the FDA Vaccine Adverse Event Reporting System suggested that using proportional reporting ratio [PRR] > 3.0 may yield better signal detection operating characteristics than the more commonly used PRR > 2.0 threshold. The network analytic approach to SRS simulation based on the principle of preferential attachment provides an attractive framework for exploring the performance of safety signal detection algorithms. This approach is potentially more principled and versatile than existing simulation approaches. The utility of network-based SRS simulations needs to be further explored by evaluating other types of simulated signals with a broader range of data mining approaches, and comparing network-based simulations with other simulation strategies where applicable.
Bridging the semantic gap in sports
NASA Astrophysics Data System (ADS)
Li, Baoxin; Errico, James; Pan, Hao; Sezan, M. Ibrahim
2003-01-01
One of the major challenges facing current media management systems and the related applications is the so-called "semantic gap" between the rich meaning that a user desires and the shallowness of the content descriptions that are automatically extracted from the media. In this paper, we address the problem of bridging this gap in the sports domain. We propose a general framework for indexing and summarizing sports broadcast programs. The framework is based on a high-level model of sports broadcast video using the concept of an event, defined according to domain-specific knowledge for different types of sports. Within this general framework, we develop automatic event detection algorithms that are based on automatic analysis of the visual and aural signals in the media. We have successfully applied the event detection algorithms to different types of sports including American football, baseball, Japanese sumo wrestling, and soccer. Event modeling and detection contribute to the reduction of the semantic gap by providing rudimentary semantic information obtained through media analysis. We further propose a novel approach, which makes use of independently generated rich textual metadata, to fill the gap completely through synchronization of the information-laden textual data with the basic event segments. An MPEG-7 compliant prototype browsing system has been implemented to demonstrate semantic retrieval and summarization of sports video.
Zhang, Le-Tao; Li, Zhan-Bin; Wang, Shan-Shan
2016-12-01
Scale issues, which have been extensively studied in the domain of soil erosion, are considerably significant in geomorphologic processes and hydrologic modelling. However, relatively scarce efforts have been made to quantify the spatial scale effect on event-based sediment dynamics in basin-wide floods. To address this issue, sediment-runoff yield data of 44 basin-wide flood events were collected from gauging stations at the Chabagou river basin, a typical agro-basin (unmanaged) in the hilly loess region of the Chinese Loess Plateau. Thus, the spatial scale effect on event-based sediment dynamics was investigated in the basin system across three different spatial scales from sublateral to basin outlet. Results showed that the event-based suspended sediment concentration, as well as the intra- and inter-scale flow-sediment relationships remained spatially constant. Hence, almost all the sediment-laden flows can reach at the detachment-limited maximum concentration across scales, specifically for hyperconcentrated flows. Consequently, limited influence was exerted by upstream sediment-laden flow on downstream sediment output, particularly for major sediment-producing events. However, flood peak discharge instead of total flood runoff amount can better interpret the dynamics of sediment yield across scales. As a composite parameter, the proposed stream energy factor combines flood runoff depth and flood peak discharge, thereby showing more advantages to describe the event-based inter-scale flow-sediment relationship than other flow-related variables. Overall, this study demonstrates the process-specific characteristics of soil erosion by water flows in the basin system. Therefore, event-based sediment control should be oriented by the process to cut off the connectivity of hyperconcentrated flows and redistribute the erosive energy of flowing water in terms of temporality and spatiality. Furthermore, evaluation of soil conservation benefits should be based on the process of runoff regulation to comprehensively assess the efficiency of anti-erosion strategies in sediment control at the basin scale. Copyright © 2016. Published by Elsevier B.V.
Thousands of exon skipping events differentiate among splicing patterns in sixteen human tissues.
Florea, Liliana; Song, Li; Salzberg, Steven L
2013-01-01
Alternative splicing is widely recognized for its roles in regulating genes and creating gene diversity. However, despite many efforts, the repertoire of gene splicing variation is still incompletely characterized, even in humans. Here we describe a new computational system, ASprofile, and its application to RNA-seq data from Illumina's Human Body Map project (>2.5 billion reads). Using the system, we identified putative alternative splicing events in 16 different human tissues, which provide a dynamic picture of splicing variation across the tissues. We detected 26,989 potential exon skipping events representing differences in splicing patterns among the tissues. A large proportion of the events (>60%) were novel, involving new exons (~3000), new introns (~16000), or both. When tracing these events across the sixteen tissues, only a small number (4-7%) appeared to be differentially expressed ('switched') between two tissues, while 30-45% showed little variation, and the remaining 50-65% were not present in one or both tissues compared. Novel exon skipping events appeared to be slightly less variable than known events, but were more tissue-specific. Our study represents the first effort to build a comprehensive catalog of alternative splicing in normal human tissues from RNA-seq data, while providing insights into the role of alternative splicing in shaping tissue transcriptome differences. The catalog of events and the ASprofile software are freely available from the Zenodo repository ( http://zenodo.org/record/7068; doi: 10.5281/zenodo.7068) and from our web site http://ccb.jhu.edu/software/ASprofile.
Patient Safety Leadership WalkRounds.
Frankel, Allan; Graydon-Baker, Erin; Neppl, Camilla; Simmonds, Terri; Gustafson, Michael; Gandhi, Tejal K
2003-01-01
In the WalkRounds concept, a core group, which includes the senior executives and/or vice presidents, conducts weekly visits to different areas of the hospital. The group, joined by one or two nurses in the area and other available staff, asks specific questions about adverse events or near misses and about the factors or systems issues that led to these events. ANALYSIS OF EVENTS: Events in the Walkrounds are entered into a database and classified according to the contributing factors. The data are aggregated by contributing factors and priority scores to highlight the root issues. The priority scores are used to determine QI pilots and make best use of limited resources. Executives are surveyed quarterly about actions they have taken as a direct result of WalkRounds and are asked what they have learned from the rounds. As of September 2002, 47 Patient Safety Leadership WalkRounds visited a total of 48 different areas of the hospital, with 432 individual comments. The WalkRounds require not only knowledgeable and invested senior leadership but also a well-organized support structure. Quality and safety personnel are needed to collect data and maintain a database of confidential information, evaluate the data from a systems approach, and delineate systems-based actions to improve care delivery. Comments of frontline clinicians and executives suggested that WalkRounds helps educate leadership and frontline staff in patient safety concepts and will lead to cultural changes, as manifested in more open discussion of adverse events and an improved rate of safety-based changes.
Are habitual overgeneral recollection and prospection maladaptive?
Robinaugh, Donald J; Lubin, Rebecca E; Babic, Luka; McNally, Richard J
2013-06-01
Individuals with depression exhibit difficulty retrieving specific memories and imagining specific future events when instructed to do so relative to non-clinical comparison groups. Instead of specific events, depressed individuals frequently retrieve or imagine "overgeneral" memories that span a long period of time or that denote a category of similar events. Recently, Raes, Hermans, Williams, and Eelen (2007) developed a sentence completion procedure (SCEPT) to assess the tendency to recall overgeneral autobiographical memories. They found that specificity on this measure was associated with depression and rumination. We aimed to replicate these findings and to examine the tendency to imagine overgeneral future events. We had 170 subjects complete past (SCEPT) and future-oriented (SCEFT) sentence completion tasks and measures of depression severity, PTSD severity, hopelessness, and repetitive negative thought. Although specificities of past and future events were correlated, neither SCEPT nor SCEFT specificity was negatively associated with depression severity, posttraumatic stress symptoms, repetitive negative thought (RNT), or hopelessness. Our data are cross-sectional, preventing any determination of causality and limiting our assessment of whether specificity is associated with psychological distress following a stressful life event. In addition, we observed poor internal consistency for both the SCEPT and SCEFT. These findings fail to support the hypothesis that overgeneral memory and prospection on these tasks are associated with psychological distress. Copyright © 2012. Published by Elsevier Ltd.
Meisner, Joshua K.; Price, Richard J.
2010-01-01
Arterial occlusive disease (AOD) is the leading cause of morbidity and mortality through the developed world, which creates a significant need for effective therapies to halt disease progression. Despite success of animal and small-scale human therapeutic arteriogenesis studies, this promising concept for treating AOD has yielded largely disappointing results in large-scale clinical trials. One reason for this lack of successful translation is that endogenous arteriogenesis is highly dependent on a poorly understood sequence of events and interactions between bone marrow derived cells (BMCs) and vascular cells, which makes designing effective therapies difficult. We contend that the process follows a complex, ordered sequence of events with multiple, specific BMC populations recruited at specific times and locations. Here we present the evidence suggesting roles for multiple BMC populations from neutrophils and mast cells to progenitor cells and propose how and where these cell populations fit within the sequence of events during arteriogenesis. Disruptions in these various BMC populations can impair the arteriogenesis process in patterns that characterize specific patient populations. We propose that an improved understanding of how arteriogenesis functions as a system can reveal individual BMC populations and functions that can be targeted for overcoming particular impairments in collateral vessel development. PMID:21044213
Fusing Symbolic and Numerical Diagnostic Computations
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
X-2000 Anomaly Detection Language denotes a developmental computing language, and the software that establishes and utilizes the language, for fusing two diagnostic computer programs, one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for realtime detection of events (e.g., failures) in a spacecraft, aircraft, or other complex engineering system. The numerical analysis method is performed by beacon-based exception analysis for multi-missions (BEAMs), which has been discussed in several previous NASA Tech Briefs articles. The symbolic analysis method is, more specifically, an artificial-intelligence method of the knowledge-based, inference engine type, and its implementation is exemplified by the Spacecraft Health Inference Engine (SHINE) software. The goal in developing the capability to fuse numerical and symbolic diagnostic components is to increase the depth of analysis beyond that previously attainable, thereby increasing the degree of confidence in the computed results. In practical terms, the sought improvement is to enable detection of all or most events, with no or few false alarms.
Monitoring Java Programs with Java PathExplorer
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)
2001-01-01
We present recent work on the development Java PathExplorer (JPAX), a tool for monitoring the execution of Java programs. JPAX can be used during program testing to gain increased information about program executions, and can potentially furthermore be applied during operation to survey safety critical systems. The tool facilitates automated instrumentation of a program's late code which will then omit events to an observer during its execution. The observer checks the events against user provided high level requirement specifications, for example temporal logic formulae, and against lower level error detection procedures, for example concurrency related such as deadlock and data race algorithms. High level requirement specifications together with their underlying logics are defined in the Maude rewriting logic, and then can either be directly checked using the Maude rewriting engine, or be first translated to efficient data structures and then checked in Java.
1976-11-01
system. b. Read different program configurations to reconfigure the software during flight. c. Write Digital Integrated Test System (DITS) results...associated witn > inor C):l.e Event must be Unlatched. The sole difference between a Latched ana an lnratcrec Condition is that upon the Scheduling...Table. Furthermore, the block of pointers for one Minor Cycle may be wholly contained witnir the Diock of ocinters for a different Minor Cycle. For
CHSIR Anthropometric Database, CHSIR Truncated Anthropometric Database, and Boundary Manikins
NASA Technical Reports Server (NTRS)
Rajulu, Sudhakar
2011-01-01
The NASA crew anthropometric dimensions that the Commercial Transportation System (CTS) must accommodate are listed in CCT-REQ-1130 Draft 3.0, with the specific critical anthropometric dimensions for use in vehicle design (and suit design in the event that a pressure suit is part of the commercial partner s design solution).
Response-Specific Effects of Pain Observation on Motor Behavior
ERIC Educational Resources Information Center
Morrison, India; Poliakoff, Ellen; Gordon, Lucy; Downing, Paul
2007-01-01
How does seeing a painful event happening to someone else influence the observer's own motor system? To address this question, we measured simple reaction times following videos showing noxious or innocuous implements contacting corporeal or noncorporeal objects. Key releases in a go/nogo task were speeded, and key presses slowed, after subjects…
Cooperative Driver Education and Safety Training. Instructor's Guide.
ERIC Educational Resources Information Center
Seyfarth, John T.; And Others
The program, designed to give the driver-training pupil a semester of 50 hours of instruction, involves four instructional phases, one of them optional to give flexibility to fit the varying needs of different school systems: Phase 1--the classroom phase, with 30 instructional hours devoted to 30 specific events, staggered at each school…
Observations of ionospheric electric fields above atmospheric weather systems
NASA Technical Reports Server (NTRS)
Farrell, W. M.; Aggson, T. L.; Rodgers, E. B.; Hanson, W. B.
1994-01-01
We report on the observations of a number of quasi-dc electric field events associated with large-scale atmospheric weather formations. The observations were made by the electric field experiment onboard the San Marco D satellite, operational in an equatorial orbit from May to December 1988. Several theoretical studies suggest that electric fields generated by thunderstorms are present at high altitudes in the ionosphere. In spite of such favorable predictions, weather-related events are not often observed since they are relatively weak. We shall report here on a set of likely E field candidates for atmospheric-ionospheric causality, these being observed over the Indonesian Basin, northern South America, and the west coast of Africa; all known sites of atmospheric activity. As we shall demonstrate, individual events often be traced to specific active weather features. For example, a number of events were associated with spacecraft passages near Hurricane Joan in mid-October 1988. As a statistical set, the events appear to coincide with the most active regions of atmospheric weather.
What can we learn from resource pulses?
Yang, Louie H; Bastow, Justin L; Spence, Kenneth O; Wright, Amber N
2008-03-01
An increasing number of studies in a wide range of natural systems have investigated how pulses of resource availability influence ecological processes at individual, population, and community levels. Taken together, these studies suggest that some common processes may underlie pulsed resource dynamics in a wide diversity of systems. Developing a common framework of terms and concepts for the study of resource pulses may facilitate greater synthesis among these apparently disparate systems. Here, we propose a general definition of the resource pulse concept, outline some common patterns in the causes and consequences of resource pulses, and suggest a few key questions for future investigations. We define resource pulses as episodes of increased resource availability in space and time that combine low frequency (rarity), large magnitude (intensity), and short duration (brevity), and emphasize the importance of considering resource pulses at spatial and temporal scales relevant to specific resource-onsumer interactions. Although resource pulses are uncommon events for consumers in specific systems, our review of the existing literature suggests that pulsed resource dynamics are actually widespread phenomena in nature. Resource pulses often result from climatic and environmental factors, processes of spatiotemporal accumulation and release, outbreak population dynamics, or a combination of these factors. These events can affect life history traits and behavior at the level of individual consumers, numerical responses at the population level, and indirect effects at the community level. Consumers show strategies for utilizing ephemeral resources opportunistically, reducing resource variability by averaging over larger spatial scales, and tolerating extended interpulse periods of reduced resource availability. Resource pulses can also create persistent effects in communities through several mechanisms. We suggest that the study of resource pulses provides opportunities to understand the dynamics of many specific systems, and may also contribute to broader ecological questions at individual, population, and community levels.
NASA Astrophysics Data System (ADS)
Hunka, Frantisek; Matula, Jiri
2017-07-01
Transaction based approach is utilized in some methodologies in business process modeling. Essential parts of these transactions are human beings. The notion of agent or actor role is usually used for them. The paper on a particular example describes possibilities of Design Engineering Methodology for Organizations (DEMO) and Resource-Event-Agent (REA) methodology. Whereas the DEMO methodology can be regarded as a generic methodology having its foundation in the theory of Enterprise Ontology the REA methodology is regarded as the domain specific methodology and has its origin in accountancy systems. The results of these approaches is that the DEMO methodology captures everything that happens in the reality with a good empirical evidence whereas the REA methodology captures only changes connected with economic events. Economic events represent either change of the property rights to economic resource or consumption or production of economic resources. This results from the essence of economic events and their connection to economic resources.
Market-based control mechanisms for patient safety
Coiera, E; Braithwaite, J
2009-01-01
A new model is proposed for enhancing patient safety using market-based control (MBC), inspired by successful approaches to environmental governance. Emissions trading, enshrined in the Kyoto protocol, set a carbon price and created a carbon market—is it possible to set a patient safety price and let the marketplace find ways of reducing clinically adverse events? To “cap and trade,” a regulator would need to establish system-wide and organisation-specific targets, based on the cost of adverse events, create a safety market for trading safety credits and then police the market. Organisations are given a clear policy signal to reduce adverse event rates, are told by how much, but are free to find mechanisms best suited to their local needs. The market would inevitably generate novel ways of creating safety credits, and accountability becomes hard to evade when adverse events are explicitly measured and accounted for in an organisation’s bottom line. PMID:19342522
A Small Acoustic Goniometer for General Purpose Research
Pook, Michael L.; Loo, Sin Ming
2016-01-01
Understanding acoustic events and monitoring their occurrence is a useful aspect of many research projects. In particular, acoustic goniometry allows researchers to determine the source of an event based solely on the sound it produces. The vast majority of acoustic goniometry research projects used custom hardware targeted to the specific application under test. Unfortunately, due to the wide range of sensing applications, a flexible general purpose hardware/firmware system does not exist for this purpose. This article focuses on the development of such a system which encourages the continued exploration of general purpose hardware/firmware and lowers barriers to research in projects requiring the use of acoustic goniometry. Simulations have been employed to verify system feasibility, and a complete hardware implementation of the acoustic goniometer has been designed and field tested. The results are reported, and suggested areas for improvement and further exploration are discussed. PMID:27136563
NASA Astrophysics Data System (ADS)
Kelley, Troy D.; McGhee, S.
2013-05-01
This paper describes the ongoing development of a robotic control architecture that inspired by computational cognitive architectures from the discipline of cognitive psychology. The Symbolic and Sub-Symbolic Robotics Intelligence Control System (SS-RICS) combines symbolic and sub-symbolic representations of knowledge into a unified control architecture. The new architecture leverages previous work in cognitive architectures, specifically the development of the Adaptive Character of Thought-Rational (ACT-R) and Soar. This paper details current work on learning from episodes or events. The use of episodic memory as a learning mechanism has, until recently, been largely ignored by computational cognitive architectures. This paper details work on metric level episodic memory streams and methods for translating episodes into abstract schemas. The presentation will include research on learning through novelty and self generated feedback mechanisms for autonomous systems.
Data analysis of the COMPTEL instrument on the NASA gamma ray observatory
NASA Technical Reports Server (NTRS)
Diehl, R.; Bennett, K.; Collmar, W.; Connors, A.; Denherder, J. W.; Hermsen, W.; Lichti, G. G.; Lockwood, J. A.; Macri, J.; Mcconnell, M.
1992-01-01
The Compton imaging telescope (COMPTEL) on the Gamma Ray Observatory (GRO) is a wide field of view instrument. The coincidence measurement technique in two scintillation detector layers requires specific analysis methods. Straightforward event projection into the sky is impossible. Therefore, detector events are analyzed in a multi-dimensional dataspace using a gamma ray sky hypothesis convolved with the point spread function of the instrument in this dataspace. Background suppression and analysis techniques have important implications on the gamma ray source results for this background limited telescope. The COMPTEL collaboration applies a software system of analysis utilities, organized around a database management system. The use of this system for the assistance of guest investigators at the various collaboration sites and external sites is foreseen and allows different detail levels of cooperation with the COMPTEL institutes, dependent on the type of data to be studied.
Zane, Richard D; Prestipino, Ann L
2004-01-01
Hospital disaster manuals and response plans often lack formal command structure; instead, they rely on the presence of key individuals who are familiar with hospital operations, or who are in leadership positions during routine, day-to-day operations. Although this structure occasionally may prove to be successful, it is unreliable, as this leadership may be unavailable at the time of the crisis, and may not be sustainable during a prolonged event. The Hospital Emergency Incident Command System (HEICS) provides a command structure that does not rely on specific individuals, is flexible and expandable, and is ubiquitous in the fire service, emergency medical services, military, and police agencies, thus allowing for ease of communication during event management. A descriptive report of the implementation of the HEICS throughout a large healthcare network is reviewed. Implementation of the HEICS provides a consistent command structure for hospitals that enables consistency and commonality with other hospitals and disaster response entities.
Pezzetta, Rachele; Nicolardi, Valentina; Tidoni, Emmanuele; Aglioti, Salvatore Maria
2018-06-06
Detecting errors in one's own actions, and in the actions of others, is a crucial ability for adaptable and flexible behavior. Studies show that specific EEG signatures underpin the monitoring of observed erroneous actions (error-related negativity, error-positivity, mid-frontal theta oscillations). However, the majority of studies on action observation used sequences of trials where erroneous actions were less frequent than correct actions. Therefore, it was not possible to disentangle whether the activation of the performance monitoring system was due to an error - as a violation of the intended goal - or a surprise/novelty effect, associated with a rare and unexpected event. Combining EEG and immersive virtual reality (IVR-CAVE system), we recorded the neural signal of 25 young adults who observed in first-person perspective, simple reach-to-grasp actions performed by an avatar aiming for a glass. Importantly, the proportion of erroneous actions was higher than correct actions. Results showed that the observation of erroneous actions elicits the typical electro-cortical signatures of error monitoring and therefore the violation of the action goal is still perceived as a salient event. The observation of correct actions elicited stronger alpha suppression. This confirmed the role of the alpha frequency band in the general orienting response to novel and infrequent stimuli. Our data provides novel evidence that an observed goal error (the action slip) triggers the activity of the performance monitoring system even when erroneous actions, which are, typically, relevant events, occur more often than correct actions and thus are not salient because of their rarity.
NASA Technical Reports Server (NTRS)
Dias, W. C.
1994-01-01
RISK D/C is a prototype program which attempts to do program risk modeling for the Space Exploration Initiative (SEI) architectures proposed in the Synthesis Group Report. Risk assessment is made with respect to risk events, their probabilities, and the severities of potential results. The program allows risk mitigation strategies to be proposed for an exploration program architecture and to be ranked with respect to their effectiveness. RISK D/C allows for the fact that risk assessment in early planning phases is subjective. Although specific to the SEI in its present form, RISK D/C can be used as a framework for developing a risk assessment program for other specific uses. RISK D/C is organized into files, or stacks, of information, including the architecture, the hazard, and the risk event stacks. Although predefined, all stacks can be upgraded by a user. The architecture stack contains information concerning the general program alternatives, which are subsequently broken down into waypoints, missions, and mission phases. The hazard stack includes any background condition which could result in a risk event. A risk event is anything unfavorable that could happen during the course of a specific point within an architecture, and the risk event stack provides the probabilities, consequences, severities, and any mitigation strategies which could be used to reduce the risk of the event, and how much the risk is reduced. RISK D/C was developed for Macintosh series computers. It requires HyperCard 2.0 or later, as well as 2Mb of RAM and System 6.0.8 or later. A Macintosh II series computer is recommended due to speed concerns. The standard distribution medium for this package is one 3.5 inch 800K Macintosh format diskette. RISK D/C was developed in 1991 and is a copyrighted work with all copyright vested in NASA. Macintosh and HyperCard are trademarks of Apple Computer, Inc.
NASA Technical Reports Server (NTRS)
Ferrara, Jeffrey; Calk, William; Atwell, William; Tsui, Tina
2013-01-01
MPISS is an automatic file transfer system that implements a combination of standard and mission-unique transfer protocols required by the Global Precipitation Measurement Mission (GPM) Precipitation Processing System (PPS) to control the flow of data between the MOC and the PPS. The primary features of MPISS are file transfers (both with and without PPS specific protocols), logging of file transfer and system events to local files and a standard messaging bus, short term storage of data files to facilitate retransmissions, and generation of file transfer accounting reports. The system includes a graphical user interface (GUI) to control the system, allow manual operations, and to display events in real time. The PPS specific protocols are an enhanced version of those that were developed for the Tropical Rainfall Measuring Mission (TRMM). All file transfers between the MOC and the PPS use the SSH File Transfer Protocol (SFTP). For reports and data files generated within the MOC, no additional protocols are used when transferring files to the PPS. For observatory data files, an additional handshaking protocol of data notices and data receipts is used. MPISS generates and sends to the PPS data notices containing data start and stop times along with a checksum for the file for each observatory data file transmitted. MPISS retrieves the PPS generated data receipts that indicate the success or failure of the PPS to ingest the data file and/or notice. MPISS retransmits the appropriate files as indicated in the receipt when required. MPISS also automatically retrieves files from the PPS. The unique feature of this software is the use of both standard and PPS specific protocols in parallel. The advantage of this capability is that it supports users that require the PPS protocol as well as those that do not require it. The system is highly configurable to accommodate the needs of future users.
Severe Weather in a Changing Climate: Getting to Adaptation
NASA Astrophysics Data System (ADS)
Wuebbles, D. J.; Janssen, E.; Kunkel, K.
2011-12-01
Analyses of observation records from U.S. weather stations indicate there is an increasing trend over recent decades in certain types of severe weather, especially large precipitation events. Widespread changes in temperature extremes have been observed over the last 50 years. In particular, the number of heat waves globally (and some parts of the U.S.) has increased, and there have been widespread increases in the numbers of warm nights. Also, analyses show that we are now breaking twice as many heat records as cold records in the U.S. Since 1957, there has been an increase in the number of historically top 1% of heavy precipitation events across the U.S. Our new analyses of the repeat or reoccurrence frequencies of large precipitation storms are showing that such events are occurring more often than in the past. The pattern of precipitation change is one of increases generally at higher northern latitudes and drying in the tropics and subtropics over land. It needs to be recognized that every weather event that happens nowadays takes place in the context of the changes in the background climate system. So nothing is entirely "natural" anymore. It's a fallacy to think that individual events are caused entirely by any one thing, either natural variation or human-induced climate change. Every event is influenced by many factors. Human-induced climate change is now a factor in weather events. The changes occurring in precipitation are consistent with the analyses of our changing climate. For extreme precipitation, we know that more precipitation is falling in very heavy events. And we know key reasons why; warmer air holds more water vapor, and so when any given weather system moves through, the extra water dumps can lead to a heavy downpour. As the climate system continues to warm, models of the Earth's climate system indicate severe precipitation events will likely become more commonplace. Water vapor will continue to increase in the atmosphere along with the warming, and large precipitation events will likely increase in intensity and frequency. In the presentation, we will not only discuss the recent trends in severe weather and the projections of the impacts of climate change on severe weather in the future, but also specific examples of how this information is being used in developing and applying adaptation policies.
NASA Astrophysics Data System (ADS)
Banerjee, Bibaswan
In power electronic basedmicrogrids, the computational requirements needed to implement an optimized online control strategy can be prohibitive. The work presented in this dissertation proposes a generalized method of derivation of geometric manifolds in a dc microgrid that is based on the a-priori computation of the optimal reactions and trajectories for classes of events in a dc microgrid. The proposed states are the stored energies in all the energy storage elements of the dc microgrid and power flowing into them. It is anticipated that calculating a large enough set of dissimilar transient scenarios will also span many scenarios not specifically used to develop the surface. These geometric manifolds will then be used as reference surfaces in any type of controller, such as a sliding mode hysteretic controller. The presence of switched power converters in microgrids involve different control actions for different system events. The control of the switch states of the converters is essential for steady state and transient operations. A digital memory look-up based controller that uses a hysteretic sliding mode control strategy is an effective technique to generate the proper switch states for the converters. An example dcmicrogrid with three dc-dc boost converters and resistive loads is considered for this work. The geometric manifolds are successfully generated for transient events, such as step changes in the loads and the sources. The surfaces corresponding to a specific case of step change in the loads are then used as reference surfaces in an EEPROM for experimentally validating the control strategy. The required switch states corresponding to this specific transient scenario are programmed in the EEPROM as a memory table. This controls the switching of the dc-dc boost converters and drives the system states to the reference manifold. In this work, it is shown that this strategy effectively controls the system for a transient condition such as step changes in the loads for the example case.
Hierarchical event selection for video storyboards with a case study on snooker video visualization.
Parry, Matthew L; Legg, Philip A; Chung, David H S; Griffiths, Iwan W; Chen, Min
2011-12-01
Video storyboard, which is a form of video visualization, summarizes the major events in a video using illustrative visualization. There are three main technical challenges in creating a video storyboard, (a) event classification, (b) event selection and (c) event illustration. Among these challenges, (a) is highly application-dependent and requires a significant amount of application specific semantics to be encoded in a system or manually specified by users. This paper focuses on challenges (b) and (c). In particular, we present a framework for hierarchical event representation, and an importance-based selection algorithm for supporting the creation of a video storyboard from a video. We consider the storyboard to be an event summarization for the whole video, whilst each individual illustration on the board is also an event summarization but for a smaller time window. We utilized a 3D visualization template for depicting and annotating events in illustrations. To demonstrate the concepts and algorithms developed, we use Snooker video visualization as a case study, because it has a concrete and agreeable set of semantic definitions for events and can make use of existing techniques of event detection and 3D reconstruction in a reliable manner. Nevertheless, most of our concepts and algorithms developed for challenges (b) and (c) can be applied to other application areas. © 2010 IEEE
Remembering the past and planning for the future in rats
Crystal, Jonathon D.
2012-01-01
A growing body of research suggests that rats represent and remember specific earlier events from the past. An important criterion for validating a rodent model of episodic memory is to establish that the content of the representation is about a specific event in the past rather than vague information about remoteness. Recent evidence suggests that rats may also represent events that are anticipated to occur in the future. An important capacity afforded by a representation of the future is the ability to plan for the occurrence of a future event. However, relatively little is known about the content of represented future events and the cognitive mechanisms that may support planning. This article reviews evidence that rats remember specific earlier events from the past, represent events that are anticipated to ccur in the future, and develops criteria for validating a rodent model of future planning. These criteria include representing a specific time in the future, the ability to temporarily disengage from a plan and reactivate the plan at an appropriate time in the future, and flexibility to deploy a plan in novel conditions. PMID:23219951
Atilola, Olayinka; Omigbodun, Olayinka; Bella-Awusah, Tolulope
2014-05-01
There are some knowledge gaps in what is known about pre-contact exposure to traumatic events among adolescents within the juvenile justice system. Data often focus on psychological sequelae without describing the traumatic events. In addition, there are few data from sub-Saharan Africa where juvenile justice inmates are often minor offenders and may themselves have been victims of abuse and neglect. To present detailed data on the lifetime prevalence rate and pattern of traumatic events among a cohort of adolescents in juvenile justice custody in Nigeria and to compare inmates who are 'offenders' with those who are 'victims'. Inmates of a borstal and a remand home comprised the study group and age- and gender-matched adolescents from two government schools were the secondary comparison group. The trauma-checklist of the Current and Lifetime Version of the Kiddies Schedule for Affective Disorders and Schizophrenia was used as a guide in assessing traumatic events. Of a total of 408 adolescents, 204 were recruited from the two juvenile justice institutions and 204 from secondary schools. Ninety per cent of participants were male and the mean (SD) age was 15·9 (2·8) years. The prevalence rate of lifetime exposure to traumatic events among the juvenile justice offenders was 88·7% compared with 48·5% of the comparison group (P = 0·001). The most commonly reported specific lifetime traumatic event was physical abuse (52·8%). The institutionalised adolescents were significantly more likely to report lifetime exposure to almost all the traumatic events assessed. Apart from the perpetrators of violent crime, there was no statistically significant difference in the prevalence and pattern of lifetime exposure to traumatic events between the offenders and the victims. This study provides further evidence that exposure to traumatic events is a fact of life for inmates of juvenile institutions, irrespective of whether they are offenders or victims. The implications for reform of the Nigerian juvenile justice system are discussed.
NASA Astrophysics Data System (ADS)
Barillere, R.; Cabel, H.; Chan, B.; Goulas, I.; Le Goff, J. M.; Vinot, L.; Willmott, C.; Milcent, H.; Huuskonen, P.
1994-12-01
The Cortex control information system framework is being developed at CERN. It offers basic functions to allow the sharing of information, control and analysis functions; it presents a uniform human interface for such information and functions; it permits upgrades and additions without code modification and it is sufficiently generic to allow its use by most of the existing or future control systems at CERN. Services will include standard interfaces to user-supplied functions, analysis, archive and event management. Cortex does not attempt to carry out the direct data acquisition or control of the devices; these are activities which are highly specific to the application and are best done by commercial systems or user-written programs. Instead, Cortex integrates these application-specific pieces and supports them by supplying other commonly needed facilities such as collaboration, analysis, diagnosis and user assistance.
Velasco, Edward; Agheneza, Tumacha; Denecke, Kerstin; Kirchner, Göran; Eckmanns, Tim
2014-03-01
The exchange of health information on the Internet has been heralded as an opportunity to improve public health surveillance. In a field that has traditionally relied on an established system of mandatory and voluntary reporting of known infectious diseases by doctors and laboratories to governmental agencies, innovations in social media and so-called user-generated information could lead to faster recognition of cases of infectious disease. More direct access to such data could enable surveillance epidemiologists to detect potential public health threats such as rare, new diseases or early-level warnings for epidemics. But how useful are data from social media and the Internet, and what is the potential to enhance surveillance? The challenges of using these emerging surveillance systems for infectious disease epidemiology, including the specific resources needed, technical requirements, and acceptability to public health practitioners and policymakers, have wide-reaching implications for public health surveillance in the 21st century. This article divides public health surveillance into indicator-based surveillance and event-based surveillance and provides an overview of each. We did an exhaustive review of published articles indexed in the databases PubMed, Scopus, and Scirus between 1990 and 2011 covering contemporary event-based systems for infectious disease surveillance. Our literature review uncovered no event-based surveillance systems currently used in national surveillance programs. While much has been done to develop event-based surveillance, the existing systems have limitations. Accordingly, there is a need for further development of automated technologies that monitor health-related information on the Internet, especially to handle large amounts of data and to prevent information overload. The dissemination to health authorities of new information about health events is not always efficient and could be improved. No comprehensive evaluations show whether event-based surveillance systems have been integrated into actual epidemiological work during real-time health events. The acceptability of data from the Internet and social media as a regular part of public health surveillance programs varies and is related to a circular challenge: the willingness to integrate is rooted in a lack of effectiveness studies, yet such effectiveness can be proved only through a structured evaluation of integrated systems. Issues related to changing technical and social paradigms in both individual perceptions of and interactions with personal health data, as well as social media and other data from the Internet, must be further addressed before such information can be integrated into official surveillance systems. © 2014 Milbank Memorial Fund.
VELASCO, EDWARD; AGHENEZA, TUMACHA; DENECKE, KERSTIN; KIRCHNER, GÖRAN; ECKMANNS, TIM
2014-01-01
Context: The exchange of health information on the Internet has been heralded as an opportunity to improve public health surveillance. In a field that has traditionally relied on an established system of mandatory and voluntary reporting of known infectious diseases by doctors and laboratories to governmental agencies, innovations in social media and so-called user-generated information could lead to faster recognition of cases of infectious disease. More direct access to such data could enable surveillance epidemiologists to detect potential public health threats such as rare, new diseases or early-level warnings for epidemics. But how useful are data from social media and the Internet, and what is the potential to enhance surveillance? The challenges of using these emerging surveillance systems for infectious disease epidemiology, including the specific resources needed, technical requirements, and acceptability to public health practitioners and policymakers, have wide-reaching implications for public health surveillance in the 21st century. Methods: This article divides public health surveillance into indicator-based surveillance and event-based surveillance and provides an overview of each. We did an exhaustive review of published articles indexed in the databases PubMed, Scopus, and Scirus between 1990 and 2011 covering contemporary event-based systems for infectious disease surveillance. Findings: Our literature review uncovered no event-based surveillance systems currently used in national surveillance programs. While much has been done to develop event-based surveillance, the existing systems have limitations. Accordingly, there is a need for further development of automated technologies that monitor health-related information on the Internet, especially to handle large amounts of data and to prevent information overload. The dissemination to health authorities of new information about health events is not always efficient and could be improved. No comprehensive evaluations show whether event-based surveillance systems have been integrated into actual epidemiological work during real-time health events. Conclusions: The acceptability of data from the Internet and social media as a regular part of public health surveillance programs varies and is related to a circular challenge: the willingness to integrate is rooted in a lack of effectiveness studies, yet such effectiveness can be proved only through a structured evaluation of integrated systems. Issues related to changing technical and social paradigms in both individual perceptions of and interactions with personal health data, as well as social media and other data from the Internet, must be further addressed before such information can be integrated into official surveillance systems. PMID:24597553
A dimensionless approach for the runoff peak assessment: effects of the rainfall event structure
NASA Astrophysics Data System (ADS)
Gnecco, Ilaria; Palla, Anna; La Barbera, Paolo
2018-02-01
The present paper proposes a dimensionless analytical framework to investigate the impact of the rainfall event structure on the hydrograph peak. To this end a methodology to describe the rainfall event structure is proposed based on the similarity with the depth-duration-frequency (DDF) curves. The rainfall input consists of a constant hyetograph where all the possible outcomes in the sample space of the rainfall structures can be condensed. Soil abstractions are modelled using the Soil Conservation Service method and the instantaneous unit hydrograph theory is undertaken to determine the dimensionless form of the hydrograph; the two-parameter gamma distribution is selected to test the proposed methodology. The dimensionless approach is introduced in order to implement the analytical framework to any study case (i.e. natural catchment) for which the model assumptions are valid (i.e. linear causative and time-invariant system). A set of analytical expressions are derived in the case of a constant-intensity hyetograph to assess the maximum runoff peak with respect to a given rainfall event structure irrespective of the specific catchment (such as the return period associated with the reference rainfall event). Looking at the results, the curve of the maximum values of the runoff peak reveals a local minimum point corresponding to the design hyetograph derived according to the statistical DDF curve. A specific catchment application is discussed in order to point out the dimensionless procedure implications and to provide some numerical examples of the rainfall structures with respect to observed rainfall events; finally their effects on the hydrograph peak are examined.
NASA Astrophysics Data System (ADS)
Flach, Milan; Mahecha, Miguel; Gans, Fabian; Rodner, Erik; Bodesheim, Paul; Guanche-Garcia, Yanira; Brenning, Alexander; Denzler, Joachim; Reichstein, Markus
2016-04-01
The number of available Earth observations (EOs) is currently substantially increasing. Detecting anomalous patterns in these multivariate time series is an important step in identifying changes in the underlying dynamical system. Likewise, data quality issues might result in anomalous multivariate data constellations and have to be identified before corrupting subsequent analyses. In industrial application a common strategy is to monitor production chains with several sensors coupled to some statistical process control (SPC) algorithm. The basic idea is to raise an alarm when these sensor data depict some anomalous pattern according to the SPC, i.e. the production chain is considered 'out of control'. In fact, the industrial applications are conceptually similar to the on-line monitoring of EOs. However, algorithms used in the context of SPC or process monitoring are rarely considered for supervising multivariate spatio-temporal Earth observations. The objective of this study is to exploit the potential and transferability of SPC concepts to Earth system applications. We compare a range of different algorithms typically applied by SPC systems and evaluate their capability to detect e.g. known extreme events in land surface processes. Specifically two main issues are addressed: (1) identifying the most suitable combination of data pre-processing and detection algorithm for a specific type of event and (2) analyzing the limits of the individual approaches with respect to the magnitude, spatio-temporal size of the event as well as the data's signal to noise ratio. Extensive artificial data sets that represent the typical properties of Earth observations are used in this study. Our results show that the majority of the algorithms used can be considered for the detection of multivariate spatiotemporal events and directly transferred to real Earth observation data as currently assembled in different projects at the European scale, e.g. http://baci-h2020.eu/index.php/ and http://earthsystemdatacube.net/. Known anomalies such as the Russian heatwave are detected as well as anomalies which are not detectable with univariate methods.
Juillard, Catherine; Kouo Ngamby, Marquise; Ekeke Monono, Martin; Etoundi Mballa, Georges Alain; Dicker, Rochelle A; Stevens, Kent A; Hyder, Adnan A
2017-12-01
Road traffic injury surveillance systems are a cornerstone of organized efforts at injury control. Although high-income countries rely on established trauma registries and police databases, in low- and middle-income countries, the data source that provides the best collection of road traffic injury events in specific low- and middle-income country contexts without mature surveillance systems is unclear. The objective of this study was to compare the information available on road traffic injuries in 3 data sources used for surveillance in the sub-Saharan African country of Cameroon, providing potential insight on data sources for road traffic injury surveillance in low- and middle-income countries. We assessed the number of events captured and the information available in Yaoundé, Cameroon, from 3 separate sources of data on road traffic injuries: trauma registry, police records, and newspapers. Data were collected from a single-hospital trauma registry, police records, and the 6 most widely circulated newspapers in Yaoundé during a 6-month period in 2009. The number of road traffic injury events, mortality, and other variables included commonly in injury surveillance systems were recorded. We compared these sources using descriptive analysis. Hospital, police, and newspaper sources recorded 1,686, 273, and 480 road traffic injuries, respectively. The trauma registry provided the most complete data for the majority of variables explored; however, the newspaper data source captured 2, mass casualty, train crash events unrecorded in the other sources. Police data provided the most complete information on first responders to the scene, missing in only 7%. Investing in the hospital-based trauma registry may yield the best surveillance for road traffic injuries in some low- and middle-income countries, such as Yaoundé, Cameroon; however, police and newspaper reports may serve as alternative data sources when specific information is needed. Copyright © 2017 Elsevier Inc. All rights reserved.
Piccinni, Carlo; Gissi, Davide B; Gabusi, Andrea; Montebugnoli, Lucio; Poluzzi, Elisabetta
2015-07-01
This study was aimed to evaluate the possible alert signals of paraesthesia by local anaesthetics, focusing on those used in dentistry. A case/non-case study of spontaneous adverse events recorded in FAERS (FDA Adverse Event Reporting System) between 2004 and 2011 was performed. Cases were represented by the reports of reactions grouped under the term 'Paraesthesias and dysaesthesias' involving local anaesthetics (ATC: N01B*); non-cases were all other reports of the same drugs. Reporting odds ratios (ROR) with the relevant 95% confidence intervals (95CI) were calculated. Alert signal was considered when number of cases >3 and lower limit of ROR 95CI > 1. To estimate the specificity of signals for dentistry, the analysis was restricted to the specific term "Oral Paraesthesia" and to reports concerning dental practice. Overall, 528 reports of 'Paraesthesias and dysaesthesias' were retrieved, corresponding to 573 drug-reaction pairs (247 lidocaine, 99 bupivacaine, 85 articaine, 30 prilocaine, 112 others). The signal was significant only for articaine (ROR=18.38; 95CI = 13.95-24.21) and prilocaine (2.66; 1.82-3.90). The analysis of the specific term "Oral Paraesthesia" retrieved 82 reports corresponding to 90 drug-reaction pairs (37 articaine, 19 lidocaine, 14 prilocaine, 7 bupivacaine, 13 others) and confirmed the signal for articaine (58.77; 37.82-91.31) and prilocaine (8.73; 4.89-15.57). The analysis of reports concerning dental procedures retrieved a signal for articaine, both for any procedures (8.84; 2.79-27.97) and for non-surgical ones (15.79; 1.87-133.46). In conclusion, among local anaesthetics, only articaine and prilocaine generated a signal of paraesthesia, especially when used in dentistry. © 2015 Nordic Association for the Publication of BCPT (former Nordic Pharmacological Society).
Shaw, B E; Chapman, J; Fechter, M; Foeken, L; Greinix, H; Hwang, W; Phillips-Johnson, L; Korhonen, M; Lindberg, B; Navarro, W H; Szer, J
2013-11-01
Safety of living donors is critical to the success of blood, tissue and organ transplantation. Structured and robust vigilance and surveillance systems exist as part of some national entities, but historically no global systems are in place to ensure conformity, harmonisation and the recognition of rare adverse events (AEs). The World Health Assembly has recently resolved to require AE/reaction (AE/R) reporting both nationally and globally. The World Marrow Donor Association (WMDA) is an international organisation promoting the safety of unrelated donors and progenitor cell products for use in haematopoietic progenitor cell (HPC) transplantation. To address this issue, we established a system for collecting, collating, analysing, distributing and reacting to serious adverse events and reactions (SAE/R) in unrelated HPC donors. The WMDA successfully instituted this reporting system with 203 SAE/R reported in 2011. The committee generated two rapid reports, reacting to specific SAE/R, resulting in practice changing policies. The system has a robust governance structure, formal feedback to the WMDA membership and transparent information flows to other agencies, specialist physicians and transplant programs and the general public.
Towards a high sensitivity small animal PET system based on CZT detectors (Conference Presentation)
NASA Astrophysics Data System (ADS)
Abbaszadeh, Shiva; Levin, Craig
2017-03-01
Small animal positron emission tomography (PET) is a biological imaging technology that allows non-invasive interrogation of internal molecular and cellular processes and mechanisms of disease. New PET molecular probes with high specificity are under development to target, detect, visualize, and quantify subtle molecular and cellular processes associated with cancer, heart disease, and neurological disorders. However, the limited uptake of these targeted probes leads to significant reduction in signal. There is a need to advance the performance of small animal PET system technology to reach its full potential for molecular imaging. Our goal is to assemble a small animal PET system based on CZT detectors and to explore methods to enhance its photon sensitivity. In this work, we reconstruct an image from a phantom using a two-panel subsystem consisting of six CZT crystals in each panel. For image reconstruction, coincidence events with energy between 450 and 570 keV were included. We are developing an algorithm to improve sensitivity of the system by including multiple interaction events.
Management of natural crises with choreography and orchestration of federated warning-systems
NASA Astrophysics Data System (ADS)
Haener, Rainer; Waechter, Joachim; Hammitzsch, Martin
2013-04-01
The project Collaborative, Complex and Critical Decision-Support in Evolving Crises (TRIDEC), co-funded by the European Commission in its Seventh Framework Programme focuses on real-time intelligent information management in earth management. The addressed challenges include the design and implementation of a robust and scalable service infrastructure supporting the integration of existing resources, components and systems. Key challenge for TRIDEC is establishing a network of independent systems, cooperatively interacting as a collective in a system-of-systems (SoS). For this purpose TRIDEC adopts enhancements of service-oriented architecture (SOA) principles in terms of an event-driven architecture (EDA) design (SOA 2.0). In this way TRIDEC establishes large-scale concurrent and intelligent information management of a manifold of crisis types by focusing on the integration of autonomous, task-oriented and geographically distributed systems. To this end TRIDEC adapts both ways SOA 2.0 offers: orchestration and choreography. In orchestration, a central knowledge-based processing framework takes control over the involved services and coordinates their execution. Choreography on the other hand avoids central coordination. Rather, each system involved in the SoS follows a global scenario without a single point of control but specifically defined (enacted, agreed upon) trigger conditions. More than orchestration choreography allows collaborative business processes of various heterogeneous sub-systems (e.g. cooperative decision making) by concurrent Complex Event Processing (CEP) and asynchronous communication. These types of interaction adapt the concept of decoupled relationships between information producers (e.g. sensors and sensor systems) and information consumers (e.g. warning systems and warning dissemination systems). Asynchronous communication is useful if a participant wants to trigger specific actions by delegating the responsibility (separation of concerns) for the action to a dedicated participant. Implementing CEP, none of the participants has to know anything about the others. Information is filtered from a stream of manifold events (triggers) assigned to certain and well-defined topics. Both, orchestration and choreography are based on the specification of conversations, which comprise the information model, the roles and responsibilities of all participants, services and business processes, and interaction scenarios. By the maintenance of conversations in commonly available and semantically enabled registries it is possible to establish a federation of systems that is able to provide dynamic, yet coherent behaviour. TRIDEC establishes a reliable and adaptive SoS (concurrent processing of events and activities) which exposes emergent behaviour (e.g. intelligent and adaptive monitoring strategies, cooperative decision making or dynamic system configuration) even in case of partly system failures. In a process of self-organising (task balancing and dynamic delegation of responsibilities) as SoS is able to secure the reliability and responsiveness for real-time, long running & durable monitoring activities. Concepts like Design by Contract (DbC), service level agreements (SLA), redundancy- and failover-strategies as well as a comprehensive knowledge-based description of all facets of all potential interactions ensure the interoperability, robustness and expected behaviour of the TRIDEC SoS even if it is composed of managerial independent sub-systems. Beyond these features, the adaptability of a SoS offers scalability and virtualization regarding both, systems and domains. Composability and re-use of functionality can be achieved easily even across domain-boundaries.
Hulsman, Robert L; van der Vloodt, Jane
2015-03-01
Self-evaluation and peer-feedback are important strategies within the reflective practice paradigm for the development and maintenance of professional competencies like medical communication. Characteristics of the self-evaluation and peer-feedback annotations of medical students' video recorded communication skills were analyzed. Twenty-five year 4 medical students recorded history-taking consultations with a simulated patient, uploaded the video to a web-based platform, marked and annotated positive and negative events. Peers reviewed the video and self-evaluations and provided feedback. Analyzed were the number of marked positive and negative annotations and the amount of text entered. Topics and specificity of the annotations were coded and analyzed qualitatively. Students annotated on average more negative than positive events. Additional peer-feedback was more often positive. Topics most often related to structuring the consultation. Students were most critical about their biomedical topics. Negative annotations were more specific than positive annotations. Self-evaluations were more specific than peer-feedback and both show a significant correlation. Four response patterns were detected that negatively bias specificity assessment ratings. Teaching students to be more specific in their self-evaluations may be effective for receiving more specific peer-feedback. Videofragmentrating is a convenient tool to implement reflective practice activities like self-evaluation and peer-feedback to the classroom in the teaching of clinical skills. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Treml, Diana; Venturelli, Gustavo L; Brod, Fábio C A; Faria, Josias C; Arisi, Ana C M
2014-12-10
A genetically modified (GM) common bean event, namely Embrapa 5.1, resistant to the bean golden mosaic virus (BGMV), was approved for commercialization in Brazil. Brazilian regulation for genetically modified organism (GMO) labeling requires that any food containing more than 1% GMO be labeled. The event-specific polymerase chain reaction (PCR) method has been the primary trend for GMO identification and quantitation because of its high specificity based on the flanking sequence. This work reports the development of an event-specific assay, named FGM, for Embrapa 5.1 detection and quantitation by use of SYBR Green or hydrolysis probe. The FGM assay specificity was tested for Embrapa 2.3 event (a noncommercial GM common bean also resistant to BGMV), 46 non-GM common bean varieties, and other crop species including maize, GM maize, soybean, and GM soybean. The FGM assay showed high specificity to detect the Embrapa 5.1 event. Standard curves for the FGM assay presented a mean efficiency of 95% and a limit of detection (LOD) of 100 genome copies in the presence of background DNA. The primers and probe developed are suitable for the detection and quantitation of Embrapa 5.1.
3D Simulation of External Flooding Events for the RISMC Pathway
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prescott, Steven; Mandelli, Diego; Sampath, Ramprasad
2015-09-01
Incorporating 3D simulations as part of the Risk-Informed Safety Margins Characterization (RISMIC) Toolkit allows analysts to obtain a more complete picture of complex system behavior for events including external plant hazards. External events such as flooding have become more important recently – however these can be analyzed with existing and validated simulated physics toolkits. In this report, we describe these approaches specific to flooding-based analysis using an approach called Smoothed Particle Hydrodynamics. The theory, validation, and example applications of the 3D flooding simulation are described. Integrating these 3D simulation methods into computational risk analysis provides a spatial/visual aspect to themore » design, improves the realism of results, and can prove visual understanding to validate the analysis of flooding.« less
Moro, Pedro L; Woo, Emily Jane; Paul, Wendy; Lewis, Paige; Petersen, Brett W; Cano, Maria
2016-07-01
In 1980, human diploid cell vaccine (HDCV, Imovax Rabies, Sanofi Pasteur), was licensed for use in the United States. To assess adverse events (AEs) after HDCV reported to the US Vaccine Adverse Event Reporting System (VAERS), a spontaneous reporting surveillance system. We searched VAERS for US reports after HDCV among persons vaccinated from January 1, 1990-July 31, 2015. Medical records were requested for reports classified as serious (death, hospitalization, prolonged hospitalization, disability, life-threatening-illness), and those suggesting anaphylaxis and Guillain-Barré syndrome (GBS). Physicians reviewed available information and assigned a primary clinical category to each report using MedDRA system organ classes. Empirical Bayesian (EB) data mining was used to identify disproportional AE reporting after HDCV. VAERS received 1,611 reports after HDCV; 93 (5.8%) were serious. Among all reports, the three most common AEs included pyrexia (18.2%), headache (17.9%), and nausea (16.5%). Among serious reports, four deaths appeared to be unrelated to vaccination. This 25-year review of VAERS did not identify new or unexpected AEs after HDCV. The vast majority of AEs were non-serious. Injection site reactions, hypersensitivity reactions, and non-specific constitutional symptoms were most frequently reported, similar to findings in pre-licensure studies.
Reliable multicast protocol specifications protocol operations
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd; Whetten, Brian
1995-01-01
This appendix contains the complete state tables for Reliable Multicast Protocol (RMP) Normal Operation, Multi-RPC Extensions, Membership Change Extensions, and Reformation Extensions. First the event types are presented. Afterwards, each RMP operation state, normal and extended, is presented individually and its events shown. Events in the RMP specification are one of several things: (1) arriving packets, (2) expired alarms, (3) user events, (4) exceptional conditions.
Sense, decide, act, communicate (SDAC): next generation of smart sensor systems
NASA Astrophysics Data System (ADS)
Berry, Nina; Davis, Jesse; Ko, Teresa H.; Kyker, Ron; Pate, Ron; Stark, Doug; Stinnett, Regan; Baker, James; Cushner, Adam; Van Dyke, Colin; Kyckelhahn, Brian
2004-09-01
The recent war on terrorism and increased urban warfare has been a major catalysis for increased interest in the development of disposable unattended wireless ground sensors. While the application of these sensors to hostile domains has been generally governed by specific tasks, this research explores a unique paradigm capitalizing on the fundamental functionality related to sensor systems. This functionality includes a sensors ability to Sense - multi-modal sensing of environmental events, Decide - smart analysis of sensor data, Act - response to environmental events, and Communication - internal to system and external to humans (SDAC). The main concept behind SDAC sensor systems is to integrate the hardware, software, and networking to generate 'knowledge and not just data'. This research explores the usage of wireless SDAC units to collectively make up a sensor system capable of persistent, adaptive, and autonomous behavior. These systems are base on the evaluation of scenarios and existing systems covering various domains. This paper presents a promising view of sensor network characteristics, which will eventually yield smart (intelligent collectives) network arrays of SDAC sensing units generally applicable to multiple related domains. This paper will also discuss and evaluate the demonstration system developed to test the concepts related to SDAC systems.
Mechanisms Underlying the Active Self-Assembly of Microtubule Rings and Spools.
VanDelinder, Virginia; Brener, Stephanie; Bachand, George D
2016-03-14
Active self-assembly offers a powerful route for the creation of dynamic multiscale structures that are presently inaccessible with standard microfabrication techniques. One such system uses the translation of microtubule filaments by surface-tethered kinesin to actively assemble nanocomposites with bundle, ring, and spool morphologies. Attempts to observe mechanisms involved in this active assembly system have been hampered by experimental difficulties with performing observation during buffer exchange and photodamage from fluorescent excitation. In the present work, we used a custom microfluidic device to remove these limitations and directly study ring/spool formation, including the earliest events (nucleation) that drive subsequent nanocomposite assembly. Three distinct formation events were observed: pinning, collisions, and induced curvature. Of these three, collisions accounted for the majority of event leading to ring/spool formation, while the rate of pinning was shown to be dependent on the amount of photodamage in the system. We further showed that formation mechanism directly affects the diameter and rotation direction of the resultant rings and spools. Overall, the fundamental understanding described in this work provides a foundation by which the properties of motor-driven, actively assembled nanocomposites may be tailored toward specific applications.
Mechanisms underlying the active self-assembly of microtubule rings and spools
VanDelinder, Virginia; Brener, Stephanie; Bachand, George D.
2016-02-04
Here, active self-assembly offers a powerful route for the creation of dynamic multiscale structures that are presently inaccessible with standard microfabrication techniques. One such system uses the translation of microtubule filaments by surface-tethered kinesin to actively assemble nanocomposites with bundle, ring, and spool morphologies. Attempts to observe mechanisms involved in this active assembly system have been hampered by experimental difficulties with performing observation during buffer exchange and photodamage from fluorescent excitation. In the present work, we used a custom microfluidic device to remove these limitations and directly study ring/spool formation, including the earliest events (nucleation) that drive subsequent nanocomposite assembly.more » Three distinct formation events were observed: pinning, collisions, and induced curvature. Of these three, collisions accounted for the majority of event leading to ring/spool formation, while the rate of pinning was shown to be dependent on the amount of photodamage in the system. We further showed that formation mechanism directly affects the diameter and rotation direction of the resultant rings and spools. Overall, the fundamental understanding described in this work provides a foundation by which the properties of motor-driven, actively assembled nanocomposites may be tailored toward specific applications.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-01
... monitoring will achieve detection and quantification of CO 2 in the event surface leakage occurs. The UIC... leakage detection monitoring system or technical specifications should also be described in the MRV plan... of injected CO 2 or from another cause (e.g. natural variability). The MRV plan leakage detection and...
2011-06-27
Development Generic Hull Testing Airbag and Sensor Technology Development Blast Data Recorder Specifications and Fielding Numerical Model Improvement...seat designs, airbag and restraint systems, and energy absorbing flooring solutions Vehicle event data recorders for collecting highly accurate...treatments. Airbag or comparable technologies such as bolsters. Sensors that can detect and deploy/trigger interior treatments within the timeframe of a
Two specific fires from 2011 are tracked for local to regional scale contribution to ozone (O3) and fine particulate matter (PM2.5) using a freely available regulatory modeling system that includes the BlueSky wildland fire emissions tool, Spare Matrix Operator Kernel Emissions (...
The Neuroscience of Art: A Research Program for the Next Decade?
ERIC Educational Resources Information Center
Changeux, Jean Pierre
2011-01-01
Works of art can be viewed as elements of a human-specific nonverbal communication system, distinct from language. First, the cognitive abilities and skills required for art creation and perception are built from a cascade of events driven by a "genetic envelope". Essential for the understanding of artistic creation is its epigenetic variability.…
Liu, Yanfeng; Li, Jianghua; Du, Guocheng; Chen, Jian; Liu, Long
By combining advanced omics technology and computational modeling, systems biologists have identified and inferred thousands of regulatory events and system-wide interactions of the bacterium Bacillus subtilis, which is commonly used both in the laboratory and in industry. This dissection of the multiple layers of regulatory networks and their interactions has provided invaluable information for unraveling regulatory mechanisms and guiding metabolic engineering. In this review, we discuss recent advances in the systems biology and metabolic engineering of B. subtilis and highlight current gaps in our understanding of global metabolism and global pathway engineering in this organism. We also propose future perspectives in the systems biology of B. subtilis and suggest ways that this approach can be used to guide metabolic engineering. Specifically, although hundreds of regulatory events have been identified or inferred via systems biology approaches, systematic investigation of the functionality of these events in vivo has lagged, thereby preventing the elucidation of regulatory mechanisms and further rational pathway engineering. In metabolic engineering, ignoring the engineering of multilayer regulation hinders metabolic flux redistribution. Post-translational engineering, allosteric engineering, and dynamic pathway analyses and control will also contribute to the modulation and control of the metabolism of engineered B. subtilis, ultimately producing the desired cellular traits. We hope this review will aid metabolic engineers in making full use of available systems biology datasets and approaches for the design and perfection of microbial cell factories through global metabolism optimization. Copyright © 2016 Elsevier Inc. All rights reserved.
A Flight/Ground/Test Event Logging Facility
NASA Technical Reports Server (NTRS)
Dvorak, Daniel
1999-01-01
The onboard control software for spacecraft such as Mars Pathfinder and Cassini is composed of many subsystems including executive control, navigation, attitude control, imaging, data management, and telecommunications. The software in all of these subsystems needs to be instrumented for several purposes: to report required telemetry data, to report warning and error events, to verify internal behavior during system testing, and to provide ground operators with detailed data when investigating in-flight anomalies. Events can range in importance from purely informational events to major errors. It is desirable to provide a uniform mechanism for reporting such events and controlling their subsequent processing. Since radiation-hardened flight processors are several years behind the speed and memory of their commercial cousins, and since most subsystems require real-time control, and since downlink rates to earth can be very low from deep space, there are limits to how much of the data can be saved and transmitted. Some kinds of events are more important than others and should therefore be preferentially retained when memory is low. Some faults can cause an event to recur at a high rate, but this must not be allowed to consume the memory pool. Some event occurrences may be of low importance when reported but suddenly become more important when a subsequent error event gets reported. Some events may be so low-level that they need not be saved and reported unless specifically requested by ground operators.
Multivariate Spatial Condition Mapping Using Subtractive Fuzzy Cluster Means
Sabit, Hakilo; Al-Anbuky, Adnan
2014-01-01
Wireless sensor networks are usually deployed for monitoring given physical phenomena taking place in a specific space and over a specific duration of time. The spatio-temporal distribution of these phenomena often correlates to certain physical events. To appropriately characterise these events-phenomena relationships over a given space for a given time frame, we require continuous monitoring of the conditions. WSNs are perfectly suited for these tasks, due to their inherent robustness. This paper presents a subtractive fuzzy cluster means algorithm and its application in data stream mining for wireless sensor systems over a cloud-computing-like architecture, which we call sensor cloud data stream mining. Benchmarking on standard mining algorithms, the k-means and the FCM algorithms, we have demonstrated that the subtractive fuzzy cluster means model can perform high quality distributed data stream mining tasks comparable to centralised data stream mining. PMID:25313495
Awareness is relative: dissociation as the organisation of meaning.
Lesley, Joan
2006-09-01
This essay discusses how the organisation of mental material within the cognitive system can influence consciousness and awareness, and presents a theory of dissociation based on the premise that awareness is relative, contingent on the activated representation of the ongoing event being linked to the activated self-representation. It allows four possible variations of integration: (i) non-integrated experience--perceptions about an object/event are either not perceived or they remain at the sensory level: traditional dissociative states, amnesia, depersonalisation etc; (ii) variably integrated experience--activation of information of a specific valence about an object blocks activation of information of contrasting valence: splitting; (iii) alternatively integrated experience--experience is integrated into a specific, limited active self-representation: fugue and multiple identity states; (iv) dis-integrated experience-the ongoing experience of innate drives and needs is no longer consistently activated in the core self-representation: repression and isolation.
A rule-based approach for the correlation of alarms to support Disaster and Emergency Management
NASA Astrophysics Data System (ADS)
Gloria, M.; Minei, G.; Lersi, V.; Pasquariello, D.; Monti, C.; Saitto, A.
2009-04-01
Key words: Simple Event Correlator, Agent Platform, Ontology, Semantic Web, Distributed Systems, Emergency Management The importance of recognition of emergency's typology to control the critical situation for security of citizens has been always recognized. It follows this aspect is very important for proper management of a hazardous event. In this work we present a solution for the recognition of emergency's typology adopted by an Italian research project, called CI6 (Centro Integrato per Servizi di Emergenza Innovativi). In our approach, CI6 receives alarms by citizen or people involved in the work (for example: police, operator of 112, and so on). CI6 represents any alarm by a set of information, including a text that describes it and obtained when the user points out the danger, and a pair of coordinates for its location. The system realizes an analysis of text and automatically infers information on the type of emergencies by means a set of parsing rules and rules of inference applied by a independent module: a correlator of events based on their log and called Simple Event Correlator (SEC). SEC, integrated in CI6's platform, is an open source and platform independent event correlation tool. SEC accepts input both files and text derived from standard input, making it flexible because it can be matched to any application that is able to write its output to a file stream. The SEC configuration is stored in text files as rules, each rule specifying an event matching condition, an action list, and optionally a Boolean expression whose truth value decides whether the rule can be applied at a given moment. SEC can produce output events by executing user-specified shell scripts or programs, by writing messages to files, and by various other means. SEC has been successfully applied in various domains like network management, system monitoring, data security, intrusion detection, log file monitoring and analysis, etc; it has been used or integrated with many application as CiscoWorks, HP OpenView NNM and Operation, BMC Patrol, etc. Analysis of text of an alarm can detect some keywords that allow to classify the particular event. The inference rules were developed by means an analysis about news regard real emergency found by web reaserches. We have seen that often a kind of emergency is characterized by more keyword. Keywords are not uniquely associated with a specific emergency, but they can be shared by different types of emergencies (such as. keyword "landslide" can be associated both emergency "landslide" and emergency "Flood"). However, the identification of two or more keywords associated with a particular type of emergency identified in most cases the correct type of emergency. So, for example, if text contains words as "water", "flood", "overflowing", "landslide" o other words belonging to the set of defined keywords or words that have some root of keywords, the system "decides" that this alarm belongs to specific typology, in this case "flood typology". The system has the memory of this information, so if a new alarm is reported and belongs to one of the typology already identified, it proceeds with the comparison of coordinates. The comparison between the centers of the alarms allows to see if they describe an area inscribed in an ideal circle that has centered on the first alarm and radius defined by the typology above mentioned. If this happens the system CI6 creates an emergency that has centered on the centre of that area and typology equal to that of the alarms. It follows that an emergency is represented by at least two alarms. Thus, the system suggests to manager (CI6's user) the possibility that most alarms can concern same events and makes a classification of this event. It is important to stress that CI6 is a system of decision support, hence also this service is limited to providing advice to the user to facilitate his task, leaving him the decision to accept it or not. REFERENCES SEC (Simple Event Correlator), http://kodu.neti.ee/~risto/sec/ M. Gloria,V. Lersi, G. Minei, D. Pasquariello, C. Monti, A. Saitto, "A Semantic WEB Services Platform to support Disaster and Emergency Management", 4th biennial Meeting of International Environmental Modelling and Software Society (iEMSs), 2008
NASA Astrophysics Data System (ADS)
Park, Sangwook; Lee, Young-Ran; Hwang, Yoola; Javier Santiago Noguero Galilea
2009-12-01
This paper describes the Flight Dynamics Automation (FDA) system for COMS Flight Dynamics System (FDS) and its test result in terms of the performance of the automation jobs. FDA controls the flight dynamics functions such as orbit determination, orbit prediction, event prediction, and fuel accounting. The designed FDA is independent from the specific characteristics which are defined by spacecraft manufacturer or specific satellite missions. Therefore, FDA could easily links its autonomous job control functions to any satellite mission control system with some interface modification. By adding autonomous system along with flight dynamics system, it decreases the operator’s tedious and repeated jobs but increase the usability and reliability of the system. Therefore, FDA is used to improve the completeness of whole mission control system’s quality. The FDA is applied to the real flight dynamics system of a geostationary satellite, COMS and the experimental test is performed. The experimental result shows the stability and reliability of the mission control operations through the automatic job control.
Concurrent systems and time synchronization
NASA Astrophysics Data System (ADS)
Burgin, Mark; Grathoff, Annette
2018-05-01
In the majority of scientific fields, system dynamics is described assuming existence of unique time for the whole system. However, it is established theoretically, for example, in relativity theory or in the system theory of time, and validated experimentally that there are different times and time scales in a variety of real systems - physical, chemical, biological, social, etc. In spite of this, there are no wide-ranging scientific approaches to exploration of such systems. Therefore, the goal of this paper is to study systems with this property. We call them concurrent systems because processes in them can go, events can happen and actions can be performed in different time scales. The problem of time synchronization is specifically explored.
[Continuous quality improvement in anesthesia].
Gaitini, L; Vaida, S; Madgar, S
1998-01-01
Slow continuous quality improvement (SCQI) in anesthesia is a process that allows identification of problems and their causes. Implementing measures to correct them and continuous monitoring to ensure that the problems have been eliminated are necessary. The basic assumption of CQI is that the employees of an organization are competent and working to the best of their abilities. If problems occur they are the consequences of inadequacies in the process rather that in the individual. The CQI program is a dynamic but gradual system that invokes a slower rate of response in comparison with other quality methods, like quality assurance. Spectacular results following a system change are not to be expected an the ideal is slow and continuous improvement. A SCQI program was adapted by our department in May 1994, according to the recommendations of the American Society of Anesthesiologists. Problem identification was based on 65 clinical indicators, reflecting negative events related to anesthesia. Data were collected using a specially designed computer database. 4 events were identified as crossing previously established thresholds (hypertension, hypotension, hypoxia and inadequate nerve block). Statistical process control was used to establish stability of the system and whether negative events were influenced only by the common causes. The causes responsible for these negative events were identified using specific SCQI tools, such as control-charts, cause-effect diagrams and Pareto diagrams. Hypertension and inadequate nerve block were successfully managed. The implementation of corrective measures for the other events that cross the threshold is still in evolution. This program requires considerable dedication on the part of the staff, and it is hoped that it will improve our clinical performance.
Bal-Price, Anna; Lein, Pamela J.; Keil, Kimberly P.; Sethi, Sunjay; Shafer, Timothy; Barenys, Marta; Fritsche, Ellen; Sachana, Magdalini; Meek, M.E. (Bette)
2016-01-01
The Adverse Outcome Pathway (AOP) concept has recently been proposed to support a paradigm shift in regulatory toxicology testing and risk assessment. This concept is similar to the Mode of Action (MOA), in that it describes a sequence of measurable key events triggered by a molecular initiating event in which a stressor interacts with a biological target. The resulting cascade of key events includes molecular, cellular, structural and functional changes in biological systems, resulting in a measurable adverse outcome. Thereby, an AOP ideally provides information relevant to chemical structure-activity relationships as a basis for predicting effects of structurally similar compounds. AOPs could potentially also form the basis for qualitative and quantitative predictive modeling of the human adverse outcome resulting from molecular initiating or other key events for which higher-throughput testing methods are available or can be developed. A variety of cellular and molecular processes are known to be critical for normal function of the central (CNS) and peripheral nervous systems (PNS). Because of the biological and functional complexity of the CNS and PNS, it has been challenging to establish causative links and quantitative relationships between key events that comprise the pathways leading from chemical exposure to an adverse outcome in the nervous system. Following introduction of the principles of MOA and AOPs, examples of potential or putative adverse outcome pathways specific for developmental or adult neurotoxicity are summarized and aspects of their assessment considered. Their possible application in developing mechanistically informed Integrated Approaches to Testing and Assessment (IATA) is also discussed. PMID:27212452
Remote health monitoring system for detecting cardiac disorders.
Bansal, Ayush; Kumar, Sunil; Bajpai, Anurag; Tiwari, Vijay N; Nayak, Mithun; Venkatesan, Shankar; Narayanan, Rangavittal
2015-12-01
Remote health monitoring system with clinical decision support system as a key component could potentially quicken the response of medical specialists to critical health emergencies experienced by their patients. A monitoring system, specifically designed for cardiac care with electrocardiogram (ECG) signal analysis as the core diagnostic technique, could play a vital role in early detection of a wide range of cardiac ailments, from a simple arrhythmia to life threatening conditions such as myocardial infarction. The system that the authors have developed consists of three major components, namely, (a) mobile gateway, deployed on patient's mobile device, that receives 12-lead ECG signals from any ECG sensor, (b) remote server component that hosts algorithms for accurate annotation and analysis of the ECG signal and (c) point of care device of the doctor to receive a diagnostic report from the server based on the analysis of ECG signals. In the present study, their focus has been toward developing a system capable of detecting critical cardiac events well in advance using an advanced remote monitoring system. A system of this kind is expected to have applications ranging from tracking wellness/fitness to detection of symptoms leading to fatal cardiac events.
NASA Astrophysics Data System (ADS)
Brodie, K. L.; McNinch, J. E.
2008-12-01
Accurate predictions of shoreline response to storms are contingent upon coastal-morphodynamic models effectively synthesizing the complex evolving relationships between beach topography, sandbar morphology, nearshore bathymetry, underlying geology, and the nearshore wave-field during storm events. Analysis of "pre" and "post" storm data sets have led to a common theory for event response of the nearshore system: pre-storm three-dimensional bar and shoreline configurations shift to two-dimensional, linear forms post- storm. A lack of data during storms has unfortunately left a gap in our knowledge of how the system explicitly changes during the storm event. This work presents daily observations of the beach and nearshore during high-energy storm events over a spatially extensive field site (order of magnitude: 10 km) using Bar and Swash Imaging Radar (BASIR), a mobile x-band radar system. The field site contains a complexity of features including shore-oblique bars and troughs, heterogeneous sediment, and an erosional hotspot. BASIR data provide observations of the evolution of shoreline and bar morphology, as well as nearshore bathymetry, throughout the storm events. Nearshore bathymetry is calculated using a bathymetry inversion from radar- derived wave celerity measurements. Preliminary results show a relatively stable but non-linear shore-parallel bar and a non-linear shoreline with megacusp and embayment features (order of magnitude: 1 km) that are enhanced during the wave events. Both the shoreline and shore-parallel bar undulate at a similar spatial frequency to the nearshore shore- oblique bar-field. Large-scale shore-oblique bars and troughs remain relatively static in position and morphology throughout the storm events. The persistence of a three-dimensional shoreline, shore-parallel bar, and large-scale shore-oblique bars and troughs, contradicts the idea of event-driven shifts to two- dimensional morphology and suggests that beach and nearshore response to storms may be location specific. We hypothesize that the influence of underlying geology, defined by (1) the introduction of heterogeneous sediment and (2) the possible creation of shore-oblique bars and troughs in the nearshore, may be responsible for the persistence of three-dimensional forms and the associated shoreline hotspots during storm events.
Cao, Youfang; Liang, Jie
2013-01-01
Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape. PMID:23862966
NASA Astrophysics Data System (ADS)
Cao, Youfang; Liang, Jie
2013-07-01
Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape.
Cao, Youfang; Liang, Jie
2013-07-14
Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape.
Fail Save Shut Off Valve for Filtering Systems Employing Candle Filters
VanOsdol, John
2006-01-03
The invention relates to an apparatus that acts as a fail save shut off valve. More specifically, the invention relates to a fail save shut off valve that allows fluid flow during normal operational conditions, but prevents the flow of fluids in the event of system failure upstream that causes over-pressurization. The present invention is particularly well suited for use in conjunction with hot gas filtering systems, which utilize ceramic candle filters. Used in such a hot gas system the present invention stops the flow of hot gas and prevents any particulate laden gas from entering the clean side of the system.
Fail save shut off valve for filtering systems employing candle filters
VanOsdol, John [Fairmont, WV
2006-01-03
The invention relates to an apparatus that acts as a fail save shut off valve. More specifically, the invention relates to a fail save shut off valve that allows fluid flow during normal operational conditions, but prevents the flow of fluids in the event of system failure upstream that causes over-pressurization. The present invention is particularly well suited for use in conjunction with hot gas filtering systems, which utilize ceramic candle filters. Used in such a hot gas system the present invention stops the flow of hot gas and prevents any particulate laden gas from entering the clean side of the system.
Wilson, Mary E; Chen, Lin H; Han, Pauline V; Keystone, Jay S; Cramer, Jakob P; Segurado, Aluisio; Hale, DeVon; Jensenius, Mogens; Schwartz, Eli; von Sonnenburg, Frank; Leder, Karin
2014-05-01
Brazil will host the 2014 FIFA World Cup and the 2016 Olympic and Paralympic Games, events that are expected to attract hundreds of thousands of international travelers. Travelers to Brazil will encounter locally endemic infections as well as mass event-specific risks. We describe 1586 ill returned travelers who had visited Brazil and were seen at a GeoSentinel Clinic from July 1997 through May 2013. The most common travel-related illnesses were dermatologic conditions (40%), diarrheal syndromes (25%), and febrile systemic illness (19%). The most common specific dermatologic diagnoses were cutaneous larva migrans, myiasis, and tungiasis. Dengue and malaria, predominantly Plasmodium vivax, were the most frequently identified specific causes of fever and the most common reasons for hospitalization after travel. Dengue fever diagnoses displayed marked seasonality, although cases were seen throughout the year. Among the 28 ill returned travelers with human immunodeficiency virus (HIV) infection, 11 had newly diagnosed asymptomatic infection and 9 had acute symptomatic HIV. Our analysis primarily identified infectious diseases among travelers to Brazil. Knowledge of illness in travelers returning from Brazil can assist clinicians to advise prospective travelers and guide pretravel preparation, including itinerary-tailored advice, vaccines, and chemoprophylaxis; it can also help to focus posttravel evaluation of ill returned travelers. Travelers planning to attend mass events will encounter other risks that are not captured in our surveillance network.
Antón, Alfonso; Pazos, Marta; Martín, Belén; Navero, José Manuel; Ayala, Miriam Eleonora; Castany, Marta; Martínez, Patricia; Bardavío, Javier
2013-01-01
To assess sensitivity, specificity, and agreement among automated event analysis, automated trend analysis, and expert evaluation to detect glaucoma progression. This was a prospective study that included 37 eyes with a follow-up of 36 months. All had glaucomatous disks and fields and performed reliable visual fields every 6 months. Each series of fields was assessed with 3 different methods: subjective assessment by 2 independent teams of glaucoma experts, glaucoma/guided progression analysis (GPA) event analysis, and GPA (visual field index-based) trend analysis. Kappa agreement coefficient between methods and sensitivity and specificity for each method using expert opinion as gold standard were calculated. The incidence of glaucoma progression was 16% to 18% in 3 years but only 3 cases showed progression with all 3 methods. Kappa agreement coefficient was high (k=0.82) between subjective expert assessment and GPA event analysis, and only moderate between these two and GPA trend analysis (k=0.57). Sensitivity and specificity for GPA event and GPA trend analysis were 71% and 96%, and 57% and 93%, respectively. The 3 methods detected similar numbers of progressing cases. The GPA event analysis and expert subjective assessment showed high agreement between them and moderate agreement with GPA trend analysis. In a period of 3 years, both methods of GPA analysis offered high specificity, event analysis showed 83% sensitivity, and trend analysis had a 66% sensitivity.
NASA Technical Reports Server (NTRS)
Prinzel, Lawrence J 3rd; Freeman, Frederick G.; Scerbo, Mark W.; Mikulka, Peter J.; Pope, Alan T.
2003-01-01
The present study examined the effects of an electroencephalographic- (EEG-) based system for adaptive automation on tracking performance and workload. In addition, event-related potentials (ERPs) to a secondary task were derived to determine whether they would provide an additional degree of workload specificity. Participants were run in an adaptive automation condition, in which the system switched between manual and automatic task modes based on the value of each individual's own EEG engagement index; a yoked control condition; or another control group, in which task mode switches followed a random pattern. Adaptive automation improved performance and resulted in lower levels of workload. Further, the P300 component of the ERP paralleled the sensitivity to task demands of the performance and subjective measures across conditions. These results indicate that it is possible to improve performance with a psychophysiological adaptive automation system and that ERPs may provide an alternative means for distinguishing among levels of cognitive task demand in such systems. Actual or potential applications of this research include improved methods for assessing operator workload and performance.
Hanken, Taylor; Young, Sam; Smilowitz, Karen; Chiampas, George; Waskowski, David
2016-10-01
As one of the largest marathons worldwide, the Bank of America Chicago Marathon (BACCM; Chicago, Illinois USA) accumulates high volumes of data. Race organizers and engaged agencies need the ability to access specific data in real-time. This report details a data visualization system designed for the Chicago Marathon and establishes key principles for event management data visualization. The data visualization system allows for efficient data communication among the organizing agencies of Chicago endurance events. Agencies can observe the progress of the race throughout the day and obtain needed information, such as the number and location of runners on the course and current weather conditions. Implementation of the system can reduce time-consuming, face-to-face interactions between involved agencies by having key data streams in one location, streamlining communications with the purpose of improving race logistics, as well as medical preparedness and response. Hanken T , Young S , Smilowitz K , Chiampas G , Waskowski D . Developing a data visualization system for the Bank of America Chicago Marathon (Chicago, Illinois USA). Prehosp Disaster Med. 2016;31(5):572-577.
Activation and Resolution of Periodontal Inflammation and Its Systemic Impact
Hasturk, Hatice; Kantarci, Alpdogan
2015-01-01
Inflammation is a highly organized event impacting upon organs, tissues and biological systems. Periodontal diseases are characterized by dysregulation or dysfunction of resolution pathways of inflammation resulting in a failure of healing and a dominant chronic, progressive, destructive and predominantly unresolved inflammation. The biological consequences of inflammatory processes may be independent of the etiological agents such as trauma, microbial organisms and stress. The impact of the inflammatory pathological process depends upon the affected tissues or organ system. Whilst mediators are similar, there is a tissue specificity for the inflammatory events. It is plausible that inflammatory processes in one organ could directly lead to pathologies in another organ or tissue. Communication between distant parts of the body and their inflammatory status is also mediated by common signaling mechanisms mediated via cells and soluble mediators. This review focuses on periodontal inflammation, its systemic associations and advances in therapeutic approaches based on mediators acting through orchestration of natural pathway to resolution of inflammation. We also discuss a new treatment concept where natural pathways of resolution of periodontal inflammation can be used to limit systemic inflammation and promote healing and regeneration. PMID:26252412
Fletcher, Richard Ribón; Tam, Sharon; Omojola, Olufemi; Redemske, Richard; Kwan, Joyce
2011-01-01
We present a wearable sensor platform designed for monitoring and studying autonomic nervous system (ANS) activity for the purpose of mental health treatment and interventions. The mobile sensor system consists of a sensor band worn on the ankle that continuously monitors electrodermal activity (EDA), 3-axis acceleration, and temperature. A custom-designed ECG heart monitor worn on the chest is also used as an optional part of the system. The EDA signal from the ankle bands provides a measure sympathetic nervous system activity and used to detect arousal events. The optional ECG data can be used to improve the sensor classification algorithm and provide a measure of emotional "valence." Both types of sensor bands contain a Bluetooth radio that enables communication with the patient's mobile phone. When a specific arousal event is detected, the phone automatically presents therapeutic and empathetic messages to the patient in the tradition of Cognitive Behavioral Therapy (CBT). As an example of clinical use, we describe how the system is currently being used in an ongoing study for patients with drug-addiction and post-traumatic stress disorder (PTSD).
A semi-supervised learning framework for biomedical event extraction based on hidden topics.
Zhou, Deyu; Zhong, Dayou
2015-05-01
Scientists have devoted decades of efforts to understanding the interaction between proteins or RNA production. The information might empower the current knowledge on drug reactions or the development of certain diseases. Nevertheless, due to the lack of explicit structure, literature in life science, one of the most important sources of this information, prevents computer-based systems from accessing. Therefore, biomedical event extraction, automatically acquiring knowledge of molecular events in research articles, has attracted community-wide efforts recently. Most approaches are based on statistical models, requiring large-scale annotated corpora to precisely estimate models' parameters. However, it is usually difficult to obtain in practice. Therefore, employing un-annotated data based on semi-supervised learning for biomedical event extraction is a feasible solution and attracts more interests. In this paper, a semi-supervised learning framework based on hidden topics for biomedical event extraction is presented. In this framework, sentences in the un-annotated corpus are elaborately and automatically assigned with event annotations based on their distances to these sentences in the annotated corpus. More specifically, not only the structures of the sentences, but also the hidden topics embedded in the sentences are used for describing the distance. The sentences and newly assigned event annotations, together with the annotated corpus, are employed for training. Experiments were conducted on the multi-level event extraction corpus, a golden standard corpus. Experimental results show that more than 2.2% improvement on F-score on biomedical event extraction is achieved by the proposed framework when compared to the state-of-the-art approach. The results suggest that by incorporating un-annotated data, the proposed framework indeed improves the performance of the state-of-the-art event extraction system and the similarity between sentences might be precisely described by hidden topics and structures of the sentences. Copyright © 2015 Elsevier B.V. All rights reserved.
Nishi, Fernanda A; Polak, Catarina; Cruz, Diná de Almeida Lopes Monteiro da
2018-05-01
The purpose of the Manchester Triage System is to clinically prioritize each patient seeking care in an emergency department. Patients with suspected acute myocardial infarction who have typical symptoms including chest pain should be classified in the highest priority groups, requiring immediate medical assistance or care within 10 min. As such, the Manchester Triage System should present adequate sensitivity and specificity. This study estimated the sensitivity and specificity of the Manchester Triage System in the triage of patients with chest pain related to the diagnosis of acute myocardial infarction, and the associations between the performance of the Manchester Triage System and selected variables. This was an observational, analytical, cross-sectional, retrospective study. The sensitivity and specificity of the Manchester Triage System were estimated by verifying the triage classification received by these patients and their established medical diagnoses. The sample was composed of 10,087 triage episodes, in which 139 (1.38%) patients had a diagnosis of acute myocardial infarction. In 49 episodes, confirmation of medical diagnosis was not possible. The estimated sensitivity of the Manchester Triage System was 44.60% (36.18-53.27%) and the estimated specificity was 91.30% (90.73-91.85%). Of the 10,038 episodes in which the diagnosis of acute myocardial infarction was confirmed or excluded, 938 patients (9.34%) received an incorrect classification - undertriage or overtriage. This study showed that the specificity of the Manchester Triage System was very good. However, the low sensitivity based on the Manchester Triage System indicated that patients in high priority categories were undertriaged, leading to longer wait times and associated increased risks of adverse events.
The BioLexicon: a large-scale terminological resource for biomedical text mining
2011-01-01
Background Due to the rapidly expanding body of biomedical literature, biologists require increasingly sophisticated and efficient systems to help them to search for relevant information. Such systems should account for the multiple written variants used to represent biomedical concepts, and allow the user to search for specific pieces of knowledge (or events) involving these concepts, e.g., protein-protein interactions. Such functionality requires access to detailed information about words used in the biomedical literature. Existing databases and ontologies often have a specific focus and are oriented towards human use. Consequently, biological knowledge is dispersed amongst many resources, which often do not attempt to account for the large and frequently changing set of variants that appear in the literature. Additionally, such resources typically do not provide information about how terms relate to each other in texts to describe events. Results This article provides an overview of the design, construction and evaluation of a large-scale lexical and conceptual resource for the biomedical domain, the BioLexicon. The resource can be exploited by text mining tools at several levels, e.g., part-of-speech tagging, recognition of biomedical entities, and the extraction of events in which they are involved. As such, the BioLexicon must account for real usage of words in biomedical texts. In particular, the BioLexicon gathers together different types of terms from several existing data resources into a single, unified repository, and augments them with new term variants automatically extracted from biomedical literature. Extraction of events is facilitated through the inclusion of biologically pertinent verbs (around which events are typically organized) together with information about typical patterns of grammatical and semantic behaviour, which are acquired from domain-specific texts. In order to foster interoperability, the BioLexicon is modelled using the Lexical Markup Framework, an ISO standard. Conclusions The BioLexicon contains over 2.2 M lexical entries and over 1.8 M terminological variants, as well as over 3.3 M semantic relations, including over 2 M synonymy relations. Its exploitation can benefit both application developers and users. We demonstrate some such benefits by describing integration of the resource into a number of different tools, and evaluating improvements in performance that this can bring. PMID:21992002
The BioLexicon: a large-scale terminological resource for biomedical text mining.
Thompson, Paul; McNaught, John; Montemagni, Simonetta; Calzolari, Nicoletta; del Gratta, Riccardo; Lee, Vivian; Marchi, Simone; Monachini, Monica; Pezik, Piotr; Quochi, Valeria; Rupp, C J; Sasaki, Yutaka; Venturi, Giulia; Rebholz-Schuhmann, Dietrich; Ananiadou, Sophia
2011-10-12
Due to the rapidly expanding body of biomedical literature, biologists require increasingly sophisticated and efficient systems to help them to search for relevant information. Such systems should account for the multiple written variants used to represent biomedical concepts, and allow the user to search for specific pieces of knowledge (or events) involving these concepts, e.g., protein-protein interactions. Such functionality requires access to detailed information about words used in the biomedical literature. Existing databases and ontologies often have a specific focus and are oriented towards human use. Consequently, biological knowledge is dispersed amongst many resources, which often do not attempt to account for the large and frequently changing set of variants that appear in the literature. Additionally, such resources typically do not provide information about how terms relate to each other in texts to describe events. This article provides an overview of the design, construction and evaluation of a large-scale lexical and conceptual resource for the biomedical domain, the BioLexicon. The resource can be exploited by text mining tools at several levels, e.g., part-of-speech tagging, recognition of biomedical entities, and the extraction of events in which they are involved. As such, the BioLexicon must account for real usage of words in biomedical texts. In particular, the BioLexicon gathers together different types of terms from several existing data resources into a single, unified repository, and augments them with new term variants automatically extracted from biomedical literature. Extraction of events is facilitated through the inclusion of biologically pertinent verbs (around which events are typically organized) together with information about typical patterns of grammatical and semantic behaviour, which are acquired from domain-specific texts. In order to foster interoperability, the BioLexicon is modelled using the Lexical Markup Framework, an ISO standard. The BioLexicon contains over 2.2 M lexical entries and over 1.8 M terminological variants, as well as over 3.3 M semantic relations, including over 2 M synonymy relations. Its exploitation can benefit both application developers and users. We demonstrate some such benefits by describing integration of the resource into a number of different tools, and evaluating improvements in performance that this can bring.
Architectural design and support for knowledge sharing across heterogeneous MAST systems
NASA Astrophysics Data System (ADS)
Arkin, Ronald C.; Garcia-Vergara, Sergio; Lee, Sung G.
2012-06-01
A novel approach for the sharing of knowledge between widely heterogeneous robotic agents is presented, drawing upon Gardenfors Conceptual Spaces approach [4]. The target microrobotic platforms considered are computationally, power, sensor, and communications impoverished compared to more traditional robotics platforms due to their small size. This produces novel challenges for the system to converge on an interpretation of events within the world, in this case specifically focusing on the task of recognizing the concept of a biohazard in an indoor setting.
Distributed Tactical Decision Support by Using Real-Time Database System
1987-11-01
appendix A and detailed in depth in the Advanced Combat Direction System Specification (reference 5). The assumption is that ’ ime 0 (TO) of any contact...CONSTELLATION LAUNCH I F14A CAPM 330 350 10000 STOP At simulated engagement minute 30. the following orders are next submitted to the event generator...time of contact (ETC). There is the assumption in the ETC calculation that COURSE will change such that the new report would be on a dead- reckoning
Life review based on remembering specific positive events in active aging.
Latorre, José M; Serrano, Juan P; Ricarte, Jorge; Bonete, Beatriz; Ros, Laura; Sitges, Esther
2015-02-01
The aim of this study is to evaluate the effectiveness of life review (LR) based on specific positive events in non-depressed older adults taking part in an active aging program. Fifty-five older adults were randomly assigned to an experimental group or an active control (AC) group. A six-session individual training of LR based on specific positive events was carried out with the experimental group. The AC group undertook a "media workshop" of six sessions focused on learning journalistic techniques. Pre-test and post-test measures included life satisfaction, depressive symptoms, experiencing the environment as rewarding, and autobiographical memory (AM) scales. LR intervention decreased depressive symptomatology, improved life satisfaction, and increased specific memories. The findings suggest that practice in AM for specific events is an effective component of LR that could be a useful tool in enhancing emotional well-being in active aging programs, thus reducing depressive symptoms. © The Author(s) 2014.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sattison, M.B.
The Idaho National Engineering Laboratory (INEL) over the three years has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both the U.S. Nuclear Regulatory Commission`s (NRC`s) Office of Nuclear Reactor Regulation (NRR) and the Office for Analysis and Evaluation of Operational Data (AEOD). This paper presents an overviewmore » of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events. Future plans for the ASP models is also presented.« less
Homeologous plastid DNA transformation in tobacco is mediated by multiple recombination events.
Kavanagh, T A; Thanh, N D; Lao, N T; McGrath, N; Peter, S O; Horváth, E M; Dix, P J; Medgyesy, P
1999-01-01
Efficient plastid transformation has been achieved in Nicotiana tabacum using cloned plastid DNA of Solanum nigrum carrying mutations conferring spectinomycin and streptomycin resistance. The use of the incompletely homologous (homeologous) Solanum plastid DNA as donor resulted in a Nicotiana plastid transformation frequency comparable with that of other experiments where completely homologous plastid DNA was introduced. Physical mapping and nucleotide sequence analysis of the targeted plastid DNA region in the transformants demonstrated efficient site-specific integration of the 7.8-kb Solanum plastid DNA and the exclusion of the vector DNA. The integration of the cloned Solanum plastid DNA into the Nicotiana plastid genome involved multiple recombination events as revealed by the presence of discontinuous tracts of Solanum-specific sequences that were interspersed between Nicotiana-specific markers. Marked position effects resulted in very frequent cointegration of the nonselected peripheral donor markers located adjacent to the vector DNA. Data presented here on the efficiency and features of homeologous plastid DNA recombination are consistent with the existence of an active RecA-mediated, but a diminished mismatch, recombination/repair system in higher-plant plastids. PMID:10388829
NASA Astrophysics Data System (ADS)
Yu, Yuan; Tong, Qi; Li, Zhongxia; Tian, Jinhai; Wang, Yizhi; Su, Feng; Wang, Yongsheng; Liu, Jun; Zhang, Yong
2014-02-01
PhiC31 integrase-mediated gene delivery has been extensively used in gene therapy and animal transgenesis. However, random integration events are observed in phiC31-mediated integration in different types of mammalian cells; as a result, the efficiencies of pseudo attP site integration and evaluation of site-specific integration are compromised. To improve this system, we used an attB-TK fusion gene as a negative selection marker, thereby eliminating random integration during phiC31-mediated transfection. We also excised the selection system and plasmid bacterial backbone by using two other site-specific recombinases, Cre and Dre. Thus, we generated clean transgenic bovine fetal fibroblast cells free of selectable marker and plasmid bacterial backbone. These clean cells were used as donor nuclei for somatic cell nuclear transfer (SCNT), indicating a similar developmental competence of SCNT embryos to that of non-transgenic cells. Therefore, the present gene delivery system facilitated the development of gene therapy and agricultural biotechnology.
The heuristic basis of remembering and classification: fluency, generation, and resemblance.
Whittlesea, B W; Leboe, J P
2000-03-01
People use 3 heuristics (fluency, generation, and resemblance) in remembering a prior experience of a stimulus. The authors demonstrate that people use the same 3 heuristics in classifying a stimulus as a member of a category and interpret this as support for the idea that people have a unitary memory system that operates by the same fundamental principles in both remembering and nonremembering tasks. The authors argue that the fundamental functions of memory are the production of specific mental events, under the control of the stimulus, task, and context, and the evaluation of the coherence of those events, which controls the subjective experience accompanying performance.
Comprehension of Spacecraft Telemetry Using Hierarchical Specifications of Behavior
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Joshi, Rajeev
2014-01-01
A key challenge in operating remote spacecraft is that ground operators must rely on the limited visibility available through spacecraft telemetry in order to assess spacecraft health and operational status. We describe a tool for processing spacecraft telemetry that allows ground operators to impose structure on received telemetry in order to achieve a better comprehension of system state. A key element of our approach is the design of a domain-specific language that allows operators to express models of expected system behavior using partial specifications. The language allows behavior specifications with data fields, similar to other recent runtime verification systems. What is notable about our approach is the ability to develop hierarchical specifications of behavior. The language is implemented as an internal DSL in the Scala programming language that synthesizes rules from patterns of specification behavior. The rules are automatically applied to received telemetry and the inferred behaviors are available to ground operators using a visualization interface that makes it easier to understand and track spacecraft state. We describe initial results from applying our tool to telemetry received from the Curiosity rover currently roving the surface of Mars, where the visualizations are being used to trend subsystem behaviors, in order to identify potential problems before they happen. However, the technology is completely general and can be applied to any system that generates telemetry such as event logs.
Deep Space Storm Shelter Simulation Study
NASA Technical Reports Server (NTRS)
Dugan, Kathryn; Phojanamongkolkij, Nipa; Cerro, Jeffrey; Simon, Matthew
2015-01-01
Missions outside of Earth's magnetic field are impeded by the presence of radiation from galactic cosmic rays and solar particle events. To overcome this issue, NASA's Advanced Exploration Systems Radiation Works Storm Shelter (RadWorks) has been studying different radiation protective habitats to shield against the onset of solar particle event radiation. These habitats have the capability of protecting occupants by utilizing available materials such as food, water, brine, human waste, trash, and non-consumables to build short-term shelters. Protection comes from building a barrier with the materials that dampens the impact of the radiation on astronauts. The goal of this study is to develop a discrete event simulation, modeling a solar particle event and the building of a protective shelter. The main hallway location within a larger habitat similar to the International Space Station (ISS) is analyzed. The outputs from this model are: 1) the total area covered on the shelter by the different materials, 2) the amount of radiation the crew members receive, and 3) the amount of time for setting up the habitat during specific points in a mission given an event occurs.
FusionAnalyser: a new graphical, event-driven tool for fusion rearrangements discovery
Piazza, Rocco; Pirola, Alessandra; Spinelli, Roberta; Valletta, Simona; Redaelli, Sara; Magistroni, Vera; Gambacorti-Passerini, Carlo
2012-01-01
Gene fusions are common driver events in leukaemias and solid tumours; here we present FusionAnalyser, a tool dedicated to the identification of driver fusion rearrangements in human cancer through the analysis of paired-end high-throughput transcriptome sequencing data. We initially tested FusionAnalyser by using a set of in silico randomly generated sequencing data from 20 known human translocations occurring in cancer and subsequently using transcriptome data from three chronic and three acute myeloid leukaemia samples. in all the cases our tool was invariably able to detect the presence of the correct driver fusion event(s) with high specificity. In one of the acute myeloid leukaemia samples, FusionAnalyser identified a novel, cryptic, in-frame ETS2–ERG fusion. A fully event-driven graphical interface and a flexible filtering system allow complex analyses to be run in the absence of any a priori programming or scripting knowledge. Therefore, we propose FusionAnalyser as an efficient and robust graphical tool for the identification of functional rearrangements in the context of high-throughput transcriptome sequencing data. PMID:22570408
FusionAnalyser: a new graphical, event-driven tool for fusion rearrangements discovery.
Piazza, Rocco; Pirola, Alessandra; Spinelli, Roberta; Valletta, Simona; Redaelli, Sara; Magistroni, Vera; Gambacorti-Passerini, Carlo
2012-09-01
Gene fusions are common driver events in leukaemias and solid tumours; here we present FusionAnalyser, a tool dedicated to the identification of driver fusion rearrangements in human cancer through the analysis of paired-end high-throughput transcriptome sequencing data. We initially tested FusionAnalyser by using a set of in silico randomly generated sequencing data from 20 known human translocations occurring in cancer and subsequently using transcriptome data from three chronic and three acute myeloid leukaemia samples. in all the cases our tool was invariably able to detect the presence of the correct driver fusion event(s) with high specificity. In one of the acute myeloid leukaemia samples, FusionAnalyser identified a novel, cryptic, in-frame ETS2-ERG fusion. A fully event-driven graphical interface and a flexible filtering system allow complex analyses to be run in the absence of any a priori programming or scripting knowledge. Therefore, we propose FusionAnalyser as an efficient and robust graphical tool for the identification of functional rearrangements in the context of high-throughput transcriptome sequencing data.
Vaccine adverse event text mining system for extracting features from vaccine safety reports.
Botsis, Taxiarchis; Buttolph, Thomas; Nguyen, Michael D; Winiecki, Scott; Woo, Emily Jane; Ball, Robert
2012-01-01
To develop and evaluate a text mining system for extracting key clinical features from vaccine adverse event reporting system (VAERS) narratives to aid in the automated review of adverse event reports. Based upon clinical significance to VAERS reviewing physicians, we defined the primary (diagnosis and cause of death) and secondary features (eg, symptoms) for extraction. We built a novel vaccine adverse event text mining (VaeTM) system based on a semantic text mining strategy. The performance of VaeTM was evaluated using a total of 300 VAERS reports in three sequential evaluations of 100 reports each. Moreover, we evaluated the VaeTM contribution to case classification; an information retrieval-based approach was used for the identification of anaphylaxis cases in a set of reports and was compared with two other methods: a dedicated text classifier and an online tool. The performance metrics of VaeTM were text mining metrics: recall, precision and F-measure. We also conducted a qualitative difference analysis and calculated sensitivity and specificity for classification of anaphylaxis cases based on the above three approaches. VaeTM performed best in extracting diagnosis, second level diagnosis, drug, vaccine, and lot number features (lenient F-measure in the third evaluation: 0.897, 0.817, 0.858, 0.874, and 0.914, respectively). In terms of case classification, high sensitivity was achieved (83.1%); this was equal and better compared to the text classifier (83.1%) and the online tool (40.7%), respectively. Our VaeTM implementation of a semantic text mining strategy shows promise in providing accurate and efficient extraction of key features from VAERS narratives.
Towards a method to characterize temporary groundwater dynamics during droughts
NASA Astrophysics Data System (ADS)
Heudorfer, Benedikt; Stahl, Kerstin
2016-04-01
In order to improve our understanding of the complex mechanisms involved in the development, propagation and termination of drought events, a major challenge is to grasp the role of groundwater systems. Research on how groundwater responds to meteorological drought events (i.e. short-term climate anomalies) is still limited. Part of the problem is that there is as yet no generic method to characterize the response of different groundwater systems to extreme climate anomalies. In order to explore possibilities for such a methodology, we evaluate two statistical approaches to characterize groundwater dynamics on short time scales by applying them on observed groundwater head data from different pre- and peri-mountainous groundwater systems in humid central Europe (Germany). The first method is based on the coefficient of variation in moving windows of various lengths, the second method is based on streamflow recession characteristics applied on groundwater data. With these methods, the gauges behavior during low head events and its response to precipitation was explored. Findings regarding the behavior of the gauges make it possible to distinguish between gauges with a dominance of cyclic patterns, and gauges with a dominance of patterns on seasonal or event scale (commonly referred to as slow/fast responding gauges, respectively). While some clues on what factors that might control these patterns are present, the specific controls are general unclear for the gauges in this study. However as the key conclusion stands the question if the variety of manifestations of groundwater dynamics, as they occur in real systems, is subsumable with one unique method. Further studies on the topic are in progress.
NASA Astrophysics Data System (ADS)
Kitov, I.; Bobrov, D.; Rozhkov, M.
2016-12-01
Aftershocks of larger earthquakes represent an important source of information on the distribution and evolution of stresses and deformations in pre-seismic, co-seismic and post-seismic phases. For the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Organization (CTBTO) largest aftershocks sequences are also a challenge for automatic and interactive processing. The highest rate of events recorded by two and more seismic stations of the International Monitoring System from a relatively small aftershock area may reach hundreds per hour (e.g. Sumatra 2004 and Tohoku 2011). Moreover, there are thousands of reflected/refracted phases per hour with azimuth and slowness within the uncertainty limits of the first P-waves. Misassociation of these later phases, both regular and site specific, as the first P-wave results in creation of numerous wrong event hypotheses in automatic IDC pipeline. In turn, interactive review of such wrong hypotheses is direct waste of analysts' resources. Waveform cross correlation (WCC) is a powerful tool to separate coda phases from actual P-wave arrivals and to fully utilize the repeat character of waveforms generated by events close in space. Array seismic stations of the IMS enhance the performance of the WCC in two important aspects - they reduce detection threshold and effectively suppress arrivals from all sources except master events. An IDC specific aftershock tool has been developed and merged with standard IDC pipeline. The tool includes several procedures: creation of master events consisting of waveform templates at ten and more IMS stations; cross correlation (CC) of real-time waveforms with these templates, association of arrivals detected at CC-traces in event hypotheses; building events matching IDC quality criteria; and resolution of conflicts between events hypotheses created by neighboring master-events. The final cross correlation standard event lists (XSEL) is a start point of interactive analysis. Since global monitoring of underground nuclear tests is based on historical and synthetic data, each aftershock sequence can be tested for the CTBT violation with big earthquakes as an evasion scenario.
NASA Astrophysics Data System (ADS)
Batmunkh, Munkhbaatar; Bugay, Alexander; Bayarchimeg, Lkhagvaa; Lkhagva, Oidov
2018-02-01
The present study is focused on the development of optimal models of neuron morphology for Monte Carlo microdosimetry simulations of initial radiation-induced events of heavy charged particles in the specific types of cells of the hippocampus, which is the most radiation-sensitive structure of the central nervous system. The neuron geometry and particles track structures were simulated by the Geant4/Geant4-DNA Monte Carlo toolkits. The calculations were made for beams of protons and heavy ions with different energies and doses corresponding to real fluxes of galactic cosmic rays. A simple compartmental model and a complex model with realistic morphology extracted from experimental data were constructed and compared. We estimated the distribution of the energy deposition events and the production of reactive chemical species within the developed models of CA3/CA1 pyramidal neurons and DG granule cells of the rat hippocampus under exposure to different particles with the same dose. Similar distributions of the energy deposition events and concentration of some oxidative radical species were obtained in both the simplified and realistic neuron models.
Dorfman, David M; LaPlante, Charlotte D; Pozdnyakova, Olga; Li, Betty
2015-11-01
In our high-sensitivity flow cytometric approach for systemic mastocytosis (SM), we identified mast cell event clustering as a new diagnostic criterion for the disease. To objectively characterize mast cell gated event distributions, we performed cluster analysis using FLOCK, a computational approach to identify cell subsets in multidimensional flow cytometry data in an unbiased, automated fashion. FLOCK identified discrete mast cell populations in most cases of SM (56/75 [75%]) but only a minority of non-SM cases (17/124 [14%]). FLOCK-identified mast cell populations accounted for 2.46% of total cells on average in SM cases and 0.09% of total cells on average in non-SM cases (P < .0001) and were predictive of SM, with a sensitivity of 75%, a specificity of 86%, a positive predictive value of 76%, and a negative predictive value of 85%. FLOCK analysis provides useful diagnostic information for evaluating patients with suspected SM, and may be useful for the analysis of other hematopoietic neoplasms. Copyright© by the American Society for Clinical Pathology.
Haney, Gillian; Cocoros, Noelle; Cranston, Kevin; DeMaria, Alfred
2014-01-01
The Massachusetts Virtual Epidemiologic Network (MAVEN) was deployed in 2006 by the Massachusetts Department of Public Health, Bureau of Infectious Disease to serve as an integrated, Web-based disease surveillance and case management system. MAVEN replaced program-specific, siloed databases, which were inaccessible to local public health and unable to integrate electronic reporting. Disease events are automatically created without human intervention when a case or laboratory report is received and triaged in real time to state and local public health personnel. Events move through workflows for initial notification, case investigation, and case management. Initial development was completed within 12 months and recent state regulations mandate the use of MAVEN by all 351 jurisdictions. More than 300 local boards of health are using MAVEN, there are approximately one million events, and 70 laboratories report electronically. MAVEN has demonstrated responsiveness and flexibility to emerging diseases while also streamlining routine surveillance processes and improving timeliness of notifications and data completeness, although the long-term resource requirements are significant. PMID:24587547
Forecast Based Financing for Managing Weather and Climate Risks to Reduce Potential Disaster Impacts
NASA Astrophysics Data System (ADS)
Arrighi, J.
2017-12-01
There is a critical window of time to reduce potential impacts of a disaster after a forecast for heightened risk is issued and before an extreme event occurs. The concept of Forecast-based Financing focuses on this window of opportunity. Through advanced preparation during system set-up, tailored methodologies are used to 1) analyze a range of potential extreme event forecasts, 2) identify emergency preparedness measures that can be taken when factoring in forecast lead time and inherent uncertainty and 3) develop standard operating procedures that are agreed on and tied to guaranteed funding sources to facilitate emergency measures led by the Red Cross or government actors when preparedness measures are triggered. This presentation will focus on a broad overview of the current state of theory and approaches used in developing a forecast-based financing systems - with a specific focus on hydrologic events, case studies of success and challenges in various contexts where this approach is being piloted, as well as what is on the horizon to be further explored and developed from a research perspective as the application of this approach continues to expand.
Dynamics of Sensorimotor Oscillations in a Motor Task
NASA Astrophysics Data System (ADS)
Pfurtscheller, Gert; Neuper, Christa
Many BCI systems rely on imagined movement. The brain activity associated with real or imagined movement produces reliable changes in the EEG. Therefore, many people can use BCI systems by imagining movements to convey information. The EEG has many regular rhythms. The most famous are the occipital alpha rhythm and the central mu and beta rhythms. People can desynchronize the alpha rhythm (that is, produce weaker alpha activity) by being alert, and can increase alpha activity by closing their eyes and relaxing. Sensory processing or motor behavior leads to EEG desynchronization or blocking of central beta and mu rhythms, as originally reported by Berger [1], Jasper and Andrew [2] and Jasper and Penfield [3]. This desynchronization reflects a decrease of oscillatory activity related to an internally or externally-paced event and is known as Event-Related Desynchronization (ERD, [4]). The opposite, namely the increase of rhythmic activity, was termed Event-Related Synchronization (ERS, [5]). ERD and ERS are characterized by fairly localized topography and frequency specificity [6]. Both phenomena can be studied through topographiuthc maps, time courses, and time-frequency representations (ERD maps, [7]).
Matoza, Robin S.; Shearer, Peter M.; Okubo, Paul G.
2016-01-01
Long-period (0.5–5 Hz, LP) seismicity has been recorded for decades in the summit region of Kı̄lauea Volcano, Hawai‘i, and is postulated as linked with the magma transport and shallow hydrothermal systems. To better characterize its spatiotemporal occurrence, we perform a systematic analysis of 49,030 seismic events occurring in the Kı̄lauea summit region from January 1986 to March 2009 recorded by the ∼50-station Hawaiian Volcano Observatory permanent network. We estimate 215,437 P wave spectra, considering all events on all stations, and use a station-averaged spectral metric to consistently classify LP and non-LP seismicity. We compute high-precision relative relocations for 5327 LP events (43% of all classified LP events) using waveform cross correlation and cluster analysis with 6.4 million event pairs, combined with the source-specific station term method. The majority of intermediate-depth (5–15 km) LPs collapse to a compact volume, with remarkable source location stability over 23 years indicating a source process controlled by geological or conduit structure.
Dynamics of pollutant discharge in combined sewer systems during rain events: chance or determinism?
Hannouche, A; Chebbo, G; Joannis, C
2014-01-01
A large database of continuous flow and turbidity measurements cumulating data on hundreds of rain events and dry weather days from two sites in Paris (called Quais and Clichy) and one in Lyon (called Ecully) is presented. This database is used to characterize and compare the behaviour of the three sites at the inter-events scale. The analysis is probed through three various variables: total volumes and total suspended solids (TSS) masses and concentrations during both wet and dry weather periods in addition to the contributions of diverse-origin sources to event flow volume and TSS load values. The results obtained confirm the previous findings regarding the spatial consistency of TSS fluxes and concentrations between both sites in Paris having similar land uses. Moreover, masses and concentrations are proven to be correlated between Parisian sites in a way that implies the possibility of some deterministic processes being reproducible from one catchment to another for a particular rain event. The results also demonstrate the importance of the contribution of wastewater and sewer deposits to the total events' loads and show that such contributions are not specific to Paris sewer networks.
Dreyer, Felix R; Pulvermüller, Friedemann
2018-03-01
Previous research showed that modality-preferential sensorimotor areas are relevant for processing concrete words used to speak about actions. However, whether modality-preferential areas also play a role for abstract words is still under debate. Whereas recent functional magnetic resonance imaging (fMRI) studies suggest an involvement of motor cortex in processing the meaning of abstract emotion words as, for example, 'love', other non-emotional abstract words, in particular 'mental words', such as 'thought' or 'logic', are believed to engage 'amodal' semantic systems only. In the present event-related fMRI experiment, subjects passively read abstract emotional and mental nouns along with concrete action related words. Contrary to expectation, the results indicate a specific involvement of face motor areas in the processing of mental nouns, resembling that seen for face related action words. This result was confirmed when subject-specific regions of interest (ROIs) defined by motor localizers were used. We conclude that a role of motor systems in semantic processing is not restricted to concrete words but extends to at least some abstract mental symbols previously thought to be entirely 'disembodied' and divorced from semantically related sensorimotor processing. Implications for neurocognitive theories of semantics and clinical applications will be highlighted, paying specific attention to the role of brain activations as indexes of cognitive processes and their relationships to 'causal' studies addressing lesion and transcranial magnetic stimulation (TMS) effects. Possible implications for clinical practice, in particular speech language therapy, are discussed in closing. Copyright © 2017. Published by Elsevier Ltd.
Exploring the evolution of node neighborhoods in Dynamic Networks
NASA Astrophysics Data System (ADS)
Orman, Günce Keziban; Labatut, Vincent; Naskali, Ahmet Teoman
2017-09-01
Dynamic Networks are a popular way of modeling and studying the behavior of evolving systems. However, their analysis constitutes a relatively recent subfield of Network Science, and the number of available tools is consequently much smaller than for static networks. In this work, we propose a method specifically designed to take advantage of the longitudinal nature of dynamic networks. It characterizes each individual node by studying the evolution of its direct neighborhood, based on the assumption that the way this neighborhood changes reflects the role and position of the node in the whole network. For this purpose, we define the concept of neighborhood event, which corresponds to the various transformations such groups of nodes can undergo, and describe an algorithm for detecting such events. We demonstrate the interest of our method on three real-world networks: DBLP, LastFM and Enron. We apply frequent pattern mining to extract meaningful information from temporal sequences of neighborhood events. This results in the identification of behavioral trends emerging in the whole network, as well as the individual characterization of specific nodes. We also perform a cluster analysis, which reveals that, in all three networks, one can distinguish two types of nodes exhibiting different behaviors: a very small group of active nodes, whose neighborhood undergo diverse and frequent events, and a very large group of stable nodes.
Threat and error management for anesthesiologists: a predictive risk taxonomy
Ruskin, Keith J.; Stiegler, Marjorie P.; Park, Kellie; Guffey, Patrick; Kurup, Viji; Chidester, Thomas
2015-01-01
Purpose of review Patient care in the operating room is a dynamic interaction that requires cooperation among team members and reliance upon sophisticated technology. Most human factors research in medicine has been focused on analyzing errors and implementing system-wide changes to prevent them from recurring. We describe a set of techniques that has been used successfully by the aviation industry to analyze errors and adverse events and explain how these techniques can be applied to patient care. Recent findings Threat and error management (TEM) describes adverse events in terms of risks or challenges that are present in an operational environment (threats) and the actions of specific personnel that potentiate or exacerbate those threats (errors). TEM is a technique widely used in aviation, and can be adapted for the use in a medical setting to predict high-risk situations and prevent errors in the perioperative period. A threat taxonomy is a novel way of classifying and predicting the hazards that can occur in the operating room. TEM can be used to identify error-producing situations, analyze adverse events, and design training scenarios. Summary TEM offers a multifaceted strategy for identifying hazards, reducing errors, and training physicians. A threat taxonomy may improve analysis of critical events with subsequent development of specific interventions, and may also serve as a framework for training programs in risk mitigation. PMID:24113268
Temperament Constructs Related to Betrayal of Trust
1991-12-01
life events, in making decisions about the suitability of individuals for such positions. This system has established issues on which individuals seeking...socialization, hyperactivity, impulsiveness , and other dimensions. The multitude of personality traits by which offenders and nonoffenders have been found...Organization. and the Offender -are used to make the comparison. Within each of these groupings are specific criteria by which congruences and disparities
Direct Energy Conversion for Low Specific Mass In-Space Power and Propulsion
NASA Technical Reports Server (NTRS)
Scott, John H.; George, Jeffrey A.; Tarditi, Alfonso G.
2013-01-01
"Changing the game" in space exploration involves changing the paradigm for the human exploration of the Solar System, e.g, changing the human exploration of Mars from a three-year epic event to an annual expedition. For the purposes of this assessment an "annual expedition" capability is defined as an in-space power & propulsion system which, with launch mass limits as defined in NASA s Mars Architecture 5.0, enables sending a crew to Mars and returning them after a 30-day surface stay within one year, irrespective of planetary alignment. In this work the authors intend to show that obtaining this capability requires the development of an in-space power & propulsion system with an end-to-end specific mass considerably less than 3 kg/kWe. A first order energy balance analysis reveals that the technologies required to create a system with this specific mass include direct energy conversion and nuclear sources that release energy in the form of charged particle beams. This paper lays out this first order approximation and details these conclusions.
NASA Astrophysics Data System (ADS)
Petrinec, S. M.; Chenette, D. L.; Imhof, W. L.; Baker, D. N.; Barth, C. A.; Mankoff, K. D.; Luhmann, J. G.; Mason, G. M.; Mazur, J. E.; Evans, D. S.
2001-12-01
A detailed analysis of the particle precipitation into the auroral regions during specific storm intervals is performed. The global energetic particle input to the ionosphere and lower thermosphere is provided by several monitors; namely the Polar Ionospheric X-ray Experiment (PIXIE) on board the NASA/GGS Polar satellite (for inferred electron energies greater than about 3 keV); the TED sensor system on board the NOAA/Polar Orbiting Environmental Satellite (POES) (particle energies between about 50 eV and 20 keV), and the sensor system (LICA) on board the Solar, Anomalous, and Magnetospheric Particle Explorer (SAMPEX) spacecraft (for electron energies greater then 25 keV). Changes in nitric oxide (NO) densities at altitudes between 97 and 150 km during these storm intervals are studied using observations from the Student Nitric Oxide Explorer (SNOE). Solar wind observations are also used to provide important information regarding the external drivers for the magnetospheric input to the upper atmosphere. Specific intervals of examination include the recent large geomagnetic event of March 31-April 1, 2001, and other events from the most recent solar maximum.
Recommended Practice: Creating Cyber Forensics Plans for Control Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eric Cornelius; Mark Fabro
Cyber forensics has been in the popular mainstream for some time, and has matured into an information-technology capability that is very common among modern information security programs. The goal of cyber forensics is to support the elements of troubleshooting, monitoring, recovery, and the protection of sensitive data. Moreover, in the event of a crime being committed, cyber forensics is also the approach to collecting, analyzing, and archiving data as evidence in a court of law. Although scalable to many information technology domains, especially modern corporate architectures, cyber forensics can be challenging when being applied to non-traditional environments, which are notmore » comprised of current information technologies or are designed with technologies that do not provide adequate data storage or audit capabilities. In addition, further complexity is introduced if the environments are designed using proprietary solutions and protocols, thus limiting the ease of which modern forensic methods can be utilized. The legacy nature and somewhat diverse or disparate component aspects of control systems environments can often prohibit the smooth translation of modern forensics analysis into the control systems domain. Compounded by a wide variety of proprietary technologies and protocols, as well as critical system technologies with no capability to store significant amounts of event information, the task of creating a ubiquitous and unified strategy for technical cyber forensics on a control systems device or computing resource is far from trivial. To date, no direction regarding cyber forensics as it relates to control systems has been produced other than what might be privately available from commercial vendors. Current materials have been designed to support event recreation (event-based), and although important, these requirements do not always satisfy the needs associated with incident response or forensics that are driven by cyber incidents. To address these issues and to accommodate for the diversity in both system and architecture types, a framework based in recommended practices to address forensics in the control systems domain is required. This framework must be fully flexible to allow for deployment into any control systems environment regardless of technologies used. Moreover, the framework and practices must provide for direction on the integration of modern network security technologies with traditionally closed systems, the result being a true defense-in-depth strategy for control systems architectures. This document takes the traditional concepts of cyber forensics and forensics engineering and provides direction regarding augmentation for control systems operational environments. The goal is to provide guidance to the reader with specifics relating to the complexity of cyber forensics for control systems, guidance to allow organizations to create a self-sustaining cyber forensics program, and guidance to support the maintenance and evolution of such programs. As the current control systems cyber security community of interest is without any specific direction on how to proceed with forensics in control systems environments, this information product is intended to be a first step.« less
A System for Traffic Violation Detection
Aliane, Nourdine; Fernandez, Javier; Mata, Mario; Bemposta, Sergio
2014-01-01
This paper describes the framework and components of an experimental platform for an advanced driver assistance system (ADAS) aimed at providing drivers with a feedback about traffic violations they have committed during their driving. The system is able to detect some specific traffic violations, record data associated to these faults in a local data-base, and also allow visualization of the spatial and temporal information of these traffic violations in a geographical map using the standard Google Earth tool. The test-bed is mainly composed of two parts: a computer vision subsystem for traffic sign detection and recognition which operates during both day and nighttime, and an event data recorder (EDR) for recording data related to some specific traffic violations. The paper covers firstly the description of the hardware architecture and then presents the policies used for handling traffic violations. PMID:25421737
A system for traffic violation detection.
Aliane, Nourdine; Fernandez, Javier; Mata, Mario; Bemposta, Sergio
2014-11-24
This paper describes the framework and components of an experimental platform for an advanced driver assistance system (ADAS) aimed at providing drivers with a feedback about traffic violations they have committed during their driving. The system is able to detect some specific traffic violations, record data associated to these faults in a local data-base, and also allow visualization of the spatial and temporal information of these traffic violations in a geographical map using the standard Google Earth tool. The test-bed is mainly composed of two parts: a computer vision subsystem for traffic sign detection and recognition which operates during both day and nighttime, and an event data recorder (EDR) for recording data related to some specific traffic violations. The paper covers firstly the description of the hardware architecture and then presents the policies used for handling traffic violations.
A Photoisomerizing Rhodopsin Mimic Observed at Atomic Resolution.
Nosrati, Meisam; Berbasova, Tetyana; Vasileiou, Chrysoula; Borhan, Babak; Geiger, James H
2016-07-20
The members of the rhodopsin family of proteins are involved in many essential light-dependent processes in biology. Specific photoisomerization of the protein-bound retinylidene PSB at a specified wavelength range of light is at the heart of all of these systems. Nonetheless, it has been difficult to reproduce in an engineered system. We have developed rhodopsin mimics, using intracellular lipid binding protein family members as scaffolds, to study fundamental aspects of protein/chromophore interactions. Herein we describe a system that specifically isomerizes the retinylidene protonated Schiff base both thermally and photochemically. This isomerization has been characterized at atomic resolution by quantitatively interconverting the isomers in the crystal both thermally and photochemically. This event is accompanied by a large pKa change of the imine similar to the pKa changes observed in bacteriorhodopsin and visual opsins during isomerization.
Wang, Licheng; Wang, Zidong; Han, Qing-Long; Wei, Guoliang
2018-03-01
This paper is concerned with the distributed filtering problem for a class of discrete time-varying stochastic parameter systems with error variance constraints over a sensor network where the sensor outputs are subject to successive missing measurements. The phenomenon of the successive missing measurements for each sensor is modeled via a sequence of mutually independent random variables obeying the Bernoulli binary distribution law. To reduce the frequency of unnecessary data transmission and alleviate the communication burden, an event-triggered mechanism is introduced for the sensor node such that only some vitally important data is transmitted to its neighboring sensors when specific events occur. The objective of the problem addressed is to design a time-varying filter such that both the requirements and the variance constraints are guaranteed over a given finite-horizon against the random parameter matrices, successive missing measurements, and stochastic noises. By recurring to stochastic analysis techniques, sufficient conditions are established to ensure the existence of the time-varying filters whose gain matrices are then explicitly characterized in term of the solutions to a series of recursive matrix inequalities. A numerical simulation example is provided to illustrate the effectiveness of the developed event-triggered distributed filter design strategy.
[Specification of cell destiny in early Caenorhabditis elegans embryo].
Schierenberg, E
1997-02-01
Embryogenesis of the nematode Caenorhabditis elegans has been described completely on a cell-by-cell basis and found to be essentially invariant. With this knowledge in hands, micromanipulated embryos and mutants have been analyzed for cell lineage defects and the distribution of specific gene products. The results challenge the classical view of cell-autonomous development in nematodes and indicate that the early embryo of C. elegans is a highly dynamic system. A network of inductive events between neighboring cells is being revealed, which is necessary to assign different developmental programs to blastomeres. In those cases where molecules involved in these cell-cell interactions have been identified, homologies to cell surface receptors, ligands and transcription factors found in other systems have become obvious.
Variability in bacterial community structure during upwelling in the coastal ocean
Kerkhof, L.J.; Voytek, M.A.; Sherrell, Robert M.; Millie, D.; Schofield, O.
1999-01-01
Over the last 30 years, investigations at the community level of marine bacteria and phytoplankton populations suggest they are tightly coupled. However, traditional oceanographic approaches cannot assess whether associations between specific bacteria and phytoplankton exist. Recently, molecular based approaches have been implemented to characterize specific members of different marine bacterial communities. Yet, few molecular-based studies have examined coastal upwelling situations. This is important since upwelling systems provide a unique opportunity for analyzing the association between specific bacteria and specific phytoplankton in the ocean. It is widely believed that upwelling can lead to changes in phytoplankton populations (blooms). Thus, if specific associations exist, we would expect to observe changes in the bacterial population triggered by the bloom. In this paper, we present preliminary data from coastal waters off New Jersey that confirm a shift in bacterial communities during a 1995 upwelling event recorded at a long-term earth observatory (LEO-15) in the Mid-Atlantic Bight. Using PCR amplification and cloning, specific bacterial 16S ribosomal RNA sequences were found which were present in upwelling samples during a phytoplankton bloom, but were not detected in non-bloom samples (surface seawater, offshore sites or sediment samples) collected at the same time or in the same area. These findings are consistent with the notion of specific associations between bacteria and phytoplankton in the ocean. However, further examination of episodic events, such as coastal upwelling, are needed to confirm the existence of specific associations. Additionally, experiments need to be performed to elucidate the mechanisms leading to the specific linkages between a group of bacteria and a group of phytoplankton.
Solute Response To Arid-Climate Managed-River Flow During Storm Events
NASA Astrophysics Data System (ADS)
McLean, B.; Shock, E.
2006-12-01
Storm pulses are widely used in unmanaged, temperate and subtropical river systems to resolve in-stream surface and subsurface flow components. Resulting catchment-scale hydrochemical mixing models yield insight into mechanisms of solute transport. Managed systems are far more complicated due to the human need for high quality water resources, which drives processes that are superimposed on most, if not all, of the unmanaged components. As an example, an increasingly large portion of the water supply for the Phoenix metropolitan area is derived from multiple surface water sources that are impounded, diverted and otherwise managed upstream from the urban core that consumes the water and produces anthropogenic impacts. During large storm events this managed system is perturbed towards natural behavior as it receives inputs from natural hydrologic pathways in addition to impervious surfaces and storm water drainage channels. Our goals in studying managed river systems during this critical transition state are to determine how the well- characterized behavior of natural systems break down as the system responds then returns to its managed state. Using storm events as perturbations we can contrast an arid managed system with the unmanaged system it approaches during the storm event. In the process, we can extract geochemical consequences specifically related to unknown urban components in the form of chemical fingerprints. The effects of river management on solute behavior were assessed by taking advantage of several anomalously heavy winter storm events in late 2004 and early 2005 using a rigorous sampling routine. Several hundred samples collected between January and October 2005 were analyzed for major ion, isotopic, and trace metal concentrations with 78 individual measurements for each sample. The data are used to resolve managed watershed processes, mechanisms of solute transport and river mixing from anthropogenic inputs. Our results show that concentrations of major solutes change slowly and are independent of discharge downstream from the dams on two major tributaries. This is indicative of reservoir release water. In addition, a third input is derived from the Colorado River via the Central Arizona Project canal system. Cross plots including concentrations of solutes such as nitrate and sulfate from downstream of the confluence indicate at least three end-member sources, as do Piper diagrams using major anion and cation data. Dynamic contributions from natural event water and urban inputs can be resolved from the slowly changing release water, and may dictate the short-term transport of pollutants during the storm-induced transition state.
Schwartz, Rafi; Lahav, Ori; Ostfeld, Avi
2014-10-15
As a complementary step towards solving the general event detection problem of water distribution systems, injection of the organophosphate pesticides, chlorpyrifos (CP) and parathion (PA), were simulated at various locations within example networks and hydraulic parameters were calculated over 24-h duration. The uniqueness of this study is that the chemical reactions and byproducts of the contaminants' oxidation were also simulated, as well as other indicative water quality parameters such as alkalinity, acidity, pH and the total concentration of free chlorine species. The information on the change in water quality parameters induced by the contaminant injection may facilitate on-line detection of an actual event involving this specific substance and pave the way to development of a generic methodology for detecting events involving introduction of pesticides into water distribution systems. Simulation of the contaminant injection was performed at several nodes within two different networks. For each injection, concentrations of the relevant contaminants' mother and daughter species, free chlorine species and water quality parameters, were simulated at nodes downstream of the injection location. The results indicate that injection of these substances can be detected at certain conditions by a very rapid drop in Cl2, functioning as the indicative parameter, as well as a drop in alkalinity concentration and a small decrease in pH, both functioning as supporting parameters, whose usage may reduce false positive alarms. Copyright © 2014 Elsevier Ltd. All rights reserved.
Microearthquake sequences along the Irpinia normal fault system in Southern Apennines, Italy
NASA Astrophysics Data System (ADS)
Orefice, Antonella; Festa, Gaetano; Alfredo Stabile, Tony; Vassallo, Maurizio; Zollo, Aldo
2013-04-01
Microearthquakes reflect a continuous readjustment of tectonic structures, such as faults, under the action of local and regional stress fields. Low magnitude seismicity in the vicinity of active fault zones may reveal insights into the mechanics of the fault systems during the inter-seismic period and shine a light on the role of fluids and other physical parameters in promoting or disfavoring the nucleation of larger size events in the same area. Here we analyzed several earthquake sequences concentrated in very limited regions along the 1980 Irpinia earthquake fault zone (Southern Italy), a complex system characterized by normal stress regime, monitored by the dense, multi-component, high dynamic range seismic network ISNet (Irpinia Seismic Network). On a specific single sequence, the May 2008 Laviano swarm, we performed accurate absolute and relative locations and estimated source parameters and scaling laws that were compared with standard stress-drops computed for the area. Additionally, from EGF deconvolution, we computed a slip model for the mainshock and investigated the space-time evolution of the events in the sequence to reveal possible interactions among earthquakes. Through the massive analysis of cross-correlation based on the master event scanning of the continuous recording, we also reconstructed the catalog of repeated earthquakes and recognized several co-located sequences. For these events, we analyzed the statistical properties, location and source parameters and their space-time evolution with the aim of inferring the processes that control the occurrence and the size of microearthquakes in a swarm.
NASA Astrophysics Data System (ADS)
Kassab, Ala'; Liang, Steve; Gao, Yang
2010-12-01
Emergency agencies seek to maintain situational awareness and effective decision making through continuous monitoring of, and real-time alerting about, sources of information regarding current incidents and developing fire hazards. The nature of this goal requires integrating different, potentially numerous, sources of dynamic geospatial information on the one side, and a large number of clients having heterogeneous and specific interests in data on the other side. In such scenarios, the traditional request/reply communication style may function inefficiently, as it is based on point-to-point, synchronous, and pulling mode interaction between consumer clients and information providers/services. In this work, we propose Geospatial-based Publish/ Subscribe, an interaction framework that serves as a middleware for real-time transacting of spatially related information of interest, termed geospatial events, in distributed systems. Expressive data models, including geospatial event and geospatial subscription, as well as an efficient matching approach for fast dissemination of geospatial events to interested clients, are introduced. The proposed interaction framework is realized through the development of a Real-Time Fire Emergency Response System (RFERS) prototype. The prototype is designed for transacting several topics of geospatial events that are crucial within the context of fire emergencies, including GPS locations of emergency assets, meteorological observations of wireless sensors, fire incidents reports, and temporal sequences of remote sensing images of active wildfires. The performance of the system prototype has been evaluated in order to demonstrate its efficiency.
Estimating rare events in biochemical systems using conditional sampling.
Sundar, V S
2017-01-28
The paper focuses on development of variance reduction strategies to estimate rare events in biochemical systems. Obtaining this probability using brute force Monte Carlo simulations in conjunction with the stochastic simulation algorithm (Gillespie's method) is computationally prohibitive. To circumvent this, important sampling tools such as the weighted stochastic simulation algorithm and the doubly weighted stochastic simulation algorithm have been proposed. However, these strategies require an additional step of determining the important region to sample from, which is not straightforward for most of the problems. In this paper, we apply the subset simulation method, developed as a variance reduction tool in the context of structural engineering, to the problem of rare event estimation in biochemical systems. The main idea is that the rare event probability is expressed as a product of more frequent conditional probabilities. These conditional probabilities are estimated with high accuracy using Monte Carlo simulations, specifically the Markov chain Monte Carlo method with the modified Metropolis-Hastings algorithm. Generating sample realizations of the state vector using the stochastic simulation algorithm is viewed as mapping the discrete-state continuous-time random process to the standard normal random variable vector. This viewpoint opens up the possibility of applying more sophisticated and efficient sampling schemes developed elsewhere to problems in stochastic chemical kinetics. The results obtained using the subset simulation method are compared with existing variance reduction strategies for a few benchmark problems, and a satisfactory improvement in computational time is demonstrated.
Will, Thorsten; Helms, Volkhard
2017-04-04
Differential analysis of cellular conditions is a key approach towards understanding the consequences and driving causes behind biological processes such as developmental transitions or diseases. The progress of whole-genome expression profiling enabled to conveniently capture the state of a cell's transcriptome and to detect the characteristic features that distinguish cells in specific conditions. In contrast, mapping the physical protein interactome for many samples is experimentally infeasible at the moment. For the understanding of the whole system, however, it is equally important how the interactions of proteins are rewired between cellular states. To overcome this deficiency, we recently showed how condition-specific protein interaction networks that even consider alternative splicing can be inferred from transcript expression data. Here, we present the differential network analysis tool PPICompare that was specifically designed for isoform-sensitive protein interaction networks. Besides detecting significant rewiring events between the interactomes of grouped samples, PPICompare infers which alterations to the transcriptome caused each rewiring event and what is the minimal set of alterations necessary to explain all between-group changes. When applied to the development of blood cells, we verified that a reasonable amount of rewiring events were reported by the tool and found that differential gene expression was the major determinant of cellular adjustments to the interactome. Alternative splicing events were consistently necessary in each developmental step to explain all significant alterations and were especially important for rewiring in the context of transcriptional control. Applying PPICompare enabled us to investigate the dynamics of the human protein interactome during developmental transitions. A platform-independent implementation of the tool PPICompare is available at https://sourceforge.net/projects/ppicompare/ .
The OGC Publish/Subscribe specification in the context of sensor-based applications
NASA Astrophysics Data System (ADS)
Bigagli, Lorenzo
2014-05-01
The Open Geospatial Consortium Publish/Subscribe Standards Working Group (in short, OGC PubSub SWG) was chartered in 2010 to specify a mechanism to support publish/subscribe requirements across OGC service interfaces and data types (coverage, feature, etc.) The Publish/Subscribe Interface Standard 1.0 - Core (13-131) defines an abstract description of the basic mandatory functionality, along with several optional, extended capabilities. The Core is independent of the underlying binding, for which two extensions are currently considered in the PubSub SWG scope: a SOAP binding and RESTful binding. Two primary parties characterize the publish/subscribe model: a Publisher, which is publishing information, and a Subscriber, which expresses an interest in all or part of the published information. In many cases, the Subscriber and the entity to which data is to be delivered (the Receiver) are one and the same. However, they are distinguished in PubSub to allow for these roles to be segregated. This is useful, for example, in event-based systems, where system entities primarily react to incoming information and may emit new information to other interested entities. The Publish/Subscribe model is distinguished from the typical request/response model, where a client makes a request and the server responds with either the requested information or a failure. This provides relatively immediate feedback, but can be insufficient in cases where the client is waiting for a specific event (such as data arrival, server changes, or data updates). In fact, while waiting for an event, a client must repeatedly request the desired information (polling). This has undesirable side effects: if a client polls frequently this can increase server load and network traffic, and if a client polls infrequently it may not receive a message when it is needed. These issues are accentuated when event occurrences are unpredictable, or when the delay between event occurrence and client notification must be small. Instead, the Publish/Subscribe model is characterized by the ability for a Subscriber to specify an ongoing (persistent) expression of interest in some messages, and by the asynchronous delivery of such messages. Hence, the publish/subscribe model can be useful to reduce the latency between event occurrence and event notification, as it is the Publisher's responsibility to publish a message when the event occurs, rather than relying on clients to anticipate the occurrence. The following cross-service requirements have been identified for PubSub 1.0: • Provide notification capabilities as a module to existing OGC services with no impact on existing service semantics and by reusing service-specific filtering semantics. • Usable as a way to push actual data (not only references to data) from a data access service (i.e. WCS, WFS, SOS) to the client. • Usable as a way to push notification messages (i.e. lightweight, no data but rather references to data) to the client. • Usable as a way to provide notifications of service and dataset updates in order to simplify/optimize harvesting by catalogs. The use-cases identified for PubSub 1.0 include: • Service Filtered Data Push. • Service Filtered Notification. • Notification of Threshold Crossings. • FAA SAA Dissemination Pilot. • Emergency / Safety Critical Application. The above suggests that the OGC Publish/Subscribe specification could be successfully applied to sensor-based monitoring. This work elaborates on this technology and its possible applications in this context.
Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling
Lareo, Angel; Forlim, Caroline G.; Pinto, Reynaldo D.; Varona, Pablo; Rodriguez, Francisco de Borja
2016-01-01
Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox. PMID:27766078
Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling.
Lareo, Angel; Forlim, Caroline G; Pinto, Reynaldo D; Varona, Pablo; Rodriguez, Francisco de Borja
2016-01-01
Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox.
The new Euskalmet coastal-maritime warning system
NASA Astrophysics Data System (ADS)
Gaztelumendi, Santiago; Egaña, Joseba; Liria, Pedro; Gonzalez, Manuel; Aranda, José Antonio; Anitua, Pedro
2016-06-01
This work presents the main characteristics of the Basque Meteorology Agency (Euskalmet) maritime-coastal risk warning system, with special emphasis on the latest updates, including a clear differentiation on specific warning messages addressing sea conditions for navigation purposes in the first 2 nautical miles, and expected coastal impacts. Some details of the warning bulletin for maritime and coastal risk situations are also presented, together with other communication products and strategies used in coastal and maritime severe episodes at the Basque coast. Today, three different aspects are included in the coastal-maritime risk warning system in Basque Country, related to the main potential severe events that affecting coastal activities. - "Galerna" risk relates to a sudden wind reversal that can severely affect coastal navigation and recreational activities. - "Navigation" risk relates to severe sea state conditions for 0-2 miles, affecting different navigation activities. - "Coastal impact" risk relates to adverse wave characteristics and tidal surges that induce flooding events and different impacts in littoral areas.
Soldan, Anja; Mangels, Jennifer A; Cooper, Lynn A
2006-03-01
This study was designed to differentiate between structural description and bias accounts of performance in the possible/impossible object-decision test. Two event-related potential (ERP) studies examined how the visual system processes structurally possible and impossible objects. Specifically, the authors investigated the effects of object repetition on a series of early posterior components during structural (Experiment 1) and functional (Experiment 2) encoding and the relationship of these effects to behavioral measures of priming. In both experiments, the authors found repetition enhancement of the posterior N1 and N2 for possible objects only. In addition, the magnitude of the N1 repetition effect for possible objects was correlated with priming for possible objects. Although the behavioral results were more ambiguous, these ERP results fail to support bias models that hold that both possible and impossible objects are processed similarly in the visual system. Instead, they support the view that priming is supported by a structural description system that encodes the global 3-dimensional structure of an object.
Minding Impacting Events in a Model of Stochastic Variance
Duarte Queirós, Sílvio M.; Curado, Evaldo M. F.; Nobre, Fernando D.
2011-01-01
We introduce a generalization of the well-known ARCH process, widely used for generating uncorrelated stochastic time series with long-term non-Gaussian distributions and long-lasting correlations in the (instantaneous) standard deviation exhibiting a clustering profile. Specifically, inspired by the fact that in a variety of systems impacting events are hardly forgot, we split the process into two different regimes: a first one for regular periods where the average volatility of the fluctuations within a certain period of time is below a certain threshold, , and another one when the local standard deviation outnumbers . In the former situation we use standard rules for heteroscedastic processes whereas in the latter case the system starts recalling past values that surpassed the threshold. Our results show that for appropriate parameter values the model is able to provide fat tailed probability density functions and strong persistence of the instantaneous variance characterized by large values of the Hurst exponent (), which are ubiquitous features in complex systems. PMID:21483864
Addis, Donna Rose; Wong, Alana T.; Schacter, Daniel L.
2007-01-01
People can consciously re-experience past events and pre-experience possible future events. This fMRI study examined the neural regions mediating the construction and elaboration of past and future events. Participants were cued with a noun for 20 seconds and instructed to construct a past or future event within a specified time period (week, year, 5–20 years). Once participants had the event in mind, they made a button press and for the remainder of the 20 seconds elaborated on the event. Importantly, all events generated were episodic and did not differ on a number of phenomenological qualities (detail, emotionality, personal significance, field/observer perspective). Conjunction analyses indicated the left hippocampus was commonly engaged by past and future event construction, along with posterior visuospatial regions, but considerable neural differentiation was also observed during the construction phase. Future events recruited regions involved in prospective thinking and generation processes, specifically right frontopolar cortex and left ventrolateral prefrontal cortex, respectively. Furthermore, future event construction uniquely engaged the right hippocampus, possibly as a response to the novelty of these events. In contrast to the construction phase, elaboration was characterized by remarkable overlap in regions comprising the autobiographical memory retrieval network, attributable to the common processes engaged during elaboration, including self-referential processing, contextual and episodic imagery. This striking neural overlap is consistent with findings that amnesic patients exhibit deficits in both past and future thinking, and confirms that the episodic system contributes importantly to imagining the future. PMID:17126370
Novak, Avrey; Nyflot, Matthew J; Ermoian, Ralph P; Jordan, Loucille E; Sponseller, Patricia A; Kane, Gabrielle M; Ford, Eric C; Zeng, Jing
2016-05-01
Radiation treatment planning involves a complex workflow that has multiple potential points of vulnerability. This study utilizes an incident reporting system to identify the origination and detection points of near-miss errors, in order to guide their departmental safety improvement efforts. Previous studies have examined where errors arise, but not where they are detected or applied a near-miss risk index (NMRI) to gauge severity. From 3/2012 to 3/2014, 1897 incidents were analyzed from a departmental incident learning system. All incidents were prospectively reviewed weekly by a multidisciplinary team and assigned a NMRI score ranging from 0 to 4 reflecting potential harm to the patient (no potential harm to potential critical harm). Incidents were classified by point of incident origination and detection based on a 103-step workflow. The individual steps were divided among nine broad workflow categories (patient assessment, imaging for radiation therapy (RT) planning, treatment planning, pretreatment plan review, treatment delivery, on-treatment quality management, post-treatment completion, equipment/software quality management, and other). The average NMRI scores of incidents originating or detected within each broad workflow area were calculated. Additionally, out of 103 individual process steps, 35 were classified as safety barriers, the process steps whose primary function is to catch errors. The safety barriers which most frequently detected incidents were identified and analyzed. Finally, the distance between event origination and detection was explored by grouping events by the number of broad workflow area events passed through before detection, and average NMRI scores were compared. Near-miss incidents most commonly originated within treatment planning (33%). However, the incidents with the highest average NMRI scores originated during imaging for RT planning (NMRI = 2.0, average NMRI of all events = 1.5), specifically during the documentation of patient positioning and localization of the patient. Incidents were most frequently detected during treatment delivery (30%), and incidents identified at this point also had higher severity scores than other workflow areas (NMRI = 1.6). Incidents identified during on-treatment quality management were also more severe (NMRI = 1.7), and the specific process steps of reviewing portal and CBCT images tended to catch highest-severity incidents. On average, safety barriers caught 46% of all incidents, most frequently at physics chart review, therapist's chart check, and the review of portal images; however, most of the incidents that pass through a particular safety barrier are not designed to be capable of being captured at that barrier. Incident learning systems can be used to assess the most common points of error origination and detection in radiation oncology. This can help tailor safety improvement efforts and target the highest impact portions of the workflow. The most severe near-miss events tend to originate during simulation, with the most severe near-miss events detected at the time of patient treatment. Safety barriers can be improved to allow earlier detection of near-miss events.
Payne, Daniel C; Franzke, Laura H; Stehr-Green, Paul A; Schwartz, Benjamin; McNeil, Michael M
2007-01-01
In 2002, the Centers for Disease Control and Prevention established the Vaccine Analytic Unit (VAU) in collaboration with the Department of Defense (DoD). The focus of this report is to describe the process by which the VAU's anthrax vaccine safety research plan was developed following a comprehensive review of these topics. Public health literature, surveillance data, and clinical sources were reviewed to create a list of adverse events hypothesized to be potentially related to anthrax vaccine adsorbed (AVA). From this list, a consensus process was used to select 11 important research topics. Adverse event background papers were written for each of these topics, addressing predetermined criteria. These were independently reviewed and ranked by a National Vaccine Advisory Committee (NVAC) workgroup. The adverse events included in the final priority list will be the subject of observational or other post marketing surveillance studies using the Defense Medical Surveillance System (DMSS) database. A review of various information sources identified over 100 potential adverse events. The review process recommended 11 topics as potentially warranting further study. The NVAC workgroup identified the following adverse event topics for study: arthritis, optic neuritis, and Stevens-Johnson syndrome/Toxic epidermal necrolysis. Two additional topics (systemic lupus erythematosus (SLE) and multiple, near-concurrent military vaccinations) were added in response to emerging public health and military concerns. The experience described, while specific for establishing the VAU's research agenda for the safety of the current anthrax vaccine, may be useful and adapted for research planning in other areas of public health research. Copyright (c) 2006 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Hesselbo, Stephen; Bjerrum, Christian; Hinnov, Linda; Mac Niocaill, Conall; Miller, Kenneth; Riding, James; van de Schootbrugge, Bas; Wonik, Thomas
2014-05-01
The Early Jurassic Epoch (201.4 - 175 Ma) was a time of extreme environmental change. Through this period there are well-documented examples of rapid transitions from cold, or even glacial climates, through to super-greenhouse events, the latter characterized worldwide by hugely enhanced organic carbon burial, multiple large-magnitude isotopic anomalies, global sea-level changes, and mass extinctions. These events not only reflect changes in the global climate system but are also thought to have had significant influence on the evolution of Jurassic marine and terrestrial biota. Furthermore, the events may serve as analogues for present-day and future environmental transitions. Although our knowledge of specific global change events within the Early Jurassic is rapidly improving, a prime case-in-point being the Toarcian Oceanic Anoxic Event (or T-OAE), we have neither documented all the events, nor do we have a comprehensive understanding of their timing, pacing, or triggers. A key factor contributing to our fragmentary knowledge is the scattered and discontinuous nature of the existing datasets. The major goal for this proposed ICDP project is therefore to produce a new global standard for these key 25 million years of Earth history by re-drilling a 45 year old borehole at Mochras Farm on the edge of Cardigan Bay, Wales, and to develop an integrated stratigraphy for the cored material, as well as high-resolution proxy-records of environmental change. The new datasets will be applied to understand fundamental questions about the long- and short-term evolution of the Earth System.
A Smartwatch-Driven Medication Management System Compliant to the German Medication Plan.
Keil, Andreas; Gegier, Konstantin; Pobiruchin, Monika; Wiesner, Martin
2016-01-01
Medication adherence is an important factor for the outcome of medical therapies. To support high adherence levels, smartwatches can be used to assist the patient. However, a successful integration of such devices into clinicians' or general practitioners' information systems requires the use of standards. In this paper, a medication management system supplied with smartwatch generated feedback events is presented. It allows physicians to manage their patients' medications and track their adherence in real time. Moreover, it fosters interoperability via a ISO/IEC 16022 data matrix which encodes related medication data in compliance with the German Medication Plan specification.
Monitoring and evaluating civil structures using measured vibration
NASA Astrophysics Data System (ADS)
Straser, Erik G.; Kiremidjian, Anne S.
1996-04-01
The need for a rapid assessment of the state of critical and conventional civil structures, such as bridges, control centers, airports, and hospitals, among many, has been amply demonstrated during recent natural disasters. Research is underway at Stanford University to develop a state-of-the-art automated damage monitoring system for long term and extreme event monitoring based on both ambient and forced response measurements. Such research requires a multi-disciplinary approach harnessing the talents and expertise of civil, electrical, and mechanical engineering to arrive at a novel hardware and software solution. Recent advances in silicon micro-machining and microprocessor design allow for the economical integration of sensing, processing, and communication components. Coupling these technological advances with parameter identification algorithms allows for the realization of extreme event damage monitoring systems for civil structures. This paper addresses the first steps toward the development of a near real-time damage diagnostic and monitoring system based on structural response to extreme events. Specifically, micro-electro-mechanical- structures (MEMS) and microcontroller embedded systems (MES) are demonstrated to be an effective platform for the measurement and analysis of civil structures. Experimental laboratory tests with small scale model specimens and a preliminary sensor module are used to evaluate hardware and obtain structural response data from input accelerograms. A multi-step analysis procedure employing ordinary least squares (OLS), extended Kalman filtering (EKF), and a substructuring approach is conducted to extract system characteristics of the model. Results from experimental tests and system identification (SI) procedures as well as fundamental system design issues are presented.
Northern Plains Blizzards in Past and Future Climates
NASA Astrophysics Data System (ADS)
Trellinger, A.; Kennedy, A. D.
2017-12-01
High-latitude regions of the globe including the northern tier of the United States are subject to adverse conditions during the winter such as snowstorms. When snowfall combines with strong winds, blizzards can result and these events have significant personal, societal, and economic impacts for the Northern Plains. Although the climatology of wintertime extremes such as blizzards is reasonably understood, it is not known how the frequency and intensity of these events may change in a warming climate. Complicating factors include competing trends that suggest winter will have more snow over this region, but over a shorter seasonal duration. Identifying blizzards in climate models is difficult due to the horizontal and vertical grid spacing used. Additionally, blowing snow is not considered in these models, so it cannot be directly diagnosed. Instead, alternative ways must be developed to identify these events. The presented work will use a competitive neural network known as the Self-Organizing Map (SOM) to identify meteorological patterns associated with blizzard events over the Northern Plains from 1979-2016. Once these large-scale patterns are identified from observations, they will be identified in Community Climate System Model (CESM) 4.0 20th Century forcing climate simulations run in support for the Coupled Model Intercomparison Project Phase 5 (CMIP-5). In specific, the methodology will rely on the `Mother of All Runs' (MOAR) ensemble member. Because this member provides subdaily output for many variables, specific meteorological patterns can be identified. Blizzard events will be identified during historical time periods to determine biases, and then under future emissions scenarios.
NASA Astrophysics Data System (ADS)
Sturdevant-Rees, P. L.; Long, S. C.; Barten, P. K.
2002-05-01
A forty-month investigation to collect microbial and water-quality measurements during storm events under a variety of meteorological and land-use conditions is in its initial stages. Intense sampling during storm event periods will be used to optimize sampling and analysis strategies for accurate determination of constituent loads. Of particular interest is identification of meteorological and hydrologic conditions under which sampling and analysis of surface waters for traditional microbial organisms, emerging microbial organisms and non-bacterial pathogens are critical to ensure the integrity of surface-water drinking supplies. This work is particular to the Quabbin-Ware-Wachusett reservoir system in Massachusetts, which provides unfiltered drinking water to 2.5 million people in Boston and surrounding communities. Sampling and analysis strategies will be optimized in terms of number of samples over the hydrograph, timing of sample collection (including sample initiation), constituents measured, volumes analyzed, and monetary and personnel costs. Initial water-quality analyses include pH, temperature, turbidity, conductivity, total suspended solids, total phosphorus, total Kjeldahl-nitrogen, ammonia nitrogen, and total and fecal coliforms. Giardia cysts and Cryptosporidium oocysts will also be measured at all sample sites. Sorbitol-fermenting Bifidobacteria, Rhodococcus coprophilus, Clostridium perfringens spores, and Somatic and F-specific coliphages are measured at select sites as potential alternative source-specific indicator organisms. It is anticipated that the final database will consist of transport data for the above parameters during twenty-four distinct storm-events in addition to monthly baseline data. Results and analyses for the first monitored storm-event will be presented.
Moro, Pedro L.; Woo, Emily Jane; Paul, Wendy; Lewis, Paige; Petersen, Brett W.; Cano, Maria
2016-01-01
Background In 1980, human diploid cell vaccine (HDCV, Imovax Rabies, Sanofi Pasteur), was licensed for use in the United States. Objective To assess adverse events (AEs) after HDCV reported to the US Vaccine Adverse Event Reporting System (VAERS), a spontaneous reporting surveillance system. Methods We searched VAERS for US reports after HDCV among persons vaccinated from January 1, 1990–July 31, 2015. Medical records were requested for reports classified as serious (death, hospitalization, prolonged hospitalization, disability, life-threatening-illness), and those suggesting anaphylaxis and Guillain-Barré syndrome (GBS). Physicians reviewed available information and assigned a primary clinical category to each report using MedDRA system organ classes. Empirical Bayesian (EB) data mining was used to identify disproportional AE reporting after HDCV. Results VAERS received 1,611 reports after HDCV; 93 (5.8%) were serious. Among all reports, the three most common AEs included pyrexia (18.2%), headache (17.9%), and nausea (16.5%). Among serious reports, four deaths appeared to be unrelated to vaccination. Conclusions This 25-year review of VAERS did not identify new or unexpected AEs after HDCV. The vast majority of AEs were non-serious. Injection site reactions, hypersensitivity reactions, and non-specific constitutional symptoms were most frequently reported, similar to findings in pre-licensure studies. PMID:27410239
Information extraction from Italian medical reports: An ontology-driven approach.
Viani, Natalia; Larizza, Cristiana; Tibollo, Valentina; Napolitano, Carlo; Priori, Silvia G; Bellazzi, Riccardo; Sacchi, Lucia
2018-03-01
In this work, we propose an ontology-driven approach to identify events and their attributes from episodes of care included in medical reports written in Italian. For this language, shared resources for clinical information extraction are not easily accessible. The corpus considered in this work includes 5432 non-annotated medical reports belonging to patients with rare arrhythmias. To guide the information extraction process, we built a domain-specific ontology that includes the events and the attributes to be extracted, with related regular expressions. The ontology and the annotation system were constructed on a development set, while the performance was evaluated on an independent test set. As a gold standard, we considered a manually curated hospital database named TRIAD, which stores most of the information written in reports. The proposed approach performs well on the considered Italian medical corpus, with a percentage of correct annotations above 90% for most considered clinical events. We also assessed the possibility to adapt the system to the analysis of another language (i.e., English), with promising results. Our annotation system relies on a domain ontology to extract and link information in clinical text. We developed an ontology that can be easily enriched and translated, and the system performs well on the considered task. In the future, it could be successfully used to automatically populate the TRIAD database. Copyright © 2017 Elsevier B.V. All rights reserved.
Design and implementation of the GLIF3 guideline execution engine.
Wang, Dongwen; Peleg, Mor; Tu, Samson W; Boxwala, Aziz A; Ogunyemi, Omolola; Zeng, Qing; Greenes, Robert A; Patel, Vimla L; Shortliffe, Edward H
2004-10-01
We have developed the GLIF3 Guideline Execution Engine (GLEE) as a tool for executing guidelines encoded in the GLIF3 format. In addition to serving as an interface to the GLIF3 guideline representation model to support the specified functions, GLEE provides defined interfaces to electronic medical records (EMRs) and other clinical applications to facilitate its integration with the clinical information system at a local institution. The execution model of GLEE takes the "system suggests, user controls" approach. A tracing system is used to record an individual patient's state when a guideline is applied to that patient. GLEE can also support an event-driven execution model once it is linked to the clinical event monitor in a local environment. Evaluation has shown that GLEE can be used effectively for proper execution of guidelines encoded in the GLIF3 format. When using it to execute each guideline in the evaluation, GLEE's performance duplicated that of the reference systems implementing the same guideline but taking different approaches. The execution flexibility and generality provided by GLEE, and its integration with a local environment, need to be further evaluated in clinical settings. Integration of GLEE with a specific event-monitoring and order-entry environment is the next step of our work to demonstrate its use for clinical decision support. Potential uses of GLEE also include quality assurance, guideline development, and medical education.
How Engineers Negotiate Domain Boundaries in a Complex, Interdisciplinary Engineering Project
NASA Technical Reports Server (NTRS)
Panther, Grace; Montfort, Devlin; Pirtle, Zachary
2017-01-01
Engineering educators have an essential role in preparing engineers to work in a complex, interdisciplinary workforce. While much engineering education focuses on teaching students to develop disciplinary expertise in specific engineering domains, there is a strong need to teach engineers about the knowledge that they develop or use in their work (Bucciarelli 1994, Allenby Sarewitz, 2011; Frodeman, 2013). The purpose of this research is to gain a better understanding of the knowledge systems of practicing engineers through observations of their practices such that the insights learned can guide future education efforts. Using an example from a complex and interdisciplinary engineering project, this paper presents a case study overviewing the types of epistemological (or knowledge-acquiring or using) complexities that engineers navigate. Specifically, we looked at a discussion of the thermal design of a CubeSat that occurred during an engineering review at NASA. We analyzed the review using a framework that we call 'peak events', or pointed discussions between reviewers, project engineers, and managers. We examined the dialog within peak events to identify the ways that knowledge was brought to bear, highlighting discussions of uncertainty and the boundaries of knowledge claims. We focus on one example discussion surrounding the thermal design of the CubeSat, which provides a particularly thorough example of a knowledge system since the engineers present explained, justified, negotiated, and defended knowledge within a social setting. Engineering students do not get much practice or instruction in explicitly negotiating knowledge systems and epistemic standards in this way. We highlight issues that should matter to engineering educators, such as the need to discuss what level of uncertainty is sufficient and the need to negotiate boundaries of system responsibility. Although this analysis is limited to a single discussion or 'peak event', our case shows that this type of discussion can occur in engineering and suggests that it could be important for future engineering education research.
NASA Astrophysics Data System (ADS)
Choudhury, Diptyajit; Angeloski, Aleksandar; Ziah, Haseeb; Buchholz, Hilmar; Landsman, Andre; Gupta, Amitava; Mitra, Tiyasa
Lunar explorations often involve use of a lunar lander , a rover [1],[2] and an orbiter which rotates around the moon with a fixed radius. The orbiters are usually lunar satellites orbiting along a polar orbit to ensure visibility with respect to the rover and the Earth Station although with varying latency. Communication in such deep space missions is usually done using a specialized protocol like Proximity-1[3]. MATLAB simulation of Proximity-1 have been attempted by some contemporary researchers[4] to simulate all features like transmission control, delay etc. In this paper it is attempted to simulate, in real time, the communication between a tracking station on earth (earth station), a lunar orbiter and a lunar rover using concepts of Distributed Real-time Simulation(DRTS).The objective of the simulation is to simulate, in real-time, the time varying communication delays associated with the communicating elements with a facility to integrate specific simulation modules to study different aspects e.g. response due to a specific control command from the earth station to be executed by the rover. The hardware platform comprises four single board computers operating as stand-alone real time systems (developed by MATLAB xPC target and inter-networked using UDP-IP protocol). A time triggered DRTS approach is adopted. The earth station, the orbiter and the rover are programmed as three standalone real-time processes representing the communicating elements in the system. Communication from one communicating element to another constitutes an event which passes a state message from one element to another, augmenting the state of the latter. These events are handled by an event scheduler which is the fourth real-time process. The event scheduler simulates the delay in space communication taking into consideration the distance between the communicating elements. A unique time synchronization algorithm is developed which takes into account the large latencies in space communication. The DRTS setup thus developed serves as an important and inexpensive test bench for trying out remote controlled applications on the rover, for example, from an earth station. The simulation is modular and the system is composable. Each of the processes can be aug-mented with relevant simulation modules that handle the events to simulate specific function-alities. With stringent energy saving requirements on most rovers, such a simulation set up, for example, can be used to design optimal rover movement control strategies from the orbiter in conjunction with autonomous systems on the rover itself. References 1. Lunar and Planetary Department, Moscow University, Lunokhod 1, "http://selena.sai.msu.ru/Home/Spa 2. NASA History Office, Guidelines for Advanced Manned Space Vehicle Program, "http://history.nasa.gov 35ann/AMSVPguidelines/top.htm" 3. Consultative Committee For Space Data Systems, "Proximity-1 Space Link Protocol" CCSDS 211.0-B-1 Blue Book. October 2002. 4. Segui, J. and Jennings, E., "Delay Tolerant Networking-Bundle Protocol Simulation", in Proceedings of the 2nd IEEE International Conference on Space Mission Challenges for Infor-mation Technology, 2006.
A Systems-Theoretical Generalization of Non-Local Correlations
NASA Astrophysics Data System (ADS)
von Stillfried, Nikolaus
Non-local correlations between quantum events are not due to a causal interaction in the sense of one being the cause for the other. In principle, the correlated events can thus occur simultaneously. Generalized Quantum Theory (GQT) formalizes the idea that non-local phenomena are not exclusive to quantum mechanics, e.g. due to some specific properties of (sub)atomic particles, but that they instead arise as a consequence of the way such particles are arranged into systems. Non-local phenomena should hence occur in any system which fulfils the necessary systems-theoretical parameters. The two most important parameters with respect to non-local correlations seem to be a conserved global property of the system as a whole and sufficient degrees of freedom of the corresponding property of its subsystems. Both factors place severe limitations on experimental observability of the phenomena, especially in terms of replicability. It has been suggested that reported phenomena of a so-called synchronistic, parapsychological or paranormal kind could be understood as instances of systems-inherent non-local correlations. From a systems-theoretical perspective, their phenomenology (including the favorable conditions for their occurrence and their lack of replicability) displays substantial similarities to non-local correlations in quantum systems and matches well with systems-theoretical parameters, thus providing circumstantial evidence for this hypothesis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitzgerald, Peter; Laughter, Mark D; Martyn, Rose
The Cylinder Accountability and Tracking System (CATS) is a tool designed for use by the International Atomic Energy Agency (IAEA) to improve overall inspector efficiency through real-time unattended monitoring of cylinder movements, site specific rules-based event detection, and the capability to integrate many types of monitoring technologies. The system is based on the tracking of cylinder movements using (radio frequency) RF tags, and the collection of data, such as accountability weights, that can be associated with the cylinders. This presentation will cover the installation and evaluation of the CATS at the Global Nuclear Fuels (GNF) fuel fabrication facility in Wilmington,more » NC. This system was installed to evaluate its safeguards applicability, operational durability under operating conditions, and overall performance. An overview of the system design and elements specific to the GNF deployment will be presented along with lessons learned from the installation process and results from the field trial.« less
Zhu, Pengyu; Fu, Wei; Wang, Chenguang; Du, Zhixin; Huang, Kunlun; Zhu, Shuifang; Xu, Wentao
2016-04-15
The possibility of the absolute quantitation of GMO events by digital PCR was recently reported. However, most absolute quantitation methods based on the digital PCR required pretreatment steps. Meanwhile, singleplex detection could not meet the demand of the absolute quantitation of GMO events that is based on the ratio of foreign fragments and reference genes. Thus, to promote the absolute quantitative detection of different GMO events by digital PCR, we developed a quantitative detection method based on duplex digital PCR without pretreatment. Moreover, we tested 7 GMO events in our study to evaluate the fitness of our method. The optimized combination of foreign and reference primers, limit of quantitation (LOQ), limit of detection (LOD) and specificity were validated. The results showed that the LOQ of our method for different GMO events was 0.5%, while the LOD is 0.1%. Additionally, we found that duplex digital PCR could achieve the detection results with lower RSD compared with singleplex digital PCR. In summary, the duplex digital PCR detection system is a simple and stable way to achieve the absolute quantitation of different GMO events. Moreover, the LOQ and LOD indicated that this method is suitable for the daily detection and quantitation of GMO events. Copyright © 2016 Elsevier B.V. All rights reserved.
Francy, Reshma C; Farid, Amro M; Youcef-Toumi, Kamal
2015-05-01
For many decades, state estimation (SE) has been a critical technology for energy management systems utilized by power system operators. Over time, it has become a mature technology that provides an accurate representation of system state under fairly stable and well understood system operation. The integration of variable energy resources (VERs) such as wind and solar generation, however, introduces new fast frequency dynamics and uncertainties into the system. Furthermore, such renewable energy is often integrated into the distribution system thus requiring real-time monitoring all the way to the periphery of the power grid topology and not just the (central) transmission system. The conventional solution is two fold: solve the SE problem (1) at a faster rate in accordance with the newly added VER dynamics and (2) for the entire power grid topology including the transmission and distribution systems. Such an approach results in exponentially growing problem sets which need to be solver at faster rates. This work seeks to address these two simultaneous requirements and builds upon two recent SE methods which incorporate event-triggering such that the state estimator is only called in the case of considerable novelty in the evolution of the system state. The first method incorporates only event-triggering while the second adds the concept of tracking. Both SE methods are demonstrated on the standard IEEE 14-bus system and the results are observed for a specific bus for two difference scenarios: (1) a spike in the wind power injection and (2) ramp events with higher variability. Relative to traditional state estimation, the numerical case studies showed that the proposed methods can result in computational time reductions of 90%. These results were supported by a theoretical discussion of the computational complexity of three SE techniques. The work concludes that the proposed SE techniques demonstrate practical improvements to the computational complexity of classical state estimation. In such a way, state estimation can continue to support the necessary control actions to mitigate the imbalances resulting from the uncertainties in renewables. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
Real World Experience With Ion Implant Fault Detection at Freescale Semiconductor
NASA Astrophysics Data System (ADS)
Sing, David C.; Breeden, Terry; Fakhreddine, Hassan; Gladwin, Steven; Locke, Jason; McHugh, Jim; Rendon, Michael
2006-11-01
The Freescale automatic fault detection and classification (FDC) system has logged data from over 3.5 million implants in the past two years. The Freescale FDC system is a low cost system which collects summary implant statistics at the conclusion of each implant run. The data is collected by either downloading implant data log files from the implant tool workstation, or by exporting summary implant statistics through the tool's automation interface. Compared to the traditional FDC systems which gather trace data from sensors on the tool as the implant proceeds, the Freescale FDC system cannot prevent scrap when a fault initially occurs, since the data is collected after the implant concludes. However, the system can prevent catastrophic scrap events due to faults which are not detected for days or weeks, leading to the loss of hundreds or thousands of wafers. At the Freescale ATMC facility, the practical applications of the FD system fall into two categories: PM trigger rules which monitor tool signals such as ion gauges and charge control signals, and scrap prevention rules which are designed to detect specific failure modes that have been correlated to yield loss and scrap. PM trigger rules are designed to detect shifts in tool signals which indicate normal aging of tool systems. For example, charging parameters gradually shift as flood gun assemblies age, and when charge control rules start to fail a flood gun PM is performed. Scrap prevention rules are deployed to detect events such as particle bursts and excessive beam noise, events which have been correlated to yield loss. The FDC system does have tool log-down capability, and scrap prevention rules often use this capability to automatically log the tool into a maintenance state while simultaneously paging the sustaining technician for data review and disposition of the affected product.
NASA Astrophysics Data System (ADS)
Zielke, O.; Arrowsmith, R. J.
2005-12-01
The nonlinear dynamics of fault behavior are dominated by complex interactions among the multiple processes controlling the system. For example, temporal and spatial variations in pore pressure, healing effects, and stress transfer cause significant heterogeneities in fault properties and the stress-field at the sub-fault level. Numerical and laboratory fault models show that the interaction of large systems of fault elements causes the entire system to develop into a state of self-organized criticality. Once in this state, small perturbations of the system may result in chain reactions (i.e., earthquakes) which can affect any number of fault segments. This sensitivity to small perturbations is strong evidence for chaotic fault behavior, which implies that exact event prediction is not possible. However, earthquake prediction with a useful accuracy is nevertheless possible. Studies of other natural chaotic systems have shown that they may enter states of metastability, in which the system's behavior is predictable. Applying this concept to earthquake faults, these windows of metastable behavior should be characterized by periodic earthquake recurrence. The observed periodicity of the Parkfield, CA (M= 6) events may resemble such a window of metastability. I am statistically analyzing numerically generated seismic records to study these phases of periodic behavior. In this preliminary study, seismic records were generated using a model introduced by Nakanishi [Phys. Rev. A, 43, 6613-6621, 1991]. It consists of a one-dimensional chain of blocks (interconnected by springs) with a relaxation function that mimics velocity-weakened frictional behavior. The earthquakes occurring in this model show generally a power-law frequency-size distribution. However, for large events the distribution has a shoulder where the frequency of events is higher than expected from the power law. I have analyzed time-series of single block motions within the system. These time-series include noticeable periodicity during certain intervals in an otherwise aperiodic record. The observed periodic signal is not equally distributed over the range of offsets but shows a multi-modal distribution with increased periodicity for the smallest events and for large events that show a specific offset. These large events also form a shoulder in the frequency-size distribution. Apparently, the model exhibits characteristic earthquakes (defined by similar coseismic slip) that occur more frequently than expected from a power law distribution, and also are significantly more periodic. The wavelength of the periodic signal generally equals the minimum loading time, which is related to the loading velocity and the amount of coseismic slip (i.e., stress drop). No significant event occurs between the characteristic events as long as the system stays in a window of periodic behavior. Within the windows of periodic behavior, earthquake prediction is straightforward. Therefore, recognition of these windows not only in synthetic data but also in real seismic records, may improve the intra-window forecast of earthquakes. Further studies will attempt to determine the characteristics of onset, duration, and end of these windows of periodic earthquake recurrence. Only the motion of a single block within a bigger system was analyzed so far. Going from a zero dimensional scenario to a two dimensional case where the offsets not only of a single block but the displacement patterns caused by a certain event are analyzed will increase the verisimilitude of the detection of periodic earthquake recurrence within an otherwise chaotic seismic record.
Kay, Aaron C; Shepherd, Steven; Blatz, Craig W; Chua, Sook Ning; Galinsky, Adam D
2010-11-01
It has been recently proposed that people can flexibly rely on sources of control that are both internal and external to the self to satisfy the need to believe that their world is under control (i.e., that events do not unfold randomly or haphazardly). Consistent with this, past research demonstrates that, when personal control is threatened, people defend external systems of control, such as God and government. This theoretical perspective also suggests that belief in God and support for governmental systems, although seemingly disparate, will exhibit a hydraulic relationship with one another. Using both experimental and longitudinal designs in Eastern and Western cultures, the authors demonstrate that experimental manipulations or naturally occurring events (e.g., electoral instability) that lower faith in one of these external systems (e.g., the government) lead to subsequent increases in faith in the other (e.g., God). In addition, mediation and moderation analyses suggest that specific concerns with order and structure underlie these hydraulic effects. Implications for the psychological, sociocultural, and sociopolitical underpinnings of religious faith, as well as system justification theory, are discussed.
Alert Notification System Router
NASA Technical Reports Server (NTRS)
Gurganus, Joseph; Carey, Everett; Antonucci, Robert; Hitchener, Peter
2009-01-01
The Alert Notification System Router (ANSR) software provides satellite operators with notifications of key events through pagers, cell phones, and e-mail. Written in Java, this application is specifically designed to meet the mission-critical standards for mission operations while operating on a variety of hardware environments. ANSR is a software component that runs inside the Mission Operations Center (MOC). It connects to the mission's message bus using the GMSEC [Goddard Space Flight Center (GSFC) Mission Services Evolution Center (GMSEC)] standard. Other components, such as automation and monitoring components, can use ANSR to send directives to notify users or groups. The ANSR system, in addition to notifying users, can check for message acknowledgements from a user and escalate the notification to another user if there is no acknowledgement. When a firewall prevents ANSR from accessing the Internet directly, proxies can be run on the other side of the wall. These proxies can be configured to access the Internet, notify users, and poll for their responses. Multiple ANSRs can be run in parallel, providing a seamless failover capability in the event that one ANSR system becomes incapacitated.
Mechanism-based Pharmacovigilance over the Life Sciences Linked Open Data Cloud.
Kamdar, Maulik R; Musen, Mark A
2017-01-01
Adverse drug reactions (ADR) result in significant morbidity and mortality in patients, and a substantial proportion of these ADRs are caused by drug-drug interactions (DDIs). Pharmacovigilance methods are used to detect unanticipated DDIs and ADRs by mining Spontaneous Reporting Systems, such as the US FDA Adverse Event Reporting System (FAERS). However, these methods do not provide mechanistic explanations for the discovered drug-ADR associations in a systematic manner. In this paper, we present a systems pharmacology-based approach to perform mechanism-based pharmacovigilance. We integrate data and knowledge from four different sources using Semantic Web Technologies and Linked Data principles to generate a systems network. We present a network-based Apriori algorithm for association mining in FAERS reports. We evaluate our method against existing pharmacovigilance methods for three different validation sets. Our method has AUROC statistics of 0.7-0.8, similar to current methods, and event-specific thresholds generate AUROC statistics greater than 0.75 for certain ADRs. Finally, we discuss the benefits of using Semantic Web technologies to attain the objectives for mechanism-based pharmacovigilance.
A Decision Support System for Tele-Monitoring COPD-Related Worrisome Events.
Merone, Mario; Pedone, Claudio; Capasso, Giuseppe; Incalzi, Raffaele Antonelli; Soda, Paolo
2017-03-01
Chronic Obstructive Pulmonary Disease (COPD) is a preventable, treatable, and slowly progressive disease, whose course is aggravated by a periodic worsening of symptoms and lung function lasting for several days. The development of home telemonitoring systems has made possible to collect symptoms and physiological data in electronic records, boosting the development of decision support systems (DSSs). Current DSSs work with physiological measurements collected by means of several measuring and communication devices as well as with symptoms gathered by questionnaires submitted to COPD subjects. However, this contrasts with the advices provided by the World Health Organization and the Global initiative for chronic Obstructive Lung Disease that recommend to avoid invasive or complex daily measurements. For these reasons this manuscript presents a DSS detecting the onset of worrisome events in COPD subjects. It uses the hearth rate and the oxygen saturation, which can be collected via a pulse oximeter. The DSS consists in a binary finite state machine, whose training stage allows a subject specific personalization of the predictive model, triggering warnings, and alarms as the health status evolves over time. The experiments on data collected from 22 COPD patients tele-monitored at home for six months show that the system recognition performance is better than the one achieved by medical experts. Furthermore, the support offered by the system in the decision-making process allows to increase the agreement between the specialists, largely impacting the recognition of the worrisome events.
40 CFR 50.14 - Treatment of air quality monitoring data influenced by exceptional events.
Code of Federal Regulations, 2010 CFR
2010-07-01
... specific air pollution concentration at a particular air quality monitoring location. (2) Demonstration to... exceptional event caused a specific air pollution concentration in excess of one or more national ambient air... specific air pollution concentration in excess of one or more national ambient air quality standards at a...
Characterization of extreme precipitation within atmospheric river events over California
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeon, S.; Prabhat,; Byna, S.
Atmospheric rivers (ARs) are large, spatially coherent weather systems with high concentrations of elevated water vapor. These systems often cause severe downpours and flooding over the western coastal United States – and with the availability of more atmospheric moisture in the future under global warming we expect ARs to play an important role as potential causes of extreme precipitation changes. Therefore, we aim to investigate changes in extreme precipitation properties correlated with AR events in a warmer climate, which are large-scale meteorological patterns affecting the weather and climate of California. We have recently developed the TECA (Toolkit for Extreme Climatemore » Analysis) software for automatically identifying and tracking features in climate data sets. Specifically, we can now identify ARs that make landfall on the western coast of North America. Based on this detection procedure, we can investigate the impact of ARs by exploring the spatial extent of AR precipitation using climate model (CMIP5) simulations and characterize spatial patterns of dependence for future projections between AR precipitation extremes under climate change within the statistical framework. Our results show that AR events in the future RCP (Representative Concentration Pathway)8.5 scenario (2076–2100) tend to produce heavier rainfall with higher frequency and longer days than events from the historical run (1981–2005). We also find that the dependence between extreme precipitation events has a shorter spatial range, within localized areas in California, under the high future emissions scenario than under the historical run.« less
Characterization of extreme precipitation within atmospheric river events over California
Jeon, S.; Prabhat,; Byna, S.; ...
2015-11-17
Atmospheric rivers (ARs) are large, spatially coherent weather systems with high concentrations of elevated water vapor. These systems often cause severe downpours and flooding over the western coastal United States – and with the availability of more atmospheric moisture in the future under global warming we expect ARs to play an important role as potential causes of extreme precipitation changes. Therefore, we aim to investigate changes in extreme precipitation properties correlated with AR events in a warmer climate, which are large-scale meteorological patterns affecting the weather and climate of California. We have recently developed the TECA (Toolkit for Extreme Climatemore » Analysis) software for automatically identifying and tracking features in climate data sets. Specifically, we can now identify ARs that make landfall on the western coast of North America. Based on this detection procedure, we can investigate the impact of ARs by exploring the spatial extent of AR precipitation using climate model (CMIP5) simulations and characterize spatial patterns of dependence for future projections between AR precipitation extremes under climate change within the statistical framework. Our results show that AR events in the future RCP (Representative Concentration Pathway)8.5 scenario (2076–2100) tend to produce heavier rainfall with higher frequency and longer days than events from the historical run (1981–2005). We also find that the dependence between extreme precipitation events has a shorter spatial range, within localized areas in California, under the high future emissions scenario than under the historical run.« less
Modeling extreme (Carrington-type) space weather events using three-dimensional MHD code simulations
NASA Astrophysics Data System (ADS)
Ngwira, C. M.; Pulkkinen, A. A.; Kuznetsova, M. M.; Glocer, A.
2013-12-01
There is growing concern over possible severe societal consequences related to adverse space weather impacts on man-made technological infrastructure and systems. In the last two decades, significant progress has been made towards the modeling of space weather events. Three-dimensional (3-D) global magnetohydrodynamics (MHD) models have been at the forefront of this transition, and have played a critical role in advancing our understanding of space weather. However, the modeling of extreme space weather events is still a major challenge even for existing global MHD models. In this study, we introduce a specially adapted University of Michigan 3-D global MHD model for simulating extreme space weather events that have a ground footprint comparable (or larger) to the Carrington superstorm. Results are presented for an initial simulation run with ``very extreme'' constructed/idealized solar wind boundary conditions driving the magnetosphere. In particular, we describe the reaction of the magnetosphere-ionosphere system and the associated ground induced geoelectric field to such extreme driving conditions. We also discuss the results and what they might mean for the accuracy of the simulations. The model is further tested using input data for an observed space weather event to verify the MHD model consistence and to draw guidance for future work. This extreme space weather MHD model is designed specifically for practical application to the modeling of extreme geomagnetically induced electric fields, which can drive large currents in earth conductors such as power transmission grids.
NASA Astrophysics Data System (ADS)
Ushio, Toshimitsu; Takai, Shigemasa
Supervisory control is a general framework of logical control of discrete event systems. A supervisor assigns a set of control-disabled controllable events based on observed events so that the controlled discrete event system generates specified languages. In conventional supervisory control, it is assumed that observed events are determined by internal events deterministically. But, this assumption does not hold in a discrete event system with sensor errors and a mobile system, where each observed event depends on not only an internal event but also a state just before the occurrence of the internal event. In this paper, we model such a discrete event system by a Mealy automaton with a nondeterministic output function. We introduce two kinds of supervisors: one assigns each control action based on a permissive policy and the other based on an anti-permissive one. We show necessary and sufficient conditions for the existence of each supervisor. Moreover, we discuss the relationship between the supervisors in the case that the output function is determinisitic.
Analysis and design of randomised clinical trials involving competing risks endpoints.
Tai, Bee-Choo; Wee, Joseph; Machin, David
2011-05-19
In randomised clinical trials involving time-to-event outcomes, the failures concerned may be events of an entirely different nature and as such define a classical competing risks framework. In designing and analysing clinical trials involving such endpoints, it is important to account for the competing events, and evaluate how each contributes to the overall failure. An appropriate choice of statistical model is important for adequate determination of sample size. We describe how competing events may be summarised in such trials using cumulative incidence functions and Gray's test. The statistical modelling of competing events using proportional cause-specific and subdistribution hazard functions, and the corresponding procedures for sample size estimation are outlined. These are illustrated using data from a randomised clinical trial (SQNP01) of patients with advanced (non-metastatic) nasopharyngeal cancer. In this trial, treatment has no effect on the competing event of loco-regional recurrence. Thus the effects of treatment on the hazard of distant metastasis were similar via both the cause-specific (unadjusted csHR = 0.43, 95% CI 0.25 - 0.72) and subdistribution (unadjusted subHR 0.43; 95% CI 0.25 - 0.76) hazard analyses, in favour of concurrent chemo-radiotherapy followed by adjuvant chemotherapy. Adjusting for nodal status and tumour size did not alter the results. The results of the logrank test (p = 0.002) comparing the cause-specific hazards and the Gray's test (p = 0.003) comparing the cumulative incidences also led to the same conclusion. However, the subdistribution hazard analysis requires many more subjects than the cause-specific hazard analysis to detect the same magnitude of effect. The cause-specific hazard analysis is appropriate for analysing competing risks outcomes when treatment has no effect on the cause-specific hazard of the competing event. It requires fewer subjects than the subdistribution hazard analysis for a similar effect size. However, if the main and competing events are influenced in opposing directions by an intervention, a subdistribution hazard analysis may be warranted.
Recent Advances in Macrocyclic Fluorescent Probes for Ion Sensing.
Wong, Joseph K-H; Todd, Matthew H; Rutledge, Peter J
2017-01-25
Small-molecule fluorescent probes play a myriad of important roles in chemical sensing. Many such systems incorporating a receptor component designed to recognise and bind a specific analyte, and a reporter or transducer component which signals the binding event with a change in fluorescence output have been developed. Fluorescent probes use a variety of mechanisms to transmit the binding event to the reporter unit, including photoinduced electron transfer (PET), charge transfer (CT), Förster resonance energy transfer (FRET), excimer formation, and aggregation induced emission (AIE) or aggregation caused quenching (ACQ). These systems respond to a wide array of potential analytes including protons, metal cations, anions, carbohydrates, and other biomolecules. This review surveys important new fluorescence-based probes for these and other analytes that have been reported over the past five years, focusing on the most widely exploited macrocyclic recognition components, those based on cyclam, calixarenes, cyclodextrins and crown ethers; other macrocyclic and non-macrocyclic receptors are also discussed.
Naver: a PC-cluster-based VR system
NASA Astrophysics Data System (ADS)
Park, ChangHoon; Ko, HeeDong; Kim, TaiYun
2003-04-01
In this paper, we present a new framework NAVER for virtual reality application. The NAVER is based on a cluster of low-cost personal computers. The goal of NAVER is to provide flexible, extensible, scalable and re-configurable framework for the virtual environments defined as the integration of 3D virtual space and external modules. External modules are various input or output devices and applications on the remote hosts. From the view of system, personal computers are divided into three servers according to its specific functions: Render Server, Device Server and Control Server. While Device Server contains external modules requiring event-based communication for the integration, Control Server contains external modules requiring synchronous communication every frame. And, the Render Server consists of 5 managers: Scenario Manager, Event Manager, Command Manager, Interaction Manager and Sync Manager. These managers support the declaration and operation of virtual environment and the integration with external modules on remote servers.
A software bus for thread objects
NASA Technical Reports Server (NTRS)
Callahan, John R.; Li, Dehuai
1995-01-01
The authors have implemented a software bus for lightweight threads in an object-oriented programming environment that allows for rapid reconfiguration and reuse of thread objects in discrete-event simulation experiments. While previous research in object-oriented, parallel programming environments has focused on direct communication between threads, our lightweight software bus, called the MiniBus, provides a means to isolate threads from their contexts of execution by restricting communications between threads to message-passing via their local ports only. The software bus maintains a topology of connections between these ports. It routes, queues, and delivers messages according to this topology. This approach allows for rapid reconfiguration and reuse of thread objects in other systems without making changes to the specifications or source code. A layered approach that provides the needed transparency to developers is presented. Examples of using the MiniBus are given, and the value of bus architectures in building and conducting simulations of discrete-event systems is discussed.
The brain's default network: origins and implications for the study of psychosis.
Buckner, Randy L
2013-09-01
The brain's default network is a set of regions that is spontaneously active during passive moments. The network is also active during directed tasks that require participants to remember past events or imagine upcoming events. One hypothesis is that the network facilitates construction of mental models (simulations) that can be used adaptively in many contexts. Extensive research has considered whether disruption of the default network may contribute to disease. While an intriguing possibility, a specific challenge to this notion is the fact that it is difficult to accurately measure the default network in patients where confounds of head motion and compliance are prominent. Nonetheless, some intriguing recent findings suggest that dysfunctional interactions between front-oparietal control systems and the default network contribute to psychosis. Psychosis may be a network disturbance that manifests as disordered thought partly because it disrupts the fragile balance between the default network and competing brain systems.
The brain's default network: origins and implications for the study of psychosis
Buckner, Randy L.
2013-01-01
The brain's default network is a set of regions that is spontaneously active during passive moments. The network is also active during directed tasks that require participants to remember past events or imagine upcoming events. One hypothesis is that the network facilitates construction of mental models (simulations) that can be used adaptively in many contexts. Extensive research has considered whether disruption of the default network may contribute to disease. While an intriguing possibility, a specific challenge to this notion is the fact that it is difficult to accurately measure the default network in patients where confounds of head motion and compliance are prominent. Nonetheless, some intriguing recent findings suggest that dysfunctional interactions between front-oparietal control systems and the default network contribute to psychosis. Psychosis may be a network disturbance that manifests as disordered thought partly because it disrupts the fragile balance between the default network and competing brain systems. PMID:24174906
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, G.D.; Kukielka, C.A.; Olson, L.M.
The engineering analysis group is responsible for all nuclear plant systems analysis and reactor analysis activities, excluding fuel management analysis, at Pennsylvania Power and Light Company. These activities include making pretest and posttest predictions of startup tests; analyzing unplanned or unexpected transient events; providing technical training to plant personnel; assisting in the development of emergency drill scenarios; providing engineering evaluations to support design and technical specification changes, and evaluating, assessing, and resolving a number of license conditions. Many of these activities have required the direct use of RETRAN models. Two RETRAN analyses that were completed to support plant operations -more » a pretest analysis of the turbine trip startup test, and a posttest analysis of the loss of startup transformer event - are investigated. For each case, RETRAN results are compared with available plant data and comparisons are drawn on the acceptability of the performance of the plant systems.« less
Isotopic And Geochemical Investigations Of Meteorites
NASA Technical Reports Server (NTRS)
Walker, Richard J.
2005-01-01
The primary goals of our research over the past four years are to constrain the timing of certain early planetary accretion/differentiation events, and to constrain the proportions and provenance of materials involved in these processes. This work was achieved via the analysis and interpretation of long- and short-lived isotope systems, and the study of certain trace elements. Our research targeted these goals primarily via the application of the Re-187, Os-187, Pt-190 Os-186 Tc-98 Ru-99 and Tc-99 Ru-99 isotopic systems, and the determination/modeling of abundances of the highly siderophile elements (HSE; including Re, Os, Ir, Ru, Pd, Pt, and maybe Tc). The specific events we examined include the segregation and crystallization histories of asteroidal cores, the accretion and metamorphic histories of chondrites and chondrite components, and the accretionary and differentiation histories of Mars and the Moon.
On the complex quantification of risk: systems-based perspective on terrorism.
Haimes, Yacov Y
2011-08-01
This article highlights the complexity of the quantification of the multidimensional risk function, develops five systems-based premises on quantifying the risk of terrorism to a threatened system, and advocates the quantification of vulnerability and resilience through the states of the system. The five premises are: (i) There exists interdependence between a specific threat to a system by terrorist networks and the states of the targeted system, as represented through the system's vulnerability, resilience, and criticality-impact. (ii) A specific threat, its probability, its timing, the states of the targeted system, and the probability of consequences can be interdependent. (iii) The two questions in the risk assessment process: "What is the likelihood?" and "What are the consequences?" can be interdependent. (iv) Risk management policy options can reduce both the likelihood of a threat to a targeted system and the associated likelihood of consequences by changing the states (including both vulnerability and resilience) of the system. (v) The quantification of risk to a vulnerable system from a specific threat must be built on a systemic and repeatable modeling process, by recognizing that the states of the system constitute an essential step to construct quantitative metrics of the consequences based on intelligence gathering, expert evidence, and other qualitative information. The fact that the states of all systems are functions of time (among other variables) makes the time frame pivotal in each component of the process of risk assessment, management, and communication. Thus, risk to a system, caused by an initiating event (e.g., a threat) is a multidimensional function of the specific threat, its probability and time frame, the states of the system (representing vulnerability and resilience), and the probabilistic multidimensional consequences. © 2011 Society for Risk Analysis.
Impact of Fast Charging on Life of EV Batteries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neubauer, Jeremy; Wood, Eric; Burton, Evan
2015-05-03
Utilization of public charging infrastructure is heavily dependent on user-specific travel behavior. The availability of fast chargers can positively affect the utility of battery electric vehicles, even given infrequent use. Estimated utilization rates do not appear frequent enough to significantly impact battery life. Battery thermal management systems are critical in mitigating dangerous thermal conditions on long distance tours with multiple fast charge events.
2013-05-01
specifics of the correlation will be explored followed by discussion of new paradigms— the ordered event list (OEL) and the decision tree — that result from...4.2.1 Brief Overview of the Decision Tree Paradigm ................................................15 4.2.2 OEL Explained...6 Figure 3. A depiction of a notional fault/activation tree . ................................................................7
Understanding the Lives of Mà'dí Men and Women through the Names They Give Their Children
ERIC Educational Resources Information Center
Shoemaker, Jack C.
2012-01-01
This study treats Mà'dí naming patterns, a system of names which developed in a specific socio-religious and political environment that practiced patrilocal residence and relied primarily on farming for subsistence. Most Mà'dí names are social commentary names, recalling some event or circumstance parents experienced around the time a child was…
Adolescent Depression and Negative Life Events, the Mediating Role of Cognitive Emotion Regulation
Stikkelbroek, Yvonne; Bodden, Denise H. M.; Kleinjan, Marloes; Reijnders, Mirjam; van Baar, Anneloes L.
2016-01-01
Background Depression during adolescence is a serious mental health problem. Difficulties in regulating evoked emotions after stressful life events are considered to lead to depression. This study examined if depressive symptoms were mediated by various cognitive emotion regulation strategies after stressful life events, more specifically, the loss of a loved one, health threats or relational challenges. Methods We used a sample of 398 adolescents (Mage = 16.94, SD = 2.90), including 52 depressed outpatients, who all reported stressful life event(s). Path analyses in Mplus were used to test mediation, for the whole sample as well as separately for participants scoring high versus low on depression, using multigroup analyses. Results Health threats and relational challenging stressful life events were associated with depressive symptoms, while loss was not. More frequent use of maladaptive strategies was related to more depressive symptoms. More frequent use of adaptive strategies was related to less depressive symptoms. Specific life events were associated with specific emotion regulation strategies. The relationship between challenging, stressful life events and depressive symptoms in the whole group was mediated by maladaptive strategies (self-blame, catastrophizing and rumination). No mediation effect was found for adaptive strategies. Conclusion The association between relational challenging, stressful life events and depressive symptoms was mediated by maladaptive, cognitive emotion regulation strategies. PMID:27571274
Configuration-specific kinetic theory applied to an ideal binary gas mixture.
Wiseman, Floyd L
2006-10-05
This paper is the second in a two-part series dealing with the configuration-specific analyses for molecular collision events of hard, spherical molecules at thermal equilibrium. The first paper analyzed a single-component system, and the reader is referred to it for the fundamental concepts. In this paper, the expressions for the configuration-specific collision frequencies and the average line-of-centers collision angles and speeds are derived for an ideal binary gas mixture. The analyses show that the average line-of-centers quantities are all dependent upon the ratio of the masses of the two components, but not upon molecular size. Of course, the configuration-specific collision frequencies do depend on molecular size. The expression for the overall binary collision frequency is a simple sum of the configuration-specific collision frequencies and is identical to the conventional expression.
Extended specificity studies of mRNA assays used to infer human organ tissues and body fluids.
van den Berge, Margreet; Sijen, Titia
2017-12-01
Messenger RNA (mRNA) profiling is a technique increasingly applied for the forensic identification of body fluids and skin. More recently, an mRNA-based organ typing assay was developed which allows for the inference of brain, lung, liver, skeletal muscle, heart, kidney, and skin tissue. When applying this organ typing system in forensic casework for the presence of animal, rather than human, tissue is an alternative scenario to be proposed, for instance that bullets carry cell material from a hunting event. Even though mRNA profiling systems are commonly in silico designed to be primate specific, physical testing against other animal species is generally limited. In this study, human specificity of the organ tissue inferring system was assessed against organ tissue RNAs of various animals. Results confirm human specificity of the system, especially when utilizing interpretation rules considering multiple markers per cell type. Besides, we cross-tested our organ and body fluid mRNA assays against the target types covered by the other assay. Marker expression in the nontarget organ tissues and body fluids was observed to a limited extent, which emphasizes the importance of involving the case-specific context of the forensic samples in deciding which mRNA profiling assay to use and when for interpreting results. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Using waveform cross correlation for automatic recovery of aftershock sequences
NASA Astrophysics Data System (ADS)
Bobrov, Dmitry; Kitov, Ivan; Rozhkov, Mikhail
2017-04-01
Aftershock sequences of the largest earthquakes are difficult to recover. There can be several hundred mid-sized aftershocks per hour within a few hundred km from each other recorded by the same stations. Moreover, these events generate thousands of reflected/refracted phases having azimuth and slowness close to those from the P-waves. Therefore, aftershock sequences with thousands of events represent a major challenge for automatic and interactive processing at the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Organization (CTBTO). Standard methods of detection and phase association do not use all information contained in signals. As a result, wrong association of the first and later phases, both regular and site specific, produces enormous number of wrong event hypotheses and destroys valid event hypotheses in automatic IDC processing. In turn, the IDC analysts have to reject false and recreate valid hypotheses wasting precious human resources. At the current level of the IDC catalogue completeness, the method of waveform cross correlation (WCC) can resolve most of detection and association problems fully utilizing the similarity of waveforms generated by aftershocks. Array seismic stations of the International monitoring system (IMS) can enhance the performance of the WCC method: reduce station-specific detection thresholds, allow accurate estimate of signal attributes, including relative magnitude, and effectively suppress irrelevant arrivals. We have developed and tested a prototype of an aftershock tool matching all IDC processing requirements and merged it with the current IDC pipeline. This tool includes creation of master events consisting of real or synthetic waveform templates at ten and more IMS stations; cross correlation (CC) of real-time waveforms with these templates, association of arrivals detected at CC-traces in event hypotheses; building events matching the IDC quality criteria; and resolution of conflicts between events hypotheses created by neighboring master-events. The final cross correlation standard event lists (XSEL) is a start point for interactive analysis with standard tools. We present select results for the biggest earthquakes, like Sumatra 2004 and Tohoku 2011, as well as for several smaller events with hundreds of aftershocks. The sensitivity and resolution of the aftershock tool is demonstrated on the example of mb=2.2 aftershock found after the September 9, 2016 DPRK test.
Sessler, Nelson E; Walker, Ekaterina; Chickballapur, Harsha; Kacholakalayil, James; Coplan, Paul M
2017-01-01
Positive-controlled clinical studies have shown a dose dependent effect of buprenorphine transdermal system on QTc interval prolongation. This study provides assessment of the buprenorphine transdermal system and cardiac arrhythmia using US FDA and WHO postmarketing reporting databases. Disproportionality analysis of spontaneously reported adverse events to assess whether the reporting rate of cardiac arrhythmia events was disproportionately elevated relative to expected rates of reporting in both FDA and WHO databases. Cardiac arrhythmia events were identified using the standardized Medical Dictionary for Regulatory Activities query for torsade de pointes and/or QT prolongation (TdP/QTP). The threshold for a signal of disproportionate adverse event reporting was defined as the lower 90% confidence limit ≥ 2 of the Empiric Bayes geometric mean in FDA database and as the lower 95% confidence limit of the Informational Component >0 in WHO database. There were 124 (<1%) and 77 (2%) cardiac arrhythmia event cases associated with buprenorphine transdermal as compared to 3206 (12%) and 2913 (14%) involving methadone in the FDA and WHO databases, respectively. In the FDA database methadone was associated with a signal of disproportionate reporting for TdP/QTP (EB05 3.26); however, buprenorphine transdermal was not (EB05 0.33). In the WHO database methadone was associated with a signal of disproportionate reporting for TdP/QTP (IC025 2.66); however, buprenorphine transdermal was not (IC025 -0.88). Similar trends were observed in sensitivity analyses by age, gender, and specific terms related to ventricular arrhythmia. The signal identified in the transdermal buprenorphine thorough QTc study, which led to a dose limitation in its US label, does not translate into a signal of increased risk for cardiac arrhythmia in real world use, as assessed by this method of analyzing post-market surveillance data.
LAN attack detection using Discrete Event Systems.
Hubballi, Neminath; Biswas, Santosh; Roopa, S; Ratti, Ritesh; Nandi, Sukumar
2011-01-01
Address Resolution Protocol (ARP) is used for determining the link layer or Medium Access Control (MAC) address of a network host, given its Internet Layer (IP) or Network Layer address. ARP is a stateless protocol and any IP-MAC pairing sent by a host is accepted without verification. This weakness in the ARP may be exploited by malicious hosts in a Local Area Network (LAN) by spoofing IP-MAC pairs. Several schemes have been proposed in the literature to circumvent these attacks; however, these techniques either make IP-MAC pairing static, modify the existing ARP, patch operating systems of all the hosts etc. In this paper we propose a Discrete Event System (DES) approach for Intrusion Detection System (IDS) for LAN specific attacks which do not require any extra constraint like static IP-MAC, changing the ARP etc. A DES model is built for the LAN under both a normal and compromised (i.e., spoofed request/response) situation based on the sequences of ARP related packets. Sequences of ARP events in normal and spoofed scenarios are similar thereby rendering the same DES models for both the cases. To create different ARP events under normal and spoofed conditions the proposed technique uses active ARP probing. However, this probing adds extra ARP traffic in the LAN. Following that a DES detector is built to determine from observed ARP related events, whether the LAN is operating under a normal or compromised situation. The scheme also minimizes extra ARP traffic by probing the source IP-MAC pair of only those ARP packets which are yet to be determined as genuine/spoofed by the detector. Also, spoofed IP-MAC pairs determined by the detector are stored in tables to detect other LAN attacks triggered by spoofing namely, man-in-the-middle (MiTM), denial of service etc. The scheme is successfully validated in a test bed. Copyright © 2010 ISA. Published by Elsevier Ltd. All rights reserved.
Mechanical ventilation in disaster situations: a new paradigm using the AGILITIES Score System.
Wilkens, Eric P; Klein, Gary M
2010-01-01
The failure of life-critical systems such as mechanical ventilators in the wake of a pandemic or a disaster may result in death, and therefore, state and federal government agencies must have precautions in place to ensure availability, reliability, and predictability through comprehensive preparedness and response plans. All 50 state emergency preparedness response plans were extensively examined for the attention given to the critically injured and ill patient population during a pandemic or mass casualty event. Public health authorities of each state were contacted as well. Nine of 51 state plans (17.6 percent) included a plan or committee for mechanical ventilation triage and management in a pandemic influenza event. All 51 state plans relied on the Centers for Disease Control and Prevention Flu Surge 2.0 spreadsheet to provide estimates for their influenza planning. In the absence of more specific guidance, the authors have developed and provided guidelines recommended for ventilator triage and the implementation of the AGILITIES Score in the event of a pandemic, mass casualty event, or other catastrophic disaster. The authors present and describe the AGILITIES Score Ventilator Triage System and provide related guidelines to be adopted uniformly by government agencies and hospitals. This scoring system and the set ofguidelines are to be used iA disaster settings, such as Hurricane Katrina, and are based on three key factors: relative health, duration of time on mechanical ventilation, and patients' use of resources during a disaster. For any event requiring large numbers of ventilators for patients, the United States is woefully unprepared. The deficiencies in this aspect of preparedness include (1) lack of accountability for physical ventilators, (2) lack of understanding with which healthcare professionals can safely operate these ventilators, (3) lack of understanding from where additional ventilator resources exist, and (4) a triage strategy to provide ventilator support to those patients with the greatest chances of survival.
Fault tree analysis: NiH2 aerospace cells for LEO mission
NASA Technical Reports Server (NTRS)
Klein, Glenn C.; Rash, Donald E., Jr.
1992-01-01
The Fault Tree Analysis (FTA) is one of several reliability analyses or assessments applied to battery cells to be utilized in typical Electric Power Subsystems for spacecraft in low Earth orbit missions. FTA is generally the process of reviewing and analytically examining a system or equipment in such a way as to emphasize the lower level fault occurrences which directly or indirectly contribute to the major fault or top level event. This qualitative FTA addresses the potential of occurrence for five specific top level events: hydrogen leakage through either discrete leakage paths or through pressure vessel rupture; and four distinct modes of performance degradation - high charge voltage, suppressed discharge voltage, loss of capacity, and high pressure.
Brillouin scattering-induced rogue waves in self-pulsing fiber lasers
Hanzard, Pierre-Henry; Talbi, Mohamed; Mallek, Djouher; Kellou, Abdelhamid; Leblond, Hervé; Sanchez, François; Godin, Thomas; Hideur, Ammar
2017-01-01
We report the experimental observation of extreme instabilities in a self-pulsing fiber laser under the influence of stimulated Brillouin scattering (SBS). Specifically, we observe temporally localized structures with high intensities that can be referred to as rogue events through their statistical behaviour with highly-skewed intensity distributions. The emergence of these SBS-induced rogue waves is attributed to the interplay between laser operation and resonant Stokes orders. As this behaviour is not accounted for by existing models, we also present numerical simulations showing that such instabilities can be observed in chaotic laser operation. This study opens up new possibilities towards harnessing extreme events in highly-dissipative systems through adapted laser cavity configurations. PMID:28374840
Brillouin scattering-induced rogue waves in self-pulsing fiber lasers.
Hanzard, Pierre-Henry; Talbi, Mohamed; Mallek, Djouher; Kellou, Abdelhamid; Leblond, Hervé; Sanchez, François; Godin, Thomas; Hideur, Ammar
2017-04-04
We report the experimental observation of extreme instabilities in a self-pulsing fiber laser under the influence of stimulated Brillouin scattering (SBS). Specifically, we observe temporally localized structures with high intensities that can be referred to as rogue events through their statistical behaviour with highly-skewed intensity distributions. The emergence of these SBS-induced rogue waves is attributed to the interplay between laser operation and resonant Stokes orders. As this behaviour is not accounted for by existing models, we also present numerical simulations showing that such instabilities can be observed in chaotic laser operation. This study opens up new possibilities towards harnessing extreme events in highly-dissipative systems through adapted laser cavity configurations.
Auxin acts as a local morphogenetic trigger to specify lateral root founder cells
Dubrovsky, Joseph G.; Sauer, Michael; Napsucialy-Mendivil, Selene; Ivanchenko, Maria G.; Friml, Jiří; Shishkova, Svetlana; Celenza, John; Benková, Eva
2008-01-01
Plants exhibit an exceptional adaptability to different environmental conditions. To a large extent, this adaptability depends on their ability to initiate and form new organs throughout their entire postembryonic life. Plant shoot and root systems unceasingly branch and form axillary shoots or lateral roots, respectively. The first event in the formation of a new organ is specification of founder cells. Several plant hormones, prominent among them auxin, have been implicated in the acquisition of founder cell identity by differentiated cells, but the mechanisms underlying this process are largely elusive. Here, we show that auxin and its local accumulation in root pericycle cells is a necessary and sufficient signal to respecify these cells into lateral root founder cells. Analysis of the alf4–1 mutant suggests that specification of founder cells and the subsequent activation of cell division leading to primordium formation represent two genetically separable events. Time-lapse experiments show that the activation of an auxin response is the earliest detectable event in founder cell specification. Accordingly, local activation of auxin response correlates absolutely with the acquisition of founder cell identity and precedes the actual formation of a lateral root primordium through patterned cell division. Local production and subsequent accumulation of auxin in single pericycle cells induced by Cre-Lox-based activation of auxin synthesis converts them into founder cells. Thus, auxin is the local instructive signal that is sufficient for acquisition of founder cell identity and can be considered a morphogenetic trigger in postembryonic plant organogenesis. PMID:18559858
Ojha, Rohit P; Jackson, Bradford E; Tota, Joseph E; Offutt-Powell, Tabatha N; Singh, Karan P; Bae, Sejong
2014-01-01
Post-marketing surveillance studies provide conflicting evidence about whether Guillain–Barre syndrome occurs more frequently following quadrivalent human papillomavirus (HPV4) vaccination. We aimed to assess whether Guillain–Barre syndrome is reported more frequently following HPV4 vaccination than other vaccinations among females and males aged 9 to 26 y in the United States. We used adverse event reports received by the United States Vaccine Adverse Event Reporting System (VAERS) between January 1, 2010 and December 31, 2012 to estimate overall, age-, and sex-specific proportional reporting ratios (PRRs) and corresponding Χ2 values for reports of Guillain–Barre syndrome between 5 and 42 d following HPV vaccination. Minimum criteria for a signal using this approach are 3 or more cases, PRR ≥2, and Χ2 ≥ 4. Guillain–Barre syndrome was listed as an adverse event in 45 of 14 822 reports, of which 9 reports followed HPV4 vaccination and 36 reports followed all other vaccines. The overall, age-, and sex-specific PRR estimates were uniformly below 1. In addition, the overall, age-, and sex-specific Χ2 values were uniformly below 3. Our analysis of post-marketing surveillance data does not suggest that Guillain–Barre syndrome is reported more frequently following HPV4 vaccination than other vaccinations among vaccine-eligible females or males in the United States. Our findings may be useful when discussing the risks and benefits of HPV4 vaccination. PMID:24013368
Ojha, Rohit P; Jackson, Bradford E; Tota, Joseph E; Offutt-Powell, Tabatha N; Singh, Karan P; Bae, Sejong
2014-01-01
Post-marketing surveillance studies provide conflicting evidence about whether Guillain-Barre syndrome occurs more frequently following quadrivalent human papillomavirus (HPV4) vaccination. We aimed to assess whether Guillain-Barre syndrome is reported more frequently following HPV4 vaccination than other vaccinations among females and males aged 9 to 26 y in the United States. We used adverse event reports received by the United States Vaccine Adverse Event Reporting System (VAERS) between January 1, 2010 and December 31, 2012 to estimate overall, age-, and sex-specific proportional reporting ratios (PRRs) and corresponding Χ2 values for reports of Guillain-Barre syndrome between 5 and 42 d following HPV vaccination. Minimum criteria for a signal using this approach are 3 or more cases, PRR≥2, and Χ2≥4. Guillain-Barre syndrome was listed as an adverse event in 45 of 14,822 reports, of which 9 reports followed HPV4 vaccination and 36 reports followed all other vaccines. The overall, age-, and sex-specific PRR estimates were uniformly below 1. In addition, the overall, age-, and sex-specific Χ2 values were uniformly below 3. Our analysis of post-marketing surveillance data does not suggest that Guillain-Barre syndrome is reported more frequently following HPV4 vaccination than other vaccinations among vaccine-eligible females or males in the United States. Our findings may be useful when discussing the risks and benefits of HPV4 vaccination.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peach, D.F.
1987-12-01
Fiber optic telecommunication systems are susceptible to both natural and man-made stress. National Security/Emergency Preparedness (NSEP) is a function of how durable these systems are in light of projected levels of stress. Emergency Preparedness in 1987 is not just a matter of--can they deliver food, water, energy and other essentials--but can they deliver the vital information necessary to maintain corporate function of our country. 'Communication stamina' is a function of 'probability of survival' when faced with stress. This report provides an overview of the enhancements to a fiber-optic communication system/installation that will increase durability. These enhancements are grouped, based onmore » their value in protecting the system, such that a Multitier Specification is created that presents multiple levels of hardness. Mitigation of effects due to high-altitude electromagnetic pulse (HEMP) and gamma radiation, and protection from vandalism and weather events are discussed in the report. The report is presented in two volumes. Volume I presents the Multitier Specification in a format that is usable for management review. The attributes of specified physical parameters, and the levels of protection stated in Volume I, are discussed in more detail in Volume II.« less
Dynamic event tree analysis with the SAS4A/SASSYS-1 safety analysis code
Jankovsky, Zachary K.; Denman, Matthew R.; Aldemir, Tunc
2018-02-02
The consequences of a transient in an advanced sodium-cooled fast reactor are difficult to capture with the traditional approach to probabilistic risk assessment (PRA). Numerous safety-relevant systems are passive and may have operational states that cannot be represented by binary success or failure. In addition, the specific order and timing of events may be crucial which necessitates the use of dynamic PRA tools such as ADAPT. The modifications to the SAS4A/SASSYS-1 sodium-cooled fast reactor safety analysis code for linking it to ADAPT to perform a dynamic PRA are described. A test case is used to demonstrate the linking process andmore » to illustrate the type of insights that may be gained with this process. Finally, newly-developed dynamic importance measures are used to assess the significance of reactor parameters/constituents on calculated consequences of initiating events.« less
Dynamic event tree analysis with the SAS4A/SASSYS-1 safety analysis code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jankovsky, Zachary K.; Denman, Matthew R.; Aldemir, Tunc
The consequences of a transient in an advanced sodium-cooled fast reactor are difficult to capture with the traditional approach to probabilistic risk assessment (PRA). Numerous safety-relevant systems are passive and may have operational states that cannot be represented by binary success or failure. In addition, the specific order and timing of events may be crucial which necessitates the use of dynamic PRA tools such as ADAPT. The modifications to the SAS4A/SASSYS-1 sodium-cooled fast reactor safety analysis code for linking it to ADAPT to perform a dynamic PRA are described. A test case is used to demonstrate the linking process andmore » to illustrate the type of insights that may be gained with this process. Finally, newly-developed dynamic importance measures are used to assess the significance of reactor parameters/constituents on calculated consequences of initiating events.« less
Recent Advancements in the Global Understanding of what Drives Heatwaves
NASA Astrophysics Data System (ADS)
Perkins-Kirkpatrick, S.
2016-12-01
Heatwaves, defined as prolonged periods of extreme heat, are disastrous events that impact human, natural and industrial systems all over the world. In recent years, the global research effort has greatly increased our understanding on quantifying heatwaves and how they have changed, what drives them, and their future projections. This talk will summarize critical developments made in this field, with particular emphasis on the physical driving mechanisms and the role of internal climate variability. Case studies from various global regions will illustrate both similarities and differences in the physical set-ups of these fascinating events. Future projections of heatwaves and the human contribution behind specific observed heatwave events will be briefly discussed. The talk will conclude by highlighting research priorities such that future investigation is targeted, and closes existing knowledge gaps on what drives heatwaves as effectively as possible. Such developments will ultimately aid in the predictability of heatwaves, thus aiding in reducing their devastating impacts.
Flood and Weather Monitoring Using Real-time Twitter Data Streams
NASA Astrophysics Data System (ADS)
Demir, I.; Sit, M. A.; Sermet, M. Y.
2016-12-01
Social media data is a widely used source to making inference within public crisis periods and events in disaster times. Specifically, since Twitter provides large-scale data publicly in real-time, it is one of the most extensive resources with location information. This abstract provides an overview of a real-time Twitter analysis system to support flood preparedness and response using a comprehensive information-centric flood ontology and natural language processing. Within the scope of this project, we deal with acquisition and processing of real-time Twitter data streams. System fetches the tweets with specified keywords and classifies them as related to flooding or heavy weather conditions. The system uses machine learning algorithms to discover patterns using the correlation between tweets and Iowa Flood Information System's (IFIS) extensive resources. The system uses these patterns to forecast the formation and progress of a potential future flood event. While fetching tweets, predefined hashtags are used for filtering and enhancing the relevancy for selected tweets. With this project, tweets can also be used as an alternative data source where other data sources are not sufficient for specific tasks. During the disasters, the photos that people upload alongside their tweets can be collected and placed to appropriate locations on a mapping system. This allows decision making authorities and communities to see the most recent outlook of the disaster interactively. In case of an emergency, concentration of tweets can help the authorities to determine a strategy on how to reach people most efficiently while providing them the supplies they need. Thanks to the extendable nature of the flood ontology and framework, results from this project will be a guide for other natural disasters, and will be shared with the community.
Amir, Offer; Barak-Shinar, Deganit; Henry, Antonietta; Smart, Frank W
2012-02-01
Sleep-disordered breathing and Cheyne-Stokes breathing are often not diagnosed, especially in cardiovascular patients. An automated system based on photoplethysmographic signals might provide a convenient screening and diagnostic solution for patient evaluation at home or in an ambulatory setting. We compared event detection and classification obtained by full polysomnography (the 'gold standard') and by an automated new algorithm system in 74 subjects. Each subject underwent overnight polysomnography, 60 in a hospital cardiology department and 14 while being tested for suspected sleep-disordered breathing in a sleep laboratory. The sleep-disordered breathing and Cheyne-Stokes breathing parameters measured by a new automated algorithm system correlated very well with the corresponding results obtained by full polysomnography. The sensitivity of the Cheyne-Stokes breathing detected from the system compared to full polysomnography was 92% [95% confidence interval (CI): 78.6-98.3%] and specificity 94% (95% CI: 81.3-99.3%). Comparison of the Apnea Hyponea Index with a cutoff level of 15 shows a sensitivity of 98% (95% CI: 87.1-99.6%) and specificity of 96% (95% CI: 79.8-99.3%). The detection of respiratory events showed agreement of approximately 80%. Regression and Bland-Altman plots revealed good agreement between the two methods. Relative to gold-standard polysomnography, the simply used automated system in this study yielded an acceptable analysis of sleep- and/or cardiac-related breathing disorders. Accordingly, and given the convenience and simplicity of its application, this system can be considered as a suitable platform for home and ambulatory screening and diagnosis of sleep-disordered breathing in patients with cardiovascular disease. © 2011 European Sleep Research Society.