Statistical analysis of life history calendar data.
Eerola, Mervi; Helske, Satu
2016-04-01
The life history calendar is a data-collection tool for obtaining reliable retrospective data about life events. To illustrate the analysis of such data, we compare the model-based probabilistic event history analysis and the model-free data mining method, sequence analysis. In event history analysis, we estimate instead of transition hazards the cumulative prediction probabilities of life events in the entire trajectory. In sequence analysis, we compare several dissimilarity metrics and contrast data-driven and user-defined substitution costs. As an example, we study young adults' transition to adulthood as a sequence of events in three life domains. The events define the multistate event history model and the parallel life domains in multidimensional sequence analysis. The relationship between life trajectories and excess depressive symptoms in middle age is further studied by their joint prediction in the multistate model and by regressing the symptom scores on individual-specific cluster indices. The two approaches complement each other in life course analysis; sequence analysis can effectively find typical and atypical life patterns while event history analysis is needed for causal inquiries. © The Author(s) 2012.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ravindra, M.K.; Banon, H.
1992-07-01
In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as amore » three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC's PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.« less
Job optimization in ATLAS TAG-based distributed analysis
NASA Astrophysics Data System (ADS)
Mambelli, M.; Cranshaw, J.; Gardner, R.; Maeno, T.; Malon, D.; Novak, M.
2010-04-01
The ATLAS experiment is projected to collect over one billion events/year during the first few years of operation. The efficient selection of events for various physics analyses across all appropriate samples presents a significant technical challenge. ATLAS computing infrastructure leverages the Grid to tackle the analysis across large samples by organizing data into a hierarchical structure and exploiting distributed computing to churn through the computations. This includes events at different stages of processing: RAW, ESD (Event Summary Data), AOD (Analysis Object Data), DPD (Derived Physics Data). Event Level Metadata Tags (TAGs) contain information about each event stored using multiple technologies accessible by POOL and various web services. This allows users to apply selection cuts on quantities of interest across the entire sample to compile a subset of events that are appropriate for their analysis. This paper describes new methods for organizing jobs using the TAGs criteria to analyze ATLAS data. It further compares different access patterns to the event data and explores ways to partition the workload for event selection and analysis. Here analysis is defined as a broader set of event processing tasks including event selection and reduction operations ("skimming", "slimming" and "thinning") as well as DPD making. Specifically it compares analysis with direct access to the events (AOD and ESD data) to access mediated by different TAG-based event selections. We then compare different ways of splitting the processing to maximize performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ravindra, M.K.; Banon, H.
1992-07-01
In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as amore » three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC`s PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.« less
Survival analysis: Part I — analysis of time-to-event
2018-01-01
Length of time is a variable often encountered during data analysis. Survival analysis provides simple, intuitive results concerning time-to-event for events of interest, which are not confined to death. This review introduces methods of analyzing time-to-event. The Kaplan-Meier survival analysis, log-rank test, and Cox proportional hazards regression modeling method are described with examples of hypothetical data. PMID:29768911
ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE-EVENT SIMULATION
2016-03-24
ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...in the United States. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...UNLIMITED. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION Erich W
NASA Astrophysics Data System (ADS)
Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.
2016-05-01
Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.
Research on Visual Analysis Methods of Terrorism Events
NASA Astrophysics Data System (ADS)
Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing
2016-06-01
Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.
Antón, Alfonso; Pazos, Marta; Martín, Belén; Navero, José Manuel; Ayala, Miriam Eleonora; Castany, Marta; Martínez, Patricia; Bardavío, Javier
2013-01-01
To assess sensitivity, specificity, and agreement among automated event analysis, automated trend analysis, and expert evaluation to detect glaucoma progression. This was a prospective study that included 37 eyes with a follow-up of 36 months. All had glaucomatous disks and fields and performed reliable visual fields every 6 months. Each series of fields was assessed with 3 different methods: subjective assessment by 2 independent teams of glaucoma experts, glaucoma/guided progression analysis (GPA) event analysis, and GPA (visual field index-based) trend analysis. Kappa agreement coefficient between methods and sensitivity and specificity for each method using expert opinion as gold standard were calculated. The incidence of glaucoma progression was 16% to 18% in 3 years but only 3 cases showed progression with all 3 methods. Kappa agreement coefficient was high (k=0.82) between subjective expert assessment and GPA event analysis, and only moderate between these two and GPA trend analysis (k=0.57). Sensitivity and specificity for GPA event and GPA trend analysis were 71% and 96%, and 57% and 93%, respectively. The 3 methods detected similar numbers of progressing cases. The GPA event analysis and expert subjective assessment showed high agreement between them and moderate agreement with GPA trend analysis. In a period of 3 years, both methods of GPA analysis offered high specificity, event analysis showed 83% sensitivity, and trend analysis had a 66% sensitivity.
EventThread: Visual Summarization and Stage Analysis of Event Sequence Data.
Guo, Shunan; Xu, Ke; Zhao, Rongwen; Gotz, David; Zha, Hongyuan; Cao, Nan
2018-01-01
Event sequence data such as electronic health records, a person's academic records, or car service records, are ordered series of events which have occurred over a period of time. Analyzing collections of event sequences can reveal common or semantically important sequential patterns. For example, event sequence analysis might reveal frequently used care plans for treating a disease, typical publishing patterns of professors, and the patterns of service that result in a well-maintained car. It is challenging, however, to visually explore large numbers of event sequences, or sequences with large numbers of event types. Existing methods focus on extracting explicitly matching patterns of events using statistical analysis to create stages of event progression over time. However, these methods fail to capture latent clusters of similar but not identical evolutions of event sequences. In this paper, we introduce a novel visualization system named EventThread which clusters event sequences into threads based on tensor analysis and visualizes the latent stage categories and evolution patterns by interactively grouping the threads by similarity into time-specific clusters. We demonstrate the effectiveness of EventThread through usage scenarios in three different application domains and via interviews with an expert user.
ERIC Educational Resources Information Center
Simmonds, Mark C.; Higgins, Julian P. T.; Stewart, Lesley A.
2013-01-01
Meta-analysis of time-to-event data has proved difficult in the past because consistent summary statistics often cannot be extracted from published results. The use of individual patient data allows for the re-analysis of each study in a consistent fashion and thus makes meta-analysis of time-to-event data feasible. Time-to-event data can be…
Rauch, Geraldine; Kieser, Meinhard; Binder, Harald; Bayes-Genis, Antoni; Jahn-Eimermacher, Antje
2018-05-01
Composite endpoints combining several event types of clinical interest often define the primary efficacy outcome in cardiologic trials. They are commonly evaluated as time-to-first-event, thereby following the recommendations of regulatory agencies. However, to assess the patient's full disease burden and to identify preventive factors or interventions, subsequent events following the first one should be considered as well. This is especially important in cohort studies and RCTs with a long follow-up leading to a higher number of observed events per patients. So far, there exist no recommendations which approach should be preferred. Recently, the Cardiovascular Round Table of the European Society of Cardiology indicated the need to investigate "how to interpret results if recurrent-event analysis results differ […] from time-to-first-event analysis" (Anker et al., Eur J Heart Fail 18:482-489, 2016). This work addresses this topic by means of a systematic simulation study. This paper compares two common analysis strategies for composite endpoints differing with respect to the incorporation of recurrent events for typical data scenarios motivated by a clinical trial. We show that the treatment effects estimated from a time-to-first-event analysis (Cox model) and a recurrent-event analysis (Andersen-Gill model) can systematically differ, particularly in cardiovascular trials. Moreover, we provide guidance on how to interpret these results and recommend points to consider for the choice of a meaningful analysis strategy. When planning trials with a composite endpoint, researchers, and regulatory agencies should be aware that the model choice affects the estimated treatment effect and its interpretation.
Event-based analysis of free-living behaviour.
Granat, Malcolm H
2012-11-01
The quantification of free-living physical activities is important in understanding how physical activity and sedentary behaviour impact on health and also on how interventions might modify free-living behaviour to enhance health. Quantification, and the terminology used, has in many ways been determined by the choice of measurement technique. The inter-related issues around measurement devices and terminology used are explored. This paper proposes a terminology and a systematic approach for the analysis of free-living activity information using event-based activity data. The event-based approach uses a flexible hierarchical classification of events and, dependent on the research question, analysis can then be undertaken on a selection of these events. The quantification of free-living behaviour is therefore the result of the analysis on the patterns of these chosen events. The application of this approach is illustrated with results from a range of published studies by our group showing how event-based analysis provides a flexible yet robust method of addressing the research question(s) and provides a deeper insight into free-living behaviour. It is proposed that it is through event-based analysis we can more clearly understand how behaviour is related to health and also how we can produce more relevant outcome measures.
Risk Analysis in Support of the Chemical Stockpile Disposal Program. Volume 1. Analysis
1987-12-17
Analysis of Event Probability A-14 A.4.2 Analysis of Event Consequence A-14 A.4.2.1 Agent Release to Atmosphere A-14 A.4.2.2 Toxic P-lume Size A-16 A...Disposal Program (CSDP), which comprises several alternatives for carrying out the disposal effort (U.S. Army Toxic and Hazardous Materials Agency, 1986...or unavoidable accident or event could occur that would expose a nearby civilian population to these toxic chemicals. Such events could occur even
NASA Astrophysics Data System (ADS)
Tziotziou, Kostas; Malandraki, Olga; Valtonen, Eino; Heber, Bernd; Zucca, Pietro; Klein, Karl-Ludwig; Vainio, Rami; Tsiropoula, Georgia; Share, Gerald
2017-04-01
Multi-spacecraft observations of solar energetic particle (SEP) events are important for understanding the acceleration processes and the interplanetary propagation of particles released during eruptive events. In this work, we have carefully studied 25 gamma-ray flare events observed by FERMI and investigated possible associations with SEP-related events observed with STEREO and L1 spacecraft in the heliosphere. A data-driven velocity dispersion analysis (VDA) and Time-Shifting Analysis (TSA) are used for deriving the release times of protons and electrons at the Sun and for comparing them with the respective times stemming from the gamma-ray event analysis and their X-ray signatures, in an attempt to interconnect the SEPs and Fermi events and better understand the physics involved. Acknowledgements: This project has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement No 637324.
How Expert Pilots Think Cognitive Processes in Expert Decision Making
1993-02-01
Management (CRM) This document is available to the public Advanced Qualification Program (AQP) through the National Technical Information Cognitive Task Analysis (CTA...8217 Selecting realistic EDM scenarios with critical events and performing a cognitive task analysis of novice vs. expert decision making for these events...scenarios with critical events and performing a cognitive task analysis of novice vs. expert decision making for these events is a basic requirement for
Second-Order Analysis of Semiparametric Recurrent Event Processes
Guan, Yongtao
2011-01-01
Summary A typical recurrent event dataset consists of an often large number of recurrent event processes, each of which contains multiple event times observed from an individual during a followup period. Such data have become increasingly available in medical and epidemiological studies. In this paper, we introduce novel procedures to conduct second-order analysis for a flexible class of semiparametric recurrent event processes. Such an analysis can provide useful information regarding the dependence structure within each recurrent event process. Specifically, we will use the proposed procedures to test whether the individual recurrent event processes are all Poisson processes and to suggest sensible alternative models for them if they are not. We apply these procedures to a well-known recurrent event dataset on chronic granulomatous disease and an epidemiological dataset on Meningococcal disease cases in Merseyside, UK to illustrate their practical value. PMID:21361885
A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.
Yu, Hongyang; Khan, Faisal; Veitch, Brian
2017-09-01
Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.
ERIC Educational Resources Information Center
Tuma, Nancy Brandon; Hannan, Michael T.
The document, part of a series of chapters described in SO 011 759, considers the problem of censoring in the analysis of event-histories (data on dated events, including dates of change from one qualitative state to another). Censoring refers to the lack of information on events that occur before or after the period for which data are available.…
Event reweighting with the NuWro neutrino interaction generator
NASA Astrophysics Data System (ADS)
Pickering, Luke; Stowell, Patrick; Sobczyk, Jan
2017-09-01
Event reweighting has been implemented in the NuWro neutrino event generator for a number of free theory parameters in the interaction model. Event reweighting is a key analysis technique, used to efficiently study the effect of neutrino interaction model uncertainties. This opens up the possibility for NuWro to be used as a primary event generator by experimental analysis groups. A preliminary model tuning to ANL and BNL data of quasi-elastic and single pion production events was performed to validate the reweighting engine.
Reaching Out: A Break from Traditional Forensic Events. "On Interpretation Analysis."
ERIC Educational Resources Information Center
Seney, Ronald J.
In recent years a new event called "Interpretation Analysis" has appeared at certain forensic events. The objective is for the student, through analysis and performance, to study a piece of literature and to communicate his or her understanding of that literature to a specific audience. Perhaps there is room within the established…
Second-order analysis of semiparametric recurrent event processes.
Guan, Yongtao
2011-09-01
A typical recurrent event dataset consists of an often large number of recurrent event processes, each of which contains multiple event times observed from an individual during a follow-up period. Such data have become increasingly available in medical and epidemiological studies. In this article, we introduce novel procedures to conduct second-order analysis for a flexible class of semiparametric recurrent event processes. Such an analysis can provide useful information regarding the dependence structure within each recurrent event process. Specifically, we will use the proposed procedures to test whether the individual recurrent event processes are all Poisson processes and to suggest sensible alternative models for them if they are not. We apply these procedures to a well-known recurrent event dataset on chronic granulomatous disease and an epidemiological dataset on meningococcal disease cases in Merseyside, United Kingdom to illustrate their practical value. © 2011, The International Biometric Society.
Kahan, Brennan C; Harhay, Michael O
2015-12-01
Adjustment for center in multicenter trials is recommended when there are between-center differences or when randomization has been stratified by center. However, common methods of analysis (such as fixed-effects, Mantel-Haenszel, or stratified Cox models) often require a large number of patients or events per center to perform well. We reviewed 206 multicenter randomized trials published in four general medical journals to assess the average number of patients and events per center and determine whether appropriate methods of analysis were used in trials with few patients or events per center. The median number of events per center/treatment arm combination for trials using a binary or survival outcome was 3 (interquartile range, 1-10). Sixteen percent of trials had less than 1 event per center/treatment combination, 50% fewer than 3, and 63% fewer than 5. Of the trials which adjusted for center using a method of analysis which requires a large number of events per center, 6% had less than 1 event per center-treatment combination, 25% fewer than 3, and 50% fewer than 5. Methods of analysis that allow for few events per center, such as random-effects models or generalized estimating equations (GEEs), were rarely used. Many multicenter trials contain few events per center. Adjustment for center using random-effects models or GEE with model-based (non-robust) standard errors may be beneficial in these scenarios. Copyright © 2015 Elsevier Inc. All rights reserved.
Analysis of Emergency Diesel Generators Failure Incidents in Nuclear Power Plants
NASA Astrophysics Data System (ADS)
Hunt, Ronderio LaDavis
In early years of operation, emergency diesel generators have had a minimal rate of demand failures. Emergency diesel generators are designed to operate as a backup when the main source of electricity has been disrupted. As of late, EDGs (emergency diesel generators) have been failing at NPPs (nuclear power plants) around the United States causing either station blackouts or loss of onsite and offsite power. These failures occurred from a specific type called demand failures. This thesis evaluated the current problem that raised concern in the nuclear industry which was averaging 1 EDG demand failure/year in 1997 to having an excessive event of 4 EDG demand failure year which occurred in 2011. To determine the next occurrence of the extreme event and possible cause to an event of such happening, two analyses were conducted, the statistical and root cause analysis. Considering the statistical analysis in which an extreme event probability approach was applied to determine the next occurrence year of an excessive event as well as, the probability of that excessive event occurring. Using the root cause analysis in which the potential causes of the excessive event occurred by evaluating, the EDG manufacturers, aging, policy changes/ maintenance practices and failure components. The root cause analysis investigated the correlation between demand failure data and historical data. Final results from the statistical analysis showed expectations of an excessive event occurring in a fixed range of probability and a wider range of probability from the extreme event probability approach. The root-cause analysis of the demand failure data followed historical statistics for the EDG manufacturer, aging and policy changes/ maintenance practices but, indicated a possible cause regarding the excessive event with the failure components. Conclusions showed the next excessive demand failure year, prediction of the probability and the next occurrence year of such failures, with an acceptable confidence level, was difficult but, it was likely that this type of failure will not be a 100 year event. It was noticeable to see that the majority of the EDG demand failures occurred within the main components as of 2005. The overall analysis of this study provided from percentages, indicated that it would be appropriate to make the statement that the excessive event was caused by the overall age (wear and tear) of the Emergency Diesel Generators in Nuclear Power Plants. Future Work will be to better determine the return period of the excessive event once the occurrence has happened for a second time by implementing the extreme event probability approach.
ERIC Educational Resources Information Center
Arya, Poonam; Christ, Tanya; Chiu, Ming
2015-01-01
This study examined how characteristics of Collaborative Peer Video Analysis (CPVA) events are related to teachers' pedagogical outcomes. Data included 39 transcribed literacy video events, in which 14 in-service teachers engaged in discussions of their video clips. Emergent coding and Statistical Discourse Analysis were used to analyze the data.…
Surface Management System Departure Event Data Analysis
NASA Technical Reports Server (NTRS)
Monroe, Gilena A.
2010-01-01
This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.
2017-09-01
whether emergency incidents connected to low frequency and high risk events contain sufficient warning signs or indicators of imminent catastrophic... high risk events contain sufficient warning signs or indicators of imminent catastrophic events, if firefighters could identify them, and if there...EFFECTIVE TRAINING SOLUTIONS FOR FIREFIGHTING: THE ANALYSIS OF EMERGENCY RESPONSES AND LINE OF DUTY DEATH REPORTS FOR LOW FREQUENCY, HIGH RISK EVENTS
ERIC Educational Resources Information Center
Barker, Bruce O.; Petersen, Paul D.
This paper explores the fault-tree analysis approach to isolating failure modes within a system. Fault tree investigates potentially undesirable events and then looks for failures in sequence that would lead to their occurring. Relationships among these events are symbolized by AND or OR logic gates, AND used when single events must coexist to…
Trend Detection and Bivariate Frequency Analysis for Nonstrationary Rainfall Data
NASA Astrophysics Data System (ADS)
Joo, K.; Kim, H.; Shin, J. Y.; Heo, J. H.
2017-12-01
Multivariate frequency analysis has been developing for hydro-meteorological data such as rainfall, flood, and drought. Particularly, the copula has been used as a useful tool for multivariate probability model which has no limitation on deciding marginal distributions. The time-series rainfall data can be characterized to rainfall event by inter-event time definition (IETD) and each rainfall event has a rainfall depth and rainfall duration. In addition, nonstationarity in rainfall event has been studied recently due to climate change and trend detection of rainfall event is important to determine the data has nonstationarity or not. With the rainfall depth and duration of a rainfall event, trend detection and nonstationary bivariate frequency analysis has performed in this study. 62 stations from Korea Meteorological Association (KMA) over 30 years of hourly recorded data used in this study and the suitability of nonstationary copula for rainfall event has examined by the goodness-of-fit test.
Application of a temporal reasoning framework tool in analysis of medical device adverse events.
Clark, Kimberly K; Sharma, Deepak K; Chute, Christopher G; Tao, Cui
2011-01-01
The Clinical Narrative Temporal Relation Ontology (CNTRO)1 project offers a semantic-web based reasoning framework, which represents temporal events and relationships within clinical narrative texts, and infer new knowledge over them. In this paper, the CNTRO reasoning framework is applied to temporal analysis of medical device adverse event files. One specific adverse event was used as a test case: late stent thrombosis. Adverse event narratives were obtained from the Food and Drug Administration's (FDA) Manufacturing and User Facility Device Experience (MAUDE) database2. 15 adverse event files in which late stent thrombosis was confirmed were randomly selected across multiple drug eluting stent devices. From these files, 81 events and 72 temporal relations were annotated. 73 temporal questions were generated, of which 65 were correctly answered by the CNTRO system. This results in an overall accuracy of 89%. This system should be pursued further to continue assessing its potential benefits in temporal analysis of medical device adverse events.
NASA Astrophysics Data System (ADS)
Bronstert, Axel; Ankit, Agarwal; Berry, Boessenkool; Madlen, Fischer; Maik, Heistermann; Lisei, Köhn-Reich; Thomas, Moran; Dadiyorto, Wendi
2017-04-01
The flash-flood at 29th May 2016 in the vicinity of the village of Braunsbach in Southwestern Germany, State of Baden-Wuerttemberg, has been a particularly concise event of the floods occurring in southern Germany at the end of May / early June 2016. This extreme event was triggered by a convective high intensity rain storm, causing extreme discharge rates and subsequent debris flow in the local creek. This led to severe flooding of the village with immense damages. Besides its extreme nature, the event is characterized by very local and short term scales, i.e. the catchment of the creek covers an area of only six km2 and the whole event lasted only two hours. This contribution presents a retrospective analysis with regard to meteorology and hydrology to obtain a quantitative assessment of the governing processes and their development. We term this a "forensic analysis" because due to the very local and sudden feature of this flashflood event, the processes cannot be directly measured during the event and/or at the site. Instead, they need to be reconstructed and estimated after the event from a variety of rather different information sources and "soft" data. Using these types of post event observations and analysis, we aim at obtaining a rather comprehensive picture of the event and its consequences. Regarding rainfall, both station data from the surroundings of the catchment and radar data from the German Weather Service were analyzed, including the analysis of different errors types and dynamic features of the convective system. The flood hydrograph, including the maximum discharge rate during the event, was estimated by three different approaches, which were compared to obtain an idea of the associated uncertainty. The overall results of this forensic analysis show that it was a very rare rainfall event with extreme rainfall intensities, e.g. return period exceeding 100 years. Catalyzed by catchment properties, this lead to extreme runoff, severe soil erosion, and subsequent debris flow processes. Due to the complex and interacting processes, the hazard must not be attributed to a single cause, since only the interplay of the different processes and catchment conditions can lead to such an event. The people in the region say that such an event "has never happened before". However, from some first geomorphological analysis we got some indications that such events, including debris flow, might have happened before during previous times (time scale of millennia). Therefore, it would be more appropriate to state that "nobody can remember of such an event".
Higher moments of net-proton multiplicity distributions in a heavy-ion event pile-up scenario
NASA Astrophysics Data System (ADS)
Garg, P.; Mishra, D. K.
2017-10-01
High-luminosity modern accelerators, like the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory (BNL) and Large Hadron Collider (LHC) at European Organization for Nuclear Research (CERN), inherently have event pile-up scenarios which significantly contribute to physics events as a background. While state-of-the-art tracking algorithms and detector concepts take care of these event pile-up scenarios, several offline analytical techniques are used to remove such events from the physics analysis. It is still difficult to identify the remaining pile-up events in an event sample for physics analysis. Since the fraction of these events is significantly small, it may not be as serious of an issue for other analyses as it would be for an event-by-event analysis. Particularly when the characteristics of the multiplicity distribution are observable, one needs to be very careful. In the present work, we demonstrate how a small fraction of residual pile-up events can change the moments and their ratios of an event-by-event net-proton multiplicity distribution, which are sensitive to the dynamical fluctuations due to the QCD critical point. For this study, we assume that the individual event-by-event proton and antiproton multiplicity distributions follow Poisson, negative binomial, or binomial distributions. We observe a significant effect in cumulants and their ratios of net-proton multiplicity distributions due to pile-up events, particularly at lower energies. It might be crucial to estimate the fraction of pile-up events in the data sample while interpreting the experimental observable for the critical point.
Garde, Ainara; Dehkordi, Parastoo; Wensley, David; Ansermino, J Mark; Dumont, Guy A
2015-01-01
Obstructive sleep apnea (OSA) disrupts normal ventilation during sleep and can lead to serious health problems in children if left untreated. Polysomnography, the gold standard for OSA diagnosis, is resource intensive and requires a specialized laboratory. Thus, we proposed to use the Phone Oximeter™, a portable device integrating pulse oximetry with a smartphone, to detect OSA events. As a proportion of OSA events occur without oxygen desaturation (defined as SpO2 decreases ≥ 3%), we suggest combining SpO2 and pulse rate variability (PRV) analysis to identify all OSA events and provide a more detailed sleep analysis. We recruited 160 children and recorded pulse oximetry consisting of SpO2 and plethysmography (PPG) using the Phone Oximeter™, alongside standard polysomnography. A sleep technician visually scored all OSA events with and without oxygen desaturation from polysomnography. We divided pulse oximetry signals into 1-min signal segments and extracted several features from SpO2 and PPG analysis in the time and frequency domain. Segments with OSA, especially the ones with oxygen desaturation, presented greater SpO2 variability and modulation reflected in the spectral domain than segments without OSA. Segments with OSA also showed higher heart rate and sympathetic activity through the PRV analysis relative to segments without OSA. PRV analysis was more sensitive than SpO2 analysis for identification of OSA events without oxygen desaturation. Combining SpO2 and PRV analysis enhanced OSA event detection through a multiple logistic regression model. The area under the ROC curve increased from 81% to 87%. Thus, the Phone Oximeter™ might be useful to monitor sleep and identify OSA events with and without oxygen desaturation at home.
Predicting Pilot Performance in Off-Nominal Conditions: A Meta-Analysis and Model Validation
NASA Technical Reports Server (NTRS)
Wickens, C.D.; Hooey, B.L.; Gore, B.F.; Sebok, A.; Koenecke, C.; Salud, E.
2009-01-01
Pilot response to off-nominal (very rare) events represents a critical component to understanding the safety of next generation airspace technology and procedures. We describe a meta-analysis designed to integrate the existing data regarding pilot accuracy of detecting rare, unexpected events such as runway incursions in realistic flight simulations. Thirty-five studies were identified and pilot responses were categorized by expectancy, event location, and whether the pilot was flying with a highway-in-the-sky display. All three dichotomies produced large, significant effects on event miss rate. A model of human attention and noticing, N-SEEV, was then used to predict event noticing performance as a function of event salience and expectancy, and retinal eccentricity. Eccentricity is predicted from steady state scanning by the SEEV model of attention allocation. The model was used to predict miss rates for the expectancy, location and highway-in-the-sky (HITS) effects identified in the meta-analysis. The correlation between model-predicted results and data from the meta-analysis was 0.72.
Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.
Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian
2011-01-01
Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.
Renfro, Lindsay A; Grothey, Axel M; Paul, James; Floriani, Irene; Bonnetain, Franck; Niedzwiecki, Donna; Yamanaka, Takeharu; Souglakos, Ioannis; Yothers, Greg; Sargent, Daniel J
2014-12-01
Clinical trials are expensive and lengthy, where success of a given trial depends on observing a prospectively defined number of patient events required to answer the clinical question. The point at which this analysis time occurs depends on both patient accrual and primary event rates, which typically vary throughout the trial's duration. We demonstrate real-time analysis date projections using data from a collection of six clinical trials that are part of the IDEA collaboration, an international preplanned pooling of data from six trials testing the duration of adjuvant chemotherapy in stage III colon cancer, and we additionally consider the hypothetical impact of one trial's early termination of follow-up. In the absence of outcome data from IDEA, monthly accrual rates for each of the six IDEA trials were used to project subsequent trial-specific accrual, while historical data from similar Adjuvant Colon Cancer Endpoints (ACCENT) Group trials were used to construct a parametric model for IDEA's primary endpoint, disease-free survival, under the same treatment regimen. With this information and using the planned total accrual from each IDEA trial protocol, individual patient accrual and event dates were simulated and the overall IDEA interim and final analysis times projected. Projections were then compared with actual (previously undisclosed) trial-specific event totals at a recent census time for validation. The change in projected final analysis date assuming early termination of follow-up for one IDEA trial was also calculated. Trial-specific predicted event totals were close to the actual number of events per trial for the recent census date at which the number of events per trial was known, with the overall IDEA projected number of events only off by eight patients. Potential early termination of follow-up by one IDEA trial was estimated to postpone the overall IDEA final analysis date by 9 months. Real-time projection of the final analysis time during a trial, or the overall analysis time during a trial collaborative such as IDEA, has practical implications for trial feasibility when these projections are translated into additional time and resources required.
The detection and analysis of point processes in biological signals
NASA Technical Reports Server (NTRS)
Anderson, D. J.; Correia, M. J.
1977-01-01
A pragmatic approach to the detection and analysis of discrete events in biomedical signals is taken. Examples from both clinical and basic research are provided. Introductory sections discuss not only discrete events which are easily extracted from recordings by conventional threshold detectors but also events embedded in other information carrying signals. The primary considerations are factors governing event-time resolution and the effects limits to this resolution have on the subsequent analysis of the underlying process. The analysis portion describes tests for qualifying the records as stationary point processes and procedures for providing meaningful information about the biological signals under investigation. All of these procedures are designed to be implemented on laboratory computers of modest computational capacity.
Wan, Zhaofei; Liu, Xiaojun; Wang, Xinhong; Liu, Fuqiang; Liu, Weimin; Wu, Yue; Pei, Leilei; Yuan, Zuyi
2014-04-01
Arterial elasticity has been shown to predict cardiovascular disease (CVD) in apparently healthy populations. The present study aimed to explore whether arterial elasticity could predict CVD events in Chinese patients with angiographic coronary artery disease (CAD). Arterial elasticity of 365 patients with angiographic CAD was measured. During follow-up (48 months; range 6-65), 140 CVD events occurred (including 34 deaths). Univariate Cox analysis demonstrated that both large arterial elasticity and small arterial elasticity were significant predictors of CVD events. Multivariate Cox analysis indicated that small arterial elasticity remained significant. Kaplan-Meier analysis showed that the probability of having a CVD event/CVD death increased with a decrease of small arterial elasticity (P < .001, respectively). Decreased small arterial elasticity independently predicts the risk of CVD events in Chinese patients with angiographic CAD.
Muething, S E; Conway, P H; Kloppenborg, E; Lesko, A; Schoettker, P J; Seid, M; Kotagal, U
2010-10-01
To describe how in-depth analysis of adverse events can reveal underlying causes. Triggers for adverse events were developed using the hospital's computerised medical record (naloxone for opiate-related oversedation and administration of a glucose bolus while on insulin for insulin-related hypoglycaemia). Triggers were identified daily. Based on information from the medical record and interviews, a subject expert determined if an adverse drug event had occurred and then conducted a real-time analysis to identify event characteristics. Expert groups, consisting of frontline staff and specialist physicians, examined event characteristics and determined the apparent cause. 30 insulin-related hypoglycaemia events and 34 opiate-related oversedation events were identified by the triggers over 16 and 21 months, respectively. In the opinion of the experts, patients receiving continuous-infusion insulin and those receiving dextrose only via parenteral nutrition were at increased risk for insulin-related hypoglycaemia. Lack of standardisation in insulin-dosing decisions and variation regarding when and how much to adjust insulin doses in response to changing glucose levels were identified as common causes of the adverse events. Opiate-related oversedation events often occurred within 48 h of surgery. Variation in pain management in the operating room and post-anaesthesia care unit was identified by the experts as potential causes. Variations in practice, multiple services writing orders, multidrug regimens and variations in interpretation of patient assessments were also noted as potential contributing causes. Identification of adverse drug events through an automated trigger system, supplemented by in-depth analysis, can help identify targets for intervention and improvement.
Bringing a transgenic crop to market: where compositional analysis fits.
Privalle, Laura S; Gillikin, Nancy; Wandelt, Christine
2013-09-04
In the process of developing a biotechnology product, thousands of genes and transformation events are evaluated to select the event that will be commercialized. The ideal event is identified on the basis of multiple characteristics including trait efficacy, the molecular characteristics of the insert, and agronomic performance. Once selected, the commercial event is subjected to a rigorous safety evaluation taking a multipronged approach including examination of the safety of the gene and gene product - the protein, plant performance, impact of cultivating the crop on the environment, agronomic performance, and equivalence of the crop/food to conventional crops/food - by compositional analysis. The compositional analysis is composed of a comparison of the nutrient and antinutrient composition of the crop containing the event, its parental line (variety), and other conventional lines (varieties). Different geographies have different requirements for the compositional analysis studies. Parameters that vary include the number of years (seasons) and locations (environments) to be evaluated, the appropriate comparator(s), analytes to be evaluated, and statistical analysis. Specific examples of compositional analysis results will be presented.
Meta-analysis: Association between hypoglycaemia and serious adverse events in older patients.
Mattishent, Katharina; Loke, Yoon Kong
2016-07-01
We aimed to conduct a meta-analysis of serious adverse events (macro- and microvascular events, falls and fractures, death) associated with hypoglycaemia in older patients. We searched MEDLINE and EMBASE spanning a ten-year period up to March 2015 (with automated PubMed updates to October 2015). We selected observational studies reporting on hypoglycaemia and associated serious adverse events, and conducted a meta-analysis. We assessed study validity based on ascertainment of hypoglycaemia, adverse events and adjustment for confounders. We included 17 studies involving 1.86 million participants. Meta-analysis of eight studies demonstrated that hypoglycemic episodes were associated with macrovascular complications, odds ratio (OR) 1.83 (95% confidence interval [CI] 1.64, 2.05), and microvascular complications in two studies OR 1.77 (95% CI 1.49, 2.10). Meta-analysis of four studies demonstrated an association between hypoglycaemia and falls or fractures, OR 1.89 (95% CI 1.54, 2.32) and 1.92 (95% CI 1.56, 2.38) respectively. Hypoglycaemia was associated with increased likelihood of death in a meta-analysis of eight studies, OR 2.04 (95% Confidence Interval 1.68, 2.47). Our meta-analysis raises major concerns about a range of serious adverse events associated with hypoglycaemia. Clinicians should prioritize individualized therapy and closer monitoring strategies to avoid hypoglycaemia in susceptible older patients. Copyright © 2016 Elsevier Inc. All rights reserved.
Interpretation Analysis as a Competitive Event.
ERIC Educational Resources Information Center
Nading, Robert M.
Interpretation analysis is a new and interesting event on the forensics horizon which appears to be attracting an ever larger number of supporters. This event, developed by Larry Lambert of Ball State University in 1989, requires a student to perform all three disciplines of forensic competition (interpretation, public speaking, and limited…
Combining conversation analysis and event sequencing to study health communication.
Pecanac, Kristen E
2018-06-01
Good communication is essential in patient-centered care. The purpose of this paper is to describe conversation analysis and event sequencing and explain how integrating these methods strengthened the analysis in a study of communication between clinicians and surrogate decision makers in an intensive care unit. Conversation analysis was first used to determine how clinicians introduced the need for decision-making regarding life-sustaining treatment and how surrogate decision makers responded. Event sequence analysis then was used to determine the transitional probability (probability of one event leading to another in the interaction) that a given type of clinician introduction would lead to surrogate resistance or alignment. Conversation analysis provides a detailed analysis of the interaction between participants in a conversation. When combined with a quantitative analysis of the patterns of communication in an interaction, these data add information on the communication strategies that produce positive outcomes. Researchers can apply this mixed-methods approach to identify beneficial conversational practices and design interventions to improve health communication. © 2018 Wiley Periodicals, Inc.
Multi-scale comparison of source parameter estimation using empirical Green's function approach
NASA Astrophysics Data System (ADS)
Chen, X.; Cheng, Y.
2015-12-01
Analysis of earthquake source parameters requires correction of path effect, site response, and instrument responses. Empirical Green's function (EGF) method is one of the most effective methods in removing path effects and station responses by taking the spectral ratio between a larger and smaller event. Traditional EGF method requires identifying suitable event pairs, and analyze each event individually. This allows high quality estimations for strictly selected events, however, the quantity of resolvable source parameters is limited, which challenges the interpretation of spatial-temporal coherency. On the other hand, methods that exploit the redundancy of event-station pairs are proposed, which utilize the stacking technique to obtain systematic source parameter estimations for a large quantity of events at the same time. This allows us to examine large quantity of events systematically, facilitating analysis of spatial-temporal patterns, and scaling relationship. However, it is unclear how much resolution is scarified during this process. In addition to the empirical Green's function calculation, choice of model parameters and fitting methods also lead to biases. Here, using two regional focused arrays, the OBS array in the Mendocino region, and the borehole array in the Salton Sea geothermal field, I compare the results from the large scale stacking analysis, small-scale cluster analysis, and single event-pair analysis with different fitting methods to systematically compare the results within completely different tectonic environment, in order to quantify the consistency and inconsistency in source parameter estimations, and the associated problems.
Regression analysis of mixed recurrent-event and panel-count data with additive rate models.
Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L
2015-03-01
Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007, The Statistical Analysis of Recurrent Events. New York: Springer-Verlag; Zhao et al., 2011, Test 20, 1-42). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013, Statistics in Medicine 32, 1954-1963). In this article, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study. © 2014, The International Biometric Society.
Uncertainties in stormwater runoff data collection from a small urban catchment, Southeast China.
Huang, Jinliang; Tu, Zhenshun; Du, Pengfei; Lin, Jie; Li, Qingsheng
2010-01-01
Monitoring data are often used to identify stormwater runoff characteristics and in stormwater runoff modelling without consideration of their inherent uncertainties. Integrated with discrete sample analysis and error propagation analysis, this study attempted to quantify the uncertainties of discrete chemical oxygen demand (COD), total suspended solids (TSS) concentration, stormwater flowrate, stormwater event volumes, COD event mean concentration (EMC), and COD event loads in terms of flow measurement, sample collection, storage and laboratory analysis. The results showed that the uncertainties due to sample collection, storage and laboratory analysis of COD from stormwater runoff are 13.99%, 19.48% and 12.28%. Meanwhile, flow measurement uncertainty was 12.82%, and the sample collection uncertainty of TSS from stormwater runoff was 31.63%. Based on the law of propagation of uncertainties, the uncertainties regarding event flow volume, COD EMC and COD event loads were quantified as 7.03%, 10.26% and 18.47%.
Analysis tools for discovering strong parity violation at hadron colliders
NASA Astrophysics Data System (ADS)
Backović, Mihailo; Ralston, John P.
2011-07-01
Several arguments suggest parity violation may be observable in high energy strong interactions. We introduce new analysis tools to describe the azimuthal dependence of multiparticle distributions, or “azimuthal flow.” Analysis uses the representations of the orthogonal group O(2) and dihedral groups DN necessary to define parity completely in two dimensions. Classification finds that collective angles used in event-by-event statistics represent inequivalent tensor observables that cannot generally be represented by a single “reaction plane.” Many new parity-violating observables exist that have never been measured, while many parity-conserving observables formerly lumped together are now distinguished. We use the concept of “event-shape sorting” to suggest separating right- and left-handed events, and we discuss the effects of transverse and longitudinal spin. The analysis tools are statistically robust, and can be applied equally to low or high multiplicity events at the Tevatron, RHIC or RHIC Spin, and the LHC.
Stettler, Christoph; Allemann, Sabin; Jüni, Peter; Cull, Carole A; Holman, Rury R; Egger, Matthias; Krähenbühl, Stephan; Diem, Peter
2006-07-01
Uncertainty persists concerning the effect of improved long-term glycemic control on macrovascular disease in diabetes mellitus (DM). We performed a systematic review and meta-analysis of randomized controlled trials comparing interventions to improve glycemic control with conventional treatment in type 1 and type 2 diabetes. Outcomes included the incidence rate ratios for any macrovascular event, cardiac events, stroke, and peripheral arterial disease, and the number needed to treat intensively during 10 years to prevent one macrovascular event. The analysis was based on 8 randomized comparisons including 1800 patients with type 1 DM (134 macrovascular events, 40 cardiac events, 88 peripheral vascular events, 6 cerebrovascular events, 11293 person-years of follow-up) and 6 comparisons including 4472 patients with type 2 DM (1587 macrovascular events, 1197 cardiac events, 87 peripheral vascular events, 303 cerebrovascular events, 43607 person-years). Combined incidence rate ratios for any macrovascular event were 0.38 (95% CI 0.26-0.56) in type 1 and 0.81 (0.73-0.91) in type 2 DM. In type 1 DM, effect was mainly based on reduction of cardiac and peripheral vascular events and, in type 2 DM, due to reductions in stroke and peripheral vascular events. Effects appear to be particularly important in younger patients with shorter duration of diabetes. Our data suggest that attempts to improve glycemic control reduce the incidence of macrovascular events both in type 1 and type 2 DM. In absolute terms, benefits are comparable, although effects on specific manifestations of macrovascular disease differ.
Khandelwal, Siddhartha; Wickstrom, Nicholas
2016-12-01
Detecting gait events is the key to many gait analysis applications that would benefit from continuous monitoring or long-term analysis. Most gait event detection algorithms using wearable sensors that offer a potential for use in daily living have been developed from data collected in controlled indoor experiments. However, for real-word applications, it is essential that the analysis is carried out in humans' natural environment; that involves different gait speeds, changing walking terrains, varying surface inclinations and regular turns among other factors. Existing domain knowledge in the form of principles or underlying fundamental gait relationships can be utilized to drive and support the data analysis in order to develop robust algorithms that can tackle real-world challenges in gait analysis. This paper presents a novel approach that exhibits how domain knowledge about human gait can be incorporated into time-frequency analysis to detect gait events from long-term accelerometer signals. The accuracy and robustness of the proposed algorithm are validated by experiments done in indoor and outdoor environments with approximately 93 600 gait events in total. The proposed algorithm exhibits consistently high performance scores across all datasets in both, indoor and outdoor environments.
Jusufovic, Mirza; Sandset, Else Charlotte; Bath, Philip M; Berge, Eivind
2016-08-01
Early blood pressure-lowering treatment appears to be beneficial in patients with acute intracerebral haemorrhage and potentially in ischaemic stroke. We used a new method for analysis of vascular events in the Scandinavian Candesartan Acute Stroke Trial to see if the effect was dependent on the timing of treatment. Scandinavian Candesartan Acute Stroke Trial was a randomized controlled and placebo-controlled trial of candesartan within 30 h of ischaemic or haemorrhagic stroke. Of 2029 patients, 231 (11.4%) had a vascular event (vascular death, nonfatal stroke or nonfatal myocardial infarction) during the first 6 months. The modified Rankin Scale (mRS) score following a vascular event was used to categorize vascular events in order of severity: no event (n = 1798), minor (mRS 0-2, n = 59), moderately severe (mRS 3-4, n = 57) and major event (mRS 5-6, n = 115). We used ordinal logistic regression for analysis and adjusted for predefined prognostic variables. Candesartan had no overall effect on vascular events (adjusted common odds ratio 1.11, 95% confidence interval 0.84-1.47, P = 0.48), and the effects were the same in ischaemic and haemorrhagic stroke. Among the patients treated within 6 h, the adjusted common odds ratio for vascular events was 0.37, 95% confidence interval 0.16-0.84, P = 0.02, and there was no heterogeneity of effect between ischaemic and haemorrhagic strokes. Ordinal analysis of vascular events showed no overall effect of candesartan in the subacute phase of stroke. The effect of treatment given within 6 h of stroke onset appears promising, and will be addressed in ongoing trials. Ordinal analysis of vascular events is feasible and can be used in future trials.
CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS
Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...
An Analysis of US School Shooting Data (1840-2015)
ERIC Educational Resources Information Center
Paradice, David
2017-01-01
This paper describes the construction and descriptive analysis of a data set of United States school shooting events. Three hundred forty-three shooting events are included, spanning 175 years of United States educational history. All levels of US educational institution are included. Events are included when a firearm is discharged, regardless of…
Revisiting a Meta-Analysis of Helpful Aspects of Therapy in a Community Counselling Service
ERIC Educational Resources Information Center
Quick, Emma L; Dowd, Claire; Spong, Sheila
2018-01-01
This small scale mixed methods study examines helpful events in a community counselling setting, categorising impacts of events according to Timulak's [(2007). Identifying core categories of client-identified impact of helpful events in psychotherapy: A qualitative meta-analysis. "Psychotherapy Research," 17, 305-314] meta-synthesis of…
Biometrical issues in the analysis of adverse events within the benefit assessment of drugs.
Bender, Ralf; Beckmann, Lars; Lange, Stefan
2016-07-01
The analysis of adverse events plays an important role in the benefit assessment of drugs. Consequently, results on adverse events are an integral part of reimbursement dossiers submitted by pharmaceutical companies to health policy decision-makers. Methods applied in the analysis of adverse events commonly include simple standard methods for contingency tables. However, the results produced may be misleading if observations are censored at the time of discontinuation due to treatment switching or noncompliance, resulting in unequal follow-up periods. In this paper, we present examples to show that the application of inadequate methods for the analysis of adverse events in the reimbursement dossier can lead to a downgrading of the evidence on a drug's benefit in the subsequent assessment, as greater harm from the drug cannot be excluded with sufficient certainty. Legal regulations on the benefit assessment of drugs in Germany are presented, in particular, with regard to the analysis of adverse events. Differences in safety considerations between the drug approval process and the benefit assessment are discussed. We show that the naive application of simple proportions in reimbursement dossiers frequently leads to uninterpretable results if observations are censored and the average follow-up periods differ between treatment groups. Likewise, the application of incidence rates may be misleading in the case of recurrent events and unequal follow-up periods. To allow for an appropriate benefit assessment of drugs, adequate survival time methods accounting for time dependencies and duration of follow-up are required, not only for time-to-event efficacy endpoints but also for adverse events. © 2016 The Authors. Pharmaceutical Statistics published by John Wiley & Sons Ltd. © 2016 The Authors. Pharmaceutical Statistics published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Schröter, Kai; Elmer, Florian; Trieselmann, Werner; Kreibich, Heidi; Kunz, Michael; Khazai, Bijan; Dransch, Doris; Wenzel, Friedemann; Zschau, Jochen; Merz, Bruno; Mühr, Bernhard; Kunz-Plapp, Tina; Möhrle, Stella; Bessel, Tina; Fohringer, Joachim
2014-05-01
The Central European flood of June 2013 is one of the most severe flood events that have occurred in Central Europe in the past decades. All major German river basins were affected (Rhine, Danube, and Elbe as well as the smaller Weser catchment).In terms of spatial extent and event magnitude, it was the most severe event at least since 1950. Within the current research focus on near real time forensic disaster analysis, the Center for Disaster Management and Risk Reduction Technology (CEDIM) assessed and analysed the multiple facets of the flood event from the beginning. The aim is to describe the on-going event, analyse the event sources, link the physical characteristics to the impact and consequences of the event and to understand the root causes that turn the physical event into a disaster (or prevent it from becoming disastrous). For the near real time component of this research, tools for rapid assessment and concise presentation of analysis results are essential. This contribution provides a graphical summary of the results of the CEDIM-FDA analyses on the June 2013 flood. It demonstrates the potential of visual representations for improving the communication and hence usability of findings in a rapid, intelligible and expressive way as a valuable supplement to usual event reporting. It is based on analyses of the hydrometeorological sources, the flood pathways (from satellite imagery, data extraction from social media), the resilience of the affected regions, and causal loss analysis. The prototypical representation of the FDA-results for the June 2013 flood provides an important step in the development of graphical event templates for the visualisation of forensic disaster analyses. These are intended to become a standard component of future CEDIM-FDA event activities.
A cyber-event correlation framework and metrics
NASA Astrophysics Data System (ADS)
Kang, Myong H.; Mayfield, Terry
2003-08-01
In this paper, we propose a cyber-event fusion, correlation, and situation assessment framework that, when instantiated, will allow cyber defenders to better understand the local, regional, and global cyber-situation. This framework, with associated metrics, can be used to guide assessment of our existing cyber-defense capabilities, and to help evaluate the state of cyber-event correlation research and where we must focus our future cyber-event correlation research. The framework, based on the cyber-event gathering activities and analysis functions, consists of five operational steps, each of which provides a richer set of contextual information to support greater situational understanding. The first three steps are categorically depicted as increasingly richer and broader-scoped contexts achieved through correlation activity, while in the final two steps, these richer contexts are achieved through analytical activities (situation assessment, and threat analysis & prediction). Category 1 Correlation focuses on the detection of suspicious activities and the correlation of events from a single cyber-event source. Category 2 Correlation clusters the same or similar events from multiple detectors that are located at close proximity and prioritizes them. Finally, the events from different time periods and event sources at different location/regions are correlated at Category 3 to recognize the relationship among different events. This is the category that focuses on the detection of large-scale and coordinated attacks. The situation assessment step (Category 4) focuses on the assessment of cyber asset damage and the analysis of the impact on missions. The threat analysis and prediction step (Category 5) analyzes attacks based on attack traces and predicts the next steps. Metrics that can distinguish correlation and cyber-situation assessment tools for each category are also proposed.
Detection and analysis of microseismic events using a Matched Filtering Algorithm (MFA)
NASA Astrophysics Data System (ADS)
Caffagni, Enrico; Eaton, David W.; Jones, Joshua P.; van der Baan, Mirko
2016-07-01
A new Matched Filtering Algorithm (MFA) is proposed for detecting and analysing microseismic events recorded by downhole monitoring of hydraulic fracturing. This method requires a set of well-located template (`parent') events, which are obtained using conventional microseismic processing and selected on the basis of high signal-to-noise (S/N) ratio and representative spatial distribution of the recorded microseismicity. Detection and extraction of `child' events are based on stacked, multichannel cross-correlation of the continuous waveform data, using the parent events as reference signals. The location of a child event relative to its parent is determined using an automated process, by rotation of the multicomponent waveforms into the ray-centred co-ordinates of the parent and maximizing the energy of the stacked amplitude envelope within a search volume around the parent's hypocentre. After correction for geometrical spreading and attenuation, the relative magnitude of the child event is obtained automatically using the ratio of stacked envelope peak with respect to its parent. Since only a small number of parent events require interactive analysis such as picking P- and S-wave arrivals, the MFA approach offers the potential for significant reduction in effort for downhole microseismic processing. Our algorithm also facilitates the analysis of single-phase child events, that is, microseismic events for which only one of the S- or P-wave arrivals is evident due to unfavourable S/N conditions. A real-data example using microseismic monitoring data from four stages of an open-hole slickwater hydraulic fracture treatment in western Canada demonstrates that a sparse set of parents (in this case, 4.6 per cent of the originally located events) yields a significant (more than fourfold increase) in the number of located events compared with the original catalogue. Moreover, analysis of the new MFA catalogue suggests that this approach leads to more robust interpretation of the induced microseismicity and novel insights into dynamic rupture processes based on the average temporal (foreshock-aftershock) relationship of child events to parents.
Oimatsu, Yu; Kaikita, Koichi; Ishii, Masanobu; Mitsuse, Tatsuro; Ito, Miwa; Arima, Yuichiro; Sueta, Daisuke; Takahashi, Aya; Iwashita, Satomi; Yamamoto, Eiichiro; Kojima, Sunao; Hokimoto, Seiji; Tsujita, Kenichi
2017-04-24
Periprocedural bleeding events are common after percutaneous coronary intervention. We evaluated the association of periprocedural bleeding events with thrombogenicity, which was measured quantitatively by the Total Thrombus-formation Analysis System equipped with microchips and thrombogenic surfaces (collagen, platelet chip [PL]; collagen plus tissue factor, atheroma chip [AR]). Between August 2013 and March 2016, 313 consecutive patients with coronary artery disease undergoing elective percutaneous coronary intervention were enrolled. They were divided into those with or without periprocedural bleeding events. We determined the bleeding events as composites of major bleeding events defined by the International Society on Thrombosis and Hemostasis and minor bleeding events (eg, minor hematoma, arteriovenous shunt and pseudoaneurysm). Blood samples obtained at percutaneous coronary intervention were analyzed for thrombus formation area under the curve (PL 24 -AUC 10 for PL chip; AR 10 -AUC 30 for AR chip) by the Total Thrombus-formation Analysis System and P2Y12 reaction unit by the VerifyNow system. Periprocedural bleeding events occurred in 37 patients. PL 24 -AUC 10 levels were significantly lower in patients with such events than those without ( P =0.002). Multiple logistic regression analyses showed association between low PL 24 -AUC 10 levels and periprocedural bleeding events (odds ratio, 2.71 [1.22-5.99]; P =0.01) and association between PL 24 -AUC 10 and periprocedural bleeding events in 176 patients of the femoral approach group (odds ratio, 2.88 [1.11-7.49]; P =0.03). However, PL 24 -AUC 10 levels in 127 patients of the radial approach group were not significantly different in patients with or without periprocedural bleeding events. PL 24 -AUC 10 measured by the Total Thrombus-formation Analysis System is a potentially useful predictor of periprocedural bleeding events in coronary artery disease patients undergoing elective percutaneous coronary intervention. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.
Renfro, Lindsay A.; Grothey, Axel M.; Paul, James; Floriani, Irene; Bonnetain, Franck; Niedzwiecki, Donna; Yamanaka, Takeharu; Souglakos, Ioannis; Yothers, Greg; Sargent, Daniel J.
2015-01-01
Purpose Clinical trials are expensive and lengthy, where success of a given trial depends on observing a prospectively defined number of patient events required to answer the clinical question. The point at which this analysis time occurs depends on both patient accrual and primary event rates, which typically vary throughout the trial's duration. We demonstrate real-time analysis date projections using data from a collection of six clinical trials that are part of the IDEA collaboration, an international preplanned pooling of data from six trials testing the duration of adjuvant chemotherapy in stage III colon cancer, and we additionally consider the hypothetical impact of one trial's early termination of follow-up. Patients and Methods In the absence of outcome data from IDEA, monthly accrual rates for each of the six IDEA trials were used to project subsequent trial-specific accrual, while historical data from similar Adjuvant Colon Cancer Endpoints (ACCENT) Group trials were used to construct a parametric model for IDEA's primary endpoint, disease-free survival, under the same treatment regimen. With this information and using the planned total accrual from each IDEA trial protocol, individual patient accrual and event dates were simulated and the overall IDEA interim and final analysis times projected. Projections were then compared with actual (previously undisclosed) trial-specific event totals at a recent census time for validation. The change in projected final analysis date assuming early termination of follow-up for one IDEA trial was also calculated. Results Trial-specific predicted event totals were close to the actual number of events per trial for the recent census date at which the number of events per trial was known, with the overall IDEA projected number of events only off by eight patients. Potential early termination of follow-up by one IDEA trial was estimated to postpone the overall IDEA final analysis date by 9 months. Conclusions Real-time projection of the final analysis time during a trial, or the overall analysis time during a trial collaborative such as IDEA, has practical implications for trial feasibility when these projections are translated into additional time and resources required. PMID:26989447
Bayesian analysis of rare events
NASA Astrophysics Data System (ADS)
Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.
An analysis of post-event processing in social anxiety disorder.
Brozovich, Faith; Heimberg, Richard G
2008-07-01
Research has demonstrated that self-focused thoughts and negative affect have a reciprocal relationship [Mor, N., Winquist, J. (2002). Self-focused attention and negative affect: A meta-analysis. Psychological Bulletin, 128, 638-662]. In the anxiety disorder literature, post-event processing has emerged as a specific construction of repetitive self-focused thoughts that pertain to social anxiety disorder. Post-event processing can be defined as an individual's repeated consideration and potential reconstruction of his performance following a social situation. Post-event processing can also occur when an individual anticipates a social or performance event and begins to brood about other, past social experiences. The present review examined the post-event processing literature in an attempt to organize and highlight the significant results. The methodologies employed to study post-event processing have included self-report measures, daily diaries, social or performance situations created in the laboratory, and experimental manipulations of post-event processing or anticipation of an upcoming event. Directions for future research on post-event processing are discussed.
Bayesian Approach for Flexible Modeling of Semicompeting Risks Data
Han, Baoguang; Yu, Menggang; Dignam, James J.; Rathouz, Paul J.
2016-01-01
Summary Semicompeting risks data arise when two types of events, non-terminal and terminal, are observed. When the terminal event occurs first, it censors the non-terminal event, but not vice versa. To account for possible dependent censoring of the non-terminal event by the terminal event and to improve prediction of the terminal event using the non-terminal event information, it is crucial to model their association properly. Motivated by a breast cancer clinical trial data analysis, we extend the well-known illness-death models to allow flexible random effects to capture heterogeneous association structures in the data. Our extension also represents a generalization of the popular shared frailty models that usually assume that the non-terminal event does not affect the hazards of the terminal event beyond a frailty term. We propose a unified Bayesian modeling approach that can utilize existing software packages for both model fitting and individual specific event prediction. The approach is demonstrated via both simulation studies and a breast cancer data set analysis. PMID:25274445
Visual search of cyclic spatio-temporal events
NASA Astrophysics Data System (ADS)
Gautier, Jacques; Davoine, Paule-Annick; Cunty, Claire
2018-05-01
The analysis of spatio-temporal events, and especially of relationships between their different dimensions (space-time-thematic attributes), can be done with geovisualization interfaces. But few geovisualization tools integrate the cyclic dimension of spatio-temporal event series (natural events or social events). Time Coil and Time Wave diagrams represent both the linear time and the cyclic time. By introducing a cyclic temporal scale, these diagrams may highlight the cyclic characteristics of spatio-temporal events. However, the settable cyclic temporal scales are limited to usual durations like days or months. Because of that, these diagrams cannot be used to visualize cyclic events, which reappear with an unusual period, and don't allow to make a visual search of cyclic events. Also, they don't give the possibility to identify the relationships between the cyclic behavior of the events and their spatial features, and more especially to identify localised cyclic events. The lack of possibilities to represent the cyclic time, outside of the temporal diagram of multi-view geovisualization interfaces, limits the analysis of relationships between the cyclic reappearance of events and their other dimensions. In this paper, we propose a method and a geovisualization tool, based on the extension of Time Coil and Time Wave, to provide a visual search of cyclic events, by allowing to set any possible duration to the diagram's cyclic temporal scale. We also propose a symbology approach to push the representation of the cyclic time into the map, in order to improve the analysis of relationships between space and the cyclic behavior of events.
INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
R.J. Garrett
2005-02-17
The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology formore » this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.« less
An unjustified benefit: immortal time bias in the analysis of time-dependent events.
Gleiss, Andreas; Oberbauer, Rainer; Heinze, Georg
2018-02-01
Immortal time bias is a problem arising from methodologically wrong analyses of time-dependent events in survival analyses. We illustrate the problem by analysis of a kidney transplantation study. Following patients from transplantation to death, groups defined by the occurrence or nonoccurrence of graft failure during follow-up seemingly had equal overall mortality. Such naive analysis assumes that patients were assigned to the two groups at time of transplantation, which actually are a consequence of occurrence of a time-dependent event later during follow-up. We introduce landmark analysis as the method of choice to avoid immortal time bias. Landmark analysis splits the follow-up time at a common, prespecified time point, the so-called landmark. Groups are then defined by time-dependent events having occurred before the landmark, and outcome events are only considered if occurring after the landmark. Landmark analysis can be easily implemented with common statistical software. In our kidney transplantation example, landmark analyses with landmarks set at 30 and 60 months clearly identified graft failure as a risk factor for overall mortality. We give further typical examples from transplantation research and discuss strengths and limitations of landmark analysis and other methods to address immortal time bias such as Cox regression with time-dependent covariables. © 2017 Steunstichting ESOT.
Development of a Bayesian Belief Network Runway Incursion and Excursion Model
NASA Technical Reports Server (NTRS)
Green, Lawrence L.
2014-01-01
In a previous work, a statistical analysis of runway incursion (RI) event data was conducted to ascertain the relevance of this data to the top ten Technical Challenges (TC) of the National Aeronautics and Space Administration (NASA) Aviation Safety Program (AvSP). The study revealed connections to several of the AvSP top ten TC and identified numerous primary causes and contributing factors of RI events. The statistical analysis served as the basis for developing a system-level Bayesian Belief Network (BBN) model for RI events, also previously reported. Through literature searches and data analysis, this RI event network has now been extended to also model runway excursion (RE) events. These RI and RE event networks have been further modified and vetted by a Subject Matter Expert (SME) panel. The combined system-level BBN model will allow NASA to generically model the causes of RI and RE events and to assess the effectiveness of technology products being developed under NASA funding. These products are intended to reduce the frequency of runway safety incidents/accidents, and to improve runway safety in general. The development and structure of the BBN for both RI and RE events are documented in this paper.
Limitations imposed on fire PRA methods as the result of incomplete and uncertain fire event data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nowlen, Steven Patrick; Hyslop, J. S.
2010-04-01
Fire probabilistic risk assessment (PRA) methods utilize data and insights gained from actual fire events in a variety of ways. For example, fire occurrence frequencies, manual fire fighting effectiveness and timing, and the distribution of fire events by fire source and plant location are all based directly on the historical experience base. Other factors are either derived indirectly or supported qualitatively based on insights from the event data. These factors include the general nature and intensity of plant fires, insights into operator performance, and insights into fire growth and damage behaviors. This paper will discuss the potential methodology improvements thatmore » could be realized if more complete fire event reporting information were available. Areas that could benefit from more complete event reporting that will be discussed in the paper include fire event frequency analysis, analysis of fire detection and suppression system performance including incipient detection systems, analysis of manual fire fighting performance, treatment of fire growth from incipient stages to fully-involved fires, operator response to fire events, the impact of smoke on plant operations and equipment, and the impact of fire-induced cable failures on plant electrical circuits.« less
Video analysis of motor events in REM sleep behavior disorder.
Frauscher, Birgit; Gschliesser, Viola; Brandauer, Elisabeth; Ulmer, Hanno; Peralta, Cecilia M; Müller, Jörg; Poewe, Werner; Högl, Birgit
2007-07-30
In REM sleep behavior disorder (RBD), several studies focused on electromyographic characterization of motor activity, whereas video analysis has remained more general. The aim of this study was to undertake a detailed and systematic video analysis. Nine polysomnographic records from 5 Parkinson patients with RBD were analyzed and compared with sex- and age-matched controls. Each motor event in the video during REM sleep was classified according to duration, type of movement, and topographical distribution. In RBD, a mean of 54 +/- 23.2 events/10 minutes of REM sleep (total 1392) were identified and visually analyzed. Seventy-five percent of all motor events lasted <2 seconds. Of these events, 1,155 (83.0%) were classified as elementary, 188 (13.5%) as complex behaviors, 50 (3.6%) as violent, and 146 (10.5%) as vocalizations. In the control group, 3.6 +/- 2.3 events/10 minutes (total 264) of predominantly elementary simple character (n = 240, 90.9%) were identified. Number and types of motor events differed significantly between patients and controls (P < 0.05). This study shows a very high number and great variety of motor events during REM sleep in symptomatic RBD. However, most motor events are minor, and violent episodes represent only a small fraction. Copyright 2007 Movement Disorder Society
DOE Office of Scientific and Technical Information (OSTI.GOV)
Powell, Danny H; Elwood Jr, Robert H
2011-01-01
Analysis of the material protection, control, and accountability (MPC&A) system is necessary to understand the limits and vulnerabilities of the system to internal threats. A self-appraisal helps the facility be prepared to respond to internal threats and reduce the risk of theft or diversion of nuclear material. The material control and accountability (MC&A) system effectiveness tool (MSET) fault tree was developed to depict the failure of the MPC&A system as a result of poor practices and random failures in the MC&A system. It can also be employed as a basis for assessing deliberate threats against a facility. MSET uses faultmore » tree analysis, which is a top-down approach to examining system failure. The analysis starts with identifying a potential undesirable event called a 'top event' and then determining the ways it can occur (e.g., 'Fail To Maintain Nuclear Materials Under The Purview Of The MC&A System'). The analysis proceeds by determining how the top event can be caused by individual or combined lower level faults or failures. These faults, which are the causes of the top event, are 'connected' through logic gates. The MSET model uses AND-gates and OR-gates and propagates the effect of event failure using Boolean algebra. To enable the fault tree analysis calculations, the basic events in the fault tree are populated with probability risk values derived by conversion of questionnaire data to numeric values. The basic events are treated as independent variables. This assumption affects the Boolean algebraic calculations used to calculate results. All the necessary calculations are built into the fault tree codes, but it is often useful to estimate the probabilities manually as a check on code functioning. The probability of failure of a given basic event is the probability that the basic event primary question fails to meet the performance metric for that question. The failure probability is related to how well the facility performs the task identified in that basic event over time (not just one performance or exercise). Fault tree calculations provide a failure probability for the top event in the fault tree. The basic fault tree calculations establish a baseline relative risk value for the system. This probability depicts relative risk, not absolute risk. Subsequent calculations are made to evaluate the change in relative risk that would occur if system performance is improved or degraded. During the development effort of MSET, the fault tree analysis program used was SAPHIRE. SAPHIRE is an acronym for 'Systems Analysis Programs for Hands-on Integrated Reliability Evaluations.' Version 1 of the SAPHIRE code was sponsored by the Nuclear Regulatory Commission in 1987 as an innovative way to draw, edit, and analyze graphical fault trees primarily for safe operation of nuclear power reactors. When the fault tree calculations are performed, the fault tree analysis program will produce several reports that can be used to analyze the MPC&A system. SAPHIRE produces reports showing risk importance factors for all basic events in the operational MC&A system. The risk importance information is used to examine the potential impacts when performance of certain basic events increases or decreases. The initial results produced by the SAPHIRE program are considered relative risk values. None of the results can be interpreted as absolute risk values since the basic event probability values represent estimates of risk associated with the performance of MPC&A tasks throughout the material balance area (MBA). The RRR for a basic event represents the decrease in total system risk that would result from improvement of that one event to a perfect performance level. Improvement of the basic event with the greatest RRR value produces a greater decrease in total system risk than improvement of any other basic event. Basic events with the greatest potential for system risk reduction are assigned performance improvement values, and new fault tree calculations show the improvement in total system risk. The operational impact or cost-effectiveness from implementing the performance improvements can then be evaluated. The improvements being evaluated can be system performance improvements, or they can be potential, or actual, upgrades to the system. The RIR for a basic event represents the increase in total system risk that would result from failure of that one event. Failure of the basic event with the greatest RIR value produces a greater increase in total system risk than failure of any other basic event. Basic events with the greatest potential for system risk increase are assigned failure performance values, and new fault tree calculations show the increase in total system risk. This evaluation shows the importance of preventing performance degradation of the basic events. SAPHIRE identifies combinations of basic events where concurrent failure of the events results in failure of the top event.« less
[Analysis on the adverse events of cupping therapy in the application].
Zhou, Xin; Ruan, Jing-wen; Xing, Bing-feng
2014-10-01
The deep analysis has been done on the cases of adverse events and common injury of cupping therapy encountered in recent years in terms of manipulation and patient's constitution. The adverse events of cupping therapy are commonly caused by improper manipulation of medical practitioners, ignoring contraindication and patient's constitution. Clinical practitioners should use cupping therapy cautiously, follow strictly the rules of standard manipulation and medical core system, pay attention to the contraindication and take strict precautions against the occurrence of adverse events.
NASA Technical Reports Server (NTRS)
Perez, Christopher E.; Berg, Melanie D.; Friendlich, Mark R.
2011-01-01
Motivation for this work is: (1) Accurately characterize digital signal processor (DSP) core single-event effect (SEE) behavior (2) Test DSP cores across a large frequency range and across various input conditions (3) Isolate SEE analysis to DSP cores alone (4) Interpret SEE analysis in terms of single-event upsets (SEUs) and single-event transients (SETs) (5) Provide flight missions with accurate estimate of DSP core error rates and error signatures.
Distributed video data fusion and mining
NASA Astrophysics Data System (ADS)
Chang, Edward Y.; Wang, Yuan-Fang; Rodoplu, Volkan
2004-09-01
This paper presents an event sensing paradigm for intelligent event-analysis in a wireless, ad hoc, multi-camera, video surveillance system. In particilar, we present statistical methods that we have developed to support three aspects of event sensing: 1) energy-efficient, resource-conserving, and robust sensor data fusion and analysis, 2) intelligent event modeling and recognition, and 3) rapid deployment, dynamic configuration, and continuous operation of the camera networks. We outline our preliminary results, and discuss future directions that research might take.
A Bayesian technique for improving the sensitivity of the atmospheric neutrino L/E analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blake, A. S. T.; Chapman, J. D.; Thomson, M. A.
Tmore » his paper outlines a method for improving the precision of atmospheric neutrino oscillation measurements. One experimental signature for these oscillations is an observed deficit in the rate of ν μ charged-current interactions with an oscillatory dependence on L ν / E ν , where L ν is the neutrino propagation distance and E mrow is="true"> ν is the neutrino energy. For contained-vertex atmospheric neutrino interactions, the L ν / E ν resolution varies significantly from event to event. he precision of the oscillation measurement can be improved by incorporating information on L ν / E ν resolution into the oscillation analysis. In the analysis presented here, a Bayesian technique is used to estimate the L ν / E ν resolution of observed atmospheric neutrinos on an event-by-event basis. By separating the events into bins of L ν / E ν resolution in the oscillation analysis, a significant improvement in oscillation sensitivity can be achieved.« less
Hydrometeorological Analysis of Flooding Events in San Antonio, TX
NASA Astrophysics Data System (ADS)
Chintalapudi, S.; Sharif, H.; Elhassan, A.
2008-12-01
South Central Texas is particularly vulnerable to floods due to: proximity to a moist air source (the Gulf of Mexico); the Balcones Escarpment, which concentrates rainfall runoff; a tendency for synoptic scale features to become cut-off and stall over the area; and decaying tropical cyclones stalling over the area. The San Antonio Metropolitan Area is the 7th largest city in the nation, one of the most flash-flood prone regions in North America, and has experienced a number of flooding events in the last decade (1998, 2002, 2004, and 2007). Research is being conducted to characterize the meteorological conditions that lead to these events and apply the rainfall and watershed characteristics data to recreate the runoff events using a two- dimensional, physically-based, distributed-parameter hydrologic model. The physically based, distributed-parameter Gridded Surface Subsurface Hydrologic Analysis (GSSHA) hydrological model was used for simulating the watershed response to these storm events. Finally observed discharges were compared to GSSHA model discharges for these storm events. Analysis of the some of these events will be presented.
Element analysis: a wavelet-based method for analysing time-localized events in noisy time series.
Lilly, Jonathan M
2017-04-01
A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized 'events'. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event's 'region of influence' within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis , is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry.
Pediatric emergency department census during major sporting events.
Kim, Tommy Y; Barcega, Besh B; Denmark, T Kent
2012-11-01
Our study attempted to evaluate the effects of major sporting events on the census of a pediatric emergency department (ED) in the United States specifically related to the National Football League Super Bowl, National Basketball Association (NBA) Finals, and Major League Baseball World Series. We performed a retrospective data analysis of our pediatric ED census on the number of visits during major sporting events over a 5-year period. Data during the same period 1 week after the major sporting event were collected for comparison as the control. We evaluated the medians of 2-hour increments around the event start time. Subgroup analysis was performed for games involving the local sporting teams. Our results showed no significant difference in ED census during the sporting events, except in the post 6 to 8 hours of the NBA finals. Subgroup analysis of the Los Angeles Lakers showed the same significant findings in the post 6 to 8 hours of the NBA finals. No major difference in pediatric ED census is observed during the most major sporting events in the United States.
Alternative splicing and trans-splicing events revealed by analysis of the Bombyx mori transcriptome
Shao, Wei; Zhao, Qiong-Yi; Wang, Xiu-Ye; Xu, Xin-Yan; Tang, Qing; Li, Muwang; Li, Xuan; Xu, Yong-Zhen
2012-01-01
Alternative splicing and trans-splicing events have not been systematically studied in the silkworm Bombyx mori. Here, the silkworm transcriptome was analyzed by RNA-seq. We identified 320 novel genes, modified 1140 gene models, and found thousands of alternative splicing and 58 trans-splicing events. Studies of three SR proteins show that both their alternative splicing patterns and mRNA products are conserved from insect to human, and one isoform of Srsf6 with a retained intron is expressed sex-specifically in silkworm gonads. Trans-splicing of mod(mdg4) in silkworm was experimentally confirmed. We identified integrations from a common 5′-gene with 46 newly identified alternative 3′-exons that are located on both DNA strands over a 500-kb region. Other trans-splicing events in B. mori were predicted by bioinformatic analysis, in which 12 events were confirmed by RT-PCR, six events were further validated by chimeric SNPs, and two events were confirmed by allele-specific RT-PCR in F1 hybrids from distinct silkworm lines of JS and L10, indicating that trans-splicing is more widespread in insects than previously thought. Analysis of the B. mori transcriptome by RNA-seq provides valuable information of regulatory alternative splicing events. The conservation of splicing events across species and newly identified trans-splicing events suggest that B. mori is a good model for future studies. PMID:22627775
NASA Astrophysics Data System (ADS)
Tian, F.; Sivapalan, M.; Li, H.; Hu, H.
2007-12-01
The importance of diagnostic analysis of hydrological models is increasingly recognized by the scientific community (M. Sivapalan, et al., 2003; H. V. Gupta, et al., 2007). Model diagnosis refers to model structures and parameters being identified not only by statistical comparison of system state variables and outputs but also by process understanding in a specific watershed. Process understanding can be gained by the analysis of observational data and model results at the specific watershed as well as through regionalization. Although remote sensing technology can provide valuable data about the inputs, state variables, and outputs of the hydrological system, observational rainfall-runoff data still constitute the most accurate, reliable, direct, and thus a basic component of hydrology related database. One critical question in model diagnostic analysis is, therefore, what signature characteristic can we extract from rainfall and runoff data. To this date only a few studies have focused on this question, such as Merz et al. (2006) and Lana-Renault et al. (2007), still none of these studies related event analysis with model diagnosis in an explicit, rigorous, and systematic manner. Our work focuses on the identification of the dominant runoff generation mechanisms from event analysis of rainfall-runoff data, including correlation analysis and analysis of timing pattern. The correlation analysis involves the identification of the complex relationship among rainfall depth, intensity, runoff coefficient, and antecedent conditions, and the timing pattern analysis aims to identify the clustering pattern of runoff events in relation to the patterns of rainfall events. Our diagnostic analysis illustrates the changing pattern of runoff generation mechanisms in the DMIP2 test watersheds located in Oklahoma region, which is also well recognized by numerical simulations based on TsingHua Representative Elementary Watershed (THREW) model. The result suggests the usefulness of rainfall-runoff event analysis for model development as well as model diagnostics.
Systematic inference of functional phosphorylation events in yeast metabolism.
Chen, Yu; Wang, Yonghong; Nielsen, Jens
2017-07-01
Protein phosphorylation is a post-translational modification that affects proteins by changing their structure and conformation in a rapid and reversible way, and it is an important mechanism for metabolic regulation in cells. Phosphoproteomics enables high-throughput identification of phosphorylation events on metabolic enzymes, but identifying functional phosphorylation events still requires more detailed biochemical characterization. Therefore, development of computational methods for investigating unknown functions of a large number of phosphorylation events identified by phosphoproteomics has received increased attention. We developed a mathematical framework that describes the relationship between phosphorylation level of a metabolic enzyme and the corresponding flux through the enzyme. Using this framework, it is possible to quantitatively estimate contribution of phosphorylation events to flux changes. We showed that phosphorylation regulation analysis, combined with a systematic workflow and correlation analysis, can be used for inference of functional phosphorylation events in steady and dynamic conditions, respectively. Using this analysis, we assigned functionality to phosphorylation events of 17 metabolic enzymes in the yeast Saccharomyces cerevisiae , among which 10 are novel. Phosphorylation regulation analysis cannot only be extended for inference of other functional post-translational modifications but also be a promising scaffold for multi-omics data integration in systems biology. Matlab codes for flux balance analysis in this study are available in Supplementary material. yhwang@ecust.edu.cn or nielsenj@chalmers.se. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
Bercu, Joel P; Jolly, Robert A; Flagella, Kelly M; Baker, Thomas K; Romero, Pedro; Stevens, James L
2010-12-01
In order to determine a threshold for nongenotoxic carcinogens, the traditional risk assessment approach has been to identify a mode of action (MOA) with a nonlinear dose-response. The dose-response for one or more key event(s) linked to the MOA for carcinogenicity allows a point of departure (POD) to be selected from the most sensitive effect dose or no-effect dose. However, this can be challenging because multiple MOAs and key events may exist for carcinogenicity and oftentimes extensive research is required to elucidate the MOA. In the present study, a microarray analysis was conducted to determine if a POD could be identified following short-term oral rat exposure with two nongenotoxic rodent carcinogens, fenofibrate and methapyrilene, using a benchmark dose analysis of genes aggregated in Kyoto Encyclopedia of Genes and Genomes (KEGG) pathways and Gene Ontology (GO) biological processes, which likely encompass key event(s) for carcinogenicity. The gene expression response for fenofibrate given to rats for 2days was consistent with its MOA and known key events linked to PPARα activation. The temporal response from daily dosing with methapyrilene demonstrated biological complexity with waves of pathways/biological processes occurring over 1, 3, and 7days; nonetheless, the benchmark dose values were consistent over time. When comparing the dose-response of toxicogenomic data to tumorigenesis or precursor events, the toxicogenomics POD was slightly below any effect level. Our results suggest that toxicogenomic analysis using short-term studies can be used to identify a threshold for nongenotoxic carcinogens based on evaluation of potential key event(s) which then can be used within a risk assessment framework. Copyright © 2010 Elsevier Inc. All rights reserved.
Araz, Coskun; Pirat, Arash; Unlukaplan, Aytekin; Torgay, Adnan; Karakayali, Hamdi; Arslan, Gulnaz; Moray, Gokhan; Haberal, Mehmet
2012-04-01
To evaluate the frequency, type, and predictors of intraoperative adverse events during donor hepatectomy for living-donor liver transplant. Retrospective analyses of the data from 182 consecutive living-donor liver transplant donors between May 2002 and September 2008. Ninety-one patients (50%) had at least 1 intraoperative adverse event including hypothermia (39%), hypotension (26%), need for transfusions (17%), and hypertension (7%). Patients with an adverse event were older (P = .001), had a larger graft weight (P = .023), more frequently underwent a right hepatectomy (P = .019), and were more frequently classified as American Society of Anesthesiologists physical status class II (P = .027) than those who did not have these adverse events. Logistic regression analysis revealed that only age (95% confidence interval 1.018-1.099; P = .001) was a risk factor for intraoperative adverse events. Patients with these adverse events more frequently required admission to the intensive care unit and were hospitalized longer postoperatively. A before and after analysis showed that after introduction of in-line fluid warmers and more frequent use of acute normovolemic hemodilution, the frequency of intraoperative adverse events was significantly lower (80% vs 29%; P < .001). Intraoperative adverse events such as hypothermia and hypotension were common in living-donor liver transplant donors, and older age was associated with an increased risk of these adverse events. However, the effect of these adverse events on postoperative recovery is not clear.
Luhmann, Maike; Hofmann, Wilhelm; Eid, Michael; Lucas, Richard E.
2012-01-01
Previous research has shown that major life events can have short- and long-term effects on subjective well-being (SWB). The present meta-analysis examines (a) whether life events have different effects on cognitive and affective well-being and (b) how the rate of adaptation varies across different life events. Longitudinal data from 188 publications (313 samples, N = 65,911) were integrated to describe the reaction and adaptation to four family events (marriage, divorce, bereavement, child birth) and four work events (unemployment, reemployment, retirement, relocation/migration). The findings show that life events have very different effects on affective and cognitive well-being, and that for most events the effects of life events on cognitive well-being are stronger and more consistent across samples. Different life events differ in their effects on SWB, but these effects are not a function of the alleged desirability of events. The results are discussed with respect to their theoretical implications, and recommendations for future studies on adaptation are given. PMID:22059843
NASA Technical Reports Server (NTRS)
Turso, James; Lawrence, Charles; Litt, Jonathan
2004-01-01
The development of a wavelet-based feature extraction technique specifically targeting FOD-event induced vibration signal changes in gas turbine engines is described. The technique performs wavelet analysis of accelerometer signals from specified locations on the engine and is shown to be robust in the presence of significant process and sensor noise. It is envisioned that the technique will be combined with Kalman filter thermal/health parameter estimation for FOD-event detection via information fusion from these (and perhaps other) sources. Due to the lack of high-frequency FOD-event test data in the open literature, a reduced-order turbofan structural model (ROM) was synthesized from a finite element model modal analysis to support the investigation. In addition to providing test data for algorithm development, the ROM is used to determine the optimal sensor location for FOD-event detection. In the presence of significant noise, precise location of the FOD event in time was obtained using the developed wavelet-based feature.
NASA Technical Reports Server (NTRS)
Turso, James A.; Lawrence, Charles; Litt, Jonathan S.
2007-01-01
The development of a wavelet-based feature extraction technique specifically targeting FOD-event induced vibration signal changes in gas turbine engines is described. The technique performs wavelet analysis of accelerometer signals from specified locations on the engine and is shown to be robust in the presence of significant process and sensor noise. It is envisioned that the technique will be combined with Kalman filter thermal/ health parameter estimation for FOD-event detection via information fusion from these (and perhaps other) sources. Due to the lack of high-frequency FOD-event test data in the open literature, a reduced-order turbofan structural model (ROM) was synthesized from a finite-element model modal analysis to support the investigation. In addition to providing test data for algorithm development, the ROM is used to determine the optimal sensor location for FOD-event detection. In the presence of significant noise, precise location of the FOD event in time was obtained using the developed wavelet-based feature.
Results from the First Two Flights of the Static Computer Memory Integrity Testing Experiment
NASA Technical Reports Server (NTRS)
Hancock, Thomas M., III
1999-01-01
This paper details the scientific objectives, experiment design, data collection method, and post flight analysis following the first two flights of the Static Computer Memory Integrity Testing (SCMIT) experiment. SCMIT is designed to detect soft-event upsets in passive magnetic memory. A soft-event upset is a change in the logic state of active or passive forms of magnetic memory, commonly referred to as a "Bitflip". In its mildest form a soft-event upset can cause software exceptions, unexpected events, start spacecraft safeing (ending data collection) or corrupted fault protection and error recovery capabilities. In it's most severe form loss of mission or spacecraft can occur. Analysis after the first flight (in 1991 during STS-40) identified possible soft-event upsets to 25% of the experiment detectors. Post flight analysis after the second flight (in 1997 on STS-87) failed to find any evidence of soft-event upsets. The SCMIT experiment is currently scheduled for a third flight in December 1999 on STS-101.
NASA Astrophysics Data System (ADS)
Tavakoli, S.; Poslad, S.; Fruhwirth, R.; Winter, M.
2012-04-01
This paper introduces an application of a novel EventTracker platform for instantaneous Sensitivity Analysis (SA) of large scale real-time geo-information. Earth disaster management systems demand high quality information to aid a quick and timely response to their evolving environments. The idea behind the proposed EventTracker platform is the assumption that modern information management systems are able to capture data in real-time and have the technological flexibility to adjust their services to work with specific sources of data/information. However, to assure this adaptation in real time, the online data should be collected, interpreted, and translated into corrective actions in a concise and timely manner. This can hardly be handled by existing sensitivity analysis methods because they rely on historical data and lazy processing algorithms. In event-driven systems, the effect of system inputs on its state is of value, as events could cause this state to change. This 'event triggering' situation underpins the logic of the proposed approach. Event tracking sensitivity analysis method describes the system variables and states as a collection of events. The higher the occurrence of an input variable during the trigger of event, the greater its potential impact will be on the final analysis of the system state. Experiments were designed to compare the proposed event tracking sensitivity analysis with existing Entropy-based sensitivity analysis methods. The results have shown a 10% improvement in a computational efficiency with no compromise for accuracy. It has also shown that the computational time to perform the sensitivity analysis is 0.5% of the time required compared to using the Entropy-based method. The proposed method has been applied to real world data in the context of preventing emerging crises at drilling rigs. One of the major purposes of such rigs is to drill boreholes to explore oil or gas reservoirs with the final scope of recovering the content of such reservoirs; both in onshore regions as well as in offshore regions. Drilling a well is always guided by technical, economic and security constraints to prevent crew, equipment and environment from injury, damage and pollution. Although risk assessment and local practice provides a high degree of security, uncertainty is given by the behaviour of the formation which may cause crucial situations at the rig. To overcome such uncertainties real-time sensor measurements form a base to predict and thus prevent such crises, the proposed method supports the identification of the data necessary for that.
The Monitoring Erosion of Agricultural Land and spatial database of erosion events
NASA Astrophysics Data System (ADS)
Kapicka, Jiri; Zizala, Daniel
2013-04-01
In 2011 originated in The Czech Republic The Monitoring Erosion of Agricultural Land as joint project of State Land Office (SLO) and Research Institute for Soil and Water Conservation (RISWC). The aim of the project is collecting and record keeping information about erosion events on agricultural land and their evaluation. The main idea is a creation of a spatial database that will be source of data and information for evaluation and modeling erosion process, for proposal of preventive measures and measures to reduce negative impacts of erosion events. A subject of monitoring is the manifestations of water erosion, wind erosion and slope deformation in which cause damaged agriculture land. A website, available on http://me.vumop.cz, is used as a tool for keeping and browsing information about monitored events. SLO employees carry out record keeping. RISWC is specialist institute in the Monitoring Erosion of Agricultural Land that performs keeping the spatial database, running the website, managing the record keeping of events, analysis the cause of origins events and statistical evaluations of keeping events and proposed measures. Records are inserted into the database using the user interface of the website which has map server as a component. Website is based on database technology PostgreSQL with superstructure PostGIS and MapServer UMN. Each record is in the database spatial localized by a drawing and it contains description information about character of event (data, situation description etc.) then there are recorded information about land cover and about grown crops. A part of database is photodocumentation which is taken in field reconnaissance which is performed within two days after notify of event. Another part of database are information about precipitations from accessible precipitation gauges. Website allows to do simple spatial analysis as are area calculation, slope calculation, percentage representation of GAEC etc.. Database structure was designed on the base of needs analysis inputs to mathematical models. Mathematical models are used for detailed analysis of chosen erosion events which include soil analysis. Till the end 2012 has had the database 135 events. The content of database still accrues and gives rise to the extensive source of data that is usable for testing mathematical models.
The `TTIME' Package: Performance Evaluation in a Cluster Computing Environment
NASA Astrophysics Data System (ADS)
Howe, Marico; Berleant, Daniel; Everett, Albert
2011-06-01
The objective of translating developmental event time across mammalian species is to gain an understanding of the timing of human developmental events based on known time of those events in animals. The potential benefits include improvements to diagnostic and intervention capabilities. The CRAN `ttime' package provides the functionality to infer unknown event timings and investigate phylogenetic proximity utilizing hierarchical clustering of both known and predicted event timings. The original generic mammalian model included nine eutherian mammals: Felis domestica (cat), Mustela putorius furo (ferret), Mesocricetus auratus (hamster), Macaca mulatta (monkey), Homo sapiens (humans), Mus musculus (mouse), Oryctolagus cuniculus (rabbit), Rattus norvegicus (rat), and Acomys cahirinus (spiny mouse). However, the data for this model is expected to grow as more data about developmental events is identified and incorporated into the analysis. Performance evaluation of the `ttime' package across a cluster computing environment versus a comparative analysis in a serial computing environment provides an important computational performance assessment. A theoretical analysis is the first stage of a process in which the second stage, if justified by the theoretical analysis, is to investigate an actual implementation of the `ttime' package in a cluster computing environment and to understand the parallelization process that underlies implementation.
Bayesian analysis of rare events
DOE Office of Scientific and Technical Information (OSTI.GOV)
Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into themore » probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.« less
Magnesium and the Risk of Cardiovascular Events: A Meta-Analysis of Prospective Cohort Studies
Hao, Yongqiang; Li, Huiwu; Tang, Tingting; Wang, Hao; Yan, Weili; Dai, Kerong
2013-01-01
Background Prospective studies that have examined the association between dietary magnesium intake and serum magnesium concentrations and the risk of cardiovascular disease (CVD) events have reported conflicting findings. We undertook a meta-analysis to evaluate the association between dietary magnesium intake and serum magnesium concentrations and the risk of total CVD events. Methodology/Principal Findings We performed systematic searches on MEDLINE, EMBASE, and OVID up to February 1, 2012 without limits. Categorical, linear, and nonlinear, dose-response, heterogeneity, publication bias, subgroup, and meta-regression analysis were performed. The analysis included 532,979 participants from 19 studies (11 studies on dietary magnesium intake, 6 studies on serum magnesium concentrations, and 2 studies on both) with 19,926 CVD events. The pooled relative risks of total CVD events for the highest vs. lowest category of dietary magnesium intake and serum magnesium concentrations were 0.85 (95% confidence interval 0.78 to 0.92) and 0.77 (0.66 to 0.87), respectively. In linear dose-response analysis, only serum magnesium concentrations ranging from 1.44 to 1.8 mEq/L were significantly associated with total CVD events risk (0.91, 0.85 to 0.97) per 0.1 mEq/L (Pnonlinearity = 0.465). However, significant inverse associations emerged in nonlinear models for dietary magnesium intake (Pnonlinearity = 0.024). The greatest risk reduction occurred when intake increased from 150 to 400 mg/d. There was no evidence of publication bias. Conclusions/Significance There is a statistically significant nonlinear inverse association between dietary magnesium intake and total CVD events risk. Serum magnesium concentrations are linearly and inversely associated with the risk of total CVD events. PMID:23520480
DOE Office of Scientific and Technical Information (OSTI.GOV)
Supanich, M; Chu, J; Wehmeyer, A
2014-06-15
Purpose: This work offers as a teaching example a reported high dose fluoroscopy case and the workflow the institution followed to self-report a radiation overdose sentinel event to the Joint Commission. Methods: Following the completion of a clinical case in a hybrid OR room with a reported air kerma of >18 Gy at the Interventional Reference Point (IRP) the physicians involved in the case referred study to the institution's Radiation Safety Committee (RSC) for review. The RSC assigned a Diagnostic Medical Physicist (DMP) to estimate the patient's Peak Skin Dose (PSD) and analyze the case. Following the DMP's analysis andmore » estimate of a PSD of >15 Gy the institution's adverse event committee was convened to discuss the case and to self-report the case as a radiation overdose sentinel event to the Joint Commission. The committee assigned a subgroup to perform the root cause analysis and develop institutional responses to the event. Results: The self-reporting of the sentinel event and the associated root cause analysis resulted in several institutional action items that are designed to improve process and safety. A formal reporting and analysis mechanism was adopted to review fluoroscopy cases with air kerma greater than 6 Gy at the IRP. An improved and formalized radiation safety training program for physicians using fluoroscopy equipment was implemented. Additionally efforts already under way to monitor radiation exposure in the Radiology department were expanded to include all fluoroscopy equipment capable of automated dose reporting. Conclusion: The adverse event review process and the root cause analysis following the self-reporting of the sentinel event resulted in policies and procedures that are expected to improve the quality and safe usage of fluoroscopy throughout the institution.« less
NASA Astrophysics Data System (ADS)
Otto, Friederike E. L.; van der Wiel, Karin; van Oldenborgh, Geert Jan; Philip, Sjoukje; Kew, Sarah F.; Uhe, Peter; Cullen, Heidi
2018-02-01
On 4-6 December 2015, storm Desmond caused very heavy rainfall in Northern England and Southern Scotland which led to widespread flooding. A week after the event we provided an initial assessment of the influence of anthropogenic climate change on the likelihood of one-day precipitation events averaged over an area encompassing Northern England and Southern Scotland using data and methods available immediately after the event occurred. The analysis was based on three independent methods of extreme event attribution: historical observed trends, coupled climate model simulations and a large ensemble of regional model simulations. All three methods agreed that the effect of climate change was positive, making precipitation events like this about 40% more likely, with a provisional 2.5%-97.5% confidence interval of 5%-80%. Here we revisit the assessment using more station data, an additional monthly event definition, a second global climate model and regional model simulations of winter 2015/16. The overall result of the analysis is similar to the real-time analysis with a best estimate of a 59% increase in event frequency, but a larger confidence interval that does include no change. It is important to highlight that the observational data in the additional monthly analysis does not only represent the rainfall associated with storm Desmond but also that of storms Eve and Frank occurring towards the end of the month.
Oliker, Nurit; Ostfeld, Avi
2014-03-15
This study describes a decision support system, alerts for contamination events in water distribution systems. The developed model comprises a weighted support vector machine (SVM) for the detection of outliers, and a following sequence analysis for the classification of contamination events. The contribution of this study is an improvement of contamination events detection ability and a multi-dimensional analysis of the data, differing from the parallel one-dimensional analysis conducted so far. The multivariate analysis examines the relationships between water quality parameters and detects changes in their mutual patterns. The weights of the SVM model accomplish two goals: blurring the difference between sizes of the two classes' data sets (as there are much more normal/regular than event time measurements), and adhering the time factor attribute by a time decay coefficient, ascribing higher importance to recent observations when classifying a time step measurement. All model parameters were determined by data driven optimization so the calibration of the model was completely autonomic. The model was trained and tested on a real water distribution system (WDS) data set with randomly simulated events superimposed on the original measurements. The model is prominent in its ability to detect events that were only partly expressed in the data (i.e., affecting only some of the measured parameters). The model showed high accuracy and better detection ability as compared to previous modeling attempts of contamination event detection. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Folesky, J.; Kummerow, J.; Shapiro, S. A.; Asanuma, H.; Häring, M. O.
2015-12-01
The Emprirical Green's Function (EGF) method uses pairs of events of high wave form similarity and adjacent hypocenters to decompose the influences of source time function, ray path, instrument site, and instrument response. The seismogram of the smaller event is considered as the Green's Function which then can be deconvolved from the other seismogram. The result provides a reconstructed relative source time function (RSTF) of the larger event of that event pair. The comparison of the RSTFs at different stations of the observation systems produces information on the rupture process of the larger event based on the observation of the directivity effect and on changing RSTFs complexities.The Basel EGS dataset of 2006-2007 consists of about 2800 localized events of magnitudes between 0.0
Analysis of the Interactions of Planetary Waves with the Mean Flow of the Stratosphere
NASA Technical Reports Server (NTRS)
Newman, Paul A.
2007-01-01
During the winter period, large scale waves (planetary waves) are observed to propagate from the troposphere into the stratosphere. Such wave events have been recognized since the 1 950s. The very largest wave events result in major stratospheric warmings. These large scale wave events have typical durations of a few days to 2 weeks. The wave events deposit easterly momentum in the stratosphere, decelerating the polar night jet and warming the polar region. In this presentation we show the typical characteristics of these events via a compositing analysis. We will show the typical periods and scales of motion and the associated decelerations and warmings. We will illustrate some of the differences between major and minor warming wave events. We will further illustrate the feedback by the mean flow on subsequent wave events.
Low frequency events on Montserrat
NASA Astrophysics Data System (ADS)
Visser, K.; Neuberg, J.
2003-04-01
Earthquake swarms observed on volcanoes consist generally of low frequency events. The low frequency content of these events indicates the presence of interface waves at the boundary of the magma filled conduit and the surrounding country rock. The observed seismic signal at the surface shows therefore a complicated interference pattern of waves originating at various parts of the magma filled conduit, interacting with the free surface and interfaces in the volcanic edifice. This research investigates the applicability of conventional seismic tools on these low frequency events, focusing on hypocenter location analysis using arrival times and particle motion analysis for the Soufrière Hills Volcano on Montserrat. Both single low frequency events and swarms are observed on this volcano. Synthetic low frequency events are used for comparison. Results show that reliable hypocenter locations and particle motions can only be obtained if the low frequency events are single events with an identifiable P wave onset, for example the single events preceding swarms on Montserrat or the first low frequency event of a swarm. Consecutive events of the same swarm are dominated by interface waves which are converted at the top of the conduit into weak secondary P waves and surface waves. Conventional seismic tools fail to correctly analyse these events.
Ice Particle Analysis of the Honeywell AL502 Engine Booster
NASA Technical Reports Server (NTRS)
Bidwell, Colin S.; Rigby, David L.
2015-01-01
A flow and ice particle trajectory analysis was performed for the booster of the Honeywell ALF502 engine. The analysis focused on two closely related conditions one of which produced an icing event and another which did not during testing of the ALF502 engine in the Propulsion Systems Lab (PSL) at NASA Glenn Research Center. The flow analysis was generated using the NASA Glenn GlennHT flow solver and the particle analysis was generated using the NASA Glenn LEWICE3D v3.63 ice accretion software. The inflow conditions for the two conditions were similar with the main differences being that the condition that produced the icing event was 6.8 K colder than the non-icing event case and the inflow ice water content (IWC) for the non-icing event case was 50% less than for the icing event case. The particle analysis, which considered sublimation, evaporation and phase change, was generated for a 5 micron ice particle with a sticky impact model and for a 24 micron median volume diameter (MVD), 7 bin ice particle distribution with a supercooled large droplet (SLD) splash model used to simulate ice particle breakup. The particle analysis did not consider the effect of the runback and re-impingement of water resulting from the heated spinner and anti-icing system. The results from the analysis showed that the amount of impingement for the components were similar for the same particle size and impact model for the icing and non-icing event conditions. This was attributed to the similar aerodynamic conditions in the booster for the two cases. The particle temperature and melt fraction were higher at the same location and particle size for the non-icing event than for the icing event case due to the higher incoming inflow temperature for the non-event case. The 5 micron ice particle case produced higher impact temperatures and higher melt fractions on the components downstream of the fan than the 24 micron MVD case because the average particle size generated by the particle breakup was larger than 5 microns which yielded less warming and melting. The analysis also showed that the melt fraction and wet bulb temperature icing criterion developed during tests in the Research Altitude Test Facility (RATFac) at the National Research Council (NRC) of Canada were useful in predicting icing events in the ALF502 engine. The development of an ice particle impact model which includes the effects of particle breakup, phase change, and surface state is necessary to further improve the prediction of ice particle transport with phase change through turbomachinery.
Analysis and design of randomised clinical trials involving competing risks endpoints.
Tai, Bee-Choo; Wee, Joseph; Machin, David
2011-05-19
In randomised clinical trials involving time-to-event outcomes, the failures concerned may be events of an entirely different nature and as such define a classical competing risks framework. In designing and analysing clinical trials involving such endpoints, it is important to account for the competing events, and evaluate how each contributes to the overall failure. An appropriate choice of statistical model is important for adequate determination of sample size. We describe how competing events may be summarised in such trials using cumulative incidence functions and Gray's test. The statistical modelling of competing events using proportional cause-specific and subdistribution hazard functions, and the corresponding procedures for sample size estimation are outlined. These are illustrated using data from a randomised clinical trial (SQNP01) of patients with advanced (non-metastatic) nasopharyngeal cancer. In this trial, treatment has no effect on the competing event of loco-regional recurrence. Thus the effects of treatment on the hazard of distant metastasis were similar via both the cause-specific (unadjusted csHR = 0.43, 95% CI 0.25 - 0.72) and subdistribution (unadjusted subHR 0.43; 95% CI 0.25 - 0.76) hazard analyses, in favour of concurrent chemo-radiotherapy followed by adjuvant chemotherapy. Adjusting for nodal status and tumour size did not alter the results. The results of the logrank test (p = 0.002) comparing the cause-specific hazards and the Gray's test (p = 0.003) comparing the cumulative incidences also led to the same conclusion. However, the subdistribution hazard analysis requires many more subjects than the cause-specific hazard analysis to detect the same magnitude of effect. The cause-specific hazard analysis is appropriate for analysing competing risks outcomes when treatment has no effect on the cause-specific hazard of the competing event. It requires fewer subjects than the subdistribution hazard analysis for a similar effect size. However, if the main and competing events are influenced in opposing directions by an intervention, a subdistribution hazard analysis may be warranted.
Li, Hu; Benoit, Karin; Wang, Wei; Motsko, Stephen
2016-04-01
Limited information exists about whether exogenous testosterone therapy is associated with a risk of venous thrombotic events. We investigated via cohort and nested case-control analyses whether exogenous testosterone therapy is associated with the risk of venous thrombotic events in men with hypogonadism. Databases were reviewed to identify men prescribed exogenous testosterone therapy and/or men with a hypogonadism diagnosis. Propensity score 1:1 matching was used to select patients for cohort analysis. Cases (men with venous thrombotic events) were matched 1:4 with controls (men without venous thrombotic events) for the nested case-control analysis. Primary outcome was defined as incident idiopathic venous thrombotic events. Cox regression and conditional logistic regression were used to assess HRs and ORs, respectively. Sensitivity analyses were also performed. A total of 102,650 exogenous testosterone treated and 102,650 untreated patients were included in cohort analysis after matching, and 2,785 cases and 11,119 controls were included in case-control analysis. Cohort analysis revealed a HR of 1.08 for all testosterone treated patients (95% CI 0.91, 1.27, p = 0.378). Case-control analysis resulted in an OR of 1.02 (95% CI 0.92, 1.13, p = 0.702) for current exogenous testosterone therapy exposure and an OR of 0.92 (95% CI 0.82, 1.03, p = 0.145) for past exogenous testosterone therapy exposure. These results remained nonstatistically significant after stratifying by exogenous testosterone therapy administration route and age category. Most sensitivity analyses yielded consistent results. No significant association was found between exogenous testosterone therapy and incidents of idiopathic or overall venous thrombotic events in men with hypogonadism. However, some discrepant findings exist for the association between injectable formulations and the risk of overall venous thrombotic events. Copyright © 2016 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
Media Stereotypes Analysis in the Classroom at the Student Audience
ERIC Educational Resources Information Center
Fedorov, Alexander
2015-01-01
Media Stereotypes Analysis is the identification and analysis of stereotypical images of people, ideas, events, stories, themes and etc. in media texts. Media stereotype reflects the well-established attitudes towards a particular object, it is schematic averaged, familiar, stable representation of genres, social processes/events, ideas, people,…
Regression Analysis of Mixed Recurrent-Event and Panel-Count Data with Additive Rate Models
Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L.
2015-01-01
Summary Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007; Zhao et al., 2011). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013). In this paper, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study. PMID:25345405
Tudur Smith, Catrin; Gueyffier, François; Kolamunnage‐Dona, Ruwanthi
2017-01-01
Background Joint modelling of longitudinal and time‐to‐event data is often preferred over separate longitudinal or time‐to‐event analyses as it can account for study dropout, error in longitudinally measured covariates, and correlation between longitudinal and time‐to‐event outcomes. The joint modelling literature focuses mainly on the analysis of single studies with no methods currently available for the meta‐analysis of joint model estimates from multiple studies. Methods We propose a 2‐stage method for meta‐analysis of joint model estimates. These methods are applied to the INDANA dataset to combine joint model estimates of systolic blood pressure with time to death, time to myocardial infarction, and time to stroke. Results are compared to meta‐analyses of separate longitudinal or time‐to‐event models. A simulation study is conducted to contrast separate versus joint analyses over a range of scenarios. Results Using the real dataset, similar results were obtained by using the separate and joint analyses. However, the simulation study indicated a benefit of use of joint rather than separate methods in a meta‐analytic setting where association exists between the longitudinal and time‐to‐event outcomes. Conclusions Where evidence of association between longitudinal and time‐to‐event outcomes exists, results from joint models over standalone analyses should be pooled in 2‐stage meta‐analyses. PMID:29250814
TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach.
Elgendi, Mohamed
2016-11-02
Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages ("TERMA") involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages ( W 1 and W 2 ) have to follow the inequality ( 8 × W 1 ) ≥ W 2 ≥ ( 2 × W 1 ) . Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions.
Event shape analysis of deep inelastic scattering events with a large rapidity gap at HERA
NASA Astrophysics Data System (ADS)
ZEUS Collaboration; Breitweg, J.; Derrick, M.; Krakauer, D.; Magill, S.; Mikunas, D.; Musgrave, B.; Repond, J.; Stanek, R.; Talaga, R. L.; Yoshida, R.; Zhang, H.; Mattingly, M. C. K.; Anselmo, F.; Antonioli, P.; Bari, G.; Basile, M.; Bellagamba, L.; Boscherini, D.; Bruni, A.; Bruni, G.; Cara Romeo, G.; Castellini, G.; Cifarelli, L.; Cindolo, F.; Contin, A.; Corradi, M.; de Pasquale, S.; Gialas, I.; Giusti, P.; Iacobucci, G.; Laurenti, G.; Levi, G.; Margotti, A.; Massam, T.; Nania, R.; Palmonari, F.; Pesci, A.; Polini, A.; Ricci, F.; Sartorelli, G.; Zamora Garcia, Y.; Zichichi, A.; Amelung, C.; Bornheim, A.; Brock, I.; Coböken, K.; Crittenden, J.; Deffner, R.; Eckert, M.; Grothe, M.; Hartmann, H.; Heinloth, K.; Heinz, L.; Hilger, E.; Jakob, H.-P.; Katz, U. F.; Kerger, R.; Paul, E.; Pfeiffer, M.; Rembser, Ch.; Stamm, J.; Wedemeyer, R.; Wieber, H.; Bailey, D. S.; Campbell-Robson, S.; Cottingham, W. N.; Foster, B.; Hall-Wilton, R.; Hayes, M. E.; Heath, G. P.; Heath, H. F.; McFall, J. D.; Piccioni, D.; Roff, D. G.; Tapper, R. J.; Arneodo, M.; Ayad, R.; Capua, M.; Garfagnini, A.; Iannotti, L.; Schioppa, M.; Susinno, G.; Kim, J. Y.; Lee, J. H.; Lim, I. T.; Pac, M. Y.; Caldwell, A.; Cartiglia, N.; Jing, Z.; Liu, W.; Mellado, B.; Parsons, J. A.; Ritz, S.; Sampson, S.; Sciulli, F.; Straub, P. B.; Zhu, Q.; Borzemski, P.; Chwastowski, J.; Eskreys, A.; Figiel, J.; Klimek, K.; Przybycień , M. B.; Zawiejski, L.; Adamczyk, L.; Bednarek, B.; Bukowy, M.; Jeleń , K.; Kisielewska, D.; Kowalski, T.; Przybycień , M.; Rulikowska-Zarȩ Bska, E.; Suszycki, L.; Zaja C, J.; Duliń Ski, Z.; Kotań Ski, A.; Abbiendi, G.; Bauerdick, L. A. T.; Behrens, U.; Beier, H.; Bienlein, J. K.; Cases, G.; Deppe, O.; Desler, K.; Drews, G.; Fricke, U.; Gilkinson, D. J.; Glasman, C.; Göttlicher, P.; Haas, T.; Hain, W.; Hasell, D.; Johnson, K. F.; Kasemann, M.; Koch, W.; Kötz, U.; Kowalski, H.; Labs, J.; Lindemann, L.; Löhr, B.; Löwe, M.; Mań Czak, O.; Milewski, J.; Monteiro, T.; Ng, J. S. T.; Notz, D.; Ohrenberg, K.; Park, I. H.; Pellegrino, A.; Pelucchi, F.; Piotrzkowski, K.; Roco, M.; Rohde, M.; Roldán, J.; Ryan, J. J.; Savin, A. A.; Schneekloth, U.; Selonke, F.; Surrow, B.; Tassi, E.; Voß, T.; Westphal, D.; Wolf, G.; Wollmer, U.; Youngman, C.; Zsolararnecki, A. F.; Zeuner, W.; Burow, B. D.; Grabosch, H. J.; Meyer, A.; Schlenstedt, S.; Barbagli, G.; Gallo, E.; Pelfer, P.; Maccarrone, G.; Votano, L.; Bamberger, A.; Eisenhardt, S.; Markun, P.; Trefzger, T.; Wölfle, S.; Bromley, J. T.; Brook, N. H.; Bussey, P. J.; Doyle, A. T.; MacDonald, N.; Saxon, D. H.; Sinclair, L. E.; Strickland, E.; Waugh, R.; Bohnet, I.; Gendner, N.; Holm, U.; Meyer-Larsen, A.; Salehi, H.; Wick, K.; Gladilin, L. K.; Horstmann, D.; Kçira, D.; Klanner, R.; Lohrmann, E.; Poelz, G.; Schott, W.; Zetsche, F.; Bacon, T. C.; Butterworth, I.; Cole, J. E.; Howell, G.; Hung, B. H. Y.; Lamberti, L.; Long, K. R.; Miller, D. B.; Pavel, N.; Prinias, A.; Sedgbeer, J. K.; Sideris, D.; Walker, R.; Mallik, U.; Wang, S. M.; Wu, J. T.; Cloth, P.; Filges, D.; Fleck, J. I.; Ishii, T.; Kuze, M.; Suzuki, I.; Tokushuku, K.; Yamada, S.; Yamauchi, K.; Yamazaki, Y.; Hong, S. J.; Lee, S. B.; Nam, S. W.; Park, S. K.; Barreiro, F.; Fernández, J. P.; García, G.; Graciani, R.; Hernández, J. M.; Hervás, L.; Labarga, L.; Martínez, M.; del Peso, J.; Puga, J.; Terrón, J.; de Trocóniz, J. F.; Corriveau, F.; Hanna, D. S.; Hartmann, J.; Hung, L. W.; Murray, W. N.; Ochs, A.; Riveline, M.; Stairs, D. G.; St-Laurent, M.; Ullmann, R.; Tsurugai, T.; Bashkirov, V.; Dolgoshein, B. A.; Stifutkin, A.; Bashindzhagyan, G. L.; Ermolov, P. F.; Golubkov, Yu. A.; Khein, L. A.; Korotkova, N. A.; Korzhavina, I. A.; Kuzmin, V. A.; Lukina, O. Yu.; Proskuryakov, A. S.; Shcheglova, L. M.; Solomin, A. N.; Zotkin, S. A.; Bokel, C.; Botje, M.; Brümmer, N.; Chlebana, F.; Engelen, J.; Koffeman, E.; Kooijman, P.; van Sighem, A.; Tiecke, H.; Tuning, N.; Verkerke, W.; Vossebeld, J.; Vreeswijk, M.; Wiggers, L.; de Wolf, E.; Acosta, D.; Bylsma, B.; Durkin, L. S.; Gilmore, J.; Ginsburg, C. M.; Kim, C. L.; Ling, T. Y.; Nylander, P.; Romanowski, T. A.; Blaikley, H. E.; Cashmore, R. J.; Cooper-Sarkar, A. M.; Devenish, R. C. E.; Edmonds, J. K.; Große-Knetter, J.; Harnew, N.; Nath, C.; Noyes, V. A.; Quadt, A.; Ruske, O.; Tickner, J. R.; Uijterwaal, H.; Walczak, R.; Waters, D. S.; Bertolin, A.; Brugnera, R.; Carlin, R.; dal Corso, F.; Dosselli, U.; Limentani, S.; Morandin, M.; Posocco, M.; Stanco, L.; Stroili, R.; Voci, C.; Bulmahn, J.; Oh, B. Y.; Okrasiń Ski, J. R.; Toothacker, W. S.; Whitmore, J. J.; Iga, Y.; D'Agostini, G.; Marini, G.; Nigro, A.; Raso, M.; Hart, J. C.; McCubbin, N. A.; Shah, T. P.; Epperson, D.; Heusch, C.; Rahn, J. T.; Sadrozinski, H. F.-W.; Seiden, A.; Wichmann, R.; Williams, D. C.; Schwarzer, O.; Walenta, A. H.; Abramowicz, H.; Briskin, G.; Dagan, S.; Kananov, S.; Levy, A.; Abe, T.; Fusayasu, T.; Inuzuka, M.; Nagano, K.; Umemori, K.; Yamashita, T.; Hamatsu, R.; Hirose, T.; Homma, K.; Kitamura, S.; Matsushita, T.; Cirio, R.; Costa, M.; Ferrero, M. I.; Maselli, S.; Monaco, V.; Peroni, C.; Petrucci, M. C.; Ruspa, M.; Sacchi, R.; Solano, A.; Staiano, A.; Dardo, M.; Bailey, D. C.; Fagerstroem, C.-P.; Galea, R.; Hartner, G. F.; Joo, K. K.; Levman, G. M.; Martin, J. F.; Orr, R. S.; Polenz, S.; Sabetfakhri, A.; Simmons, D.; Teuscher, R. J.; Butterworth, J. M.; Catterall, C. D.; Jones, T. W.; Lane, J. B.; Saunders, R. L.; Sutton, M. R.; Wing, M.; Ciborowski, J.; Grzelak, G.; Kasprzak, M.; Muchorowski, K.; Nowak, R. J.; Pawlak, J. M.; Pawlak, R.; Tymieniecka, T.; Wróblewski, A. K.; Zakrzewski, J. A.; Adamus, M.; Coldewey, C.; Eisenberg, Y.; Hochman, D.; Karshon, U.; Badgett, W. F.; Chapin, D.; Cross, R.; Dasu, S.; Foudas, C.; Loveless, R. J.; Mattingly, S.; Reeder, D. D.; Smith, W. H.; Vaiciulis, A.; Wodarczyk, M.; Deshpande, A.; Dhawan, S.; Hughes, V. W.; Bhadra, S.; Frisken, W. R.; Khakzad, M.; Schmidke, W. B.
1998-03-01
A global event shape analysis of the multihadronic final states observed in neutral current deep inelastic scattering events with a large rapidity gap with respect to the proton direction is presented. The analysis is performed in the range 5<=Q2<=185 GeV2 and 160<=W<=250 GeV, where Q2 is the virtuality of the photon and W is the virtual-photon proton centre of mass energy. Particular emphasis is placed on the dependence of the shape variables, measured in the γ*-pomeron rest frame, on the mass of the hadronic final state, MX. With increasing MX the multihadronic final state becomes more collimated and planar. The experimental results are compared with several models which attempt to describe diffractive events. The broadening effects exhibited by the data require in these models a significant gluon component of the pomeron.
Dynamic Analysis and Research on Environmental Pollution in China from 1992 to 2014
NASA Astrophysics Data System (ADS)
Sun, Fei; Yuan, Peng; Li, Huiting; Zhang, Moli
2018-01-01
The regular pattern of development of the environmental pollution events was analyzed from the perspective of statistical analysis of pollution events in recent years. The Moran, s I and spatial center-of-gravity shift curve of China, s environmental emergencies were calculated by ARCGIS software. And the method is global spatial analysis and spatial center of gravity shift. The results showed that the trend of China, s environmental pollution events from 1992 to 2014 was the first dynamic growth and then gradually reduced. Environmental pollution events showed spatial aggregation distribution in 1992-1994, 2001-2006, 2008-2014, and the rest of year was a random distribution of space. There were two stages in China, s environmental pollution events: The transition to the southwest from 1992 to 2006 and the transition to the northeast from the year of 2006 to 2014.
Komaki, Yuga; Komaki, Fukiko; Micic, Dejan; Yamada, Akihiro; Suzuki, Yasuo; Sakuraba, Atsushi
2017-06-01
A limited option of therapies is available for hospitalized patients with severe steroid refractory ulcerative colitis (UC). Furthermore, there exists a paucity of direct comparisons between them. To provide a comparative evaluation of the efficacy and safety of pharmacologic therapies, we conducted a network meta-analysis combined with a benefit-risk analysis of randomized controlled trials (RCTs) performed in hospitalized patients with severe steroid refractory UC. Electronic databases were searched through November 2015 for RCTs evaluating the efficacy of therapies for severe steroid refractory hospitalized UC. The outcomes were clinical response, colectomy free rate, and severe adverse events leading to discontinuation of therapy. The primary endpoints were the rank of therapies based on network meta-analysis combined with benefit-risk analysis between clinical response and severe adverse events as well as colectomy free rate and severe adverse events. Eight RCTs of 421 patients were identified. Cyclosporine, infliximab, and tacrolimus as well as placebo were included in our analysis. Network meta-analysis with benefit-risk analysis simultaneously assessing clinical response and severe adverse events demonstrated the rank order of efficacy as infliximab, cyclosporine, tacrolimus, and placebo. Similar analysis for colectomy-free rate and severe adverse events demonstrated the same rank order of efficacy. The differences among infliximab, cyclosporine, and tacrolimus were small in all analyses. The results of the present comprehensive benefit-risk assessment using network meta-analysis provide RCT-based evidence on efficacy and safety of infliximab, cyclosporine, and tacrolimus for hospitalized patients with severe steroid refractory UC. © 2016 Journal of Gastroenterology and Hepatology Foundation and John Wiley & Sons Australia, Ltd.
Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task 2 Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Shuai; Makarov, Yuri V.; McKinstry, Craig A.
Task report detailing low probability tail event analysis and mitigation in BPA control area. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, causing the imbalance between generation and load to become very significant.
ERIC Educational Resources Information Center
Zembylas, Michalinos; Bekerman, Zvi
2011-01-01
This article presents an in-depth analysis of two commemoration events in a first-grade classroom of a bilingual school in Israel. The two events presented--the commemorations of the Holocaust Day and the Memorial Day--derive from a longitudinal ethnographic study of integrated bilingual schools in Israel. The analysis of these events shows…
NASA Astrophysics Data System (ADS)
Petrucci, Olga; Caloiero, Tommaso; Aurora Pasqua, Angela; Perrotta, Piero; Russo, Luigi; Tansi, Carlo
2017-11-01
Calabria (southern Italy) is a flood prone region, due to both its rough orography and fast hydrologic response of most watersheds. During the rainy season, intense rain affects the region, triggering floods and mass movements that cause economic damage and fatalities. This work presents a methodological approach to perform the comparative analysis of two events affecting the same area at a distance of 15 years, by collecting all the qualitative and quantitative features useful to describe both rain and damage. The aim is to understand if similar meteorological events affecting the same area can have different outcomes in terms of damage. The first event occurred between 8 and 10 September 2000, damaged 109 out of 409 municipalities of the region and killed 13 people in a campsite due to a flood. The second event, which occurred between 30 October and 1 November 2015, damaged 79 municipalities, and killed a man due to a flood. The comparative analysis highlights that, despite the exceptionality of triggering daily rain was higher in the 2015 event, the damage caused by the 2000 event to both infrastructures and belongings was higher, and it was strongly increased due to the 13 flood victims. We concluded that, in the 2015 event, the management of pre-event phases, with the issuing of meteorological alert, and the emergency management, with the preventive evacuation of people in hazardous situations due to landslides or floods, contributed to reduce the number of victims.
Regression analysis of mixed recurrent-event and panel-count data
Zhu, Liang; Tong, Xinwei; Sun, Jianguo; Chen, Manhua; Srivastava, Deo Kumar; Leisenring, Wendy; Robison, Leslie L.
2014-01-01
In event history studies concerning recurrent events, two types of data have been extensively discussed. One is recurrent-event data (Cook and Lawless, 2007. The Analysis of Recurrent Event Data. New York: Springer), and the other is panel-count data (Zhao and others, 2010. Nonparametric inference based on panel-count data. Test 20, 1–42). In the former case, all study subjects are monitored continuously; thus, complete information is available for the underlying recurrent-event processes of interest. In the latter case, study subjects are monitored periodically; thus, only incomplete information is available for the processes of interest. In reality, however, a third type of data could occur in which some study subjects are monitored continuously, but others are monitored periodically. When this occurs, we have mixed recurrent-event and panel-count data. This paper discusses regression analysis of such mixed data and presents two estimation procedures for the problem. One is a maximum likelihood estimation procedure, and the other is an estimating equation procedure. The asymptotic properties of both resulting estimators of regression parameters are established. Also, the methods are applied to a set of mixed recurrent-event and panel-count data that arose from a Childhood Cancer Survivor Study and motivated this investigation. PMID:24648408
SCADA data and the quantification of hazardous events for QMRA.
Nilsson, P; Roser, D; Thorwaldsdotter, R; Petterson, S; Davies, C; Signor, R; Bergstedt, O; Ashbolt, N
2007-01-01
The objective of this study was to assess the use of on-line monitoring to support the QMRA at water treatment plants studied in the EU MicroRisk project. SCADA data were obtained from three Catchment-to-Tap Systems (CTS) along with system descriptions, diary records, grab sample data and deviation reports. Particular attention was paid to estimating hazardous event frequency, duration and magnitude. Using Shewart and CUSUM we identified 'change-points' corresponding to events of between 10 min and >1 month duration in timeseries data. Our analysis confirmed it is possible to quantify hazardous event durations from turbidity, chlorine residual and pH records and distinguish them from non-hazardous variability in the timeseries dataset. The durations of most 'events' were short-term (0.5-2.3 h). These data were combined with QMRA to estimate pathogen infection risk arising from such events as chlorination failure. While analysis of SCADA data alone could identify events provisionally, its interpretation was severely constrained in the absence of diary records and other system information. SCADA data analysis should only complement traditional water sampling, rather than replace it. More work on on-line data management, quality control and interpretation is needed before it can be used routinely for event characterization.
Markov chains and semi-Markov models in time-to-event analysis.
Abner, Erin L; Charnigo, Richard J; Kryscio, Richard J
2013-10-25
A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields.
Markov chains and semi-Markov models in time-to-event analysis
Abner, Erin L.; Charnigo, Richard J.; Kryscio, Richard J.
2014-01-01
A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields. PMID:24818062
Idaho National Laboratory Quarterly Event Performance Analysis FY 2013 4th Quarter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, Lisbeth A.
2013-11-01
This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy Occurrence Reporting and Processing System (ORPS) as prescribed in DOE Order 232.2 “Occurrence Reporting and Processing of Operations Information” requires a quarterly analysis of events, both reportable and not reportable for the previous twelve months. This report is the analysis of occurrence reports and deficiency reports (including not reportable events) identified at the Idaho National Laboratory (INL) during the period of October 2012 through September 2013.
OAE: The Ontology of Adverse Events.
He, Yongqun; Sarntivijai, Sirarat; Lin, Yu; Xiang, Zuoshuang; Guo, Abra; Zhang, Shelley; Jagannathan, Desikan; Toldo, Luca; Tao, Cui; Smith, Barry
2014-01-01
A medical intervention is a medical procedure or application intended to relieve or prevent illness or injury. Examples of medical interventions include vaccination and drug administration. After a medical intervention, adverse events (AEs) may occur which lie outside the intended consequences of the intervention. The representation and analysis of AEs are critical to the improvement of public health. The Ontology of Adverse Events (OAE), previously named Adverse Event Ontology (AEO), is a community-driven ontology developed to standardize and integrate data relating to AEs arising subsequent to medical interventions, as well as to support computer-assisted reasoning. OAE has over 3,000 terms with unique identifiers, including terms imported from existing ontologies and more than 1,800 OAE-specific terms. In OAE, the term 'adverse event' denotes a pathological bodily process in a patient that occurs after a medical intervention. Causal adverse events are defined by OAE as those events that are causal consequences of a medical intervention. OAE represents various adverse events based on patient anatomic regions and clinical outcomes, including symptoms, signs, and abnormal processes. OAE has been used in the analysis of several different sorts of vaccine and drug adverse event data. For example, using the data extracted from the Vaccine Adverse Event Reporting System (VAERS), OAE was used to analyse vaccine adverse events associated with the administrations of different types of influenza vaccines. OAE has also been used to represent and classify the vaccine adverse events cited in package inserts of FDA-licensed human vaccines in the USA. OAE is a biomedical ontology that logically defines and classifies various adverse events occurring after medical interventions. OAE has successfully been applied in several adverse event studies. The OAE ontological framework provides a platform for systematic representation and analysis of adverse events and of the factors (e.g., vaccinee age) important for determining their clinical outcomes.
Zhang, Bu-Chun; Wu, Qiang; Wang, Cheng; Li, Dong-Ye; Wang, Zhi-Rong
2014-04-01
The iso-osmolar contrast agent iodixanol may be associated with a lower incidence of cardiac events than low-osmolar contrast media (LOCM), but previous trials have yielded mixed results. To compare the risk of total cardiovascular events of the iso-osmolar contrast medium, iodixanol, to LOCM. Medical literature databases were searched to identify comparisons between iodixanol and LOCM with cardiovascular events as a primary endpoint. A random-effects model was used to obtain pooled odds ratio (OR) for within-hospital and 30-day events. A total of 2 prospective cross-sectional studies and 11 randomized controlled trials (RCTs) (covering 6859 subjects) met our criteria. There was no significant difference in the incidence of within-hospital and 30-day cardiovascular events when iodixanol was compared with LOCM, with pooled OR of 0.72 (95%CI 0.49-1.06, p=0.09) and 1.19 (95%CI 0.70-2.02, p=0.53), respectively. Subgroup analysis showed no relative difference when iodixanol was compared with ioxaglate (OR=0.92, 95%CI 0.50-1.70, p=0.80) and iohexol (OR=0.75, 95%CI 0.48-1.17, p=0.21). However, a reduction in the within-hospital cardiovascular events was observed when iodixanol was compared with LOCM in the RCT subgroup (OR=0.65, 95%CI 0.44-0.96, p=0.03). Sensitivity analyses revealed that three studies had a strong impact on the association of within-hospital cardiovascular events between iodixanol and LOCM. Meta-regression analysis failed to account for heterogeneity. No publication bias was detected. This meta-analysis demonstrates that there is no conclusive evidence that iodixanol is superior to LOCM overall with regard to fewer cardiovascular events. Copyright © 2014. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Ajayakumar, J.; Shook, E.; Turner, V. K.
2017-10-01
With social media becoming increasingly location-based, there has been a greater push from researchers across various domains including social science, public health, and disaster management, to tap in the spatial, temporal, and textual data available from these sources to analyze public response during extreme events such as an epidemic outbreak or a natural disaster. Studies based on demographics and other socio-economic factors suggests that social media data could be highly skewed based on the variations of population density with respect to place. To capture the spatio-temporal variations in public response during extreme events we have developed the Socio-Environmental Data Explorer (SEDE). SEDE collects and integrates social media, news and environmental data to support exploration and assessment of public response to extreme events. For this study, using SEDE, we conduct spatio-temporal social media response analysis on four major extreme events in the United States including the "North American storm complex" in December 2015, the "snowstorm Jonas" in January 2016, the "West Virginia floods" in June 2016, and the "Hurricane Matthew" in October 2016. Analysis is conducted on geo-tagged social media data from Twitter and warnings from the storm events database provided by National Centers For Environmental Information (NCEI) for analysis. Results demonstrate that, to support complex social media analyses, spatial and population-based normalization and filtering is necessary. The implications of these results suggests that, while developing software solutions to support analysis of non-conventional data sources such as social media, it is quintessential to identify the inherent biases associated with the data sources, and adapt techniques and enhance capabilities to mitigate the bias. The normalization strategies that we have developed and incorporated to SEDE will be helpful in reducing the population bias associated with social media data and will be useful for researchers and decision makers to enhance their analysis on spatio-temporal social media responses during extreme events.
The Frasnian-Famennian mass killing event(s), methods of identification and evaluation
NASA Technical Reports Server (NTRS)
Geldsetzer, H. H. J.
1988-01-01
The absence of an abnormally high number of earlier Devonian taxa from Famennian sediments was repeatedly documented and can hardly be questioned. Primary recognition of the event(s) was based on paleontological data, especially common macrofossils. Most paleontologists place the disappearance of these common forms at the gigas/triangularis contact and this boundary was recently proposed as the Frasnian-Famennian (F-F) boundary. Not unexpectedly, alternate F-F positions were suggested caused by temporary Frasnian survivors or sudden post-event radiations of new forms. Secondary supporting evidence for mass killing event(s) is supplied by trace element and stable isotope geochemistry but not with the same success as for the K/T boundary, probably due to additional 300 ma of tectonic and diagenetic overprinting. Another tool is microfacies analysis which is surprisingly rarely used even though it can explain geochemical anomalies or paleontological overlap not detectable by conventional macrofacies analysis. The combination of microfacies analysis and geochemistry was applied at two F-F sections in western Canada and showed how interdependent the two methods are. Additional F-F sections from western Canada, western United States, France, Germany and Australia were sampled or re-sampled and await geochemical/microfacies evaluation.
Regression Analysis of Mixed Panel Count Data with Dependent Terminal Events
Yu, Guanglei; Zhu, Liang; Li, Yang; Sun, Jianguo; Robison, Leslie L.
2017-01-01
Event history studies are commonly conducted in many fields and a great deal of literature has been established for the analysis of the two types of data commonly arising from these studies: recurrent event data and panel count data. The former arises if all study subjects are followed continuously, while the latter means that each study subject is observed only at discrete time points. In reality, a third type of data, a mixture of the two types of the data above, may occur and furthermore, as with the first two types of the data, there may exist a dependent terminal event, which may preclude the occurrences of recurrent events of interest. This paper discusses regression analysis of mixed recurrent event and panel count data in the presence of a terminal event and an estimating equation-based approach is proposed for estimation of regression parameters of interest. In addition, the asymptotic properties of the proposed estimator are established and a simulation study conducted to assess the finite-sample performance of the proposed method suggests that it works well in practical situations. Finally the methodology is applied to a childhood cancer study that motivated this study. PMID:28098397
NASA Technical Reports Server (NTRS)
Wilson, R. M.
1976-01-01
The Skylab ATM/S-056 X-Ray Event Analyzer, part of an X-ray telescope experiment, is described. The techniques employed in the analysis of its data to determine electron temperatures and emission measures are reviewed. The analysis of a sample event - the 15 June 1973 1B/M3 flare - is performed. Comparison of the X-Ray Event Analyzer data with that of the SolRad 9 observations indicates that the X-Ray Event Analyzer accurately monitored the sun's 2.5 to 7.25 A X-ray emission and to a lesser extent the 6.1 to 20 A emission. A mean average peak temperature of 15 million K at 1,412 UT and a mean average peak electron density (assuming a flare volume of 10 to the 13 power cu km) of 27 million/cu mm at 1,416 to 1,417 UT are deduced for the event. The X-Ray Event Analyzer data, having a 2.5 s time resolution, should be invaluable in comparisons with other high-time resolution data (e.g., radio bursts).
NASA Astrophysics Data System (ADS)
Eldardiry, H.; Hossain, F.
2017-12-01
Atmospheric Rivers (ARs) are narrow elongated corridors with horizontal water vapor transport located within the warm sector of extratropical cyclones. While it is widely known that most of heavy rainfall events across the western United States (US) are driven by ARs, the connection between atmospheric conditions and precipitation during an AR event has not been fully documented. In this study, we present a statistical analysis of the connection between precipitation, temperature, wind, and snowpack during the cold season AR events hitting the coastal regions of the western US. For each AR event, the precipitation and other atmospheric variables are retrieved through the dynamic downscaling of NCEP/NCAR Reanalysis product using the Advanced Research Weather Research and Forecasting Model (ARW-WRF). The results show a low frequency of precipitation (below 0.3) during AR events that reflects the connection of AR with extreme precipitation. Examining the horizontal wind speed during AR events indicates a high correlation (above 0.7) with precipitation. In addition, high levels of snow water equivalence (SWE) are also noticed along the mountainous regions, e.g., Cascade Range and Sierra-Nevada mountain range, during most of AR events. Addressing the impact of duration on the frequency of precipitation, we develop Intensity-Duration-Frequency (IDF) curves during AR events that can potentially describe the future predictability of precipitation along the north and south coast. To complement our analysis, we further investigate the flooding events recorded in the National Centers for Environmental Information (NCEI) storm events database. While some flooding events are attributed to heavy rainfall associated with an AR event, other flooding events are significantly connected to the increase in the snowmelt before the flooding date. Thus, we introduce an index that describes the contribution of rainfall vs snowmelt and categorizes the flooding events during an AR event into rain-driven and snow-driven events. Such categorization can provide insight into whether or not an AR will produce extreme precipitation or flooding. The results from such investigations are important to understand historical AR events and assess how precipitation and flooding might evolve in future climate.
Persistent landfalling atmospheric rivers over the west coast of North America
NASA Astrophysics Data System (ADS)
Payne, Ashley E.; Magnusdottir, Gudrun
2016-11-01
Landfalling atmospheric rivers (ARs) are linked to heavy precipitation and extreme flooding, and are well known along the western coast of North America. The hydrological impacts of ARs upon landfall are correlated with their duration and magnitude. In order to improve the forecast of these hydrologically significant landfalling events, a better understanding of how they differ from other landfalling events must be established through an investigation of the mechanisms leading to their development prior to landfall. A subset of persistent landfalling AR events between 30°N and 50°N is identified in 3-hourly Modern-Era Retrospective Analysis for Research and Applications reanalysis and validated against existing data sets. These events are identified as features in the low troposphere with high moisture transport and extended geometry that persist over a limited region of the coastline for longer than 63 h (85th percentile of AR duration). A composite analysis shows that persistent events have distinct thermodynamical and dynamical characteristics compared to all AR events. They are characterized by greater moisture content, suggestive of Pineapple Express-type events, a perturbed upper level jet and anticyclonic overturning of potential vorticity contours associated with anticyclonic Rossby wave breaking. Moreover, the location of the Rossby wave breaking is shifted inland compared to all AR events. Analogue analysis of the 500 hPa geopotential height anomalies is used to find nonpersistent events with similar dynamical characteristics to persistent events. Despite their similarity to persistent events, nonpersistent analogues show very little shift toward longer duration. A comparison of the development of persistent and nonpersistent analogues shows that persistent events have much greater moisture content.
Joint Modeling Approach for Semicompeting Risks Data with Missing Nonterminal Event Status
Hu, Chen; Tsodikov, Alex
2014-01-01
Semicompeting risks data, where a subject may experience sequential non-terminal and terminal events, and the terminal event may censor the non-terminal event but not vice versa, are widely available in many biomedical studies. We consider the situation when a proportion of subjects’ non-terminal events is missing, such that the observed data become a mixture of “true” semicompeting risks data and partially observed terminal event only data. An illness-death multistate model with proportional hazards assumptions is proposed to study the relationship between non-terminal and terminal events, and provide covariate-specific global and local association measures. Maximum likelihood estimation based on semiparametric regression analysis is used for statistical inference, and asymptotic properties of proposed estimators are studied using empirical process and martingale arguments. We illustrate the proposed method with simulation studies and data analysis of a follicular cell lymphoma study. PMID:24430204
Automated Detection of Events of Scientific Interest
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
A report presents a slightly different perspective of the subject matter of Fusing Symbolic and Numerical Diagnostic Computations (NPO-42512), which appears elsewhere in this issue of NASA Tech Briefs. Briefly, the subject matter is the X-2000 Anomaly Detection Language, which is a developmental computing language for fusing two diagnostic computer programs one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for real-time detection of events. In the case of the cited companion NASA Tech Briefs article, the contemplated events that one seeks to detect would be primarily failures or other changes that could adversely affect the safety or success of a spacecraft mission. In the case of the instant report, the events to be detected could also include natural phenomena that could be of scientific interest. Hence, the use of X- 2000 Anomaly Detection Language could contribute to a capability for automated, coordinated use of multiple sensors and sensor-output-data-processing hardware and software to effect opportunistic collection and analysis of scientific data.
Analysis of event data recorder data for vehicle safety improvement
DOT National Transportation Integrated Search
2008-04-01
The Volpe Center performed a comprehensive engineering analysis of Event Data Recorder (EDR) data supplied by the National Highway Traffic Safety Administration (NHTSA) to assess its accuracy and usefulness in crash reconstruction and improvement of ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane
2003-09-01
This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to anmore » aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.« less
Discourse Analysis and Language Learning [Summary of a Symposium].
ERIC Educational Resources Information Center
Hatch, Evelyn
1981-01-01
A symposium on discourse analysis and language learning is summarized. Discourse analysis can be divided into six fields of research: syntax, the amount of syntactic organization required for different types of discourse, large speech events, intra-sentential cohesion in text, speech acts, and unequal power discourse. Research on speech events and…
External events analysis for the Savannah River Site K reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brandyberry, M.D.; Wingo, H.E.
1990-01-01
The probabilistic external events analysis performed for the Savannah River Site K-reactor PRA considered many different events which are generally perceived to be external'' to the reactor and its systems, such as fires, floods, seismic events, and transportation accidents (as well as many others). Events which have been shown to be significant contributors to risk include seismic events, tornados, a crane failure scenario, fires and dam failures. The total contribution to the core melt frequency from external initiators has been found to be 2.2 {times} 10{sup {minus}4} per year, from which seismic events are the major contributor (1.2 {times} 10{supmore » {minus}4} per year). Fire initiated events contribute 1.4 {times} 10{sup {minus}7} per year, tornados 5.8 {times} 10{sup {minus}7} per year, dam failures 1.5 {times} 10{sup {minus}6} per year and the crane failure scenario less than 10{sup {minus}4} per year to the core melt frequency. 8 refs., 3 figs., 5 tabs.« less
Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R
2009-01-01
We present g-PRIME, a software based tool for physiology data acquisition, analysis, and stimulus generation in education and research. This software was developed in an undergraduate neurophysiology course and strongly influenced by instructor and student feedback. g-PRIME is a free, stand-alone, windows application coded and "compiled" in Matlab (does not require a Matlab license). g-PRIME supports many data acquisition interfaces from the PC sound card to expensive high throughput calibrated equipment. The program is designed as a software oscilloscope with standard trigger modes, multi-channel visualization controls, and data logging features. Extensive analysis options allow real time and offline filtering of signals, multi-parameter threshold-and-window based event detection, and two-dimensional display of a variety of parameters including event time, energy density, maximum FFT frequency component, max/min amplitudes, and inter-event rate and intervals. The software also correlates detected events with another simultaneously acquired source (event triggered average) in real time or offline. g-PRIME supports parameter histogram production and a variety of elegant publication quality graphics outputs. A major goal of this software is to merge powerful engineering acquisition and analysis tools with a biological approach to studies of nervous system function.
Evidence of validity of the Stress-Producing Life Events (SPLE) instrument.
Rizzini, Marta; Santos, Alcione Miranda Dos; Silva, Antônio Augusto Moura da
2018-01-01
OBJECTIVE Evaluate the construct validity of a list of eight Stressful Life Events in pregnant women. METHODS A cross-sectional study was conducted with 1,446 pregnant women in São Luís, MA, and 1,364 pregnant women in Ribeirão Preto, SP (BRISA cohort), from February 2010 to June 2011. In the exploratory factorial analysis, the promax oblique rotation was used and for the calculation of the internal consistency, we used the compound reliability. The construct validity was determined by means of the confirmatory factorial analysis with the method of estimation of weighted least squares adjusted by the mean and variance. RESULTS The model with the best fit in the exploratory analysis was the one that retained three factors with a cumulative variance of 61.1%. The one-factor model did not obtain a good fit in both samples in the confirmatory analysis. The three-factor model called Stress-Producing Life Events presented a good fit (RMSEA < 0.05; CFI/TLI > 0.90) for both samples. CONCLUSIONS The Stress-Producing Life Events constitute a second order construct with three dimensions related to health, personal and financial aspects and violence. This study found evidence that confirms the construct validity of a list of stressor events, entitled Stress-Producing Life Events Inventory.
Schug, Stephan A; Parsons, Bruce; Li, Chunming; Xia, Feng
2017-01-01
Background Nonselective, nonsteroidal anti-inflammatory drugs (NSAIDs) and selective cyclooxygenase-2 (COX-2) inhibitors are associated with safety issues including cardiovascular, renal, and gastrointestinal (GI) events. Objective To examine the safety of parecoxib, a COX-2 inhibitor, for the management of postoperative pain. Design Pooled analysis of 28 placebo-controlled trials of parecoxib and review of postauthorization safety data. Main outcome measures Prespecified safety events commonly associated with COX-2 inhibitors and/or NSAIDs. In the clinical trial analysis, the frequency of each event was compared between treatment groups using a chi-square test. In the postauthorization review, the number of confirmed cases, along with outcome, was presented for each event. Results In the clinical trial analysis, GI-related events occurred in ~0.2% of patients in the parecoxib and placebo groups. Renal failure and impairment was similar between parecoxib (1.0%) and placebo (0.9%). The occurrence of arterial (parecoxib=0.3%; placebo=0.2%) and venous (parecoxib=0.2%; placebo=0.1%) cardiovascular embolic and thrombotic events was similar between groups. Hypersensitivity reactions including anaphylactic reactions (parecoxib=8.7%; placebo=8.6%), hypotension (parecoxib=2.6%; placebo=2.1%), angioedema (parecoxib=2.5%; placebo=2.8%), and severe cutaneous adverse reactions (0% in both groups) were similar between groups. Incision site or other skin/tissue infections occurred in <0.1% of patients in both groups. The occurrence of these events (total reports/serious reports) in the postauthorization database, based on 69,567,300 units of parecoxib, was as follows: GI ulceration-related events (35/35), renal failure and impairment (77/68), cardiovascular embolic and thrombotic events (66/64), hypersensitivity reactions including hypotension-related events (32/25) and severe cutaneous adverse events (17/17), and masking signs of inflammation (18/18). A majority of reported outcomes were classified as recovered or recovering. Conclusions Potentially serious safety events occur infrequently with parecoxib, which high-lights its safety in patients with postoperative pain. PMID:29066931
Schug, Stephan A; Parsons, Bruce; Li, Chunming; Xia, Feng
2017-01-01
Nonselective, nonsteroidal anti-inflammatory drugs (NSAIDs) and selective cyclooxygenase-2 (COX-2) inhibitors are associated with safety issues including cardiovascular, renal, and gastrointestinal (GI) events. To examine the safety of parecoxib, a COX-2 inhibitor, for the management of postoperative pain. Pooled analysis of 28 placebo-controlled trials of parecoxib and review of postauthorization safety data. Prespecified safety events commonly associated with COX-2 inhibitors and/or NSAIDs. In the clinical trial analysis, the frequency of each event was compared between treatment groups using a chi-square test. In the postauthorization review, the number of confirmed cases, along with outcome, was presented for each event. In the clinical trial analysis, GI-related events occurred in ~0.2% of patients in the parecoxib and placebo groups. Renal failure and impairment was similar between parecoxib (1.0%) and placebo (0.9%). The occurrence of arterial (parecoxib=0.3%; placebo=0.2%) and venous (parecoxib=0.2%; placebo=0.1%) cardiovascular embolic and thrombotic events was similar between groups. Hypersensitivity reactions including anaphylactic reactions (parecoxib=8.7%; placebo=8.6%), hypotension (parecoxib=2.6%; placebo=2.1%), angioedema (parecoxib=2.5%; placebo=2.8%), and severe cutaneous adverse reactions (0% in both groups) were similar between groups. Incision site or other skin/tissue infections occurred in <0.1% of patients in both groups. The occurrence of these events (total reports/serious reports) in the postauthorization database, based on 69,567,300 units of parecoxib, was as follows: GI ulceration-related events (35/35), renal failure and impairment (77/68), cardiovascular embolic and thrombotic events (66/64), hypersensitivity reactions including hypotension-related events (32/25) and severe cutaneous adverse events (17/17), and masking signs of inflammation (18/18). A majority of reported outcomes were classified as recovered or recovering. Potentially serious safety events occur infrequently with parecoxib, which high-lights its safety in patients with postoperative pain.
NASA Astrophysics Data System (ADS)
Jun, Changhyun; Qin, Xiaosheng; Gan, Thian Yew; Tung, Yeou-Koung; De Michele, Carlo
2017-10-01
This study presents a storm-event based bivariate frequency analysis approach to determine design rainfalls in which, the number, intensity and duration of actual rainstorm events were considered. To derive more realistic design storms, the occurrence probability of an individual rainstorm event was determined from the joint distribution of storm intensity and duration through a copula model. Hourly rainfall data were used at three climate stations respectively located in Singapore, South Korea and Canada. It was found that the proposed approach could give a more realistic description of rainfall characteristics of rainstorm events and design rainfalls. As results, the design rainfall quantities from actual rainstorm events at the three studied sites are consistently lower than those obtained from the conventional rainfall depth-duration-frequency (DDF) method, especially for short-duration storms (such as 1-h). It results from occurrence probabilities of each rainstorm event and a different angle for rainfall frequency analysis, and could offer an alternative way of describing extreme rainfall properties and potentially help improve the hydrologic design of stormwater management facilities in urban areas.
An analysis of high-impact, low-predictive skill severe weather events in the northeast U.S
NASA Astrophysics Data System (ADS)
Vaughan, Matthew T.
An objective evaluation of Storm Prediction Center slight risk convective outlooks, as well as a method to identify high-impact severe weather events with poor-predictive skill are presented in this study. The objectives are to assess severe weather forecast skill over the northeast U.S. relative to the continental U.S., build a climatology of high-impact, low-predictive skill events between 1980--2013, and investigate the dynamic and thermodynamic differences between severe weather events with low-predictive skill and high-predictive skill over the northeast U.S. Severe storm reports of hail, wind, and tornadoes are used to calculate skill scores including probability of detection (POD), false alarm ratio (FAR) and threat scores (TS) for each convective outlook. Low predictive skill events are binned into low POD (type 1) and high FAR (type 2) categories to assess temporal variability of low-predictive skill events. Type 1 events were found to occur in every year of the dataset with an average of 6 events per year. Type 2 events occur less frequently and are more common in the earlier half of the study period. An event-centered composite analysis is performed on the low-predictive skill database using the National Centers for Environmental Prediction Climate Forecast System Reanalysis 0.5° gridded dataset to analyze the dynamic and thermodynamic conditions prior to high-impact severe weather events with varying predictive skill. Deep-layer vertical shear between 1000--500 hPa is found to be a significant discriminator in slight risk forecast skill where high-impact events with less than 31-kt shear have lower threat scores than high-impact events with higher shear values. Case study analysis of type 1 events suggests the environment over which severe weather occurs is characterized by high downdraft convective available potential energy, steep low-level lapse rates, and high lifting condensation level heights that contribute to an elevated risk of severe wind.
Negated bio-events: analysis and identification
2013-01-01
Background Negation occurs frequently in scientific literature, especially in biomedical literature. It has previously been reported that around 13% of sentences found in biomedical research articles contain negation. Historically, the main motivation for identifying negated events has been to ensure their exclusion from lists of extracted interactions. However, recently, there has been a growing interest in negative results, which has resulted in negation detection being identified as a key challenge in biomedical relation extraction. In this article, we focus on the problem of identifying negated bio-events, given gold standard event annotations. Results We have conducted a detailed analysis of three open access bio-event corpora containing negation information (i.e., GENIA Event, BioInfer and BioNLP’09 ST), and have identified the main types of negated bio-events. We have analysed the key aspects of a machine learning solution to the problem of detecting negated events, including selection of negation cues, feature engineering and the choice of learning algorithm. Combining the best solutions for each aspect of the problem, we propose a novel framework for the identification of negated bio-events. We have evaluated our system on each of the three open access corpora mentioned above. The performance of the system significantly surpasses the best results previously reported on the BioNLP’09 ST corpus, and achieves even better results on the GENIA Event and BioInfer corpora, both of which contain more varied and complex events. Conclusions Recently, in the field of biomedical text mining, the development and enhancement of event-based systems has received significant interest. The ability to identify negated events is a key performance element for these systems. We have conducted the first detailed study on the analysis and identification of negated bio-events. Our proposed framework can be integrated with state-of-the-art event extraction systems. The resulting systems will be able to extract bio-events with attached polarities from textual documents, which can serve as the foundation for more elaborate systems that are able to detect mutually contradicting bio-events. PMID:23323936
Event selection services in ATLAS
NASA Astrophysics Data System (ADS)
Cranshaw, J.; Cuhadar-Donszelmann, T.; Gallas, E.; Hrivnac, J.; Kenyon, M.; McGlone, H.; Malon, D.; Mambelli, M.; Nowak, M.; Viegas, F.; Vinek, E.; Zhang, Q.
2010-04-01
ATLAS has developed and deployed event-level selection services based upon event metadata records ("TAGS") and supporting file and database technology. These services allow physicists to extract events that satisfy their selection predicates from any stage of data processing and use them as input to later analyses. One component of these services is a web-based Event-Level Selection Service Interface (ELSSI). ELSSI supports event selection by integrating run-level metadata, luminosity-block-level metadata (e.g., detector status and quality information), and event-by-event information (e.g., triggers passed and physics content). The list of events that survive after some selection criterion is returned in a form that can be used directly as input to local or distributed analysis; indeed, it is possible to submit a skimming job directly from the ELSSI interface using grid proxy credential delegation. ELSSI allows physicists to explore ATLAS event metadata as a means to understand, qualitatively and quantitatively, the distributional characteristics of ATLAS data. In fact, the ELSSI service provides an easy interface to see the highest missing ET events or the events with the most leptons, to count how many events passed a given set of triggers, or to find events that failed a given trigger but nonetheless look relevant to an analysis based upon the results of offline reconstruction, and more. This work provides an overview of ATLAS event-level selection services, with an emphasis upon the interactive Event-Level Selection Service Interface.
NASA Astrophysics Data System (ADS)
Gonzalez-Hidalgo, J. C.; Batalla, R.; Cerda, A.; de Luis, M.
2009-04-01
When Thornes and Brunsden wrote in 1977 "How often one hears the researcher (and no less the undergraduate) complain that after weeks of observation "nothing happened" only to learn that, the day after his departure, a flood caused unprecedent erosion and channel changes!" (Thornes and Brunsden, 1977, p. 57), they focussed on two different problems in geomorphological research: the effects of extreme events and the temporal compression of geomorphological processes. The time compression is one of the main characteristic of erosion processes. It means that an important amount of the total soil eroded is produced in very short temporal intervals, i.e. few events mostly related to extreme events. From magnitude-frequency analysis we know that few events, not necessarily extreme by magnitude, produce high amount of geomorphological work. Last but not least, extreme isolated events are a classical issue in geomorphology by their specific effects, and they are receiving permanent attention, increased at present because of scenarios of global change. Notwithstanding, the time compression of geomorphological processes could be focused not only on the analysis of extreme events and the traditional magnitude-frequency approach, but on new complementary approach based on the effects of largest events. The classical approach define extreme event as a rare event (identified by its magnitude and quantified by some deviation from central value), while we define largest events by the rank, whatever their magnitude. In a previous research on time compression of soil erosion, using USLE soil erosion database (Gonzalez-Hidalgo et al., EGU 2007), we described a relationship between the total amount of daily erosive events recorded by plot and the percentage contribution to total soil erosion of n-largest aggregated daily events. Now we offer a further refined analysis comparing different agricultural regions in USA. To do that we have analyzed data from 594 erosion plots from USLE database with different record periods, and located in different climatic regions. Results indicate that there are no significant differences in the mean contribution of aggregated 5-largest daily erosion events between different agricultural divisions (i.e. different regional climate), and the differences detected can be attributed to specific site and plots conditions. Expected contribution of 5-largest daily event for 100 total daily events recorded is estimated around 40% of total soil erosion. We discuss the possible causes of such results and the applicability of them to the design of field research on soil erosion plots.
Becker, Jeroen H; Krikhaar, Anniek; Schuit, Ewoud; Mårtendal, Annika; Maršál, Karel; Kwee, Anneke; Visser, Gerard H A; Amer-Wåhlin, Isis
2015-02-01
To study the predictive value of biphasic ST-events for interventions for suspected fetal distress and adverse neonatal outcome, when using ST-analysis of the fetal electrocardiogram (FECG) for intrapartum fetal monitoring. Prospective cohort study. Three academic hospitals in Sweden. Women in labor with a high-risk singleton fetus in cephalic position beyond 36 weeks of gestation. In women in labor who were monitored with conventional cardiotocography, ST-waveform analysis was recorded and concealed. Traces with biphasic ST-events of the FECG (index) were compared with traces without biphasic events of the FECG. The ability of biphasic events to predict interventions for suspected fetal distress and adverse outcome was assessed using univariable and multivariable logistic regression analyses. Interventions for suspected fetal distress and adverse outcome (defined as presence of metabolic acidosis (i.e. umbilical cord pH <7.05 and base deficit in extracellular fluid >12 mmol), umbilical cord pH <7.00, 5-min Apgar score <7, admittance to neonatal intensive care unit or perinatal death). Although the presence of biphasic events of the FECG was associated with more interventions for fetal distress and an increased risk of adverse outcome compared with cases with no biphasic events, the presence of significant (i.e. intervention advised according to cardiotocography interpretation) biphasic events showed no independent association with interventions for fetal distress [odds ratio (OR) 1.71, 95% confidence interval (CI) 0.65-4.50] or adverse outcome (OR 1.96, 95% CI 0.74-5.24). The presence of significant biphasic events did not discriminate in the prediction of interventions for fetal distress or adverse outcome. Therefore, biphasic events in relation to ST-analysis monitoring during birth should be omitted if future studies confirm our findings. © 2014 Nordic Federation of Societies of Obstetrics and Gynecology.
NASA Astrophysics Data System (ADS)
Müller, Eva; Pfister, Angela; Gerd, Büger; Maik, Heistermann; Bronstert, Axel
2015-04-01
Hydrological extreme events can be triggered by rainfall on different spatiotemporal scales: river floods are typically caused by event durations of between hours and days, while urban flash floods as well as soil erosion or contaminant transport rather result from storms events of very short duration (minutes). Still, the analysis of climate change impacts on rainfall-induced extreme events is usually carried out using daily precipitation data at best. Trend analyses of extreme rainfall at sub-daily or even sub-hourly time scales are rare. In this contribution two lines of research are combined: first, we analyse sub-hourly rainfall data for several decades in three European regions.Second, we investigate the scaling behaviour of heavy short-term precipitation with temperature, i.e. the dependence of high intensity rainfall on the atmospheric temperature at that particular time and location. The trend analysis of high-resolution rainfall data shows for the first time that the frequency of short and intensive storm events in the temperate lowland regions in Germany has increased by up to 0.5 events per year over the last decades. I.e. this trend suggests that the occurrence of these types of storms have multiplied over only a few decades. Parallel to the changes in the rainfall regime, increases in the annual and seasonal average temperature and changes in the occurrence of circulation patterns responsible for the generation of high-intensity storms have been found. The analysis of temporally highly resolved rainfall records from three European regions further indicates that extreme precipitation events are more intense with warmer temperatures during the rainfall event. These observations follow partly the Clausius-Clapeyron relation. Based on this relation one may derive a general rule of maximum rainfall intensity associated to the event temperature, roughly following the Clausius-Clapeyron (CC) relation. This rule might be used for scenarios of future maximum rainfall intensities under a warming climate.
Event Reports Promoting Root Cause Analysis.
Pandit, Swananda; Gong, Yang
2016-01-01
Improving health is the sole objective of medical care. Unfortunately, mishaps or patient safety events happen during the care. If the safety events were collected effectively, they would help identify patterns, underlying causes, and ultimately generate proactive and remedial solutions for prevention of recurrence. Based on the AHRQ Common Formats, we examine the quality of patient safety incident reports and describe the initial data requirement that can support and accelerate effective root cause analysis. The ultimate goal is to develop a knowledge base of patient safety events and their common solutions which can be readily available for sharing and learning.
Survey of critical failure events in on-chip interconnect by fault tree analysis
NASA Astrophysics Data System (ADS)
Yokogawa, Shinji; Kunii, Kyousuke
2018-07-01
In this paper, a framework based on reliability physics is proposed for adopting fault tree analysis (FTA) to the on-chip interconnect system of a semiconductor. By integrating expert knowledge and experience regarding the possibilities of failure on basic events, critical issues of on-chip interconnect reliability will be evaluated by FTA. In particular, FTA is used to identify the minimal cut sets with high risk priority. Critical events affecting the on-chip interconnect reliability are identified and discussed from the viewpoint of long-term reliability assessment. The moisture impact is evaluated as an external event.
Sudell, Maria; Tudur Smith, Catrin; Gueyffier, François; Kolamunnage-Dona, Ruwanthi
2018-04-15
Joint modelling of longitudinal and time-to-event data is often preferred over separate longitudinal or time-to-event analyses as it can account for study dropout, error in longitudinally measured covariates, and correlation between longitudinal and time-to-event outcomes. The joint modelling literature focuses mainly on the analysis of single studies with no methods currently available for the meta-analysis of joint model estimates from multiple studies. We propose a 2-stage method for meta-analysis of joint model estimates. These methods are applied to the INDANA dataset to combine joint model estimates of systolic blood pressure with time to death, time to myocardial infarction, and time to stroke. Results are compared to meta-analyses of separate longitudinal or time-to-event models. A simulation study is conducted to contrast separate versus joint analyses over a range of scenarios. Using the real dataset, similar results were obtained by using the separate and joint analyses. However, the simulation study indicated a benefit of use of joint rather than separate methods in a meta-analytic setting where association exists between the longitudinal and time-to-event outcomes. Where evidence of association between longitudinal and time-to-event outcomes exists, results from joint models over standalone analyses should be pooled in 2-stage meta-analyses. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
Mitigation of Manhole Events Caused by Secondary Cable Failure
NASA Astrophysics Data System (ADS)
Zhang, Lili
"Manhole event" refers to a range of phenomena, such as smokers, fires and explosions which occur on underground electrical infrastructure, primarily in major cities. The most common cause of manhole events is decomposition of secondary cable initiated by an electric fault. The work presented in this thesis addresses various aspects related to the evolution and mitigation of the manhole events caused by secondary cable insulation failure. Manhole events develop as a result of thermal decomposition of organic materials present in the cable duct and manholes. Polymer characterization techniques are applied to intensively study the materials properties as related to manhole events, mainly the thermal decomposition behaviors of the polymers present in the cable duct. Though evolved gas analysis, the combustible gases have been quantitatively identified. Based on analysis and knowledge of field conditions, manhole events is divided into at least two classes, those in which exothermic chemical reactions dominate and those in which electrical energy dominates. The more common form of manhole event is driven by air flow down the duct. Numerical modeling of smolder propagation in the cable duct demonstrated that limiting air flow is effective in reducing the generation rate of combustible gas, in other words, limiting manhole events to relatively minor "smokers". Besides manhole events, another by-product of secondary cable insulation breakdown is stray voltage. The danger to personnel due to stray voltage is mostly caused by the 'step potential'. The amplitude of step potential as a result of various types of insulation defects is calculated using Finite Element Analysis (FEA) program.
Fernández-Fernández, Virginia; Márquez-González, María; Losada-Baltar, Andrés; García, Pablo E; Romero-Moreno, Rosa
2013-01-01
Older people's emotional distress is often related to rumination processes focused on past vital events occurred during their lives. The specific coping strategies displayed to face those events may contribute to explain older adults' current well-being: they can perceive that they have obtained personal growth after those events and/or they can show a tendency to have intrusive thoughts about those events. This paper describes the development and analysis of the psychometric properties of the Scales for the Assessment of the Psychological Impact of Past Life Events (SAPIPLE): the past life events-occurrence scale (LE-O), ruminative thought scale (LE-R) and personal growth scale (LE-PG). Participants were 393 community dwelling elderly (mean age=71.5 years old; SD=6.9). In addition to the SAPIPLE scales, depressive symptomatology, anxiety, psychological well-being, life satisfaction, physical function and vitality have been assessed. The inter-rater agreement's analysis suggests the presence of two factors in the LE-O: positive and negative vital events. Confirmatory Factor Analysis (CFA) supported this two-dimensional structure for both the LE-R and the LE-PG. Good internal consistency indexes have been obtained for each scale and subscale, as well as good criterion and concurrent validity indexes. Both ruminative thoughts about past life events and personal growth following those events are related to older adults' current well-being. The SAPIPLE presents good psychometric properties that justify its use for elderly people. Copyright © 2012 SEGG. Published by Elsevier Espana. All rights reserved.
Leal, Cristiano; Amaral, António Luís; Costa, Maria de Lourdes
2016-08-01
Activated sludge systems are prone to be affected by foaming occurrences causing the sludge to rise in the reactor and affecting the wastewater treatment plant (WWTP) performance. Nonetheless, there is currently a knowledge gap hindering the development of foaming events prediction tools that may be fulfilled by the quantitative monitoring of AS systems biota and sludge characteristics. As such, the present study focuses on the assessment of foaming events in full-scale WWTPs, by quantitative protozoa, metazoa, filamentous bacteria, and sludge characteristics analysis, further used to enlighten the inner relationships between these parameters. In the current study, a conventional activated sludge system (CAS) and an oxidation ditch (OD) were surveyed throughout a period of 2 and 3 months, respectively, regarding their biota and sludge characteristics. The biota community was monitored by microscopic observation, and a new filamentous bacteria index was developed to quantify their occurrence. Sludge characteristics (aggregated and filamentous biomass contents and aggregate size) were determined by quantitative image analysis (QIA). The obtained data was then processed by principal components analysis (PCA), cross-correlation analysis, and decision trees to assess the foaming occurrences, and enlighten the inner relationships. It was found that such events were best assessed by the combined use of the relative abundance of testate amoeba and nocardioform filamentous index, presenting a 92.9 % success rate for overall foaming events, and 87.5 and 100 %, respectively, for persistent and mild events.
Sources of Error and the Statistical Formulation of M S: m b Seismic Event Screening Analysis
NASA Astrophysics Data System (ADS)
Anderson, D. N.; Patton, H. J.; Taylor, S. R.; Bonner, J. L.; Selby, N. D.
2014-03-01
The Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global ban on nuclear explosions, is currently in a ratification phase. Under the CTBT, an International Monitoring System (IMS) of seismic, hydroacoustic, infrasonic and radionuclide sensors is operational, and the data from the IMS is analysed by the International Data Centre (IDC). The IDC provides CTBT signatories basic seismic event parameters and a screening analysis indicating whether an event exhibits explosion characteristics (for example, shallow depth). An important component of the screening analysis is a statistical test of the null hypothesis H 0: explosion characteristics using empirical measurements of seismic energy (magnitudes). The established magnitude used for event size is the body-wave magnitude (denoted m b) computed from the initial segment of a seismic waveform. IDC screening analysis is applied to events with m b greater than 3.5. The Rayleigh wave magnitude (denoted M S) is a measure of later arriving surface wave energy. Magnitudes are measurements of seismic energy that include adjustments (physical correction model) for path and distance effects between event and station. Relative to m b, earthquakes generally have a larger M S magnitude than explosions. This article proposes a hypothesis test (screening analysis) using M S and m b that expressly accounts for physical correction model inadequacy in the standard error of the test statistic. With this hypothesis test formulation, the 2009 Democratic Peoples Republic of Korea announced nuclear weapon test fails to reject the null hypothesis H 0: explosion characteristics.
Pre-2014 mudslides at Oso revealed by InSAR and multi-source DEM analysis
NASA Astrophysics Data System (ADS)
Kim, J. W.; Lu, Z.; QU, F.
2014-12-01
The landslide is a process that results in the downward and outward movement of slope-reshaping materials including rocks and soils and annually causes the loss of approximately $3.5 billion and tens of casualties in the United States. The 2014 Oso mudslide was an extreme event costing nearly 40 deaths and damaging civilian properties. Landslides are often unpredictable, but in many cases, catastrophic events are repetitive. Historic record in the Oso mudslide site indicates that there have been serial events in decades, though the extent of sliding events varied from time to time. In our study, the combination of multi-source DEMs, InSAR, and time-series InSAR analysis has enabled to characterize the Oso mudslide. InSAR results from ALOS PALSAR show that there was no significant deformation between mid-2006 and 2011. The combination of time-series InSAR analysis and old-dated DEM indicated revealed topographic changes associated the 2006 sliding event, which is confirmed by the difference of multiple LiDAR DEMs. Precipitation and discharge measurements before the 2006 and 2014 landslide events did not exhibit extremely anomalous records, suggesting the precipitation is not the controlling factor in determining the sliding events at Oso. The lack of surface deformation during 2006-2011 and weak correlation between the precipitation and the sliding event, suggest other factors (such as porosity) might play a critical role on the run-away events at this Oso and other similar landslides.
Adverse event reporting in cancer clinical trial publications.
Sivendran, Shanthi; Latif, Asma; McBride, Russell B; Stensland, Kristian D; Wisnivesky, Juan; Haines, Lindsay; Oh, William K; Galsky, Matthew D
2014-01-10
Reporting adverse events is a critical element of a clinical trial publication. In 2003, the Consolidated Standards of Reporting Trials (CONSORT) group generated recommendations regarding the appropriate reporting of adverse events. The degree to which these recommendations are followed in oncology publications has not been comprehensively evaluated. A review of citations from PubMed, Medline, and Embase published between Jan 1, 2009 and December 31, 2011, identified eligible randomized, controlled phase III trials in metastatic solid malignancies. Publications were assessed for 14 adverse event-reporting elements derived from the CONSORT harms extension statement; a completeness score (range, 0 to 14) was calculated by adding the number of elements reported. Linear regression analysis identified which publication characteristics associated with reporting completeness. A total of 175 publications, with data for 96,125 patients, were included in the analysis. The median completeness score was eight (range, three to 12). Most publications (96%) reported only adverse events occurring above a threshold rate or severity, 37% did not specify the criteria used to select which adverse events were reported, and 88% grouped together adverse events of varying severity. Regression analysis revealed that trials without a stated funding source and with an earlier year of publication had significantly lower completeness scores. Reporting of adverse events in oncology publications of randomized trials is suboptimal and characterized by substantial selectivity and heterogeneity. The development of oncology-specific standards for adverse event reporting should be established to ensure consistency and provide critical information required for medical decision-making.
Structural monitoring for rare events in remote locations
NASA Astrophysics Data System (ADS)
Hale, J. M.
2005-01-01
A structural monitoring system has been developed for use on high value engineering structures, which is particularly suitable for use in remote locations where rare events such as accidental impacts, seismic activity or terrorist attack might otherwise go undetected. The system comprises a low power intelligent on-site data logger and a remote analysis computer that communicate with one another using the internet and mobile telephone technology. The analysis computer also generates e-mail alarms and maintains a web page that displays detected events in near real-time to authorised users. The application of the prototype system to pipeline monitoring is described in which the analysis of detected events is used to differentiate between impacts and pressure surges. The system has been demonstrated successfully and is ready for deployment.
NASA Astrophysics Data System (ADS)
Dhakal, N.; Jain, S.
2013-12-01
Rare and unusually large events (such as hurricanes and floods) can create unusual and interesting trends in statistics. Generalized Extreme Value (GEV) distribution is usually used to statistically describe extreme rainfall events. A number of the recent studies have shown that the frequency of extreme rainfall events has increased over the last century and as a result, there has been change in parameters of GEV distribution with the time (non-stationary). But what impact does a single unusually large rainfall event (e.g., hurricane Irene) have on the GEV parameters and consequently on the level of risks or the return periods used in designing the civil infrastructures? In other words, if such a large event occurs today, how will it influence the level of risks (estimated based on past rainfall records) for the civil infrastructures? To answer these questions, we performed sensitivity analysis of the distribution parameters of GEV as well as the return periods to unusually large outlier events. The long-term precipitation records over the period of 1981-2010 from 12 USHCN stations across the state of Maine were used for analysis. For most of the stations, addition of each outlier event caused an increase in the shape parameter with a huge decrease on the corresponding return period. This is a key consideration for time-varying engineering design. These isolated extreme weather events should simultaneously be considered with traditional statistical methodology related to extreme events while designing civil infrastructures (such as dams, bridges, and culverts). Such analysis is also useful in understanding the statistical uncertainty of projecting extreme events into future.
TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach
Elgendi, Mohamed
2016-01-01
Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages (“TERMA”) involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages (W1 and W2) have to follow the inequality (8×W1)≥W2≥(2×W1). Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions. PMID:27827852
Fault tree analysis of the causes of waterborne outbreaks.
Risebro, Helen L; Doria, Miguel F; Andersson, Yvonne; Medema, Gertjan; Osborn, Keith; Schlosser, Olivier; Hunter, Paul R
2007-01-01
Prevention and containment of outbreaks requires examination of the contribution and interrelation of outbreak causative events. An outbreak fault tree was developed and applied to 61 enteric outbreaks related to public drinking water supplies in the EU. A mean of 3.25 causative events per outbreak were identified; each event was assigned a score based on percentage contribution per outbreak. Source and treatment system causative events often occurred concurrently (in 34 outbreaks). Distribution system causative events occurred less frequently (19 outbreaks) but were often solitary events contributing heavily towards the outbreak (a mean % score of 87.42). Livestock and rainfall in the catchment with no/inadequate filtration of water sources contributed concurrently to 11 of 31 Cryptosporidium outbreaks. Of the 23 protozoan outbreaks experiencing at least one treatment causative event, 90% of these events were filtration deficiencies; by contrast, for bacterial, viral, gastroenteritis and mixed pathogen outbreaks, 75% of treatment events were disinfection deficiencies. Roughly equal numbers of groundwater and surface water outbreaks experienced at least one treatment causative event (18 and 17 outbreaks, respectively). Retrospective analysis of multiple outbreaks of enteric disease can be used to inform outbreak investigations, facilitate corrective measures, and further develop multi-barrier approaches.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bley, D.C.; Cooper, S.E.; Forester, J.A.
ATHEANA, a second-generation Human Reliability Analysis (HRA) method integrates advances in psychology with engineering, human factors, and Probabilistic Risk Analysis (PRA) disciplines to provide an HRA quantification process and PRA modeling interface that can accommodate and represent human performance in real nuclear power plant events. The method uses the characteristics of serious accidents identified through retrospective analysis of serious operational events to set priorities in a search process for significant human failure events, unsafe acts, and error-forcing context (unfavorable plant conditions combined with negative performance-shaping factors). ATHEANA has been tested in a demonstration project at an operating pressurized water reactor.
Paleo-event data standards for dendrochronology
Elaine Kennedy Sutherland; P. Brewer; W. Gross
2017-01-01
Extreme environmental events, such as storm winds, landslides, insect infestations, and wildfire, cause loss of life, resources, and human infrastructure. Disaster riskreduction analysis can be improved with information about past frequency, intensity, and spatial patterns of extreme events. Tree-ring analyses can provide such information: tree rings reflect events as...
Genome-wide analysis of alternative splicing during human heart development
NASA Astrophysics Data System (ADS)
Wang, He; Chen, Yanmei; Li, Xinzhong; Chen, Guojun; Zhong, Lintao; Chen, Gangbing; Liao, Yulin; Liao, Wangjun; Bin, Jianping
2016-10-01
Alternative splicing (AS) drives determinative changes during mouse heart development. Recent high-throughput technological advancements have facilitated genome-wide AS, while its analysis in human foetal heart transition to the adult stage has not been reported. Here, we present a high-resolution global analysis of AS transitions between human foetal and adult hearts. RNA-sequencing data showed extensive AS transitions occurred between human foetal and adult hearts, and AS events occurred more frequently in protein-coding genes than in long non-coding RNA (lncRNA). A significant difference of AS patterns was found between foetal and adult hearts. The predicted difference in AS events was further confirmed using quantitative reverse transcription-polymerase chain reaction analysis of human heart samples. Functional foetal-specific AS event analysis showed enrichment associated with cell proliferation-related pathways including cell cycle, whereas adult-specific AS events were associated with protein synthesis. Furthermore, 42.6% of foetal-specific AS events showed significant changes in gene expression levels between foetal and adult hearts. Genes exhibiting both foetal-specific AS and differential expression were highly enriched in cell cycle-associated functions. In conclusion, we provided a genome-wide profiling of AS transitions between foetal and adult hearts and proposed that AS transitions and deferential gene expression may play determinative roles in human heart development.
Drought, flood and rainfall analysis under climate change in Crete, Greece
NASA Astrophysics Data System (ADS)
Tapoglou, Evdokia; Vozinaki, Anthi-Eirini; Tsanis, Ioannis; Nerantzaki, Sofia; Nikolaidis, Nikolaos
2017-04-01
In this study an analysis on the drought frequency and magnitude under climate change in Crete, Greece is performed. The analysis was performed for the time period from 1983-2100, divided into three sub-periods (1983-1999, 2000-2049 and 2050-2099) for inter-comparison. Two climate models were studied MPI-ESM-LR-r1-CSC-REMO and EC-EARTH-r12-SMHI-RCA4, following three possible representative concentration pathways (+2.6, +4.5 and +8.5 W/m2). In order to perform the analysis the results of a SWAT simulation which covered the entity of Crete using 352 subbasins, was used. Drought events are recognized by using the Standardized Precipitation Index (SPI) to identify the meteorological drought events and Standardized Runoff Index (SRI) for hydrological droughts. SPI and SRI drought indices, were used in order to identify the number of drought events for each climate model and scenario. In all cases, an increase in both severity and number of drought events was calculated for the future periods, compared to the baseline period 1983-1999. This increase was smaller for the +2.6 W/m2 scenario and largest for the +8.5 W/m2. The magnitude of events with 10 and 100 years return period was calculated for the subbasins of Crete and the most vulnerable were identified, both in terms of severity and the change throughout the years in index magnitude. Next a flood frequency analysis was performed for the entity of Crete Island in order to calculate the magnitude of events with 10 and 100 years return period. In order to perform the flood frequency analysis, the results of the SWAT simulation in terms of runoff in each subbasin are used. By calculating the magnitude of flood events with 10 and 100 years return period and the change in the magnitude throughout the time periods the most vulnerable subbasins are identified. The same frequency analysis was performed for the precipitation at each subbasin, and the magnitude of extreme precipitation events with 10 and 100 years return period was calculated. In this case the most significant changes appeared in Chania prefecture, having a 25-50% increase in extreme precipitation magnitude for the 10 years and the 100 years return period until the end of the third study period. Drought and flood frequency analysis can be proved a valuable tool in water management and infrastructure projects planning providing an integrated analysis for extreme event magnitude anticipation in Crete. The research reported in this paper was fully supported by the Project "Innovative solutions to climate change adaptation and governance in the water management of the Region of Crete - AQUAMAN" funded within the framework of the EEA Financial Mechanism 2009-2014.
Fong, Allan; Harriott, Nicole; Walters, Donna M; Foley, Hanan; Morrissey, Richard; Ratwani, Raj R
2017-08-01
Many healthcare providers have implemented patient safety event reporting systems to better understand and improve patient safety. Reviewing and analyzing these reports is often time consuming and resource intensive because of both the quantity of reports and length of free-text descriptions in the reports. Natural language processing (NLP) experts collaborated with clinical experts on a patient safety committee to assist in the identification and analysis of medication related patient safety events. Different NLP algorithmic approaches were developed to identify four types of medication related patient safety events and the models were compared. Well performing NLP models were generated to categorize medication related events into pharmacy delivery delays, dispensing errors, Pyxis discrepancies, and prescriber errors with receiver operating characteristic areas under the curve of 0.96, 0.87, 0.96, and 0.81 respectively. We also found that modeling the brief without the resolution text generally improved model performance. These models were integrated into a dashboard visualization to support the patient safety committee review process. We demonstrate the capabilities of various NLP models and the use of two text inclusion strategies at categorizing medication related patient safety events. The NLP models and visualization could be used to improve the efficiency of patient safety event data review and analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
Regression analysis of mixed recurrent-event and panel-count data.
Zhu, Liang; Tong, Xinwei; Sun, Jianguo; Chen, Manhua; Srivastava, Deo Kumar; Leisenring, Wendy; Robison, Leslie L
2014-07-01
In event history studies concerning recurrent events, two types of data have been extensively discussed. One is recurrent-event data (Cook and Lawless, 2007. The Analysis of Recurrent Event Data. New York: Springer), and the other is panel-count data (Zhao and others, 2010. Nonparametric inference based on panel-count data. Test 20: , 1-42). In the former case, all study subjects are monitored continuously; thus, complete information is available for the underlying recurrent-event processes of interest. In the latter case, study subjects are monitored periodically; thus, only incomplete information is available for the processes of interest. In reality, however, a third type of data could occur in which some study subjects are monitored continuously, but others are monitored periodically. When this occurs, we have mixed recurrent-event and panel-count data. This paper discusses regression analysis of such mixed data and presents two estimation procedures for the problem. One is a maximum likelihood estimation procedure, and the other is an estimating equation procedure. The asymptotic properties of both resulting estimators of regression parameters are established. Also, the methods are applied to a set of mixed recurrent-event and panel-count data that arose from a Childhood Cancer Survivor Study and motivated this investigation. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
North Atlantic storm driving of extreme wave heights in the North Sea
NASA Astrophysics Data System (ADS)
Bell, R. J.; Gray, S. L.; Jones, O. P.
2017-04-01
The relationship between storms and extreme ocean waves in the North Sea is assessed using a long-period wave data set and storms identified in the Interim ECMWF Re-Analysis (ERA-Interim). An ensemble sensitivity analysis is used to provide information on the spatial and temporal forcing from mean sea-level pressure and surface wind associated with extreme ocean wave height responses. Extreme ocean waves in the central North Sea arise due to intense extratropical cyclone winds from either the cold conveyor belt (northerly-wind events) or the warm conveyor belt (southerly-wind events). The largest wave heights are associated with northerly-wind events which tend to have stronger wind speeds and occur as the cold conveyor belt wraps rearward round the cyclone to the cold side of the warm front. The northerly-wind events provide a larger fetch to the central North Sea to aid wave growth. Southerly-wind events are associated with the warm conveyor belts of intense extratropical cyclones that develop in the left upper tropospheric jet exit region. Ensemble sensitivity analysis can provide early warning of extreme wave events by demonstrating a relationship between wave height and high pressure to the west of the British Isles for northerly-wind events 48 h prior. Southerly-wind extreme events demonstrate sensitivity to low pressure to the west of the British Isles 36 h prior.
NASA Astrophysics Data System (ADS)
Feller, Jens; Feller, Sebastian; Mauersberg, Bernhard; Mergenthaler, Wolfgang
2009-09-01
Many applications in plant management require close monitoring of equipment performance, in particular with the objective to prevent certain critical events. At each point in time, the information available to classify the criticality of the process, is represented through the historic signal database as well as the actual measurement. This paper presents an approach to detect and predict critical events, based on pattern recognition and discriminance analysis.
Lee, Sandra; Reddington, Elise; Koutsogiannaki, Sophia; Hernandez, Michael R; Odegard, Kirsten C; DiNardo, James A; Yuki, Koichi
2018-04-27
While mortality and adverse perioperative events after noncardiac surgery in children with a broad range of congenital cardiac lesions have been investigated using large multiinstitutional databases, to date single-center studies addressing adverse outcomes in children with congenital heart disease (CHD) undergoing noncardiac surgery have only included small numbers of patients with significant heart disease. The primary objective of this study was to determine the incidences of perioperative cardiovascular and respiratory events in a large cohort of patients from a single institution with a broad range of congenital cardiac lesions undergoing noncardiac procedures and to determine risk factors for these events. We identified 3010 CHD patients presenting for noncardiac procedures in our institution over a 5-year period. We collected demographic information, including procedure performed, cardiac diagnosis, ventricular function as assessed by echocardiogram within 6 months of the procedure, and classification of CHD into 3 groups (minor, major, or severe CHD) based on residual lesion burden and cardiovascular functional status. Characteristics related to conduct of anesthesia care were also collected. The primary outcome variables for our analysis were the incidences of intraoperative cardiovascular and respiratory events. Univariable and multivariable logistic regressions were used to determine risk factors for these 2 outcomes. The incidence of cardiovascular events was 11.5% and of respiratory events was 4.7%. Univariate analysis and multivariable analysis demonstrated that American Society of Anesthesiologists (≥3), emergency cases, major and severe CHD, single-ventricle physiology, ventricular dysfunction, orthopedic surgery, general surgery, neurosurgery, and pulmonary procedures were associated with perioperative cardiovascular events. Respiratory events were associated with American Society of Anesthesiologists (≥4) and otolaryngology, gastrointestinal, general surgery, and maxillofacial procedures. Intraoperative cardiovascular events and respiratory events in patients with CHD were relatively common. While cardiovascular events were highly associated with cardiovascular status, respiratory events were not associated with cardiovascular status.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao
In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a systemcentric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologiesmore » within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.« less
Tang, Huilin; Cui, Wei; Li, Dandan; Wang, Tiansheng; Zhang, Jingjing; Zhai, Suodi; Song, Yiqing
2017-01-01
Given inconsistent trial results of sodium-glucose cotransporter 2 (SGLT2) inhibitors in addition to insulin therapy for treating type 2 diabetes mellitus (T2DM), a meta-analysis was performed to evaluate the efficacy and safety of this combination for T2DM by searching available randomized trials from PubMed, Embase, CENTRAL and ClinicalTrials.gov. Our meta-analysis included seven eligible placebo-controlled trials involving 4235 patients. Compared with placebo, SGLT2 inhibitor treatment was significantly associated with a mean reduction in HbA1c of -0.56%, fasting plasma glucose of -0.95 mmol/L, body weight of -2.63 kg and insulin dose of -8.79 IU, but an increased risk of drug-related adverse events by 36%, urinary tract infections by 29% and genital infections by 357%. No significant increase was observed in risk of overall adverse events [risk ratio (RR), 1.00], serious adverse events (RR, 0.90), adverse events leading to discontinuation (RR, 1.16), hypoglycaemia events (RR, 1.07) and severe hypoglycaemia events (RR, 1.24). No diabetic ketoacidosis events were reported. Further studies are needed to establish optimal combination type and dose. © 2016 John Wiley & Sons Ltd.
Regression analysis of mixed panel count data with dependent terminal events.
Yu, Guanglei; Zhu, Liang; Li, Yang; Sun, Jianguo; Robison, Leslie L
2017-05-10
Event history studies are commonly conducted in many fields, and a great deal of literature has been established for the analysis of the two types of data commonly arising from these studies: recurrent event data and panel count data. The former arises if all study subjects are followed continuously, while the latter means that each study subject is observed only at discrete time points. In reality, a third type of data, a mixture of the two types of the data earlier, may occur and furthermore, as with the first two types of the data, there may exist a dependent terminal event, which may preclude the occurrences of recurrent events of interest. This paper discusses regression analysis of mixed recurrent event and panel count data in the presence of a terminal event and an estimating equation-based approach is proposed for estimation of regression parameters of interest. In addition, the asymptotic properties of the proposed estimator are established, and a simulation study conducted to assess the finite-sample performance of the proposed method suggests that it works well in practical situations. Finally, the methodology is applied to a childhood cancer study that motivated this study. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Poisson-event-based analysis of cell proliferation.
Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul
2015-05-01
A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.
Comparing Low-Frequency Earthquakes During Triggered and Ambient Tremor in Taiwan
NASA Astrophysics Data System (ADS)
Alvarado Lara, F., Sr.; Ledezma, C., Sr.
2014-12-01
In South America, larger magnitude seismic events originate in the subduction zone between the Nazca and Continental plates, as opposed to crustal events. Crustal seismic events are important in areas very close to active fault lines; however, seismic hazard analyses incorporate crust events related to a maximum distance from the site under study. In order to use crustal events as part of a seismic hazard analysis, it is necessary to use the attenuation relationships which represent the seismic behavior of the site under study. Unfortunately, in South America the amount of compiled crustal event historical data is not yet sufficient to generate a firm regional attenuation relationship. In the absence of attenuation relationships for crustal earthquakes in the region, the conventional approach is to use attenuation relationships from other regions which have a large amount of compiled data and which have similar seismic conditions to the site under study. This practice permits the development of seismic hazard analysis work with a certain margin of accuracy. In South America, in the engineering practice, new generation attenuation relationships (NGA-W) are used among other alternatives in order to incorporate the effect of crustal events in a seismic hazard analysis. In 2014, the NGA-W Version 2 (NGA-W2) was presented with a database containing information from Taiwan, Turkey, Iran, USA, Mexico, Japan, and Alaska. This paper examines whether it is acceptable to utilize the NGA-W2 in seismic hazard analysis in South America. A comparison between response spectrums of the seismic risk prepared in accordance with NGA-W2 and actual response spectrums of crustal events from Argentina is developed in order to support the examination. The seismic data were gathered from equipment installed in the cities of Santiago, Chile and Mendoza, Argentina.
Wang, Kyle; Pearlstein, Kevin A; Patchett, Nicholas D; Deal, Allison M; Mavroidis, Panayiotis; Jensen, Brian C; Lipner, Matthew B; Zagar, Timothy M; Wang, Yue; Lee, Carrie B; Eblan, Michael J; Rosenman, Julian G; Socinski, Mark A; Stinchcombe, Thomas E; Marks, Lawrence B
2017-11-01
To assess associations between radiation dose/volume parameters for cardiac subvolumes and different types of cardiac events in patients treated on radiation dose-escalation trials. Patients with Stage III non-small-cell lung cancer received dose-escalated radiation (median 74 Gy) using 3D-conformal radiotherapy on six prospective trials from 1996 to 2009. Volumes analyzed included whole heart, left ventricle (LV), right atrium (RA), and left atrium (LA). Cardiac events were divided into three categories: pericardial (symptomatic effusion and pericarditis), ischemia (myocardial infarction and unstable angina), and arrhythmia. Univariable competing risks analysis was used. 112 patients were analyzed, with median follow-up 8.8 years for surviving patients. Nine patients had pericardial, seven patients had ischemic, and 12 patients had arrhythmic events. Pericardial events were correlated with whole heart, RA, and LA dose (eg, heart-V30 [p=0.024], RA-V30 [p=0.013], and LA-V30 [p=0.001]), but not LV dose. Ischemic events were correlated with LV and whole heart dose (eg, LV-V30 [p=0.012], heart-V30 [p=0.048]). Arrhythmic events showed borderline significant associations with RA, LA, and whole heart dose (eg, RA-V30 [p=0.082], LA-V30 [p=0.076], heart-V30 [p=0.051]). Cardiac events were associated with decreased survival on univariable analysis (p=0.008, HR 2.09), but only disease progression predicted for decreased survival on multivariable analysis. Cardiac events were heterogeneous and associated with distinct heart subvolume doses. These data support the hypothesis of distinct etiologies for different types of radiation-associated cardiotoxicity. Copyright © 2017 Elsevier B.V. All rights reserved.
Revision of the Applicability of the NGA's in South America, Chile - Argentina.
NASA Astrophysics Data System (ADS)
Alvarado Lara, F., Sr.; Ledezma, C., Sr.
2015-12-01
In South America, larger magnitude seismic events originate in the subduction zone between the Nazca and Continental plates, as opposed to crustal events. Crustal seismic events are important in areas very close to active fault lines; however, seismic hazard analyses incorporate crust events related to a maximum distance from the site under study. In order to use crustal events as part of a seismic hazard analysis, it is necessary to use the attenuation relationships which represent the seismic behavior of the site under study. Unfortunately, in South America the amount of compiled crustal event historical data is not yet sufficient to generate a firm regional attenuation relationship. In the absence of attenuation relationships for crustal earthquakes in the region, the conventional approach is to use attenuation relationships from other regions which have a large amount of compiled data and which have similar seismic conditions to the site under study. This practice permits the development of seismic hazard analysis work with a certain margin of accuracy. In South America, in the engineering practice, new generation attenuation relationships (NGA-W) are used among other alternatives in order to incorporate the effect of crustal events in a seismic hazard analysis. In 2014, the NGA-W Version 2 (NGA-W2) was presented with a database containing information from Taiwan, Turkey, Iran, USA, Mexico, Japan, and Alaska. This paper examines whether it is acceptable to utilize the NGA-W2 in seismic hazard analysis in South America. A comparison between response spectrums of the seismic risk prepared in accordance with NGA-W2 and actual response spectrums of crustal events from Argentina is developed in order to support the examination. The seismic data were gathered from equipment installed in the cities of Santiago, Chile and Mendoza, Argentina.
Asher, Lucy; Harvey, Naomi D.; Green, Martin; England, Gary C. W.
2017-01-01
Epidemiology is the study of patterns of health-related states or events in populations. Statistical models developed for epidemiology could be usefully applied to behavioral states or events. The aim of this study is to present the application of epidemiological statistics to understand animal behavior where discrete outcomes are of interest, using data from guide dogs to illustrate. Specifically, survival analysis and multistate modeling are applied to data on guide dogs comparing dogs that completed training and qualified as a guide dog, to those that were withdrawn from the training program. Survival analysis allows the time to (or between) a binary event(s) and the probability of the event occurring at or beyond a specified time point. Survival analysis, using a Cox proportional hazards model, was used to examine the time taken to withdraw a dog from training. Sex, breed, and other factors affected time to withdrawal. Bitches were withdrawn faster than dogs, Labradors were withdrawn faster, and Labrador × Golden Retrievers slower, than Golden Retriever × Labradors; and dogs not bred by Guide Dogs were withdrawn faster than those bred by Guide Dogs. Multistate modeling (MSM) can be used as an extension of survival analysis to incorporate more than two discrete events or states. Multistate models were used to investigate transitions between states of training to qualification as a guide dog or behavioral withdrawal, and from qualification as a guide dog to behavioral withdrawal. Sex, breed (with purebred Labradors and Golden retrievers differing from F1 crosses), and bred by Guide Dogs or not, effected movements between states. We postulate that survival analysis and MSM could be applied to a wide range of behavioral data and key examples are provided. PMID:28804710
Pesticide leaching via subsurface drains in different hydrologic situations
NASA Astrophysics Data System (ADS)
Zajíček, Antonín; Fučík, Petr; Liška, Marek; Dobiáš, Jakub
2017-04-01
esticides and their degradates in tile drainage waters were studied in two small, predominantly agricultural, tile-drained subcatchments in the Bohemian-Moravian Highlands, Czech Republic. The goal was to evaluate their occurence and the dymamics of their concentrations in drainage waters in different hydrologic situations using discharge and concentration monitoring together with 18O and 2H isotope analysis for Mean Residence Time (MRT) estimation and hydrograph separations during rainfall - runoff (R-R) events. The drainage and stream discharges were measured continuously at the closing outlets of three drainage groups and one small stream. During periods of prevailing base and interflow, samples were collected manually in two-week intervals for isotope analysis and during the spraying period (March to October) also for pesticide analysis. During R-R events, samples were taken by automatic samplers in intervals varying from 20 min (summer) to 1 hour (winter). To enable isotopic analysis, precipitation was sampled both manually at two-week intervals and also using an automatic rainfall sampler which collected samples of precipitation during the R-R events at 20-min. intervals. The isotopic analysis showed, that MRT of drainage base flow and interflow varies from 2,2 to 3,3 years, while MRT of base flow and interflow in surface stream is several months. During R-R events, the proportion of event water varied from 0 to 60 % in both drainage and surface runoff. The occurrence of pesticides and their degradates in drainage waters is strongly dependent on the hydrologic situation. While degradates were permanently present in drainage waters in high but varying concentrations according to instantaneous runoff composition, parent matters were detected almost exclusively during R-R events. In periods with prevailing base flow and interflow (grab samples), especially ESA forms of chloracetanilide degradates occured in high concentrations in all samples. Average sum of degradates varried between 1 730 - 5 760 ng/l. During R-R events, pesticide concentration varried according to runoff composition and time between sprayng and event. Event with no protortiom of event water in drainage runoff were typical by incereas in degradates concentrations (up to 20 000ng/l) and none or low occurence of parent matters. Events with significant event water proportion in drainage runoff were characterised by decrease in degradates concentrations and (when event happened soon affter spraying) by presence of paternal pesticides in drinage runoff. Instanteous concentrations of paren matters can be extremely high in that causes, up to 23 000 ng/l in drainage waters and up to 40 000 ng/l in small stream. Above results suggest that drainage systems could act as significant source of pesticide leaching. When parent compounds leaches via tile drainage systems, there are some border conditions that must exist together such as the occurence of R-R event soon after the pests application and the presence of event water (or water with short residence time in the catchment) in the drainage runoff.
Web Video Event Recognition by Semantic Analysis From Ubiquitous Documents.
Yu, Litao; Yang, Yang; Huang, Zi; Wang, Peng; Song, Jingkuan; Shen, Heng Tao
2016-12-01
In recent years, the task of event recognition from videos has attracted increasing interest in multimedia area. While most of the existing research was mainly focused on exploring visual cues to handle relatively small-granular events, it is difficult to directly analyze video content without any prior knowledge. Therefore, synthesizing both the visual and semantic analysis is a natural way for video event understanding. In this paper, we study the problem of Web video event recognition, where Web videos often describe large-granular events and carry limited textual information. Key challenges include how to accurately represent event semantics from incomplete textual information and how to effectively explore the correlation between visual and textual cues for video event understanding. We propose a novel framework to perform complex event recognition from Web videos. In order to compensate the insufficient expressive power of visual cues, we construct an event knowledge base by deeply mining semantic information from ubiquitous Web documents. This event knowledge base is capable of describing each event with comprehensive semantics. By utilizing this base, the textual cues for a video can be significantly enriched. Furthermore, we introduce a two-view adaptive regression model, which explores the intrinsic correlation between the visual and textual cues of the videos to learn reliable classifiers. Extensive experiments on two real-world video data sets show the effectiveness of our proposed framework and prove that the event knowledge base indeed helps improve the performance of Web video event recognition.
Discrete event simulation tool for analysis of qualitative models of continuous processing systems
NASA Technical Reports Server (NTRS)
Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)
1990-01-01
An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.
Test report for single event effects of the 80386DX microprocessor
NASA Technical Reports Server (NTRS)
Watson, R. Kevin; Schwartz, Harvey R.; Nichols, Donald K.
1993-01-01
The Jet Propulsion Laboratory Section 514 Single Event Effects (SEE) Testing and Analysis Group has performed a series of SEE tests of certain strategic registers of Intel's 80386DX CHMOS 4 microprocessor. Following a summary of the test techniques and hardware used to gather the data, we present the SEE heavy ion and proton test results. We also describe the registers tested, along with a system impact analysis should these registers experience a single event upset.
NASA Astrophysics Data System (ADS)
Petrov, Yevgeniy
2009-10-01
Searches for sources of the highest-energy cosmic rays traditionally have included looking for clusters of event arrival directions on the sky. The smallest cluster is a pair of events falling within some angular window. In contrast to the standard two point (2-pt) autocorrelation analysis, this work takes into account influence of the galactic magnetic field (GMF). The highest energy events, those above 50EeV, collected by the surface detector of the Pierre Auger Observatory between January 1, 2004 and May 31, 2009 are used in the analysis. Having assumed protons as primaries, events are backtracked through BSS/S, BSS/A, ASS/S and ASS/A versions of Harari-Mollerach-Roulet (HMR) model of the GMF. For each version of the model, a 2-pt autocorrelation analysis is applied to the backtracked events and to 105 isotropic Monte Carlo realizations weighted by the Auger exposure. Scans in energy, separation angular window and different model parameters reveal clustering at different angular scales. Small angle clustering at 2-3 deg is particularly interesting and it is compared between different field scenarios. The strength of the autocorrelation signal at those angular scales differs between BSS and ASS versions of the HMR model. The BSS versions of the model tend to defocus protons as they arrive to Earth whereas for the ASS, in contrary, it is more likely to focus them.
The pros and cons of researching events ethnographically
2017-01-01
Events (remarkable, disruptive happenings) are important subjects of study for understanding processes of change. In this essay, I reflect upon the issue of what the ethnographic method has to offer for the analysis of this social phenomenon. To do so, I review three recently published ethnographic studies of events. My conclusion is that it is indeed a very useful method for understanding the feelings and ideas of people who are experiencing eventful situations, for instance around protests or natural disasters. However, using this method also brings about practical difficulties, such as the ‘luck’ that an event occurs at the ethnographic fieldwork site. Next, as transformative responses to events are not bound by the place or time of the happening, other methods (interviews, discourse analysis, surveys) that make it easier to follow them in varying locations and periods might be more suitable for getting a comprehensive picture of their meaning-making dynamics. PMID:29081715
NASA Astrophysics Data System (ADS)
Nasution, A. H.; Rachmawan, Y. A.
2018-04-01
Fashion trend in the world changed extremely fast. Fashion has become the one of people’s lifestyle in the world. Fashion week events in several areas can be a measurement of fahion trend nowadays. There was a fashion week event in Indonesia called Jakarta Fashion Week (JFW) aims to show fashion trend to people who want to improve their fashion style. People will join some events if the event has involvement to them, hence they will come to that event again and again. Annually and continuously event is really important to create loyalty among people who are involved in it, in order to increase positive development towards the organizer in organizing the next event. Saving a huge amount from the marketing budget, and creating a higher quality event. This study aims to know the effect of 5 brand personality dimension to event involvement and loyalty in Jakarta Fashion Week (JFW). This study use quantitative confirmative method with Structural Equation Model (SEM) analysis technique. The sample of this study is 150 respondents who became a participant of Jakarta Fashion Week 2017. Result show that there was significant effect of 5 brand personality dimension to 3 dimension of event involvement and loyalty. Meanwhile, there was one dimension of event involvement called personal self-expression that has not effect to loyalty.
Identification of Tf1 integration events in S. pombe under nonselective conditions
Cherry, Kristina E.; Hearn, Willis E.; Seshie, Osborne Y.; Singleton, Teresa L.; Singleton, Teresa L.
2014-01-01
Integration of retroviral elements into the host genome is a phenomena observed among many classes of retroviruses. Much information concerning integration of retroviral elements has been documented based on in vitro analysis or expression of selectable markers. To identify possible Tf1 integration events within silent regions of the S. pombe genome, we focused on performing an in vivo genome-wide analysis of Tf1 integration events from the nonselective phase of the retrotransposition assay. We analyzed 1000 individual colonies streaked from four independent Tf1 transposed patches under nonselection conditions. Our analysis detected a population of G418S/neo+ Tf1 integration events that would have been overlooked during the selective phase of the assay. Further RNA analysis from the G418S/neo+ clones revealed 50% of clones expressing the neo selectable marker. Our data reveals Tf1’s ability to insert within silent regions of S. pombe’s genome. PMID:24680781
Identification of Tf1 integration events in S. pombe under nonselective conditions.
Cherry, Kristina E; Hearn, Willis E; Seshie, Osborne Y K; Singleton, Teresa L
2014-06-01
Integration of retroviral elements into the host genome is a phenomena observed among many classes of retroviruses. Much information concerning the integration of retroviral elements has been documented based on in vitro analysis or expression of selectable markers. To identify possible Tf1 integration events within silent regions of the Schizosaccharomyces pombe genome, we focused on performing an in vivo genome-wide analysis of Tf1 integration events from the nonselective phase of the retrotransposition assay. We analyzed 1000 individual colonies streaked from four independent Tf1 transposed patches under nonselection conditions. Our analysis detected a population of G418(S)/neo(+) Tf1 integration events that would have been overlooked during the selective phase of the assay. Further RNA analysis from the G418(S)/neo(+) clones revealed 50% of clones expressing the neo selectable marker. Our data reveals Tf1's ability to insert within silent regions of S. pombe's genome. Copyright © 2014 Elsevier B.V. All rights reserved.
Using Aoristic Analysis to Link Remote and Ground-Level Phenological Observations
NASA Astrophysics Data System (ADS)
Henebry, G. M.
2013-12-01
Phenology is about observing events in time and space. With the advent of publically accessible geospatial datastreams and easy to use mapping software, specifying where an event occurs is much less of a challenge than it was just two decades ago. In contrast, specifying when an event occurs remains a nontrivial function of a population of organismal responses, sampling interval, compositing period, and reporting precision. I explore how aoristic analysis can be used to analyzing spatiotemporal events for which the location is known to acceptable levels of precision but for which temporal coordinates are poorly specified or only partially bounded. Aoristic analysis was developed in the late 1990s in the field of quantitative criminology to leverage temporally imprecise geospatial data of crime reports. Here I demonstrate how aoristic analysis can be used to link remotely sensed observations of land surface phenology to ground-level observations of organismal phenophase transitions. Explicit representation of the windows of temporal uncertainty with aoristic weights enables cross-validation exercises and forecasting efforts to avoid false precision.
Classification and Space-Time Analysis of Precipitation Events in Manizales, Caldas, Colombia.
NASA Astrophysics Data System (ADS)
Suarez Hincapie, J. N.; Vélez, J.; Romo Melo, L.; Chang, P.
2015-12-01
Manizales is a mid-mountain Andean city located near the Nevado del Ruiz volcano in west-central Colombia, this location exposes it to earthquakes, floods, landslides and volcanic eruptions. It is located in the intertropical convergence zone (ITCZ) and presents a climate with a bimodal rainfall regime (Cortés, 2010). Its mean annual rainfall is 2000 mm, one may observe precipitation 70% of the days over a year. This rain which favors the formation of large masses of clouds and the presence of macroclimatic phenomenon as "El Niño South Oscillation", has historically caused great impacts in the region (Vélez et al, 2012). For example the geographical location coupled with rain events results in a high risk of landslides in the city. Manizales has a hydrometeorological network of 40 stations that measure and transmit data of up to eight climate variables. Some of these stations keep 10 years of historical data. However, until now this information has not been used for space-time classification of precipitation events, nor has the meteorological variables that influence them been thoroughly researched. The purpose of this study was to classify historical events of rain in an urban area of Manizales and investigate patterns of atmospheric behavior that influence or trigger such events. Classification of events was performed by calculating the "n" index of the heavy rainfall, describing the behavior of precipitation as a function of time throughout the event (Monjo, 2009). The analysis of meteorological variables was performed using statistical quantification over variable time periods before each event. The proposed classification allowed for an analysis of the evolution of rainfall events. Specially, it helped to look for the influence of different meteorological variables triggering rainfall events in hazardous areas as the city of Manizales.
Transient Volcano Deformation Event Detection over Variable Spatial Scales in Alaska
NASA Astrophysics Data System (ADS)
Li, J. D.; Rude, C. M.; Gowanlock, M.; Herring, T.; Pankratius, V.
2016-12-01
Transient deformation events driven by volcanic activity can be monitored using increasingly dense networks of continuous Global Positioning System (GPS) ground stations. The wide spatial extent of GPS networks, the large number of GPS stations, and the spatially and temporally varying scale of deformation events result in the mixing of signals from multiple sources. Typical analysis then necessitates manual identification of times and regions of volcanic activity for further study and the careful tuning of algorithmic parameters to extract possible transient events. Here we present a computer-aided discovery system that facilitates the discovery of potential transient deformation events at volcanoes by providing a framework for selecting varying spatial regions of interest and for tuning the analysis parameters. This site specification step in the framework reduces the spatial mixing of signals from different volcanic sources before applying filters to remove interfering signals originating from other geophysical processes. We analyze GPS data recorded by the Plate Boundary Observatory network and volcanic activity logs from the Alaska Volcano Observatory to search for and characterize transient inflation events in Alaska. We find 3 transient inflation events between 2008 and 2015 at the Akutan, Westdahl, and Shishaldin volcanoes in the Aleutian Islands. The inflation event detected in the first half of 2008 at Akutan is validated other studies, while the inflation events observed in early 2011 at Westdahl and in early 2013 at Shishaldin are previously unreported. Our analysis framework also incorporates modelling of the transient inflation events and enables a comparison of different magma chamber inversion models. Here, we also estimate the magma sources that best describe the deformation observed by the GPS stations at Akutan, Westdahl, and Shishaldin. We acknowledge support from NASA AIST-NNX15AG84G (PI: V. Pankratius).
Increasing the Operational Value of Event Messages
NASA Technical Reports Server (NTRS)
Li, Zhenping; Savkli, Cetin; Smith, Dan
2003-01-01
Assessing the health of a space mission has traditionally been performed using telemetry analysis tools. Parameter values are compared to known operational limits and are plotted over various time periods. This presentation begins with the notion that there is an incredible amount of untapped information contained within the mission s event message logs. Through creative advancements in message handling tools, the event message logs can be used to better assess spacecraft and ground system status and to highlight and report on conditions not readily apparent when messages are evaluated one-at-a-time during a real-time pass. Work in this area is being funded as part of a larger NASA effort at the Goddard Space Flight Center to create component-based, middleware-based, standards-based general purpose ground system architecture referred to as GMSEC - the GSFC Mission Services Evolution Center. The new capabilities and operational concepts for event display, event data analyses and data mining are being developed by Lockheed Martin and the new subsystem has been named GREAT - the GMSEC Reusable Event Analysis Toolkit. Planned for use on existing and future missions, GREAT has the potential to increase operational efficiency in areas of problem detection and analysis, general status reporting, and real-time situational awareness.
NASA Astrophysics Data System (ADS)
Labak, P.; Arndt, R.; Villagran, M.
2009-04-01
One of the sub-goals of the Integrated Field Experiment in 2008 (IFE08) in Kazakhstan was testing the prototype elements of the Seismic aftershock monitoring system (SAMS) for on-site inspection purposes. The task of the SAMS is to collect the facts, which should help to clarify nature of the triggering event. Therefore the SAMS has to be capable to detect and identify events as small as magnitude -2 in the inspection area size up to 1000 km2. Equipment for 30 mini-arrays and 10 3-component stations represented the field equipment of the SAMS. Each mini-array consisted of a central 3-component seismometer and 3 vertical seismometers at the distance about 100 m from the central seismometer. The mini-arrays covered approximately 80% of surrogate inspection area (IA) on the territory of former Semipalatinsk test site. Most of the stations were installed during the first four days of field operations by the seismic sub-team, which consisted of 10 seismologists. SAMS data center comprised 2 IBM Blade centers and 8 working places for data archiving, detection list production and event analysis. A prototype of SAMS software was tested. Average daily amount of collected raw data was 15-30 GB and increased according to the amount of stations entering operation. Routine manual data screening and data analyses were performed by 2-6 subteam members. Automatic screening was used for selected time intervals. Screening was performed using the Sonoview program in frequency domain and using the Geotool and Hypolines programs for screening in time domain. The screening results were merged into the master event list. The master event list served as a basis of detailed analysis of unclear events and events identified to be potentially in the IA. Detailed analysis of events to be potentially in the IA was performed by the Hypoline and Geotool programs. In addition, the Hyposimplex and Hypocenter programs were also used for localization of events. The results of analysis were integrated in the visual form using the Seistrain/geosearch program. Data were fully screened for the period 5.-13.9.2008. 360 teleseismic, regional and local events were identified. Results of the detection and analysis will be presented and consequences for further SAMS development will be discussed.
Healy, Donagh A; Kimura, Shiori; Power, David; Elhaj, Abubaker; Abdeldaim, Yasser; Cross, Keith S; McGreal, Gerard T; Burke, Paul E; Moloney, Tony; Manning, Brian J; Kavanagh, Eamon G
2018-06-09
A systematic review and meta-analysis was performed to determine the incidence of thrombotic events following great saphenous vein (GSV) endovenous thermal ablation (EVTA). MEDLINE, Embase and conference abstracts were searched. Eligible studies were randomised controlled trials and case series that included at least 100 patients who underwent GSV EVTA (laser ablation or radiofrequency ablation [RFA]) with duplex ultrasound (DUS) within 30 days. The systematic review focused on the complications of endovenous heat induced thrombosis (EHIT), deep venous thrombosis (DVT), and pulmonary embolism (PE). The primary outcome for the meta-analysis was deep venous thrombotic events which were defined as DVT or EHIT Type 2, 3, or 4. Secondary outcomes for the meta-analysis were EHIT Type 2, 3, or 4, DVT and PE. Subgroup analyses were performed for both the RFA and EVLA groups. Pooled proportions were calculated using random effects modelling. Fifty-two studies (16,398 patients) were included. Thrombotic complications occurred infrequently. Deep venous thrombotic events occurred in 1.7% of cases (95% CI 0.9-2.7%) (25 studies; 10,012 patients; 274 events). EHIT Type 2, 3, or 4 occurred in 1.4% of cases (95% CI 0.8-2.3%) (26 studies; 10,225 patients; 249 events). DVT occurred in 0.3% of cases (95% CI = 0.2%-0.5%) (49 studies; 15,676 patients; 48 events). PE occurred in 0.1% of cases (95% CI = 0.1-0.2%) (29 studies; 8223 patients; 3 events). Similar results were found when the RFA and EVLA groups were analysed separately. Thrombotic events occur infrequently following GSV EVTA. Given the large numbers of procedures worldwide and the potential for serious consequences, further research is needed on the burden of these complications and their management. Copyright © 2018 European Society for Vascular Surgery. Published by Elsevier B.V. All rights reserved.
Journot, Valérie; Tabuteau, Sophie; Collin, Fidéline; Molina, Jean-Michel; Chene, Geneviève; Rancinan, Corinne
2008-03-01
Since 2003, the Medical Dictionary for Regulatory Activities (MedDRA) is the regulatory standard for safety report in clinical trials in the European Community. Yet, we found no published example of a practical experience for a scientifically oriented statistical analysis of events coded with MedDRA. We took advantage of a randomized trial in HIV-infected patients with MedDRA-coded events to explain the difficulties encountered during the events analysis and the strategy developed to report events consistently with trial-specific objectives. MedDRA has a rich hierarchical structure, which allows the grouping of coded terms into 5 levels, the highest being "System Organ Class" (SOC). Each coded term may be related to several SOCs, among which one primary SOC is defined. We developed a new general 5-step strategy to select a SOC as trial primary SOC, consistently with trial-specific objectives for this analysis. We applied it to the ANRS 099 ALIZE trial, where all events were coded with MedDRA version 3.0. We compared the MedDRA and the ALIZE primary SOCs. In the ANRS 099 ALIZE trial, 355 patients were recruited, and 3,722 events were reported and documented, among which 35% had multiple SOCs (2 to 4). We applied the proposed 5-step strategy. Altogether, 23% of MedDRA primary SOCs were modified, mainly from MedDRA primary SOCs "Investigations" (69%) and "Ear and labyrinth disorders" (6%), for the ALIZE primary SOCs "Hepatobiliary disorders" (35%), "Musculoskeletal and connective tissue disorders" (21%), and "Gastrointestinal disorders" (15%). MedDRA largely enhanced in size and complexity with versioning and the development of Standardized MedDRA Queries. Yet, statisticians should not systematically rely on primary SOCs proposed by MedDRA to report events. A simple general 5-step strategy to re-classify events consistently with the trial-specific objectives might be useful in HIV trials as well as in other fields.
Estimating Rupture Directivity of Aftershocks of the 2014 Mw8.1 Iquique Earthquake, Northern Chile
NASA Astrophysics Data System (ADS)
Folesky, Jonas; Kummerow, Jörn; Timann, Frederik; Shapiro, Serge
2017-04-01
The 2014 Mw8.1 Iquique earthquake was accompanied by numerous fore- and aftershocks of magnitudes up to M ˜ 7.6. While the rupture processes of the main event and its largest aftershock were already analysed in great detail, this study focusses on the rupture processes of about 230 smaller aftershocks that occurred during the first two days after the main event. Since the events are of magnitudes 4.0 ≤ M ≤ 6.5 it is not trivial which method is most suitable. Thus we apply and compare here three different approaches attempting to extract a possible rupture directivity for each single event. The seismic broadband recordings of the Integrated Plate Boundary Observatory Chile (IPOC) provide an excellent database for our analysis. Their high sampling rate (100 Hz) and a well distributed station selection that cover an aperture of about 180 ° are a great advantage for a thorough directivity analysis. First, we apply a P wave polarization analysis (PPA) where we reconstruct the direction of the incoming wave-field by covariance analysis of the first particle motions. Combined with a sliding time window the results from different stations are capable of identifying first the hypocentre of the events and also a migration of the rupture front, if the event is of unilateral character. A second approach is the back projection imaging (BPI) technique, which illuminates the rupture path by back-projecting the recorded seismic energy to its source. A propagating rupture front would be reconstructed from the migration of the zone of high constructive amplitude stacks. In a third step we apply the empirical Green's function (EGF) method, where events of high waveform similarity, hence co-located and of similar mechanisms, are selected in order to use the smaller event as the Green's function of the larger event. This approach results in an estimated source time function, which is compared station wise and whose azimuthal variations are analysed for complexities and directivity.
NASA Astrophysics Data System (ADS)
Folesky, Jonas; Kummerow, Jörn
2015-04-01
The Empirical Green's Function (EGF) method uses pairs of events of high wave form similarity and adjacent hypocenters to decompose the influences of source time function, ray path, instrument site, and instrument response. The seismogram of the smaller event is considered as the Green's function which then can be deconvolved from the other seismogram. The result provides a reconstructed relative source time function (RSTF) of the larger event of that event pair. The comparison of the RSTFs at all stations of the observation systems produces information on the rupture process of the event based on an apparent directivity effect and possible changes in the RSTFs complexities. The Basel EGS dataset of 2006-2007 consists of about 2800 localized events of magnitudes between 0.0 < ML < 3.5 with event pairs of adequate magnitude difference for EGF analysis. The data has sufficient quality to analyse events with magnitudes down to ML = 0.0 for an apparent directivity effect although the approximate rupture duration for those events is of only a few milliseconds. The dataset shows a number of multiplets where fault plane solutions are known from earlier studies. Using the EGF method we compute rupture orientations for about 190 event pairs and compare them to the fault plane solutions of the multiplets. For the majority of events we observe a good consistency between the rupture direction found there and one of the previously determined nodal planes from fault plane solutions. In combination this resolves the fault plane ambiguity. Furthermore the rupture direction fitting yields estimates for projections of the rupture velocity on the horizontal plane. They seem to vary between the multiplets in the reservoir from 0.3 to 0.7 times the S-wave velocity. To our knowledge source characterization by EGF analysis has not yet been introduced to microseismic reservoirs with the data quality found in Basel. Our results show that EGF analysis can provide valuable additional insights on the distribution of rupture properties within the reservoir.
NASA Technical Reports Server (NTRS)
English, Thomas
2005-01-01
A standard tool of reliability analysis used at NASA-JSC is the event tree. An event tree is simply a probability tree, with the probabilities determining the next step through the tree specified at each node. The nodal probabilities are determined by a reliability study of the physical system at work for a particular node. The reliability study performed at a node is typically referred to as a fault tree analysis, with the potential of a fault tree existing.for each node on the event tree. When examining an event tree it is obvious why the event tree/fault tree approach has been adopted. Typical event trees are quite complex in nature, and the event tree/fault tree approach provides a systematic and organized approach to reliability analysis. The purpose of this study was two fold. Firstly, we wanted to explore the possibility that a semi-Markov process can create dependencies between sojourn times (the times it takes to transition from one state to the next) that can decrease the uncertainty when estimating time to failures. Using a generalized semi-Markov model, we studied a four element reliability model and were able to demonstrate such sojourn time dependencies. Secondly, we wanted to study the use of semi-Markov processes to introduce a time variable into the event tree diagrams that are commonly developed in PRA (Probabilistic Risk Assessment) analyses. Event tree end states which change with time are more representative of failure scenarios than are the usual static probability-derived end states.
Preterm Versus Term Children: Analysis of Sedation/Anesthesia Adverse Events and Longitudinal Risk.
Havidich, Jeana E; Beach, Michael; Dierdorf, Stephen F; Onega, Tracy; Suresh, Gautham; Cravero, Joseph P
2016-03-01
Preterm and former preterm children frequently require sedation/anesthesia for diagnostic and therapeutic procedures. Our objective was to determine the age at which children who are born <37 weeks gestational age are no longer at increased risk for sedation/anesthesia adverse events. Our secondary objective was to describe the nature and incidence of adverse events. This is a prospective observational study of children receiving sedation/anesthesia for diagnostic and/or therapeutic procedures outside of the operating room by the Pediatric Sedation Research Consortium. A total of 57,227 patients 0 to 22 years of age were eligible for this study. All adverse events and descriptive terms were predefined. Logistic regression and locally weighted scatterplot regression were used for analysis. Preterm and former preterm children had higher adverse event rates (14.7% vs 8.5%) compared with children born at term. Our analysis revealed a biphasic pattern for the development of adverse sedation/anesthesia events. Airway and respiratory adverse events were most commonly reported. MRI scans were the most commonly performed procedures in both categories of patients. Patients born preterm are nearly twice as likely to develop sedation/anesthesia adverse events, and this risk continues up to 23 years of age. We recommend obtaining birth history during the formulation of an anesthetic/sedation plan, with heightened awareness that preterm and former preterm children may be at increased risk. Further prospective studies focusing on the etiology and prevention of adverse events in former preterm patients are warranted. Copyright © 2016 by the American Academy of Pediatrics.
Guyot, Patricia; Ades, A E; Ouwens, Mario J N M; Welton, Nicky J
2012-02-01
The results of Randomized Controlled Trials (RCTs) on time-to-event outcomes that are usually reported are median time to events and Cox Hazard Ratio. These do not constitute the sufficient statistics required for meta-analysis or cost-effectiveness analysis, and their use in secondary analyses requires strong assumptions that may not have been adequately tested. In order to enhance the quality of secondary data analyses, we propose a method which derives from the published Kaplan Meier survival curves a close approximation to the original individual patient time-to-event data from which they were generated. We develop an algorithm that maps from digitised curves back to KM data by finding numerical solutions to the inverted KM equations, using where available information on number of events and numbers at risk. The reproducibility and accuracy of survival probabilities, median survival times and hazard ratios based on reconstructed KM data was assessed by comparing published statistics (survival probabilities, medians and hazard ratios) with statistics based on repeated reconstructions by multiple observers. The validation exercise established there was no material systematic error and that there was a high degree of reproducibility for all statistics. Accuracy was excellent for survival probabilities and medians, for hazard ratios reasonable accuracy can only be obtained if at least numbers at risk or total number of events are reported. The algorithm is a reliable tool for meta-analysis and cost-effectiveness analyses of RCTs reporting time-to-event data. It is recommended that all RCTs should report information on numbers at risk and total number of events alongside KM curves.
Sources of Infrasound events listed in IDC Reviewed Event Bulletin
NASA Astrophysics Data System (ADS)
Bittner, Paulina; Polich, Paul; Gore, Jane; Ali, Sherif; Medinskaya, Tatiana; Mialle, Pierrick
2017-04-01
Until 2003 two waveform technologies, i.e. seismic and hydroacoustic were used to detect and locate events included in the International Data Centre (IDC) Reviewed Event Bulletin (REB). The first atmospheric event was published in the REB in 2003, however automatic processing required significant improvements to reduce the number of false events. In the beginning of 2010 the infrasound technology was reintroduced to the IDC operations and has contributed to both automatic and reviewed IDC bulletins. The primary contribution of infrasound technology is to detect atmospheric events. These events may also be observed at seismic stations, which will significantly improve event location. Examples sources of REB events, which were detected by the International Monitoring System (IMS) infrasound network were fireballs (e.g. Bangkok fireball, 2015), volcanic eruptions (e.g. Calbuco, Chile 2015) and large surface explosions (e.g. Tjanjin, China 2015). Query blasts (e.g. Zheleznogorsk) and large earthquakes (e.g. Italy 2016) belong to events primarily recorded at seismic stations of the IMS network but often detected at the infrasound stations. In case of earthquakes analysis of infrasound signals may help to estimate the area affected by ground vibration. Infrasound associations to query blast events may help to obtain better source location. The role of IDC analysts is to verify and improve location of events detected by the automatic system and to add events which were missed in the automatic process. Open source materials may help to identify nature of some events. Well recorded examples may be added to the Reference Infrasound Event Database to help in analysis process. This presentation will provide examples of events generated by different sources which were included in the IDC bulletins.
Analysis of magnetometer data/wave signals in the Earth's magnetosphere
NASA Technical Reports Server (NTRS)
Engebretson, Mark J.
1993-01-01
Work on the reduction and analysis of Dynamics Explorer (DE) satellite magnetometer data with special emphasis on the ULF fluctuations and waves evident in such data is described. Research focused on the following: (1) studies of Pc 1 wave packets near the plasmapause; (2) satellite-ground pulsation study; (3) support for studies of ion energization processes; (4) search for Pc 1 wave events in 1981 DE 1 data; (5) study of Pc 3-5 events observed simultaneously by DE 1 and by AMPTE CCE; (6) support for studies of electromagnetic transients on DE 1; and (7) analysis of wave events induced by sudden impulses.
Video Traffic Analysis for Abnormal Event Detection
DOT National Transportation Integrated Search
2010-01-01
We propose the use of video imaging sensors for the detection and classification of abnormal events to be used primarily for mitigation of traffic congestion. Successful detection of such events will allow for new road guidelines; for rapid deploymen...
Video traffic analysis for abnormal event detection.
DOT National Transportation Integrated Search
2010-01-01
We propose the use of video imaging sensors for the detection and classification of abnormal events to : be used primarily for mitigation of traffic congestion. Successful detection of such events will allow for : new road guidelines; for rapid deplo...
Fine-Scale Event Location and Error Analysis in NET-VISA
NASA Astrophysics Data System (ADS)
Arora, N. S.; Russell, S.
2016-12-01
NET-VISA is a generative probabilistic model for the occurrence of seismic, hydro, and atmospheric events, and the propagation of energy from these events through various mediums and phases before being detected, or misdetected, by IMS stations. It is built on top of the basic station, and arrival detection processing at the IDC, and is currently being tested in the IDC network processing pipelines. A key distinguishing feature of NET-VISA is that it is easy to incorporate prior scientific knowledge and historical data into the probabilistic model. The model accounts for both detections and mis-detections when forming events, and this allows it to make more accurate event hypothesis. It has been continuously evaluated since 2012, and in each year it makes a roughly 60% reduction in the number of missed events without increasing the false event rate as compared to the existing GA algorithm. More importantly the model finds large numbers of events that have been confirmed by regional seismic bulletins but missed by the IDC analysts using the same data. In this work we focus on enhancements to the model to improve the location accuracy, and error ellipses. We will present a new version of the model that focuses on the fine scale around the event location, and present error ellipses and analysis of recent important events.
Network hydraulics inclusion in water quality event detection using multiple sensor stations data.
Oliker, Nurit; Ostfeld, Avi
2015-09-01
Event detection is one of the current most challenging topics in water distribution systems analysis: how regular on-line hydraulic (e.g., pressure, flow) and water quality (e.g., pH, residual chlorine, turbidity) measurements at different network locations can be efficiently utilized to detect water quality contamination events. This study describes an integrated event detection model which combines multiple sensor stations data with network hydraulics. To date event detection modelling is likely limited to single sensor station location and dataset. Single sensor station models are detached from network hydraulics insights and as a result might be significantly exposed to false positive alarms. This work is aimed at decreasing this limitation through integrating local and spatial hydraulic data understanding into an event detection model. The spatial analysis complements the local event detection effort through discovering events with lower signatures by exploring the sensors mutual hydraulic influences. The unique contribution of this study is in incorporating hydraulic simulation information into the overall event detection process of spatially distributed sensors. The methodology is demonstrated on two example applications using base runs and sensitivity analyses. Results show a clear advantage of the suggested model over single-sensor event detection schemes. Copyright © 2015 Elsevier Ltd. All rights reserved.
AOP: An R Package For Sufficient Causal Analysis in Pathway ...
Summary: How can I quickly find the key events in a pathway that I need to monitor to predict that a/an beneficial/adverse event/outcome will occur? This is a key question when using signaling pathways for drug/chemical screening in pharma-cology, toxicology and risk assessment. By identifying these sufficient causal key events, we have fewer events to monitor for a pathway, thereby decreasing assay costs and time, while maximizing the value of the information. I have developed the “aop” package which uses backdoor analysis of causal net-works to identify these minimal sets of key events that are suf-ficient for making causal predictions. Availability and Implementation: The source and binary are available online through the Bioconductor project (http://www.bioconductor.org/) as an R package titled “aop”. The R/Bioconductor package runs within the R statistical envi-ronment. The package has functions that can take pathways (as directed graphs) formatted as a Cytoscape JSON file as input, or pathways can be represented as directed graphs us-ing the R/Bioconductor “graph” package. The “aop” package has functions that can perform backdoor analysis to identify the minimal set of key events for making causal predictions.Contact: burgoon.lyle@epa.gov This paper describes an R/Bioconductor package that was developed to facilitate the identification of key events within an AOP that are the minimal set of sufficient key events that need to be tested/monit
HIGH-PRECISION BIOLOGICAL EVENT EXTRACTION: EFFECTS OF SYSTEM AND OF DATA
Cohen, K. Bretonnel; Verspoor, Karin; Johnson, Helen L.; Roeder, Chris; Ogren, Philip V.; Baumgartner, William A.; White, Elizabeth; Tipney, Hannah; Hunter, Lawrence
2013-01-01
We approached the problems of event detection, argument identification, and negation and speculation detection in the BioNLP’09 information extraction challenge through concept recognition and analysis. Our methodology involved using the OpenDMAP semantic parser with manually written rules. The original OpenDMAP system was updated for this challenge with a broad ontology defined for the events of interest, new linguistic patterns for those events, and specialized coordination handling. We achieved state-of-the-art precision for two of the three tasks, scoring the highest of 24 teams at precision of 71.81 on Task 1 and the highest of 6 teams at precision of 70.97 on Task 2. We provide a detailed analysis of the training data and show that a number of trigger words were ambiguous as to event type, even when their arguments are constrained by semantic class. The data is also shown to have a number of missing annotations. Analysis of a sampling of the comparatively small number of false positives returned by our system shows that major causes of this type of error were failing to recognize second themes in two-theme events, failing to recognize events when they were the arguments to other events, failure to recognize nontheme arguments, and sentence segmentation errors. We show that specifically handling coordination had a small but important impact on the overall performance of the system. The OpenDMAP system and the rule set are available at http://bionlp.sourceforge.net. PMID:25937701
Victim countries of transnational terrorism: an empirical characteristics analysis.
Elbakidze, Levan; Jin, Yanhong
2012-12-01
This study empirically investigates the association between country-level socioeconomic characteristics and risk of being victimized in transnational terrorism events. We find that a country's annual financial contribution to the U.N. general operating budget has a positive association with the frequency of being victimized in transnational terrorism events. In addition, per capita GDP, political freedom, and openness to trade are nonlinearly related to the frequency of being victimized in transnational terrorism events. © 2012 Society for Risk Analysis.
[Design of medical devices management system supporting full life-cycle process management].
Su, Peng; Zhong, Jianping
2014-03-01
Based on the analysis of the present status of medical devices management, this paper optimized management process, developed a medical devices management system with Web technologies. With information technology to dynamic master the use of state of the entire life-cycle of medical devices. Through the closed-loop management with pre-event budget, mid-event control and after-event analysis, improved the delicacy management level of medical devices, optimized asset allocation, promoted positive operation of devices.
Prins, Theo W; Scholtens, Ingrid M J; Bak, Arno W; van Dijk, Jeroen P; Voorhuijzen, Marleen M; Laurensse, Emile J; Kok, Esther J
2016-12-15
During routine monitoring for GMOs in food in the Netherlands, papaya-containing food supplements were found positive for the genetically modified (GM) elements P-35S and T-nos. The goal of this study was to identify the unknown and EU unauthorised GM papaya event(s). A screening strategy was applied using additional GM screening elements including a newly developed PRSV coat protein PCR. The detected PRSV coat protein PCR product was sequenced and the nucleotide sequence showed identity to PRSV YK strains indigenous to China and Taiwan. The GM events 16-0-1 and 18-2-4 could be identified by amplifying and sequencing events-specific sequences. Further analyses showed that both papaya event 16-0-1 and event 18-2-4 were transformed with the same construct. For use in routine analysis, derived TaqMan qPCR methods for events 16-0-1 and 18-2-4 were developed. Event 16-0-1 was detected in all samples tested whereas event 18-2-4 was detected in one sample. This study presents a strategy for combining information from different sources (literature, patent databases) and novel sequence data to identify unknown GM papaya events. Copyright © 2016 Elsevier Ltd. All rights reserved.
Standardized Analytical Methods for Environmental Restoration Following Homeland Security Events
USDA-ARS?s Scientific Manuscript database
Methodology was formulated for use in the event of a terrorist attack using a variety of chemical, radioactive, biological, and toxic agents. Standardized analysis procedures were determined for use should these events occur. This publication is annually updated....
Streamflow characterization using functional data analysis of the Potomac River
NASA Astrophysics Data System (ADS)
Zelmanow, A.; Maslova, I.; Ticlavilca, A. M.; McKee, M.
2013-12-01
Flooding and droughts are extreme hydrological events that affect the United States economically and socially. The severity and unpredictability of flooding has caused billions of dollars in damage and the loss of lives in the eastern United States. In this context, there is an urgent need to build a firm scientific basis for adaptation by developing and applying new modeling techniques for accurate streamflow characterization and reliable hydrological forecasting. The goal of this analysis is to use numerical streamflow characteristics in order to classify, model, and estimate the likelihood of extreme events in the eastern United States, mainly the Potomac River. Functional data analysis techniques are used to study yearly streamflow patterns, with the extreme streamflow events characterized via functional principal component analysis. These methods are merged with more classical techniques such as cluster analysis, classification analysis, and time series modeling. The developed functional data analysis approach is used to model continuous streamflow hydrographs. The forecasting potential of this technique is explored by incorporating climate factors to produce a yearly streamflow outlook.
Fusing Symbolic and Numerical Diagnostic Computations
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
X-2000 Anomaly Detection Language denotes a developmental computing language, and the software that establishes and utilizes the language, for fusing two diagnostic computer programs, one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for realtime detection of events (e.g., failures) in a spacecraft, aircraft, or other complex engineering system. The numerical analysis method is performed by beacon-based exception analysis for multi-missions (BEAMs), which has been discussed in several previous NASA Tech Briefs articles. The symbolic analysis method is, more specifically, an artificial-intelligence method of the knowledge-based, inference engine type, and its implementation is exemplified by the Spacecraft Health Inference Engine (SHINE) software. The goal in developing the capability to fuse numerical and symbolic diagnostic components is to increase the depth of analysis beyond that previously attainable, thereby increasing the degree of confidence in the computed results. In practical terms, the sought improvement is to enable detection of all or most events, with no or few false alarms.
Parameterization of synoptic weather systems in the South Atlantic Bight for modeling applications
NASA Astrophysics Data System (ADS)
Wu, Xiaodong; Voulgaris, George; Kumar, Nirnimesh
2017-10-01
An event based, long-term, climatological analysis is presented that allows the creation of coastal ocean atmospheric forcing on the coastal ocean that preserves both frequency of occurrence and event time history. An algorithm is developed that identifies individual storm event (cold fronts, warm fronts, and tropical storms) from meteorological records. The algorithm has been applied to a location along the South Atlantic Bight, off South Carolina, an area prone to cyclogenesis occurrence and passages of atmospheric fronts. Comparison against daily weather maps confirms that the algorithm is efficient in identifying cold fronts and warm fronts, while the identification of tropical storms is less successful. The average state of the storm events and their variability are represented by the temporal evolution of atmospheric pressure, air temperature, wind velocity, and wave directional spectral energy. The use of uncorrected algorithm-detected events provides climatologies that show a little deviation from those derived using corrected events. The effectiveness of this analysis method is further verified by numerically simulating the wave conditions driven by the characteristic wind forcing and comparing the results with the wave climatology that corresponds to each storm type. A high level of consistency found in the comparison indicates that this analysis method can be used for accurately characterizing event-based oceanic processes and long-term storm-induced morphodynamic processes on wind-dominated coasts.
Held, Jürgen; Manser, Tanja
2005-02-01
This article outlines how a Palm- or Newton-based PDA (personal digital assistant) system for online event recording was used to record and analyze concurrent events. We describe the features of this PDA-based system, called the FIT-System (flexible interface technique), and its application to the analysis of concurrent events in complex behavioral processes--in this case, anesthesia work processes. The patented FIT-System has a unique user interface design allowing the user to design an interface template with a pencil and paper or using a transparency film. The template usually consists of a drawing or sketch that includes icons or symbols that depict the observer's representation of the situation to be observed. In this study, the FIT-System allowed us to create a design for fast, intuitive online recording of concurrent events using a set of 41 observation codes. An analysis of concurrent events leads to a description of action density, and our results revealed a characteristic distribution of action density during the administration of anesthesia in the operating room. This distribution indicated the central role of the overlapping operations in the action sequences of medical professionals as they deal with the varying requirements of this complex task. We believe that the FIT-System for online recording of concurrent events in complex behavioral processes has the potential to be useful across a broad spectrum of research areas.
Analyzing Verbal and Nonverbal Classroom Communications.
ERIC Educational Resources Information Center
Heger, Herbert K.
The Miniaturized Interaction Analysis System (Mini-TIA) was developed to permit improved analysis of classroom communication in conjunction with video taping. Each of seven verbal event categories is subdivided into two categories according to the nature of the nonverbal events paralleling them. Integrated into the system are (1) defined verbal…
Effects of Interventions on Relapse to Narcotics Addiction: An Event-History Analysis.
ERIC Educational Resources Information Center
Hser, Yih-Ing; And Others
1995-01-01
Event-history analysis was applied to the life history data of 581 male narcotics addicts to specify the concurrent, postintervention, and durational effects of social interventions on relapse to narcotics use. Results indicate the advisability of supporting methadone maintenance with other prevention strategies. (SLD)
Analysis of electrical penetration graph data: what to do with artificially terminated events?
USDA-ARS?s Scientific Manuscript database
Observing the durations of hemipteran feeding behaviors via Electrical Penetration Graph (EPG) results in situations where the duration of the last behavior is not ended by the insect under observation, but by the experimenter. These are artificially terminated events. In data analysis, one must ch...
Assessing and quantifying changes in precipitation patterns using event-driven analysis
USDA-ARS?s Scientific Manuscript database
Studies have claimed that climate change may adversely affect precipitation patterns by increasing the occurrence of extreme events. The effects of climate change on precipitation is expected to take place over a long period of time and will require long-term data to demonstrate. Frequency analysis ...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-15
... NUCLEAR REGULATORY COMMISSION [NRC-2011-0254] Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft Report for Comment; Correction AGENCY: Nuclear Regulatory Commission. ACTION: Draft NUREG; request for comment; correction. SUMMARY: This document corrects a notice appearing...
The natural mathematics of behavior analysis.
Li, Don; Hautus, Michael J; Elliffe, Douglas
2018-04-19
Models that generate event records have very general scope regarding the dimensions of the target behavior that we measure. From a set of predicted event records, we can generate predictions for any dependent variable that we could compute from the event records of our subjects. In this sense, models that generate event records permit us a freely multivariate analysis. To explore this proposition, we conducted a multivariate examination of Catania's Operant Reserve on single VI schedules in transition using a Markov Chain Monte Carlo scheme for Approximate Bayesian Computation. Although we found systematic deviations between our implementation of Catania's Operant Reserve and our observed data (e.g., mismatches in the shape of the interresponse time distributions), the general approach that we have demonstrated represents an avenue for modelling behavior that transcends the typical constraints of algebraic models. © 2018 Society for the Experimental Analysis of Behavior.
Analysis of Hospital Processes with Process Mining Techniques.
Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises
2015-01-01
Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.
NASA Astrophysics Data System (ADS)
Li, Jiaqiang; Choutko, Vitaly; Xiao, Liyi
2018-03-01
Based on the collection of error data from the Alpha Magnetic Spectrometer (AMS) Digital Signal Processors (DSP), on-orbit Single Event Upsets (SEUs) of the DSP program memory are analyzed. The daily error distribution and time intervals between errors are calculated to evaluate the reliability of the system. The particle density distribution of International Space Station (ISS) orbit is presented and the effects from the South Atlantic Anomaly (SAA) and the geomagnetic poles are analyzed. The impact of solar events on the DSP program memory is carried out combining data analysis and Monte Carlo simulation (MC). From the analysis and simulation results, it is concluded that the area corresponding to the SAA is the main source of errors on the ISS orbit. Solar events can also cause errors on DSP program memory, but the effect depends on the on-orbit particle density.
Evaluation of statistical distributions to analyze the pollution of Cd and Pb in urban runoff.
Toranjian, Amin; Marofi, Safar
2017-05-01
Heavy metal pollution in urban runoff causes severe environmental damage. Identification of these pollutants and their statistical analysis is necessary to provide management guidelines. In this study, 45 continuous probability distribution functions were selected to fit the Cd and Pb data in the runoff events of an urban area during October 2014-May 2015. The sampling was conducted from the outlet of the city basin during seven precipitation events. For evaluation and ranking of the functions, we used the goodness of fit Kolmogorov-Smirnov and Anderson-Darling tests. The results of Cd analysis showed that Hyperbolic Secant, Wakeby and Log-Pearson 3 are suitable for frequency analysis of the event mean concentration (EMC), the instantaneous concentration series (ICS) and instantaneous concentration of each event (ICEE), respectively. In addition, the LP3, Wakeby and Generalized Extreme Value functions were chosen for the EMC, ICS and ICEE related to Pb contamination.
NASA Astrophysics Data System (ADS)
Djallel Dilmi, Mohamed; Mallet, Cécile; Barthes, Laurent; Chazottes, Aymeric
2017-04-01
The study of rain time series records is mainly carried out using rainfall rate or rain accumulation parameters estimated on a fixed integration time (typically 1 min, 1 hour or 1 day). In this study we used the concept of rain event. In fact, the discrete and intermittent natures of rain processes make the definition of some features inadequate when defined on a fixed duration. Long integration times (hour, day) lead to mix rainy and clear air periods in the same sample. Small integration time (seconds, minutes) will lead to noisy data with a great sensibility to detector characteristics. The analysis on the whole rain event instead of individual short duration samples of a fixed duration allows to clarify relationships between features, in particular between macro physical and microphysical ones. This approach allows suppressing the intra-event variability partly due to measurement uncertainties and allows focusing on physical processes. An algorithm based on Genetic Algorithm (GA) and Self Organising Maps (SOM) is developed to obtain a parsimonious characterisation of rain events using a minimal set of variables. The use of self-organizing map (SOM) is justified by the fact that it allows to map a high dimensional data space in a two-dimensional space while preserving as much as possible the initial space topology in an unsupervised way. The obtained SOM allows providing the dependencies between variables and consequently removing redundant variables leading to a minimal subset of only five features (the event duration, the rain rate peak, the rain event depth, the event rain rate standard deviation and the absolute rain rate variation of order 0.5). To confirm relevance of the five selected features the corresponding SOM is analyzed. This analysis shows clearly the existence of relationships between features. It also shows the independence of the inter-event time (IETp) feature or the weak dependence of the Dry percentage in event (Dd%e) feature. This confirms that a rain time series can be considered by an alternation of independent rain event and no rain period. The five selected feature are used to perform a hierarchical clustering of the events. The well-known division between stratiform and convective events appears clearly. This classification into two classes is then refined in 5 fairly homogeneous subclasses. The data driven analysis performed on whole rain events instead of fixed length samples allows identifying strong relationships between macrophysics (based on rain rate) and microphysics (based on raindrops) features. We show that among the 5 identified subclasses some of them have specific microphysics characteristics. Obtaining information on microphysical characteristics of rainfall events from rain gauges measurement suggests many implications in development of the quantitative precipitation estimation (QPE), for the improvement of rain rate retrieval algorithm in remote sensing context.
NASA Technical Reports Server (NTRS)
Berg, Melanie; LaBel, Kenneth; Campola, Michael; Xapsos, Michael
2017-01-01
We are investigating the application of classical reliability performance metrics combined with standard single event upset (SEU) analysis data. We expect to relate SEU behavior to system performance requirements. Our proposed methodology will provide better prediction of SEU responses in harsh radiation environments with confidence metrics. single event upset (SEU), single event effect (SEE), field programmable gate array devises (FPGAs)
Risch, Neil; Herrell, Richard; Lehner, Thomas; Liang, Kung-Yee; Eaves, Lindon; Hoh, Josephine; Griem, Andrea; Kovacs, Maria; Ott, Jurg; Merikangas, Kathleen Ries
2009-06-17
Substantial resources are being devoted to identify candidate genes for complex mental and behavioral disorders through inclusion of environmental exposures following the report of an interaction between the serotonin transporter linked polymorphic region (5-HTTLPR) and stressful life events on an increased risk of major depression. To conduct a meta-analysis of the interaction between the serotonin transporter gene and stressful life events on depression using both published data and individual-level original data. Search of PubMed, EMBASE, and PsycINFO databases through March 2009 yielded 26 studies of which 14 met criteria for the meta-analysis. Criteria for studies for the meta-analyses included published data on the association between 5-HTTLPR genotype (SS, SL, or LL), number of stressful life events (0, 1, 2, > or = 3) or equivalent, and a categorical measure of depression defined by the Diagnostic and Statistical Manual of Mental Disorders (Fourth Edition) or the International Statistical Classification of Diseases, 10th Revision (ICD-10) or use of a cut point to define depression from standardized rating scales. To maximize our ability to use a common framework for variable definition, we also requested original data from all studies published prior to 2008 that met inclusion criteria. Of the 14 studies included in the meta-analysis, 10 were also included in a second sex-specific meta-analysis of original individual-level data. Logistic regression was used to estimate the effects of the number of short alleles at 5-HTTLPR, the number of stressful life events, and their interaction on depression. Odds ratios (ORs) and 95% confidence intervals (CIs) were calculated separately for each study and then weighted averages of the individual estimates were obtained using random-effects meta-analysis. Both sex-combined and sex-specific meta-analyses were conducted. Of a total of 14,250 participants, 1769 were classified as having depression; 12,481 as not having depression. In the meta-analysis of published data, the number of stressful life events was significantly associated with depression (OR, 1.41; 95% CI,1.25-1.57). No association was found between 5-HTTLPR genotype and depression in any of the individual studies nor in the weighted average (OR, 1.05; 95% CI, 0.98-1.13) and no interaction effect between genotype and stressful life events on depression was observed (OR, 1.01; 95% CI, 0.94-1.10). Comparable results were found in the sex-specific meta-analysis of individual-level data. This meta-analysis yielded no evidence that the serotonin transporter genotype alone or in interaction with stressful life events is associated with an elevated risk of depression in men alone, women alone, or in both sexes combined.
Bronstert, Axel; Agarwal, Ankit; Boessenkool, Berry; Crisologo, Irene; Fischer, Madlen; Heistermann, Maik; Köhn-Reich, Lisei; López-Tarazón, José Andrés; Moran, Thomas; Ozturk, Ugur; Reinhardt-Imjela, Christian; Wendi, Dadiyorto
2018-07-15
The flash-flood in Braunsbach in the north-eastern part of Baden-Wuerttemberg/Germany was a particularly strong and concise event which took place during the floods in southern Germany at the end of May/early June 2016. This article presents a detailed analysis of the hydro-meteorological forcing and the hydrological consequences of this event. A specific approach, the "forensic hydrological analysis" was followed in order to include and combine retrospectively a variety of data from different disciplines. Such an approach investigates the origins, mechanisms and course of such natural events if possible in a "near real time" mode, in order to follow the most recent traces of the event. The results show that it was a very rare rainfall event with extreme intensities which, in combination with catchment properties, led to extreme runoff plus severe geomorphological hazards, i.e. great debris flows, which together resulted in immense damage in this small rural town Braunsbach. It was definitely a record-breaking event and greatly exceeded existing design guidelines for extreme flood discharge for this region, i.e. by a factor of about 10. Being such a rare or even unique event, it is not reliably feasible to put it into a crisp probabilistic context. However, one can conclude that a return period clearly above 100years can be assigned for all event components: rainfall, peak discharge and sediment transport. Due to the complex and interacting processes, no single flood cause or reason for the very high damage can be identified, since only the interplay and the cascading characteristics of those led to such an event. The roles of different human activities on the origin and/or intensification of such an extreme event are finally discussed. Copyright © 2018. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Egozcue, J. J.; Pawlowsky-Glahn, V.; Ortego, M. I.
2005-03-01
Standard practice of wave-height hazard analysis often pays little attention to the uncertainty of assessed return periods and occurrence probabilities. This fact favors the opinion that, when large events happen, the hazard assessment should change accordingly. However, uncertainty of the hazard estimates is normally able to hide the effect of those large events. This is illustrated using data from the Mediterranean coast of Spain, where the last years have been extremely disastrous. Thus, it is possible to compare the hazard assessment based on data previous to those years with the analysis including them. With our approach, no significant change is detected when the statistical uncertainty is taken into account. The hazard analysis is carried out with a standard model. Time-occurrence of events is assumed Poisson distributed. The wave-height of each event is modelled as a random variable which upper tail follows a Generalized Pareto Distribution (GPD). Moreover, wave-heights are assumed independent from event to event and also independent of their occurrence in time. A threshold for excesses is assessed empirically. The other three parameters (Poisson rate, shape and scale parameters of GPD) are jointly estimated using Bayes' theorem. Prior distribution accounts for physical features of ocean waves in the Mediterranean sea and experience with these phenomena. Posterior distribution of the parameters allows to obtain posterior distributions of other derived parameters like occurrence probabilities and return periods. Predictives are also available. Computations are carried out using the program BGPE v2.0.
[Analysis of Kudiezi injection's security literature].
Chang, Yan-Peng; Xie, Yan-Ming
2012-09-01
By retrieving the relevant database, aim was to achieve the security reported of Kudiezi injection (Yueanxin). To analysis the gender, age, underlying disease, medication dosage, solvent, adverse event/adverse reaction time of occurrence, clinical presentation of patients, It was found the adverse event/adverse reaction usually occur in older people, involving the organs and systems include skin and its appendages, digestive system, nervous system, circulatory system, respiratory system, systemic reaction, part of the adverse event/adverse reaction's cause were not according to the instructions. It was found on the adverse event/adverse reaction of the judgment on the lack of objective evidence, to produce certain effect for objective evaluation of security of Kudiezi injection (Yueanxin).
First Measurement of Electron Neutrino Appearance in NOvA
NASA Astrophysics Data System (ADS)
Adamson, P.; Ader, C.; Andrews, M.; Anfimov, N.; Anghel, I.; Arms, K.; Arrieta-Diaz, E.; Aurisano, A.; Ayres, D. S.; Backhouse, C.; Baird, M.; Bambah, B. A.; Bays, K.; Bernstein, R.; Betancourt, M.; Bhatnagar, V.; Bhuyan, B.; Bian, J.; Biery, K.; Blackburn, T.; Bocean, V.; Bogert, D.; Bolshakova, A.; Bowden, M.; Bower, C.; Broemmelsiek, D.; Bromberg, C.; Brunetti, G.; Bu, X.; Butkevich, A.; Capista, D.; Catano-Mur, E.; Chase, T. R.; Childress, S.; Choudhary, B. C.; Chowdhury, B.; Coan, T. E.; Coelho, J. A. B.; Colo, M.; Cooper, J.; Corwin, L.; Cronin-Hennessy, D.; Cunningham, A.; Davies, G. S.; Davies, J. P.; Del Tutto, M.; Derwent, P. F.; Deepthi, K. N.; Demuth, D.; Desai, S.; Deuerling, G.; Devan, A.; Dey, J.; Dharmapalan, R.; Ding, P.; Dixon, S.; Djurcic, Z.; Dukes, E. C.; Duyang, H.; Ehrlich, R.; Feldman, G. J.; Felt, N.; Fenyves, E. J.; Flumerfelt, E.; Foulkes, S.; Frank, M. J.; Freeman, W.; Gabrielyan, M.; Gallagher, H. R.; Gebhard, M.; Ghosh, T.; Gilbert, W.; Giri, A.; Goadhouse, S.; Gomes, R. A.; Goodenough, L.; Goodman, M. C.; Grichine, V.; Grossman, N.; Group, R.; Grudzinski, J.; Guarino, V.; Guo, B.; Habig, A.; Handler, T.; Hartnell, J.; Hatcher, R.; Hatzikoutelis, A.; Heller, K.; Howcroft, C.; Huang, J.; Huang, X.; Hylen, J.; Ishitsuka, M.; Jediny, F.; Jensen, C.; Jensen, D.; Johnson, C.; Jostlein, H.; Kafka, G. K.; Kamyshkov, Y.; Kasahara, S. M. S.; Kasetti, S.; Kephart, K.; Koizumi, G.; Kotelnikov, S.; Kourbanis, I.; Krahn, Z.; Kravtsov, V.; Kreymer, A.; Kulenberg, Ch.; Kumar, A.; Kutnink, T.; Kwarciancy, R.; Kwong, J.; Lang, K.; Lee, A.; Lee, W. M.; Lee, K.; Lein, S.; Liu, J.; Lokajicek, M.; Lozier, J.; Lu, Q.; Lucas, P.; Luchuk, S.; Lukens, P.; Lukhanin, G.; Magill, S.; Maan, K.; Mann, W. A.; Marshak, M. L.; Martens, M.; Martincik, J.; Mason, P.; Matera, K.; Mathis, M.; Matveev, V.; Mayer, N.; McCluskey, E.; Mehdiyev, R.; Merritt, H.; Messier, M. D.; Meyer, H.; Miao, T.; Michael, D.; Mikheyev, S. P.; Miller, W. H.; Mishra, S. R.; Mohanta, R.; Moren, A.; Mualem, L.; Muether, M.; Mufson, S.; Musser, J.; Newman, H. B.; Nelson, J. K.; Niner, E.; Norman, A.; Nowak, J.; Oksuzian, Y.; Olshevskiy, A.; Oliver, J.; Olson, T.; Paley, J.; Pandey, P.; Para, A.; Patterson, R. B.; Pawloski, G.; Pearson, N.; Perevalov, D.; Pershey, D.; Peterson, E.; Petti, R.; Phan-Budd, S.; Piccoli, L.; Pla-Dalmau, A.; Plunkett, R. K.; Poling, R.; Potukuchi, B.; Psihas, F.; Pushka, D.; Qiu, X.; Raddatz, N.; Radovic, A.; Rameika, R. A.; Ray, R.; Rebel, B.; Rechenmacher, R.; Reed, B.; Reilly, R.; Rocco, D.; Rodkin, D.; Ruddick, K.; Rusack, R.; Ryabov, V.; Sachdev, K.; Sahijpal, S.; Sahoo, H.; Samoylov, O.; Sanchez, M. C.; Saoulidou, N.; Schlabach, P.; Schneps, J.; Schroeter, R.; Sepulveda-Quiroz, J.; Shanahan, P.; Sherwood, B.; Sheshukov, A.; Singh, J.; Singh, V.; Smith, A.; Smith, D.; Smolik, J.; Solomey, N.; Sotnikov, A.; Sousa, A.; Soustruznik, K.; Stenkin, Y.; Strait, M.; Suter, L.; Talaga, R. L.; Tamsett, M. C.; Tariq, S.; Tas, P.; Tesarek, R. J.; Thayyullathil, R. B.; Thomsen, K.; Tian, X.; Tognini, S. C.; Toner, R.; Trevor, J.; Tzanakos, G.; Urheim, J.; Vahle, P.; Valerio, L.; Vinton, L.; Vrba, T.; Waldron, A. V.; Wang, B.; Wang, Z.; Weber, A.; Wehmann, A.; Whittington, D.; Wilcer, N.; Wildberger, R.; Wildman, D.; Williams, K.; Wojcicki, S. G.; Wood, K.; Xiao, M.; Xin, T.; Yadav, N.; Yang, S.; Zadorozhnyy, S.; Zalesak, J.; Zamorano, B.; Zhao, A.; Zirnstein, J.; Zwaska, R.; NOvA Collaboration
2016-04-01
We report results from the first search for νμ→νe transitions by the NOvA experiment. In an exposure equivalent to 2.74 ×1020 protons on target in the upgraded NuMI beam at Fermilab, we observe 6 events in the Far Detector, compared to a background expectation of 0.99 ±0.11 (syst) events based on the Near Detector measurement. A secondary analysis observes 11 events with a background of 1.07 ±0.14 (syst) . The 3.3 σ excess of events observed in the primary analysis disfavors 0.1 π <δC P<0.5 π in the inverted mass hierarchy at the 90% C.L.
Weir, Matthew R; Haskell, Lloyd; Berger, Jeffrey S; Ashton, Veronica; Laliberté, François; Crivera, Concetta; Brown, Kip; Lefebvre, Patrick; Schein, Jeffrey
2018-05-01
Renal dysfunction increases the risk of thromboembolic and bleeding events in patients with nonvalvular atrial fibrillation (NVAF). Adult NVAF patients with ≥ 6 months prior to first warfarin or rivaroxaban dispensing were selected from the IMS Health Real-World Data Adjudicated Claims database (05/2011 - 06/2015) with electronic medical records. Ischemic stroke events, thromboembolic events (venous thromboembolism, myocardial infarction, or ischemic stroke), and major bleeding events were compared between patients by renal function identified by 1) relevant ICD-9-CM diagnosis codes and 2) estimated creatinine clearance (eCrCl). Baseline confounders were adjusted using inverse probability of treatment weights. The diagnosis-based analysis included 39,872 rivaroxaban and 48,637 warfarin users (3,572 and 8,230 with renal dysfunction, respectively). The eCrCl-based analysis included 874 rivaroxaban and 1,069 warfarin users (66 and 208 with eCrCl < 60 mL/min, respectively). In the diagnosis-based analysis, rivaroxaban users with renal dysfunction had a significantly lower stroke rate (HR = 0.55, p = 0.0004) compared to warfarin users; rivaroxaban users with and without renal dysfunction had significantly lower thromboembolic event rates (HR = 0.62, p < 0.0001; and HR = 0.64, p < 0.0001, respectively), and similar major bleeding rates to warfarin users. In the eCrCl-based analysis, rivaroxaban users with eCrCl ≥ 60 mL/min had a significantly lower thromboembolic event rate, but other outcomes were not statistically significant. Rivaroxaban-treated NVAF patients with diagnosed renal dysfunction had a significantly lower stroke rate compared to warfarin-treated patients. Regardless of renal dysfunction diagnoses, rivaroxaban users had lower thromboembolic event rates compared to warfarin users, and a similar rate of major bleeding. eCrCl-based analysis was limited by a small sample size. .
Climate signature of Northwest U.S. precipitation Extremes
NASA Astrophysics Data System (ADS)
Kushnir, Y.; Nakamura, J.
2017-12-01
The climate signature of precipitation extremes in the Northwest U.S. - the region that includes Oregon, Washington, Idaho, Montana and Wyoming - is studied using composite analysis of atmospheric fields leading to and associated with extreme rainfall events. A K-Medoids cluster analysis is applied to winter (November-February) months, maximum 5-day precipitation amounts calculated from 1-degree gridded daily rainfall between 1950/51 and 2013/14. The clustering divides the region into three sub-regions: one over the far eastern part of the analysis domain, includeing most of Montana and Wyoming. Two other sub-regions are in the west, lying north and south of the latitude of 45N. Using the time series corresponding to the Medoid centers, we extract the largest (top 5%) monthly extreme events to form the basis for the composite analysis. The main circulation feature distinguishing a 5-day extreme precipitation event in the two western sub-regions of the Northwest is the presence of a large, blocking, high pressure anomaly over the Gulf of Alaska about a week before the beginning of the 5-day intense precipitation event. The high pressure center intensifies considerably with time, drifting slowly westward, up to a day before the extreme event. During that time, a weak low pressure centers appears at 30N, to the southwest of the high, deepening as it moves east. As the extreme rainfall event is about to begin, the now deep low is encroaching on the Northwest coast while its southern flank taps well south into the subtropical Pacific, drawing moisture from as south as 15N. During the 5-day extreme precipitation event the high pressure center moves west and weakens while the now intense low converges large amounts of subtropical moisture to precipitate over the western Northwest. The implication of this analysis for extended range prediction is assessed.
NASA Astrophysics Data System (ADS)
Streubel, D. P.; Kodama, K.
2014-12-01
To provide continuous flash flood situational awareness and to better differentiate severity of ongoing individual precipitation events, the National Weather Service Research Distributed Hydrologic Model (RDHM) is being implemented over Hawaii and Alaska. In the implementation process of RDHM, three gridded precipitation analyses are used as forcing. The first analysis is a radar only precipitation estimate derived from WSR-88D digital hybrid reflectivity, a Z-R relationship and aggregated into an hourly ¼ HRAP grid. The second analysis is derived from a rain gauge network and interpolated into an hourly ¼ HRAP grid using PRISM climatology. The third analysis is derived from a rain gauge network where rain gauges are assigned static pre-determined weights to derive a uniform mean areal precipitation that is applied over a catchment on a ¼ HRAP grid. To assess the effect of different QPE analyses on the accuracy of RDHM simulations and to potentially identify a preferred analysis for operational use, each QPE was used to force RDHM to simulate stream flow for 20 USGS peak flow events. An evaluation of the RDHM simulations was focused on peak flow magnitude, peak flow timing, and event volume accuracy to be most relevant for operational use. Results showed RDHM simulations based on the observed rain gauge amounts were more accurate in simulating peak flow magnitude and event volume relative to the radar derived analysis. However this result was not consistent for all 20 events nor was it consistent for a few of the rainfall events where an annual peak flow was recorded at more than one USGS gage. Implications of this indicate that a more robust QPE forcing with the inclusion of uncertainty derived from the three analyses may provide a better input for simulating extreme peak flow events.
Li, Zhuqing; Li, Xiang; Wang, Canhua; Song, Guiwen; Pi, Liqun; Zheng, Lan; Zhang, Dabing; Yang, Litao
2017-09-27
Multiple-target plasmid DNA reference materials have been generated and utilized as good substitutes of matrix-based reference materials in the analysis of genetically modified organisms (GMOs). Herein, we report the construction of one multiple-target plasmid reference molecule, pCAN, which harbors eight GM canola event-specific sequences (RF1, RF2, MS1, MS8, Topas 19/2, Oxy235, RT73, and T45) and a partial sequence of the canola endogenous reference gene PEP. The applicability of this plasmid reference material in qualitative and quantitative PCR assays of the eight GM canola events was evaluated, including the analysis of specificity, limit of detection (LOD), limit of quantification (LOQ), and performance of pCAN in the analysis of various canola samples, etc. The LODs are 15 copies for RF2, MS1, and RT73 assays using pCAN as the calibrator and 10 genome copies for the other events. The LOQ in each event-specific real-time PCR assay is 20 copies. In quantitative real-time PCR analysis, the PCR efficiencies of all event-specific and PEP assays are between 91% and 97%, and the squared regression coefficients (R 2 ) are all higher than 0.99. The quantification bias values varied from 0.47% to 20.68% with relative standard deviation (RSD) from 1.06% to 24.61% in the quantification of simulated samples. Furthermore, 10 practical canola samples sampled from imported shipments in the port of Shanghai, China, were analyzed employing pCAN as the calibrator, and the results were comparable with those assays using commercial certified materials as the calibrator. Concluding from these results, we believe that this newly developed pCAN plasmid is one good candidate for being a plasmid DNA reference material in the detection and quantification of the eight GM canola events in routine analysis.
Participation of the NDC Austria at the NDC Preparedness Exercise 2012
NASA Astrophysics Data System (ADS)
Mitterbauer, Ulrike; Wotawa, Gerhard; Schraick, Irene
2013-04-01
NDC Preparedness Exercises (NPEs) are conducted annually by the National Data Centers (NDCs) of CTBT States Signatories to train the detection of a (hypothetical) nuclear test. During the NDC Preparedness Exercise 2012, a fictitious radionuclide scenario originating from a real seismic event (mining explosion) was calculated by the German NDC and distributed among all NDCs. For the scenario computation, it was assumed that the selected seismic event was the epicentre of an underground nuclear fission explosion. The scenario included detections of the Iodine isotopes I-131 and I-133 (both particulates), and the Radioxenon Isotopes Xe-133, Xe-133M, Xe-131M and Xe-135 (noble gas). By means of atmospheric transport modelling (ATM), concentrations of all these six isotopes which would result from the hypothetical explosion were calculated and interpolated to the IMS station locations. The participating NDCs received information about the concentration of the isotopes at the station locations without knowing the underlying seismic event. The aim of the exercise was to identify this event based on the detection scenario. The Austrian NDC performed the following analyses: • Atmospheric backtracking and data fusion to identify seismic candidate events, • Seismic analysis of candidate events within the possible source region, • Atmospheric transport modelling (forward mode) from identified candidate events, comparison between "measured" and simulated concentrations based on certain release assumptions. The main goal of the analysis was to identify the event selected by NDC Germany to calculate the radionuclide scenario, and to exclude other events. In the presentation, the analysis methodology as well as the final results and conclusions will be shown and discussed in detail.
NASA Astrophysics Data System (ADS)
Allison, Lesley; Hawkins, Ed; Woollings, Tim
2015-01-01
Many previous studies have shown that unforced climate model simulations exhibit decadal-scale fluctuations in the Atlantic meridional overturning circulation (AMOC), and that this variability can have impacts on surface climate fields. However, the robustness of these surface fingerprints across different models is less clear. Furthermore, with the potential for coupled feedbacks that may amplify or damp the response, it is not known whether the associated climate signals are linearly related to the strength of the AMOC changes, or if the fluctuation events exhibit nonlinear behaviour with respect to their strength or polarity. To explore these questions, we introduce an objective and flexible method for identifying the largest natural AMOC fluctuation events in multicentennial/multimillennial simulations of a variety of coupled climate models. The characteristics of the events are explored, including their magnitude, meridional coherence and spatial structure, as well as links with ocean heat transport and the horizontal circulation. The surface fingerprints in ocean temperature and salinity are examined, and compared with the results of linear regression analysis. It is found that the regressions generally provide a good indication of the surface changes associated with the largest AMOC events. However, there are some exceptions, including a nonlinear change in the atmospheric pressure signal, particularly at high latitudes, in HadCM3. Some asymmetries are also found between the changes associated with positive and negative AMOC events in the same model. Composite analysis suggests that there are signals that are robust across the largest AMOC events in each model, which provides reassurance that the surface changes associated with one particular event will be similar to those expected from regression analysis. However, large differences are found between the AMOC fingerprints in different models, which may hinder the prediction and attribution of such events in reality.
FASEA: A FPGA Acquisition System and Software Event Analysis for liquid scintillation counting
NASA Astrophysics Data System (ADS)
Steele, T.; Mo, L.; Bignell, L.; Smith, M.; Alexiev, D.
2009-10-01
The FASEA (FPGA based Acquisition and Software Event Analysis) system has been developed to replace the MAC3 for coincidence pulse processing. The system uses a National Instruments Virtex 5 FPGA card (PXI-7842R) for data acquisition and a purpose developed data analysis software for data analysis. Initial comparisons to the MAC3 unit are included based on measurements of 89Sr and 3H, confirming that the system is able to accurately emulate the behaviour of the MAC3 unit.
Economic impact and market analysis of a special event: The Great New England Air Show
Rodney B. Warnick; David C. Bojanic; Atul Sheel; Apurv Mather; Deepak Ninan
2010-01-01
We conducted a post-event evaluation for the Great New England Air Show to assess its general economic impact and to refine economic estimates where possible. In addition to the standard economic impact variables, we examined travel distance, purchase decision involvement, event satisfaction, and frequency of attendance. Graphic mapping of event visitors' home ZIP...
ERIC Educational Resources Information Center
Tanner, Marie
2017-01-01
In this article, I examine the relation between literacy events and literacy practices in classroom interaction and add to ongoing discussions in the field of NLS about the transcontextual nature of literacy and how local literacy events are linked to broader literacy practices. It specifically focuses on how the link between literacy events and…
DOT National Transportation Integrated Search
2002-02-01
This report examines the literature on involuntary, high-consequence, low-probability (IHL) events like nuclear power plant meltdowns to determine what can be applied to the problem of voluntary, low-consequence high-probability (VLH) events like tra...
Performance characterisation of a constructed wetland.
Mangangka, Isri R; Egodawatta, Prasanna; Parker, Nathaniel; Gardner, Ted; Goonetilleke, Ashantha
2013-01-01
Performance of a constructed wetland is commonly reported as being variable due to the site specific nature of influential factors. This paper discusses the outcomes from an in-depth study which characterised the treatment performance of a wetland based on the variation in the runoff regime. The study included a comprehensive field monitoring of a well-established constructed wetland in Gold Coast, Australia. Samples collected at the inlet and outlet were tested for Total Suspended Solids (TSS), Total Nitrogen (TN) and Total Phosphorus (TP). Pollutant concentrations in the outflow were found to be consistent irrespective of the variation in inflow water quality. The analysis revealed two different treatment characteristics for events with different rainfall depths. TSS and TN load reduction was found to be strongly influenced by the hydraulic retention time where performance was relatively superior for rainfall events below the design event. For small events, treatment performance was higher at the beginning of the event and gradually decreased during the course of the event. For large events, the treatment performance was comparatively poor at the beginning and improved during the course of the event. The analysis also confirmed the variable treatment trends for different pollutant types.
The Human Factors of an Early Space Accident: Flight 3-65 of the X-15
NASA Technical Reports Server (NTRS)
Barshi, Immanuel; Statler, Irving C.; Orr, Jeb S.
2016-01-01
The X-15 was a critical research vehicle in the early days of space flight. On November 15, 1967, the X-15-3 suffered an in-flight breakup. This 191st flight of the X-15 and the 65th flight of this third configuration was the only fatal accident of the X-15 program. This paper presents an analysis, from a human factors perspective, of the events that led up to the accident. The analysis is based on the information contained in the report of the Air Force-NASA Accident Investigation Board (AIB) dated January, 1968. The AIBs analysis addressed, primarily, the events that occurred subsequent to the pilot's taking direct control of the reaction control system. The analysis described here suggests that, rather than events following the pilot's switch to direct control, it was the events preceding the switch that led to the accident. Consequently, the analyses and conclusions regarding the causal factors of, and the contributing factors to, the loss of Flight 3-65 presented here differ from those of the AIB based on the same evidence. Although the accident occurred in 1967, the results of the presented analysis are still relevant today. We present our analysis and discuss its implications for the safety of space operations.
NASA Astrophysics Data System (ADS)
Kaitna, R.; Braun, M.
2016-12-01
Steep mountain channels episodically can experience very different geomorphic processes, ranging from flash floods, intensive bedload transport, debris floods, and debris flows. Rainfall-related trigger conditions and geomorphic disposition for each of these processes to occur, as well as conditions leading to one process and not to the other, are not well understood. In this contribution, we analyze triggering rainfalls for all documented events in the Eastern (Austrian) Alps on a daily and sub-daily basis. The analysis with daily rainfall data covers more than 6640 events between 1901 and 2014 and the analysis based on sub-daily (10 min interval) rainfall data includes around 950 events between 1992 and 2014. Of the four investigated event types, we find that debris flows are typically associated with the least cumulative rainfall, while intensive bedload transport as well as torrential floods occur when there is a substantial amount of cumulative rainfall. Debris floods are occurring on average with cumulative rainfall in a range between the aforementioned processes. Comparison of historical data shows, that about 90% of events are triggered with a combination of extreme rainfall and temperature. Bayesian analysis reveals that a high degree of geomorphic events is associated with very short rainfall durations that cannot be resolved with daily rainfall data. A comparison of both datasets shows that subdaily data gives more accurate results. Additionally, we find a high degree of regional differences, e.g. between regions north and south of the Alpine chain or high or low Alpine regions. There is indication that especially debris flows need less total rainfall amount when occurring in regions with a high relief energy than in less steep environments. The limitation of our analysis is mainly due to the distance between the locations of event triggering and rainfall measurement and the definition of rainfall events for the Bayesian analysis. In a next step, we will connect our results with the analyses of the hydrological as well as geomorphological disposition in selected study regions and with projections of changing climate conditions.
Disruption Event Characterization and Forecasting in Tokamaks
NASA Astrophysics Data System (ADS)
Berkery, J. W.; Sabbagh, S. A.; Park, Y. S.; Ahn, J. H.; Jiang, Y.; Riquezes, J. D.; Gerhardt, S. P.; Myers, C. E.
2017-10-01
The Disruption Event Characterization and Forecasting (DECAF) code, being developed to meet the challenging goal of high reliability disruption prediction in tokamaks, automates data analysis to determine chains of events that lead to disruptions and to forecast their evolution. The relative timing of magnetohydrodynamic modes and other events including plasma vertical displacement, loss of boundary control, proximity to density limits, reduction of safety factor, and mismatch of the measured and desired plasma current are considered. NSTX/-U databases are examined with analysis expanding to DIII-D, KSTAR, and TCV. Characterization of tearing modes has determined mode bifurcation frequency and locking points. In an NSTX database exhibiting unstable resistive wall modes (RWM), the RWM event and loss of boundary control event were found in 100%, and the vertical displacement event in over 90% of cases. A reduced kinetic RWM stability physics model is evaluated to determine the proximity of discharges to marginal stability. The model shows high success as a disruption predictor (greater than 85%) with relatively low false positive rate. Supported by US DOE Contracts DE-FG02-99ER54524, DE-AC02-09CH11466, and DE-SC0016614.
NASA Astrophysics Data System (ADS)
Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao
2006-12-01
We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.
Lu, Guo-Cai; Wei, Rui-Li
2012-01-01
Background Intravitreal anti-vascular endothelial growth factor (VEGF) monoclonal antibodies are used in ocular neovascular diseases. A consensus has emerged that intravenous anti-VEGF can increase the risk of arterial thromboembolic events. However, the role of intravitreal anti-VEGF in arterial thromboembolism is controversial. Therefore, we did a systematic review and meta-analysis to investigate the effects of intravitreal anti-VEGF on the risk of arterial thromboembolic events. Methods Electronic databases were searched to identify relevant randomized clinical trials comparing intravitreal anti-VEGF with controls. Criteria for inclusion in our meta-analysis included a study duration of no less than 12 months, the use of a randomized control group not receiving any intravitreal active agent, and the availability of outcome data for arterial thromboembolic events, myocardial infarction, cerebrovascular accidents, and vascular death. The risk ratios and 95% CIs were calculated using a fixed-effects or random-effects model, depending on the heterogeneity of the included studies. Results A total of 4942 patients with a variety of ocular neovascular diseases from 13 randomized controlled trials were identified and included for analysis. There was no significant difference between intravitreal anti-VEGF and control in the risk of all events, with risk ratios of 0.87 (95% CI, 0.64 to 1.19) for arterial thromboembolic events, 0.96 (95% CI, 0.55–1.68) for cerebrovascular accidents, 0.69 (95% CI 0.40–1.21) for myocardial infarctions, and 0.68 (95% CI, 0.37–1.27) for vascular death. Conclusions The strength evidence suggests that the intravitreal use of anti-VEGF antibodies is not associated with an increased risk of arterial thromboembolic events. PMID:22829940
The cost of nurse-sensitive adverse events.
Pappas, Sharon Holcombe
2008-05-01
The aim of this study was to describe the methodology for nursing leaders to determine the cost of adverse events and effective levels of nurse staffing. The growing transparency of quality and cost outcomes motivates healthcare leaders to optimize the effectiveness of nurse staffing. Most hospitals have robust cost accounting systems that provide actual patient-level direct costs. These systems allow an analysis of the cost consumed by patients during a hospital stay. By knowing the cost of complications, leaders have the ability to justify the cost of improved staffing when quality evidence shows that higher nurse staffing improves quality. An analysis was performed on financial and clinical data from hospital databases of 3,200 inpatients. The purpose was to establish a methodology to determine actual cost per case. Three diagnosis-related groups were the focus of the analysis. Five adverse events were analyzed along with the costs. A regression analysis reported that the actual direct cost of an adverse event was dollars 1,029 per case in the congestive heart failure cases and dollars 903 in the surgical cases. There was a significant increase in the cost per case in medical patients with urinary tract infection and pressure ulcers and in surgical patients with urinary tract infection and pneumonia. The odds of pneumonia occurring in surgical patients decreased with additional registered nurse hours per patient day. Hospital cost accounting systems are useful in determining the cost of adverse events and can aid in decision making about nurse staffing. Adverse events add costs to patient care and should be measured at the unit level to adjust staffing to reduce adverse events and avoid costs.
On-line data analysis and monitoring for H1 drift chambers
NASA Astrophysics Data System (ADS)
Düllmann, Dirk
1992-05-01
The on-line monitoring, slow control and calibration of the H1 central jet chamber uses a VME multiprocessor system to perform the analysis and a connected Macintosh computer as graphical interface to the operator on shift. Task of this system are: - analysis of event data including on-line track search, - on-line calibration from normal events and testpulse events, - control of the high voltage and monitoring of settings and currents, - monitoring of temperature, pressure and mixture of the chambergas. A program package is described which controls the dataflow between data aquisition, differnt VME CPUs and Macintosh. It allows to run off-line style programs for the different tasks.
Re-Evaluation of Event Correlations in Virtual California Using Statistical Analysis
NASA Astrophysics Data System (ADS)
Glasscoe, M. T.; Heflin, M. B.; Granat, R. A.; Yikilmaz, M. B.; Heien, E.; Rundle, J.; Donnellan, A.
2010-12-01
Fusing the results of simulation tools with statistical analysis methods has contributed to our better understanding of the earthquake process. In a previous study, we used a statistical method to investigate emergent phenomena in data produced by the Virtual California earthquake simulator. The analysis indicated that there were some interesting fault interactions and possible triggering and quiescence relationships between events. We have converted the original code from Matlab to python/C++ and are now evaluating data from the most recent version of Virtual California in order to analyze and compare any new behavior exhibited by the model. The Virtual California earthquake simulator can be used to study fault and stress interaction scenarios for realistic California earthquakes. The simulation generates a synthetic earthquake catalog of events with a minimum size of ~M 5.8 that can be evaluated using statistical analysis methods. Virtual California utilizes realistic fault geometries and a simple Amontons - Coulomb stick and slip friction law in order to drive the earthquake process by means of a back-slip model where loading of each segment occurs due to the accumulation of a slip deficit at the prescribed slip rate of the segment. Like any complex system, Virtual California may generate emergent phenomena unexpected even by its designers. In order to investigate this, we have developed a statistical method that analyzes the interaction between Virtual California fault elements and thereby determine whether events on any given fault elements show correlated behavior. Our method examines events on one fault element and then determines whether there is an associated event within a specified time window on a second fault element. Note that an event in our analysis is defined as any time an element slips, rather than any particular “earthquake” along the entire fault length. Results are then tabulated and then differenced with an expected correlation, calculated by assuming a uniform distribution of events in time. We generate a correlation score matrix, which indicates how weakly or strongly correlated each fault element is to every other in the course of the VC simulation. We calculate correlation scores by summing the difference between the actual and expected correlations over all time window lengths and normalizing by the time window size. The correlation score matrix can focus attention on the most interesting areas for more in-depth analysis of event correlation vs. time. The previous study included 59 faults (639 elements) in the model, which included all the faults save the creeping section of the San Andreas. The analysis spanned 40,000 yrs of Virtual California-generated earthquake data. The newly revised VC model includes 70 faults, 8720 fault elements, and spans 110,000 years. Due to computational considerations, we will evaluate the elements comprising the southern California region, which our previous study indicated showed interesting fault interaction and event triggering/quiescence relationships.
Joint Attributes and Event Analysis for Multimedia Event Detection.
Ma, Zhigang; Chang, Xiaojun; Xu, Zhongwen; Sebe, Nicu; Hauptmann, Alexander G
2017-06-15
Semantic attributes have been increasingly used the past few years for multimedia event detection (MED) with promising results. The motivation is that multimedia events generally consist of lower level components such as objects, scenes, and actions. By characterizing multimedia event videos with semantic attributes, one could exploit more informative cues for improved detection results. Much existing work obtains semantic attributes from images, which may be suboptimal for video analysis since these image-inferred attributes do not carry dynamic information that is essential for videos. To address this issue, we propose to learn semantic attributes from external videos using their semantic labels. We name them video attributes in this paper. In contrast with multimedia event videos, these external videos depict lower level contents such as objects, scenes, and actions. To harness video attributes, we propose an algorithm established on a correlation vector that correlates them to a target event. Consequently, we could incorporate video attributes latently as extra information into the event detector learnt from multimedia event videos in a joint framework. To validate our method, we perform experiments on the real-world large-scale TRECVID MED 2013 and 2014 data sets and compare our method with several state-of-the-art algorithms. The experiments show that our method is advantageous for MED.
Ginsburg, Liane R; Chuang, You-Ta; Berta, Whitney Blair; Norton, Peter G; Ng, Peggy; Tregunno, Deborah; Richardson, Julia
2010-06-01
To examine the relationship between organizational leadership for patient safety and five types of learning from patient safety events (PSEs). Forty-nine general acute care hospitals in Ontario, Canada. A nonexperimental design using cross-sectional surveys of hospital patient safety officers (PSOs) and patient care managers (PCMs). PSOs provided data on organization-level learning from (a) minor events, (b) moderate events, (c) major near misses, (d) major event analysis, and (e) major event dissemination/communication. PCMs provided data on organizational leadership (formal and informal) for patient safety. Hospitals were the unit of analysis. Seemingly unrelated regression was used to examine the influence of formal and informal leadership for safety on the five types of learning from PSEs. The interaction between leadership and hospital size was also examined. Formal organizational leadership for patient safety is an important predictor of learning from minor, moderate, and major near-miss events, and major event dissemination. This relationship is significantly stronger for small hospitals (<100 beds). We find support for the relationship between patient safety leadership and patient safety behaviors such as learning from safety events. Formal leadership support for safety is of particular importance in small organizations where the economic burden of safety programs is disproportionately large and formal leadership is closer to the front lines.
A Comprehensive Seismic Characterization of the Cove Fort-Sulphurdale Geothermal Site, Utah
NASA Astrophysics Data System (ADS)
Zhang, H.; Li, J.; Zhang, X.; Liu, Y.; Kuleli, H. S.; Toksoz, M. N.
2012-12-01
The Cove Fort-Sulphurdale geothermal area is located in the transition zone between the extensional Basin and Range Province to the west and the uplifted Colorado Plateau to the east. The region around the geothermal site has the highest heat flow values of over 260 mWm-2 in Utah. To better understand the structure around the geothermal site, the MIT group deployed 10 seismic stations for a period of one year from August 2010. The local seismic network detected over 500 local earthquakes, from which ~200 events located within the network were selected for further analysis. Our seismic analysis is focused on three aspects: seismic velocity and attenuation tomography, seismic event focal mechanism analysis, and seismic shear wave splitting analysis. First P- and S-wave arrivals are picked manually and then the waveform cross-correlation technique is applied to obtain more accurate differential times between event pairs observed on common stations. The double-difference tomography method of Zhang and Thurber (2003) is used to simultaneously determine Vp and Vs models and seismic event locations. For the attenuation tomography, we first calculate t* values from spectrum fitting and then invert them to get Q models based on known velocity models and seismic event locations. Due to the limited station coverage and relatively low signal to noise ratio, many seismic waveforms do not have clear first P arrival polarities and as a result the conventional focal mechanism determination method relying on the polarity information is not applicable. Therefore, we used the full waveform matching method of Li et al. (2010) to determine event focal mechanisms. For the shear wave splitting analysis, we used the cross-correlation method to determine the delay times between fast and slow shear waves and the polarization angles of fast shear waves. The delay times are further taken to image the anisotropy percentage distribution in three dimensions using the shear wave splitting tomography method of Zhang et al. (2007). For the study region, overall the velocity is lower and attenuation is higher in the western part. Correspondingly, the anisotropy is also stronger, indicating the fractures may be more developed in the western part. The average fast polarization directions of fast shear waves at each station mostly point NNE. From the focal mechanism analysis from selected events, it shows that the normal faulting events have strikes in NNE direction, and the events with strike slip mechanism have strikes either parallel with the NNE trending faults or their conjugate ones. Assuming the maximum horizontal stress (SHmax) is parallel with the strike of the normal faulting events and bisects the two fault planes of the strike-slip events, the inverted source mechanism suggests a NNE oriented maximum horizontal stress regime. This area is under W-E tensional stress, which means maximum compressional stress should be in the N-E or NNE direction in general. The combination of shear wave splitting and focal mechanism analysis suggests that in this region the faults and fractures are aligned in the NNE direction.
A Survey of Logic Formalisms to Support Mishap Analysis
NASA Technical Reports Server (NTRS)
Johnson, Chris; Holloway, C. M.
2003-01-01
Mishap investigations provide important information about adverse events and near miss incidents. They are intended to help avoid any recurrence of previous failures. Over time, they can also yield statistical information about incident frequencies that helps to detect patterns of failure and can validate risk assessments. However, the increasing complexity of many safety critical systems is posing new challenges for mishap analysis. Similarly, the recognition that many failures have complex, systemic causes has helped to widen the scope of many mishap investigations. These two factors have combined to pose new challenges for the analysis of adverse events. A new generation of formal and semi-formal techniques have been proposed to help investigators address these problems. We introduce the term mishap logics to collectively describe these notations that might be applied to support the analysis of mishaps. The proponents of these notations have argued that they can be used to formally prove that certain events created the necessary and sufficient causes for a mishap to occur. These proofs can be used to reduce the bias that is often perceived to effect the interpretation of adverse events. Others have argued that one cannot use logic formalisms to prove causes in the same way that one might prove propositions or theorems. Such mechanisms cannot accurately capture the wealth of inductive, deductive and statistical forms of inference that investigators must use in their analysis of adverse events. This paper provides an overview of these mishap logics. It also identifies several additional classes of logic that might also be used to support mishap analysis.
Gibbons, Jeffrey A; Horowitz, Kyle A; Dunlap, Spencer M
2017-08-01
Unpleasant affect fades faster than pleasant affect (e.g., Walker, Vogl, & Thompson, 1997); this effect is referred to as the Fading Affect Bias (FAB; Walker, Skowronski, Gibbons, Vogl, & Thompson, 2003a). Research shows that the FAB is consistently related to positive/healthy outcomes at a general but not at a specific level of analysis based on event types and individual differences (e.g., Gibbons et al., 2013). Based on the positive outcomes for FAB and negative outcomes for social media (Bolton et al., 2013; Huang, 2010), the current study examined FAB in the context of social media events along with related individual differences. General positive outcomes were shown in the form of robust FAB effects across social media and non-social media events, a larger FAB for non-social media events than for social media events, negative correlations of FAB with depression, anxiety, and stress as well as a positive correlation of FAB with self-esteem. However, the lack of a negative correlation between FAB and anxiety for social media events in a 3-way interaction did not show positive outcomes at a specific level of analysis. Rehearsal ratings mediated the 3-way interaction. Implications are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.
Ondeck, Nathaniel T; Fu, Michael C; Skrip, Laura A; McLynn, Ryan P; Cui, Jonathan J; Basques, Bryce A; Albert, Todd J; Grauer, Jonathan N
2018-04-09
The presence of missing data is a limitation of large datasets, including the National Surgical Quality Improvement Program (NSQIP). In addressing this issue, most studies use complete case analysis, which excludes cases with missing data, thus potentially introducing selection bias. Multiple imputation, a statistically rigorous approach that approximates missing data and preserves sample size, may be an improvement over complete case analysis. The present study aims to evaluate the impact of using multiple imputation in comparison with complete case analysis for assessing the associations between preoperative laboratory values and adverse outcomes following anterior cervical discectomy and fusion (ACDF) procedures. This is a retrospective review of prospectively collected data. Patients undergoing one-level ACDF were identified in NSQIP 2012-2015. Perioperative adverse outcome variables assessed included the occurrence of any adverse event, severe adverse events, and hospital readmission. Missing preoperative albumin and hematocrit values were handled using complete case analysis and multiple imputation. These preoperative laboratory levels were then tested for associations with 30-day postoperative outcomes using logistic regression. A total of 11,999 patients were included. Of this cohort, 63.5% of patients had missing preoperative albumin and 9.9% had missing preoperative hematocrit. When using complete case analysis, only 4,311 patients were studied. The removed patients were significantly younger, healthier, of a common body mass index, and male. Logistic regression analysis failed to identify either preoperative hypoalbuminemia or preoperative anemia as significantly associated with adverse outcomes. When employing multiple imputation, all 11,999 patients were included. Preoperative hypoalbuminemia was significantly associated with the occurrence of any adverse event and severe adverse events. Preoperative anemia was significantly associated with the occurrence of any adverse event, severe adverse events, and hospital readmission. Multiple imputation is a rigorous statistical procedure that is being increasingly used to address missing values in large datasets. Using this technique for ACDF avoided the loss of cases that may have affected the representativeness and power of the study and led to different results than complete case analysis. Multiple imputation should be considered for future spine studies. Copyright © 2018 Elsevier Inc. All rights reserved.
Methodology for Collision Risk Assessment of an Airspace Flow Corridor Concept
NASA Astrophysics Data System (ADS)
Zhang, Yimin
This dissertation presents a methodology to estimate the collision risk associated with a future air-transportation concept called the flow corridor. The flow corridor is a Next Generation Air Transportation System (NextGen) concept to reduce congestion and increase throughput in en-route airspace. The flow corridor has the potential to increase throughput by reducing the controller workload required to manage aircraft outside the corridor and by reducing separation of aircraft within corridor. The analysis in this dissertation is a starting point for the safety analysis required by the Federal Aviation Administration (FAA) to eventually approve and implement the corridor concept. This dissertation develops a hybrid risk analysis methodology that combines Monte Carlo simulation with dynamic event tree analysis. The analysis captures the unique characteristics of the flow corridor concept, including self-separation within the corridor, lane change maneuvers, speed adjustments, and the automated separation assurance system. Monte Carlo simulation is used to model the movement of aircraft in the flow corridor and to identify precursor events that might lead to a collision. Since these precursor events are not rare, standard Monte Carlo simulation can be used to estimate these occurrence rates. Dynamic event trees are then used to model the subsequent series of events that may lead to collision. When two aircraft are on course for a near-mid-air collision (NMAC), the on-board automated separation assurance system provides a series of safety layers to prevent the impending NNAC or collision. Dynamic event trees are used to evaluate the potential failures of these layers in order to estimate the rare-event collision probabilities. The results show that the throughput can be increased by reducing separation to 2 nautical miles while maintaining the current level of safety. A sensitivity analysis shows that the most critical parameters in the model related to the overall collision probability are the minimum separation, the probability that both flights fail to respond to traffic collision avoidance system, the probability that an NMAC results in a collision, the failure probability of the automatic dependent surveillance broadcast in receiver, and the conflict detection probability.
Climate change impacts on extreme events in the United States: an uncertainty analysis
Extreme weather and climate events, such as heat waves, droughts and severe precipitation events, have substantial impacts on ecosystems and the economy. However, future climate simulations display large uncertainty in mean changes. As a result, the uncertainty in future changes ...
"That in your hands". A comprehensive process analysis of a significant event in psychotherapy.
Elliott, R
1983-05-01
This article illustrates a new approach to the study of change processes in psychotherapy. The approach involves selecting significant change events and analyzing them according to the Comprehensive Process Model. In this model, client and therapist behaviors are analyzed for content, interpersonal action, style and response quality by using information derived from Interpersonal Process Recall, client and therapist objective process ratings and qualitative analyses. The event selected for analysis in this paper was rated by client and therapist as significantly helpful. The focal therapist response was a reflective-interpretive intervention in which the therapist collaboratively and evocatively expanded the client's implicit meanings. The event involved working through an earlier insight and realization of progress by the client. The event suggests an association between subjective "felt shifts" and public "process shifts" in client in-therapy behaviors. A model, consistent with Gendlin's experiential psychotherapy (1970), is offered to describe the change process which occurred in this event.
Weng, Shenglin; Li, Yiping; Wei, Jin; Du, Wei; Gao, Xiaomeng; Wang, Wencai; Wang, Jianwei; Acharya, Kumud; Luo, Liancong
2018-05-01
The identification of coherent structures is very important in investigating the sediment transport mechanism and controlling the eutrophication in shallow lakes. This study analyzed the turbulence characteristics and the sensitivity of quadrant analysis to threshold level. Simultaneous in situ measurements of velocities and suspended sediment concentration (SSC) were conducted in Lake Taihu with acoustic Doppler velocimeter (ADV) and optical backscatter sensor (OBS) instruments. The results show that the increase in hole size makes the difference between dominant and non-dominant events more distinct. Wind velocity determines the frequency of occurrence of sweep and ejection events, which provide dominant contributions to the Reynolds stress. The increase of wind velocity enlarges the magnitude of coherent events but has little impact on the events frequency with the same hole size. The events occurring within short periods provide large contributions to the momentum flux. Transportation and diffusion of sediment are in control of the intermittent coherent events to a large extent.
Re-presentations of space in Hollywood movies: an event-indexing analysis.
Cutting, James; Iricinschi, Catalina
2015-03-01
Popular movies present chunk-like events (scenes and subscenes) that promote episodic, serial updating of viewers' representations of the ongoing narrative. Event-indexing theory would suggest that the beginnings of new scenes trigger these updates, which in turn require more cognitive processing. Typically, a new movie event is signaled by an establishing shot, one providing more background information and a longer look than the average shot. Our analysis of 24 films reconfirms this. More important, we show that, when returning to a previously shown location, the re-establishing shot reduces both context and duration while remaining greater than the average shot. In general, location shifts dominate character and time shifts in event segmentation of movies. In addition, over the last 70 years re-establishing shots have become more like the noninitial shots of a scene. Establishing shots have also approached noninitial shot scales, but not their durations. Such results suggest that film form is evolving, perhaps to suit more rapid encoding of narrative events. Copyright © 2014 Cognitive Science Society, Inc.
Insertion of coherence requests for debugging a multiprocessor
Blumrich, Matthias A.; Salapura, Valentina
2010-02-23
A method and system are disclosed to insert coherence events in a multiprocessor computer system, and to present those coherence events to the processors of the multiprocessor computer system for analysis and debugging purposes. The coherence events are inserted in the computer system by adding one or more special insert registers. By writing into the insert registers, coherence events are inserted in the multiprocessor system as if they were generated by the normal coherence protocol. Once these coherence events are processed, the processing of coherence events can continue in the normal operation mode.
Automatic event recognition and anomaly detection with attribute grammar by learning scene semantics
NASA Astrophysics Data System (ADS)
Qi, Lin; Yao, Zhenyu; Li, Li; Dong, Junyu
2007-11-01
In this paper we present a novel framework for automatic event recognition and abnormal behavior detection with attribute grammar by learning scene semantics. This framework combines learning scene semantics by trajectory analysis and constructing attribute grammar-based event representation. The scene and event information is learned automatically. Abnormal behaviors that disobey scene semantics or event grammars rules are detected. By this method, an approach to understanding video scenes is achieved. Further more, with this prior knowledge, the accuracy of abnormal event detection is increased.
Uchiyama, Shinichiro; Demaerschalk, Bart M; Goto, Shinya; Shinohara, Yukito; Gotoh, Fumio; Stone, William M; Money, Samuel R; Kwon, Sun Uck
2009-01-01
Cilostazol is an antiplatelet agent that inhibits phosphodiesterase III in platelets and vascular endothelium. Previous randomized controlled trials of cilostazol for prevention of cerebrovascular events have garnered mixed results. We performed a systematic review and meta-analysis of the randomized clinical trials in patients with atherothrombotic diseases to determine the effects of cilostazol on cerebrovascular, cardiac, and all vascular events, and on all major hemorrhagic events. Relevant trials were identified by searching MEDLINE, EMBASE, and the Cochrane Controlled Trial Registry for titles and abstracts. Data from 12 randomized controlled trials, involving 5674 patients, were analyzed for end points of cerebrovascular, cardiac, and major bleeding events. Searching, determination of eligibility, data extraction, and meta-analyses were conducted by multiple independent investigators. Data were available in 3782, 1187, and 705 patients with peripheral arterial disease, cerebrovascular disease, and coronary stenting, respectively. Incidence of total vascular events was significantly lower in the cilostazol group compared with the placebo group (relative risk [RR], 0.86; 95% confidence interval [CI], 0.74-0.99; P=.038). This was particularly influenced by a significant decrease of incidence of cerebrovascular events in the cilostazol group (RR, 0.58; 95% CI, 0.43-0.78; P < .001). There was no significant intergroup difference in incidence of cardiac events (RR, 0.99; 95% CI, 0.83-1.17; P=.908) and serious bleeding complications (RR, 1.00; 95% CI, 0.66-1.51; P=.996). This first meta-analysis of cilostazol in patients with atherothrombosis demonstrated a significant risk reduction for cerebrovascular events, with no associated increase of bleeding risk.
Vu, Duy; Lomi, Alessandro; Mascia, Daniele; Pallotti, Francesca
2017-06-30
The main objective of this paper is to introduce and illustrate relational event models, a new class of statistical models for the analysis of time-stamped data with complex temporal and relational dependencies. We outline the main differences between recently proposed relational event models and more conventional network models based on the graph-theoretic formalism typically adopted in empirical studies of social networks. Our main contribution involves the definition and implementation of a marked point process extension of currently available models. According to this approach, the sequence of events of interest is decomposed into two components: (a) event time and (b) event destination. This decomposition transforms the problem of selection of event destination in relational event models into a conditional multinomial logistic regression problem. The main advantages of this formulation are the possibility of controlling for the effect of event-specific data and a significant reduction in the estimation time of currently available relational event models. We demonstrate the empirical value of the model in an analysis of interhospital patient transfers within a regional community of health care organizations. We conclude with a discussion of how the models we presented help to overcome some the limitations of statistical models for networks that are currently available. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Creating Reality: How TV News Distorts Events.
ERIC Educational Resources Information Center
Altheide, David L.
A three-year research project, including more than one year in a network affiliate station, provided the material for an analysis of current practices in television news programming. Based on the thesis that the organization of news encourages the oversimplification of events, this analysis traces the foundation of the bias called the "news…
Service-Learning and Graduation: Evidence from Event History Analysis
ERIC Educational Resources Information Center
Yue, Hongtao; Hart, Steven M.
2017-01-01
This research employed Event History Analysis to understand how service-learning participation is related to students' graduation within six years. The longitudinal dataset includes 31,074 new undergraduate students who enrolled in a large western U.S. public university from Fall 2002 to Fall 2009. The study revealed that service-learning…
Data Albums: An Event Driven Search, Aggregation and Curation Tool for Earth Science
NASA Technical Reports Server (NTRS)
Ramachandran, Rahul; Kulkarni, Ajinkya; Maskey, Manil; Bakare, Rohan; Basyal, Sabin; Li, Xiang; Flynn, Shannon
2014-01-01
Approaches used in Earth science research such as case study analysis and climatology studies involve discovering and gathering diverse data sets and information to support the research goals. To gather relevant data and information for case studies and climatology analysis is both tedious and time consuming. Current Earth science data systems are designed with the assumption that researchers access data primarily by instrument or geophysical parameter. In cases where researchers are interested in studying a significant event, they have to manually assemble a variety of datasets relevant to it by searching the different distributed data systems. This paper presents a specialized search, aggregation and curation tool for Earth science to address these challenges. The search rool automatically creates curated 'Data Albums', aggregated collections of information related to a specific event, containing links to relevant data files [granules] from different instruments, tools and services for visualization and analysis, and information about the event contained in news reports, images or videos to supplement research analysis. Curation in the tool is driven via an ontology based relevancy ranking algorithm to filter out non relevant information and data.
NASA Astrophysics Data System (ADS)
Gómez, Wilmar
2017-04-01
By analyzing the spatial and temporal variability of extreme precipitation events we can prevent or reduce the threat and risk. Many water resources projects require joint probability distributions of random variables such as precipitation intensity and duration, which can not be independent with each other. The problem of defining a probability model for observations of several dependent variables is greatly simplified by the joint distribution in terms of their marginal by taking copulas. This document presents a general framework set frequency analysis bivariate and multivariate using Archimedean copulas for extreme events of hydroclimatological nature such as severe storms. This analysis was conducted in the lower Tunjuelo River basin in Colombia for precipitation events. The results obtained show that for a joint study of the intensity-duration-frequency, IDF curves can be obtained through copulas and thus establish more accurate and reliable information from design storms and associated risks. It shows how the use of copulas greatly simplifies the study of multivariate distributions that introduce the concept of joint return period used to represent the needs of hydrological designs properly in frequency analysis.
NASA Technical Reports Server (NTRS)
Hejduk, M. D.
2016-01-01
Provide a response to MOWG action item 1410-01: Analyze close approaches which have required mission team action on short notice. Determine why the approaches were identified later in the process than most other events. Method: Performed an analysis to determine whether there is any correlation between late notice event identification and space weather, sparse tracking, or high drag objects, which would allow preventive action to be taken Examined specific late notice events identified by missions as problematic to try to identify root cause and attempt to relate them to the correlation analysis.
Pealing, Louise; Perel, Pablo; Prieto-Merino, David; Roberts, Ian
2012-01-01
Background Vascular occlusive events can complicate recovery following trauma. We examined risk factors for venous and arterial vascular occlusive events in trauma patients and the extent to which the risk of vascular occlusive events varies with the severity of bleeding. Methods and Findings We conducted a cohort analysis using data from a large international, double-blind, randomised, placebo-controlled trial (The CRASH-2 trial) [1]. We studied the association between patient demographic and physiological parameters at hospital admission and the risk of vascular occlusive events. To assess the extent to which risk of vascular occlusive events varies with severity of bleeding, we constructed a prognostic model for the risk of death due to bleeding and assessed the relationship between risk of death due to bleeding and risk of vascular occlusive events. There were 20,127 trauma patients with outcome data including 204 (1.01%) patients with a venous event (pulmonary embolism or deep vein thrombosis) and 200 (0.99%) with an arterial event (myocardial infarction or stroke). There were 81 deaths due to vascular occlusive events. Increasing age, decreasing systolic blood pressure, increased respiratory rates, longer central capillary refill times, higher heart rates and lower Glasgow Coma Scores (all p<0.02) were strong risk factors for venous and arterial vascular occlusive events. Patients with more severe bleeding as assessed by predicted risk of haemorrhage death had a greatly increased risk for all types of vascular occlusive event (all p<0.001). Conclusions Patients with severe traumatic bleeding are at greatly increased risk of venous and arterial vascular occlusive events. Older age and blunt trauma are also risk factors for vascular occlusive events. Effective treatment of bleeding may reduce venous and arterial vascular occlusive complications in trauma patients. PMID:23251374
Pertinent anatomy and analysis for midface volumizing procedures.
Surek, Christopher C; Beut, Javier; Stephens, Robert; Jelks, Glenn; Lamb, Jerome
2015-05-01
The study was conducted to construct an anatomically inspired midfacial analysis facilitating safe, accurate, and dynamic nonsurgical rejuvenation. Emphasis is placed on determining injection target areas and adverse event zones. Twelve hemifacial fresh cadavers were dissected in a layered fashion. Dimensional measurements between the midfacial fat compartments, prezygomatic space, mimetic muscles, and neurovascular bundles were used to develop a topographic analysis for clinical injections. A longitudinal line from the base of the alar crease to the medial edge of the levator anguli oris muscle (1.9 cm), lateral edge of the levator anguli oris muscle (2.6 cm), and zygomaticus major muscle (4.6 cm) partitions the cheek into two aesthetic regions. A six-step facial analysis outlines three target zones and two adverse event zones and triangulates the point of maximum cheek projection. The lower adverse event zone yields an anatomical explanation to inadvertent jowling during anterior cheek injection. The upper adverse event zone localizes the palpebral branch of the infraorbital artery. The medial malar target area isolates quadrants for anterior cheek projection and tear trough effacement. The middle malar target area addresses lid-cheek blending and superficial compartment turgor. The lateral malar target area highlights lateral cheek projection and locates the prezygomatic space. This stepwise analysis illustrates target areas and adverse event zones to achieve midfacial support, contour, and profile in the repose position and simultaneous molding of a natural shape during animation. This reproducible method can be used both procedurally and in record-keeping for midface volumizing procedures.
Statistical process control methods allow the analysis and improvement of anesthesia care.
Fasting, Sigurd; Gisvold, Sven E
2003-10-01
Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.
NASA Astrophysics Data System (ADS)
Hu, H.; Ge, Y. J.
2013-11-01
With the social networking and network socialisation have brought more text information and social relationships into our daily lives, the question of whether big data can be fully used to study the phenomenon and discipline of natural sciences has prompted many specialists and scholars to innovate their research. Though politics were integrally involved in the hyperlinked word issues since 1990s, automatic assembly of different geospatial web and distributed geospatial information systems utilizing service chaining have explored and built recently, the information collection and data visualisation of geo-events have always faced the bottleneck of traditional manual analysis because of the sensibility, complexity, relativity, timeliness and unexpected characteristics of political events. Based on the framework of Heritrix and the analysis of web-based text, word frequency, sentiment tendency and dissemination path of the Huangyan Island incident is studied here by combining web crawler technology and the text analysis method. The results indicate that tag cloud, frequency map, attitudes pie, individual mention ratios and dissemination flow graph based on the data collection and processing not only highlight the subject and theme vocabularies of related topics but also certain issues and problems behind it. Being able to express the time-space relationship of text information and to disseminate the information regarding geo-events, the text analysis of network information based on focused web crawler technology can be a tool for understanding the formation and diffusion of web-based public opinions in political events.
Reducing Information Overload in Large Seismic Data Sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
HAMPTON,JEFFERY W.; YOUNG,CHRISTOPHER J.; MERCHANT,BION J.
2000-08-02
Event catalogs for seismic data can become very large. Furthermore, as researchers collect multiple catalogs and reconcile them into a single catalog that is stored in a relational database, the reconciled set becomes even larger. The sheer number of these events makes searching for relevant events to compare with events of interest problematic. Information overload in this form can lead to the data sets being under-utilized and/or used incorrectly or inconsistently. Thus, efforts have been initiated to research techniques and strategies for helping researchers to make better use of large data sets. In this paper, the authors present their effortsmore » to do so in two ways: (1) the Event Search Engine, which is a waveform correlation tool and (2) some content analysis tools, which area combination of custom-built and commercial off-the-shelf tools for accessing, managing, and querying seismic data stored in a relational database. The current Event Search Engine is based on a hierarchical clustering tool known as the dendrogram tool, which is written as a MatSeis graphical user interface. The dendrogram tool allows the user to build dendrogram diagrams for a set of waveforms by controlling phase windowing, down-sampling, filtering, enveloping, and the clustering method (e.g. single linkage, complete linkage, flexible method). It also allows the clustering to be based on two or more stations simultaneously, which is important to bridge gaps in the sparsely recorded event sets anticipated in such a large reconciled event set. Current efforts are focusing on tools to help the researcher winnow the clusters defined using the dendrogram tool down to the minimum optimal identification set. This will become critical as the number of reference events in the reconciled event set continually grows. The dendrogram tool is part of the MatSeis analysis package, which is available on the Nuclear Explosion Monitoring Research and Engineering Program Web Site. As part of the research into how to winnow the reference events in these large reconciled event sets, additional database query approaches have been developed to provide windows into these datasets. These custom built content analysis tools help identify dataset characteristics that can potentially aid in providing a basis for comparing similar reference events in these large reconciled event sets. Once these characteristics can be identified, algorithms can be developed to create and add to the reduced set of events used by the Event Search Engine. These content analysis tools have already been useful in providing information on station coverage of the referenced events and basic statistical, information on events in the research datasets. The tools can also provide researchers with a quick way to find interesting and useful events within the research datasets. The tools could also be used as a means to review reference event datasets as part of a dataset delivery verification process. There has also been an effort to explore the usefulness of commercially available web-based software to help with this problem. The advantages of using off-the-shelf software applications, such as Oracle's WebDB, to manipulate, customize and manage research data are being investigated. These types of applications are being examined to provide access to large integrated data sets for regional seismic research in Asia. All of these software tools would provide the researcher with unprecedented power without having to learn the intricacies and complexities of relational database systems.« less
SNIa detection in the SNLS photometric analysis using Morphological Component Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Möller, A.; Ruhlmann-Kleider, V.; Neveu, J.
2015-04-01
Detection of supernovae (SNe) and, more generally, of transient events in large surveys can provide numerous false detections. In the case of a deferred processing of survey images, this implies reconstructing complete light curves for all detections, requiring sizable processing time and resources. Optimizing the detection of transient events is thus an important issue for both present and future surveys. We present here the optimization done in the SuperNova Legacy Survey (SNLS) for the 5-year data deferred photometric analysis. In this analysis, detections are derived from stacks of subtracted images with one stack per lunation. The 3-year analysis provided 300,000more » detections dominated by signals of bright objects that were not perfectly subtracted. Allowing these artifacts to be detected leads not only to a waste of resources but also to possible signal coordinate contamination. We developed a subtracted image stack treatment to reduce the number of non SN-like events using morphological component analysis. This technique exploits the morphological diversity of objects to be detected to extract the signal of interest. At the level of our subtraction stacks, SN-like events are rather circular objects while most spurious detections exhibit different shapes. A two-step procedure was necessary to have a proper evaluation of the noise in the subtracted image stacks and thus a reliable signal extraction. We also set up a new detection strategy to obtain coordinates with good resolution for the extracted signal. SNIa Monte-Carlo (MC) generated images were used to study detection efficiency and coordinate resolution. When tested on SNLS 3-year data this procedure decreases the number of detections by a factor of two, while losing only 10% of SN-like events, almost all faint ones. MC results show that SNIa detection efficiency is equivalent to that of the original method for bright events, while the coordinate resolution is improved.« less
NASA Astrophysics Data System (ADS)
Pasten, D.; Comte, D.; Vallejos, J.
2013-05-01
During the last decades several authors showing that the spatial distribution of earthquakes follows multifractal laws and the most interesting behavior is the decreasing of the fratal dimensions before the ocurrence of a large earthquake, and also before its main aftershocks. A multifractal analysis to over 55920 microseismicity events recorded from January 2006 to January 2009 at Creighton mine, Canada was applied. In order to work with a complete catalogue in magnitude, it was taken the data associated with the linear part of the Gutenber-Richter law, with magnitudes greater than -1.5. A multifractal analysis was performed using microseismic data, considering that significant earthquakes are those with magnitude MW ≥ 1.0. A moving window was used, containing a constant number of events in order to guarantee the precise estimations of the fractal dimensions. After different trials, we choose 200 events for the number of the data points in each windows. Two consecutive windows were shifted by 20 events. The complete data set was separated in six sections and this multifractal analysis was applied for each section of 9320 data. The multifractal analysis of each section shows that there is a systematic decreasing of the fractal dimension (Dq) with time before the occurrence of rockburst or natural event with magnitude greater than MW ≥ 1.0, as it is observed in the seismic sequence of large earthquakes. This metodology was repeated for minimum magnitudes MW ≥ 1.5 and MW ≥ 2.0, obtaining same results. The best result was obtained using MW >= 2.0, a right answer rate vary between fifty and eighty percent. The result shows the possibility to use systematically the determination of the Dq parameter in order to detect the next rockburst or natural event in the studied mine. This project has been financially suppoerted by FONDECyT No 3120237 Grant (D.P).
Using the DOE Knowledge Base for Special Event Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Armstrong, H.M.; Harris, J.M.; Young, C.J.
1998-10-20
The DOE Knowledge Base is a library of detailed information whose purpose is to support the United States National Data Center (USNDC) in its mission to monitor compliance with the Comprehensive Test Ban Treaty (CTBT). One of the important tasks which the USNDC must accomplish is to periodically perform detailed analysis of events of high interest, so-called "Special Events", to provide the national authority with information needed to make policy decisions. In this paper we investigate some possible uses of the Knowledge Base for Special Event Analysis (SEA), and make recommendations for improving Knowledge Base support for SEA. To analyzemore » an event in detail, there are two basic types of data which must be used sensor-derived data (wave- forms, arrivals, events, etc.) and regiohalized contextual data (known sources, geological characteristics, etc.). Cur- rently there is no single package which can provide full access to both types of data, so for our study we use a separate package for each MatSeis, the Sandia Labs-developed MATLAB-based seismic analysis package, for wave- form data analysis, and ArcView, an ESRI product, for contextual data analysis. Both packages are well-suited to pro- totyping because they provide a rich set of currently available functionality and yet are also flexible and easily extensible, . Using these tools and Phase I Knowledge Base data sets, we show how the Knowledge Base can improve both the speed and the quality of SEA. Empirically-derived interpolated correction information can be accessed to improve both location estimates and associated error estimates. This information can in turn be used to identi~ any known nearby sources (e.g. mines, volcanos), which may then trigger specialized processing of the sensor data. Based on the location estimate, preferred magnitude formulas and discriminants can be retrieved, and any known blockages can be identified to prevent miscalculations. Relevant historic events can be identilled either by spatial proximity searches or through waveform correlation processing. The locations and waveforms of these events can then be made available for side-by-side comparison and processing. If synthetic modeling is thought to be warranted, a wide variety of rele- vant contextu~l information (e.g. crustal thickness and layering, seismic velocities, attenuation factors) can be retrieved and sent to the appropriate applications. Once formedj the synthetics can then be brought in for side-by-side comparison and fhrther processing. Based on our study, we make two general recommendations. First, proper inter-process communication between sensor data analysis software and contextual data analysis sofisvare should be developed. Second, some of the Knowl- edge Base data sets should be prioritized or winnowed to streamline comparison with observed quantities.« less
Development and assessment of stressful life events subscales - A preliminary analysis.
Buccheri, Teresa; Musaad, Salma; Bost, Kelly K; Fiese, Barbara H
2018-01-15
Stress affects people of all ages, genders, and cultures and is associated with physical and psychological complications. Stressful life events are an important research focus and a psychometrically valid measure could provide useful clinical information. The purpose of the study was to develop a reliable and valid measurement of stressful life events and to assess its reliability and validity using established measures of social support, stress, depression, anxiety and maternal and child health. The authors used an adaptation from the Social Readjustment Rating Scale (SRRS) to describe the prevalence of life events; they developed a 4-factor stressful life events subscales and used Medical Outcomes Social Support Scale, Social Support Scale, Depression, Anxiety and Stress Scale and 14 general health items for validity analysis. Analyses were performed with descriptive statistics, Cronbach's alpha, Spearman's rho, Chi-square test or Fisher's exact test and Wilcoxon 2-sample test. The 4-factor stressful life events subscales showed acceptable reliability. The resulting subscale scores were significantly associated with established measures of social support, depression, anxiety, stress, and caregiver health indicators. The study presented a number of limitations in terms of design and recall bias. Despite the presence of a number of limitations, the study provided valuable insight and suggested that further investigation is needed in order to determine the effectiveness of the measures in revealing the family's wellbeing and to develop and strengthen a more detailed analysis of the stressful life events/health association. Copyright © 2017 Elsevier B.V. All rights reserved.
Detection and analysis of a transient energy burst with beamforming of multiple teleseismic phases
NASA Astrophysics Data System (ADS)
Retailleau, Lise; Landès, Matthieu; Gualtieri, Lucia; Shapiro, Nikolai M.; Campillo, Michel; Roux, Philippe; Guilbert, Jocelyn
2018-01-01
Seismological detection methods are traditionally based on picking techniques. These methods cannot be used to analyse emergent signals where the arrivals cannot be picked. Here, we detect and locate seismic events by applying a beamforming method that combines multiple body-wave phases to USArray data. This method explores the consistency and characteristic behaviour of teleseismic body waves that are recorded by a large-scale, still dense, seismic network. We perform time-slowness analysis of the signals and correlate this with the time-slowness equivalent of the different body-wave phases predicted by a global traveltime calculator, to determine the occurrence of an event with no a priori information about it. We apply this method continuously to one year of data to analyse the different events that generate signals reaching the USArray network. In particular, we analyse in detail a low-frequency secondary microseismic event that occurred on 2010 February 1. This event, that lasted 1 d, has a narrow frequency band around 0.1 Hz, and it occurred at a distance of 150° to the USArray network, South of Australia. We show that the most energetic phase observed is the PKPab phase. Direct amplitude analysis of regional seismograms confirms the occurrence of this event. We compare the seismic observations with models of the spectral density of the pressure field generated by the interferences between oceanic waves. We attribute the observed signals to a storm-generated microseismic event that occurred along the South East Indian Ridge.
Zhao, Kun; Yang, Ming; Lu, Yanxia; Sun, Shusen; Li, Wei; Li, Xingang; Zhao, Zhigang
2018-05-23
Some studies have reported an association between P2Y12 gene polymorphisms and clopidogrel adverse outcomes with inconsistent results. We aimed to explore the relationship between P2Y12 polymorphisms and the risk of adverse clinical events in patients treated with clopidogrel through a meta-analysis. A systematic search of PubMed, Web of Science and the Cochrane Library was conducted. Retrieved articles were comprehensively reviewed and eligible studies were included, and the relevant data was extracted for this meta-analysis. All statistical tests were performed by the Review Manager 5.3 software. A total of 14 studies involving 8,698 patients were included. In the Han Chinese population, ischemic events were associated with P2Y12 T744C polymorphism in the CC vs TT+CT genetic model (OR=3.32, 95%CI=1.62-6.82, P =0.001), and the events were associated with P2Y12 C34T polymorphism in the TT+TC vs CC genetic model (OR=1.70, 95%CI=1.22-2.36, P =0.002). However, ischemic events were not related to P2Y12 G52T polymorphism (TT+TG vs GG: OR=1.13, 95%CI=0.76-1.68, P =0.56; TT vs GG+TG: OR=2.02, 95%CI=0.65-6.28, P =0.22). The associations between the P2Y12 polymorphism and ischemic events were not significant in T744C, G52T and C34T genotype for another subgroup of the Caucasian population ( P >0.05). Only two studies referring to bleeding events were included in this analysis of C34T polymorphism, and no significant association was found (TT+TC vs CC: OR=1.07, 95%CI=0.37-3.15, P =0.90). In the Caucasian population, P2Y12 gene polymorphisms are not associated with clinical events. However, in the Chinese Han population, P2Y12 T744C and C34T polymorphisms are significantly associated with adverse clinical events. © Georg Thieme Verlag KG Stuttgart · New York.
Walton, Merrilyn; Smith-Merry, Jennifer; Harrison, Reema; Manias, Elizabeth; Iedema, Rick; Kelly, Patrick
2014-01-01
Introduction Evidence of patients’ experiences is fundamental to creating effective health policy and service responses, yet is missing from our knowledge of adverse events. This protocol describes explorative research redressing this significant deficit; investigating the experiences of a large cohort of recently hospitalised patients aged 45 years and above in hospitals in New South Wales (NSW), Australia. Methods and analysis The 45 and Up Study is a cohort of 265 000 adults aged 45 years and above in NSW. Patients who were hospitalised between 1 January and 30 June 2014 will be identified from this cohort using data linkage and a random sample of 20 000 invited to participate. A cross-sectional survey (including qualitative and quantitative components) will capture patients’ experiences in hospital and specifically of adverse events. Approximately 25% of respondents are likely to report experiencing an adverse event. Quantitative components will capture the nature and type of events as well as common features of patients’ experiences. Qualitative data provide contextual knowledge of their condition and care and the impact of the event on individuals. Respondents who do not report an adverse event will report their experience in hospital and be the control group. Statistical and thematic analysis will be used to present a patient perspective of their experiences in hospital; the characteristics of patients experiencing an adverse event; experiences of information sharing after an event (open disclosure) and the other avenues of redress pursued. Interviews with key policymakers and a document analysis will be used to create a map of the current practice. Ethics and dissemination Dissemination via a one-day workshop, peer-reviewed publications and conference presentations will enable effective clinical responses and service provision and policy responses to adverse events to be developed. PMID:25311039
Initiating Event Analysis of a Lithium Fluoride Thorium Reactor
NASA Astrophysics Data System (ADS)
Geraci, Nicholas Charles
The primary purpose of this study is to perform an Initiating Event Analysis for a Lithium Fluoride Thorium Reactor (LFTR) as the first step of a Probabilistic Safety Assessment (PSA). The major objective of the research is to compile a list of key initiating events capable of resulting in failure of safety systems and release of radioactive material from the LFTR. Due to the complex interactions between engineering design, component reliability and human reliability, probabilistic safety assessments are most useful when the scope is limited to a single reactor plant. Thus, this thesis will study the LFTR design proposed by Flibe Energy. An October 2015 Electric Power Research Institute report on the Flibe Energy LFTR asked "what-if?" questions of subject matter experts and compiled a list of key hazards with the most significant consequences to the safety or integrity of the LFTR. The potential exists for unforeseen hazards to pose additional risk for the LFTR, but the scope of this thesis is limited to evaluation of those key hazards already identified by Flibe Energy. These key hazards are the starting point for the Initiating Event Analysis performed in this thesis. Engineering evaluation and technical study of the plant using a literature review and comparison to reference technology revealed four hazards with high potential to cause reactor core damage. To determine the initiating events resulting in realization of these four hazards, reference was made to previous PSAs and existing NRC and EPRI initiating event lists. Finally, fault tree and event tree analyses were conducted, completing the logical classification of initiating events. Results are qualitative as opposed to quantitative due to the early stages of system design descriptions and lack of operating experience or data for the LFTR. In summary, this thesis analyzes initiating events using previous research and inductive and deductive reasoning through traditional risk management techniques to arrive at a list of key initiating events that can be used to address vulnerabilities during the design phases of LFTR development.
Van Gent, Jan-Michael; Calvo, Richard Yee; Zander, Ashley L; Olson, Erik J; Sise, C Beth; Sise, Michael J; Shackford, Steven R
2017-12-01
Venous thromboembolism, including deep vein thrombosis (DVT) and pulmonary embolism (PE), is typically reported as a composite measure of the quality of trauma center care. Given that recent data suggesting postinjury DVT and PE are distinct clinical processes, a better understanding may result from analyzing them as independent, competing events. Using competing risks analysis, we evaluated our hypothesis that the risk factors and timing of postinjury DVT and PE are different. We examined all adult trauma patients admitted to our Level I trauma center from July 2006 to December 2011 who received at least one surveillance duplex ultrasound of the lower extremities and who were at high risk or greater for DVT. Outcomes included DVT and PE events, and time-to-event from admission. We used competing risks analysis to evaluate risk factors for DVT while accounting for PE as a competing event, and vice versa. Of 2,370 patients, 265 (11.2%) had at least one venous thromboembolism event, 235 DVT only, 19 PE only, 11 DVT and PE. Within 2 days of admission, 38% of DVT cases had occurred compared with 26% of PE. Competing risks modeling of DVT as primary event identified older age, severe injury (Injury Severity Score, ≥ 15), mechanical ventilation longer than 4 days, active cancer, history of DVT or PE, major venous repair, male sex, and prophylactic enoxaparin and prophylactic heparin as associated risk factors. Modeling of PE as the primary event showed younger age, nonsevere injury (Injury Severity Score, < 15), central line placement, and prophylactic heparin as relevant factors. The risk factors for PE and DVT after injury were different, suggesting that they are clinically distinct events that merit independent consideration. Many DVT events occurred early despite prophylaxis, bringing into question the preventability of postinjury DVT. We recommend trauma center quality reporting program measures be revised to account for DVT and PE as unique events. Epidemiologic, level III.
Younger African American Adults' Use of Religious Songs to Manage Stressful Life Events.
Hamilton, Jill B; Stewart, Jennifer M; Thompson, Keitra; Alvarez, Carmen; Best, Nakia C; Amoah, Kevin; Carlton-LaNey, Iris B
2017-02-01
The aim of this study was to explore the use of religious songs in response to stressful life events among young African American adults. Fifty-five young African American adults aged 18-49 participated in a qualitative study involving criterion sampling and open-ended interviews. Data analysis included content analysis and descriptive statistics. Stressful life events were related to work or school; caregiving and death of a family member; and relationships. Religious songs represented five categories: Instructive, Communication with God, Thanksgiving and Praise, Memory of Forefathers, and Life after Death. The tradition of using religious songs in response to stressful life events continues among these young adults. Incorporating religious songs into health-promoting interventions might enhance their cultural relevance to this population.
The problem of extreme events in paired-watershed studies
James W. Hornbeck
1973-01-01
In paired-watershed studies, the occurrence of an extreme event during the after-treatment period presents a problem: the effects of treatment must be determined by using greatly extrapolated regression statistics. Several steps are presented to help insure careful handling of extreme events during analysis and reporting of research results.
Neural Events in the Reinforcement Contingency
ERIC Educational Resources Information Center
Silva, Maria Teresa Araujo; Goncalves, Fabio Leyser; Garcia-Mijares, Miriam
2007-01-01
When neural events are analyzed as stimuli and responses, functional relations among them and among overt stimuli and responses can be unveiled. The integration of neuroscience and the experimental analysis of behavior is beginning to provide empirical evidence of involvement of neural events in the three-term contingency relating discriminative…
Patton, John M.; Guy, Michelle R.; Benz, Harley M.; Buland, Raymond P.; Erickson, Brian K.; Kragness, David S.
2016-08-18
This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National Earthquake Information Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.
Zhou, Bing-Yang; Guo, Yuan-Lin; Wu, Na-Qiong; Zhu, Cheng-Gang; Gao, Ying; Qing, Ping; Li, Xiao-Lin; Wang, Yao; Dong, Qian; Liu, Geng; Xu, Rui Xia; Cui, Chuan-Jue; Sun, Jing; Li, Jian-Jun
2017-03-01
Big endothelin-1 (ET-1) has been proposed as a novel prognostic indicator of acute coronary syndrome, while its predicting role of cardiovascular outcomes in patients with stable coronary artery disease (CAD) is unclear. A total of 3154 consecutive patients with stable CAD were enrolled and followed up for 24months. The outcomes included all-cause death, non-fatal myocardial infarction, stroke and unplanned revascularization (percutaneous coronary intervention and coronary artery bypass grafting). Baseline big ET-1 was measured using sandwich enzyme immunoassay method. Cox proportional hazard regression analysis and Kaplan-Meier analysis were used to evaluate the prognostic value of big ET-1 on cardiovascular outcomes. One hundred and eighty-nine (5.99%) events occurred during follow-up. Patients were divided into two groups: events group (n=189) and non-events group (n=2965). The results indicated that the events group had higher levels of big ET-1 compared to non-events group. Multivariable Cox proportional hazard regression analysis showed that big ET-1 was positively and statistically correlated with clinical outcomes (Hazard Ratio: 1.656, 95% confidence interval: 1.099-2.496, p=0.016). Additionally, the Kaplan-Meier analysis revealed that patients with higher big ET-1 presented lower event-free survival (p=0.016). The present study firstly suggests that big ET-1 is an independent risk marker of cardiovascular outcomes in patients with stable CAD. And more studies are needed to confirm our findings. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Post-blasting seismicity in Rudna copper mine, Poland - source parameters analysis.
NASA Astrophysics Data System (ADS)
Caputa, Alicja; Rudziński, Łukasz; Talaga, Adam
2017-04-01
The really important hazard in Polish copper mines is high seismicity and corresponding rockbursts. Many methods are used to reduce the seismic hazard. Among others the most effective is preventing blasting in potentially hazardous mining panels. The method is expected to provoke small moderate tremors (up to M2.0) and reduce in this way a stress accumulation in the rockmass. This work presents an analysis, which deals with post-blasting events in Rudna copper mine, Poland. Using the Full Moment Tensor (MT) inversion and seismic spectra analysis, we try to find some characteristic features of post blasting seismic sources. Source parameters estimated for post-blasting events are compared with the parameters of not-provoked mining events that occurred in the vicinity of the provoked sources. Our studies show that focal mechanisms of events which occurred after blasts have similar MT decompositions, namely are characterized by a quite strong isotropic component as compared with the isotropic component of not-provoked events. Also source parameters obtained from spectral analysis show that provoked seismicity has a specific source physics. Among others, it is visible from S to P wave energy ratio, which is higher for not-provoked events. The comparison of all our results reveals a three possible groups of sources: a) occurred just after blasts, b) occurred from 5min to 24h after blasts and c) not-provoked seismicity (more than 24h after blasting). Acknowledgements: This work was supported within statutory activities No3841/E-41/S/2016 of Ministry of Science and Higher Education of Poland.
Acoustic emission analysis as a non-destructive test procedure for fiber compound structures
NASA Technical Reports Server (NTRS)
Block, J.
1983-01-01
The concept of acoustic emission analysis is explained in scientific terms. The detection of acoustic events, their localization, damage discrimination, and event summation curves are discussed. A block diagram of the concept of damage-free testing of fiber-reinforced synthetic materials is depicted. Prospects for application of the concept are assessed.
ERIC Educational Resources Information Center
Kelly, Sean
2004-01-01
In this event history analysis of the 1990-1991 Schools and Staffing Survey and the 1992 Teacher Follow-up Survey, a retrospective person-year database was constructed to examine teacher attrition over the course of the teaching career. Consistent with prior research, higher teacher salaries reduced attrition, but only slightly so. Teacher…
Propensity for Violence among Homeless and Runaway Adolescents: An Event History Analysis
ERIC Educational Resources Information Center
Crawford, Devan M.; Whitbeck, Les B.; Hoyt, Dan R.
2011-01-01
Little is known about the prevalence of violent behaviors among homeless and runaway adolescents or the specific behavioral factors that influence violent behaviors across time. In this longitudinal study of 300 homeless and runaway adolescents aged 16 to 19 at baseline, the authors use event history analysis to assess the factors associated with…
ERIC Educational Resources Information Center
Dante, Angelo; Fabris, Stefano; Palese, Alvisa
2013-01-01
Empirical studies and conceptual frameworks presented in the extant literature offer a static imagining of academic failure. Time-to-event analysis, which captures the dynamism of individual factors, as when they determine the failure to properly tailor timely strategies, impose longitudinal studies which are still lacking within the field. The…
An x ray archive on your desk: The Einstein CD-ROM's
NASA Technical Reports Server (NTRS)
Prestwich, A.; Mcdowell, J.; Plummer, D.; Manning, K.; Garcia, M.
1992-01-01
Data from the Einstein Observatory imaging proportional counter (IPC) and high resolution imager (HRI) were released on several CD-ROM sets. The sets released so far include pointed IPC and HRI observations in both simple image and detailed photon event list format, as well as the IPC slew survey. With the data on these CD-ROMS's the user can perform spatial analysis (e.g., surface brightness distributions), spectral analysis (with the IPC event lists), and timing analysis (with the IPC and HRI event lists). The next CD-ROM set will contain IPC unscreened data, allowing the user to perform custom screening to recover, for instance, data during times of lost aspect data or high particle background rates.
Maley, Christine M; Pagana, Nicole K; Velenger, Christa A; Humbert, Tamera Keiter
2016-01-01
This systematic literature review analyzed the construct of spirituality as perceived by people who have experienced or are experiencing a major life event or transition. The researchers investigated studies that used narrative analysis or a phenomenological methodology related to the topic. Thematic analysis resulted in three major themes: (1) avenues to and through spirituality, (2) the experience of spirituality, and (3) the meaning of spirituality. The results provide insights into the intersection of spirituality, meaning, and occupational engagement as understood by people experiencing a major life event or transition and suggest further research that addresses spirituality in occupational therapy and interdisciplinary intervention. Copyright © 2016 by the American Occupational Therapy Association, Inc.
NASA Astrophysics Data System (ADS)
Bauwe, Andreas; Tiemeyer, Bärbel; Kahle, Petra; Lennartz, Bernd
2015-12-01
Nitrate is one of the most important sources of pollution for surface waters in tile-drained agricultural areas. In order to develop appropriate management strategies to reduce nitrate losses, it is crucial to first understand the underlying hydrological processes. In this study, we used Principle Component Analysis (PCA) and Linear Discriminant Analysis (LDA) to analyze 212 discharge events between 2004 and 2011 across three spatial scales (68 events at the collector drain, 72 at the ditch, and 72 at the brook) to identify the controlling factors for hydrograph response characteristics and their influence on nitrate concentration patterns. Our results showed that the 212 hydrological events can be classified into six different types: summer events (28%), snow-dominated events (10%), events controlled by rainfall duration (16%), rainfall totals (8%), dry antecedent conditions (10%), and events controlled by wet antecedent conditions (14%). The relatively large number of unclassified events (15%) demonstrated the difficulty in separating event types due to mutually influencing variables. NO3-N concentrations showed a remarkably consistent pattern during the discharge events regardless of event type, with minima at the beginning, increasing concentrations at the rising limb, and maxima around peak discharge. However, the level of NO3-N concentrations varied notably among the event types. The highest average NO3-N concentrations were found for events controlled by rainfall totals (NO3-N = 17.1 mg/l), events controlled by wet antecedent conditions (NO3-N = 17.1 mg/l), and snowmelt (NO3-N = 15.2 mg/l). Average maximum NO3-N concentrations were significantly lower during summer events (NO3-N = 10.2 mg/l) and events controlled by dry antecedent conditions (NO3-N = 11.7 mg/l). The results have furthermore shown that similar hydrological and biogeochemical processes determine the hydrograph and NO3-N response on storm events at various spatial scales. The management of tile-drained agricultural land to reduce NO3-N losses should focus explicitly on flow events and, more specifically, active management should preferably be conducted in the winter season for discharge events after snowmelt, after heavy rain storms and when the soil moisture conditions are wet.
NASA Astrophysics Data System (ADS)
Petrucci, Olga; Caloiero, Tommaso; Aurora Pasqua, Angela
2016-04-01
Each year, especially during winter season, some episode of intense rain affects Calabria, the southernmost Italian peninsular region, triggering flash floods and mass movements that cause damage and fatalities. This work presents a comparative analysis between two events that affected the southeast sector of the region, in 2000 and 2014, respectively. The event occurred between 9th and 10th of September 2000 is known in Italy as Soverato event, after the name of the municipality where it reached the highest damage severity. In the Soverato area, more than 200 mm of rain that fell in 24 hours caused a disastrous flood that swept away a campsite at about 4 a.m., killing 13 people and hurting 45. Besides, the rain affected a larger area, causing damage in 89 (out of 409) municipalities of the region. Flooding was the most common process, which damaged housing and trading. Landslide mostly affected the road network, housing and cultivations. The most recent event affected the same regional sector between 30th October and 2nd November 2015. The daily rain recorded at some of the rain gauges of the area almost reached 400 mm. Out of the 409 municipalities of Calabria, 109 suffered damage. The most frequent types of processes were both flash floods and landslides. The most heavily damaged element was the road network: the representative picture of the event is a railway bridge destroyed by the river flow. Housing was damaged too, and 486 people were temporarily evacuated from home. The event also caused a victim killed by a flood. The event-centred study approach aims to highlight differences and similarities in both the causes and the effects of the two events that occurred at a temporal distance of 14 years. The comparative analysis focus on three main aspects: the intensity of triggering rain, the modifications of urbanised areas, and the evolution of emergency management. The comparative analysis of rain is made by comparing the return period of both daily and cumulative rain. The modifications of urbanised sectors is obtained by comparing ISTAT (National Statistic Institute of Italy) data and google maps of the affected areas at the time of the occurrence of the events. The emergency management is analysed by comparing the types and extend of civil protection alerts diffused in the two studied cases.
Stewart, C M; Newlands, S D; Perachio, A A
2004-12-01
Rapid and accurate discrimination of single units from extracellular recordings is a fundamental process for the analysis and interpretation of electrophysiological recordings. We present an algorithm that performs detection, characterization, discrimination, and analysis of action potentials from extracellular recording sessions. The program was entirely written in LabVIEW (National Instruments), and requires no external hardware devices or a priori information about action potential shapes. Waveform events are detected by scanning the digital record for voltages that exceed a user-adjustable trigger. Detected events are characterized to determine nine different time and voltage levels for each event. Various algebraic combinations of these waveform features are used as axis choices for 2-D Cartesian plots of events. The user selects axis choices that generate distinct clusters. Multiple clusters may be defined as action potentials by manually generating boundaries of arbitrary shape. Events defined as action potentials are validated by visual inspection of overlain waveforms. Stimulus-response relationships may be identified by selecting any recorded channel for comparison to continuous and average cycle histograms of binned unit data. The algorithm includes novel aspects of feature analysis and acquisition, including higher acquisition rates for electrophysiological data compared to other channels. The program confirms that electrophysiological data may be discriminated with high-speed and efficiency using algebraic combinations of waveform features derived from high-speed digital records.
Challenges of Guarantee-Time Bias
Giobbie-Hurder, Anita; Gelber, Richard D.; Regan, Meredith M.
2013-01-01
The potential for guarantee-time bias (GTB), also known as immortal time bias, exists whenever an analysis that is timed from enrollment or random assignment, such as disease-free or overall survival, is compared across groups defined by a classifying event occurring sometime during follow-up. The types of events associated with GTB are varied and may include the occurrence of objective disease response, onset of toxicity, or seroconversion. However, comparative analyses using these types of events as predictors are different from analyses using baseline characteristics that are specified completely before the occurrence of any outcome event. Recognizing the potential for GTB is not always straightforward, and it can be challenging to know when GTB is influencing the results of an analysis. This article defines GTB, provides examples of GTB from several published articles, and discusses three analytic techniques that can be used to remove the bias: conditional landmark analysis, extended Cox model, and inverse probability weighting. The strengths and limitations of each technique are presented. As an example, we explore the effect of bisphosphonate use on disease-free survival (DFS) using data from the BIG (Breast International Group) 1-98 randomized clinical trial. An analysis using a naive approach showed substantial benefit for patients who received bisphosphonate therapy. In contrast, analyses using the three methods known to remove GTB showed no statistical evidence of a reduction in risk of a DFS event with bisphosphonate therapy. PMID:23835712
Kaye, Stephen; Baddon, Andrew; Jones, Mark; Armitage, W John; Fehily, Deirdre; Warwick, Ruth M
2010-02-01
Reporting and investigation of serious adverse events and reactions associated with tissue and cell transplantation is a fundamental aspect of ensuring adequate levels of safety and quality and is a requirement of the European Union Directives on tissues and cells. In the UK, a system for the reporting and analysis of events and reactions associated with ocular tissue transplantation is well established. It is operated by a network of individuals and organisations, each with clearly defined roles and responsibilities, following written procedures for reporting and investigation. Analysis of reports indicates that the most important adverse reactions associated with this type of tissue transplantation are endophthalmitis (0.58%) and primary graft failure (0.3%). This system allows the analysis of all types of events and reactions by the professionals involved so that trends can be identified and services improved. Tools to evaluate the severity and imputability of individual events or reactions, such as those developed by the EUSTITE project, can be utilised to facilitate the selection of those cases meeting the criteria for reporting to the Competent Authority. This vigilance model has been shown to be effective and could be applied in other fields of tissue or cell transplantation.
Comparative hazard analysis of processes leading to remarkable flash floods (France, 1930-1999)
NASA Astrophysics Data System (ADS)
Boudou, M.; Lang, M.; Vinet, F.; Cœur, D.
2016-10-01
Flash flood events are responsible for large economic losses and lead to fatalities every year in France. This is especially the case in the Mediterranean and oversea territories/departments of France, characterized by extreme hydro-climatological features and with a large part of the population exposed to flood risks. The recurrence of remarkable flash flood events, associated with high hazard intensity, significant damage and socio-political consequences, therefore raises several issues for authorities and risk management policies. This study aims to improve our understanding of the hazard analysis process in the case of four remarkable flood events: March 1930, October 1940, January 1980 and November 1999. Firstly, we present the methodology used to define the remarkability score of a flood event. Then, to identify the factors leading to a remarkable flood event, we explore the main parameters of the hazard analysis process, such as the meteorological triggering conditions, the return period of the rainfall and peak discharge, as well as some additional factors (initial catchment state, flood chronology, cascade effects, etc.). The results contribute to understanding the complexity of the processes leading to flood hazard and highlight the importance for risk managers of taking additional factors into account.
Frndak, Seth E; Smerbeck, Audrey M; Irwin, Lauren N; Drake, Allison S; Kordovski, Victoria M; Kunker, Katrina A; Khan, Anjum L; Benedict, Ralph H B
2016-10-01
We endeavored to clarify how distinct co-occurring symptoms relate to the presence of negative work events in employed multiple sclerosis (MS) patients. Latent profile analysis (LPA) was utilized to elucidate common disability patterns by isolating patient subpopulations. Samples of 272 employed MS patients and 209 healthy controls (HC) were administered neuroperformance tests of ambulation, hand dexterity, processing speed, and memory. Regression-based norms were created from the HC sample. LPA identified latent profiles using the regression-based z-scores. Finally, multinomial logistic regression tested for negative work event differences among the latent profiles. Four profiles were identified via LPA: a common profile (55%) characterized by slightly below average performance in all domains, a broadly low-performing profile (18%), a poor motor abilities profile with average cognition (17%), and a generally high-functioning profile (9%). Multinomial regression analysis revealed that the uniformly low-performing profile demonstrated a higher likelihood of reported negative work events. Employed MS patients with co-occurring motor, memory and processing speed impairments were most likely to report a negative work event, classifying them as uniquely at risk for job loss.
Using social media for disaster emergency management
NASA Astrophysics Data System (ADS)
Wang, Y. D.; Wang, T.; Ye, X. Y.; Zhu, J. Q.; Lee, J.
2016-06-01
Social media have become a universal phenomenon in our society (Wang et al., 2012). As a new data source, social media have been widely used in knowledge discovery in fields related to health (Jackson et al., 2014), human behaviour (Lee, 2014), social influence (Hong, 2013), and market analysis (Hanna et al., 2011). In this paper, we report a case study of the 2012 Beijing Rainstorm to investigate how emergency information was timely distributed using social media during emergency events. We present a classification and location model for social media text streams during emergency events. This model classifies social media text streams based on their topical contents. Integrated with a trend analysis, we show how Sina-Weibo fluctuated during emergency events. Using a spatial statistical analysis method, we found that the distribution patterns of Sina-Weibo were related to the emergency events but varied among different topics. This study helps us to better understand emergency events so that decision-makers can act on emergencies in a timely manner. In addition, this paper presents the tools, methods, and models developed in this study that can be used to work with text streams from social media in the context of disaster management.
Element analysis: a wavelet-based method for analysing time-localized events in noisy time series
2017-01-01
A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized ‘events’. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event’s ‘region of influence’ within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis, is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry. PMID:28484325
Statistical analysis of secondary particle distributions in relativistic nucleus-nucleus collisions
NASA Technical Reports Server (NTRS)
Mcguire, Stephen C.
1987-01-01
The use is described of several statistical techniques to characterize structure in the angular distributions of secondary particles from nucleus-nucleus collisions in the energy range 24 to 61 GeV/nucleon. The objective of this work was to determine whether there are correlations between emitted particle intensity and angle that may be used to support the existence of the quark gluon plasma. The techniques include chi-square null hypothesis tests, the method of discrete Fourier transform analysis, and fluctuation analysis. We have also used the method of composite unit vectors to test for azimuthal asymmetry in a data set of 63 JACEE-3 events. Each method is presented in a manner that provides the reader with some practical detail regarding its application. Of those events with relatively high statistics, Fe approaches 0 at 55 GeV/nucleon was found to possess an azimuthal distribution with a highly non-random structure. No evidence of non-statistical fluctuations was found in the pseudo-rapidity distributions of the events studied. It is seen that the most effective application of these methods relies upon the availability of many events or single events that possess very high multiplicities.
Photographic Analysis Technique for Assessing External Tank Foam Loss Events
NASA Technical Reports Server (NTRS)
Rieckhoff, T. J.; Covan, M.; OFarrell, J. M.
2001-01-01
A video camera and recorder were placed inside the solid rocket booster forward skirt in order to view foam loss events over an area on the external tank (ET) intertank surface. In this Technical Memorandum, a method of processing video images to allow rapid detection of permanent changes indicative of foam loss events on the ET surface was defined and applied to accurately count, categorize, and locate such events.
Ruckart, Perri Z; Wattigney, Wendy A; Kaye, Wendy E
2004-01-01
Background Releases of hazardous materials can cause substantial morbidity and mortality. To reduce and prevent the public health consequences (victims or evacuations) from uncontrolled or illegally released hazardous substances, a more comprehensive analysis is needed to determine risk factors for hazardous materials incidents. Methods Hazardous Substances Emergency Events Surveillance (HSEES) data from 1996 through 2001 were analyzed using bivariate and multiple logistic regression. Fixed-facility and transportation-related events were analyzed separately. Results For fixed-facility events, 2,327 (8%) resulted in at least one victim and 2,844 (10%) involved ordered evacuations. For transportation-related events, 759 (8%) resulted in at least one victim, and 405 (4%) caused evacuation orders. Fire and/or explosion were the strongest risk factors for events involving either victims or evacuations. Stratified analysis of fixed-facility events involving victims showed a strong association for acid releases in the agriculture, forestry, and fisheries industry. Chlorine releases in fixed-facility events resulted in victims and evacuations in more industry categories than any other substance. Conclusions Outreach efforts should focus on preventing and preparing for fires and explosions, acid releases in the agricultural industry, and chlorine releases in fixed facilities. PMID:15496226
Genetic consequences of sequential founder events by an island-colonizing bird.
Clegg, Sonya M; Degnan, Sandie M; Kikkawa, Jiro; Moritz, Craig; Estoup, Arnaud; Owens, Ian P F
2002-06-11
The importance of founder events in promoting evolutionary changes on islands has been a subject of long-running controversy. Resolution of this debate has been hindered by a lack of empirical evidence from naturally founded island populations. Here we undertake a genetic analysis of a series of historically documented, natural colonization events by the silvereye species-complex (Zosterops lateralis), a group used to illustrate the process of island colonization in the original founder effect model. Our results indicate that single founder events do not affect levels of heterozygosity or allelic diversity, nor do they result in immediate genetic differentiation between populations. Instead, four to five successive founder events are required before indices of diversity and divergence approach that seen in evolutionarily old forms. A Bayesian analysis based on computer simulation allows inferences to be made on the number of effective founders and indicates that founder effects are weak because island populations are established from relatively large flocks. Indeed, statistical support for a founder event model was not significantly higher than for a gradual-drift model for all recently colonized islands. Taken together, these results suggest that single colonization events in this species complex are rarely accompanied by severe founder effects, and multiple founder events and/or long-term genetic drift have been of greater consequence for neutral genetic diversity.
Simplex and duplex event-specific analytical methods for functional biotech maize.
Lee, Seong-Hun; Kim, Su-Jeong; Yi, Bu-Young
2009-08-26
Analytical methods are very important in the control of genetically modified organism (GMO) labeling systems or living modified organism (LMO) management for biotech crops. Event-specific primers and probes were developed for qualitative and quantitative analysis for biotech maize event 3272 and LY 038 on the basis of the 3' flanking regions, respectively. The qualitative primers confirmed the specificity by a single PCR product and sensitivity to 0.05% as a limit of detection (LOD). Simplex and duplex quantitative methods were also developed using TaqMan real-time PCR. One synthetic plasmid was constructed from two taxon-specific DNA sequences of maize and two event-specific 3' flanking DNA sequences of event 3272 and LY 038 as reference molecules. In-house validation of the quantitative methods was performed using six levels of mixing samples, from 0.1 to 10.0%. As a result, the biases from the true value and the relative deviations were all within the range of +/-30%. Limits of quantitation (LOQs) of the quantitative methods were all 0.1% for simplex real-time PCRs of event 3272 and LY 038 and 0.5% for duplex real-time PCR of LY 038. This study reports that event-specific analytical methods were applicable for qualitative and quantitative analysis for biotech maize event 3272 and LY 038.
Thinking about thinking and feeling about feeling
Moore, J.
2000-01-01
Traditional clinical psychology generally posits “mental” events that differ from “behavioral” events. Mental events are not publicly observable, take place in a different dimension from overt behavior, and are the topic of primary concern. For example, mental events are often taken to be causes of troublesome overt behavior. In addition, the mental events themselves may be regarded as troublesome, independent of their relation to any specific overt behavior. Therapy is usually aimed at fixing these troublesome mental events, under an assumption that improvement in the client's status will follow in due course. Behavior analysis has its own position on the relations among clinical matters, overt behavior, and such private events as thinking and feeling. In a behavior-analytic view, private events are behavioral phenomena rather than mental phenomena. They are not initiating causes of behavior; rather, they are themselves caused by antecedent conditions, but they may contribute to discriminative control over subsequent behavior, both verbal and nonverbal. Verbal processes are viewed as vitally important in understanding troublesome behavior. However, the circumstances that cause both the troublesome private events and the troublesome behavior in the first place still need to be addressed. Finally, clinical behavior analysis will need to market its insights into diagnosis and treatment very adroitly, because it rejects the mentalism upon which most traditional forms of therapy are predicated and the mentalism that most consumers expect to encounter. PMID:22478337
North Sea Storm Driving of Extreme Wave Heights
NASA Astrophysics Data System (ADS)
Bell, Ray; Gray, Suzanne; Jones, Oliver
2017-04-01
The relationship between storms and extreme ocean waves in the North sea is assessed using a long-period wave dataset and storms identified in the Interim ECMWF Re-Analysis (ERA-Interim). An ensemble sensitivity analysis is used to provide information on the spatial and temporal forcing from mean sea-level pressure and surface wind associated with extreme ocean wave height responses. Extreme ocean waves in the central North Sea arise due to either the winds in the cold conveyor belt (northerly-wind events) or winds in the warm conveyor belt (southerly-wind events) of extratropical cyclones. The largest wave heights are associated with northerly-wind events which tend to have stronger wind speeds and occur as the cold conveyor belt wraps rearwards round the cyclone to the cold side of the warm front. The northerly-wind events also provide a larger fetch to the central North Sea. Southerly-wind events are associated with the warm conveyor belts of intense extratropical storms developing in the right upper-tropospheric jet exit region. There is predictability in the extreme ocean wave events up to two days before the event associated with a strengthening of a high pressure system to the west (northerly-wind events) and south-west (southerly-wind events) of the British Isles. This acts to increase the pressure gradient over the British Isles and therefore drive stronger wind speeds in the central North sea.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew
GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level, the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of PRA methodologies to conduct a mechanistic source term (MST) analysis for event sequences that could result in the release ofmore » radionuclides. The MST analysis seeks to realistically model and assess the transport, retention, and release of radionuclides from the reactor to the environment. The MST methods developed during this project seek to satisfy the requirements of the Mechanistic Source Term element of the ASME/ANS Non-LWR PRA standard. The MST methodology consists of separate analysis approaches for risk-significant and non-risk significant event sequences that may result in the release of radionuclides from the reactor. For risk-significant event sequences, the methodology focuses on a detailed assessment, using mechanistic models, of radionuclide release from the fuel, transport through and release from the primary system, transport in the containment, and finally release to the environment. The analysis approach for non-risk significant event sequences examines the possibility of large radionuclide releases due to events such as re-criticality or the complete loss of radionuclide barriers. This paper provides details on the MST methodology, including the interface between the MST analysis and other elements of the PRA, and provides a simplified example MST calculation for a sodium fast reactor.« less
Back analysis of fault-slip in burst prone environment
NASA Astrophysics Data System (ADS)
Sainoki, Atsushi; Mitri, Hani S.
2016-11-01
In deep underground mines, stress re-distribution induced by mining activities could cause fault-slip. Seismic waves arising from fault-slip occasionally induce rock ejection when hitting the boundary of mine openings, and as a result, severe damage could be inflicted. In general, it is difficult to estimate fault-slip-induced ground motion in the vicinity of mine openings because of the complexity of the dynamic response of faults and the presence of geological structures. In this paper, a case study is conducted for a Canadian underground mine, herein called "Mine-A", which is known for its seismic activities. Using a microseismic database collected from the mine, a back analysis of fault-slip is carried out with mine-wide 3-dimensional numerical modeling. A back analysis is conducted to estimate the physical and mechanical properties of the causative fracture or shear zones. One large seismic event has been selected for the back analysis to detect a fault-slip related seismic event. In the back analysis, the shear zone properties are estimated with respect to moment magnitude of the seismic event and peak particle velocity (PPV) recorded by a strong ground motion sensor. The estimated properties are then validated through comparison with peak ground acceleration recorded by accelerometers. Lastly, ground motion in active mining areas is estimated by conducting dynamic analysis with the estimated values. The present study implies that it would be possible to estimate the magnitude of seismic events that might occur in the near future by applying the estimated properties to the numerical model. Although the case study is conducted for a specific mine, the developed methodology can be equally applied to other mines suffering from fault-slip related seismic events.
Sudell, Maria; Kolamunnage-Dona, Ruwanthi; Tudur-Smith, Catrin
2016-12-05
Joint models for longitudinal and time-to-event data are commonly used to simultaneously analyse correlated data in single study cases. Synthesis of evidence from multiple studies using meta-analysis is a natural next step but its feasibility depends heavily on the standard of reporting of joint models in the medical literature. During this review we aim to assess the current standard of reporting of joint models applied in the literature, and to determine whether current reporting standards would allow or hinder future aggregate data meta-analyses of model results. We undertook a literature review of non-methodological studies that involved joint modelling of longitudinal and time-to-event medical data. Study characteristics were extracted and an assessment of whether separate meta-analyses for longitudinal, time-to-event and association parameters were possible was made. The 65 studies identified used a wide range of joint modelling methods in a selection of software. Identified studies concerned a variety of disease areas. The majority of studies reported adequate information to conduct a meta-analysis (67.7% for longitudinal parameter aggregate data meta-analysis, 69.2% for time-to-event parameter aggregate data meta-analysis, 76.9% for association parameter aggregate data meta-analysis). In some cases model structure was difficult to ascertain from the published reports. Whilst extraction of sufficient information to permit meta-analyses was possible in a majority of cases, the standard of reporting of joint models should be maintained and improved. Recommendations for future practice include clear statement of model structure, of values of estimated parameters, of software used and of statistical methods applied.
Ontology-supported research on vaccine efficacy, safety and integrative biological networks.
He, Yongqun
2014-07-01
While vaccine efficacy and safety research has dramatically progressed with the methods of in silico prediction and data mining, many challenges still exist. A formal ontology is a human- and computer-interpretable set of terms and relations that represent entities in a specific domain and how these terms relate to each other. Several community-based ontologies (including Vaccine Ontology, Ontology of Adverse Events and Ontology of Vaccine Adverse Events) have been developed to support vaccine and adverse event representation, classification, data integration, literature mining of host-vaccine interaction networks, and analysis of vaccine adverse events. The author further proposes minimal vaccine information standards and their ontology representations, ontology-based linked open vaccine data and meta-analysis, an integrative One Network ('OneNet') Theory of Life, and ontology-based approaches to study and apply the OneNet theory. In the Big Data era, these proposed strategies provide a novel framework for advanced data integration and analysis of fundamental biological networks including vaccine immune mechanisms.
Hsu, Yi-Fang; Szűcs, Dénes
2012-02-15
Several functional magnetic resonance imaging (fMRI) studies have used neural adaptation paradigms to detect anatomical locations of brain activity related to number processing. However, currently not much is known about the temporal structure of number adaptation. In the present study, we used electroencephalography (EEG) to elucidate the time course of neural events in symbolic number adaptation. The numerical distance of deviants relative to standards was manipulated. In order to avoid perceptual confounds, all levels of deviants consisted of perceptually identical stimuli. Multiple successive numerical distance effects were detected in event-related potentials (ERPs). Analysis of oscillatory activity further showed at least two distinct stages of neural processes involved in the automatic analysis of numerical magnitude, with the earlier effect emerging at around 200ms and the later effect appearing at around 400ms. The findings support for the hypothesis that numerical magnitude processing involves a succession of cognitive events. Crown Copyright © 2011. Published by Elsevier Inc. All rights reserved.
Ontology-supported Research on Vaccine Efficacy, Safety, and Integrative Biological Networks
He, Yongqun
2016-01-01
Summary While vaccine efficacy and safety research has dramatically progressed with the methods of in silico prediction and data mining, many challenges still exist. A formal ontology is a human- and computer-interpretable set of terms and relations that represent entities in a specific domain and how these terms relate to each other. Several community-based ontologies (including the Vaccine Ontology, Ontology of Adverse Events, and Ontology of Vaccine Adverse Events) have been developed to support vaccine and adverse event representation, classification, data integration, literature mining of host-vaccine interaction networks, and analysis of vaccine adverse events. The author further proposes minimal vaccine information standards and their ontology representations, ontology-based linked open vaccine data and meta-analysis, an integrative One Network (“OneNet”) Theory of Life, and ontology-based approaches to study and apply the OneNet theory. In the Big Data era, these proposed strategies provide a novel framework for advanced data integration and analysis of fundamental biological networks including vaccine immune mechanisms. PMID:24909153
On the properties of stochastic intermittency in rainfall processes.
Molini, A; La, Barbera P; Lanza, L G
2002-01-01
In this work we propose a mixed approach to deal with the modelling of rainfall events, based on the analysis of geometrical and statistical properties of rain intermittency in time, combined with the predictability power derived from the analysis of no-rain periods distribution and from the binary decomposition of the rain signal. Some recent hypotheses on the nature of rain intermittency are reviewed too. In particular, the internal intermittent structure of a high resolution pluviometric time series covering one decade and recorded at the tipping bucket station of the University of Genova is analysed, by separating the internal intermittency of rainfall events from the inter-arrival process through a simple geometrical filtering procedure. In this way it is possible to associate no-rain intervals with a probability distribution both in virtue of their position within the event and their percentage. From this analysis, an invariant probability distribution for the no-rain periods within the events is obtained at different aggregation levels and its satisfactory agreement with a typical extreme value distribution is shown.
SAFE Software and FED Database to Uncover Protein-Protein Interactions using Gene Fusion Analysis.
Tsagrasoulis, Dimosthenis; Danos, Vasilis; Kissa, Maria; Trimpalis, Philip; Koumandou, V Lila; Karagouni, Amalia D; Tsakalidis, Athanasios; Kossida, Sophia
2012-01-01
Domain Fusion Analysis takes advantage of the fact that certain proteins in a given proteome A, are found to have statistically significant similarity with two separate proteins in another proteome B. In other words, the result of a fusion event between two separate proteins in proteome B is a specific full-length protein in proteome A. In such a case, it can be safely concluded that the protein pair has a common biological function or even interacts physically. In this paper, we present the Fusion Events Database (FED), a database for the maintenance and retrieval of fusion data both in prokaryotic and eukaryotic organisms and the Software for the Analysis of Fusion Events (SAFE), a computational platform implemented for the automated detection, filtering and visualization of fusion events (both available at: http://www.bioacademy.gr/bioinformatics/projects/ProteinFusion/index.htm). Finally, we analyze the proteomes of three microorganisms using these tools in order to demonstrate their functionality.
SAFE Software and FED Database to Uncover Protein-Protein Interactions using Gene Fusion Analysis
Tsagrasoulis, Dimosthenis; Danos, Vasilis; Kissa, Maria; Trimpalis, Philip; Koumandou, V. Lila; Karagouni, Amalia D.; Tsakalidis, Athanasios; Kossida, Sophia
2012-01-01
Domain Fusion Analysis takes advantage of the fact that certain proteins in a given proteome A, are found to have statistically significant similarity with two separate proteins in another proteome B. In other words, the result of a fusion event between two separate proteins in proteome B is a specific full-length protein in proteome A. In such a case, it can be safely concluded that the protein pair has a common biological function or even interacts physically. In this paper, we present the Fusion Events Database (FED), a database for the maintenance and retrieval of fusion data both in prokaryotic and eukaryotic organisms and the Software for the Analysis of Fusion Events (SAFE), a computational platform implemented for the automated detection, filtering and visualization of fusion events (both available at: http://www.bioacademy.gr/bioinformatics/projects/ProteinFusion/index.htm). Finally, we analyze the proteomes of three microorganisms using these tools in order to demonstrate their functionality. PMID:22267904
A lengthy look at the daily grind: time series analysis of events, mood, stress, and satisfaction.
Fuller, Julie A; Stanton, Jeffrey M; Fisher, Gwenith G; Spitzmuller, Christiane; Russell, Steven S; Smith, Patricia C
2003-12-01
The present study investigated processes by which job stress and satisfaction unfold over time by examining the relations between daily stressful events, mood, and these variables. Using a Web-based daily survey of stressor events, perceived strain, mood, and job satisfaction completed by 14 university workers, 1,060 occasions of data were collected. Transfer function analysis, a multivariate version of time series analysis, was used to examine the data for relationships among the measured variables after factoring out the contaminating influences of serial dependency. Results revealed a contrast effect in which a stressful event associated positively with higher strain on the same day and associated negatively with strain on the following day. Perceived strain increased over the course of a semester for a majority of participants, suggesting that effects of stress build over time. Finally, the data were consistent with the notion that job satisfaction is a distal outcome that is mediated by perceived strain. ((c) 2003 APA, all rights reserved)
Li, W.; Thorne, R. M.; Bortnik, J.; ...
2015-09-07
In this study by determining preferential solar wind conditions leading to efficient radiation belt electron acceleration is crucial for predicting radiation belt electron dynamics. Using Van Allen Probes electron observations (>1 MeV) from 2012 to 2015, we identify a number of efficient and inefficient acceleration events separately to perform a superposed epoch analysis of the corresponding solar wind parameters and geomagnetic indices. By directly comparing efficient and inefficient acceleration events, we clearly show that prolonged southward Bz, high solar wind speed, and low dynamic pressure are critical for electron acceleration to >1 MeV energies in the heart of the outermore » radiation belt. We also evaluate chorus wave evolution using the superposed epoch analysis for the identified efficient and inefficient acceleration events and find that chorus wave intensity is much stronger and lasts longer during efficient electron acceleration events, supporting the scenario that chorus waves play a key role in MeV electron acceleration.« less
Association between antipsychotics and cardiovascular adverse events: A systematic review.
Silva, Ana Amancio Santos Da; Ribeiro, Marina Viegas Moura Rezende; Sousa-Rodrigues, Célio Fernando de; Barbosa, Fabiano Timbó
2017-03-01
Determine whether there is an association between the risk of cardiovascular adverse events and the use of antipsychotic agents. Analysis of original articles retrieved from the following databases: LILACS, PubMed, Cochrane Controlled Trials Clinical Data Bank (CENTRAL) and PsycINFO, without language restriction, dated until November 2015. After screening of 2,812 studies, three cohort original articles were selected for quality analysis. 403,083 patients with schizophrenia and 119,015 participants in the control group data were analyzed. The occurrence of cardiovascular events observed in the articles was: 63.5% (article 1), 13.1% (article 2) and 24.95% (article 3) in the group of treated schizophrenic patients, and 46.2%, 86.9% and 24.9%, respectively, in the control groups. Clinical heterogeneity among the studies led to a provisional response and made it impossible to perform the meta-analysis, although the articles demonstrate an association between cardiovascular adverse events and the use of antipsychotics. More quality clinical trials are needed to support this evidence.
Maximum Likelihood Analysis of Low Energy CDMS II Germanium Data
Agnese, R.
2015-03-30
We report on the results of a search for a Weakly Interacting Massive Particle (WIMP) signal in low-energy data of the Cryogenic Dark Matter Search experiment using a maximum likelihood analysis. A background model is constructed using GEANT4 to simulate the surface-event background from Pb210decay-chain events, while using independent calibration data to model the gamma background. Fitting this background model to the data results in no statistically significant WIMP component. In addition, we also perform fits using an analytic ad hoc background model proposed by Collar and Fields, who claimed to find a large excess of signal-like events in ourmore » data. Finally, we confirm the strong preference for a signal hypothesis in their analysis under these assumptions, but excesses are observed in both single- and multiple-scatter events, which implies the signal is not caused by WIMPs, but rather reflects the inadequacy of their background model.« less
Nickel, Daniela Alba; Calvo, Maria Cristina Marino; Natal, Sonia; Freitas, Sérgio Fernando Torres de; Hartz, Zulmira Maria de Araújo
2014-04-01
This article analyzes evaluation capacity-building based on the case study of a State Health Secretariat participating in the Project to Strengthen the Technical Capacity of State Health Secretariats in Monitoring and Evaluating Primary Healthcare. The case study adopted a mixed design with information from documents, semi-structured interviews, and evaluation of primary care by the State Health Secretariat in 2008-2011. Process analysis was used to identify the logical events that contributed to evaluation capacity-building, with two categories: evaluation capacity-building events and events for building organizational structure. The logical chain of events was formed by negotiation and agreement on the decision-making levels for the continuity of evaluation, data collection and analysis by the State Health Secretariat, a change in key indicators, restructuring of the evaluation matrix, and communication of the results to the municipalities. The three-way analysis showed that the aim of developing evaluation capacity was achieved.
Idaho National Laboratory Quarterly Performance Analysis for the 2nd Quarter FY 2015
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, Lisbeth A.
2015-04-01
This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of events for the 2nd Qtr FY-15.
Categorizing accident sequences in the external radiotherapy for risk analysis
2013-01-01
Purpose This study identifies accident sequences from the past accidents in order to help the risk analysis application to the external radiotherapy. Materials and Methods This study reviews 59 accidental cases in two retrospective safety analyses that have collected the incidents in the external radiotherapy extensively. Two accident analysis reports that accumulated past incidents are investigated to identify accident sequences including initiating events, failure of safety measures, and consequences. This study classifies the accidents by the treatments stages and sources of errors for initiating events, types of failures in the safety measures, and types of undesirable consequences and the number of affected patients. Then, the accident sequences are grouped into several categories on the basis of similarity of progression. As a result, these cases can be categorized into 14 groups of accident sequence. Results The result indicates that risk analysis needs to pay attention to not only the planning stage, but also the calibration stage that is committed prior to the main treatment process. It also shows that human error is the largest contributor to initiating events as well as to the failure of safety measures. This study also illustrates an event tree analysis for an accident sequence initiated in the calibration. Conclusion This study is expected to provide sights into the accident sequences for the prospective risk analysis through the review of experiences. PMID:23865005
Chuang, Sheuwen; Howley, Peter P; Lin, Shih-Hua
2015-05-01
Root cause analysis (RCA) is often adopted to complement epidemiologic investigation for outbreaks and infection-related adverse events in hospitals; however, RCA has been argued to have limited effectiveness in preventing such events. We describe how an innovative systems analysis approach halted repeated scabies outbreaks, and highlight the importance of systems thinking for outbreaks analysis and sustaining effective infection prevention and control. Following RCA for a third successive outbreak of scabies over a 17-month period in a 60-bed respiratory care ward of a Taiwan hospital, a systems-oriented event analysis (SOEA) model was used to reanalyze the outbreak. Both approaches and the recommendations were compared. No nosocomial scabies have been reported for more than 1975 days since implementation of the SOEA. Previous intervals between seeming eradication and repeat outbreaks following RCA were 270 days and 180 days. Achieving a sustainable positive resolution relied on applying systems thinking and the holistic analysis of the system, not merely looking for root causes of events. To improve the effectiveness of outbreaks analysis and infection control, an emphasis on systems thinking is critical, along with a practical approach to ensure its effective implementation. The SOEA model provides the necessary framework and is a viable complementary approach, or alternative, to RCA. Copyright © 2015 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Human Rights Event Detection from Heterogeneous Social Media Graphs.
Chen, Feng; Neill, Daniel B
2015-03-01
Human rights organizations are increasingly monitoring social media for identification, verification, and documentation of human rights violations. Since manual extraction of events from the massive amount of online social network data is difficult and time-consuming, we propose an approach for automated, large-scale discovery and analysis of human rights-related events. We apply our recently developed Non-Parametric Heterogeneous Graph Scan (NPHGS), which models social media data such as Twitter as a heterogeneous network (with multiple different node types, features, and relationships) and detects emerging patterns in the network, to identify and characterize human rights events. NPHGS efficiently maximizes a nonparametric scan statistic (an aggregate measure of anomalousness) over connected subgraphs of the heterogeneous network to identify the most anomalous network clusters. It summarizes each event with information such as type of event, geographical locations, time, and participants, and provides documentation such as links to videos and news reports. Building on our previous work that demonstrates the utility of NPHGS for civil unrest prediction and rare disease outbreak detection, we present an analysis of human rights events detected by NPHGS using two years of Twitter data from Mexico. NPHGS was able to accurately detect relevant clusters of human rights-related tweets prior to international news sources, and in some cases, prior to local news reports. Analysis of social media using NPHGS could enhance the information-gathering missions of human rights organizations by pinpointing specific abuses, revealing events and details that may be blocked from traditional media sources, and providing evidence of emerging patterns of human rights violations. This could lead to more timely, targeted, and effective advocacy, as well as other potential interventions.
NASA Astrophysics Data System (ADS)
Slinskey, E. A.; Loikith, P. C.; Waliser, D. E.; Goodman, A.
2017-12-01
Extreme precipitation events are associated with numerous societal and environmental impacts. Furthermore, anthropogenic climate change is projected to alter precipitation intensity across portions of the Continental United States (CONUS). Therefore, a spatial understanding and intuitive means of monitoring extreme precipitation over time is critical. Towards this end, we apply an event-based indicator, developed as a part of NASA's support of the ongoing efforts of the US National Climate Assessment, which assigns categories to extreme precipitation events based on 3-day storm totals as a basis for dataset intercomparison. To assess observational uncertainty across a wide range of historical precipitation measurement approaches, we intercompare in situ station data from the Global Historical Climatology Network (GHCN), satellite-derived precipitation data from NASA's Tropical Rainfall Measuring Mission (TRMM), gridded in situ station data from the Parameter-elevation Regressions on Independent Slopes Model (PRISM), global reanalysis from NASA's Modern Era Retrospective-Analysis version 2 (MERRA 2), and regional reanalysis with gauge data assimilation from NCEP's North American Regional Reanalysis (NARR). Results suggest considerable variability across the five-dataset suite in the frequency, spatial extent, and magnitude of extreme precipitation events. Consistent with expectations, higher resolution datasets were found to resemble station data best and capture a greater frequency of high-end extreme events relative to lower spatial resolution datasets. The degree of dataset agreement varies regionally, however all datasets successfully capture the seasonal cycle of precipitation extremes across the CONUS. These intercomparison results provide additional insight about observational uncertainty and the ability of a range of precipitation measurement and analysis products to capture extreme precipitation event climatology. While the event category threshold is fixed in this analysis, preliminary results from the development of a flexible categorization scheme, that scales with grid resolution, are presented.
Cardiovascular disease in live related renal transplantation.
Kaul, A; Sharm, R K; Gupta, A; Sinha, N; Singh, U
2011-11-01
Cardiovascular disease has become the leading cause of morbidity and mortality in renal transplant recipients, although its pathogenesis and treatment are poorly understood. Modifiable cardiovascular risk factors and graft dysfunction both play an important role in development of post transplant cardiovascular events. Prevalence of cardiovascular disease was studied in stable kidney transplant patients on cyclosporine based triple immunosuppression in relation to the various risk factors and post transplant cardiovascular events. Analysis of 562 post transplant patients with stable graft function for 6 months, the patients were evaluated for cardiovascular events in post transplant period. Pre and post transplant risk factors were analyzed using the COX proportional hazard model. 174 patients had undergone pre transplant coronary angiography, 15 of these patients underwent coronary revascularization (angioplasty in 12, CABG in 3). The prevalence of CAD was 7.2% in transplant recipients. Of 42 patients with CAD 31 (73.8%) had cardiovascular event in post transplant period. Age > or = 40 yrs, male sex, graft dysfunction, diabetes as primary renal disease, pre transplant cardiovascular event, chronic rejection showed significant correlation in univariate analysis and there was significant between age > or = 40 years (OR = 2.16 with 95% CI, 0.977-4.78) S creatinine > or = 1.4 mg % (OR = 2.40 with 95% CI, 1.20 - 4.82), diabetes as primary disease (OR with 95% CI 3.67, 3.2-14.82), PTDM (OR 3.67, 95% CI 1.45-9.40), pre-transplant cardiovascular disease (OR 4.14, 95% CI .38-13.15) with post transplant cardiovascular event on multivariate analysis. There was poor patient and graft survival among those who suffered post transplant cardiovascular event. The incidence of cardiovascular disease continues to be high after renal transplantation and modifiable risk factors should be identified to prevent occurrence of events in post transplant period.
Kraan, Tamar; Velthorst, Eva; Smit, Filip; de Haan, Lieuwe; van der Gaag, Mark
2015-02-01
Childhood trauma and recent life-events have been related to psychotic disorders. The aim of the present study was to examine whether childhood trauma and recent life-events are significantly more prevalent in patients at Ultra High Risk (UHR) of developing a psychotic disorder compared to healthy controls. A search of PsychInfo and Embase was conducted, relevant papers were reviewed, and three random-effects meta-analyses were performed. One meta-analysis assessed the prevalence rate of childhood trauma in UHR subjects and two meta-analyses were conducted to compare UHR subjects and healthy control subjects on the experience of childhood trauma and recent life-events. We found 12 studies on the prevalence of (childhood) trauma in UHR populations and 4 studies on recent life-events in UHR populations. We performed a meta-analysis on 6 studies (of which trauma prevalence rates were available) on childhood trauma in UHR populations, yielding a mean prevalence rate of 86.8% (95% CI 77%-93%). Childhood trauma was significantly more prevalent in UHR subjects compared to healthy control groups (Random effects Hedges' g=1.09; Z=4.60, p<.001). In contrast to our hypothesis, life-event rates were significantly lower in UHR subjects compared to healthy controls (Random effects Hedges' g=-0.53; Z=-2.36, p<.02). Our meta-analytic results illustrate that childhood trauma is highly prevalent among UHR subjects and that childhood trauma is related to UHR status. These results are in line with studies on childhood trauma in psychotic populations. In contrast to studies on recent life-events in psychotic populations, our results show that recent life-events are not associated with UHR status. Copyright © 2014 Elsevier B.V. All rights reserved.
Adaptation of Chain Event Graphs for use with Case-Control Studies in Epidemiology.
Keeble, Claire; Thwaites, Peter Adam; Barber, Stuart; Law, Graham Richard; Baxter, Paul David
2017-09-26
Case-control studies are used in epidemiology to try to uncover the causes of diseases, but are a retrospective study design known to suffer from non-participation and recall bias, which may explain their decreased popularity in recent years. Traditional analyses report usually only the odds ratio for given exposures and the binary disease status. Chain event graphs are a graphical representation of a statistical model derived from event trees which have been developed in artificial intelligence and statistics, and only recently introduced to the epidemiology literature. They are a modern Bayesian technique which enable prior knowledge to be incorporated into the data analysis using the agglomerative hierarchical clustering algorithm, used to form a suitable chain event graph. Additionally, they can account for missing data and be used to explore missingness mechanisms. Here we adapt the chain event graph framework to suit scenarios often encountered in case-control studies, to strengthen this study design which is time and financially efficient. We demonstrate eight adaptations to the graphs, which consist of two suitable for full case-control study analysis, four which can be used in interim analyses to explore biases, and two which aim to improve the ease and accuracy of analyses. The adaptations are illustrated with complete, reproducible, fully-interpreted examples, including the event tree and chain event graph. Chain event graphs are used here for the first time to summarise non-participation, data collection techniques, data reliability, and disease severity in case-control studies. We demonstrate how these features of a case-control study can be incorporated into the analysis to provide further insight, which can help to identify potential biases and lead to more accurate study results.
TRACE/PARCS analysis of the OECD/NEA Oskarshamn-2 BWR stability benchmark
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kozlowski, T.; Downar, T.; Xu, Y.
2012-07-01
On February 25, 1999, the Oskarshamn-2 NPP experienced a stability event which culminated in diverging power oscillations with a decay ratio of about 1.4. The event was successfully modeled by the TRACE/PARCS coupled code system, and further analysis of the event is described in this paper. The results show very good agreement with the plant data, capturing the entire behavior of the transient including the onset of instability, growth of the oscillations (decay ratio) and oscillation frequency. This provides confidence in the prediction of other parameters which are not available from the plant records. The event provides coupled code validationmore » for a challenging BWR stability event, which involves the accurate simulation of neutron kinetics (NK), thermal-hydraulics (TH), and TH/NK. coupling. The success of this work has demonstrated the ability of the 3-D coupled systems code TRACE/PARCS to capture the complex behavior of BWR stability events. The problem was released as an international OECD/NEA benchmark, and it is the first benchmark based on measured plant data for a stability event with a DR greater than one. Interested participants are invited to contact authors for more information. (authors)« less
Lu, Yan; Li, Tao
2014-03-01
To explore the effect of Chinese drugs for activating blood circulation and removing blood stasis (CDABCRBS) on carotid atherosclerotic plaque and long-term ischemic cerebrovascular events. By using open and control method, effect of 4 groups of platelet antagonists, platelet antagonists + CDABCRBS, platelet antagonists +atorvastatin, platelet antagonists +atorvastatin +CDABCRBS on carotid atherosclerotic plaque and long-term ischemic cerebrovascular events of 90 cerebral infarction patients were analyzed. Through survival analysis, there was no statistical difference in the effect of the 4 interventions on the variation of carotid stenosis rates or ischemic cerebrovascular events (P > 0.05). The occurrence of ischemic cerebrovascular events could be postponed by about 4 months in those treated with platelet antagonists + CDABCRBS and platelet antagonists + atorvastatin +CDABCRBS. By multivariate Logistic analysis, age, hypertension, and clopidogrel were associated with stenosis of extracranial carotid arteries (P <0.05). Age, diabetes, aspirin, clopidogrel, CDABCRBS were correlated with cerebrovascular accidents (P < 0.05). Whether or not accompanied with hypertension is an influential factor for carotid stenosis, but it does not affect the occurrence of ischemic cerebrovascular events. CDABCRBS could effectively prolong the occurrence time of ischemic cerebrovascular events.
Towards cross-lingual alerting for bursty epidemic events.
Collier, Nigel
2011-10-06
Online news reports are increasingly becoming a source for event-based early warning systems that detect natural disasters. Harnessing the massive volume of information available from multilingual newswire presents as many challanges as opportunities due to the patterns of reporting complex spatio-temporal events. In this article we study the problem of utilising correlated event reports across languages. We track the evolution of 16 disease outbreaks using 5 temporal aberration detection algorithms on text-mined events classified according to disease and outbreak country. Using ProMED reports as a silver standard, comparative analysis of news data for 13 languages over a 129 day trial period showed improved sensitivity, F1 and timeliness across most models using cross-lingual events. We report a detailed case study analysis for Cholera in Angola 2010 which highlights the challenges faced in correlating news events with the silver standard. The results show that automated health surveillance using multilingual text mining has the potential to turn low value news into high value alerts if informed choices are used to govern the selection of models and data sources. An implementation of the C2 alerting algorithm using multilingual news is available at the BioCaster portal http://born.nii.ac.jp/?page=globalroundup.
Root Cause Analysis: Learning from Adverse Safety Events.
Brook, Olga R; Kruskal, Jonathan B; Eisenberg, Ronald L; Larson, David B
2015-10-01
Serious adverse events continue to occur in clinical practice, despite our best preventive efforts. It is essential that radiologists, both as individuals and as a part of organizations, learn from such events and make appropriate changes to decrease the likelihood that such events will recur. Root cause analysis (RCA) is a process to (a) identify factors that underlie variation in performance or that predispose an event toward undesired outcomes and (b) allow for development of effective strategies to decrease the likelihood of similar adverse events occurring in the future. An RCA process should be performed within the environment of a culture of safety, focusing on underlying system contributors and, in a confidential manner, taking into account the emotional effects on the staff involved. The Joint Commission now requires that a credible RCA be performed within 45 days for all sentinel or major adverse events, emphasizing the need for all radiologists to understand the processes with which an effective RCA can be performed. Several RCA-related tools that have been found to be useful in the radiology setting include the "five whys" approach to determine causation; cause-and-effect, or Ishikawa, diagrams; causal tree mapping; affinity diagrams; and Pareto charts. © RSNA, 2015.
Ford, Alexander C; Malfertheiner, Peter; Giguère, Monique; Santana, José; Khan, Mostafizur; Moayyedi, Paul
2008-01-01
AIM: To assess the safety of bismuth used in Helicobacter pylori (H pylori) eradication therapy regimens. METHODS: We conducted a systematic review and meta-analysis. MEDLINE and EMBASE were searched (up to October 2007) to identify randomised controlled trials comparing bismuth with placebo or no treatment, or bismuth salts in combination with antibiotics as part of eradication therapy with the same dose and duration of antibiotics alone or, in combination, with acid suppression. Total numbers of adverse events were recorded. Data were pooled and expressed as relative risks with 95% confidence intervals (CI). RESULTS: We identified 35 randomised controlled trials containing 4763 patients. There were no serious adverse events occurring with bismuth therapy. There was no statistically significant difference detected in total adverse events with bismuth [relative risk (RR) = 1.01; 95% CI: 0.87-1.16], specific individual adverse events, with the exception of dark stools (RR = 5.06; 95% CI: 1.59-16.12), or adverse events leading to withdrawal of therapy (RR = 0.86; 95% CI: 0.54-1.37). CONCLUSION: Bismuth for the treatment of H pylori is safe and well-tolerated. The only adverse event occurring significantly more commonly was dark stools. PMID:19109870
NASA Astrophysics Data System (ADS)
Bauwe, Andreas; Tiemeyer, Bärbel; Kahle, Petra; Lennartz, Bernd
2015-04-01
Nitrate is one of the most important sources of pollution for surface waters in tile-drained agricultural areas. In order to develop appropriate management strategies to reduce nitrate losses, it is crucial to first understand the underlying hydrological processes. In this study, we used Principle Component Analysis (PCA) and Linear Discriminant Analysis (LDA) to analyze 212 storm events between 2004 and 2011 across three spatial scales (collector drain, ditch, and brook) to identify the controlling factors for hydrograph response characteristics and their influence on nitrate concentration patterns. Our results showed that the 212 hydrological events can be classified into six different types: summer events (28%), snow-dominated events (10%), events controlled by rainfall duration (16%), rainfall totals (8%), dry antecedent conditions (10%), and events controlled by wet antecedent conditions (14%). The relatively large number of unclassified events (15%) demonstrated the difficulty in separating event types due to mutually influencing variables. NO3-N concentrations showed a remarkably consistent pattern during the discharge events regardless of event type, with minima at the beginning, increasing concentrations at the rising limb, and maxima around peak discharge. However, the level of NO3-N concentrations varied notably among the event types. The highest average NO3-N concentrations were found for events controlled by rainfall totals (NO3-N=17.1 mg/l), events controlled by wet antecedent conditions (NO3-N=17.1 mg/l), and snowmelt (NO3-N=15.2 mg/l). Average maximum NO3-N concentrations were significantly lower during summer events (NO3-N=10.2 mg/l) and events controlled by dry antecedent conditions (NO3-N=11.7 mg/l). The results have furthermore shown that similar hydrological and biogeochemical processes determine the hydrograph and NO3-N response on storm events at various spatial scales. The management of tile-drained agricultural land to reduce NO3-N losses should focus explicitly on flow events and, more specifically, active management should preferably be conducted in the winter season for discharge events after snowmelt, after heavy rain storms and when the soil moisture conditions are wet.
2012-01-01
Background Adverse consequences of medical interventions are a source of concern, but clinical trials may lack power to detect elevated rates of such events, while observational studies have inherent limitations. Meta-analysis allows the combination of individual studies, which can increase power and provide stronger evidence relating to adverse events. However, meta-analysis of adverse events has associated methodological challenges. The aim of this study was to systematically identify and review the methodology used in meta-analyses where a primary outcome is an adverse or unintended event, following a therapeutic intervention. Methods Using a collection of reviews identified previously, 166 references including a meta-analysis were selected for review. At least one of the primary outcomes in each review was an adverse or unintended event. The nature of the intervention, source of funding, number of individual meta-analyses performed, number of primary studies included in the review, and use of meta-analytic methods were all recorded. Specific areas of interest relating to the methods used included the choice of outcome metric, methods of dealing with sparse events, heterogeneity, publication bias and use of individual patient data. Results The 166 included reviews were published between 1994 and 2006. Interventions included drugs and surgery among other interventions. Many of the references being reviewed included multiple meta-analyses with 44.6% (74/166) including more than ten. Randomised trials only were included in 42.2% of meta-analyses (70/166), observational studies only in 33.7% (56/166) and a mix of observational studies and trials in 15.7% (26/166). Sparse data, in the form of zero events in one or both arms where the outcome was a count of events, was found in 64 reviews of two-arm studies, of which 41 (64.1%) had zero events in both arms. Conclusions Meta-analyses of adverse events data are common and useful in terms of increasing the power to detect an association with an intervention, especially when the events are infrequent. However, with regard to existing meta-analyses, a wide variety of different methods have been employed, often with no evident rationale for using a particular approach. More specifically, the approach to dealing with zero events varies, and guidelines on this issue would be desirable. PMID:22553987
Partial and no recovery from delirium after hospital discharge predict increased adverse events.
Cole, Martin G; McCusker, Jane; Bailey, Robert; Bonnycastle, Michael; Fung, Shek; Ciampi, Antonio; Belzile, Eric
2017-01-08
The implications of partial and no recovery from delirium after hospital discharge are not clear. We sought to explore whether partial and no recovery from delirium among recently discharged patients predicted increased adverse events (emergency room visits, hospitalisations, death) during the subsequent 3 months. Prospective study of recovery from delirium in older hospital inpatients. The Confusion Assessment Method was used to diagnose delirium in hospital and determine recovery status after discharge (T0). Adverse events were determined during the 3 months T0. Survival analysis to the first adverse event and counting process modelling for one or more adverse events were used to examine associations between recovery status (ordinal variable, 0, 1 or 2 for full, partial or no recovery, respectively) and adverse events. Of 278 hospital inpatients with delirium, 172 were discharged before the assessment of recovery status (T0). Delirium recovery status at T0 was determined for 152: 25 had full recovery, 32 had partial recovery and 95 had no recovery. Forty-four patients had at least one adverse event during the subsequent 3 months. In multivariable analysis of one or more adverse events, poorer recovery status predicted increased adverse events; the hazard ratio (HR) (95% confidence interval, CI) was 1.72 (1.09, 2.71). The association of recovery status with adverse events was stronger among patients without dementia. Partial and no recovery from delirium after hospital discharge appear to predict increased adverse events during the subsequent 3 months These findings have potentially important implications for in-hospital and post-discharge management and policy.
Nishii, Nobuhiro; Miyoshi, Akihito; Kubo, Motoki; Miyamoto, Masakazu; Morimoto, Yoshimasa; Kawada, Satoshi; Nakagawa, Koji; Watanabe, Atsuyuki; Nakamura, Kazufumi; Morita, Hiroshi; Ito, Hiroshi
2018-03-01
Remote monitoring (RM) has been advocated as the new standard of care for patients with cardiovascular implantable electronic devices (CIEDs). RM has allowed the early detection of adverse clinical events, such as arrhythmia, lead failure, and battery depletion. However, lead failure was often identified only by arrhythmic events, but not impedance abnormalities. To compare the usefulness of arrhythmic events with conventional impedance abnormalities for identifying lead failure in CIED patients followed by RM. CIED patients in 12 hospitals have been followed by the RM center in Okayama University Hospital. All transmitted data have been analyzed and summarized. From April 2009 to March 2016, 1,873 patients have been followed by the RM center. During the mean follow-up period of 775 days, 42 lead failure events (atrial lead 22, right ventricular pacemaker lead 5, implantable cardioverter defibrillator [ICD] lead 15) were detected. The proportion of lead failures detected only by arrhythmic events, which were not detected by conventional impedance abnormalities, was significantly higher than that detected by impedance abnormalities (arrhythmic event 76.2%, 95% CI: 60.5-87.9%; impedance abnormalities 23.8%, 95% CI: 12.1-39.5%). Twenty-seven events (64.7%) were detected without any alert. Of 15 patients with ICD lead failure, none has experienced inappropriate therapy. RM can detect lead failure earlier, before clinical adverse events. However, CIEDs often diagnose lead failure as just arrhythmic events without any warning. Thus, to detect lead failure earlier, careful human analysis of arrhythmic events is useful. © 2017 Wiley Periodicals, Inc.
Dźwiarek, Marek; Latała, Agata
2016-01-01
This article presents an analysis of results of 1035 serious and 341 minor accidents recorded by Poland's National Labour Inspectorate (PIP) in 2005-2011, in view of their prevention by means of additional safety measures applied by machinery users. Since the analysis aimed at formulating principles for the application of technical safety measures, the analysed accidents should bear additional attributes: the type of machine operation, technical safety measures and the type of events causing injuries. The analysis proved that the executed tasks and injury-causing events were closely connected and there was a relation between casualty events and technical safety measures. In the case of tasks consisting of manual feeding and collecting materials, the injuries usually occur because of the rotating motion of tools or crushing due to a closing motion. Numerous accidents also happened in the course of supporting actions, like removing pollutants, correcting material position, cleaning, etc.
Analysis of selected microflares observed by SphinX over the last minimum of solar activity
NASA Astrophysics Data System (ADS)
Siarkowski, Marek; Sylwester, Janusz; Sylwester, Barbara; Gryciuk, Magdalena
The Solar Photometer in X-rays (SphinX) was designed to observe soft X-ray solar emission in the energy range between 1 keV and 15 keV with the resolution better than 0.5 keV. The instrument operated from February until November 2009 aboard CORONAS-Photon satellite, during the phase of exceptionally low minimum of solar activity. Here we use SphinX data for analysis of selected microflare-class events. We selected events of unusual lightcurves or location. Our study involves determination of temporal characteristics (times of start, maximum and end of flares) and analysis of physical conditions in flaring plasma (temperature, emission measure). Dedicated method has been used in order to remove emission not related to flare. Supplementary information about morphology and evolution of investigated events has been derived from the analysis of XRT/Hinode and SECCHI /STEREO images.
LHCb trigger streams optimization
NASA Astrophysics Data System (ADS)
Derkach, D.; Kazeev, N.; Neychev, R.; Panin, A.; Trofimov, I.; Ustyuzhanin, A.; Vesterinen, M.
2017-10-01
The LHCb experiment stores around 1011 collision events per year. A typical physics analysis deals with a final sample of up to 107 events. Event preselection algorithms (lines) are used for data reduction. Since the data are stored in a format that requires sequential access, the lines are grouped into several output file streams, in order to increase the efficiency of user analysis jobs that read these data. The scheme efficiency heavily depends on the stream composition. By putting similar lines together and balancing the stream sizes it is possible to reduce the overhead. We present a method for finding an optimal stream composition. The method is applied to a part of the LHCb data (Turbo stream) on the stage where it is prepared for user physics analysis. This results in an expected improvement of 15% in the speed of user analysis jobs, and will be applied on data to be recorded in 2017.
Dźwiarek, Marek; Latała, Agata
2016-01-01
This article presents an analysis of results of 1035 serious and 341 minor accidents recorded by Poland's National Labour Inspectorate (PIP) in 2005–2011, in view of their prevention by means of additional safety measures applied by machinery users. Since the analysis aimed at formulating principles for the application of technical safety measures, the analysed accidents should bear additional attributes: the type of machine operation, technical safety measures and the type of events causing injuries. The analysis proved that the executed tasks and injury-causing events were closely connected and there was a relation between casualty events and technical safety measures. In the case of tasks consisting of manual feeding and collecting materials, the injuries usually occur because of the rotating motion of tools or crushing due to a closing motion. Numerous accidents also happened in the course of supporting actions, like removing pollutants, correcting material position, cleaning, etc. PMID:26652689
A computer aided treatment event recognition system in radiation therapy.
Xia, Junyi; Mart, Christopher; Bayouth, John
2014-01-01
To develop an automated system to safeguard radiation therapy treatments by analyzing electronic treatment records and reporting treatment events. CATERS (Computer Aided Treatment Event Recognition System) was developed to detect treatment events by retrieving and analyzing electronic treatment records. CATERS is designed to make the treatment monitoring process more efficient by automating the search of the electronic record for possible deviations from physician's intention, such as logical inconsistencies as well as aberrant treatment parameters (e.g., beam energy, dose, table position, prescription change, treatment overrides, etc). Over a 5 month period (July 2012-November 2012), physicists were assisted by the CATERS software in conducting normal weekly chart checks with the aims of (a) determining the relative frequency of particular events in the authors' clinic and (b) incorporating these checks into the CATERS. During this study period, 491 patients were treated at the University of Iowa Hospitals and Clinics for a total of 7692 fractions. All treatment records from the 5 month analysis period were evaluated using all the checks incorporated into CATERS after the training period. About 553 events were detected as being exceptions, although none of them had significant dosimetric impact on patient treatments. These events included every known event type that was discovered during the trial period. A frequency analysis of the events showed that the top three types of detected events were couch position override (3.2%), extra cone beam imaging (1.85%), and significant couch position deviation (1.31%). The significant couch deviation is defined as the number of treatments where couch vertical exceeded two times standard deviation of all couch verticals, or couch lateral/longitudinal exceeded three times standard deviation of all couch laterals and longitudinals. On average, the application takes about 1 s per patient when executed on either a desktop computer or a mobile device. CATERS offers an effective tool to detect and report treatment events. Automation and rapid processing enables electronic record interrogation daily, alerting the medical physicist of deviations potentially days prior to performing weekly check. The output of CATERS could also be utilized as an important input to failure mode and effects analysis.
NASA Astrophysics Data System (ADS)
Heo, J. H.; Ahn, H.; Kjeldsen, T. R.
2017-12-01
South Korea is prone to large, and often disastrous, rainfall events caused by a mixture of monsoon and typhoon rainfall phenomena. However, traditionally, regional frequency analysis models did not consider this mixture of phenomena when fitting probability distributions, potentially underestimating the risk posed by the more extreme typhoon events. Using long-term observed records of extreme rainfall from 56 sites combined with detailed information on the timing and spatial impact of past typhoons from the Korea Meteorological Administration (KMA), this study developed and tested a new mixture model for frequency analysis of two different phenomena; events occurring regularly every year (monsoon) and events only occurring in some years (typhoon). The available annual maximum 24 hour rainfall data were divided into two sub-samples corresponding to years where the annual maximum is from either (1) a typhoon event, or (2) a non-typhoon event. Then, three-parameter GEV distribution was fitted to each sub-sample along with a weighting parameter characterizing the proportion of historical events associated with typhoon events. Spatial patterns of model parameters were analyzed and showed that typhoon events are less commonly associated with annual maximum rainfall in the North-West part of the country (Seoul area), and more prevalent in the southern and eastern parts of the country, leading to the formation of two distinct typhoon regions: (1) North-West; and (2) Southern and Eastern. Using a leave-one-out procedure, a new regional frequency model was tested and compared to a more traditional index flood method. The results showed that the impact of typhoon on design events might previously have been underestimated in the Seoul area. This suggests that the use of the mixture model should be preferred where the typhoon phenomena is less frequent, and thus can have a significant effect on the rainfall-frequency curve. This research was supported by a grant(2017-MPSS31-001) from Supporting Technology Development Program for Disaster Management funded by Ministry of Public Safety and Security(MPSS) of the Korean government.
The FTA Method And A Possibility Of Its Application In The Area Of Road Freight Transport
NASA Astrophysics Data System (ADS)
Poliaková, Adela
2015-06-01
The Fault Tree process utilizes logic diagrams to portray and analyse potentially hazardous events. Three basic symbols (logic gates) are adequate for diagramming any fault tree. However, additional recently developed symbols can be used to reduce the time and effort required for analysis. A fault tree is a graphical representation of the relationship between certain specific events and the ultimate undesired event (2). This paper deals to method of Fault Tree Analysis basic description and provides a practical view on possibility of application by quality improvement in road freight transport company.
Seismic event near Jarocin (Poland)
NASA Astrophysics Data System (ADS)
Lizurek, Grzegorz; Plesiewicz, Beata; Wiejacz, Paweł; Wiszniowski, Jan; Trojanowski, Jacek
2013-02-01
The earthquake of magnitude M L = 3:8 (EMSC) took place on Friday, 6 January 2012, north-east of the town of Jarocin in Wielkopolska Region, Poland. The only historical information about past earthquakes in the region was found in the diary from 1824; apart of it, there was a seismic event noticed in the vicinity of Wielkopolska in 1606 (Pagaczewski 1982). The scope of this paper is to describe the 6 January 2012 event in view of instrumental seismology, macroseismic data analysis and known tectonics of the region, which should be useful in future seismic hazard analysis of Poland.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sezen, Halil; Aldemir, Tunc; Denning, R.
Probabilistic risk assessment of nuclear power plants initially focused on events initiated by internal faults at the plant, rather than external hazards including earthquakes and flooding. Although the importance of external hazards risk analysis is now well recognized, the methods for analyzing low probability external hazards rely heavily on subjective judgment of specialists, often resulting in substantial conservatism. This research developed a framework to integrate the risk of seismic and flooding events using realistic structural models and simulation of response of nuclear structures. The results of four application case studies are presented.
Activity recognition using Video Event Segmentation with Text (VEST)
NASA Astrophysics Data System (ADS)
Holloway, Hillary; Jones, Eric K.; Kaluzniacki, Andrew; Blasch, Erik; Tierno, Jorge
2014-06-01
Multi-Intelligence (multi-INT) data includes video, text, and signals that require analysis by operators. Analysis methods include information fusion approaches such as filtering, correlation, and association. In this paper, we discuss the Video Event Segmentation with Text (VEST) method, which provides event boundaries of an activity to compile related message and video clips for future interest. VEST infers meaningful activities by clustering multiple streams of time-sequenced multi-INT intelligence data and derived fusion products. We discuss exemplar results that segment raw full-motion video (FMV) data by using extracted commentary message timestamps, FMV metadata, and user-defined queries.
Risk Analysis of Return Support Material on Gas Compressor Platform Project
NASA Astrophysics Data System (ADS)
Silvianita; Aulia, B. U.; Khakim, M. L. N.; Rosyid, Daniel M.
2017-07-01
On a fixed platforms project are not only carried out by a contractor, but two or more contractors. Cooperation in the construction of fixed platforms is often not according to plan, it is caused by several factors. It takes a good synergy between the contractor to avoid miss communication may cause problems on the project. For the example is about support material (sea fastening, skid shoe and shipping support) used in the process of sending a jacket structure to operation place often does not return to the contractor. It needs a systematic method to overcome the problem of support material. This paper analyses the causes and effects of GAS Compressor Platform that support material is not return, using Fault Tree Analysis (FTA) and Event Tree Analysis (ETA). From fault tree analysis, the probability of top event is 0.7783. From event tree analysis diagram, the contractors lose Rp.350.000.000, - to Rp.10.000.000.000, -.
A comparison of analysis methods to estimate contingency strength.
Lloyd, Blair P; Staubitz, Johanna L; Tapp, Jon T
2018-05-09
To date, several data analysis methods have been used to estimate contingency strength, yet few studies have compared these methods directly. To compare the relative precision and sensitivity of four analysis methods (i.e., exhaustive event-based, nonexhaustive event-based, concurrent interval, concurrent+lag interval), we applied all methods to a simulated data set in which several response-dependent and response-independent schedules of reinforcement were programmed. We evaluated the degree to which contingency strength estimates produced from each method (a) corresponded with expected values for response-dependent schedules and (b) showed sensitivity to parametric manipulations of response-independent reinforcement. Results indicated both event-based methods produced contingency strength estimates that aligned with expected values for response-dependent schedules, but differed in sensitivity to response-independent reinforcement. The precision of interval-based methods varied by analysis method (concurrent vs. concurrent+lag) and schedule type (continuous vs. partial), and showed similar sensitivities to response-independent reinforcement. Recommendations and considerations for measuring contingencies are identified. © 2018 Society for the Experimental Analysis of Behavior.
Vullings, Rik; Verdurmen, Kim M J; Hulsenboom, Alexandra D J; Scheffer, Stephanie; de Lau, Hinke; Kwee, Anneke; Wijn, Pieter F F; Amer-Wåhlin, Isis; van Laar, Judith O E H; Oei, S Guid
2017-01-01
Reducing perinatal morbidity and mortality is one of the major challenges in modern health care. Analysing the ST segment of the fetal electrocardiogram was thought to be the breakthrough in fetal monitoring during labour. However, its implementation in clinical practice yields many false alarms and ST monitoring is highly dependent on cardiotocogram assessment, limiting its value for the prediction of fetal distress during labour. This study aims to evaluate the relation between physiological variations in the orientation of the fetal electrical heart axis and the occurrence of ST events. A post-hoc analysis was performed following a multicentre randomised controlled trial, including 1097 patients from two participating centres. All women were monitored with ST analysis during labour. Cases of fetal metabolic acidosis, poor signal quality, missing blood gas analysis, and congenital heart disease were excluded. The orientation of the fetal electrical heart axis affects the height of the initial T/QRS baseline, and therefore the incidence of ST events. We grouped tracings with the same initial baseline T/QRS value. We depicted the number of ST events as a function of the initial baseline T/QRS value with a linear regression model. A significant increment of ST events was observed with increasing height of the initial T/QRS baseline, irrespective of the fetal condition; correlation coefficient 0.63, p<0.001. The most frequent T/QRS baseline is 0.12. The orientation of the fetal electrical heart axis and accordingly the height of the initial T/QRS baseline should be taken into account in fetal monitoring with ST analysis.
Tonin, Fernanda S; Piazza, Thais; Wiens, Astrid; Fernandez-Llimos, Fernando; Pontarolo, Roberto
2015-12-01
Objective:We aimed to gather evidence of the discontinuation rates owing to adverse events or treatment failure for four recently approved antipsychotics (asenapine, blonanserin, iloperidone, and lurasidone).Methods: A systematic review followed by pairwise meta-analysis and mixed treatment comparison meta analysis(MTC) was performed, including randomized controlled trials (RCTs) that compared the use of the above-mentioned drugs versus placebo in patients with schizophrenia. An electronic search was conducted in PubMed, Scopus, Science Direct, Scielo, the Cochrane Library, and International Pharmaceutical Abstracts(January 2015). The included trials were at least single blinded. The main outcome measures extracted were discontinuation owing to adverse events and discontinuation owing to treatment failure.Results: Fifteen RCTs were identified (n = 5400 participants) and 13 of them were amenable for use in our meta-analyses. No significant differences were observed between any of the four drugs and placebo as regards discontinuation owing to adverse events, whether in pairwise meta-analysis or in MTC. All drugs presented a better profile than placebo on discontinuation owing to treatment failure, both in pairwise meta-analysis and MTC. Asenapine was found to be the best therapy in terms of tolerability owing to failure,while lurasidone was the worst treatment in terms of adverse events. The evidence around blonanserin is weak.Conclusion: MTCs allowed the creation of two different rank orders of these four antipsychotic drugs in two outcome measures. This evidence-generating method allows direct and indirect comparisons, supporting approval and pricing decisions when lacking sufficient, direct, head-to-head trials.
A Typology of Language-Brokering Events in Dual-Language Immersion Classrooms
ERIC Educational Resources Information Center
Coyoca, Anne Marie; Lee, Jin Sook
2009-01-01
This paper examines language-brokering events to better understand how children utilize their linguistic resources to create spaces where the coexistence of two languages can enable or restrict understanding and learning of academic content for themselves and others. An analysis of the structure of language-brokering events reveals that different…
Atmospheric dust events in Central Asia: Relationship to wind, soil type, and land use
USDA-ARS?s Scientific Manuscript database
Xinjiang Province is one of the most important source regions of atmospheric dust in China. Spatial-temporal characteristics of dust events in the region were investigated by time series analysis of annual dust event frequency and meteorological data collected at 101 stations in Xinjiang Province fr...
Du, Xiuquan; Hu, Changlin; Yao, Yu; Sun, Shiwei; Zhang, Yanping
2017-12-12
In bioinformatics, exon skipping (ES) event prediction is an essential part of alternative splicing (AS) event analysis. Although many methods have been developed to predict ES events, a solution has yet to be found. In this study, given the limitations of machine learning algorithms with RNA-Seq data or genome sequences, a new feature, called RS (RNA-seq and sequence) features, was constructed. These features include RNA-Seq features derived from the RNA-Seq data and sequence features derived from genome sequences. We propose a novel Rotation Forest classifier to predict ES events with the RS features (RotaF-RSES). To validate the efficacy of RotaF-RSES, a dataset from two human tissues was used, and RotaF-RSES achieved an accuracy of 98.4%, a specificity of 99.2%, a sensitivity of 94.1%, and an area under the curve (AUC) of 98.6%. When compared to the other available methods, the results indicate that RotaF-RSES is efficient and can predict ES events with RS features.
Marín Torrens, R M; Sánchez Cánovas, J; Donat Colomer, F; Dupuy Layo, M J; Salas Trejo, M D
1996-05-15
To find what vital events middle-aged women in our society most often experience and their influence as stress factors on physical health and subjective psychological well-being. A multivariant transversal study. 5 primary care centres in Valencia and Alicante. 306 women chosen at random among those seen at these health centres. Frequency analysis of vital events. Correlation analysis with questionnaires on physical symptoms and diseases, psychological well-being, work situation, emotional behaviour, sexuality and relationships with their partner. ANOVA: dividing the sample into 2 groups based on mean adaptive effort. The most common events numbered 23. The ANOVA showed a significant association between greater adaptive effort and negative emotional behaviour, personal control, material well-being, relationship with the partner, and physical and psychological symptoms. The relevance of daily events as generators of stress was confirmed, as was the impact of these and major events on these women's physical and psychological health. The importance of attending women at this stage of their lives from an integrated and interdisciplinary perspective, which tackles the physiological, psychological and cultural features together, was shown.
Sport events and climate for visitors—the case of FIFA World Cup in Qatar 2022
NASA Astrophysics Data System (ADS)
Matzarakis, Andreas; Fröhlich, Dominik
2015-04-01
The effect of weather on sport events is not well studied. It requires special attention if the event is taking place at a time and place with extreme weather situations. For the world soccer championship in Qatar (Doha 2022), human biometeorological analysis has been performed in order to identify the time of the year that is most suitable in terms of thermal comfort for visitors attending the event. The analysis is based on thermal indices like Physiologically Equivalent Temperature (PET). The results show that this kind of event may be not appropriate for visitors, if it is placed during months with extreme conditions. For Doha, this is the period from May to September, when conditions during a large majority of hours of the day cause strong heat stress for the visitors. A more appropriate time would be the months November to February, when thermally comfortable conditions are much more frequent. The methods applied here can quantify the thermal conditions and show limitations and possibilities for specific events and locations.
Extreme precipitation events and related weather patterns over Iraq
NASA Astrophysics Data System (ADS)
raheem Al-nassar, Ali; Sangrà, Pablo; Alarcón, Marta
2016-04-01
This study aims to investigate the extreme precipitation events and the associated weather phenomena in the Middle East and particularly in Iraq. For this purpose we used Baghdad daily precipitation records from the Iraqi Meteorological and Seismology Organization combined with ECMWF (ERA-Interim) reanalysis data for the period from January 2002 to December 2013. Extreme events were found statistically at the 90% percentile of the recorded precipitation, and were highly correlated with hydrological flooding in some cities of Iraq. We identified fifteen extreme precipitation events. The analysis of the corresponding weather patterns (500 hPa and 250 hPa geopotential and velocity field distribution) indicated that 5 events were related with cut off low causing the highest precipitation (180 mm), 3 events related with rex block (158 mm), 3 events related with jet streak occurrence (130 mm) and 4 events related with troughs (107 mm). . Five of these events caused flash floods and in particular one of them related with a rex block was the most dramatic heavy rain event in Iraq in 30 years. We investigated for each case the convective instability and dynamical forcing together with humidity sources. For convective instability we explored the distribution of the K index and SWEAT index. For dynamical forcing we analyzed at several levels Q vector, divergence, potential and relative vorticity advection and omega vertical velocity. Source of humidity was investigated through humidity and convergence of specific humidity distribution. One triggering factor of all the events is the advection and convergence of humidity from the Red Sea and the Persian Gulf. Therefore a necessary condition for extreme precipitation in Iraq is the advection and convergence of humidity from the Red Sea and Persian Gulf. Our preliminary analysis also indicates that extreme precipitation events are primary dynamical forced playing convective instability a secondary role.
Stressful Encounters with Social Work Clients: A Descriptive Account Based on Critical Incidents
ERIC Educational Resources Information Center
Savaya, Riki; Gardner, Fiona; Stange, Dorit
2011-01-01
This article presents the findings of an analysis of 130 critical incidents reported by social workers in Israel. Almost all the incidents turned out to be upsetting events that caused the writers a great deal of pain, frustration, and self-doubt. Content analysis yielded four main categories of incidents or events: (1) client hostility and…
ERIC Educational Resources Information Center
Anderson, Kate T.
2017-01-01
This article presents a reflexive and critical discourse analysis of classroom events that grew out of a cross-cultural partnership with a secondary school teacher in Singapore. I aim to illuminate how differences between researcher and teacher assumptions about what participation in classroom activities should look like came into high relief when…
Evidential Networks for Fault Tree Analysis with Imprecise Knowledge
NASA Astrophysics Data System (ADS)
Yang, Jianping; Huang, Hong-Zhong; Liu, Yu; Li, Yan-Feng
2012-06-01
Fault tree analysis (FTA), as one of the powerful tools in reliability engineering, has been widely used to enhance system quality attributes. In most fault tree analyses, precise values are adopted to represent the probabilities of occurrence of those events. Due to the lack of sufficient data or imprecision of existing data at the early stage of product design, it is often difficult to accurately estimate the failure rates of individual events or the probabilities of occurrence of the events. Therefore, such imprecision and uncertainty need to be taken into account in reliability analysis. In this paper, the evidential networks (EN) are employed to quantify and propagate the aforementioned uncertainty and imprecision in fault tree analysis. The detailed conversion processes of some logic gates to EN are described in fault tree (FT). The figures of the logic gates and the converted equivalent EN, together with the associated truth tables and the conditional belief mass tables, are also presented in this work. The new epistemic importance is proposed to describe the effect of ignorance degree of event. The fault tree of an aircraft engine damaged by oil filter plugs is presented to demonstrate the proposed method.
Bonofiglio, Federico; Beyersmann, Jan; Schumacher, Martin; Koller, Michael; Schwarzer, Guido
2016-09-01
Meta-analysis of a survival endpoint is typically based on the pooling of hazard ratios (HRs). If competing risks occur, the HRs may lose translation into changes of survival probability. The cumulative incidence functions (CIFs), the expected proportion of cause-specific events over time, re-connect the cause-specific hazards (CSHs) to the probability of each event type. We use CIF ratios to measure treatment effect on each event type. To retrieve information on aggregated, typically poorly reported, competing risks data, we assume constant CSHs. Next, we develop methods to pool CIF ratios across studies. The procedure computes pooled HRs alongside and checks the influence of follow-up time on the analysis. We apply the method to a medical example, showing that follow-up duration is relevant both for pooled cause-specific HRs and CIF ratios. Moreover, if all-cause hazard and follow-up time are large enough, CIF ratios may reveal additional information about the effect of treatment on the cumulative probability of each event type. Finally, to improve the usefulness of such analysis, better reporting of competing risks data is needed. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Kim, M H; Lin, J; Hussein, M; Battleman, D
2009-12-01
Rhythm- and rate-control therapies are an essential part of atrial fibrillation (AF) management; however, the use of existing agents is often limited by the occurrence of adverse events. The aim of this study was to evaluate suspected adverse events and adverse event monitoring, and associated medical costs, in patients receiving AF rhythm-control and/or rate-control therapy. This retrospective cohort study used claims data from the Integrated Healthcare Information Systems National Managed Care Benchmark Database from 2002-2006. Patients hospitalized for AF (primary diagnosis), and who had at least 365 days' enrollment before and after the initial (index) AF hospitalization, were included in the analysis. Suspected AF therapy-related adverse events and function tests for adverse event monitoring were identified according to pre-specified diagnosis codes/procedures, and examined over the 12 months following discharge from the index hospitalization. Events/function tests had to have occurred within 90 days of a claim for AF therapy to be considered a suspected adverse event/adverse event monitoring. Of 4174 AF patients meeting the study criteria, 3323 received AF drugs; 428 received rhythm-control only (12.9%), 2130 rate-control only (64.1%), and 765 combined rhythm/rate-control therapy (23.0%). Overall, 50.1% of treated patients had a suspected adverse event and/or function test for adverse event monitoring (45.5% with rate-control, 53.5% with rhythm-control, and 61.2% with combined rhythm/rate-control). Suspected cardiovascular adverse events were the most common events (occurring in 36.1% of patients), followed by pulmonary (6.1%), and endocrine events (5.9%). Overall, suspected adverse events/function tests were associated with mean annual per-patient costs of $3089 ($1750 with rhythm-control, $2041 with rate control, and $6755 with combined rhythm/rate-control). As a retrospective analysis, the study is subject to potential selection bias, while its reliance on diagnostic codes for identification of AF and suspected adverse events is a source of potential investigator error. A direct cause-effect relationship between suspected adverse events/function tests and AF therapy cannot be confirmed based on the claims data available. The incidence of suspected adverse events and adverse event monitoring during AF rhythm-control and/or rate-control therapy is high. Costs associated with adverse events and adverse event monitoring are likely to add considerably to the overall burden of AF management.
NASA Astrophysics Data System (ADS)
Yoo, Chulsang; Park, Minkyu; Kim, Hyeon Jun; Choi, Juhee; Sin, Jiye; Jun, Changhyun
2015-01-01
In this study, the analysis of documentary records on the storm events in the Annals of the Choson Dynasty, covering the entire period of 519 years from 1392 to 1910, was carried out. By applying various key words related to storm events, a total of 556 documentary records could be identified. The main objective of this study was to develop rules of classification for the documentary records on the storm events in the Annals of the Choson Dynasty. The results were also compared with the rainfall data of the traditional Korean rain gauge, named Chukwooki, which are available from 1777 to 1910 (about 130 years). The analysis is organized as follows. First, the frequency of the documents, their length, comments about the size of the inundated area, the number of casualties, the number of property losses, and the size of the countermeasures, etc. were considered to determine the magnitude of the events. To this end, rules of classification of the storm events are developed. Cases in which the word 'disaster' was used along with detailed information about the casualties and property damages, were classified as high-level storm events. The high-level storm events were additionally sub-categorized into catastrophic, extreme, and severe events. Second, by applying the developed rules of classification, a total of 326 events were identified as high-level storm events during the 519 years of the Choson Dynasty. Among these high-level storm events, only 19 events were then classified as the catastrophic ones, 106 events as the extreme ones, and 201 events as the severe ones. The mean return period of these storm events was found to be about 30 years for the catastrophic events, 5 years for the extreme events, and 2-3 years for the severe events. Third, the classification results were verified considering the records of the traditional Korean rain gauge; it was found that the catastrophic events are strongly distinguished from other events with a mean total rainfall and a storm duration equal to 439.8 mm and 49.3 h, respectively. The return period of these catastrophic events was also estimated to be in the range 100-500 years.
Cheng, Ji; Pullenayegum, Eleanor; Marshall, John K; Thabane, Lehana
2016-01-01
Objectives There is no consensus on whether studies with no observed events in the treatment and control arms, the so-called both-armed zero-event studies, should be included in a meta-analysis of randomised controlled trials (RCTs). Current analytic approaches handled them differently depending on the choice of effect measures and authors' discretion. Our objective is to evaluate the impact of including or excluding both-armed zero-event (BA0E) studies in meta-analysis of RCTs with rare outcome events through a simulation study. Method We simulated 2500 data sets for different scenarios varying the parameters of baseline event rate, treatment effect and number of patients in each trial, and between-study variance. We evaluated the performance of commonly used pooling methods in classical meta-analysis—namely, Peto, Mantel-Haenszel with fixed-effects and random-effects models, and inverse variance method with fixed-effects and random-effects models—using bias, root mean square error, length of 95% CI and coverage. Results The overall performance of the approaches of including or excluding BA0E studies in meta-analysis varied according to the magnitude of true treatment effect. Including BA0E studies introduced very little bias, decreased mean square error, narrowed the 95% CI and increased the coverage when no true treatment effect existed. However, when a true treatment effect existed, the estimates from the approach of excluding BA0E studies led to smaller bias than including them. Among all evaluated methods, the Peto method excluding BA0E studies gave the least biased results when a true treatment effect existed. Conclusions We recommend including BA0E studies when treatment effects are unlikely, but excluding them when there is a decisive treatment effect. Providing results of including and excluding BA0E studies to assess the robustness of the pooled estimated effect is a sensible way to communicate the results of a meta-analysis when the treatment effects are unclear. PMID:27531725
On the relation of earthquake stress drop and ground motion variability
NASA Astrophysics Data System (ADS)
Oth, Adrien; Miyake, Hiroe; Bindi, Dino
2017-07-01
One of the key parameters for earthquake source physics is stress drop since it can be directly linked to the spectral level of ground motion. Stress drop estimates from moment corner frequency analysis have been shown to be extremely variable, and this to a much larger degree than expected from the between-event ground motion variability. This discrepancy raises the question whether classically determined stress drop variability is too large, which would have significant consequences for seismic hazard analysis. We use a large high-quality data set from Japan with well-studied stress drop data to address this issue. Nonparametric and parametric reference ground motion models are derived, and the relation of between-event residuals for Japan Meteorological Agency equivalent seismic intensity and peak ground acceleration with stress drop is analyzed for crustal earthquakes. We find a clear correlation of the between-event residuals with stress drops estimates; however, while the island of Kyushu is characterized by substantially larger stress drops than Honshu, the between-event residuals do not reflect this observation, leading to the appearance of two event families with different stress drop levels yet similar range of between-event residuals. Both the within-family and between-family stress drop variations are larger than expected from the ground motion between-event variability. A systematic common analysis of these parameters holds the potential to provide important constraints on the relative robustness of different groups of data in the different parameter spaces and to improve our understanding on how much of the observed source parameter variability is likely to be true source physics variability.
NASA Astrophysics Data System (ADS)
Al-Doukhi, Hanadi Abulateef
The Salalah Crystalline Basement (SCB) is the largest Precambrian exposure in Oman located on the southern margin of the Arabian Plate at the Arabian Sea shore. This work used remote sensing, detailed structural analysis and the analysis of ten samples using 40Ar/39Ar age dating to establish the Precambrian evolution of the SCB by focusing on its central and southwestern parts. This work found that the SCB evolved through four deformational events that shaped its final architecture: (1) Folding and thrusting event that resulted in the emplacement of the Sadh complex atop the Juffa complex. This event resulted in the formation of possibly N-verging nappe structure; (2) Regional folding event around SE- and SW-plunging axes that deformed the regional fabric developed during the N-verging nappe structure and produced map-scale SE- and SW-plunging antiforms shaping the complexes into a semi-dome structure; (3) Strike-slip shearing event that produced a conjugate set of NE-trending sinistral and NW-trending dextral strike-slip shear zones; and (4) Localized SE-directed gravitational collapse manifested by top-to-the-southeast kinematic indicators. Deformation within the SCB might have ceased by 752.2+/-2.7 Ma as indicated by an age given by an undeformed granite. The thermochron of samples collected throughout the SCB complexes shows a single cooling event that occurred between about 800 and 760 Ma. This cooling event could be accomplished by crustal exhumation resulting in regional collapse following the prolonged period of the contractional deformation of the SCB. This makes the SCB a possible metamorphic core complex.
1990-02-01
transform the waveforms of this event to those of the Titanial must be a band limited representation of the firing sequence. Therefore, we decided to...design a Wiener filter to transform Pn waveforms of Event Titania4 into those of Event Titanial at all sensors of NORESS. Prior to applying this technique...for transforming the Pn phases of event Titania 4 into those of event Titanial . 28 T’tania4 -* Titania3 Titania3 B5 T’tania4 Titania4 - Titania3
VizieR Online Data Catalog: OGLE-II DIA microlensing events (Wozniak+, 2001)
NASA Astrophysics Data System (ADS)
Wozniak, P. R.; Udalski, A.; Szymanski, M.; Kubiak, M.; Pietrzynski, G.; Soszynski, I.; Zebrun, K.
2002-11-01
We present a sample of microlensing events discovered in the Difference Image Analysis (DIA) of the OGLE-II images collected during three observing seasons, 1997-1999. 4424 light curves pass our criteria on the presence of a brightening episode on top of a constant baseline. Among those, 512 candidate microlensing events were selected visually. We designed an automated procedure, which unambiguously selects up to 237 best events. Including eight candidate events recovered by other means, a total of 520 light curves are presented in this work. (4 data files).
Idaho National Laboratory Quarterly Performance Analysis - 2nd Quarter FY2014
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lisbeth A. Mitchell
2014-06-01
This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of occurrence reports and other deficiency reports (including not reportable events) identified at INL from January 2014 through March 2014.
Applying STAMP in Accident Analysis
NASA Technical Reports Server (NTRS)
Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen
2003-01-01
Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.
NASA Astrophysics Data System (ADS)
Sadegh, M.; Moftakhari, H.; AghaKouchak, A.
2017-12-01
Many natural hazards are driven by multiple forcing variables, and concurrence/consecutive extreme events significantly increases risk of infrastructure/system failure. It is a common practice to use univariate analysis based upon a perceived ruling driver to estimate design quantiles and/or return periods of extreme events. A multivariate analysis, however, permits modeling simultaneous occurrence of multiple forcing variables. In this presentation, we introduce the Multi-hazard Assessment and Scenario Toolbox (MhAST) that comprehensively analyzes marginal and joint probability distributions of natural hazards. MhAST also offers a wide range of scenarios of return period and design levels and their likelihoods. Contribution of this study is four-fold: 1. comprehensive analysis of marginal and joint probability of multiple drivers through 17 continuous distributions and 26 copulas, 2. multiple scenario analysis of concurrent extremes based upon the most likely joint occurrence, one ruling variable, and weighted random sampling of joint occurrences with similar exceedance probabilities, 3. weighted average scenario analysis based on a expected event, and 4. uncertainty analysis of the most likely joint occurrence scenario using a Bayesian framework.
Aung, Theingi; Halsey, Jim; Kromhout, Daan; Gerstein, Hertzel C; Marchioli, Roberto; Tavazzi, Luigi; Geleijnse, Johanna M; Rauch, Bernhard; Ness, Andrew; Galan, Pilar; Chew, Emily Y; Bosch, Jackie; Collins, Rory; Lewington, Sarah; Armitage, Jane; Clarke, Robert
2018-03-01
Current guidelines advocate the use of marine-derived omega-3 fatty acids supplements for the prevention of coronary heart disease and major vascular events in people with prior coronary heart disease, but large trials of omega-3 fatty acids have produced conflicting results. To conduct a meta-analysis of all large trials assessing the associations of omega-3 fatty acid supplements with the risk of fatal and nonfatal coronary heart disease and major vascular events in the full study population and prespecified subgroups. This meta-analysis included randomized trials that involved at least 500 participants and a treatment duration of at least 1 year and that assessed associations of omega-3 fatty acids with the risk of vascular events. Aggregated study-level data were obtained from 10 large randomized clinical trials. Rate ratios for each trial were synthesized using observed minus expected statistics and variances. Summary rate ratios were estimated by a fixed-effects meta-analysis using 95% confidence intervals for major diseases and 99% confidence intervals for all subgroups. The main outcomes included fatal coronary heart disease, nonfatal myocardial infarction, stroke, major vascular events, and all-cause mortality, as well as major vascular events in study population subgroups. Of the 77 917 high-risk individuals participating in the 10 trials, 47 803 (61.4%) were men, and the mean age at entry was 64.0 years; the trials lasted a mean of 4.4 years. The associations of treatment with outcomes were assessed on 6273 coronary heart disease events (2695 coronary heart disease deaths and 2276 nonfatal myocardial infarctions) and 12 001 major vascular events. Randomization to omega-3 fatty acid supplementation (eicosapentaenoic acid dose range, 226-1800 mg/d) had no significant associations with coronary heart disease death (rate ratio [RR], 0.93; 99% CI, 0.83-1.03; P = .05), nonfatal myocardial infarction (RR, 0.97; 99% CI, 0.87-1.08; P = .43) or any coronary heart disease events (RR, 0.96; 95% CI, 0.90-1.01; P = .12). Neither did randomization to omega-3 fatty acid supplementation have any significant associations with major vascular events (RR, 0.97; 95% CI, 0.93-1.01; P = .10), overall or in any subgroups, including subgroups composed of persons with prior coronary heart disease, diabetes, lipid levels greater than a given cutoff level, or statin use. This meta-analysis demonstrated that omega-3 fatty acids had no significant association with fatal or nonfatal coronary heart disease or any major vascular events. It provides no support for current recommendations for the use of such supplements in people with a history of coronary heart disease.
ANTARES constrains a blazar origin of two IceCube PeV neutrino events
NASA Astrophysics Data System (ADS)
ANTARES Collaboration; Adrián-Martínez, S.; Albert, A.; André, M.; Anton, G.; Ardid, M.; Aubert, J.-J.; Baret, B.; Barrios, J.; Basa, S.; Bertin, V.; Biagi, S.; Bogazzi, C.; Bormuth, R.; Bou-Cabo, M.; Bouwhuis, M. C.; Bruijn, R.; Brunner, J.; Busto, J.; Capone, A.; Caramete, L.; Carr, J.; Chiarusi, T.; Circella, M.; Coniglione, R.; Costantini, H.; Coyle, P.; Creusot, A.; De Rosa, G.; Dekeyser, I.; Deschamps, A.; De Bonis, G.; Distefano, C.; Donzaud, C.; Dornic, D.; Dorosti, Q.; Drouhin, D.; Dumas, A.; Eberl, T.; Enzenhöfer, A.; Escoffier, S.; Fehn, K.; Felis, I.; Fermani, P.; Folger, F.; Fusco, L. A.; Galatà, S.; Gay, P.; Geißelsöder, S.; Geyer, K.; Giordano, V.; Gleixner, A.; Gómez-González, J. P.; Gracia-Ruiz, R.; Graf, K.; van Haren, H.; Heijboer, A. J.; Hello, Y.; Hernández-Rey, J. J.; Herrero, A.; Hößl, J.; Hofestädt, J.; Hugon, C.; James, C. W.; de Jong, M.; Kalekin, O.; Katz, U.; Kießling, D.; Kooijman, P.; Kouchner, A.; Kulikovskiy, V.; Lahmann, R.; Lattuada, D.; Lefèvre, D.; Leonora, E.; Loehner, H.; Loucatos, S.; Mangano, S.; Marcelin, M.; Margiotta, A.; Martínez-Mora, J. A.; Martini, S.; Mathieu, A.; Michael, T.; Migliozzi, P.; Neff, M.; Nezri, E.; Palioselitis, D.; Păvălaş, G. E.; Perrina, C.; Piattelli, P.; Popa, V.; Pradier, T.; Racca, C.; Riccobene, G.; Richter, R.; Roensch, K.; Rostovtsev, A.; Saldaña, M.; Samtleben, D. F. E.; Sánchez-Losa, A.; Sanguineti, M.; Sapienza, P.; Schmid, J.; Schnabel, J.; Schulte, S.; Schüssler, F.; Seitz, T.; Sieger, C.; Spies, A.; Spurio, M.; Steijger, J. J. M.; Stolarczyk, Th.; Taiuti, M.; Tamburini, C.; Tayalati, Y.; Trovato, A.; Tselengidou, M.; Tönnis, C.; Vallage, B.; Vallée, C.; Van Elewyck, V.; Visser, E.; Vivolo, D.; Wagner, S.; de Wolf, E.; Yepes, H.; Zornoza, J. D.; Zúñiga, J.; TANAMI Collaboration; Krauß, F.; Kadler, M.; Mannheim, K.; Schulz, R.; Trüstedt, J.; Wilms, J.; Ojha, R.; Ros, E.; Baumgartner, W.; Beuchert, T.; Blanchard, J.; Bürkel, C.; Carpenter, B.; Edwards, P. G.; Eisenacher Glawion, D.; Elsässer, D.; Fritsch, U.; Gehrels, N.; Gräfe, C.; Großberger, C.; Hase, H.; Horiuchi, S.; Kappes, A.; Kreikenbohm, A.; Kreykenbohm, I.; Langejahn, M.; Leiter, K.; Litzinger, E.; Lovell, J. E. J.; Müller, C.; Phillips, C.; Plötz, C.; Quick, J.; Steinbring, T.; Stevens, J.; Thompson, D. J.; Tzioumis, A. K.
2015-04-01
Context. The source(s) of the neutrino excess reported by the IceCube Collaboration is unknown. The TANAMI Collaboration recently reported on the multiwavelength emission of six bright, variable blazars which are positionally coincident with two of the most energetic IceCube events. Objects like these are prime candidates to be the source of the highest-energy cosmic rays, and thus of associated neutrino emission. Aims: We present an analysis of neutrino emission from the six blazars using observations with the ANTARES neutrino telescope. Methods: The standard methods of the ANTARES candidate list search are applied to six years of data to search for an excess of muons - and hence their neutrino progenitors - from the directions of the six blazars described by the TANAMI Collaboration, and which are possibly associated with two IceCube events. Monte Carlo simulations of the detector response to both signal and background particle fluxes are used to estimate the sensitivity of this analysis for different possible source neutrino spectra. A maximum-likelihood approach, using the reconstructed energies and arrival directions of through-going muons, is used to identify events with properties consistent with a blazar origin. Results: Both blazars predicted to be the most neutrino-bright in the TANAMI sample (1653-329 and 1714-336) have a signal flux fitted by the likelihood analysis corresponding to approximately one event. This observation is consistent with the blazar-origin hypothesis of the IceCube event IC 14 for a broad range of blazar spectra, although an atmospheric origin cannot be excluded. No ANTARES events are observed from any of the other four blazars, including the three associated with IceCube event IC20. This excludes at a 90% confidence level the possibility that this event was produced by these blazars unless the neutrino spectrum is flatter than -2.4. Figures 2, 3 and Appendix A are available in electronic form at http://www.aanda.org
Sommermeyer, Dirk; Zou, Ding; Grote, Ludger; Hedner, Jan
2012-10-15
To assess the accuracy of novel algorithms using an oximeter-based finger plethysmographic signal in combination with a nasal cannula for the detection and differentiation of central and obstructive apneas. The validity of single pulse oximetry to detect respiratory disturbance events was also studied. Patients recruited from four sleep laboratories underwent an ambulatory overnight cardiorespiratory polygraphy recording. The nasal flow and photoplethysmographic signals of the recording were analyzed by automated algorithms. The apnea hypopnea index (AHI(auto)) was calculated using both signals, and a respiratory disturbance index (RDI(auto)) was calculated from photoplethysmography alone. Apnea events were classified into obstructive and central types using the oximeter derived pulse wave signal and compared with manual scoring. Sixty-six subjects (42 males, age 54 ± 14 yrs, body mass index 28.5 ± 5.9 kg/m(2)) were included in the analysis. AHI(manual) (19.4 ± 18.5 events/h) correlated highly significantly with AHI(auto) (19.9 ± 16.5 events/h) and RDI(auto) (20.4 ± 17.2 events/h); the correlation coefficients were r = 0.94 and 0.95, respectively (p < 0.001) with a mean difference of -0.5 ± 6.6 and -1.0 ± 6.1 events/h. The automatic analysis of AHI(auto) and RDI(auto) detected sleep apnea (cutoff AHI(manual) ≥ 15 events/h) with a sensitivity/specificity of 0.90/0.97 and 0.86/0.94, respectively. The automated obstructive/central apnea indices correlated closely with manually scoring (r = 0.87 and 0.95, p < 0.001) with mean difference of -4.3 ± 7.9 and 0.3 ± 1.5 events/h, respectively. Automatic analysis based on routine pulse oximetry alone may be used to detect sleep disordered breathing with accuracy. In addition, the combination of photoplethysmographic signals with a nasal flow signal provides an accurate distinction between obstructive and central apneic events during sleep.
NASA Astrophysics Data System (ADS)
Bialas, A.
2004-02-01
It is shown that the method of eliminating the statistical fluctuations from event-by-event analysis proposed recently by Fu and Liu can be rewritten in a compact form involving the generalized factorial moments.
3D Simulation of External Flooding Events for the RISMC Pathway
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prescott, Steven; Mandelli, Diego; Sampath, Ramprasad
2015-09-01
Incorporating 3D simulations as part of the Risk-Informed Safety Margins Characterization (RISMIC) Toolkit allows analysts to obtain a more complete picture of complex system behavior for events including external plant hazards. External events such as flooding have become more important recently – however these can be analyzed with existing and validated simulated physics toolkits. In this report, we describe these approaches specific to flooding-based analysis using an approach called Smoothed Particle Hydrodynamics. The theory, validation, and example applications of the 3D flooding simulation are described. Integrating these 3D simulation methods into computational risk analysis provides a spatial/visual aspect to themore » design, improves the realism of results, and can prove visual understanding to validate the analysis of flooding.« less
Idaho National Laboratory Quarterly Performance Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, Lisbeth
2014-11-01
This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of 60 reportable events (23 from the 4th Qtr FY14 and 37 from the prior three reporting quarters) as well as 58 other issue reports (including not reportable events and Significant Category A and B conditions) identified at INLmore » from July 2013 through October 2014. Battelle Energy Alliance (BEA) operates the INL under contract DE AC07 051D14517.« less
Idaho National Laboratory Quarterly Occurrence Analysis - 3rd Quarter FY-2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, Lisbeth Ann
This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of 73 reportable events (23 from the 3rd Qtr FY-16 and 50 from the prior three reporting quarters), as well as 45 other issue reports (including events found to be not reportable and Significant Category A and B conditions)more » identified at INL during the past 12 months (16 from this quarter and 29 from the prior three quarters).« less
Idaho National Laboratory Quarterly Occurrence Analysis - 1st Quarter FY 2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, Lisbeth Ann
This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of 74 reportable events (16 from the 1st Qtr FY-16 and 58 from the prior three reporting quarters), as well as 35 other issue reports (including events found to be not reportable and Significant Category A and B conditions)more » identified at INL during the past 12 months (15 from this quarter and 20 from the prior three quarters).« less
Idaho National Laboratory Quarterly Occurrence Analysis 4th Quarter FY 2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, Lisbeth Ann
This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System, as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of 84 reportable events (29 from the 4th quarter fiscal year 2016 and 55 from the prior three reporting quarters), as well as 39 other issue reports (including events found to be not reportable and Significant Category A and Bmore » conditions) identified at INL during the past 12 months (two from this quarter and 37 from the prior three quarters).« less
Idaho National Laboratory Quarterly Occurrence Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, Lisbeth Ann
This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of 85 reportable events (18 from the 4th Qtr FY-15 and 67 from the prior three reporting quarters), as well as 25 other issue reports (including events found to be not reportable and Significant Category A and B conditions)more » identified at INL during the past 12 months (8 from this quarter and 17 from the prior three quarters).« less
NPE 2010 results - Independent performance assessment by simulated CTBT violation scenarios
NASA Astrophysics Data System (ADS)
Ross, O.; Bönnemann, C.; Ceranna, L.; Gestermann, N.; Hartmann, G.; Plenefisch, T.
2012-04-01
For verification of compliance to the Comprehensive Nuclear-Test-Ban Treaty (CTBT) the global International Monitoring System (IMS) is currently being built up. The IMS is designed to detect nuclear explosions through their seismic, hydroacoustic, infrasound, and radionuclide signature. The IMS data are collected, processed to analysis products, and distributed to the state signatories by the International Data Centre (IDC) in Vienna. The state signatories themselves may operate National Data Centers (NDC) giving technical advice concerning CTBT verification to the government. NDC Preparedness Exercises (NPE) are regularly performed to practice the verification procedures for the detection of nuclear explosions in the framework of CTBT monitoring. The initial focus of the NPE 2010 was on the component of radionuclide detections and the application of Atmospheric Transport Modeling (ATM) for defining the source region of a radionuclide event. The exercise was triggered by fictitious radioactive noble gas detections which were calculated beforehand secretly by forward ATM for a hypothetical xenon release scenario starting at location and time of a real seismic event. The task for the exercise participants was to find potential source events by atmospheric backtracking and to analyze in the following promising candidate events concerning their waveform signals. The study shows one possible way of solution for NPE 2010 as it was performed at German NDC by a team without precedent knowledge of the selected event and release scenario. The ATM Source Receptor Sensitivity (SRS) fields as provided by the IDC were evaluated in a logical approach in order to define probable source regions for several days before the first reported fictitious radioactive xenon finding. Additional information on likely event times was derived from xenon isotopic ratios where applicable. Of the considered seismic events in the potential source region all except one could be identified as earthquakes by seismological analysis. The remaining event at Black Thunder Mine, Wyoming, on 23 Oct at 21:15 UTC showed clear explosion characteristics. It caused also Infrasound detections at one station in Canada. An infrasonic one station localization algorithm led to event localization results comparable in precision to the teleseismic localization. However, the analysis of regional seismological stations gave the most accurate result giving an error ellipse of about 60 square kilometer. Finally a forward ATM simulation was performed with the candidate event as source in order to reproduce the original detection scenario. The ATM results showed a simulated station fingerprint in the IMS very similar to the fictitious detections given in the NPE 2010 scenario which is an additional confirmation that the event was correctly identified. The shown event analysis of the NPE 2010 serves as successful example for Data Fusion between the technology of radionuclide detection supported by ATM and seismological methodology as well as infrasound signal processing.
Mars Methane highs unrelated to comets
NASA Astrophysics Data System (ADS)
Roos-Serote, Maarten; Atreya, Sushil K.; Webster, Chris; Mahaffy, Paul
2016-10-01
Until the Curiosity Rover arrived at Mars, all measurements of methane were done by remote sensing, either from Earth or from orbiting spacecraft, using a variety of different instruments and under different observing conditions. The Curiosity Rover's Sample Analysis at Mars (SAM) / Tunable Laser Spectrometer (TLS) has carried out systematic measurements of martian methane from Gale crater for two consecutive martian years (31 - 33, starting in October 2012). Meteoric material interacts with the martian atmosphere when Mars passes through a meteoroid stream left behind by cometary bodies orbiting the Sun. Predictions show that 33 such events are likely to occur during the martian year. It has been suggested that the organics present in this material trigger the formation of methane in the atmosphere, and thus these events could possibly be an explanation for the observed variations in the methane abundance. In a recent paper, Fries et al. [2016] argued that all measurements of high methane concentrations are within 16 days of a predicted meteor shower event, and that as such there is a correlation. We present a new analysis including seven new data points that were not available previously. All these new measurements show low methane values. Some of the new measurements were deliberately taken at the same Ls when high values of methane were measured in the previous martian year, showing that the high methane measurements are likely not seasonal, as would be expected if they were connected to meteor shower events. In our analysis we take into account all the predicted meteor events and search for any correlation drawn between these events and the level of methane in the atmosphere. We conclude that whether we consider individual data points, apply statistical analysis, or consider different time spans between measurements and the occurrence of meteor events, or possible supply of organic material from comets, there is no evidence for such a correlation in the available data.
Big Data Tools as Applied to ATLAS Event Data
NASA Astrophysics Data System (ADS)
Vukotic, I.; Gardner, R. W.; Bryant, L. A.
2017-10-01
Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Logfiles, database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and associated analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data. Such modes would simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of machine learning environments and tools like Spark, Jupyter, R, SciPy, Caffe, TensorFlow, etc. Machine learning challenges such as the Higgs Boson Machine Learning Challenge, the Tracking challenge, Event viewers (VP1, ATLANTIS, ATLASrift), and still to be developed educational and outreach tools would be able to access the data through a simple REST API. In this preliminary investigation we focus on derived xAOD data sets. These are much smaller than the primary xAODs having containers, variables, and events of interest to a particular analysis. Being encouraged with the performance of Elasticsearch for the ADC analytics platform, we developed an algorithm for indexing derived xAOD event data. We have made an appropriate document mapping and have imported a full set of standard model W/Z datasets. We compare the disk space efficiency of this approach to that of standard ROOT files, the performance in simple cut flow type of data analysis, and will present preliminary results on its scaling characteristics with different numbers of clients, query complexity, and size of the data retrieved.
Bio-inspired approach for intelligent unattended ground sensors
NASA Astrophysics Data System (ADS)
Hueber, Nicolas; Raymond, Pierre; Hennequin, Christophe; Pichler, Alexander; Perrot, Maxime; Voisin, Philippe; Moeglin, Jean-Pierre
2015-05-01
Improving the surveillance capacity over wide zones requires a set of smart battery-powered Unattended Ground Sensors capable of issuing an alarm to a decision-making center. Only high-level information has to be sent when a relevant suspicious situation occurs. In this paper we propose an innovative bio-inspired approach that mimics the human bi-modal vision mechanism and the parallel processing ability of the human brain. The designed prototype exploits two levels of analysis: a low-level panoramic motion analysis, the peripheral vision, and a high-level event-focused analysis, the foveal vision. By tracking moving objects and fusing multiple criteria (size, speed, trajectory, etc.), the peripheral vision module acts as a fast relevant event detector. The foveal vision module focuses on the detected events to extract more detailed features (texture, color, shape, etc.) in order to improve the recognition efficiency. The implemented recognition core is able to acquire human knowledge and to classify in real-time a huge amount of heterogeneous data thanks to its natively parallel hardware structure. This UGS prototype validates our system approach under laboratory tests. The peripheral analysis module demonstrates a low false alarm rate whereas the foveal vision correctly focuses on the detected events. A parallel FPGA implementation of the recognition core succeeds in fulfilling the embedded application requirements. These results are paving the way of future reconfigurable virtual field agents. By locally processing the data and sending only high-level information, their energy requirements and electromagnetic signature are optimized. Moreover, the embedded Artificial Intelligence core enables these bio-inspired systems to recognize and learn new significant events. By duplicating human expertise in potentially hazardous places, our miniature visual event detector will allow early warning and contribute to better human decision making.
Explosive Yield Estimation using Fourier Amplitude Spectra of Velocity Histories
NASA Astrophysics Data System (ADS)
Steedman, D. W.; Bradley, C. R.
2016-12-01
The Source Physics Experiment (SPE) is a series of explosive shots of various size detonated at varying depths in a borehole in jointed granite. The testbed includes an extensive array of accelerometers for measuring the shock environment close-in to the explosive source. One goal of SPE is to develop greater understanding of the explosion phenomenology in all regimes: from near-source, non-linear response to the far-field linear elastic region, and connecting the analyses from the respective regimes. For example, near-field analysis typically involves review of kinematic response (i.e., acceleration, velocity and displacement) in the time domain and looks at various indicators (e.g., peaks, pulse duration) to facilitate comparison among events. Review of far-field data more often is based on study of response in the frequency domain to facilitate comparison of event magnitudes. To try to "bridge the gap" between approaches, we have developed a scaling law for Fourier amplitude spectra of near-field velocity histories that successfully collapses data from a wide range of yields (100 kg to 5000 kg) and range to sensors in jointed granite. Moreover, we show that we can apply this scaling law to data from a new event to accurately estimate the explosive yield of that event. This approach presents a new way of working with near-field data that will be more compatible with traditional methods of analysis of seismic data and should serve to facilitate end-to-end event analysis. The goal is that this new approach to data analysis will eventually result in improved methods for discrimination of event type (i.e., nuclear or chemical explosion, or earthquake) and magnitude.
Tse, Gary; Gong, Mengqi; Wong, Cheuk Wai; Chan, Cynthia; Georgopoulos, Stamatis; Chan, Yat Sun; Yan, Bryan P; Li, Guangping; Whittaker, Paula; Ciobanu, Ana; Ali-Hasan-Al-Saegh, Sadeq; Wong, Sunny H; Wu, William K K; Bazoukis, George; Lampropoulos, Konstantinos; Wong, Wing Tak; Tse, Lap Ah; Baranchuk, Adrian M; Letsas, Konstantinos P; Liu, Tong
2018-03-01
The total cosine R-to-T (TCRT), a vectorcardiographic marker reflecting the spatial difference between the depolarization and repolarization wavefronts, has been used to predict ventricular tachycardia/fibrillation (VT/VF) and sudden cardiac death (SCD) in different clinical settings. However, its prognostic value has been controversial. This systematic review and meta-analysis evaluated the significance of TRCT in predicting arrhythmic and/or mortality endpoints. PubMed and Embase databases were searched through December 31, 2016. Of the 890 studies identified initially, 13 observational studies were included in our meta-analysis. A total of 11,528 patients, mean age 47 years old, 72% male, were followed for 43 ± 6 months. Data from five studies demonstrated lower TCRT values in myocardial infarction patients with adverse events (syncope, ventricular arrhythmias, or sudden cardiac death) compared to those without these events (mean difference = -0.36 ± 0.05, p < .001; I 2 = 48%). By contrast, only two studies analyzed outcomes in heart failure, and pooled meta-analysis did not demonstrate significant difference in TCRT between event-positive and event-negative patients (mean difference = -0.01 ± 0.10, p > .05; I 2 = 80%). TCRT is lower in MI patients at high risk of adverse events when compared to those free from such events. It can provide additional risk stratification beyond the use of clinical parameters and traditional electrocardiogram markers. Its value in other diseases such as heart failure requires further studies. © 2017 Wiley Periodicals, Inc.
Potas, Jason Robert; de Castro, Newton Gonçalves; Maddess, Ted; de Souza, Marcio Nogueira
2015-01-01
Experimental electrophysiological assessment of evoked responses from regenerating nerves is challenging due to the typical complex response of events dispersed over various latencies and poor signal-to-noise ratio. Our objective was to automate the detection of compound action potential events and derive their latencies and magnitudes using a simple cross-correlation template comparison approach. For this, we developed an algorithm called Waveform Similarity Analysis. To test the algorithm, challenging signals were generated in vivo by stimulating sural and sciatic nerves, whilst recording evoked potentials at the sciatic nerve and tibialis anterior muscle, respectively, in animals recovering from sciatic nerve transection. Our template for the algorithm was generated based on responses evoked from the intact side. We also simulated noisy signals and examined the output of the Waveform Similarity Analysis algorithm with imperfect templates. Signals were detected and quantified using Waveform Similarity Analysis, which was compared to event detection, latency and magnitude measurements of the same signals performed by a trained observer, a process we called Trained Eye Analysis. The Waveform Similarity Analysis algorithm could successfully detect and quantify simple or complex responses from nerve and muscle compound action potentials of intact or regenerated nerves. Incorrectly specifying the template outperformed Trained Eye Analysis for predicting signal amplitude, but produced consistent latency errors for the simulated signals examined. Compared to the trained eye, Waveform Similarity Analysis is automatic, objective, does not rely on the observer to identify and/or measure peaks, and can detect small clustered events even when signal-to-noise ratio is poor. Waveform Similarity Analysis provides a simple, reliable and convenient approach to quantify latencies and magnitudes of complex waveforms and therefore serves as a useful tool for studying evoked compound action potentials in neural regeneration studies.
Potas, Jason Robert; de Castro, Newton Gonçalves; Maddess, Ted; de Souza, Marcio Nogueira
2015-01-01
Experimental electrophysiological assessment of evoked responses from regenerating nerves is challenging due to the typical complex response of events dispersed over various latencies and poor signal-to-noise ratio. Our objective was to automate the detection of compound action potential events and derive their latencies and magnitudes using a simple cross-correlation template comparison approach. For this, we developed an algorithm called Waveform Similarity Analysis. To test the algorithm, challenging signals were generated in vivo by stimulating sural and sciatic nerves, whilst recording evoked potentials at the sciatic nerve and tibialis anterior muscle, respectively, in animals recovering from sciatic nerve transection. Our template for the algorithm was generated based on responses evoked from the intact side. We also simulated noisy signals and examined the output of the Waveform Similarity Analysis algorithm with imperfect templates. Signals were detected and quantified using Waveform Similarity Analysis, which was compared to event detection, latency and magnitude measurements of the same signals performed by a trained observer, a process we called Trained Eye Analysis. The Waveform Similarity Analysis algorithm could successfully detect and quantify simple or complex responses from nerve and muscle compound action potentials of intact or regenerated nerves. Incorrectly specifying the template outperformed Trained Eye Analysis for predicting signal amplitude, but produced consistent latency errors for the simulated signals examined. Compared to the trained eye, Waveform Similarity Analysis is automatic, objective, does not rely on the observer to identify and/or measure peaks, and can detect small clustered events even when signal-to-noise ratio is poor. Waveform Similarity Analysis provides a simple, reliable and convenient approach to quantify latencies and magnitudes of complex waveforms and therefore serves as a useful tool for studying evoked compound action potentials in neural regeneration studies. PMID:26325291
An Overview of Recent Advances in Event-Triggered Consensus of Multiagent Systems.
Ding, Lei; Han, Qing-Long; Ge, Xiaohua; Zhang, Xian-Ming
2018-04-01
Event-triggered consensus of multiagent systems (MASs) has attracted tremendous attention from both theoretical and practical perspectives due to the fact that it enables all agents eventually to reach an agreement upon a common quantity of interest while significantly alleviating utilization of communication and computation resources. This paper aims to provide an overview of recent advances in event-triggered consensus of MASs. First, a basic framework of multiagent event-triggered operational mechanisms is established. Second, representative results and methodologies reported in the literature are reviewed and some in-depth analysis is made on several event-triggered schemes, including event-based sampling schemes, model-based event-triggered schemes, sampled-data-based event-triggered schemes, and self-triggered sampling schemes. Third, two examples are outlined to show applicability of event-triggered consensus in power sharing of microgrids and formation control of multirobot systems, respectively. Finally, some challenging issues on event-triggered consensus are proposed for future research.
Arenal-type pyroclastic flows: A probabilistic event tree risk analysis
NASA Astrophysics Data System (ADS)
Meloy, Anthony F.
2006-09-01
A quantitative hazard-specific scenario-modelling risk analysis is performed at Arenal volcano, Costa Rica for the newly recognised Arenal-type pyroclastic flow (ATPF) phenomenon using an event tree framework. These flows are generated by the sudden depressurisation and fragmentation of an active basaltic andesite lava pool as a result of a partial collapse of the crater wall. The deposits of this type of flow include angular blocks and juvenile clasts, which are rarely found in other types of pyroclastic flow. An event tree analysis (ETA) is a useful tool and framework in which to analyse and graphically present the probabilities of the occurrence of many possible events in a complex system. Four event trees are created in the analysis, three of which are extended to investigate the varying individual risk faced by three generic representatives of the surrounding community: a resident, a worker, and a tourist. The raw numerical risk estimates determined by the ETA are converted into a set of linguistic expressions (i.e. VERY HIGH, HIGH, MODERATE etc.) using an established risk classification scale. Three individually tailored semi-quantitative risk maps are then created from a set of risk conversion tables to show how the risk varies for each individual in different areas around the volcano. In some cases, by relocating from the north to the south, the level of risk can be reduced by up to three classes. While the individual risk maps may be broadly applicable, and therefore of interest to the general community, the risk maps and associated probability values generated in the ETA are intended to be used by trained professionals and government agencies to evaluate the risk and effectively manage the long-term development of infrastructure and habitation. With the addition of fresh monitoring data, the combination of both long- and short-term event trees would provide a comprehensive and consistent method of risk analysis (both during and pre-crisis), and as such, an ETA is considered to be a valuable quantitative decision support tool.
Accident study of torrential protective structures based on the French RTM database
NASA Astrophysics Data System (ADS)
Boncompain, Ingrid; Quefféléan, Yann; Carladous, Simon
2017-04-01
Torrential protective structures such as dikes, sediment traps, and check dams aim to reduce damage on elements at risk. They are built given a reference scenario. Nevertheless, this scenario can be exceeded or structures can fail because of their design or their ageing. We later talk about "accidents". The 1996 Aras disaster near Biescas (Spain) showed that consequences can be significant: 35 of 40 check dams were destroyed which involved 87 fatalities in a campsite. The accident probability and its consequences must be taken into account to analyze risk. Databases are useful tools to extract needed information. In France, the Restoration of Mountainous Areas department (RTM) has been public funded to develop a database, specific to mountainous areas (the Alps and the Pyrenees). Almost 12 500 check dams, 80 sediment traps and 600 dikes were registered in public forests in 2011. These samples were assumed significant for check dams and sediment traps but not for dikes because the most part was missing. In parallel, more than 31 000 torrential events were registered. Given these elements, an accident study was developed. We first extracted 1 925 events with accidents on protective structures: 39 % occurred during the 19th century and 53 % have occurred since 1900. Sediment traps were involved in 37 events, check dams in 336, and dikes in 1488. Then, a detailed analysis was specifically carried out for check dams. Event phenomena were extracted: torrential flood, liquid flood, snow avalanche, rock fall, and landslide. Accident typology was also specified: scouring, breaking of several check dams, total or partial destruction of one structure, overflowing. Causes of accidents on check dams were first analyzed. Torrential floods were responsible of 85 % of events (284 of 336) even if other phenomena must be also taken into account. Almost 45 % of events (152 of 336) involved total destruction of one or several check dams. Taking into account events for the last 150 years, 30 events were registered with destruction of several check dams: the annual probability of occurrence is 3.10-4. Analyzing consequences, only 11 of previously retained events (1925) were registered with fatalities. Finally, these quantitative elements were compared to qualitative feedback analysis from field practitioners and were illustrated with the 1987 Saint-Antoine event, in Modane. The total or partial destruction of 25 check dams released between 20 000 and 30 000 m3, which was one third of the estimated debris flow volume (85 000 m3) which damaged an industrial area. The lack of their maintenance partially explained this accident. As a conclusion, we must keep in mind that these results are limited to available data (all events have not been necessarily reported). Accidents on check dams are rare according to these data. It can be due to their general maintenance. This first analysis could be improved taking into account expert analysis, completing with information from other countries databases and formalizing the approach through a dependability analysis framework.
DARHT Multi-intelligence Seismic and Acoustic Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stevens, Garrison Nicole; Van Buren, Kendra Lu; Hemez, Francois M.
The purpose of this report is to document the analysis of seismic and acoustic data collected at the Dual-Axis Radiographic Hydrodynamic Test (DARHT) facility at Los Alamos National Laboratory for robust, multi-intelligence decision making. The data utilized herein is obtained from two tri-axial seismic sensors and three acoustic sensors, resulting in a total of nine data channels. The goal of this analysis is to develop a generalized, automated framework to determine internal operations at DARHT using informative features extracted from measurements collected external of the facility. Our framework involves four components: (1) feature extraction, (2) data fusion, (3) classification, andmore » finally (4) robustness analysis. Two approaches are taken for extracting features from the data. The first of these, generic feature extraction, involves extraction of statistical features from the nine data channels. The second approach, event detection, identifies specific events relevant to traffic entering and leaving the facility as well as explosive activities at DARHT and nearby explosive testing sites. Event detection is completed using a two stage method, first utilizing signatures in the frequency domain to identify outliers and second extracting short duration events of interest among these outliers by evaluating residuals of an autoregressive exogenous time series model. Features extracted from each data set are then fused to perform analysis with a multi-intelligence paradigm, where information from multiple data sets are combined to generate more information than available through analysis of each independently. The fused feature set is used to train a statistical classifier and predict the state of operations to inform a decision maker. We demonstrate this classification using both generic statistical features and event detection and provide a comparison of the two methods. Finally, the concept of decision robustness is presented through a preliminary analysis where uncertainty is added to the system through noise in the measurements.« less
Tool for Constructing Data Albums for Significant Weather Events
NASA Astrophysics Data System (ADS)
Kulkarni, A.; Ramachandran, R.; Conover, H.; McEniry, M.; Goodman, H.; Zavodsky, B. T.; Braun, S. A.; Wilson, B. D.
2012-12-01
Case study analysis and climatology studies are common approaches used in Atmospheric Science research. Research based on case studies involves a detailed description of specific weather events using data from different sources, to characterize physical processes in play for a given event. Climatology-based research tends to focus on the representativeness of a given event, by studying the characteristics and distribution of a large number of events. To gather relevant data and information for case studies and climatology analysis is tedious and time consuming; current Earth Science data systems are not suited to assemble multi-instrument, multi mission datasets around specific events. For example, in hurricane science, finding airborne or satellite data relevant to a given storm requires searching through web pages and data archives. Background information related to damages, deaths, and injuries requires extensive online searches for news reports and official storm summaries. We will present a knowledge synthesis engine to create curated "Data Albums" to support case study analysis and climatology studies. The technological challenges in building such a reusable and scalable knowledge synthesis engine are several. First, how to encode domain knowledge in a machine usable form? This knowledge must capture what information and data resources are relevant and the semantic relationships between the various fragments of information and data. Second, how to extract semantic information from various heterogeneous sources including unstructured texts using the encoded knowledge? Finally, how to design a structured database from the encoded knowledge to store all information and to support querying? The structured database must allow both knowledge overviews of an event as well as drill down capability needed for detailed analysis. An application ontology driven framework is being used to design the knowledge synthesis engine. The knowledge synthesis engine is being applied to build a portal for hurricane case studies at the Global Hydrology and Resource Center (GHRC), a NASA Data Center. This portal will auto-generate Data Albums for specific hurricane events, compiling information from distributed resources such as NASA field campaign collections, relevant data sets, storm reports, pictures, videos and other useful sources.
When a checklist is not enough: How to improve them and what else is needed.
Raman, Jaishankar; Leveson, Nancy; Samost, Aubrey Lynn; Dobrilovic, Nikola; Oldham, Maggie; Dekker, Sidney; Finkelstein, Stan
2016-08-01
Checklists are being introduced to enhance patient safety, but the results have been mixed. The goal of this research is to understand why time-outs and checklists are sometimes not effective in preventing surgical adverse events and to identify additional measures needed to reduce these events. A total of 380 consecutive patients underwent complex cardiac surgery over a 24-month period between November 2011 and November 2013 at an academic medical center, out of a total of 529 cardiac cases. Elective isolated aortic valve replacements, mitral valve repairs, and coronary artery bypass graft surgical procedures (N = 149) were excluded. A time-out was conducted in a standard fashion in all patients in accordance with the World Health Organization surgical checklist protocol. Adverse events were classified as anything that resulted in an operative delay, nonavailability of equipment, failure of drug administration, or unexpected adverse clinical outcome. These events and their details were collected every week and analyzed using a systemic causal analysis technique using a technique called CAST (causal analysis based on systems theory). This analytic technique evaluated the sociotechnical system to identify the set of causal factors involved in the adverse events and the causal factors explored to identify reasons. Recommendations were made for the improvement of checklists and the use of system design changes that could prevent such events in the future. Thirty events were identified. The causal analysis of these 30 adverse events was carried out and actionable events classified. There were important limitations in the use of standard checklists as a stand-alone patient safety measure in the operating room setting, because of multiple factors. Major categories included miscommunication between staff, medication errors, missing instrumentation, missing implants, and improper handling of equipment or instruments. An average of 3.9 recommendations were generated for each adverse event scenario. Time-outs and checklists can prevent some types of adverse events, but they need to be carefully designed. Additional interventions aimed at improving safety controls in the system design are needed to augment the use of checklists. Customization of checklists for specialized surgical procedures may reduce adverse events. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
Risk analysis of computer system designs
NASA Technical Reports Server (NTRS)
Vallone, A.
1981-01-01
Adverse events during implementation can affect final capabilities, schedule and cost of a computer system even though the system was accurately designed and evaluated. Risk analysis enables the manager to forecast the impact of those events and to timely ask for design revisions or contingency plans before making any decision. This paper presents a structured procedure for an effective risk analysis. The procedure identifies the required activities, separates subjective assessments from objective evaluations, and defines a risk measure to determine the analysis results. The procedure is consistent with the system design evaluation and enables a meaningful comparison among alternative designs.
Luminosity variations in several parallel auroral arcs before auroral breakup
NASA Astrophysics Data System (ADS)
Safargaleev, V.; Lyatsky, W.; Tagirov, V.
1997-08-01
Variation of the luminosity in two parallel auroral arcs before auroral breakup has been studied by using digitised TV-data with high temporal and spatial resolution. The intervals when a new arc appears near already existing one were chosen for analysis. It is shown, for all cases, that the appearance of a new arc is accompanied by fading or disappearance of another arc. We have named these events out-of-phase events, OP. Another type of luminosity variation is characterised by almost simultaneous enhancement of intensity in the both arcs (in-phase event, IP). The characteristic time of IP events is 10-20 s, whereas OP events last about one minute. Sometimes out-of-phase events begin as IP events. The possible mechanisms for OP and IP events are discussed.
The unique contribution of the IDC Reviewed Event Bulletin to global seismicity catalogues
NASA Astrophysics Data System (ADS)
Koch, Karl; Kebede, Fekadu
2010-05-01
For monitoring the Comprehensive Nuclear-Test-Ban Treaty (CTBT) the International Monitoring System (IMS) network is currently being established that will eventually consists of 241 seismic, hydroacoustic and infrasound stations. The final result of processing and analysis of seismological and other waveform technology data from these stations is the Reviewed Event Bulletin (REB), which has been issued by the International Data Center (IDC) under provisional operation since February 2000 on a daily basis, except for a total of 28 days. The nearly 300,000 events produced since then correspond to more than 25,000 events per year. As an accompanying effort to the bulletin production at the IDC, quality assurance work has been carried out for the REB for the years from 2000 to 2008 through comparisons to similar bulletins of global seismicity, issued by the ISC and the National Earthquake Information Center (NEIC) of the United States Geological Survey. The comparisons with the NEIC bulletin concentrate on a timely identification of larger events that were either missed during interactive analysis at the IDC or which have been significantly mislocated. For the scope of this study the comparisons with the ISC bulletin are the focus, as this bulletin provides the most complete reference to global seismicity, even though it becomes available only after about two years of event occurrence. In our quality assessments we aimed at evaluating the consistency of event locations for common events, i.e. found in both the REB and the ISC bulletin having been relocated by ISC; the degree and the geospatial location of the events only produced in the REB and verified not being bogus, and those ISC relocated events not contained in the REB and which were missed during IDC analysis. Even though the seismic component of the IMS network with its maximum 170 seismometer stations is a sparse teleseismic network, locations differences of less than 1° (0.5° ) are observed, on average, for about 94% (85%) of the common events, as obtained from the ISC bulletin comparisons for the years 2000 to 2006. On the other hand, only 0.25% of such events were located more than 5° apart by the IDC and the ISC. The number of events of significant magnitude missed by the IDC is small and related predominantly to lack of sufficient number of observed arrivals to define an event. The unique contribution of the REB to global seismicity catalogues is expressed by the significant number of REB events that are solely reported in the ISC bulletin. Over the most recent years 2004-2006 the REB and ISC bulletin include about 20,000 common events which were reprocessed by the ISC. This compares to a steadily rising number of solely contributed REB events from more than 5,000 events in 2004 to nearly 7,500 events in 2006, i.e. a quarter to a third events more. These unique IDC events are mainly in remote and oceanic areas. A more important aspect, however, is the number of unique IDC events at depth. Below a depth of about 300 km there are nearly any ISC events not detected by the IMS network, while the number of events at larger depths and only detected by the IMS is significant. As conclusion it is found that the IDC REB is a valuable source for studies of deep seismicity occurring within the global subduction zones.
A Method for Automated Detection of Usability Problems from Client User Interface Events
Saadawi, Gilan M.; Legowski, Elizabeth; Medvedeva, Olga; Chavan, Girish; Crowley, Rebecca S.
2005-01-01
Think-aloud usability analysis provides extremely useful data but is very time-consuming and expensive to perform because of the extensive manual video analysis that is required. We describe a simple method for automated detection of usability problems from client user interface events for a developing medical intelligent tutoring system. The method incorporates (1) an agent-based method for communication that funnels all interface events and system responses to a centralized database, (2) a simple schema for representing interface events and higher order subgoals, and (3) an algorithm that reproduces the criteria used for manual coding of usability problems. A correction factor was empirically determining to account for the slower task performance of users when thinking aloud. We tested the validity of the method by simultaneously identifying usability problems using TAU and manually computing them from stored interface event data using the proposed algorithm. All usability problems that did not rely on verbal utterances were detectable with the proposed method. PMID:16779121
Fabbri, Alice; Grundy, Quinn; Mintzes, Barbara; Swandari, Swestika; Moynihan, Ray; Walkom, Emily; Bero, Lisa A
2017-01-01
Objectives To analyse patterns and characteristics of pharmaceutical industry sponsorship of events for Australian health professionals and to understand the implications of recent changes in transparency provisions that no longer require reporting of payments for food and beverages. Design Cross-sectional analysis. Participants and setting 301 publicly available company transparency reports downloaded from the website of Medicines Australia, the pharmaceutical industry trade association, covering the period from October 2011 to September 2015. Results Forty-two companies sponsored 116 845 events for health professionals, on average 608 per week with 30 attendees per event. Events typically included a broad range of health professionals: 82.0% included medical doctors, including specialists and primary care doctors, and 38.3% trainees. Oncology, surgery and endocrinology were the most frequent clinical areas of focus. Most events (64.2%) were held in a clinical setting. The median cost per event was $A263 (IQR $A153–1195) and over 90% included food and beverages. Conclusions Over this 4-year period, industry-sponsored events were widespread and pharmaceutical companies maintained a high frequency of contact with health professionals. Most events were held in clinical settings, suggesting a pervasive commercial presence in everyday clinical practice. Food and beverages, known to be associated with changes to prescribing practice, were almost always provided. New Australian transparency provisions explicitly exclude meals from the reporting requirements; thus, a large proportion of potentially influential payments from pharmaceutical companies to health professionals will disappear from public view. PMID:28667226
Sun, Siao; Barraud, Sylvie; Castebrunet, Hélène; Aubin, Jean-Baptiste; Marmonier, Pierre
2015-11-15
The assessment of urban stormwater quantity and quality is important for evaluating and controlling the impact of the stormwater to natural water and environment. This study mainly addresses long-term evolution of stormwater quantity and quality in a French urban catchment using continuous measured data from 2004 to 2011. Storm event-based data series are obtained (716 rainfall events and 521 runoff events are available) from measured continuous time series. The Mann-Kendall test is applied to these event-based data series for trend detection. A lack of trend is found in rainfall and an increasing trend in runoff is detected. As a result, an increasing trend is present in the runoff coefficient, likely due to growing imperviousness of the catchment caused by urbanization. The event mean concentration of the total suspended solid (TSS) in stormwater does not present a trend, whereas the event load of TSS has an increasing tendency, which is attributed to the increasing event runoff volume. Uncertainty analysis suggests that the major uncertainty in trend detection results lies in uncertainty due to available data. A lack of events due to missing data leads to dramatically increased uncertainty in trend detection results. In contrast, measurement uncertainty in time series data plays a trivial role. The intra-event distribution of TSS is studied based on both M(V) curves and pollutant concentrations of absolute runoff volumes. The trend detection test reveals no significant change in intra-event distributions of TSS in the studied catchment. Copyright © 2015 Elsevier Ltd. All rights reserved.
A case for multi-model and multi-approach based event attribution: The 2015 European drought
NASA Astrophysics Data System (ADS)
Hauser, Mathias; Gudmundsson, Lukas; Orth, René; Jézéquel, Aglaé; Haustein, Karsten; Seneviratne, Sonia Isabelle
2017-04-01
Science on the role of anthropogenic influence on extreme weather events such as heat waves or droughts has evolved rapidly over the past years. The approach of "event attribution" compares the occurrence probability of an event in the present, factual world with the probability of the same event in a hypothetical, counterfactual world without human-induced climate change. Every such analysis necessarily faces multiple methodological choices including, but not limited to: the event definition, climate model configuration, and the design of the counterfactual world. Here, we explore the role of such choices for an attribution analysis of the 2015 European summer drought (Hauser et al., in preparation). While some GCMs suggest that anthropogenic forcing made the 2015 drought more likely, others suggest no impact, or even a decrease in the event probability. These results additionally differ for single GCMs, depending on the reference used for the counterfactual world. Observational results do not suggest a historical tendency towards more drying, but the record may be too short to provide robust assessments because of the large interannual variability of drought occurrence. These results highlight the need for a multi-model and multi-approach framework in event attribution research. This is especially important for events with low signal to noise ratio and high model dependency such as regional droughts. Hauser, M., L. Gudmundsson, R. Orth, A. Jézéquel, K. Haustein, S.I. Seneviratne, in preparation. A case for multi-model and multi-approach based event attribution: The 2015 European drought.
Benjet, C; Bromet, E; Karam, E G; Kessler, R C; McLaughlin, K A; Ruscio, A M; Shahly, V; Stein, D J; Petukhova, M; Hill, E; Alonso, J; Atwoli, L; Bunting, B; Bruffaerts, R; Caldas-de-Almeida, J M; de Girolamo, G; Florescu, S; Gureje, O; Huang, Y; Lepine, J P; Kawakami, N; Kovess-Masfety, Viviane; Medina-Mora, M E; Navarro-Mateu, F; Piazza, M; Posada-Villa, J; Scott, K M; Shalev, A; Slade, T; ten Have, M; Torres, Y; Viana, M C; Zarkov, Z; Koenen, K C
2016-01-01
Considerable research has documented that exposure to traumatic events has negative effects on physical and mental health. Much less research has examined the predictors of traumatic event exposure. Increased understanding of risk factors for exposure to traumatic events could be of considerable value in targeting preventive interventions and anticipating service needs. General population surveys in 24 countries with a combined sample of 68 894 adult respondents across six continents assessed exposure to 29 traumatic event types. Differences in prevalence were examined with cross-tabulations. Exploratory factor analysis was conducted to determine whether traumatic event types clustered into interpretable factors. Survival analysis was carried out to examine associations of sociodemographic characteristics and prior traumatic events with subsequent exposure. Over 70% of respondents reported a traumatic event; 30.5% were exposed to four or more. Five types - witnessing death or serious injury, the unexpected death of a loved one, being mugged, being in a life-threatening automobile accident, and experiencing a life-threatening illness or injury - accounted for over half of all exposures. Exposure varied by country, sociodemographics and history of prior traumatic events. Being married was the most consistent protective factor. Exposure to interpersonal violence had the strongest associations with subsequent traumatic events. Given the near ubiquity of exposure, limited resources may best be dedicated to those that are more likely to be further exposed such as victims of interpersonal violence. Identifying mechanisms that account for the associations of prior interpersonal violence with subsequent trauma is critical to develop interventions to prevent revictimization.
Jokeit, H; Makeig, S
1994-01-01
Fast- and slow-reacting subjects exhibit different patterns of gamma-band electroencephalogram (EEG) activity when responding as quickly as possible to auditory stimuli. This result appears to confirm long-standing speculations of Wundt that fast- and slow-reacting subjects produce speeded reactions in different ways and demonstrates that analysis of event-related changes in the amplitude of EEG activity recorded from the human scalp can reveal information about event-related brain processes unavailable using event-related potential measures. Time-varying spectral power in a selected (35- to 43-Hz) gamma frequency band was averaged across trials in two experimental conditions: passive listening and speeded reacting to binaural clicks, forming 40-Hz event-related spectral responses. Factor analysis of between-subject event-related spectral response differences split subjects into two near-equal groups composed of faster- and slower-reacting subjects. In faster-reacting subjects, 40-Hz power peaked near 200 ms and 400 ms poststimulus in the react condition, whereas in slower-reacting subjects, 40-Hz power just before stimulus delivery was larger in the react condition. These group differences were preserved in separate averages of relatively long and short reaction-time epochs for each group. gamma-band (20-60 Hz)-filtered event-related potential response averages did not differ between the two groups or conditions. Because of this and because gamma-band power in the auditory event-related potential is small compared with the EEG, the observed event-related spectral response features must represent gamma-band EEG activity reliably induced by, but not phase-locked to, experimental stimuli or events. PMID:8022783
The Presentation of American Cultural Events in the Soviet Press (1977-1979).
ERIC Educational Resources Information Center
Williams, Katherine A.
A content analysis of selected Soviet newspapers and magazines was conducted to examine what cultural events from the United States were featured in the Soviet press, whether the event or artist was presented favorably or unfavorably, and whether the stories were used to make an ideological statement. Nine publications were examined over a…
ERIC Educational Resources Information Center
Maunder, Robert G.; Peladeau, Nathalie; Savage, Diane; Lancee, William J.
2010-01-01
Objective: We investigated the prevalence of childhood adversity among healthcare workers and if such experiences affect responses to adult life stress. Methods: A secondary analysis was conducted of a 2003 study of 176 hospital-based healthcare workers, which surveyed lifetime traumatic events, recent life events, psychological distress, coping,…
Forest inventory, catastrophic events and historic geospatial assessments in the south
Dennis M. Jacobs
2007-01-01
Catastrophic events are a regular occurrence of disturbance to forestland in the Southern United States. Each major event affects the integrity of the forest inventory database developed and maintained by the Forest Inventory & Analysis Research Work Unit of the U.S. Department of Agriculture, Forest Service. Some of these major disturbances through the years have...
ERIC Educational Resources Information Center
LaFollette, Lindsay K.; Knobloch, Neil A.; Schutz, Michael M.; Brady, Colleen M.
2015-01-01
Exploratory discriminant analysis was used to determine the extent adult consumers' interest motivation to participate in a free educational dairy farm event and their beliefs of the dairy industry could correctly classify the respondents' predicted participation in a nonformal educational event. The most prominent conclusion of the study was that…
ERIC Educational Resources Information Center
Sandin, Bonifacio; Chorot, Paloma; Santed, Miguel A.; Valiente, Rosa M.; Joiner, Thomas E., Jr.
1998-01-01
Empirical evidence relating negative life events and adolescent suicidal behavior is reviewed. The contribution of life events tends to be moderate or weak. A stress process model is presented. Past research has not incorporated mediating and moderating variables into pathways that link psychosocial stressors and suicidal outcomes, providing a…
Genome wide identification of aberrant alternative splicing events in myotonic dystrophy type 2.
Perfetti, Alessandra; Greco, Simona; Fasanaro, Pasquale; Bugiardini, Enrico; Cardani, Rosanna; Garcia-Manteiga, Jose M; Manteiga, Jose M Garcia; Riba, Michela; Cittaro, Davide; Stupka, Elia; Meola, Giovanni; Martelli, Fabio
2014-01-01
Myotonic dystrophy type 2 (DM2) is a genetic, autosomal dominant disease due to expansion of tetraplet (CCTG) repetitions in the first intron of the ZNF9/CNBP gene. DM2 is a multisystemic disorder affecting the skeletal muscle, the heart, the eye and the endocrine system. According to the proposed pathological mechanism, the expanded tetraplets have an RNA toxic effect, disrupting the splicing of many mRNAs. Thus, the identification of aberrantly spliced transcripts is instrumental for our understanding of the molecular mechanisms underpinning the disease. The aim of this study was the identification of new aberrant alternative splicing events in DM2 patients. By genome wide analysis of 10 DM2 patients and 10 controls (CTR), we identified 273 alternative spliced exons in 218 genes. While many aberrant splicing events were already identified in the past, most were new. A subset of these events was validated by qPCR assays in 19 DM2 and 15 CTR subjects. To gain insight into the molecular pathways involving the identified aberrantly spliced genes, we performed a bioinformatics analysis with Ingenuity system. This analysis indicated a deregulation of development, cell survival, metabolism, calcium signaling and contractility. In conclusion, our genome wide analysis provided a database of aberrant splicing events in the skeletal muscle of DM2 patients. The affected genes are involved in numerous pathways and networks important for muscle physio-pathology, suggesting that the identified variants may contribute to DM2 pathogenesis.
Genome Wide Identification of Aberrant Alternative Splicing Events in Myotonic Dystrophy Type 2
Fasanaro, Pasquale; Bugiardini, Enrico; Cardani, Rosanna; Manteiga, Jose M. Garcia.; Riba, Michela; Cittaro, Davide; Stupka, Elia; Meola, Giovanni; Martelli, Fabio
2014-01-01
Myotonic dystrophy type 2 (DM2) is a genetic, autosomal dominant disease due to expansion of tetraplet (CCTG) repetitions in the first intron of the ZNF9/CNBP gene. DM2 is a multisystemic disorder affecting the skeletal muscle, the heart, the eye and the endocrine system. According to the proposed pathological mechanism, the expanded tetraplets have an RNA toxic effect, disrupting the splicing of many mRNAs. Thus, the identification of aberrantly spliced transcripts is instrumental for our understanding of the molecular mechanisms underpinning the disease. The aim of this study was the identification of new aberrant alternative splicing events in DM2 patients. By genome wide analysis of 10 DM2 patients and 10 controls (CTR), we identified 273 alternative spliced exons in 218 genes. While many aberrant splicing events were already identified in the past, most were new. A subset of these events was validated by qPCR assays in 19 DM2 and 15 CTR subjects. To gain insight into the molecular pathways involving the identified aberrantly spliced genes, we performed a bioinformatics analysis with Ingenuity system. This analysis indicated a deregulation of development, cell survival, metabolism, calcium signaling and contractility. In conclusion, our genome wide analysis provided a database of aberrant splicing events in the skeletal muscle of DM2 patients. The affected genes are involved in numerous pathways and networks important for muscle physio-pathology, suggesting that the identified variants may contribute to DM2 pathogenesis. PMID:24722564
A Cost-Utility Model of Care for Peristomal Skin Complications
Inglese, Gary; Manson, Andrea; Townshend, Arden
2016-01-01
PURPOSE: The aim of this study was to evaluate the economic and humanistic implications of using ostomy components to prevent subsequent peristomal skin complications (PSCs) in individuals who experience an initial, leakage-related PSC event. DESIGN: Cost-utility analysis. METHODS: We developed a simple decision model to consider, from a payer's perspective, PSCs managed with and without the use of ostomy components over 1 year. The model evaluated the extent to which outcomes associated with the use of ostomy components (PSC events avoided; quality-adjusted life days gained) offset the costs associated with their use. RESULTS: Our base case analysis of 1000 hypothetical individuals over 1 year assumes that using ostomy components following a first PSC reduces recurrent events versus PSC management without components. In this analysis, component acquisition costs were largely offset by lower resource use for ostomy supplies (barriers; pouches) and lower clinical utilization to manage PSCs. The overall annual average resource use for individuals using components was about 6.3% ($139) higher versus individuals not using components. Each PSC event avoided yielded, on average, 8 additional quality-adjusted life days over 1 year. CONCLUSIONS: In our analysis, (1) acquisition costs for ostomy components were offset in whole or in part by the use of fewer ostomy supplies to manage PSCs and (2) use of ostomy components to prevent PSCs produced better outcomes (fewer repeat PSC events; more health-related quality-adjusted life days) over 1 year compared to not using components. PMID:26633166
Behaviorism, Private Events, and the Molar View of Behavior
Baum, William M
2011-01-01
Viewing the science of behavior (behavior analysis) to be a natural science, radical behaviorism rejects any form of dualism, including subjective–objective or inner–outer dualism. Yet radical behaviorists often claim that treating private events as covert behavior and internal stimuli is necessary and important to behavior analysis. To the contrary, this paper argues that, compared with the rejection of dualism, private events constitute a trivial idea and are irrelevant to accounts of behavior. Viewed in the framework of evolutionary theory or for any practical purpose, behavior is commerce with the environment. By its very nature, behavior is extended in time. The temptation to posit private events arises when an activity is viewed in too small a time frame, obscuring what the activity does. When activities are viewed in an appropriately extended time frame, private events become irrelevant to the account. This insight provides the answer to many philosophical questions about thinking, sensing, and feeling. Confusion about private events arises in large part from failure to appreciate fully the radical implications of replacing mentalistic ideas about language with the concept of verbal behavior. Like other operant behavior, verbal behavior involves no agent and no hidden causes; like all natural events, it is caused by other natural events. In a science of behavior grounded in evolutionary theory, the same set of principles applies to verbal and nonverbal behavior and to human and nonhuman organisms. PMID:22532740
NASA Technical Reports Server (NTRS)
Blaauw, Rhiannon C.; Cooke, William J.; Kingery, Aaron M.
2015-01-01
Being the only U.S. Government entity charged with monitoring the meteor environment, the Meteoroid Environment Office has deployed a network of all sky and wide field meteor cameras, along with the appropriate software tools to quickly analyze data from these systems. However, the coverage of this network is still quite limited, forcing the incorporation of data from other cameras posted to the internet in analyzing many of the fireballs reported by the public and media. A procedure has been developed that determines the analysis process for a given fireball event based on the types and amount of data available. The differences between these analysis process will be explained and outlined by looking at three bolide events, all of which were large enough to produce meteorites. The first example is an ideal event - a bright meteor that occurred over NASA's All Sky Camera Network on August 2, 2014. With clear video of the event from various angles, a high-accuracy trajectory, beginning and end heights, orbit and approximate brightness/size of the event are able to be found very quickly using custom software. The bolide had the potential to have dropped meteorites, so dark flight analysis and modeling was performed, allowing potential fall locations to be mapped as a function of meteorite mass. The second case study was a bright bolide that occurred November 3, 2014 over West Virginia. This was just north of the NASA southeastern all-sky network, and just south of the Ohio-Pennsylvania network. This case study showcases the MEO's ability to use social media and various internet sources to locate videos of the event from obscure sources (including the Washington Monument) for anything that will permit a determination of a basic trajectory and fireball light curve The third case study will highlight the ability to use doppler weather radar in helping locate meteorites, which enable a definitive classification of the impactor. The input data and analysis steps differ for each case study, but the goals remain the same - a trajectory, orbit, and mass estimate for the bolide within hours of the event, and, for events with a high probability of producing meteorites, a location of the strewn field within a day.
Data Albums: An Event Driven Search, Aggregation and Curation Tool for Earth Science
NASA Technical Reports Server (NTRS)
Ramachandran, Rahul; Kulkarni, Ajinkya; Maskey, Manil; Bakare, Rohan; Basyal, Sabin; Li, Xiang; Flynn, Shannon
2014-01-01
One of the largest continuing challenges in any Earth science investigation is the discovery and access of useful science content from the increasingly large volumes of Earth science data and related information available. Approaches used in Earth science research such as case study analysis and climatology studies involve gathering discovering and gathering diverse data sets and information to support the research goals. Research based on case studies involves a detailed description of specific weather events using data from different sources, to characterize physical processes in play for a specific event. Climatology-based research tends to focus on the representativeness of a given event, by studying the characteristics and distribution of a large number of events. This allows researchers to generalize characteristics such as spatio-temporal distribution, intensity, annual cycle, duration, etc. To gather relevant data and information for case studies and climatology analysis is both tedious and time consuming. Current Earth science data systems are designed with the assumption that researchers access data primarily by instrument or geophysical parameter. Those who know exactly the datasets of interest can obtain the specific files they need using these systems. However, in cases where researchers are interested in studying a significant event, they have to manually assemble a variety of datasets relevant to it by searching the different distributed data systems. In these cases, a search process needs to be organized around the event rather than observing instruments. In addition, the existing data systems assume users have sufficient knowledge regarding the domain vocabulary to be able to effectively utilize their catalogs. These systems do not support new or interdisciplinary researchers who may be unfamiliar with the domain terminology. This paper presents a specialized search, aggregation and curation tool for Earth science to address these existing challenges. The search tool automatically creates curated "Data Albums", aggregated collections of information related to a specific science topic or event, containing links to relevant data files (granules) from different instruments; tools and services for visualization and analysis; and information about the event contained in news reports, images or videos to supplement research analysis. Curation in the tool is driven via an ontology based relevancy ranking algorithm to filter out non-relevant information and data.
Cross-National Analysis of Islamic Fundamentalism
2016-01-20
attitudes, and was fully involved in activities concerning questionnaire design including a new experimental design in the survey, pilot testing, and...possible collaboration with the research design of the panel survey in Tunisia. • Data analysis: Analyses of religious fundamentalism, women’s dress, trust...the Event History Calendar and the best methods to ask about knowledge and experience of past events. The group designed a series of cognitive
Skylab indicators (event timer) (secondary display) (four-digit metabolic display)
NASA Technical Reports Server (NTRS)
Tiberg, W.
1971-01-01
The effort expended in developing the following indicators is summarized: (1) event timer; (2) secondary display; and (3) 4 digit display (metabolic). The mechanical design, vibration analysis, and thermal analysis of all these units are identical, and descriptions pertain to all three units. All problems incurred during the program are discussed along with the recommendations, conclusions, and actions taken to rectify the situations.
ERIC Educational Resources Information Center
Wiesner, Margit; Capaldi, Deborah M.; Kim, Hyoun K.
2010-01-01
This study used longitudinal data from 202 at-risk young men to examine effects of arrests, prior risk factors, and recent life circumstances on job loss across a 7-year period in early adulthood. Repeated failure-time continuous event-history analysis indicated that occurrence of job loss was primarily related to prior mental health problems,…
NASA Astrophysics Data System (ADS)
Odenweller, Adrian; Donner, Reik V.
2017-04-01
Over the last decade, complex network methods have been frequently used for characterizing spatio-temporal patterns of climate variability from a complex systems perspective, yielding new insights into time-dependent teleconnectivity patterns and couplings between different components of the Earth climate. Among the foremost results reported, network analyses of the synchronicity of extreme events as captured by the so-called event synchronization have been proposed to be powerful tools for disentangling the spatio-temporal organization of particularly extreme rainfall events and anticipating the timing of monsoon onsets or extreme floodings. Rooted in the analysis of spike train synchrony analysis in the neurosciences, event synchronization has the great advantage of automatically classifying pairs of events arising at two distinct spatial locations as temporally close (and, thus, possibly statistically - or even dynamically - interrelated) or not without the necessity of selecting an additional parameter in terms of a maximally tolerable delay between these events. This consideration is conceptually justified in case of the original application to spike trains in electroencephalogram (EEG) recordings, where the inter-spike intervals show relatively narrow distributions at high temporal sampling rates. However, in case of climate studies, precipitation extremes defined by daily precipitation sums exceeding a certain empirical percentile of their local distribution exhibit a distinctively different type of distribution of waiting times between subsequent events. This raises conceptual concerns if event synchronization is still appropriate for detecting interlinkages between spatially distributed precipitation extremes. In order to study this problem in more detail, we employ event synchronization together with an alternative similarity measure for event sequences, event coincidence rates, which requires a manual setting of the tolerable maximum delay between two events to be considered potentially related. Both measures are then used to generate climate networks from parts of the satellite-based TRMM precipitation data set at daily resolution covering the Indian and East Asian monsoon domains, respectively, thereby reanalysing previously published results. The obtained spatial patterns of degree densities and local clustering coefficients exhibit marked differences between both similarity measures. Specifically, we demonstrate that there exists a strong relationship between the fraction of extremes occurring at subsequent days and the degree density in the event synchronization based networks, suggesting that the spatial patterns obtained using this approach are strongly affected by the presence of serial dependencies between events. Given that a manual selection of the maximally tolerable delay between two events can be guided by a priori climatological knowledge and even used for systematic testing of different hypotheses on climatic processes underlying the emergence of spatio-temporal patterns of extreme precipitation, our results provide evidence that event coincidence rates are a more appropriate statistical characteristic for similarity assessment and network construction for climate extremes, while results based on event synchronization need to be interpreted with great caution.
Naturalistic Cycling Study: Identifying Risk Factors for On-Road Commuter Cyclists
Johnson, Marilyn; Charlton, Judith; Oxley, Jennifer; Newstead, Stuart
2010-01-01
The study aim was to identify risk factors for collisions/near-collisions involving on-road commuter cyclists and drivers. A naturalistic cycling study was conducted in Melbourne, Australia, with cyclists wearing helmet-mounted video cameras. Video recordings captured cyclists’ perspective of the road and traffic behaviours including head checks, reactions and manoeuvres. The 100-car naturalistic driving study analysis technique was adapted for data analysis and events were classified by severity: collision, near-collision and incident. Participants were adult cyclists and each filmed 12 hours of commuter cycling trips over a 4-week period. In total, 127 hours and 38 minutes were analysed for 13 participants, 54 events were identified: 2 collisions, 6 near-collisions and 46 incidents. Prior to events, 88.9% of cyclists travelled in a safe/legal manner. Sideswipe was the most frequent event type (40.7%). Most events occurred at an intersection/intersection-related location (70.3%). The vehicle driver was judged at fault in the majority of events (87.0%) and no post-event driver reaction was observed (83.3%). Cross tabulations revealed significant associations between event severity and: cyclist reaction, cyclist post-event manoeuvre, pre-event driver behaviour, other vehicle involved, driver reaction, visual obstruction, cyclist head check (left), event type and vehicle location (p<0.05). Frequent head checks suggest cyclists had high situational awareness and their reactive behaviour to driver actions led to successful avoidance of collisions/near-collisions. Strategies to improve driver awareness of on-road cyclists and to indicate early before turning/changing lanes when sharing the roadway with cyclists are discussed. Findings will contribute to the development of effective countermeasures to reduce cyclist trauma. PMID:21050610
Naturalistic cycling study: identifying risk factors for on-road commuter cyclists.
Johnson, Marilyn; Charlton, Judith; Oxley, Jennifer; Newstead, Stuart
2010-01-01
The study aim was to identify risk factors for collisions/near-collisions involving on-road commuter cyclists and drivers. A naturalistic cycling study was conducted in Melbourne, Australia, with cyclists wearing helmet-mounted video cameras. Video recordings captured cyclists' perspective of the road and traffic behaviours including head checks, reactions and manoeuvres. The 100-car naturalistic driving study analysis technique was adapted for data analysis and events were classified by severity: collision, near-collision and incident. Participants were adult cyclists and each filmed 12 hours of commuter cycling trips over a 4-week period. In total, 127 hours and 38 minutes were analysed for 13 participants, 54 events were identified: 2 collisions, 6 near-collisions and 46 incidents. Prior to events, 88.9% of cyclists travelled in a safe/legal manner. Sideswipe was the most frequent event type (40.7%). Most events occurred at an intersection/intersection-related location (70.3%). The vehicle driver was judged at fault in the majority of events (87.0%) and no post-event driver reaction was observed (83.3%). Cross tabulations revealed significant associations between event severity and: cyclist reaction, cyclist post-event manoeuvre, pre-event driver behaviour, other vehicle involved, driver reaction, visual obstruction, cyclist head check (left), event type and vehicle location (p<0.05). Frequent head checks suggest cyclists had high situational awareness and their reactive behaviour to driver actions led to successful avoidance of collisions/near-collisions. Strategies to improve driver awareness of on-road cyclists and to indicate early before turning/changing lanes when sharing the roadway with cyclists are discussed. Findings will contribute to the development of effective countermeasures to reduce cyclist trauma.
NASA Astrophysics Data System (ADS)
Nita, B.; Perchuc, E.; Thybo, H.; Maguire, P.; Denton, P.
2004-12-01
We evaluate the existence and the depth of the '8° discontinuity' beneath the Alpine orogen using the natural seismicity of Europe and northern Africa as well as events induced by mining activity. For this analysis, the regional events (1) must have epicenters further than 1000 km from the structure being imaged, and (2) the magnitude of body waves must be higher than 4.0 to obtain a favourable signal to noise ratio. The events satisfying the above conditions have epicentres in Algeria, Spain, Bulgaria, Greece and in the Lubin Copper Basin in Poland. The last region is characterised by high seismicity resulting from mining activity. We base our analysis on P-wave traveltime residuals compared to the general iasp91 model. The 8° discontinuity seems to be attributed to the observed P-wave traveltime delays at epicentral distances around 800 km. The analysis of events from the Lubin Coper Basin and the events from other regions mentioned above, gives P-wave delays of 3 s at the Alpine stations in comparison with stations in the Variscan areas to further north. We attribute this variation in travel time to the difference between 'fast' and 'slow' uppermost mantle structures in Europe.
Young doctors' problem solving strategies on call may be improved.
Michelsen, Jens; Malchow-Møller, Axel; Charles, Peder; Eika, Berit
2013-03-01
The first year following graduation from medical school is challenging as learning from books changes to workplace-based learning. Analysis and reflection on experience may ease this transition. We used Significant Event Analysis (SEA) as a tool to explore what pre-registration house officers (PRHOs) consider successful and problematic events, and to identify what problem-solving strategies they employ. A senior house officer systematically led the PRHO through the SEA of one successful and one problematic event following a night call. The PRHO wrote answers to questions about diagnosis, what happened, how he or she contributed and what knowledge-gaining activities the PRHO would prioritise before the next call. By using an inductive, thematic data analysis, we identified five problem-solving strategies: non-analytical reasoning, analytical reasoning, communication with patients, communication with colleagues and professional behaviour. On average, 1.5 strategies were used in the successful events and 1.2 strategies in the problematic events. Most PRHOs were unable to suggest activities other than reading textbooks. SEA was valuable for the identification of PRHOs' problem-solving strategies in a natural setting. PRHOs should be assisted in increasing their repertoire of strategies, and they should also be helped to "learn to learn" as they were largely unable to point to new learning strategies. not relevant. not relevant.
Kim, Seokyeon; Jeong, Seongmin; Woo, Insoo; Jang, Yun; Maciejewski, Ross; Ebert, David S
2018-03-01
Geographic visualization research has focused on a variety of techniques to represent and explore spatiotemporal data. The goal of those techniques is to enable users to explore events and interactions over space and time in order to facilitate the discovery of patterns, anomalies and relationships within the data. However, it is difficult to extract and visualize data flow patterns over time for non-directional statistical data without trajectory information. In this work, we develop a novel flow analysis technique to extract, represent, and analyze flow maps of non-directional spatiotemporal data unaccompanied by trajectory information. We estimate a continuous distribution of these events over space and time, and extract flow fields for spatial and temporal changes utilizing a gravity model. Then, we visualize the spatiotemporal patterns in the data by employing flow visualization techniques. The user is presented with temporal trends of geo-referenced discrete events on a map. As such, overall spatiotemporal data flow patterns help users analyze geo-referenced temporal events, such as disease outbreaks, crime patterns, etc. To validate our model, we discard the trajectory information in an origin-destination dataset and apply our technique to the data and compare the derived trajectories and the original. Finally, we present spatiotemporal trend analysis for statistical datasets including twitter data, maritime search and rescue events, and syndromic surveillance.
Rejection of randomly coinciding events in ZnMoO scintillating bolometers
NASA Astrophysics Data System (ADS)
Chernyak, D. M.; Danevich, F. A.; Giuliani, A.; Mancuso, M.; Nones, C.; Olivieri, E.; Tenconi, M.; Tretyak, V. I.
2014-06-01
Random coincidence of events (particularly from two neutrino double beta decay) could be one of the main sources of background in the search for neutrinoless double beta decay with cryogenic bolometers due to their poor time resolution. Pulse-shape discrimination by using front edge analysis, mean-time and methods were applied to discriminate randomly coinciding events in ZnMoO cryogenic scintillating bolometers. These events can be effectively rejected at the level of 99 % by the analysis of the heat signals with rise-time of about 14 ms and signal-to-noise ratio of 900, and at the level of 92 % by the analysis of the light signals with rise-time of about 3 ms and signal-to-noise ratio of 30, under the requirement to detect 95 % of single events. These rejection efficiencies are compatible with extremely low background levels in the region of interest of neutrinoless double beta decay of Mo for enriched ZnMoO detectors, of the order of counts/(y keV kg). Pulse-shape parameters have been chosen on the basis of the performance of a real massive ZnMoO scintillating bolometer. Importance of the signal-to-noise ratio, correct finding of the signal start and choice of an appropriate sampling frequency are discussed.
Deljavan, Reza; Sadeghi-Bazargani, Homayoun; Fouladi, Nasrin; Arshi, Shahnam; Mohammadi, Reza
2012-01-01
Little has been done to investigate the application of injury specific qualitative research methods in the field of burn injuries. The aim of this study was to use an analytical tool (Haddon's matrix) through qualitative research methods to better understand people's perceptions about burn injuries. This study applied Haddon's matrix as a framework and an analytical tool for a qualitative research methodology in burn research. Both child and adult burn injury victims were enrolled into a qualitative study conducted using focus group discussion. Haddon's matrix was used to develop an interview guide and also through the analysis phase. The main analysis clusters were pre-event level/human (including risky behaviors, belief and cultural factors, and knowledge and education), pre-event level/object, pre-event phase/environment and event and post-event phase (including fire control, emergency scald and burn wound management, traditional remedies, medical consultation, and severity indicators). This research gave rise to results that are possibly useful both for future injury research and for designing burn injury prevention plans. Haddon's matrix is applicable in a qualitative research methodology both at data collection and data analysis phases. The study using Haddon's matrix through a qualitative research methodology yielded substantially rich information regarding burn injuries that may possibly be useful for prevention or future quantitative research.
NASA Astrophysics Data System (ADS)
Kawamura, Masashi; Yamaoka, Koshun
2009-02-01
We investigated the temporal relationship between the two events, namely, the seismovolcanic activity near the Miyakejima and Kozushima islands and the slow-slip event along the plate boundary in the Tokai district. The islands are located on the east of Tokai, and the Tokai slow-slip event was discovered immediately after the large crustal deformation caused by the volcanic activity in the Miyakejima-Kozushima regions ceased. However, the order of occurrence of these events is still controversial and its recognition will help us to understand the tectonic processes of the central part of Japan, where many volcanic and seismic activities occur. For this purpose, we applied the statistical approach (Kawamura, M., Yamaoka, K., 2006. Spatiotemporal characteristics of the displacement field revealed with principal component analysis and the mode-rotation technique, Tectonophys., 419, 55-73), which consists of principal component analysis (PCA) and a mode rotation procedure, to the displacement field provided by the nationwide GPS network (GEONET) in order to obtain the characteristic structures of spatiotemporal crustal deformation caused by the above two events. We divided the time period of analysis into two sections (namely, June 26, 1999 to June 25, 2000 and June 26, 2000 to June 25, 2002) by the day when the magma intrusion occurred beneath the Miyakejima volcano on June 26, 2000. The spatial and temporal modes for the first time period did not indicate any significant spatiotemporal patterns corresponding to the two events. This indicates the absence of episodic crustal deformations during this time period. On the contrary, the modes for the latter time period included the changes caused by these events. The two major modes included the spatiotemporal structures of the first and latter half periods of the Miyake-Kozu seismovolcanic activity. The characteristic pattern of the crustal deformation corresponding to the Tokai slow-slip event was found in the fourth mode, which was prominent after the beginning of the Miyake-Kozu seismovolcanic activity. From these results, we conclude that the crustal deformation caused by the Tokai slow-slip event did not start before the Miyake-Kozu seismovolcanic activity.
Multifractal analysis of 2001 Mw 7 . 7 Bhuj earthquake sequence in Gujarat, Western India
NASA Astrophysics Data System (ADS)
Aggarwal, Sandeep Kumar; Pastén, Denisse; Khan, Prosanta Kumar
2017-12-01
The 2001 Mw 7 . 7 Bhuj mainshock seismic sequence in the Kachchh area, occurring during 2001 to 2012, has been analyzed using mono-fractal and multi-fractal dimension spectrum analysis technique. This region was characterized by frequent moderate shocks of Mw ≥ 5 . 0 for more than a decade since the occurrence of 2001 Bhuj earthquake. The present study is therefore important for precursory analysis using this sequence. The selected long-sequence has been investigated first time for completeness magnitude Mc 3.0 using the maximum curvature method. Multi-fractal Dq spectrum (Dq ∼ q) analysis was carried out using effective window-length of 200 earthquakes with a moving window of 20 events overlapped by 180 events. The robustness of the analysis has been tested by considering the magnitude completeness correction term of 0.2 to Mc 3.0 as Mc 3.2 and we have tested the error in the calculus of Dq for each magnitude threshold. On the other hand, the stability of the analysis has been investigated down to the minimum magnitude of Mw ≥ 2 . 6 in the sequence. The analysis shows the multi-fractal dimension spectrum Dq decreases with increasing of clustering of events with time before a moderate magnitude earthquake in the sequence, which alternatively accounts for non-randomness in the spatial distribution of epicenters and its self-organized criticality. Similar behavior is ubiquitous elsewhere around the globe, and warns for proximity of a damaging seismic event in an area. OS: Please confirm math roman or italics in abs.
NASA Astrophysics Data System (ADS)
Fedotova Panin, YuV, VI
2018-03-01
The results of the statistical retrospective analysis of the officially recorded geodynamic events in mines of Apatit Company within the Khibiny Massif are presented. The risks and aftereffects of geodynamic events have been calculated. Under discussion are the results of three calculation variants taking into account the scale of human impact on rock mass. The analysis shows that the main damage due to geodynamic events is different-degree destruction of mine workings while the remaining aftereffects account for less than ten percent. That is, the geodynamic risk in apatite mines can be identified as technological.
Formal analysis of imprecise system requirements with Event-B.
Le, Hong Anh; Nakajima, Shin; Truong, Ninh Thuan
2016-01-01
Formal analysis of functional properties of system requirements needs precise descriptions. However, the stakeholders sometimes describe the system with ambiguous, vague or fuzzy terms, hence formal frameworks for modeling and verifying such requirements are desirable. The Fuzzy If-Then rules have been used for imprecise requirements representation, but verifying their functional properties still needs new methods. In this paper, we propose a refinement-based modeling approach for specification and verification of such requirements. First, we introduce a representation of imprecise requirements in the set theory. Then we make use of Event-B refinement providing a set of translation rules from Fuzzy If-Then rules to Event-B notations. After that, we show how to verify both safety and eventuality properties with RODIN/Event-B. Finally, we illustrate the proposed method on the example of Crane Controller.
Chronodes: Interactive Multifocus Exploration of Event Sequences
POLACK, PETER J.; CHEN, SHANG-TSE; KAHNG, MINSUK; DE BARBARO, KAYA; BASOLE, RAHUL; SHARMIN, MOUSHUMI; CHAU, DUEN HORNG
2018-01-01
The advent of mobile health (mHealth) technologies challenges the capabilities of current visualizations, interactive tools, and algorithms. We present Chronodes, an interactive system that unifies data mining and human-centric visualization techniques to support explorative analysis of longitudinal mHealth data. Chronodes extracts and visualizes frequent event sequences that reveal chronological patterns across multiple participant timelines of mHealth data. It then combines novel interaction and visualization techniques to enable multifocus event sequence analysis, which allows health researchers to interactively define, explore, and compare groups of participant behaviors using event sequence combinations. Through summarizing insights gained from a pilot study with 20 behavioral and biomedical health experts, we discuss Chronodes’s efficacy and potential impact in the mHealth domain. Ultimately, we outline important open challenges in mHealth, and offer recommendations and design guidelines for future research. PMID:29515937
Pet-Armacost, J J; Sepulveda, J; Sakude, M
1999-12-01
The US Department of Transportation was interested in the risks associated with transporting Hydrazine in tanks with and without relief devices. Hydrazine is both highly toxic and flammable, as well as corrosive. Consequently, there was a conflict as to whether a relief device should be used or not. Data were not available on the impact of relief devices on release probabilities or the impact of Hydrazine on the likelihood of fires and explosions. In this paper, a Monte Carlo sensitivity analysis of the unknown parameters was used to assess the risks associated with highway transport of Hydrazine. To help determine whether or not relief devices should be used, fault trees and event trees were used to model the sequences of events that could lead to adverse consequences during transport of Hydrazine. The event probabilities in the event trees were derived as functions of the parameters whose effects were not known. The impacts of these parameters on the risk of toxic exposures, fires, and explosions were analyzed through a Monte Carlo sensitivity analysis and analyzed statistically through an analysis of variance. The analysis allowed the determination of which of the unknown parameters had a significant impact on the risks. It also provided the necessary support to a critical transportation decision even though the values of several key parameters were not known.
Kostal, Vratislav; Arriaga, Edgar A.
2011-01-01
Interactions between the cytoskeleton and mitochondria are essential for normal cellular function. An assessment of such interactions is commonly based on bulk analysis of mitochondrial and cytoskeletal markers present in a given sample, which assumes complete binding between these two organelle types. Such measurements are biased because they rarely account for non-bound ‘free’ subcellular species. Here we report on the use of capillary electrophoresis with dual laser induced fluorescence detection (CE-LIF) to identify, classify, count and quantify properties of individual binding events of mitochondria and cytoskeleton. Mitochondria were fluorescently labeled with DsRed2 while F-actin, a major cytoskeletal component, was fluorescently labeled with Alexa488-phalloidin. In a typical subcellular fraction of L6 myoblasts, 79% of mitochondrial events did not have detectable levels of F-actin, while the rest had on average ~2 zeptomole F-actin, which theoretically represents a ~ 2.5-μm long network of actin filaments per event. Trypsin treatment of L6 subcellular fractions prior to analysis decreased the fraction of mitochondrial events with detectable levels of F-actin, which is expected from digestion of cytoskeletal proteins on the surface of mitochondria. The electrophoretic mobility distributions of the individual events were also used to further distinguish between cytoskeleton-bound from cytoskeleton-free mitochondrial events. The CE-LIF approach described here could be further developed to explore cytoskeleton interactions with other subcellular structures, the effects of cytoskeleton destabilizing drugs, and the progression of viral infections. PMID:21309532
NASA Astrophysics Data System (ADS)
LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.
2016-12-01
Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.
Detection of goal events in soccer videos
NASA Astrophysics Data System (ADS)
Kim, Hyoung-Gook; Roeber, Steffen; Samour, Amjad; Sikora, Thomas
2005-01-01
In this paper, we present an automatic extraction of goal events in soccer videos by using audio track features alone without relying on expensive-to-compute video track features. The extracted goal events can be used for high-level indexing and selective browsing of soccer videos. The detection of soccer video highlights using audio contents comprises three steps: 1) extraction of audio features from a video sequence, 2) event candidate detection of highlight events based on the information provided by the feature extraction Methods and the Hidden Markov Model (HMM), 3) goal event selection to finally determine the video intervals to be included in the summary. For this purpose we compared the performance of the well known Mel-scale Frequency Cepstral Coefficients (MFCC) feature extraction method vs. MPEG-7 Audio Spectrum Projection feature (ASP) extraction method based on three different decomposition methods namely Principal Component Analysis( PCA), Independent Component Analysis (ICA) and Non-Negative Matrix Factorization (NMF). To evaluate our system we collected five soccer game videos from various sources. In total we have seven hours of soccer games consisting of eight gigabytes of data. One of five soccer games is used as the training data (e.g., announcers' excited speech, audience ambient speech noise, audience clapping, environmental sounds). Our goal event detection results are encouraging.
Analysis and synthesis of intonation using the Tilt model.
Taylor, P
2000-03-01
This paper introduces the Tilt intonational model and describes how this model can be used to automatically analyze and synthesize intonation. In the model, intonation is represented as a linear sequence of events, which can be pitch accents or boundary tones. Each event is characterized by continuous parameters representing amplitude, duration, and tilt (a measure of the shape of the event). The paper describes an event detector, in effect an intonational recognition system, which produces a transcription of an utterance's intonation. The features and parameters of the event detector are discussed and performance figures are shown on a variety of read and spontaneous speaker independent conversational speech databases. Given the event locations, algorithms are described which produce an automatic analysis of each event in terms of the Tilt parameters. Synthesis algorithms are also presented which generate F0 contours from Tilt representations. The accuracy of these is shown by comparing synthetic F0 contours to real F0 contours. The paper concludes with an extensive discussion on linguistic representations of intonation and gives evidence that the Tilt model goes a long way to satisfying the desired goals of such a representation in that it has the right number of degrees of freedom to be able to describe and synthesize intonation accurately.
Statistical analysis of mixed recurrent event data with application to cancer survivor study
Zhu, Liang; Tong, Xingwei; Zhao, Hui; Sun, Jianguo; Srivastava, Deo Kumar; Leisenring, Wendy; Robison, Leslie L.
2014-01-01
Event history studies occur in many fields including economics, medical studies and social science. In such studies concerning some recurrent events, two types of data have been extensively discussed in the literature. One is recurrent event data that arise if study subjects are monitored or observed continuously. In this case, the observed information provides the times of all occurrences of the recurrent events of interest. The other is panel count data, which occur if the subjects are monitored or observed only periodically. This can happen if the continuous observation is too expensive or not practical and in this case, only the numbers of occurrences of the events between subsequent observation times are available. In this paper, we discuss a third type of data, which is a mixture of recurrent event and panel count data and for which there exists little literature. For regression analysis of such data, a marginal mean model is presented and we propose an estimating equation-based approach for estimation of regression parameters. A simulation study is conducted to assess the finite sample performance of the proposed methodology and indicates that it works well for practical situations. Finally it is applied to a motivating study on childhood cancer survivors. PMID:23139023
Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P
2014-10-01
There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.
Measuring the impact of major life events upon happiness.
Ballas, Dimitris; Dorling, Danny
2007-12-01
In recent years there have been numerous attempts to define and measure happiness in various contexts and pertaining to a wide range of disciplines, ranging from neuroscience and psychology to philosophy, economics and social policy. This article builds on recent work by economists who attempt to estimate happiness regressions using large random samples of individuals in order to calculate monetary 'compensating amounts' for different life 'events'. We estimate happiness regressions using the 'major life event' and 'happiness' data from the British Household Panel Survey. The data and methods used in this article suggest that in contrast to living states such as 'being married', it is more events such as 'starting a new relationship' that have the highest positive effect on happiness. This is closely followed by 'employment-related gains' (in contrast to employment status). Also, women who become pregnant on average report higher than average levels of subjective happiness (in contrast to 'being a parent'). Other events that appear to be associated with happiness according to our analysis include 'personal education-related events' (e.g. starting a new course, graduating from University, passing exams) and 'finance/house related events' (e.g. buying a new house). On the other hand, the event that has the highest negative impact upon happiness according to our analysis is 'the end of my relationship' closely followed by 'death of a parent'. Adverse health events pertaining to the parents of the respondents also have a high negative coefficient and so does an employment-related loss. The analysis presented in this article suggests that what matters the most in people's lives in Britain is to have good dynamic interpersonal relationships and to be respected at work with that respect being constantly renewed. These 'goods' are as much reflected through dynamic events as static situations. Relationships at work appear to be of a similar order of importance to those at home. Other factors that contribute to higher than average levels of subjective happiness, at least at a superficial level, include delaying death and keeping illness at bay, having babies, buying homes and cars and passing exams. The analysis presented here also suggests that people should not expect too much from their holidays and wider families. The findings presented in this article may help us to understand a little better the propensity for groups to be more or less happy and may help us to begin to better understand the importance of the dynamics of social context-the context in which we come to terms with reward and loss.
Hazard assessment for small torrent catchments - lessons learned
NASA Astrophysics Data System (ADS)
Eisl, Julia; Huebl, Johannes
2013-04-01
The documentation of extreme events as a part of the integral risk management cycle is an important basis for the analysis and assessment of natural hazards. In July 2011 a flood event occurred in the Wölzer-valley in the province of Styria, Austria. For this event at the "Wölzerbach" a detailed event documentation was carried out, gathering data about rainfall, runoff and sediment transport as well as information on damaged objects, infrastructure or crops using various sources. The flood was triggered by heavy rainfalls in two tributaries of the Wölzer-river. Though a rain as well as a discharge gaging station exists for the Wölzer-river, the torrents affected by the high intensity rainfalls are ungaged. For these ungaged torrent catchments the common methods for hazard assessment were evaluated. The back-calculation of the rainfall event was done using a new approach for precipitation analysis. In torrent catchments especially small-scale and high-intensity rainfall events are mainly responsible for extreme events. Austria's weather surveillance radar is operated by the air traffic service "AustroControl". The usually available dataset is interpreted and shows divergences especially when it comes to high intensity rainfalls. For this study the raw data of the radar were requested and analysed. Further on the event was back-calculated with different rainfall-runoff models, hydraulic models and sediment transport models to obtain calibration parameters for future use in hazard assessment for this region. Since there are often problems with woody debris different scenarios were simulated. The calibrated and plausible results from the runoff models were used for the comparison with empirical approaches used in the practical sector. For the planning of mitigation measures of the Schöttl-torrent, which is one of the affected tributaries of the Wölzer-river, a physical scale model was used in addition to the insights of the event analysis to design a check dam for sediment retention. As far as the transport capacity of the lower reaches is limited a balance had to be found between protection on the one hand and sediment connectivity to the Wölzer-river on the other. The lessons learned kicked off discussions for future hazard assessment especially concerning the use of rainfall data and design precipitation values for small torrent catchments. Also the comparison with empirical values showed the need for differentiated concepts for hazard analysis. Therefor recommendations for the use of spatial rainfall reduction factors as well as the demarcation of hazard maps using different event scenarios are proposed.
NASA Astrophysics Data System (ADS)
Stachnik, J.; Rozhkov, M.; Baker, B.; Bobrov, D.; Friberg, P. A.
2015-12-01
Depth of event is an important criterion of seismic event screening at the International Data Center, CTBTO. However, a thorough determination of the event depth can be conducted mostly through special analysis because the IDC's Event Definition Criteria is based, in particular, on depth estimation uncertainties. This causes a large number of events in the Reviewed Event Bulletin to have depth constrained to the surface. When the true origin depth is greater than that reasonable for a nuclear test (3 km based on existing observations), this may result in a heavier workload to manually distinguish between shallow and deep events. Also, IDC depth criterion is not applicable to the events with the small t(pP-P) travel time difference, which is the case of the nuclear test. Since the shape of the first few seconds of signal of very shallow events is very sensitive to the presence of the depth phase, cross correlation between observed and theoretic seismogram can provide an estimate for the depth of the event, and so provide an expansion to the screening process. We exercised this approach mostly with events at teleseismic and partially regional distances. We found that such approach can be very efficient for the seismic event screening process, with certain caveats related mostly to the poorly defined crustal models at source and receiver which can shift the depth estimate. We used adjustable t* teleseismic attenuation model for synthetics since this characteristic is not determined for most of the rays we studied. We studied a wide set of historical records of nuclear explosions, including so called Peaceful Nuclear Explosions (PNE) with presumably known depths, and recent DPRK nuclear tests. The teleseismic synthetic approach is based on the stationary phase approximation with Robert Herrmann's hudson96 program, and the regional modelling was done with the generalized ray technique by Vlastislav Cerveny modified to the complex source topography.
Hip fractures are risky business: an analysis of the NSQIP data.
Sathiyakumar, Vasanth; Greenberg, Sarah E; Molina, Cesar S; Thakore, Rachel V; Obremskey, William T; Sethi, Manish K
2015-04-01
Hip fractures are one of the most common types of orthopaedic injury with high rates of morbidity. Currently, no study has compared risk factors and adverse events following the different types of hip fracture surgeries. The purpose of this paper is to investigate the major and minor adverse events and risk factors for complication development associated with five common surgeries for the treatment of hip fractures using the NSQIP database. Using the ACS-NSQIP database, complications for five forms of hip surgeries were selected and categorized into major and minor adverse events. Demographics and clinical variables were collected and an unadjusted bivariate logistic regression analyses was performed to determine significant risk factors for adverse events. Five multivariate regressions were run for each surgery as well as a combined regression analysis. A total of 9640 patients undergoing surgery for hip fracture were identified with an adverse events rate of 25.2% (n=2433). Open reduction and internal fixation of a femoral neck fracture had the greatest percentage of all major events (16.6%) and total adverse events (27.4%), whereas partial hip hemiarthroplasty had the greatest percentage of all minor events (11.6%). Mortality was the most common major adverse event (44.9-50.6%). For minor complications, urinary tract infections were the most common minor adverse event (52.7-62.6%). Significant risk factors for development of any adverse event included age, BMI, gender, race, active smoking status, history of COPD, history of CHF, ASA score, dyspnoea, and functional status, with various combinations of these factors significantly affecting complication development for the individual surgeries. Hip fractures are associated with significantly high numbers of adverse events. The type of surgery affects the type of complications developed and also has an effect on what risk factors significantly predict the development of a complication. Concerted efforts from orthopaedists should be made to identify higher risk patients and prevent the most common adverse events that occur postoperatively. Copyright © 2014 Elsevier Ltd. All rights reserved.
Murphy, Sabina A; Cannon, Christopher P; Blazing, Michael A; Giugliano, Robert P; White, Jennifer A; Lokhnygina, Yuliya; Reist, Craig; Im, KyungAh; Bohula, Erin A; Isaza, Daniel; Lopez-Sendon, Jose; Dellborg, Mikael; Kher, Uma; Tershakovec, Andrew M; Braunwald, Eugene
2016-02-02
Intensive low-density lipoprotein cholesterol therapy with ezetimibe/simvastatin in IMPROVE-IT (IMProved Reduction of Outcomes: Vytorin Efficacy International Trial) significantly reduced the first primary endpoint (PEP) in patients post-acute coronary syndrome (ACS) compared to placebo/simvastatin. This analysis tested the hypothesis that total events, including those beyond the first event, would also be reduced with ezetimibe/simvastatin therapy. All PEP events (cardiovascular [CV] death, myocardial infarction [MI], stroke, unstable angina [UA] leading to hospitalization, coronary revascularization ≥30 days post-randomization) during a median 6-year follow-up were analyzed in patients randomized to receive ezetimibe/simvastatin or placebo/simvastatin in IMPROVE-IT. Negative binomial regression was used for the primary analysis. Among 18,144 patients, there were 9,545 total PEP events (56% were first events and 44% subsequent events). Total PEP events were significantly reduced by 9% with ezetimibe/simvastatin vs placebo/simvastatin (incidence-rate ratio [RR]: 0.91; 95% confidence interval [CI]: 0.85 to 0.97; p = 0.007), as were the 3 pre-specified secondary composite endpoints and the exploratory composite endpoint of CV death, MI, or stroke (RR: 0.88; 95% CI: 0.81 to 0.96; p = 0.002). The reduction in total events was driven by decreases in total nonfatal MI (RR: 0.87; 95% CI: 0.79 to 0.96; p = 0.004) and total NF stroke (RR: 0.77; 95% CI: 0.65 to 0.93; p = 0.005). Lipid-lowering therapy with ezetimibe plus simvastatin improved clinical outcomes. Reductions in total PEP events, driven by reductions in MI and stroke, more than doubled the number of events prevented compared with examining only the first event. These data support continuation of intensive combination lipid-lowering therapy after an initial CV event. (IMProved Reduction of Outcomes: Vytorin Efficacy International Trial [IMPROVE-IT]; NCT00202878). Copyright © 2016 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Ben-Yoav, Hadar; Dykstra, Peter H; Bentley, William E; Ghodssi, Reza
2017-01-01
A microfluidic electrochemical lab-on-a-chip (LOC) device for DNA hybridization detection has been developed. The device comprises a 3 × 3 array of microelectrodes integrated with a dual layer microfluidic valved manipulation system that provides controlled and automated capabilities for high throughput analysis of microliter volume samples. The surface of the microelectrodes is functionalized with single-stranded DNA (ssDNA) probes which enable specific detection of complementary ssDNA targets. These targets are detected by a capacitive technique which measures dielectric variation at the microelectrode-electrolyte interface due to DNA hybridization events. A quantitative analysis of the hybridization events is carried out based on a sensing modeling that includes detailed analysis of energy storage and dissipation components. By calculating these components during hybridization events the device is able to demonstrate specific and dose response sensing characteristics. The developed microfluidic LOC for DNA hybridization detection offers a technology for real-time and label-free assessment of genetic markers outside of laboratory settings, such as at the point-of-care or in-field environmental monitoring.
Predictability of rogue events.
Birkholz, Simon; Brée, Carsten; Demircan, Ayhan; Steinmeyer, Günter
2015-05-29
Using experimental data from three different rogue wave supporting systems, determinism, and predictability of the underlying dynamics are evaluated with methods of nonlinear time series analysis. We included original records from the Draupner platform in the North Sea as well as time series from two optical systems in our analysis. One of the latter was measured in the infrared tail of optical fiber supercontinua, the other in the fluence profiles of multifilaments. All three data sets exhibit extreme-value statistics and exceed the significant wave height in the respective system by a factor larger than 2. Nonlinear time series analysis indicates a different degree of determinism in the systems. The optical fiber scenario is found to be driven by quantum noise whereas rogue waves emerge as a consequence of turbulence in the others. With the large number of rogue events observed in the multifilament system, we can systematically explore the predictability of such events in a turbulent system. We observe that rogue events do not necessarily appear without a warning, but are often preceded by a short phase of relative order. This surprising finding sheds some new light on the fascinating phenomenon of rogue waves.
3-Dimensional Root Cause Diagnosis via Co-analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Ziming; Lan, Zhiling; Yu, Li
2012-01-01
With the growth of system size and complexity, reliability has become a major concern for large-scale systems. Upon the occurrence of failure, system administrators typically trace the events in Reliability, Availability, and Serviceability (RAS) logs for root cause diagnosis. However, RAS log only contains limited diagnosis information. Moreover, the manual processing is time-consuming, error-prone, and not scalable. To address the problem, in this paper we present an automated root cause diagnosis mechanism for large-scale HPC systems. Our mechanism examines multiple logs to provide a 3-D fine-grained root cause analysis. Here, 3-D means that our analysis will pinpoint the failure layer,more » the time, and the location of the event that causes the problem. We evaluate our mechanism by means of real logs collected from a production IBM Blue Gene/P system at Oak Ridge National Laboratory. It successfully identifies failure layer information for 219 failures during 23-month period. Furthermore, it effectively identifies the triggering events with time and location information, even when the triggering events occur hundreds of hours before the resulting failures.« less
Reverse translation of adverse event reports paves the way for de-risking preclinical off-targets.
Maciejewski, Mateusz; Lounkine, Eugen; Whitebread, Steven; Farmer, Pierre; DuMouchel, William; Shoichet, Brian K; Urban, Laszlo
2017-08-08
The Food and Drug Administration Adverse Event Reporting System (FAERS) remains the primary source for post-marketing pharmacovigilance. The system is largely un-curated, unstandardized, and lacks a method for linking drugs to the chemical structures of their active ingredients, increasing noise and artefactual trends. To address these problems, we mapped drugs to their ingredients and used natural language processing to classify and correlate drug events. Our analysis exposed key idiosyncrasies in FAERS, for example reports of thalidomide causing a deadly ADR when used against myeloma, a likely result of the disease itself; multiplications of the same report, unjustifiably increasing its importance; correlation of reported ADRs with public events, regulatory announcements, and with publications. Comparing the pharmacological, pharmacokinetic, and clinical ADR profiles of methylphenidate, aripiprazole, and risperidone, and of kinase drugs targeting the VEGF receptor, demonstrates how underlying molecular mechanisms can emerge from ADR co-analysis. The precautions and methods we describe may enable investigators to avoid confounding chemistry-based associations and reporting biases in FAERS, and illustrate how comparative analysis of ADRs can reveal underlying mechanisms.
Analysis and Prediction of Ice Shedding for a Full-Scale Heated Tail Rotor
NASA Technical Reports Server (NTRS)
Kreeger, Richard E.; Work, Andrew; Douglass, Rebekah; Gazella, Matthew; Koster, Zakery; Turk, Jodi
2016-01-01
When helicopters are to fly in icing conditions, it is necessary to consider the possibility of ice shed from the rotor blades. In 2013, a series of tests were conducted on a heated tail rotor at NASA Glenn's Icing Research Tunnel (IRT). The tests produced several shed events that were captured on camera. Three of these shed events were captured at a sufficiently high frame rate to obtain multiple images of the shed ice in flight that had a sufficiently long section of shed ice for analysis. Analysis of these shed events is presented and compared to an analytical Shedding Trajectory Model (STM). The STM is developed and assumes that the ice breaks off instantly as it reaches the end of the blade, while frictional and viscous forces are used as parameters to fit the STM. The trajectory of each shed is compared to that predicted by the STM, where the STM provides information of the shed group of ice as a whole. The limitations of the model's underlying assumptions are discussed in comparison to experimental shed events.
MatSeis and the GNEM R&E regional seismic anaylsis tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chael, Eric Paul; Hart, Darren M.; Young, Christopher John
2003-08-01
To improve the nuclear event monitoring capability of the U.S., the NNSA Ground-based Nuclear Explosion Monitoring Research & Engineering (GNEM R&E) program has been developing a collection of products known as the Knowledge Base (KB). Though much of the focus for the KB has been on the development of calibration data, we have also developed numerous software tools for various purposes. The Matlab-based MatSeis package and the associated suite of regional seismic analysis tools were developed to aid in the testing and evaluation of some Knowledge Base products for which existing applications were either not available or ill-suited. This presentationmore » will provide brief overviews of MatSeis and each of the tools, emphasizing features added in the last year. MatSeis was begun in 1996 and is now a fairly mature product. It is a highly flexible seismic analysis package that provides interfaces to read data from either flatfiles or an Oracle database. All of the standard seismic analysis tasks are supported (e.g. filtering, 3 component rotation, phase picking, event location, magnitude calculation), as well as a variety of array processing algorithms (beaming, FK, coherency analysis, vespagrams). The simplicity of Matlab coding and the tremendous number of available functions make MatSeis/Matlab an ideal environment for developing new monitoring research tools (see the regional seismic analysis tools below). New MatSeis features include: addition of evid information to events in MatSeis, options to screen picks by author, input and output of origerr information, improved performance in reading flatfiles, improved speed in FK calculations, and significant improvements to Measure Tool (filtering, multiple phase display), Free Plot (filtering, phase display and alignment), Mag Tool (maximum likelihood options), and Infra Tool (improved calculation speed, display of an F statistic stream). Work on the regional seismic analysis tools (CodaMag, EventID, PhaseMatch, and Dendro) began in 1999 and the tools vary in their level of maturity. All rely on MatSeis to provide necessary data (waveforms, arrivals, origins, and travel time curves). CodaMag Tool implements magnitude calculation by scaling to fit the envelope shape of the coda for a selected phase type (Mayeda, 1993; Mayeda and Walter, 1996). New tool features include: calculation of a yield estimate based on the source spectrum, display of a filtered version of the seismogram based on the selected band, and the output of codamag data records for processed events. EventID Tool implements event discrimination using phase ratios of regional arrivals (Hartse et al., 1997; Walter et al., 1999). New features include: bandpass filtering of displayed waveforms, screening of reference events based on SNR, multivariate discriminants, use of libcgi to access correction surfaces, and the output of discrim{_}data records for processed events. PhaseMatch Tool implements match filtering to isolate surface waves (Herrin and Goforth, 1977). New features include: display of the signal's observed dispersion and an option to use a station-based dispersion surface. Dendro Tool implements agglomerative hierarchical clustering using dendrograms to identify similar events based on waveform correlation (Everitt, 1993). New features include: modifications to include arrival information within the tool, and the capability to automatically add/re-pick arrivals based on the picked arrivals for similar events.« less
Heller, Gabriella T; Zwang, Theodore J; Sarapata, Elizabeth A; Haber, Michael A; Sazinsky, Matthew H; Radunskaya, Ami E; Johal, Malkiat S
2014-05-01
Previous methods for analyzing protein-ligand binding events using the quartz crystal microbalance with dissipation monitoring (QCM-D) fail to account for unintended binding that inevitably occurs during surface measurements and obscure kinetic information. In this article, we present a system of differential equations that accounts for both reversible and irreversible unintended interactions. This model is tested on three protein-ligand systems, each of which has different features, to establish the feasibility of using the QCM-D for protein binding analysis. Based on this analysis, we were able to obtain kinetic information for the intended interaction that is consistent with those obtained in literature via bulk-phase methods. In the appendix, we include a method for decoupling these from the intended binding events and extracting relevant affinity information. Copyright © 2014 Elsevier B.V. All rights reserved.
ANCA: Anharmonic Conformational Analysis of Biomolecular Simulations.
Parvatikar, Akash; Vacaliuc, Gabriel S; Ramanathan, Arvind; Chennubhotla, S Chakra
2018-05-08
Anharmonicity in time-dependent conformational fluctuations is noted to be a key feature of functional dynamics of biomolecules. Although anharmonic events are rare, long-timescale (μs-ms and beyond) simulations facilitate probing of such events. We have previously developed quasi-anharmonic analysis to resolve higher-order spatial correlations and characterize anharmonicity in biomolecular simulations. In this article, we have extended this toolbox to resolve higher-order temporal correlations and built a scalable Python package called anharmonic conformational analysis (ANCA). ANCA has modules to: 1) measure anharmonicity in the form of higher-order statistics and its variation as a function of time, 2) output a storyboard representation of the simulations to identify key anharmonic conformational events, and 3) identify putative anharmonic conformational substates and visualization of transitions between these substates. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Idaho National Laboratory Quarterly Occurrence Analysis for the 1st Quarter FY2017
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, Lisbeth Ann
This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of 82 reportable events (13 from the 1st quarter (Qtr) of fiscal year (FY) 2017 and 68 from the prior three reporting quarters), as well as 31 other issue reports (including events found to be not reportable and Significantmore » Category A and B conditions) identified at INL during the past 12 months (seven from this quarter and 24 from the prior three quarters).« less
NASA Astrophysics Data System (ADS)
Kawazoe, S.; Gutowski, W. J., Jr.
2015-12-01
We analyze the ability of regional climate models (RCMs) to simulate very heavy daily precipitation and supporting processes for both contemporary and future-scenario simulations during summer (JJA). RCM output comes from North American Regional Climate Change Assessment Program (NARCCAP) simulations, which are all run at a spatial resolution of 50 km. Analysis focuses on the upper Mississippi basin for summer, between 1982-1998 for the contemporary climate, and 2052-2068 during the scenario climate. We also compare simulated precipitation and supporting processes with those obtained from observed precipitation and reanalysis atmospheric states. Precipitation observations are from the University of Washington (UW) and the Climate Prediction Center (CPC) gridded dataset. Utilizing two observational datasets helps determine if any uncertainties arise from differences in precipitation gridding schemes. Reanalysis fields come from the North American Regional Reanalysis. The NARCCAP models generally reproduce well the precipitation-vs.-intensity spectrum seen in observations, while producing overly strong precipitation at high intensity thresholds. In the future-scenario climate, there is a decrease in frequency for light to moderate precipitation intensities, while an increase in frequency is seen for the higher intensity events. Further analysis focuses on precipitation events exceeding the 99.5 percentile that occur simultaneously at several points in the region, yielding so-called "widespread events". For widespread events, we analyze local and large scale environmental parameters, such as 2-m temperature and specific humidity, 500-hPa geopotential heights, Convective Available Potential Energy (CAPE), vertically integrated moisture flux convergence, among others, to compare atmospheric states and processes leading to such events in the models and observations. The results suggest that an analysis of atmospheric states supporting very heavy precipitation events is a more fruitful path for understanding and detecting changes than simply looking at precipitation itself.
Inzucchi, Silvio E.; Lachin, John M.; Wanner, Christoph; Fitchett, David; Kohler, Sven; Mattheus, Michaela; Woerle, Hans J.; Broedl, Uli C.; Johansen, Odd Erik; Albers, Gregory W.; Diener, Hans Christoph
2017-01-01
Background and Purpose— In the EMPA-REG OUTCOME trial (Empagliflozin Cardiovascular Outcome Event Trial in Type 2 Diabetes Mellitus Patients), empagliflozin added to standard of care in patients with type 2 diabetes mellitus and high cardiovascular risk reduced the risk of 3-point major adverse cardiovascular events, driven by a reduction in cardiovascular mortality, with no significant difference between empagliflozin and placebo in risk of myocardial infarction or stroke. In a modified intent-to-treat analysis, the hazard ratio for stroke was 1.18 (95% confidence interval, 0.89–1.56; P=0.26). We further investigated cerebrovascular events. Methods— Patients were randomized to empagliflozin 10 mg, empagliflozin 25 mg, or placebo; 7020 patients were treated. Median observation time was 3.1 years. Results— The numeric difference in stroke between empagliflozin and placebo in the modified intent-to-treat analysis was primarily because of 18 patients in the empagliflozin group with a first event >90 days after last intake of study drug (versus 3 on placebo). In a sensitivity analysis based on events during treatment or ≤90 days after last dose of drug, the hazard ratio for stroke with empagliflozin versus placebo was 1.08 (95% confidence interval, 0.81–1.45; P=0.60). There were no differences in risk of recurrent, fatal, or disabling strokes, or transient ischemic attack, with empagliflozin versus placebo. Patients with the largest increases in hematocrit or largest decreases in systolic blood pressure did not have an increased risk of stroke. Conclusions— In patients with type 2 diabetes mellitus and high cardiovascular risk, there was no significant difference in the risk of cerebrovascular events with empagliflozin versus placebo. Clinical Trial Registration— URL: http://www.clinicaltrials.gov. Unique identifier: NCT01131676. PMID:28386035
Traumatic events and depressive symptoms among youth in Southwest Nigeria: a qualitative analysis.
Omigbodun, Olayinka; Bakare, Kofoworola; Yusuf, Bidemi
2008-01-01
Traumatic experiences have dire consequences for the mental health of young persons. Despite high rates of traumatic experiences in some African cities, there are no reports for Nigerian youth. To investigate the pattern of traumatic events and their association with depressive symptoms among youth in Southwest Nigeria. This is a descriptive cross-sectional study of randomly selected youth in urban and rural schools in Southwest Nigeria. They completed self-reports on traumatic events and depressive symptoms using the Street Children's Project Questionnaire and the Youth DISC Predictive Scale (DPS). Of the 1,768 responses (88.4% response rate) entered into the analysis, 34% reported experiencing a traumatic situation. Following interpretative phenomenological analysis, 13 themes emerged. Frequently occurring traumatic events were 'road traffic accidents' (33.0%), 'sickness' (17.1%), 'lost or trapped' (11.2%) and 'armed robbery attack' (9.7%). A bad dream was described by 3.7%. Traumatic experiences were commoner in males (36.2%) than in females (31.6%) (x2 = 4.2; p = .041). Experiencing a traumatic event was associated with depressive symptoms (X2 = 37.98; p < .001), especially when the event directly affected the youth as in sexual assault or physical abuse. One-third of youth in Southwest Nigeria have described an experienced traumatic event. Road traffic accidents, armed robbery attacks, and communal disturbances depict the prevailing social environment, whereas 'bad dreams' revealed the influence of cultural beliefs. Policy makers must be aware of the social issues making an impact on the health of youth. Multi-agency interventions to improve the social environment and provide mental health services for traumatized young people are essential.
Sun, S; Cui, Z; Zhou, M; Li, R; Li, H; Zhang, S; Ba, Y; Cheng, G
2017-02-01
Proton pump inhibitors (PPIs) are commonly used as potent gastric acid secretion antagonists for gastro-esophageal disorders and their overall safety in patients with gastro-esophageal reflux disease (GERD) is considered to be good and they are well-tolerated. However, recent studies have suggested that PPIs may be a potential independent risk factor for cardiovascular adverse events. The aim of our meta-analysis was to examine the association between PPI monotherapy and cardiovascular events in patients with GERD. A literature search involved examination of relevant databases up to July 2015 including PubMed, Cochrane Library, EMBASE, and ClinicalTrial.gov, as well as selected randomized controlled trials (RCTs) reporting cardiovascular events with PPI exposure in GERD patients. In addition, the pooled risk ratio (RR) and heterogeneity were assessed based on a fixed effects model of the meta-analysis and the I 2 statistic, respectively. Seventeen RCTs covering 7540 patients were selected. The pooled data suggested that the use of PPIs was associated with a 70% increased cardiovascular risk (RR=1.70, 95% CI: [1.13-2.56], P=.01, I 2 =0%). Furthermore, higher risks of adverse cardiovascular events in the omeprazole subgroup (RR=3.17, 95% CI: [1.43-7.03], P=.004, I 2 =25%) and long-term treatment subgroup (RR=2.33, 95% CI: [1.33-4.08], P=.003, I 2 =0%) were found. PPI monotherapy can be a risk factor for cardiovascular adverse events. Omeprazole could significantly increase the risk of cardiovascular events and, so, should be used carefully. © 2016 John Wiley & Sons Ltd.
Lloyd, Belinda; Matthews, Sharon; Livingston, Michael; Jayasekara, Harindra; Smith, Karen
2013-04-01
To assess the relationship between ambulance attendances, emergency department (ED) presentations and hospital admissions for acute alcohol intoxication and the timing of public holidays, sporting and social events. Time-series analysis was used to explore trends in intoxication in the context of major events. Population of Melbourne, Victoria, Australia between 2000 and 2009. All patients attended by ambulance, presenting to hospital EDs, or admitted to hospital who were classified as acutely alcohol intoxicated. Analysis of daily numbers of presentations for acute alcohol intoxication associated with major events were undertaken, including lead and lag effects. Analyses controlled for day of week and month of year to address temporal and seasonal variations. Alcohol intoxication presentations were significantly elevated the day before all public holidays, with intoxication cases on the day of public holidays only higher on New Year's Day (ambulance 6.57, 95% confidence intervals (CI): 3.4-9.74; ED 3.34, 95% CI: 1.28-5.4) and ANZAC Day (ambulance 3.71, 95% CI: 0.68-6.75). The Australian Football League (AFL) Grand Final (ED 2.37, 95% CI: 0.55-4.19), Commonwealth Games (ED 2.45, 95% CI: 0.6-4.3) and Melbourne Cup Day (ambulance 6.14, 95% CI: 2.42-9.85) represented the sporting events with significant elevations in acute intoxication requiring medical attention. The last working day before Christmas was the only social event where a significant increase in acute intoxication occurred (ambulance 8.98, 95% CI: 6.8-11.15). Acute alcohol intoxication cases requiring ambulance, emergency department and hospital in-patient treatment increase substantially on the day preceding public holidays and other major social events. © 2012 The Authors, Addiction © 2012 Society for the Study of Addiction.
Zinman, Bernard; Inzucchi, Silvio E; Lachin, John M; Wanner, Christoph; Fitchett, David; Kohler, Sven; Mattheus, Michaela; Woerle, Hans J; Broedl, Uli C; Johansen, Odd Erik; Albers, Gregory W; Diener, Hans Christoph
2017-05-01
In the EMPA-REG OUTCOME trial (Empagliflozin Cardiovascular Outcome Event Trial in Type 2 Diabetes Mellitus Patients), empagliflozin added to standard of care in patients with type 2 diabetes mellitus and high cardiovascular risk reduced the risk of 3-point major adverse cardiovascular events, driven by a reduction in cardiovascular mortality, with no significant difference between empagliflozin and placebo in risk of myocardial infarction or stroke. In a modified intent-to-treat analysis, the hazard ratio for stroke was 1.18 (95% confidence interval, 0.89-1.56; P =0.26). We further investigated cerebrovascular events. Patients were randomized to empagliflozin 10 mg, empagliflozin 25 mg, or placebo; 7020 patients were treated. Median observation time was 3.1 years. The numeric difference in stroke between empagliflozin and placebo in the modified intent-to-treat analysis was primarily because of 18 patients in the empagliflozin group with a first event >90 days after last intake of study drug (versus 3 on placebo). In a sensitivity analysis based on events during treatment or ≤90 days after last dose of drug, the hazard ratio for stroke with empagliflozin versus placebo was 1.08 (95% confidence interval, 0.81-1.45; P =0.60). There were no differences in risk of recurrent, fatal, or disabling strokes, or transient ischemic attack, with empagliflozin versus placebo. Patients with the largest increases in hematocrit or largest decreases in systolic blood pressure did not have an increased risk of stroke. In patients with type 2 diabetes mellitus and high cardiovascular risk, there was no significant difference in the risk of cerebrovascular events with empagliflozin versus placebo. URL: http://www.clinicaltrials.gov. Unique identifier: NCT01131676. © 2017 The Authors.
Wu, Jia-Rong; Song, Eun Kyeung; Moser, Debra K; Lennie, Terry A
2018-04-01
Heart failure is a chronic, burdensome condition with higher re-hospitalization rates in African Americans than Whites. Higher dietary antioxidant intake is associated with lower oxidative stress and improved endothelial function. Lower dietary antioxidant intake in African Americans may play a role in the re-hospitalization disparity between African American and White patients with heart failure. The objective of this study was to examine the associations among race, dietary antioxidant intake, and cardiac event-free survival in patients with heart failure. In a secondary analysis of 247 patients with heart failure who completed a four-day food diary, intake of alpha-carotene, beta-carotene, beta-cryptoxanthin, lutein, zeaxanthin, lycopene, vitamins C and E, zinc, and selenium were assessed. Antioxidant deficiency was defined as intake below the estimated average requirement for antioxidants with an established estimated average requirement, or lower than the sample median for antioxidants without an established estimated average requirement. Patients were followed for a median of one year to determine time to first cardiac event (hospitalization or death). Survival analysis was used for data analysis. African American patients had more dietary antioxidant deficiencies and a shorter cardiac event-free survival compared with Whites ( p = .007 and p = .028, respectively). In Cox regression, race and antioxidant deficiency were associated with cardiac event-free survival before and after adjusting for covariates. African Americans with heart failure had more dietary antioxidant deficiencies and shorter cardiac event-free survival than Whites. This suggests that encouraging African American patients with heart failure to consume an antioxidant-rich diet may be beneficial in lengthening cardiac event-free survival.
Leitão, Cristiane B; Gross, Jorge L
2017-01-01
Objective To evaluate the efficacy of coronary artery disease screening in asymptomatic patients with type 2 diabetes and assess the statistical reliability of the findings. Methods Electronic databases (MEDLINE, EMBASE, Cochrane Library and clinicaltrials.org) were reviewed up to July 2016. Randomised controlled trials evaluating coronary artery disease screening in asymptomatic patients with type 2 diabetes and reporting cardiovascular events and/or mortality were included. Data were summarised with Mantel-Haenszel relative risk. Trial sequential analysis (TSA) was used to evaluate the optimal sample size to detect a 40% reduction in outcomes. Main outcomes were all-cause mortality and cardiac events (non-fatal myocardial infarction and cardiovascular death); secondary outcomes were non-fatal myocardial infarction, myocardial revascularisations and heart failure. Results One hundred thirty-five references were identified and 5 studies fulfilled the inclusion criteria and totalised 3315 patients, 117 all-cause deaths and 100 cardiac events. Screening for coronary artery disease was not associated with decrease in risk for all-cause deaths (RR 0.95(95% CI 0.66 to 1.35)) or cardiac events (RR 0.72(95% CI 0.49 to 1.06)). TSA shows that futility boundaries were reached for all-cause mortality and a relative risk reduction of 40% between treatments could be discarded. However, there is not enough information for firm conclusions for cardiac events. For secondary outcomes no benefit or harm was identified; optimal sample sizes were not reached. Conclusion Current available data do not support screening for coronary artery disease in patients with type 2 diabetes for preventing fatal events. Further studies are needed to assess the effects on cardiac events. PROSPERO CRD42015026627. PMID:28490559
Revealing the underlying drivers of disaster risk: a global analysis
NASA Astrophysics Data System (ADS)
Peduzzi, Pascal
2017-04-01
Disasters events are perfect examples of compound events. Disaster risk lies at the intersection of several independent components such as hazard, exposure and vulnerability. Understanding the weight of each component requires extensive standardisation. Here, I show how footprints of past disastrous events were generated using GIS modelling techniques and used for extracting population and economic exposures based on distribution models. Using past event losses, it was possible to identify and quantify a wide range of socio-politico-economic drivers associated with human vulnerability. The analysis was applied to about nine thousand individual past disastrous events covering earthquakes, floods and tropical cyclones. Using a multiple regression analysis on these individual events it was possible to quantify each risk component and assess how vulnerability is influenced by various hazard intensities. The results show that hazard intensity, exposure, poverty, governance as well as other underlying factors (e.g. remoteness) can explain the magnitude of past disasters. Analysis was also performed to highlight the role of future trends in population and climate change and how this may impacts exposure to tropical cyclones in the future. GIS models combined with statistical multiple regression analysis provided a powerful methodology to identify, quantify and model disaster risk taking into account its various components. The same methodology can be applied to various types of risk at local to global scale. This method was applied and developed for the Global Risk Analysis of the Global Assessment Report on Disaster Risk Reduction (GAR). It was first applied on mortality risk in GAR 2009 and GAR 2011. New models ranging from global assets exposure and global flood hazard models were also recently developed to improve the resolution of the risk analysis and applied through CAPRA software to provide probabilistic economic risk assessments such as Average Annual Losses (AAL) and Probable Maximum Losses (PML) in GAR 2013 and GAR 2015. In parallel similar methodologies were developed to highlitght the role of ecosystems for Climate Change Adaptation (CCA) and Disaster Risk Reduction (DRR). New developments may include slow hazards (such as e.g. soil degradation and droughts), natech hazards (by intersecting with georeferenced critical infrastructures) The various global hazard, exposure and risk models can be visualized and download through the PREVIEW Global Risk Data Platform.
Adverse childhood events, substance abuse, and measures of affiliation.
Zlotnick, Cheryl; Tam, Tammy; Robertson, Marjorie J
2004-08-01
Adverse childhood events may influence later behaviors, including adulthood substance use and social affiliation. Studies have noted high prevalence rates of adverse childhood experiences and adulthood substance abuse among homeless adults. Using an existing longitudinal, countywide probability sample of 397 homeless adults, we examine the relationships among adverse childhood events on adulthood substance use, and the relationship of these variables to affiliation. Almost 75% of the sample had experienced an adverse childhood event. Path analysis indicated adulthood substance abuse mediated the inverse relationship between adverse childhood events and two measures of adulthood affiliation. Thus, although there is a relationship between adverse childhood events and adulthood substance use, it is adulthood substance use that determines most aspects of affiliation.
75 FR 47212 - Special Local Regulation for Marine Events; Elizabeth River, Portsmouth, VA
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-05
... includes but is not limited to sail boat regattas, boat parades, power boat racing, swimming events, crew racing, and sail board racing. An environmental analysis checklist and a categorical exclusion...
ERIC Educational Resources Information Center
Viard, Armelle; Desgranges, Beatrice; Eustache, Francis; Piolino, Pascale
2012-01-01
Remembering the past and envisioning the future are at the core of one's sense of identity. Neuroimaging studies investigating the neural substrates underlying past and future episodic events have been growing in number. However, the experimental paradigms used to select and elicit episodic events vary greatly, leading to disparate results,…
ERIC Educational Resources Information Center
Pansters, Wil G.; van Rinsum, Henk J.
2016-01-01
On the basis of ethnographic and historical material this article makes a comparative analysis of the relationship between public events, ceremonies and academic rituals, institutional identity, and processes of transition and power at two universities, one in Mexico and the other in South Africa. The public events examined here play a major role…
Matoza, Robin S.; Shearer, Peter M.; Okubo, Paul G.
2016-01-01
Long-period (0.5–5 Hz, LP) seismicity has been recorded for decades in the summit region of Kı̄lauea Volcano, Hawai‘i, and is postulated as linked with the magma transport and shallow hydrothermal systems. To better characterize its spatiotemporal occurrence, we perform a systematic analysis of 49,030 seismic events occurring in the Kı̄lauea summit region from January 1986 to March 2009 recorded by the ∼50-station Hawaiian Volcano Observatory permanent network. We estimate 215,437 P wave spectra, considering all events on all stations, and use a station-averaged spectral metric to consistently classify LP and non-LP seismicity. We compute high-precision relative relocations for 5327 LP events (43% of all classified LP events) using waveform cross correlation and cluster analysis with 6.4 million event pairs, combined with the source-specific station term method. The majority of intermediate-depth (5–15 km) LPs collapse to a compact volume, with remarkable source location stability over 23 years indicating a source process controlled by geological or conduit structure.
Pseudo and conditional score approach to joint analysis of current count and current status data.
Wen, Chi-Chung; Chen, Yi-Hau
2018-04-17
We develop a joint analysis approach for recurrent and nonrecurrent event processes subject to case I interval censorship, which are also known in literature as current count and current status data, respectively. We use a shared frailty to link the recurrent and nonrecurrent event processes, while leaving the distribution of the frailty fully unspecified. Conditional on the frailty, the recurrent event is assumed to follow a nonhomogeneous Poisson process, and the mean function of the recurrent event and the survival function of the nonrecurrent event are assumed to follow some general form of semiparametric transformation models. Estimation of the models is based on the pseudo-likelihood and the conditional score techniques. The resulting estimators for the regression parameters and the unspecified baseline functions are shown to be consistent with rates of square and cubic roots of the sample size, respectively. Asymptotic normality with closed-form asymptotic variance is derived for the estimator of the regression parameters. We apply the proposed method to a fracture-osteoporosis survey data to identify risk factors jointly for fracture and osteoporosis in elders, while accounting for association between the two events within a subject. © 2018, The International Biometric Society.
Vasylkivska, Veronika S.; Huerta, Nicolas J.
2017-06-24
Determining the spatiotemporal characteristics of natural and induced seismic events holds the opportunity to gain new insights into why these events occur. Linking the seismicity characteristics with other geologic, geographic, natural, or anthropogenic factors could help to identify the causes and suggest mitigation strategies that reduce the risk associated with such events. The nearest-neighbor approach utilized in this work represents a practical first step toward identifying statistically correlated clusters of recorded earthquake events. Detailed study of the Oklahoma earthquake catalog’s inherent errors, empirical model parameters, and model assumptions is presented. We found that the cluster analysis results are stable withmore » respect to empirical parameters (e.g., fractal dimension) but were sensitive to epicenter location errors and seismicity rates. Most critically, we show that the patterns in the distribution of earthquake clusters in Oklahoma are primarily defined by spatial relationships between events. This observation is a stark contrast to California (also known for induced seismicity) where a comparable cluster distribution is defined by both spatial and temporal interactions between events. These results highlight the difficulty in understanding the mechanisms and behavior of induced seismicity but provide insights for future work.« less
Held, Philip; Klassen, Brian J; Hall, Joanne M; Friese, Tanya R; Bertsch-Gout, Marcel M; Zalta, Alyson K; Pollack, Mark H
2018-05-03
Moral injury is a nascent construct intended to capture reactions to events that violate deeply held beliefs and moral values. Although a model of moral injury has been proposed, many of the theoretical propositions of this model have yet to be systematically studied. We conducted semistructured interviews with eight veterans who reported experiencing morally injurious events during war zone deployments. Using narrative thematic analysis, five main themes and associated subthemes emerged from the data. The main themes capture the timing of the event, contextual factors that affected the decision-making process during the morally injurious event, reactions to the moral injurious event, search for purpose and meaning, and opening up. The findings from the present study supported an existing model of moral injury, while extending it in several important ways. Preliminary clinical recommendations and directions for future research are discussed based on the study findings. These include directly exploring the context surrounding the morally injurious event, examining the veterans' moral appraisals, and helping them assume appropriate responsibility for their actions to reduce excessive self-blame. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vasylkivska, Veronika S.; Huerta, Nicolas J.
Determining the spatiotemporal characteristics of natural and induced seismic events holds the opportunity to gain new insights into why these events occur. Linking the seismicity characteristics with other geologic, geographic, natural, or anthropogenic factors could help to identify the causes and suggest mitigation strategies that reduce the risk associated with such events. The nearest-neighbor approach utilized in this work represents a practical first step toward identifying statistically correlated clusters of recorded earthquake events. Detailed study of the Oklahoma earthquake catalog’s inherent errors, empirical model parameters, and model assumptions is presented. We found that the cluster analysis results are stable withmore » respect to empirical parameters (e.g., fractal dimension) but were sensitive to epicenter location errors and seismicity rates. Most critically, we show that the patterns in the distribution of earthquake clusters in Oklahoma are primarily defined by spatial relationships between events. This observation is a stark contrast to California (also known for induced seismicity) where a comparable cluster distribution is defined by both spatial and temporal interactions between events. These results highlight the difficulty in understanding the mechanisms and behavior of induced seismicity but provide insights for future work.« less
Parallel Event Analysis Under Unix
NASA Astrophysics Data System (ADS)
Looney, S.; Nilsson, B. S.; Oest, T.; Pettersson, T.; Ranjard, F.; Thibonnier, J.-P.
The ALEPH experiment at LEP, the CERN CN division and Digital Equipment Corp. have, in a joint project, developed a parallel event analysis system. The parallel physics code is identical to ALEPH's standard analysis code, ALPHA, only the organisation of input/output is changed. The user may switch between sequential and parallel processing by simply changing one input "card". The initial implementation runs on an 8-node DEC 3000/400 farm, using the PVM software, and exhibits a near-perfect speed-up linearity, reducing the turn-around time by a factor of 8.
Applying AI tools to operational space environmental analysis
NASA Technical Reports Server (NTRS)
Krajnak, Mike; Jesse, Lisa; Mucks, John
1995-01-01
The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines events covering reports of natural phenomena such as solar flares, bursts, geomagnetic storms, and five others pertinent to space environmental analysis. With our preliminary event definitions we experimented with TAS's support for temporal pattern analysis using X-ray flare and geomagnetic storm forecasts as case studies. We are currently working on a framework for integrating advanced graphics and space environmental models into this analytical environment.
Application of artificial neural network to fMRI regression analysis.
Misaki, Masaya; Miyauchi, Satoru
2006-01-15
We used an artificial neural network (ANN) to detect correlations between event sequences and fMRI (functional magnetic resonance imaging) signals. The layered feed-forward neural network, given a series of events as inputs and the fMRI signal as a supervised signal, performed a non-linear regression analysis. This type of ANN is capable of approximating any continuous function, and thus this analysis method can detect any fMRI signals that correlated with corresponding events. Because of the flexible nature of ANNs, fitting to autocorrelation noise is a problem in fMRI analyses. We avoided this problem by using cross-validation and an early stopping procedure. The results showed that the ANN could detect various responses with different time courses. The simulation analysis also indicated an additional advantage of ANN over non-parametric methods in detecting parametrically modulated responses, i.e., it can detect various types of parametric modulations without a priori assumptions. The ANN regression analysis is therefore beneficial for exploratory fMRI analyses in detecting continuous changes in responses modulated by changes in input values.
The ARGO Project: assessing NA-TECH risks on off-shore oil platforms
NASA Astrophysics Data System (ADS)
Capuano, Paolo; Basco, Anna; Di Ruocco, Angela; Esposito, Simona; Fusco, Giannetta; Garcia-Aristizabal, Alexander; Mercogliano, Paola; Salzano, Ernesto; Solaro, Giuseppe; Teofilo, Gianvito; Scandone, Paolo; Gasparini, Paolo
2017-04-01
ARGO (Analysis of natural and anthropogenic risks on off-shore oil platforms) is a 2 years project, funded by the DGS-UNMIG (Directorate General for Safety of Mining and Energy Activities - National Mining Office for Hydrocarbons and Georesources) of Italian Ministry of Economic Development. The project, coordinated by AMRA (Center for the Analysis and Monitoring of Environmental Risk), aims at providing technical support for the analysis of natural and anthropogenic risks on offshore oil platforms. In order to achieve this challenging objective, ARGO brings together climate experts, risk management experts, seismologists, geologists, chemical engineers, earth and coastal observation experts. ARGO has developed methodologies for the probabilistic analysis of industrial accidents triggered by natural events (NA-TECH) on offshore oil platforms in the Italian seas, including extreme events related to climate changes. Furthermore the environmental effect of offshore activities has been investigated, including: changes on seismicity and on the evolution of coastal areas close to offshore platforms. Then a probabilistic multi-risk framework has been developed for the analysis of NA-TECH events on offshore installations for hydrocarbon extraction.
Defining Human Failure Events for Petroleum Risk Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring; Knut Øien
2014-06-01
In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.
A Civilian/Military Trauma Institute: National Trauma Coordinating Center
2015-12-01
zip codes was used in “proximity to violence” analysis. Data were analyzed using SPSS (version 20.0, SPSS Inc., Chicago, IL). Multivariable linear...number of adverse events and serious events was not statistically higher in one group, the incidence of deep venous thrombosis (DVT) was statistically ...subjects the lack of statistical difference on multivariate analysis may be related to an underpowered sample size. It was recommended that the
NASA Astrophysics Data System (ADS)
Haneda, Kiyofumi; Kajima, Toshio; Koyama, Tadashi; Muranaka, Hiroyuki; Dojo, Hirofumi; Aratani, Yasuhiko
2002-05-01
The target of our study is to analyze the level of necessary security requirements, to search for suitable security measures and to optimize security distribution to every portion of the medical practice. Quantitative expression must be introduced to our study, if possible, to enable simplified follow-up security procedures and easy evaluation of security outcomes or results. Using fault tree analysis (FTA), system analysis showed that system elements subdivided into groups by details result in a much more accurate analysis. Such subdivided composition factors greatly depend on behavior of staff, interactive terminal devices, kinds of services provided, and network routes. Security measures were then implemented based on the analysis results. In conclusion, we identified the methods needed to determine the required level of security and proposed security measures for each medical information system, and the basic events and combinations of events that comprise the threat composition factors. Methods for identifying suitable security measures were found and implemented. Risk factors for each basic event, a number of elements for each composition factor, and potential security measures were found. Methods to optimize the security measures for each medical information system were proposed, developing the most efficient distribution of risk factors for basic events.
An Overview of the Runtime Verification Tool Java PathExplorer
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)
2002-01-01
We present an overview of the Java PathExplorer runtime verification tool, in short referred to as JPAX. JPAX can monitor the execution of a Java program and check that it conforms with a set of user provided properties formulated in temporal logic. JPAX can in addition analyze the program for concurrency errors such as deadlocks and data races. The concurrency analysis requires no user provided specification. The tool facilitates automated instrumentation of a program's bytecode, which when executed will emit an event stream, the execution trace, to an observer. The observer dispatches the incoming event stream to a set of observer processes, each performing a specialized analysis, such as the temporal logic verification, the deadlock analysis and the data race analysis. Temporal logic specifications can be formulated by the user in the Maude rewriting logic, where Maude is a high-speed rewriting system for equational logic, but here extended with executable temporal logic. The Maude rewriting engine is then activated as an event driven monitoring process. Alternatively, temporal specifications can be translated into efficient automata, which check the event stream. JPAX can be used during program testing to gain increased information about program executions, and can potentially furthermore be applied during operation to survey safety critical systems.
Malicki, Julian; Bly, Ritva; Bulot, Mireille; Godet, Jean-Luc; Jahnen, Andreas; Krengli, Marco; Maingon, Philippe; Prieto Martin, Carlos; Przybylska, Kamila; Skrobała, Agnieszka; Valero, Marc; Jarvinen, Hannu
2017-04-01
To describe the current status of implementation of European directives for risk management in radiotherapy and to assess variability in risk management in the following areas: 1) in-country regulatory framework; 2) proactive risk assessment; (3) reactive analysis of events; and (4) reporting and learning systems. The original data were collected as part of the ACCIRAD project through two online surveys. Risk assessment criteria are closely associated with quality assurance programs. Only 9/32 responding countries (28%) with national regulations reported clear "requirements" for proactive risk assessment and/or reactive risk analysis, with wide variability in assessment methods. Reporting of adverse error events is mandatory in most (70%) but not all surveyed countries. Most European countries have taken steps to implement European directives designed to reduce the probability and magnitude of accidents in radiotherapy. Variability between countries is substantial in terms of legal frameworks, tools used to conduct proactive risk assessment and reactive analysis of events, and in the reporting and learning systems utilized. These findings underscore the need for greater harmonisation in common terminology, classification and reporting practices across Europe to improve patient safety and to enable more reliable inter-country comparisons. Copyright © 2017 Elsevier B.V. All rights reserved.
Pinti, Paola; Merla, Arcangelo; Aichelburg, Clarisse; Lind, Frida; Power, Sarah; Swingler, Elizabeth; Hamilton, Antonia; Gilbert, Sam; Burgess, Paul W; Tachtsidis, Ilias
2017-07-15
Recent technological advances have allowed the development of portable functional Near-Infrared Spectroscopy (fNIRS) devices that can be used to perform neuroimaging in the real-world. However, as real-world experiments are designed to mimic everyday life situations, the identification of event onsets can be extremely challenging and time-consuming. Here, we present a novel analysis method based on the general linear model (GLM) least square fit analysis for the Automatic IDentification of functional Events (or AIDE) directly from real-world fNIRS neuroimaging data. In order to investigate the accuracy and feasibility of this method, as a proof-of-principle we applied the algorithm to (i) synthetic fNIRS data simulating both block-, event-related and mixed-design experiments and (ii) experimental fNIRS data recorded during a conventional lab-based task (involving maths). AIDE was able to recover functional events from simulated fNIRS data with an accuracy of 89%, 97% and 91% for the simulated block-, event-related and mixed-design experiments respectively. For the lab-based experiment, AIDE recovered more than the 66.7% of the functional events from the fNIRS experimental measured data. To illustrate the strength of this method, we then applied AIDE to fNIRS data recorded by a wearable system on one participant during a complex real-world prospective memory experiment conducted outside the lab. As part of the experiment, there were four and six events (actions where participants had to interact with a target) for the two different conditions respectively (condition 1: social-interact with a person; condition 2: non-social-interact with an object). AIDE managed to recover 3/4 events and 3/6 events for conditions 1 and 2 respectively. The identified functional events were then corresponded to behavioural data from the video recordings of the movements and actions of the participant. Our results suggest that "brain-first" rather than "behaviour-first" analysis is possible and that the present method can provide a novel solution to analyse real-world fNIRS data, filling the gap between real-life testing and functional neuroimaging. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Effect of Aspirin Coadministration on the Safety of Celecoxib, Naproxen, or Ibuprofen.
Reed, Grant W; Abdallah, Mouin S; Shao, Mingyuan; Wolski, Kathy; Wisniewski, Lisa; Yeomans, Neville; Lüscher, Thomas F; Borer, Jeffrey S; Graham, David Y; Husni, M Elaine; Solomon, Daniel H; Libby, Peter; Menon, Venu; Lincoff, A Michael; Nissen, Steven E
2018-04-24
The safety of nonsteroidal anti-inflammatory drug (NSAID) and aspirin coadministration is uncertain. The aim of this study was to compare the safety of combining NSAIDs with low-dose aspirin. This analysis of the PRECISION (Prospective Randomized Evaluation of Celecoxib Integrated Safety Versus Ibuprofen or Naproxen) trial included 23,953 patients with osteoarthritis or rheumatoid arthritis at increased cardiovascular risk randomized to celecoxib, ibuprofen, or naproxen. The on-treatment population was used for this study. Outcomes included composite major adverse cardiovascular events, noncardiovascular death, gastrointestinal or renal events, and components of the composite. Cox proportional hazards models compared outcomes among NSAIDs stratified by aspirin use following propensity score adjustment. Kaplan-Meier analysis was used to compare the cumulative probability of events. When taken without aspirin, naproxen or ibuprofen had greater risk for the primary composite endpoint compared with celecoxib (hazard ratio [HR]: 1.52; 95% confidence interval [CI]: 1.22 to 1.90, p <0.001; and HR: 1.81; 95% CI: 1.46 to 2.26; p <0.001, respectively). Compared with celecoxib, ibuprofen had more major adverse cardiovascular events (p < 0.05), and both ibuprofen and naproxen had more gastrointestinal (p < 0.001) and renal (p < 0.05) events. Taken with aspirin, ibuprofen had greater risk for the primary composite endpoint compared with celecoxib (HR: 1.27; 95% CI: 1.06 to 1.51; p < 0.01); this was not significantly higher with naproxen (HR: 1.18; 95% CI: 0.98 to 1.41; p = 0.08). Among patients on aspirin, major adverse cardiovascular events were similar among NSAIDs, and compared with celecoxib, ibuprofen had more gastrointestinal and renal events (p < 0.05), while naproxen had more gastrointestinal events (p < 0.05), without a difference in renal events. Similar results were seen on adjusted Kaplan-Meier analysis. Celecoxib has a more favorable overall safety profile than naproxen or ibuprofen when taken without aspirin. Adding aspirin attenuates the safety advantage of celecoxib, although celecoxib is still associated with fewer gastrointestinal events than ibuprofen or naproxen and fewer renal events than ibuprofen. (Prospective Randomized Evaluation of Celecoxib Integrated Safety vs Ibuprofen or Naproxen [PRECISION]; NCT00346216). Copyright © 2018 American College of Cardiology Foundation. All rights reserved.
Maximum Likelihood Analysis in the PEN Experiment
NASA Astrophysics Data System (ADS)
Lehman, Martin
2013-10-01
The experimental determination of the π+ -->e+ ν (γ) decay branching ratio currently provides the most accurate test of lepton universality. The PEN experiment at PSI, Switzerland, aims to improve the present world average experimental precision of 3 . 3 ×10-3 to 5 ×10-4 using a stopped beam approach. During runs in 2008-10, PEN has acquired over 2 ×107 πe 2 events. The experiment includes active beam detectors (degrader, mini TPC, target), central MWPC tracking with plastic scintillator hodoscopes, and a spherical pure CsI electromagnetic shower calorimeter. The final branching ratio will be calculated using a maximum likelihood analysis. This analysis assigns each event a probability for 5 processes (π+ -->e+ ν , π+ -->μ+ ν , decay-in-flight, pile-up, and hadronic events) using Monte Carlo verified probability distribution functions of our observables (energies, times, etc). A progress report on the PEN maximum likelihood analysis will be presented. Work supported by NSF grant PHY-0970013.
Improved WIMP-search reach of the CDMS II germanium data
Agnese, R.
2015-10-12
CDMS II data from the five-tower runs at the Soudan Underground Laboratory were reprocessed with an improved charge-pulse fitting algorithm. Two new analysis techniques to reject surface-event backgrounds were applied to the 612 kg days germanium-detector weakly interacting massive particle (WIMP)-search exposure. An extended analysis was also completed by decreasing the 10 keV analysis threshold to ~5 keV, to increase sensitivity near a WIMP mass of 8 GeV/c 2. After unblinding, there were zero candidate events above a deposited energy of 10 keV and six events in the lower-threshold analysis. This yielded minimum WIMP-nucleon spin-independent scattering cross-section limits of 1.8×10more » –44 and 1.18×10 –41 at 90% confidence for 60 and 8.6 GeV/c 2 WIMPs, respectively. This improves the previous CDMS II result by a factor of 2.4 (2.7) for 60 (8.6) GeV/c 2 WIMPs.« less
Improved WIMP-search reach of the CDMS II germanium data
Agnese, R.
2015-10-12
CDMS II data from the five-tower runs at the Soudan Underground Laboratory were reprocessed with an improved charge-pulse fitting algorithm. Two new analysis techniques to reject surface-event backgrounds were applied to the 612 kg days germanium-detector weakly interacting massive particle (WIMP)-search exposure. An extended analysis was also completed by decreasing the 10 keV analysis threshold to ~5 keV, to increase sensitivity near a WIMP mass of 8 GeV/c 2. After unblinding, there were zero candidate events above a deposited energy of 10 keV and six events in the lower-threshold analysis. This yielded minimum WIMP-nucleon spin-independent scattering cross-section limits of 1.8×10more » –44 and 1.18×10 –41 at 90% confidence for 60 and 8.6 GeV/c 2 WIMPs, respectively. Furthermore, this improves the previous CDMS II result by a factor of 2.4 (2.7) for 60 (8.6) GeV/c 2 WIMPs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allen, A.; Santoso, S.; Muljadi, E.
2013-08-01
A network of multiple phasor measurement units (PMU) was created, set up, and maintained at the University of Texas at Austin to obtain actual power system measurements for power system analysis. Power system analysis in this report covers a variety of time ranges, such as short- term analysis for power system disturbances and their effects on power system behavior and long- term power system behavior using modal analysis. The first objective of this report is to screen the PMU data for events. The second objective of the report is to identify and describe common characteristics extracted from power system eventsmore » as measured by PMUs. The numerical characteristics for each category and how these characteristics are used to create selection rules for the algorithm are also described. Trends in PMU data related to different levels and fluctuations in wind power output are also examined.« less
Thomas, Roger E; Lorenzetti, Diane L; Spragins, Wendy; Jackson, Dave; Williamson, Tyler
2011-07-01
To assess the reporting rates of serious adverse events attributable to yellow fever vaccination with 17D and 17DD strains as reported in pharmacovigilance databases, and assess reasons for differences in reporting rates. We searched 9 electronic databases for peer reviewed and grey literature (government reports, conferences), in all languages. Reference lists of key studies were also reviewed to identify additional studies. We identified 2,415 abstracts, of which 472 were selected for full text review. We identified 15 pharmacovigilance databases which reported adverse events attributed to yellow fever vaccination, of which 10 contributed data to this review with about 107,600,000 patients (allowing for overlapping time periods for the studies of the US VAERS database), and the data are very heavily weighted (94%) by the Brazilian database. The estimates of serious adverse events form three groups. The estimates for Australia were low at 0/210,656 for "severe neurological disease" and 1/210,656 for YEL-AVD, and also low for Brazil with 9 hypersensitivity events, 0.23 anaphylactic shock events, 0.84 neurologic syndrome events and 0.19 viscerotropic events cases/million doses. The five analyses of partly overlapping periods for the US VAERS database provide an estimate of 3.6/cases per million YEL-AND in one analysis and 7.8 in another, and 3.1 YEL-AVD in one analysis and 3.9 in another. The estimates for the UK used only the inclusive term of "serious adverse events" not further classified into YEL-And or YEL-AND and reported 34 "serious adverse events." The Swiss database used the term "serious adverse events" and reported 7 such events (including 4 "neurologic reactions") for a reporting rate of 25 "serious adverse events"/million doses. Reporting rates for serious adverse events following yellow fever vaccination are low. Differences in reporting rates may be due to differences in definitions, surveillance system organisation, methods of reporting cases, administration of YFV with other vaccines, incomplete information about denominators, time intervals for reporting events, the degree of passive reporting, access to diagnostic resources, and differences in time periods of reporting.
Seoane-Pillado, María Teresa; Pita-Fernández, Salvador; Valdés-Cañedo, Francisco; Seijo-Bestilleiro, Rocio; Pértega-Díaz, Sonia; Fernández-Rivera, Constantino; Alonso-Hernández, Ángel; González-Martín, Cristina; Balboa-Barreiro, Vanesa
2017-03-07
The high prevalence of cardiovascular risk factors among the renal transplant population accounts for increased mortality. The aim of this study is to determine the incidence of cardiovascular events and factors associated with cardiovascular events in these patients. An observational ambispective follow-up study of renal transplant recipients (n = 2029) in the health district of A Coruña (Spain) during the period 1981-2011 was completed. Competing risk survival analysis methods were applied to estimate the cumulative incidence of developing cardiovascular events over time and to identify which characteristics were associated with the risk of these events. Post-transplant cardiovascular events are defined as the presence of myocardial infarction, invasive coronary artery therapy, cerebral vascular events, new-onset angina, congestive heart failure, rhythm disturbances, peripheral vascular disease and cardiovascular disease and death. The cause of death was identified through the medical history and death certificate using ICD9 (390-459, except: 427.5, 435, 446, 459.0). The mean age of patients at the time of transplantation was 47.0 ± 14.2 years; 62% were male. 16.5% had suffered some cardiovascular disease prior to transplantation and 9.7% had suffered a cardiovascular event. The mean follow-up period for the patients with cardiovascular event was 3.5 ± 4.3 years. Applying competing risk methodology, it was observed that the accumulated incidence of the event was 5.0% one year after transplantation, 8.1% after five years, and 11.9% after ten years. After applying multivariate models, the variables with an independent effect for predicting cardiovascular events are: male sex, age of recipient, previous cardiovascular disorders, pre-transplant smoking and post-transplant diabetes. This study makes it possible to determine in kidney transplant patients, taking into account competitive events, the incidence of post-transplant cardiovascular events and the risk factors of these events. Modifiable risk factors are identified, owing to which, changes in said factors would have a bearing of the incidence of events.
Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task One Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Shuai; Makarov, Yuri V.
This is a report for task one of the tail event analysis project for BPA. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, the imbalance between generation and load becomes very significant. This type of events occurs infrequently and appears on the tails of the distribution of system power imbalance; therefore, is referred to as tail events. This report analyzes what happened during the Electric Reliability Council of Texas (ERCOT) reliability event on Februarymore » 26, 2008, which was widely reported because of the involvement of wind generation. The objective is to identify sources of the problem, solutions to it and potential improvements that can be made to the system. Lessons learned from the analysis include the following: (1) Large mismatch between generation and load can be caused by load forecast error, wind forecast error and generation scheduling control error on traditional generators, or a combination of all of the above; (2) The capability of system balancing resources should be evaluated both in capacity (MW) and in ramp rate (MW/min), and be procured accordingly to meet both requirements. The resources need to be able to cover a range corresponding to the variability of load and wind in the system, additional to other uncertainties; (3) Unexpected ramps caused by load and wind can both become the cause leading to serious issues; (4) A look-ahead tool evaluating system balancing requirement during real-time operations and comparing that with available system resources should be very helpful to system operators in predicting the forthcoming of similar events and planning ahead; and (5) Demand response (only load reduction in ERCOT event) can effectively reduce load-generation mismatch and terminate frequency deviation in an emergency situation.« less
Jardine, Meg J; Kang, Amy; Zoungas, Sophia; Navaneethan, Sankar D; Ninomiya, Toshiharu; Nigwekar, Sagar U; Gallagher, Martin P; Cass, Alan; Strippoli, Giovanni; Perkovic, Vlado
2012-06-13
To systematically review the effect of folic acid based homocysteine lowering on cardiovascular outcomes in people with kidney disease. Systematic review and meta-analysis. Medline, Embase, the Cochrane Library, and ClinicalTrials.gov to June 2011. Randomised trials in people with non-dialysis dependent chronic kidney disease or end stage kidney disease or with a functioning kidney transplant reporting at least 100 patient years of follow-up and assessing the effect of folic acid based homocysteine lowering therapy. No language restrictions were applied. Two reviewers independently extracted data on study setting, design, and outcomes using a standardised form. The primary endpoint was cardiovascular events (myocardial infarction, stroke, and cardiovascular mortality, or as defined by study author). Secondary endpoints included the individual composite components, all cause mortality, access thrombosis, requirement for renal replacement therapy, and reported adverse events, including haematological and neurological events. The effect of folic acid based homocysteine lowering on outcomes was assessed with meta-analysis using random effects models. 11 trials were identified that reported on 4389 people with chronic kidney disease, 2452 with end stage kidney disease, and 4110 with functioning kidney transplants (10,951 participants in total). Folic acid based homocysteine therapy did not prevent cardiovascular events (relative risk 0.97, 95% confidence interval 0.92 to 1.03, P = 0.326) or any of the secondary outcomes. There was no evidence of heterogeneity in subgroup analyses, including those of kidney disease category, background fortification, rates of pre-existing disease, or baseline homocysteine level. The definitions of chronic kidney disease varied widely between the studies. Non-cardiovascular events could not be analysed as few studies reported these outcomes. Folic acid based homocysteine lowering does not reduce cardiovascular events in people with kidney disease. Folic acid based regimens should not be used for the prevention of cardiovascular events in people with kidney disease.
Video-tracker trajectory analysis: who meets whom, when and where
NASA Astrophysics Data System (ADS)
Jäger, U.; Willersinn, D.
2010-04-01
Unveiling unusual or hostile events by observing manifold moving persons in a crowd is a challenging task for human operators, especially when sitting in front of monitor walls for hours. Typically, hostile events are rare. Thus, due to tiredness and negligence the operator may miss important events. In such situations, an automatic alarming system is able to support the human operator. The system incorporates a processing chain consisting of (1) people tracking, (2) event detection, (3) data retrieval, and (4) display of relevant video sequence overlaid by highlighted regions of interest. In this paper we focus on the event detection stage of the processing chain mentioned above. In our case, the selected event of interest is the encounter of people. Although being based on a rather simple trajectory analysis, this kind of event embodies great practical importance because it paves the way to answer the question "who meets whom, when and where". This, in turn, forms the basis to detect potential situations where e.g. money, weapons, drugs etc. are handed over from one person to another in crowded environments like railway stations, airports or busy streets and places etc.. The input to the trajectory analysis comes from a multi-object video-based tracking system developed at IOSB which is able to track multiple individuals within a crowd in real-time [1]. From this we calculate the inter-distances between all persons on a frame-to-frame basis. We use a sequence of simple rules based on the individuals' kinematics to detect the event mentioned above to output the frame number, the persons' IDs from the tracker and the pixel coordinates of the meeting position. Using this information, a data retrieval system may extract the corresponding part of the recorded video image sequence and finally allows for replaying the selected video clip with a highlighted region of interest to attract the operator's attention for further visual inspection.
An analysis of the 2016 Hitomi breakup event
NASA Astrophysics Data System (ADS)
Flegel, Sven; Bennett, James; Lachut, Michael; Möckel, Marek; Smith, Craig
2017-04-01
The breakup of Hitomi (ASTRO-H) on 26 March 2016 is analysed. Debris from the fragmentation is used to estimate the time of the event by propagating backwards and estimating the close approach with the parent object. Based on this method, the breakup event is predicted to have occurred at approximately 01:42 UTC on 26 March 2016. The Gaussian variation of parameters equations based on the instantaneous orbits at the predicted time of the event are solved to gain additional insight into the on-orbit position of Hitomi at the time of the event and to test an alternate approach of determining the event epoch and location. A conjunction analysis is carried out between Hitomi and all catalogued objects which were in orbit around the estimated time of the anomaly. Several debris objects have close approaches with Hitomi; however, there is no evidence to support the breakup was caused by a catalogued object. Debris from both of the largest fragmentation events—the Iridium 33-Cosmos 2251 conjunction in 2009 and the intentional destruction of Fengyun 1C in 2007—is involved in close approaches with Hitomi indicating the persistent threat these events have caused in subsequent space missions. To quantify the magnitude of a potential conjunction, the fragmentation resulting from a collision with the debris is modelled using the EVOLVE-4 breakup model. The debris characteristics are estimated from two-line element data. This analysis is indicative of the threat to space assets that mission planners face due to the growing debris population. The impact of the actual event to the environment is investigated based on the debris associated with Hitomi which is currently contained in the United States Strategic Command's catalogue. A look at the active missions in the orbital vicinity of Hitomi reveals that the Hubble Space Telescope is among the spacecraft which may be immediately affected by the new debris.[Figure not available: see fulltext.
NASA Astrophysics Data System (ADS)
Nie, Yong; Liu, Qiao; Wang, Jida; Zhang, Yili; Sheng, Yongwei; Liu, Shiyin
2018-05-01
Glacial lake outburst floods (GLOFs) are a unique type of natural hazard in the cryosphere that may result in catastrophic fatalities and damages. The Himalayas are known as one of the world's most GLOF-vulnerable zones. Effective hazard assessments and risk management require a thorough inventory of historical GLOF events across the Himalayas, which is hitherto absent. Existing studies imply that numerous historical GLOF events are contentious because of discrepant geographic coordinates, names, or outburst time, requiring further verifications. This study reviews and verifies over 60 historical GLOF events across the Himalayas using a comprehensive method that combines literature documentations, archival remote sensing observations, geomorphological analysis, and field investigations. As a result, three unreported GLOF events were discovered from remote sensing images and geomorphological analysis. Eleven suspicious events were identified and suggested to be excluded. The properties of five outburst lakes, i.e., Degaco, Chongbaxia Tsho, Geiqu, Lemthang Tsho, and a lake on Tshojo Glacier, were corrected or updated. A total of 51 GLOF events were verified to be convincing, and these outburst lakes were classified into three categories according to their statuses in the past decades, namely disappeared (12), stable (30), and expanding (9). Statistics of the verified GLOF events show that GLOF tended to occur between April and October in the Himalayas. We suggest that more attention should be paid to rapidly expanding glacial lakes with high possibility of repetitive outbursts. This study also demonstrates the effectiveness of integrating remote sensing and geomorphic interpretations in identifying and verifying GLOF events in remote alpine environments. This inventory of GLOFs with a range of critical attributes (e.g., locations, time, and mechanisms) will benefit the continuous monitoring and prediction of potentially dangerous glacial lakes and contribute to outburst-induced risk assessments and hazard mitigations.
Atighechian, Golrokh; Maleki, Mohammadreza; Aryankhesal, Aidin; Jahangiri, Katayoun
2016-07-24
Oil spill in fresh water can affect ecological processes and accordingly it can influence human health. Iran, due to having 58.8 % of the world oil reserves, is highly vulnerable to water contamination by oil products. The aim of this study was to determine environmental factors affecting the management of the oil spill into one of the river in Iran using the PESTLE analysis. This was a qualitative case study conducted in 2015 on an oil spill incident in Iran and its roots from a disaster management approach. Semi-structured interviews were conducted for data collection. Seventy managers and staffs with those responsible or involved in oil spill incident management were recruited to the study. Qualitative content analysis approach was employed for the data analysis. Document analysis was used to collect additional information. Findings of the present study indicated that different factors affected the management of the event of oil spill onto one of the central river and consequently the management of drink water resources. Using this analysis, managers can plan for such events and develop scenarios for them to have better performance for the future events.
Atighechian, Golrokh; Maleki, Mohammadreza; Aryankhesal, Aidin; Jahangiri, Katayoun
2016-01-01
Introduction: Oil spill in fresh water can affect ecological processes and accordingly it can influence human health. Iran, due to having 58.8 % of the world oil reserves, is highly vulnerable to water contamination by oil products. Aim: The aim of this study was to determine environmental factors affecting the management of the oil spill into one of the river in Iran using the PESTLE analysis. Material and methods: This was a qualitative case study conducted in 2015 on an oil spill incident in Iran and its roots from a disaster management approach. Semi-structured interviews were conducted for data collection. Seventy managers and staffs with those responsible or involved in oil spill incident management were recruited to the study. Qualitative content analysis approach was employed for the data analysis. Document analysis was used to collect additional information. Results: Findings of the present study indicated that different factors affected the management of the event of oil spill onto one of the central river and consequently the management of drink water resources. Using this analysis, managers can plan for such events and develop scenarios for them to have better performance for the future events. PMID:27698608
Urbanization and Fertility: An Event-History Analysis of Coastal Ghana
WHITE, MICHAEL J.; MUHIDIN, SALUT; ANDRZEJEWSKI, CATHERINE; TAGOE, EVA; KNIGHT, RODNEY; REED, HOLLY
2008-01-01
In this article, we undertake an event-history analysis of fertility in Ghana. We exploit detailed life history calendar data to conduct a more refined and definitive analysis of the relationship among personal traits, urban residence, and fertility. Although urbanization is generally associated with lower fertility in developing countries, inferences in most studies have been hampered by a lack of information about the timing of residence in relationship to childbearing. We find that the effect of urbanization itself is strong, evident, and complex, and persists after we control for the effects of age, cohort, union status, and education. Our discrete-time event-history analysis shows that urban women exhibit fertility rates that are, on average, 11% lower than those of rural women, but the effects vary by parity. Differences in urban population traits would augment the effects of urban adaptation itself. Extensions of the analysis point to the operation of a selection effect in rural-to-urban mobility but provide limited evidence for disruption effects. The possibility of further selection of urbanward migrants on unmeasured traits remains. The analysis also demonstrates the utility of an annual life history calendar for collecting such data in the field. PMID:19110898
Sandberg, S; Järvenpää, S; Penttinen, A; Paton, J Y; McCann, D C
2004-12-01
A recent prospective study of children with asthma employing a within subject, over time analysis using dynamic logistic regression showed that severely negative life events significantly increased the risk of an acute exacerbation during the subsequent 6 week period. The timing of the maximum risk depended on the degree of chronic psychosocial stress also present. A hierarchical Cox regression analysis was undertaken to examine whether there were any immediate effects of negative life events in children without a background of high chronic stress. Sixty children with verified chronic asthma were followed prospectively for 18 months with continuous monitoring of asthma by daily symptom diaries and peak flow measurements, accompanied by repeated interview assessments of life events. The key outcome measures were asthma exacerbations and severely negative life events. An immediate effect evident within the first 2 days following a severely negative life event increased the risk of a new asthma attack by a factor of 4.69, 95% confidence interval 2.33 to 9.44 (p<0.001) [corrected] In the period 3-10 days after a severe event there was no increased risk of an asthma attack (p = 0.5). In addition to the immediate effect, an increased risk of 1.81 (95% confidence interval 1.24 to 2.65) [corrected] was found 5-7 weeks after a severe event (p = 0.002). This is consistent with earlier findings. There was a statistically significant variation due to unobserved factors in the incidence of asthma attacks between the children. The use of statistical methods capable of investigating short time lags showed that stressful life events significantly increase the risk of a new asthma attack immediately after the event; a more delayed increase in risk was also evident 5-7 weeks later.
NASA Technical Reports Server (NTRS)
Collow, Allie Marquardt; Bosilovich, Mike; Ullrich, Paul; Hoeck, Ian
2017-01-01
Extreme precipitation events can have a large impact on society through flooding that can result in property destruction, crop losses, economic losses, the spread of water-borne diseases, and fatalities. Observations indicate there has been a statistically significant increase in extreme precipitation events over the past 15 years in the Northeastern United States and other localized regions of the country have become crippled with record flooding events, for example, the flooding that occurred in the Southeast United States associated with Hurricane Matthew in October 2016. Extreme precipitation events in the United States can be caused by various meteorological influences such as extratropical cyclones, tropical cyclones, mesoscale convective complexes, general air mass thunderstorms, upslope flow, fronts, and the North American Monsoon. Reanalyses, such as the Modern Era Retrospective Analysis for Research and Applications, version 2 (MERRA-2), have become a pivotal tool to study the meteorology surrounding extreme precipitation events. Using days classified as an extreme precipitation events based on a combination of observational gauge and radar data, two techniques for the classification of these events are used to gather additional information that can be used to determine how events have changed over time using atmospheric data from MERRA-2. The first is self organizing maps, which is an artificial neural network that uses unsupervised learning to cluster like patterns and the second is an automated detection technique that searches for characteristics in the atmosphere that define a meteorological phenomena. For example, the automated detection for tropical cycles searches for a defined area of suppressed sea level pressure, alongside thickness anomalies aloft, indicating the presence of a warm core. These techniques are employed for extreme precipitation events in preselected regions that were chosen based an analysis of the climatology of precipitation.
NASA Astrophysics Data System (ADS)
Masbruch, M.; Rumsey, C.; Gangopadhyay, S.; Susong, D.; Pruitt, T.
2015-12-01
There has been a considerable amount of research linking climatic variability to hydrologic responses in arid and semi-arid regions such as the western United States. Although much effort has been spent to assess and predict changes in surface-water resources, little has been done to understand how climatic events and changes affect groundwater resources. This study focuses on quantifying the effects of large quasi-decadal groundwater recharge events on groundwater in the northern Utah portion of the Great Basin for the period 1960 to 2013. Groundwater-level monitoring data were analyzed with climatic data to characterize climatic conditions and frequency of these large recharge events. Using observed water-level changes and multivariate analysis, five large groundwater recharge events were identified within the study area and period, with a frequency of about 11 to 13 years. These events were generally characterized as having above-average annual precipitation and snow water equivalent and below-average seasonal temperatures, especially during the spring (April through June). Existing groundwater flow models for several basins within the study area were used to quantify changes in groundwater storage from these events. Simulated groundwater storage increases per basin from a single event ranged from about 115 Mm3 (93,000 acre-feet) to 205 Mm3 (166,000 acre-ft). Extrapolating these amounts over the entire northern Great Basin indicates that even a single large quasi-decadal recharge event could result in billions of cubic meters (millions of acre-feet) of groundwater recharge. Understanding the role of these large quasi-decadal recharge events in replenishing aquifers and sustaining water supplies is crucial for making informed water management decisions.
Gurbel, Paul A.; Bliden, Kevin P.; Navickas, Irene A.; Mahla, Elizabeth; Dichiara, Joseph; Suarez, Thomas A.; Antonino, Mark J.; Tantry, Udaya S.; Cohen, Eli
2010-01-01
Background Post-stenting ischemic events occur despite dual antiplatelet therapy suggesting that a “one size fits all” antithrombotic strategy has significant limitations. Ex vivo platelet function measurements may facilitate risk stratification and personalized antiplatelet therapy. Methods We investigated the prognostic utility of the strength of ADP-induced (MAADP) and thrombin-induced (MATHROMBIN) platelet-fibrin clots measured by thrombelastography and ADP-induced light transmittance aggregation (LTAADP) in 225 serial patients following elective stenting treated with aspirin and clopidogrel. Ischemic and bleeding events were assessed over three-years. Results Overall, 59 (26 %) first ischemic events occurred. Patients with ischemic events had higher MAADP, MATHROMBIN, and LTAADP (p<0.0001 for all comparisons). By receiver operating characteristic curve analysis, MAADP > 47mm had the best predictive value of long-term ischemic events compared to other measurements (p<0.0001) with an area under the curve = 0.84 [95% CI 0.78 – 0.89, p < 0.0001]. The univariate Cox proportional hazards model identified MAADP >47mm, MATHROMBIN >69mm, and LTA ADP >34% as significant independent predictors of first ischemic events at the three-year time point, with hazard ratios of 10.3 (p<0.0001), 3.8 (p<0.0001), and 4.8 (p<0.0001) respectively. Fifteen bleeding events occurred. Receiver operator characteristic curve and quartile analysis suggest MAADP ≤ 31 as a predictive value for bleeding. Conclusion This study is the first demonstration of the prognostic utility of MAADP in predicting long term event occurrence following stenting. The quantitative assessment of ADP-stimulated platelet-fibrin clot strength measured by thrombelastography can serve as a future tool in investigations of personalized antiplatelet treatment designed to reduce ischemic events and bleeding. PMID:20691842
Gurbel, Paul A; Bliden, Kevin P; Navickas, Irene A; Mahla, Elizabeth; Dichiara, Joseph; Suarez, Thomas A; Antonino, Mark J; Tantry, Udaya S; Cohen, Eli
2010-08-01
Poststenting ischemic events occur despite dual-antiplatelet therapy, suggesting that a "one size fits all" antithrombotic strategy has significant limitations. Ex vivo platelet function measurements may facilitate risk stratification and personalized antiplatelet therapy. We investigated the prognostic utility of the strength of adenosine diphosphate (ADP)-induced (MA(ADP)) and thrombin-induced (MA(THROMBIN)) platelet-fibrin clots measured by thrombelastography and ADP-induced light transmittance aggregation (LTA(ADP)) in 225 serial patients after elective stenting treated with aspirin and clopidogrel. Ischemic and bleeding events were assessed over 3 years. Overall, 59 (26%) first ischemic events occurred. Patients with ischemic events had higher MA(ADP), MA(THROMBIN), and LTA(ADP) (P < .0001 for all comparisons). By receiver operating characteristic curve analysis, MA(ADP) >47 mm had the best predictive value of long-term ischemic events compared with other measurements (P < .0001), with an area under the curve = 0.84 (95% CI 0.78-0.89, P < .0001). The univariate Cox proportional hazards model identified MA(ADP) >47 mm, MA(THROMBIN) >69 mm, and LTA(ADP) >34% as significant independent predictors of first ischemic events at the 3-year time point, with hazard ratios of 10.3 (P < .0001), 3.8 (P < .0001), and 4.8 (P < .0001), respectively. Fifteen bleeding events occurred. Receiver operating characteristic curve and quartile analysis suggests MA(ADP)
Timbo, Babgaleh B; Chirtel, Stuart J; Ihrie, John; Oladipo, Taiye; Velez-Suarez, Loy; Brewer, Vickery; Mozersky, Robert
2018-05-01
The Food and Drug Administration (FDA)'s Center for Food Safety and Applied Nutrition (CFSAN) oversees the safety of the nation's foods, dietary supplements, and cosmetic products. To present a descriptive analysis of the 2004-2013 dietary supplement adverse event report (AER) data from CAERS and evaluate the 2006 Dietary Supplements and Nonprescription Drug Consumer Protection Act as pertaining to dietary supplements adverse events reporting. We queried CAERS for data from the 2004-2013 AERs specifying at least 1 suspected dietary supplement product. We extracted the product name(s), the symptom(s) reported, age, sex, and serious adverse event outcomes. We examined time trends for mandatory and voluntary reporting and performed analysis using SAS v9.4 and R v3.3.0 software. Of the total AERs (n = 15 430) received from January 1, 2004, through December 31, 2013, indicating at least 1 suspected dietary supplement product, 66.9% were mandatory, 32.2% were voluntary, and 0.9% were both mandatory and voluntary. Reported serious outcomes included death, life-threatening conditions, hospitalizations, congenital anomalies/birth defects and events requiring interventions to prevent permanent impairments (5.1%). The dietary supplement adverse event reporting rate in the United States was estimated at ~2% based on CAERS data. This study characterizes CAERS dietary supplement adverse event data for the 2004-2013 period and estimates a reporting rate of 2% for dietary supplement adverse events based on CAERS data. The findings show that the 2006 Dietary Supplements and Nonprescription Drug Consumer Protection Act had a substantial impact on the reporting of adverse events.
A substitution method to improve completeness of events documentation in anesthesia records.
Lamer, Antoine; De Jonckheere, Julien; Marcilly, Romaric; Tavernier, Benoît; Vallet, Benoît; Jeanne, Mathieu; Logier, Régis
2015-12-01
AIMS are optimized to find and display data and curves about one specific intervention but is not retrospective analysis on a huge volume of interventions. Such a system present two main limitation; (1) the transactional database architecture, (2) the completeness of documentation. In order to solve the architectural problem, data warehouses were developed to propose architecture suitable for analysis. However, completeness of documentation stays unsolved. In this paper, we describe a method which allows determining of substitution rules in order to detect missing anesthesia events in an anesthesia record. Our method is based on the principle that missing event could be detected using a substitution one defined as the nearest documented event. As an example, we focused on the automatic detection of the start and the end of anesthesia procedure when these events were not documented by the clinicians. We applied our method on a set of records in order to evaluate; (1) the event detection accuracy, (2) the improvement of valid records. For the year 2010-2012, we obtained event detection with a precision of 0.00 (-2.22; 2.00) min for the start of anesthesia and 0.10 (0.00; 0.35) min for the end of anesthesia. On the other hand, we increased by 21.1% the data completeness (from 80.3 to 97.2% of the total database) for the start and the end of anesthesia events. This method seems to be efficient to replace missing "start and end of anesthesia" events. This method could also be used to replace other missing time events in this particular data warehouse as well as in other kind of data warehouses.
Yegya-Raman, Nikhil; Wang, Kyle; Kim, Sinae; Reyhan, Meral; Deek, Matthew P; Sayan, Mutlay; Li, Diana; Patel, Malini; Malhotra, Jyoti; Aisner, Joseph; Marks, Lawrence B; Jabbour, Salma K
2018-06-05
We hypothesized that higher cardiac doses correlates with clinically significant cardiotoxicity after standard-dose chemoradiation therapy (CRT) (∼60 Gy) for inoperable non-small cell lung cancer (NSCLC). We retrospectively reviewed the records of 140 patients with inoperable NSCLC treated with concurrent CRT from 2007-2015. Extracted data included baseline cardiac status, dosimetric parameters to the whole heart (WH) and cardiac substructures, and the development of post-CRT symptomatic cardiac events (acute coronary syndrome [ACS], arrhythmia, pericardial effusion, pericarditis, and congestive heart failure [CHF]). Competing risks analysis was used to estimate time to cardiac events. Median follow-up was 47.4 months. Median radiation therapy dose was 61.2 Gy (interquartile range, 60-66 Gy). Forty patients (28.6%) developed 47 symptomatic cardiac events at a median of 15.3 months to first event. On multivariate analysis, higher WH doses and baseline cardiac status were associated with an increased risk of symptomatic cardiac events. The 4-year cumulative incidence of symptomatic cardiac events was 48.6% versus 18.5% for mean WH dose ≥ 20 Gy versus < 20 Gy, respectively (p = 0.0002). Doses to the WH, ventricles, and left anterior descending artery were associated with ACS/CHF, whereas doses to the WH and atria were not associated with supraventricular arrhythmias. Symptomatic cardiac events (p = 0.0001) were independently associated with death. Incidental cardiac irradiation was associated with subsequent symptomatic cardiac events, particularly ACS/CHF, and symptomatic cardiac events were associated with inferior survival. These results support the minimization of cardiac doses among patients with inoperable NSCLC receiving standard-dose CRT. Copyright © 2018 International Association for the Study of Lung Cancer. Published by Elsevier Inc. All rights reserved.